Skip to content

(No) safe way to wrap taichi kernels in pytorch #11869

(No) safe way to wrap taichi kernels in pytorch

(No) safe way to wrap taichi kernels in pytorch #11869

Triggered via issue November 14, 2024 13:50
@KiordKiord
commented on #8339 e9f19b8
Status Success
Total duration 14s
Artifacts

issue_comment.yml

on: issue_comment
check_comments
5s
check_comments
Fit to window
Zoom out
Zoom in

Annotations

1 warning
check_comments
The following actions use a deprecated Node.js version and will be forced to run on node20: peter-evans/slash-command-dispatch@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/