Skip to content
This repository has been archived by the owner on Dec 20, 2024. It is now read-only.

Feature/44 make flash attention configurable #479

Feature/44 make flash attention configurable

Feature/44 make flash attention configurable #479

Triggered via pull request December 19, 2024 17:14
Status Failure
Total duration 2m 3s
Artifacts

python-pull-request.yml

on: pull_request
quality  /  pre-commit-run
16s
quality / pre-commit-run
Matrix: checks
Fit to window
Zoom out
Zoom in

Annotations

1 error and 4 warnings
quality / pre-commit-run
Process completed with exit code 1.
checks (3.9) / Run pytest with Python 3.9 on ubuntu-latest
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
checks (3.11) / Run pytest with Python 3.11 on ubuntu-latest
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
checks (3.10) / Run pytest with Python 3.10 on ubuntu-latest
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
checks (3.10) / Run pytest with Python 3.10 on ubuntu-latest
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636