Skip to content

Actions: jploski/llama.cpp

Python check requirements.txt

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
3 workflow runs
3 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

CPU/CUDA: Gemma 2 FlashAttention support (#8542)
Python check requirements.txt #3: Commit e11bd85 pushed by jploski
August 24, 2024 21:38 2m 26s master
August 24, 2024 21:38 2m 26s
Update CUDA ops ssm_conv and ssm_scan to match CPU implementation fro…
Python check requirements.txt #2: Commit 697fab6 pushed by jploski
June 2, 2024 16:15 3m 38s mamba_cuda_pr7531
June 2, 2024 16:15 3m 38s
SimpleChat: Simple histogram/repeatMatching driven garbageTrimming, S…
Python check requirements.txt #1: Commit 2ac95c9 pushed by jploski
June 1, 2024 18:02 29m 22s master
June 1, 2024 18:02 29m 22s