certik/fastGPT

Fast GPT-2 inference written in Fortran

FortranPythonCMakeCShellfortranhigh-performancegpt-2
This is stars and forks stats for /certik/fastGPT repository. As of 05 May, 2024 this repository has 159 stars and 15 forks.

fastGPT The progression of GPT-2 codes from the original to "minimal", "nano" and "pico": openai/gpt-2 karpathy/minGPT karpathy/nanoGPT jaymody/picoGPT fastGPT is very similar to picoGPT (very small and readable), but it is also fast (see the Benchmarks section below). The speed and readability is achieved by using Fortran. I wrote a blog post introducing fastGPT. fastGPT features: Fast? ✅ Training code? ❌ Batch inference? ❌ top-p sampling? ❌ top-k? ❌ temperature? ❌ categorical sampling?! ❌ greedy?...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
1Panel-dev/1PanelGoVueSCSS11.6k01k0
Winfredy/SadTalkerPythonShellJupyter Notebook7k01.2k0
sqrtZeroKnowledge/CVE-2023-23397_EXPLOIT_0DAYC#1580460
VRCFury/VRCFuryC#HLSL1990310
starkoracles/starknet-miden-verifierCairoRustPython36020
simonhicks/cleekClojure4000
launchdarkly/hello-clojureClojure1000
jcatw/cljwebserveClojure0010
sapondanaisriwan/AdashimaaTubeCSSJavaScript1700110
thoughtworks/build-your-own-radarCSSJavaScriptSCSS1.9k+7957+1