chengchingwen/Transformers.jl

Julia Implementation of Transformer models

JuliaJupyter Notebooknlpfluxmachine-learningnatural-language-processingdeep-learningtransformerattention
This is stars and forks stats for /chengchingwen/Transformers.jl repository. As of 02 May, 2024 this repository has 463 stars and 58 forks.

Julia implementation of transformer-based models, with Flux.jl. notice: The current version is almost completely different from the 0.1.x version. If you are using the old version, make sure to update the changes or stick to the old version. Installation In the Julia REPL: ]add Transformers Example Using pretrained Bert with Transformers.jl. using Transformers using Transformers.TextEncoders using Transformers.HuggingFace textencoder, bert_model = hgf"bert-base-uncased" text1 = "Peter Piper picked...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
phodal/ai-researchJupyter Notebook3.5k02940
wibus-wee/stable_diffusion_chilloutmix_ipynbJupyter Notebook3210560
cross-domain-compositing/cross-domain-compositingJupyter NotebookPythonShell1590150
dcai-course/dcai-courseCSSHTMLDockerfile76080
JuliaObjects/ConstructionBase.jlJulia260140
GiovineItalia/Gadfly.jlJulia1.9k02610
hundredblocks/large-model-parallelismJupyter Notebook86050
docarray/docarrayPythonOther2.5k01970
mayer79/statistical_computing_materialTeXR40120
dcai-course/dcai-labJupyter Notebook28701100