lucidrains/PaLM-pytorch

Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways

Pythondeep-learningtransformersartificial-general-intelligenceattention-mechanism
This is stars and forks stats for /lucidrains/PaLM-pytorch repository. As of 28 Apr, 2024 this repository has 773 stars and 82 forks.

PaLM - Pytorch Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways, in less than 200 lines of code. This model is pretty much SOTA on everything language. Yannic Kilcher explanation It obviously will not scale, but it is just for educational purposes. To elucidate the public how simple it all really is. Jax version Install $ pip install PaLM-pytorch Usage import torch from palm_pytorch import PaLM palm = PaLM( num_tokens = 20000, dim...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
awslabs/gluon-tsPythonJupyter NotebookOther3.8k+11739+3
facebookresearch/pytorch3dPythonC++Cuda7.7k01.2k0
bentoml/BentoMLPythonShellStarlark5.7k06510
uber/causalmlPythonCythonMakefile4.3k07110
n64decomp/sm64CC++Assembly7k01.2k0
qemu/qemuCC++Python8.4k+175k+4
linux-nvme/nvme-cliCPythonShell1.3k06120
colmap/colmapC++CPython5.8k01.3k0
fernandocar86/seminario-gramaticas-formalesGAPPrologJupyter Notebook70180
projectcalico/calicoGoCPython5k01.2k0