lucidrains/x-transformers

A simple but complete full-attention transformer with a set of promising experimental features from various papers

Pythondeep-learningtransformersartificial-intelligenceattention-mechanism
This is stars and forks stats for /lucidrains/x-transformers repository. As of 23 Apr, 2024 this repository has 3459 stars and 300 forks.

x-transformers A concise but fully-featured transformer, complete with a set of promising experimental features from various papers. Install $ pip install x-transformers Usage Full encoder / decoder import torch from x_transformers import XTransformer model = XTransformer( dim = 512, enc_num_tokens = 256, enc_depth = 6, enc_heads = 8, enc_max_seq_len = 1024, dec_num_tokens = 256, dec_depth = 6, dec_heads = 8, dec_max_seq_len = 1024, tie_token_emb = True ...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
eto-ai/lanceRustPythonJupyter Notebook2.7k01070
lymslive/vimllearnVim ScriptPerlHTML84501190
ethereum-optimism/op-analyticsJupyter NotebookHTMLPython1260670
TheD1rkMtr/UnhookingPatchC++Python2610430
Cosmoglobe/CommanderFortranJupyter NotebookPython150130
huggingface/peftPythonOther10.1k08210
calapsss/face_detection_tutorialPython220500
MediaBrain-SJTU/RECORDS-LTPLLPython50070
google/yapfPythonOther13.4k09390
elleryqueenhomels/fast_neural_style_transferPythonShell5170900