avik-pal/ExplicitFluxLayers.jl

Explicitly Parameterized Neural Networks in Julia

Juliamachine-learningdeep-learningneural-networks
This is stars and forks stats for /avik-pal/ExplicitFluxLayers.jl repository. As of 05 May, 2024 this repository has 350 stars and 30 forks.

The 🔥 Deep Learning Framework Installation ] add Lux Getting Started using Lux, Random, Optimisers, Zygote # using LuxCUDA, LuxAMDGPU # Optional packages for GPU support # Seeding rng = Random.default_rng() Random.seed!(rng, 0) # Construct the layer model = Chain(BatchNorm(128), Dense(128, 256, tanh), BatchNorm(256), Chain(Dense(256, 1, tanh), Dense(1, 10))) # Get the device determined by Lux device = gpu_device() # Parameter and State Variables ps, st = Lux.setup(rng, model) .|>...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
IbrahimSobh/SegmentationJupyter Notebook670100
maxim5/cs229-2018-autumnJupyter NotebookHTMLPython1k05040
hughperkins/VeriGPUSystemVerilogC++Python4180500
Oneflow-Inc/oneflowC++PythonCuda5.3k06150
onnx/onnx-tensorrtC++PythonOther2.6k+55260
sebastianruder/NLP-progressPythonHTMLRuby21.9k03.6k0
pyg-team/pytorch_geometricPythonOther18.6k03.4k0
apachecn/sklearn-doc-zhCSSJavaScriptPython4.9k01.5k0
ianfiske/julia-svmJulia2000
JuliaDocs/Documenter.jlJuliaSCSSJavaScript74604570