ashpil/moonshine

A general purpose ray traced renderer built with Zig + Vulkan

ZigHLSLPython
This is stars and forks stats for /ashpil/moonshine repository. As of 02 May, 2024 this repository has 89 stars and 2 forks.

Moonshine A general purpose GPU ray traced renderer built with Zig + Vulkan Salle de bain by nacimus, rendered with Moonshine Features Binaries offline -- a headless offline renderer online -- a real-time windowed renderer, interactive features WIP Light Transport Global Illumination Direct light sampling with multiple importance sampling for all lights and materials Lights 360° environment maps Emissive meshes Materials Standard PBR with metallic + roughness Mirror Glass Dependencies Build zig 0.12.0-dev.168+a31748b29 DirectXShaderCompiler For the online (real-time) renderer: For Linux (Ubuntu, similar on others): For Wayland: wayland-protocols libwayland-dev libxkbcommon-dev For X11: libxcursor-dev libxrandr-dev libxinerama-dev libxi-dev Should work on Windows without more dependencies Run A GPU supporting Vulkan ray tracing // TODO Feature Bloom Tonemapping HDR display More camera models Orthographic Materials Metal Rough metal Rough glass Plastic Rough plastic Mix Layer Code Make sure we have all necessary errdefers Proper memory allocation interface Reduce unnecessary copying Current jankiness Asset system Currently, one can either construct a scene manually with code or very inefficiently import glb Ideal would be to have custom scene description format that can be quickly deserialzed An Blender export addon for this format, so other formats don't need to be supported in engine directly I think this custom format would make destinctions between scene stuff and staging stuff. It would only contain actual information about the world, but not stuff like camera position, that would be separate Light system Currently, only support skybox and mesh lights, which I think makes sense Both explicitly sampled using the alias method built on CPU But we'd like to have more dynamic meshes, which means we should mesh sampling build sampling stuff on GPU Not sure about proper route -- build inversion sampler on GPU in compute? Memory management A lot of unncessary copying in scene construction at the moment Filesystem to RAM RAM to staging buffer Staging buffer to GPU Ideally this can be vastly minimized, depending on hardware At most should be doing filesystem to staging buffer On some machines, can do filesystem to GPU directly Destruction queue needs work Some notes about conventions +z is up phi is azimuthal angle (0-2pi) and theta is polar angle (0-pi) Some light reading Importance sampling Explicit light sampling Multiple importance sampling Microfacets Actual materials - ton of BRDF examples, in CODE! Better sky License This project is licensed under the AGPL.
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
WordAsImage/Word-As-ImagePythonShell1k0710
snesrev/smCPythonOther401+1300
unum-cloud/ujrpcC++CPython8550230
SteveMacenski/slam_toolboxC++CMakePython1.2k+74240
sifis-home/wp5-cicd-exampleDockerfilePython0010
cyverse-education/intro2dockerDockerfileGoPython20140
Parcoil/nativegames.netHTMLJavaScriptC120320
DarkMakerofc/Queen-Elisa-MD-V2JavaScriptPythonHTML74108.5k0
IIC2133-PUC/2023-1MakefileCPython83070
NVlabs/prismerPythonCudaOther1.2k0660