Skip to content

shader-slang/slang-material-modules-benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Material Modules Benchmark

See the charts here

How it works

  1. Commits to the master branch of Slang will trigger a CI workflow that runs the benchmark and uploads the JSON file to this repository (specifically to benchmarks.json).
  2. Changes to benchmark.json will trigger a CI workflow in this repository.
  3. This workflow updates the gh-pages repository using github-action-benchmark. It reads the new results from benchmarks.json and updates a database in gh-pages.
  4. Once gh-pages receives the update, it will trigger the final CI workflow that builds the GitHub Pages site.

The diagram below summarizes these steps.

smmb graphs

Notes:

  • In (1), the worflow from the Slang repository overrides benchmarks.json when it pushes to this repository; this is expected to happen.
  • This repository contains another file, currrent, which holds the latest commit's message and hash for debugging purposes.
  • Each time benchmarks.json is updated (2), the github-action-benchmark workflow reads its contents and appends a database that is embedded in a Javascript file.
  • There is currently no behaviour to limit the number of entries in the database, so it can grow to a couple megabytes. Manually trimming the data is possible by directly editing the Javascript file and removing entries.

Running the benchmark locally

It is possible to run the benchmark locally to see immediate results or to customize the results. This requires the having cloning both the Slang repository and the MDL-SDK fork. Then, starting from the root of the Slang repository:

  1. Build the Release version of slangc; it should be located in build/Release/bin/slangc.
  2. Change directories to tool/benchmark within the Slang repository.
  3. Copy the Slang shaders:
    • From <MDL-SDK REPO ROOT>/examples/mdl_sdk/dxr/content/slangified (*.slang files specifically)
    • To tool/benchmark
  4. Run the compile.py script using Python (3.12+ recommended):
    • Requires some light packages, which can be installed with pip using: pip install prettytable argparse.
    • Linux users may have to tweak the script to set the slangc variable to end with slangc instead of slangc.exe.
    • Script options:
      • --target to select target mode:
        • spirv generates SPIRV directly.
        • spirv-glsl generates SPIRV via glsl.
        • dxil generates DXIL through DXC.
        • dxil-embedded generates DXIL through DXC, but precomiles slang modules to DXIL
      • --samples to set the number of times to repeat and average the measurements over.
      • --output path to the JSON file where results will be stored.

The script will output timings in a Markdown friendly way, as shown below:

script

How to read the charts

  • Each chart is titled with the format <SHADER STAGE> : <COMPILATION MODE> : <TARGET>
    • <SHADER-STAGE> is one of closesthit, anyhit, and shadow
    • <COMPILATION-MODE> is either mono for monolithic compilation or module for modular compilation.
    • <TARGET> is currently fixed to DXIL.
      • Other targets can be generated by running the benchmarking script with a different target (DXIL or SPIRV; with or without precompiled modules).
  • The $x$-axis tracks the commit hash. Unfortunately there is currently no way to display concrete dates.
  • The $y$-axis shows the time, in milliseconds, taken to compile the specific shader stage under the particular compilation mode and target.

Interacting with the charts

In case there is a commit which results in alarming measurements, there is a convenient way to reach the original commit/PR. Each data point of each graph can be highlighted as so:

nodes

Clicking on the node will result in a redirection to the associated commit in this repository.

nodes

Clicking on the highlighted link will then redirect the user to original commit/PR in the Slang repository.

nodes

About

No description, website, or topics provided.

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •