Jure Triglav described a method of compressing 1.8 GB of baked imagery into 200 kB worth of neural networks.
Back in early September, Computer Scientist Jure Triglav published a comprehensive piece describing a neat method of working with lighting. In an in-depth article titled "Compressing Global Illumination with Neural Networks" the scientist demonstrated an uncommon implementation of neural networks for lighting, showing how they can be used to compress 1.8 GB worth of baked images into a roughly 600 kB demo, "of which 400 kB are dependencies."
According to Jure, the idea was to train a neural network with all the different lighting scenarios for a specific scene and make it do inference in real-time to get the desired fragment color. To test it out, the creator modeled a scene in Blender with one rotating light and trained neural networks to produce ten individual fragment shaders that change the scene's lighting scenarios. The final result can be seen in the demo attached above.
"Given the mentioned constraints it is unlikely that this technique in its current state will be useful beyond a very narrow scope of projects, but I firmly believe that making pixels smart is just beginning to show its potential," commented the author. "The ability to successfully embed a useful neural network in a fragment shader and still have it perform well was quite a surprise for me."