At almost 2 hours long this video tutorial is focused on shading and shading only. Using just the parameters that are exposed by default in the standard Mantra Surface Shader I'm going to create 26 materials representing different types of surfaces: wood, metal, glass, plastic, concrete, paint, etc.
The scene used in this training video is from a real production, a short movie that me and a friend are currently working on.
On a few occasions I had to cut out the render time in order to keep the video easy to follow.
It doesn’t have any voice over but from time to time short tips are displayed on screen.
A small note: Remember that the Mantra Surface Shader is not a black box and that it can be heavily modified if you dig inside.
Make sure you don't miss any tutorial by sending an email with the subject "subscribe" to firstname.lastname@example.org or by following me on twitter: AdrianLazar3D
Real-time Rendering and Simulation
To synchronize multiple simulations across multiple machines, I migrated the first step of the simulation on a server and established simulation streaming using WebSocket protocols.
Lighting and Color Study
I wrote a GLSL lighting system with support for directional, point and ambient light.
Created a HTML GUI for the purpose of lighting and color study.
Shading and Effects
I wrote several GLSL shaders with skinning, procedural animation, lighting, texturing, transparency and fog. Volumetric shadow faked with camera.light-facing sprites.
Textures painted with Autodesk Mudbox
Created several procedural modeling networks in Houdini.
Wrote a Houdini-to-WebGL geometry exporter using Python.
Web-based 3D Asset Viewer
List of features:
1. Dynamically loading/unloading models from a web server.
2. Lighting and animation controls.
3. Rendering to texture.
4. Post effects such as depth of field and screen-space ambient occlusion.
5. Awesome HTML5 interface made with customized dat.gui library.
Used an OpenNI C++ library and modified one of the existing body tracking examples to send user's skeleton data from Kinect depth sensor to a web server over a TCP connection. On the server-side, I forwarded the data to a web client using WebSockets protocols.
Used skeleton data to articulate and propel physics-based camera rig in a browser.
Selected old Work
Energy Plant - still image (Maya, Photoshop)
Fractal Broccoli – still image
(prman and text editor)
Paper Shutter – visual effects (Maya)