CG, Rendering, Shaders

⏱️Overview

Introduction to real-time computer graphics (in games, visualizations, etc) through rasterisation.
Short, mainly gathering topics and many links to more informational websites.

💡Rendering types

Rendering very good page with list of features (common tasks in rendering).

3 main rendering types / techniques are available:

🔺Rasterisation

Rasterisation – transforming vector graphics data (font, image, 3d model, scene, etc.) into a 2d image (lines of pixels, on screen).

Most common process in games and other real-time visualization software.
Look and performance (frames per second) are limited by GPU hardware and computational power.

Optimized for GPU, so it is required. An important GPU diagram, will show that GPU has many more cores (computational units) than CPU. High end GPUs now have thousands, low budget hundreds and laptops and integrated tens. Few GPU inside pics at end here.

☀️Ray tracing

Ray tracing produces the most photorealistic, movie quality look. It is the most computationally expensive, so not real-time, but recently GPU are so powerful that some simpler scenes can be rendered this way.
It works by casting rays from camera and is very complex since rays can bounce, reflect, refract, etc. many times. This complexity is not friendly for GPU shaders, they perform better with simpler code like in rasterisation.
Usually associated with big, powerful, 3D software that can work for minutes or days to get a single picture. E.g. Blender.

🔮Ray marching

Similar to ray tracing but much less complex. Ray marching uses a signed distance function (SDF) making it less computationally expensive. But this requires rather basic 3D objects. Also with SDF some effects are easier to achieve in real-time (like AO, glow etc). Seems quite popular for shader demos and 3D fractals.


From now on I will focus only on ⚙️Rasterisation.

⚙️Pipeline⏩

Rendering – very good page with list of features (common tasks in rendering).

Old GPUs before 2001 had only fixed pipeline, its steps are shown on this image. Old GPUs did all steps in hardware only with some adjustments possible.
Now we do vertex transformation and lighting with shaders in code, ourselves, allowing plenty of customizations. Modern GPUs don’t have limits on shader code like old had.
Old basic demo of Quake 3 shows the first cool effects with vertex shaders.

Image search for diagram. How it works video.

Few links with pipeline steps etc:

  • Graphics_pipeline good info on wikipedia, with pipeline under Shader
  • 3D Graphics with OpenGL – long and very good tutorial page with everything including primitives, matrices, frustum, lighting.
  • khronos wiki pipeline with short info on steps
  • Introduction to OpenGL and GLSL – shows OpenGL 2, 3 and 4 pipelines, primitives etc. From 2014, long page with code and VS 2012.
  • Simpler pipeline for WebGL
  • Lot of documentation with links, few specific pipelines, for GL 4, DX12, Vulcan, Metail etc.
  • Very old but detailed pipeline for DirectX9

🎓Math

It is rather important, especially when creating an engine. But less so if you’re using an engine alredy, which provide useful functions.

Few posts with math, like vectors 4, matrices 4×4,
Again same page 3D Graphics with OpenGL has matrices for rotation, projection etc.

Video1 tutorial general, Video2 on projection.

📊Features (steps)

Glossary of terms on wiki.

List of common shading algorithms

🔨Basic

  • Some culling related: Hidden surface determination, Z-buffer
  • Shading – color and brightness for a surface from light
  • Lighting computation – Specular
    Blinn-Phong, Specular highlight Cook-Torrance
  • Texture-mapping – a method of applying detail to surfaces.
    Also texture related are Mipmap and anisotropic filtering.
    With textures having alpha (transparent value), we can achieve e.g. tree leaves, fences with simple alpha test, or glass with alpha blending. Alpha test writes to depth buffer so it’s easier. But glass does not, and needs to be sorted (apart from OIT).
  • Fog – in non-clear atmosphere. Blends everything to fog color with distance.
    The oldest effect. Was fixed, now done in shaders, has to be used the same way for all. Can have different fog types, like height based.
  • other webgl textured spotlight

🪄Normal map

Detail added with extra normal map (texture) and some more per pixel (fragment) shader code.

🌒Shadows

Shadows – the effect of obstructing light. shadow web demo, other demo.

Soft shadows – varying smooth by partially obscured light sources. VSM web demo

For large scale:

✨Shaders

Shader wiki. Languages: GLSL, HLSL. Shaders are basically pieces of code (with computations) running on GPU, with some structures of data being passed between them and GPU.

The main types are vertex shader operating on vertices (which can have 3D positions, normals, texture coordinates, etc) and can transform them, e.g. read from height map texture(s) for terrain or water surface. Second tyoe is fragment / pixel shaders used heavily for shading, lighting, normal mapping etc.

🚶Animation

Vertex shader blending values from bones to vertex.

web demo with sliders for animations blend.

🌐Tessellation

Newer type of shaders allows adding more triangles, e.g. giving more geometry detail closer. Usually uses a texture to know the height.

Tessellation wiki

Videos:

Modern OpenGL Tutorial – Tessellation Shaders – 2nd half has 2D example
Interactive Graphics 18 – Tessellation Shaders – long video but starts with basics too

🛡️Physically Based Rendering (PBR)

Physically based rendering wiki, uses Bidirectional Reflectance Distribution Function (BRDF).

Gives more realistic lighting and shading with 2 more additional textures (or just parameters): roughness and metalness.

Simple Web demo with sliders and dynamic reflection. Another demo.

Web Examples: old gun web demo, Baja car model, Bike model.

Long PBR doc in Filament.

🔮Other

  • 🪞Reflection – mirror-like or highly glossy reflection. Was easy when flat.
    To look believable / good would need cubemap for each object like a car. Or for corridors more of them and blending between.
  • 🔮Refraction – bending of light associated with transparency, web demo and other.
    This is really like the worst thing for rasterization. Would need extra cubemap for each object.
  • Level of detail (LOD), a very important aspect of rendering. The further away from camera, the lower detail should geometry have (e.g. trees, rocks).
    Also crucial for terrain rendering. E.g. video simple grid, complex, quadtree.
  • Volume rendering, volume instancing demo, nice ☁️cloud web demo. Search for volumetric fog.
  • Particle system – fast rendering of many particles, for dirt, smoke, fire, explosions, magic effects, etc.
    Was done on CPU before and rendered with point sprites. Now all computing could be done by GPU making more particles possible (even millions).
    Could be also done (movement) with compute shaders which can do any computations on GPU, not just for rendering.
  • HW Instancing, explained in Ogre. Important technique to reduce draw calls when very many instances of objects are present.
  • Translucency – highly scattered transmission of light through solid objects
  • Caustics (a form of indirect illumination) – reflection of light off a shiny object, or focusing of light through a transparent object, to produce bright highlights on another object
    Fancy water with surface and caustics 🌊water web demo.
  • Subsurface scattering demo e.g. for skin

🖼️Effects, post process📷

Post processing effects work on final 2D screen image pixels. They don’t use 3D geometry at all, but can use some more screen textures like depth buffer. Done using fragment / pixel shaders.

  • Non-photorealistic rendering – e.g. Cel shading for comic like style.
  • Bloom – fake effect adding glow to screen. Was quite overused in past games.
    Web demo.
  • HDR – floating point intensity and eye adaptation dimming for whole screen, like aperture in cameras.
    Web demo.
    Tone mapping
  • Depth of field – Blurring screen parts out of focus. Should be only used for very close scenes, not far.
  • God Rays – sunbeams, or light shafts / rays.
    Web demo.
  • Motion blur – Blurring screen due to fast motion. Happens naturally in video cameras for movies, but in rendering has to be done specially.
  • SSAO – Screen Space Ambient Occlusion. It darkens areas that are less lit. It’s fake, but works quite well, giving more shadow feel. Especially in areas that aren’t lit (have only ambient and would be not shaded).
    If not done in real time as effect, then AO can be baked into textures for object in 3D modelling tool.
    Web demo with sliders.
  • Screen Space Reflections. Cheaper and fake with only near horizontal camera view looking okay.
    web demo.

Deferred rendering a different approach (from 2013), useful for high lights count. Seems old and unused now.

Approaching zero driver overhead AZDO search, quite important publications for engine.

Polygonal-Light Shading with doc paper, web demo and video.

☀️Global Illumination (GI)

Global illumination – surfaces illuminated by light reflected off other surfaces (indirect light). More screens in e.g. radiosity.

Good long talk explanation video with history of various methods.
Video with few methods and listing comparison aspects: Scalable Global Illumination for O3DE.

Forward rendering search. Ogre-next has few demos: Forward3D, Instant Radiosity. Voxel Cone, Local Reflections.
Clustered shading demo with good document links.

📊Engines

Engines list on wiki.
I will list just a few here, that I saw some more of with links to lists.

Web based: WebGL and Three.js

🚗Game

Game Engines already have all components needed for games like: physics, GUI, scripts, etc.
In their editor software you can create games without even writing code. Some have graphs for game events, some would need writing scripts (code). Same goes for shaders, can be done as graphs connecting blocks.

Big “free” but commercial, they also have a big store with assets you can buy, and a tiny “free” set:

FOSS (open source):

Basic video of same game in 3 game engines.

Many lists with engines e.g. here, alternatives here or here, 100 list c++ here.

🟢Only rendering

This type lets you choose other components (like physics engine, GUI etc.) and combine yourself. It needs much more effort, but gives more freedom and customization. Requires C++ programming skills, as well as writing shaders. Importing 3D objects from modelling software can be more complicated too.

  • Ogre – older branch, has RTSS shader system.
  • Ogre-next – new with good examples of techniques, GI too. Scripts for building from sources. Has HLMS material system.
  • Wicked Engine – pretty advanced and FOSS
  • bgfx – needs building from sources, but has good examples.

Many more C++ based here, and here, search pbr here.

Assets can be found on few wesites like: blendswap, sketchfab etc. There you can also choose CC0, CC-BY, etc. licensed ones for FOSS games.

Some game programming and C++ playlists.