how light baking work in realtime rendering?
That's too open of a question, be more specific. It varies a bit based on what technique:
- Baking direct lighting only
- almost always the first step
- Radiosity
- stored as flat light
- Quake / old-unreal / half-life 1
- stored as ambient + directional (2 lightmaps per bake)
- Last of Us, several others
- stored as radiosity normal (3 lightmaps per bake)
- Half-life 2, current Unreal
- spherical harmonic (9 lightmaps per bake)
- Substructuring and lattice interpolation
- Pseudo-realtime radiosity like Enlighten?
- coarse emitter to fine lumel relationship
- form-factors
- point-clouds
- Packing?
- Generating lightmap UV coordinates?
- Use Thekla or Microsoft UV-Atlas
That all gets nasty because there can be a lot of crossover. For instance you can render as spherical harmonic maps but then write out the final data as ambient + directional or a radiosity normal map.
The short summary is the lightmap stores only the light that will contact the surface, you use that to light the surface according to technique.
In the crudest implementation the light at the surface will be light lightmap * colorTexture
.
Baking the lightmap is basically just rendering backwards, you render to UV coordinates interopolating the position and normal per vertex
of the mesh instead of rendering to world coordinates interpolating the UV coordinates and normal per vertex
(not wholly accurate, close enough for a summary).
Radiosity is too long winded to explain without knowing that's what you're after.
Is there any resource that I can reference for light bake technique.
Hugo Elias old page is still the best reference for radiosity:
http://web.archive.org/web/20071001024020/http://freespace.virgin.net/hugo.elias/radiosity/radiosity.htm
Direct-light baking is the inverse of usual rendering as a described above.
eg. how to turn bake light into a texture
It should already be a texture if it's baked.
You really need to be more specific on what you want to know.