I found this pretty neat website on siggraph project documents about different methods of capturing normals, albedos, speculars, and other stuff, from surfaces.
So... i was thinking... is it possible to actually get real-world readings with just a camera, a linear polarized filter, a black box, sequential hemispherical lighting on all 360 degree? What i mean is, taking a picture from a polarized camera thats 90 degrees aiming down to the surface, and each picture will have a different lighting value. Then with some editing, recreate all the maps? Or is it more complicated than this?
http://www.pauldebevec.com/index.html