Looks great, but doesn't the Normal Map node expect the input to be a tangent space vec3? Right now it's just a single float value.
Creative
Beehaw's section for your art and original content, other miscellaneous creative works you've found, and discussion of the creative arts and how they happen generally. Covers everything from digital to physical; photography to painting; abstract to photorealistic; and everything in between.
(It's not mandatory, but we also encourage providing a description of your image(s) for accessibility purposes! See here for a more detailed explanation and advice on how best to do this.)
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Isn't colour data a vector of the r g and b? But good catch, there's definitely something wrong there. I'm still getting used to how blender works with its shader nodes and I was just using the "normal map" which is really just a heightmap to add some extra depth or maybe fakey AO (not from the model geometry I know there's an AO node for that, but from the heightmap itself).
The bumpmap node will take that heightmap and actually produce the results I was looking for
Isn’t colour data a vector of the r g and b?
Yeah and since it's basically a grayscale image with identical RGB values, the normal map ends up representing some weird surface with the normals being on a single line between (0,0,0) and (1,1,1). I guess the bump node finds the slope of the grayscale instead.
I think in Eevee the Displacement node does the same thing behind the scenes, so having both displacement and Bump + Normal Map deriving from the same height data will probably double the effect.
Yes. This was in cycles with adaptive subdivision on the model though so the displacement was actually displacing the geometry and the normal map was for adding a little more pop without displacing the geometry more than I wanted
Nice, I work mainly in Eevee for game assets so I have to add the geometry manually. I wish they added more real-time rendering features from game engines (like parallax mapping), would make previewing much easier. The new realtime compositor is almost good enough for protyping post-process shaders in blender but they still don't support render passes unfortunately.