this post was submitted on 01 Jun 2023
6 points (100.0% liked)

Creative

4266 readers
2 users here now

Beehaw's section for your art and original content, other miscellaneous creative works you've found, and discussion of the creative arts and how they happen generally. Covers everything from digital to physical; photography to painting; abstract to photorealistic; and everything in between.

(It's not mandatory, but we also encourage providing a description of your image(s) for accessibility purposes! See here for a more detailed explanation and advice on how best to do this.)


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Trying to get comfortable with the shader nodes in blender I sometimes create random materials. I mostly post them on my mastodon account but I will test the waters here as well.

The basic idea for the shader is to get two weaves going at 90 degrees from each other. I used a voronoi, scaled down in one direction to make each bit long rather than round, with a low randomness to make each weave. Then pipe it into normal and displacement maps to get the texture out, and use a colour ramp to add the albedo and some light subsurface scattering. Tinkering with the bsdf is just me playing with slider, I didn't have any real direction with those.

If this were going to be for a photoreal material this node set up would be an excellent base to start from, but it would take many more layers and details to use.

What do you think? Do you have any suggestions for materials for me to try?

you are viewing a single comment's thread
view the rest of the comments
[–] a_world_of_madness@beehaw.org 3 points 1 year ago* (last edited 1 year ago) (1 children)

Isn’t colour data a vector of the r g and b?

Yeah and since it's basically a grayscale image with identical RGB values, the normal map ends up representing some weird surface with the normals being on a single line between (0,0,0) and (1,1,1). I guess the bump node finds the slope of the grayscale instead.

I think in Eevee the Displacement node does the same thing behind the scenes, so having both displacement and Bump + Normal Map deriving from the same height data will probably double the effect.

[–] Butterbee@beehaw.org 3 points 1 year ago (1 children)

Yes. This was in cycles with adaptive subdivision on the model though so the displacement was actually displacing the geometry and the normal map was for adding a little more pop without displacing the geometry more than I wanted

[–] a_world_of_madness@beehaw.org 3 points 1 year ago

Nice, I work mainly in Eevee for game assets so I have to add the geometry manually. I wish they added more real-time rendering features from game engines (like parallax mapping), would make previewing much easier. The new realtime compositor is almost good enough for protyping post-process shaders in blender but they still don't support render passes unfortunately.