1

So yesterday here on stack exchange, I stumbled across a solution to something I've been trying to figure out for a while now, a rough way of calculating volume depth:

Distance from the surface as an input node for use with volumetric materials

After some tinkering with Rich Sedman's much appreciated original code, I've got the start of a... maybe not refinement, but a version aiming to trade efficiency for more accurate results, that also generates an approximate normal value for the volume point along the way:

shader volume_depthfinder(
vector Point = P,
float range = 1,
vector Incomming = I,
output vector relNormal = (0,0,1),
output vector maxNormal = (0,0,1),
output vector minNormal = (0,0,1),
output float Distance = 0,
output float MinDist = range,
output float MaxDist = 0  
) {


    if(trace(Point,Incomming,"maxdist", range))
    {
        getmessage("trace", "hitdist", Distance); 
        if(Distance < MinDist){           
            getmessage("trace", "N", minNormal);
            minNormal = -minNormal;
            MinDist = Distance;
        }
        if(Distance > MaxDist){           
            MaxDist = Distance;
            getmessage("trace", "N", maxNormal);
        } 
    }
    if(trace(Point,-Incomming,"maxdist", range))
    {
        getmessage("trace", "hitdist", Distance); 
        if(Distance < MinDist){           
            getmessage("trace", "N", minNormal);
            minNormal = -minNormal;
            MinDist = Distance;
        }
        if(Distance > MaxDist){
            getmessage("trace", "N", maxNormal);           
            MaxDist = Distance;
        } 
    }

point CrossNorm = normalize(cross(minNormal,maxNormal));


     if(trace(Point,vector(CrossNorm),"maxdist",range))
     {
                 getmessage("trace", "hitdist", Distance); 
        if(Distance < MinDist){           
            getmessage("trace", "N", minNormal);
            minNormal = -minNormal;
            MinDist = Distance;
        }
        if(Distance > MaxDist){
            getmessage("trace", "N", maxNormal);           
            MaxDist = Distance;
        } 
     }

vector foldCross = rotate(CrossNorm,3.14/2,(0,0,0),point(minNormal));

    if(trace(Point,foldCross,"maxdist",range))
     {
                 getmessage("trace", "hitdist", Distance); 
        if(Distance < MinDist){           
            getmessage("trace", "N", minNormal);
            minNormal = -minNormal;
            MinDist = Distance;
        }
        if(Distance > MaxDist){
            getmessage("trace", "N", maxNormal);           
            MaxDist = Distance;
        } 
     }
relNormal = normalize(minNormal * -maxNormal);

    if(trace(Point,relNormal,"maxdist",range))
    {
                 getmessage("trace", "hitdist", Distance); 
        if(Distance < MinDist){           
            getmessage("trace", "N", minNormal);
            minNormal = -minNormal;
            MinDist = Distance;
        }
        if(Distance > MaxDist){
            getmessage("trace", "N", maxNormal);           
            MaxDist = Distance;
        } 
    }

relNormal = normalize(minNormal * -maxNormal);
Distance = (MaxDist+MinDist)/2;}

I know there's probably better series of raytraces to find good approximate normals and depth, but I've been staring at the code for too long. Thought I'd share with the class and see if fresh eyes have any suggestions.

Edit:

The current idea with the algorithm is that it starts off by looking at the surfaces in front and behind it. This has two purposes:

For distance, I want to know both because I want to be able to know relative depth similar to the Solid modifier, not just raw depth. I've been using the Suzanne primitive for testing, and the ears are a great example- if I'm doing a stepped gradient, the head and the ears should both have the lowest step at their core.

This also gives it a pair of normals- the closer one becomes minNormal, the farther one becomes maxNormal. The maxNormal gives it something to mix with minNormal to give a relative normal between the two. This morning I realize this should probably be a lerp based on the distances.

The idea with next two traces is to feel around for tight curves and weird geometry, using the two normals I've just grabbed to make a guess about where the next closest surface might be. This is one of the places that I know probably has a better solution- I vaguely remember curl operations from calculus that would probably be useful here, probably need to hit wikipedia for a crash course refresher on those. Any input from those who remember that part better is appreciated.

Also think I'll look at using the crossproduct to project a few rotated versions of minNormal. Maybe could make the number and spread of the rays socket variables so I can flag areas to do high density scans or skip this step, too.

I'll see if I can post revised code later today.

edit 2: So here's what I've come up with so far for the 2.0 version; haven't added back in the crossproduct part yet, but it works pretty well as-is. Pretty simple setup, looks at the face in front and behind it, then does a distance-weighted lerp between them to guess what its normal should be.

It then looks to see if there's surface in that direction within range. If there is, and that's closer than what's directly in front of it, then it's probably in a sideways-facing surface, and should use that as it's front-facing normal.

It then lerps the new front-facing normal with the backfacing normal to further dial itself in, and returns all the data I though might potentially be useful.

shader volume_depthfinder(
    vector Point = P,
    float range = 1,
    vector Incomming = I,
    output vector relNormal = (0,0,1),
    output vector maxNormal = (0,0,1),
    output vector minNormal = (0,0,1),
    //output float aveDistance = 0,
    output float minDist = range,
    output float maxDist = 0,
    output float totalDistance = 0
    ) {
     float Distance = 0;

     float Slider = 0;

        if(trace(Point,Incomming,"maxdist", range))
        {
            getmessage("trace", "hitdist", Distance); 
            if(Distance < minDist){           
                getmessage("trace", "N", minNormal);
                minNormal = -minNormal;
                minDist = Distance;
            }
            if(Distance > maxDist){           
                maxDist = Distance;
                getmessage("trace", "N", maxNormal);
            } 
        }
        if(trace(Point,-Incomming,"maxdist", range))
        {
            getmessage("trace", "hitdist", Distance); 
            if(Distance < minDist){           
                getmessage("trace", "N", minNormal);
                minNormal = -minNormal;
                minDist = Distance;
            }
            if(Distance > maxDist){
                getmessage("trace", "N", maxNormal);           
                maxDist = Distance;
            } 
        }


    totalDistance = (maxDist+minDist);
    Slider = maxDist/totalDistance;
    relNormal = normalize((Slider*minNormal) + ((1-Slider)*-maxNormal));

        if(trace(Point,relNormal,"maxdist",range))
        {
                     getmessage("trace", "hitdist", Distance); 
            if(Distance < minDist){           
                getmessage("trace", "N", minNormal);
                minNormal = -minNormal;
                minDist = Distance;
            }
            if(Distance > maxDist){
                getmessage("trace", "N", maxNormal);           
                maxDist = Distance;
            } 
        }
    totalDistance = (maxDist+minDist);    
    Slider = maxDist/totalDistance;
    relNormal = normalize((Slider*minNormal) + ((1-Slider)*-maxNormal));

}

test render node setup (The "test shader" output is a viewport probe I used to check that the normals were aligning to the surface correctly, the output rendered below is the full setup) render test image

Edit: updated with a render and setup where I don't get dyslexic on the glass shell fresnel's mixing and blame the trashiness on the low sampling...

  • A more specific title (question) would be nice... – brockmann Feb 25 '20 at 21:47
  • Not really sure what I could add, sums up what I'm looking for- have an OSL node, looking for help streamlining the algorithm. – Joshua Scorpion451 Evans Feb 25 '20 at 23:32
  • Can you explain your algorithm a bit? I don’t understand quite what it’s trying to achieve. MaxNormal seems to be the normal at the furthest distance - what’s the reasoning for that? The Cross Product gets the vector perpendicular to the closest and furthest normals... then look in that direction for next closest/furthest normal. I don’t follow why though.... how about using the reciprocal of the distance to form a weighted average of the normal? At least that would be weighted towards the normal at the closest point - I don’t see how the normal at the furthest point comes into it. – Rich Sedman Feb 25 '20 at 23:46
  • What's the effect you're looking for, from the algorithm? – Robin Betts Feb 27 '20 at 12:56
  • 2
    The goal is start with a volume point, and end up with a solid read on:

    A) the normal it would have if it were an outward-facing surface point and B) how thick the surrounding mesh is, and a sense of where the point is located within that mesh.

    Basically when you have that data you can start doing all sorts of interesting effects that aren't possible right now. A "thick varnish" effect (texture suspended inside of a layer of transparent material) can be faked with fresnel, for instance but breaks with the sort of layered geometry in the example render. This can handle that.

    – Joshua Scorpion451 Evans Feb 27 '20 at 15:09

0 Answers0