So this has been bothering me for a little while. When you consider scalar diffusion like $\frac{dc}{dt} = D\nabla^2 c$, where $c=c(x,y)$, most people would say that the scalar will move downhill. Now it is certainly true that the scalar is going to move from regions of high concentration to regions of low concentration. However, most of the time it is said that the diffusion is in the direction of the gradient. (In other words, the level curves move perpendicular to themselves and not at a slight angle)
This assumption is mentioned off-hand in a lot of places like: http://www.math.umn.edu/~olver/ln_/vc2.pdf (see paragraphs near Proposition 5.3). I however cannot find a proof that it is always true for any scalar concentration field.
This seems like such a simple thing, but the more I look into it the more I don't know if it's true. When thinking about this, don't just assume a circularly symmetric distribution, consider something like a wedge shape for instance. Shouldn't the point of the wedge grow faster down the opening than along it, thus bending its angle away from the gradient?
I'd appreciate an explanation why diffusion follows the gradient, or a counter-example.