The electroweak layer described in that paper is actually caused by the Hawking radiation. See under figure 1:
A black hole with a mass 76 kg ∼ 200 tons can heat up its neighborhood
by the Hawking radiation and restores the electroweak symmetry in the
neighborhood spherically.
The electroweak fields have two basic states: a low-temperature state in which the Higgs field has a nonzero energy density (condensate) everywhere, and everything except the photon has a mass because of interactions with the condensate; and a high-temperature state in which the condensate has dissolved, there's a plasma of electroweak particles, and they are all massless.
The temperature needed to create an electroweak plasma is so high that (according to the standard understanding of astrophysics) it has existed almost nowhere in the universe since the big bang. But since the temperature of Hawking radiation increases as a black hole shrinks, near the end of its life, the radiation from an evaporating black hole would eventually cross that threshold. The Hawking particles would just be pouring out, and should create a zone around the black hole in which the Higgs condensate can't exist. It's vaguely similar to the flame around a burning candle. Outside that electroweak "flame", physics is back to normal, the Higgs condensate exists, and the Higgs mechanism is at work. Again, see figure 1 in the paper. (The "domain wall" is a transitional shell between the two states of the Higgs field.)
As for your deduction, well, it's true that if a black hole surrounded by electroweak plasma emitted an electron that started massless and moved in a straight line through the plasma into the empty space beyond, then it would indeed get there a fraction more quickly than if it had its usual mass all the way. But that's not a violation of the Hawking theory of black hole evaporation, it just shows that the second scenario is an oversimplification, for a small enough black hole.