I'm trying to find out the Area of the following surface:
Let $C$ be the curve associated to a regular, simple path $\theta:[0,l]\rightarrow \Bbb R^2 $; also assume that $((x'(s))^2+((y'(s))^2=b^2$ and let $S$ be the surface generated by the circles of radius $b$, orthogonal to, and centered in points of the curve $\rho(s)=(\theta(s),0) $.
With help from this source. I concluded that a convenient parametrization for $S$ is given by:
$$ H(s, t) = ( x(s), y(s), 0) + \sin(t) (0, 0, b) + \cos(t) (b\ y'(s), -b\ x'(s), 0). $$
I intended to use the fact that the area of $S$ is given by the surface integral:
$$ \int_S ||T_s \times T_t|| \ dv $$
However,by this approach, the terms of $||T_s \times T_t||$ become really awful. Am I doing something wrong here? What would you recommend?
I found this approach, wich uses the Divergence Theorem. However in this case I´m not sure I can use that since I don´t have a vector field.