One general answer is in the paper in the comment by @EuYu,
I. C. F. Ipsen, and R. Rizwana. "Perturbation bounds for determinants and characteristic polynomials." SIAM Journal on Matrix Analysis and Applications 30.2 (2008): 762-776.
The following holds for general complex matrices.
If $M$ is the $n\times n$ complex matrix we are interested in and $\hat{M}$ is what we observe, take $E\triangleq \hat{M}-M.$ If $M$ is invertible, then by rearranging Theorem 2.13 in that paper, we have:
$$\left\|\operatorname{det}\hat{M}-\det M\right\|\leq \left\|\det M\right\|\cdot \left(\left(\kappa_M\frac{\|E\|_2}{\|M\|_2}+1\right)^n-1\right)$$
and if $M$ is singular and rank $r$, by Corollary 2.11 we have:
$$\left\|\det\hat{M}-\det{M}\right\|=\|\det \hat{M}\|\leq \left\|E\right\|^{n-r}_2(\|M\|_2+\|E\|_2)^r,$$
where $\|\cdot\|_2$ is the operator norm (largest singular value) and $\kappa_{M}$ is the conditioning number, the ratio of the largest to the smallest singular value of $M.$ These bounds are tight when $E$ has singular values in the same direction as $M$.
In the OP, taking $C$ the matrix of $c_{ij}$'s and $\odot$ the Hadamard product then $E=M\odot C$. From here if $\varepsilon<1$ we can bound $\|E\|_2\leq (1+\varepsilon)\cdot\|M\|_2$.
A little more work is needed to get rid of the bounds' dependence on $M$ but this can be done with some simple Frobenius norm inequalities and our form of $E, \hat{M}$. I've written up the details of doing this here http://www.public.asu.edu/~cdchapm2/approximating-determinants/index.html.