I'm working a bit with Heyting algebras (which are pseudocomplemented distributive lattives, right?) and I have a question about DeMorgan's laws. I know that, in general, it's not the case that $-(X \wedge Y) = -X \vee -Y$ in a Heyting algebra. However, it's also the case that, if $-X \vee --X = 1$ in such an algebra, then both DeMorgan's laws hold, right? Further, we also know that it's possible to interpret Heyting algebras as given topologies, since a topology will generally be a pseudocomplemented distributive lattice. Take then the usual topology (let's call it $\mathcal{T}$) on the real line. Define, for $O \in \mathcal{T}$, $-O = \mathrm{int}(\mathbb{R}\setminus O)$. Join is taken as union and meet as intersection. It seems then that DeMorgan's law outlined above doesn't hold: if one considers the intervals $(0, 1)$ and $(2, 3)$, we have
$-((0, 1) \wedge (2, 3)) = \mathrm{int}(\mathbb{R} \setminus ((0, 1) \cap (2, 3)))\\ = \mathrm{int}(\mathbb{R} \setminus \varnothing)\\ = \mathrm{int}(\mathbb{R}) = \mathbb{R}$.
On the other hand:
$-(0, 1) \vee -(2,3) = \mathrm{int}(\mathbb{R} \setminus (0, 1)) \cup \mathrm{int}(\mathbb{R} \setminus (2,3))\\ = \mathrm{int}((-\infty, 0] \cup [1, \infty)) \cup \mathrm{int}((-\infty, 2] \cup [3, \infty))\\ = (-\infty, 0) \cup (1, \infty)) \cup ((-\infty, 2) \cup (3, \infty))\\ = (-\infty, 0) \cup (1, \infty)$,
which is not equal to $\mathbb{R}$ (it's missing the interval $[0, 1]$).
But then, consider again a set such that $U \in \mathcal{T}$. By the above definitions, it follows that $-U \vee --U = \mathrm{int}(\mathbb{R} \setminus U) \cup \mathrm{int}\bar{U}$. We know that $U \subseteq \mathrm{int}\bar{U}$, so it follows that $\mathrm{int}(\mathbb{R} \setminus U) \cup U \subseteq \mathrm{int}(\mathbb{R} \setminus U) \cup \mathrm{int}\bar{U}$. However, unless I'm mistaken (and I may very well be, I haven't tried to prove this yet), $\mathrm{int}(\mathbb{R} \setminus U) \cup U = \mathbb{R}$. So $\mathbb{R} \subseteq \mathrm{int}(\mathbb{R} \setminus U) \cup \mathrm{int}\bar{U}$, whence $\mathbb{R} = \mathrm{int}(\mathbb{R} \setminus U) \cup \mathrm{int}\bar{U}$. So $-U \vee --U = 1$.
Hence, we have an example of a pseudocomplemented distributive lattice in which both DeMorgan's law doesn't hold and, for every $U \in \mathcal{T}$, $-U \vee --U = 1$. Obviously, I must have gone wrong somewhere, but I'm not exactly sure where. Can you guys help me locate the source of my mistake?