Any self-verification mechanism included in the executable file could be replaced or removed by the attacker who intercepts the file. Therefore, we must leave signature verification to a system external to the downloaded executable, for example, the operating system.
There are three possible outcomes such a verification system can produce:
- The signature is definitely valid (the signature and data match)
- The signature is definitely not valid (the signature and data do not match)
- There is no signature provided (the executable is not signed)
In Windows, you may have seen pop-up confirmation windows like this when you try to run an executable:

Here, the executable has been signed and Windows has both
- verified the signature on the executable, and
- verified the identity of the public verifying key's owner, using a chain of trust based on one of the operating system's trusted certificate authorities.
We know with cryptographic certainty that an entity named WinZip Computing signed this executable, exactly as it exists right now on our hard drive. That's case #1, above.
That's how a security-conscious user can verify that something is definitely unmodified since its original publication. Similarly, the operating system can detect case #2 very easily: it attempts to verify the signature and finds a mismatch instead of a match. In that case, the operating system can sternly tell the user not to run the software because it is incredibly untrustworthy.
However, you want to know how to prevent a user from running something that has undergone arbitrary modifications. It's easy for an attacker to generate case #3 (no signature): simply replace your signed executable with an unsigned modified executable.

In this case, it's up to the user (or a very strict policy on the operating system) to decide never to run any software that does not have a signature. The user must expect a signature for your software and be alarmed at its absence (or otherwise always expect the worst and be deeply mistrustful of any unsigned software). There is no general way to distinguish between software that "should" have a signature but doesn't and software that simply was never given a signature at all (e.g. because the publisher just couldn't be bothered to do it). An attack cannot fake a modified executable so that it matches your signature, but they can produce a modified executable with no signature at all.
Of course, if your transmission method provides integrity (as TLS does) you can be sure that the executable that arrives on the client's computer is the same as the one sent from the server. If you control the server and distribute using TLS (e.g., through HTTPS), you (the distributor) can be sure that your clients are receiving your original software. If you also sign your software, the security conscious users can be sure they are running your original software, but this does nothing for security-unaware users, who don't understand the significance of a cryptographic signature.