If you just want an upper bound (approximation):
Type in fsutil fsinfo ntfsinfo C: in the command prompt, read the Mft Valid Data Length value, and divide it by 1024.
Note that this number is roughly the "high water mark" for the number of files and folders that the given filesystem has had to keep track of. As an example, if you filled your drive with 2 million tiny temp files, then subsequently deleted them, Mft Valid Data Length won't shrink, and the value will be bigger than the actual value by about 2 million.
If you need an exact value, there are faster ways but you'd need to program something (I don't know of any programs out there that precisely do this)... you'd need to read the $MFT file and parse it by hand, then figure out which ones are file entries and which ones are non-file entries... it's dramatically faster than Windows's "top-down" approach (because building the hierarchy bottom-up uses only the MFT and nothing else), but it's in no way easy.
If you're a programmer but you want a less painful (although slower) way, you could also just write a program that calls NtQueryDirectoryFile to traverse the folders instead of the default FindFirstFile/FindNextFile functions... it can be a lot faster but a bit more tricky.
Just be aware that the notion of a "file" itself is actually quite tricky. It's quite possible (and Windows even does this by default) to have multiple hardlinks to the same file, and both of them are just as "real" as any other file... do you count them once or twice?
Or you can have junctions or symbolic links that point to other places... should those be counted or no?
It's not a clear-cut process as it might seem at first, so be aware of that.
Hope that helped..
Edit
You could run
robocopy /L /E C:\ C:\Temp > "%Temp%\Temp.log"
and then inspect the "Files" statistic that's shown. :P
dir \ /s /a /wmethod took 9 minutes on an SSD with 878,664 files. – Ben Oct 02 '20 at 18:42