9

I am looking for a fast way of counting the total number of files - and directories - on any NTFS volume. (In an ad-hoc way. That is, given any random box, not a specially prepared volume.)

Note: This is not about files per directory, just the overall amount of files on the volume.

Currently, the only way I know is to open the root folder of a drive in Windows Explorer select all elements, right-click to choose Properties and then wait (and wait) until explorer has counted all elements.

Is there a better/faster way?

Bobby
  • 8,974
Martin
  • 2,296
  • 6
  • 37
  • 51
  • Could I ask what the intended use case for this was? I'm considering improving my answer, but it depends very much on what the number was intended to be used for. (The notion of a "file" itself is itself ambiguous, and whether you want to count something as a file or not depends on what you wanted to do with the result.) – user541686 Mar 13 '19 at 01:22
  • Example scenario: I am running an antivirus full scan that shows me how many files it has scanned so far. I want to know how many files are on the drive so I have an idea of how far along it is. The dir \ /s /a /w method took 9 minutes on an SSD with 878,664 files. – Ben Oct 02 '20 at 18:42

4 Answers4

11

If you just want an upper bound (approximation):

Type in fsutil fsinfo ntfsinfo C: in the command prompt, read the Mft Valid Data Length value, and divide it by 1024.

Note that this number is roughly the "high water mark" for the number of files and folders that the given filesystem has had to keep track of. As an example, if you filled your drive with 2 million tiny temp files, then subsequently deleted them, Mft Valid Data Length won't shrink, and the value will be bigger than the actual value by about 2 million.

If you need an exact value, there are faster ways but you'd need to program something (I don't know of any programs out there that precisely do this)... you'd need to read the $MFT file and parse it by hand, then figure out which ones are file entries and which ones are non-file entries... it's dramatically faster than Windows's "top-down" approach (because building the hierarchy bottom-up uses only the MFT and nothing else), but it's in no way easy.

If you're a programmer but you want a less painful (although slower) way, you could also just write a program that calls NtQueryDirectoryFile to traverse the folders instead of the default FindFirstFile/FindNextFile functions... it can be a lot faster but a bit more tricky.


Just be aware that the notion of a "file" itself is actually quite tricky. It's quite possible (and Windows even does this by default) to have multiple hardlinks to the same file, and both of them are just as "real" as any other file... do you count them once or twice?

Or you can have junctions or symbolic links that point to other places... should those be counted or no?

It's not a clear-cut process as it might seem at first, so be aware of that.

Hope that helped..


Edit

You could run

 robocopy /L /E C:\ C:\Temp > "%Temp%\Temp.log"

and then inspect the "Files" statistic that's shown. :P

mrm
  • 151
user541686
  • 23,918
  • I believe with file system limitations, this is more valid than the currently highest rated answer, as this interfaces directly with the file system subsystem, not just the presented results of dir. Anyone else have an opinion? Also, isn't it best to state: (Mft Valid Data Length)/(Bytes per FileRecord segment)? – brandeded Sep 26 '12 at 13:31
  • It seems that Valid Data Length should be considered as... In NTFS, there are two important concepts of file length: the end-of-file (EOF) marker and the Valid Data Length (VDL). The EOF indicates the actual length of the file. The VDL identifies the length of valid data on disk. Any reads between VDL and EOF automatically return 0 in order to preserve the C2 object reuse requirement. (source – brandeded Sep 26 '12 at 13:55
  • If EOF markers mark the end of the file, the VDL size should be smaller than the size of the file. But this doesn't align itself at all to the dir method marked as the answer (in a test, I see a difference of 1573080 between the two methods, with (Mft Valid Data Length)/(Bytes per FileRecord segment) being the higher of the two). As in, if we were to consider the total length of data on a disk to be greater than Mft Valid Data Length, then this number would be even greater, and much less reflective of the true file count. – brandeded Sep 26 '12 at 14:08
  • fwiw, at least on win10, fsutil requires admin, and the result isn't correct to even several orders of magnitude. I have a volume with ~50k files, and my Mft Valid Data Length / 1024 is 2.8 million. – mrm Mar 12 '19 at 00:06
  • @mrm: it just means at some point you had 2.8 million files on that volume. – user541686 Mar 12 '19 at 00:12
  • 1
    Ah, that totally makes sense. @Mehrdad thanks for the answer and the explanation! I edited your answer to include that (and will upvote it as soon as I'm allowed). – mrm Mar 13 '19 at 01:10
9

At the DOS prompt, type the following command:

  • dir \ /s /a /w

(The "/s" switch enables a recursive search on all sub-directories {with most Unix utilities this is usually the "-R" switch}, the "/a" switch counts all files regardless of Attributes, and the "/w" displays multiple entries on a single line so that the report will finish a little bit faster. Change "\" to the desired path you wish to begin from; for a different drive letter, such as drive D:, change it to "D:\" to search that drive.)

Once finished, you'll be returned the prompt, and then the last two lines of output will reveal a count of the total number of files and total number of directories. If you're looking for total number of filename entries, then just add those two number together, otherwise the total number of files is all you'll need.

  • Thanks! This worked very nicely and seems a lot better than the flaky explorer dialog. Also, it seemed quite fast (enough). – Martin Apr 20 '11 at 10:09
  • @Martin: A tool that I find particularly useful (and a major time saver for file system administration), which I suspect you probably will too, is the free and open source FAR Manager, which was heavily inspired by Norton Commander for DOS. It's a native 32-bit/64-bit Windows application that actually uses text mode (no graphics), and has some features in it that will also count files and directories (press CTRL-Q to count totals for the highlighted/selected files/directories): http://www.farmanager.com/ – Randolf Richardson Apr 21 '11 at 01:57
  • @Randolf: Awesome! :-D Fond memories of "better" days. I'll have to try that with PsExec! – Martin Apr 21 '11 at 06:38
4

The simple answer is; download Everything. It counts for you, super fast. Also does ~realtime searches for you. It is freeware and does not contain any extra crapola.

cc0
  • 335
2

It seems, more correct way is

dir /a /b /s \ | find /c /v ""

list all (/a) files recursively (/s) using bare format (\b one file per line). After that, use special windows hack to count lines in file.

Bare format does not display stats for file, so it faster, redirecting stdout to another program will reduce time to display unnecessary information.

force
  • 21
  • 2