Why is a date from the 1600s possible?
Windows does not store file modification timestamps like Unix systems do. According to the Windows Dev Center (emphasis mine):
A file time is a 64-bit value that represents the number of 100-nanosecond intervals that have elapsed since 12:00 A.M. January 1, 1601 Coordinated Universal Time (UTC). The system records file times when applications create, access, and write to files.
So, by setting a wrong value here, you can easily get dates from the 1600s.
Of course, another important question is: how was this value set? What is the actual date? I think you'll never be able to find out, as that could have simply been a calculation error in the file system driver. Another answer hypothesizes that the date is actually a Unix timestamp interpreted as a Windows timestamp, but they're actually calculated on different intervals (seconds vs. nanoseconds).
How does this relate to the Year 2038 problem?
The use of a 64-bit data type means that Windows (generally) is not affected by the Year 2038 Problem that traditional Unix systems have, since Unix initially used a 32-bit integer, which overflows sooner than the 64-bit integer that Windows has. (This is despite Unix operating on seconds and Windows operating on micro/nanoseconds.)
Windows is still affected when using 32-bit programs that were compiled with old versions of Visual Studio, of course.
Newer Unix operating systems have already expanded the data type to 64 bits, thus avoiding the issue. (In fact, since Unix timestamps operate in seconds, the new wraparound date will be 292 billion years from now.)
What is the maximum date that can be set?
For the curious ones – here's how to calculate that:
- The number of possible values in a 64-bit integer are 263 – 1 = 9223372036854775807.
- Each tick represents 100 nanoseconds, which is 0.1 µs or 0.0000001 s.
- The maximum time range would be 9223372036854775807 ⨉ 0.0000001 s, so hundreds of billions of seconds.
- One hour has 3600 seconds, one day has 86400 seconds, and one year has 365 days, so there are 86400 ⨉ 365 s = 31536000 s in a year. This is, of course, only an average, ignoring leap years, leap seconds, or any calendar changes that future postapocalyptic regimes might dictate on the remaining earthlings.
- 9223372036854775807 ⨉ 0.0000001 s / 31536000 s ≈ 29247 years
@corsiKa explains how we can subtract leap years: 29247 / 365 / 4 ≈ 20
- So your maximum year is 1601 + 29247 – 20 = 30828.
Some folks have actually tried to set this and came up with the same year.
clock_gettime(2)man page for a definition ofstruct timespec, which is whatstatand other system calls use to pass timestamps between user-space and the kernel. It's a struct with atime_tin seconds, and along tv_nsecnanoseconds. On 64-bit systems, both are 64-bit, so the whole timestamp is 128 bits (16 bytes). (Sorry for too much detail, I got carried away.) – Peter Cordes Mar 27 '18 at 16:03time_tnow; in hindsight I probably shouldn't even have brought up the year 2038 issue. – slhck Mar 27 '18 at 16:14touch -d "20 Feb 1641" file. It's occasionally useful when testing code like build systems or source repos which use timestamps to determine some behavior. – Karl Bielefeldt Mar 29 '18 at 14:04