0

I'm writing a program at the moment (on Windows 10, using Microsoft C++ 2019), where a debug build generates about a hundred megabytes of symbol files; this is going to increase as the amount of code becomes larger. Considering a debug build is a very frequent event, this could actually make a significant dent in the lifespan of the SSD.

It technically makes no sense for this to happen; these are throwaway files, and I have 32 gigabytes of RAM; they should be just kept permanently in RAM and never written to disk. The ideal would be if there was a way to tell Windows, make these throwaway files appear in the file system for software compatibility, but don't actually write them to disk, just keep them in RAM. Is there a way to do anything like that? I've never heard of such; the closest I've ever heard of is writeback caching, which wouldn't actually help, because the files still get written, just slightly later; it assumes the problem you are trying to solve is speed, but that's not the issue here.

Minimize writes to SSD disks with Windows 7 discusses the issue with other kinds of data, but most of the answers seem to be '... put X on a different partition'. That sounds like the people writing the answers were assuming the machine contains a spinning disk as well as an SSD? I did see one suggesting plugging in an external SSD, presumably to act as, well, a sacrificial SSD; I can see that working in principle, but it seems suboptimal to treat expensive, high-performance components as sacrificial to a process that shouldn't be happening in the first place.

Is a RAM disk a necessary and sufficient solution? Do those still keep their contents in physical RAM?

rwallace
  • 2,607
  • 2
    Considering a debug build is a very frequent event, this could actually make a significant dent in the lifespan of the SSD Not really. And considering the lifespan of current SSDs is the same or better than their mechanical counterparts, it's a moot point. Such concerns made (some) sense more than a decade ago, coincidentally the vintage of the Q&A you linked, and even then, in hindsight, there was excessive concern and lots of clickbaity FUD. – ChanganAuto Oct 10 '21 at 13:06
  • 1
    Write to the SSD, you will replace your PC before your SSD – Keltari Oct 10 '21 at 19:34

1 Answers1

0

A hundred megabytes of symbol files is a lot. You are probably using lots of system DLLs/assemblies and downloading their symbol files, and/or you have set Visual Studio to gather debug information on a very high level of detail.

Unless you would like to trace through Microsoft's software, you could limit the symbol files to your own software only.

In Options > Debugging > Symbols, the default option is "Automatically load symbols for all modules unless excluded".

Use instead the option of "Only specified modules", described as:

This setting by default will load symbols that are next to the binary on the disk, but will not try to load symbols for any other modules unless you add them to the include list by clicking “Specify modules”.

harrymc
  • 480,290
  • I'm actually using the command line compiler with some homebrew code to print a stack trace on crash; the command line for debug build in full is cl /DDEBUG /EHsc /Feayane /I\mpir /MP /MTd /O2 /WX /Zi /std:c++17 *.cc \mpir\debug.lib dbghelp.lib. The most relevant option is /Zi; it doesn't jump out at me that it offers that much granularity... – rwallace Oct 10 '21 at 21:20
  • But it looks to me from the description on the page you linked, that it is not talking about the generation of symbol files at all, but about memory used by loading them into the debugger? – rwallace Oct 10 '21 at 21:20
  • The symbol files are loaded into disk. The above setting will not delete files that were already downloaded, they need to be deleted manually. – harrymc Oct 11 '21 at 07:21