I apologise as I feel like I should know this off the top of my head. I will try to explain what I believe I know and then ask a simple question.
The usual file format used in software defined radios should be just numbers in binary format, representing in-phase and quadrature values of the sampled waveform. Those values are usually 32 bits, so every 2*32 bits you have a single sample of the waveform in that moment. When you play back this format, the DAC reproduces the waveform associated with those IQ states at the rate you ask it to. If you are using 10 MHz as your sample rate, the DAC will produce a waveform with the IQ values you ask it to and consume those values 10 million times a second.
The only way to entirely corrupt this file format is to cut 32 bits somewhere, as that would invert I and Q for the rest of the file. If you remove any multiple of 2*32 bits from the middle of the file you would only corrupt that single sample (and maybe the one before/after it), but the integrity of the file as a whole will not be compromised.
I'm asking this because I'm interfacing with the USB driver of a SDR and I'm confused as to what is actually going on.