I'm currently plotting high sample rate data by down-sampling by a factor and averaging. Most of the time this is working ok, but I am aware that at very low zoom levels (zoomed right out) the averaging is hiding vast amounts of data.
I have a timestamp for every data point, so are there any algorithms out there which will remove the pointless and 'less useful' points? I've done a bit of Googling and found papers retlated to time-series data compression, but that's not really what I'm after. I'm looking for something which will take advantage of the particular properties of plotting. ie. if you have 10 consecutive points with the same value, the intermediate ones can effectively be removed without losing any detail in the visual representation.