This SampleProcessor interpolates a stream of sample values to such that the
number of outputs sent to the SampleConsumer is outputCount, which is less
than sampleCount. It works by keeping a history of scanned samples
representing at least one output sample, and makes a choice of what value to
output from those scanned samples:
The rules for sample generation are:
- No averaging - - the sample returned is _always_ one of the scanned
samples
- The output sample is always either the largest or the smallest of the
scanned sample values
- Whether it is the largest or smallest depends on the "trend" of the
samples:
- If they are generally high-to-low then we output the low value.
- If they are generally low-to-high then we output the high value.
The rationale for these rules is the most interesting information is the
peaks and valleys of measurements, and averaging is bad because it destroys
peaks and valleys. A consequence of these rules is that quantities that
bounce around a lot will generate graphs that are a solid band between the
min and max values. But that's really an accurate reflection of the state. To
get more information, you have to look at shorter time intervals. The class
tries hard to make good choices amount
Of course this sort of crude averaging isn't perfect, but at least it doesn't
destroy peaks and valleys.
TODO: Figure out if the time passed to SampleConsumer should be the time
of the sample or the midpoint of the times between first and last sample.