I've some problems with definition of queuing and buffering so what is the difference between buffering and queuing ?
-
4Can you provide some citations to how they are used in ways in which you think they mean different things? Also, this doesn't really sound signal-processing related. – Jason R Jul 27 '17 at 23:07
-
it's not specifically signal processing related unless we get very broad with the definition of a signal. but the question is relevant to DSP programming, particularly real-time DSP programming. – robert bristow-johnson Jul 28 '17 at 00:00
3 Answers
I think that, especially in the context of real-time DSP the terms are really talking about the same topic.
Perhaps the most general Wikipedia reference about buffers would include both first-in-first-out (FIFO) and last-in-first-out (LIFO) but a LIFO buffer is usually called a "stack".
If it's a FIFO buffer, we usually call it a "queue" and, particularly if input and output to the buffer are asynchronous and controlled by independent source and destination, then there is a corner of the science we call "queueing theory" that worries about the size of the FIFO buffer, where the input and output pointers are, and what needs to be done to avoid "buffer starvation" (i think i am not misusing the term).
As far as real-time DSP is concerned, with uniform sampling, the only thing related to this topic that I worry about is, simply double buffering, of which I can't find a good on-line reference right away. By FIFO buffering both the input and output to a DSP process, one can write more efficient code where the processing block time is the time it takes to fill or empty the buffer (i.e. $B$ times the sample time, where $B$ is the number of samples in the buffer block). At the beginning of that block time the process is guaranteed to have $B$ valid samples waiting for the input and the process need not guarantee the correctness of any of the output samples until the very end of the block period. This can save processing overhead by loading states and coefficients and other parameters just once (rather than $B$ times, once for each sample) and storing the states just once. So the cost of this overhead is amortized over the $B$ samples. The price paid is a delay of $2B$ samples.
- 20,661
- 4
- 38
- 76
A queue can be a number of structures such as a priority queue, sometimes called a heap. They can order by arrival or magnitude. They are efficient data structures. I've seen them used in multi target tracking assignment routines.
A Buffer is also a queue and typically serve in asynchronous data transfer with control mechanisms like semaphores.
A buffer is a storage device for storing digital data. The mental image is that of a (long) shift register with data entering at one end and exiting at the other end and indeed once upon a time, buffers were actually built that way with discrete components such as vacuum tubes and transistors configured as flip-flops etc. Modern implementations eschew the notion of a shift register with bits chugging along from input to output (and the flip fops changing states) because of the unnecessary energy expenditure and simply put the data into a RAM unit where each bit stays in its allotted cell until such time as is it read out: the only flip-flops that change state are those in the address decoder for the RAM. But nonetheless, the notion of a buffer is that of a FIFO device.
Examples of buffers are those associated with decoders for error-correcting codes. A modern-day decoder for, say, a Reed-Solomon or BCH code, is usually working simultaneously on three consecutive received words. The most recent received word (word #N) is still not received completely: as the symbols arrive one by one, they enter the syndrome generator unit and are simultaneously being stored in a buffer. The previous received word (word #(N-1)) is already in the buffer (with its syndrome already computed) and the syndrome is currently being worked on by the Berlekamp-Massey algorithm or the Euclidean algorithm to figure out the error-locator and error evaluator polynomials. The word received prior to that (word #(N-2)) is being read out of the buffer and the errors in it are being corrected "on the fly" as each symbol leaves the buffer: that's the whole point of the Chien search and the Forney algorithm which process the error-locator polynomial and error-evaluator polynomial respectively to figure out whether the symbol leaving the buffer needs to the corrected (and if so, to what), or left alone because there is no error in the symbol. So, FIFO device capable of storing $2n$ symbols (two received words) of a $[n,k]$ code is needed as a buffer.
Other uses of buffers are for changing data rates: when the input and output use different clock rates. Examples are at the input to an encoder for an error-correcting code. In one codeword time, $k$ symbols enter the encoder and $n > k$ symbols exit the encoder. This is accomplished by reading in $k$ symbols into one of two buffers at the slow clock rate while the other buffer is being emptied into the encoder at the higher clock rate. Closer to the dsp.SE heart consider a polyphase interpolation filter used for upsampling. The input is at the original rate and is held in a buffer. During the time for one slow sample, $L$ interpolated samples are created as different linear combinations of the contents of the buffer, and spit out at the ($L times) higher rate of the output samples.
A queue, on the other hand, can be a much more complicated data structure in which data can enter and exit the device in different order, and the ordering can change on the fly depending on various conditions (as opposed to a periodic interleaver in which the data enter and leave with a fixed permutation being applied to the data.). See the other answers for some details.
- 20,349
- 4
- 48
- 94