#include <DelaymUG.h>
DelayUG and DelaymUG both delay their input signal by some number of samples before producing it at its output. They require a SynthData object to store the delayed signal. They differ in that DelayUG will accept any SynthData object, while DelaymUG requires a SynthData object allocated as "moduls", using the Orchestra method allocModulsSynthData:. DelaymUG is much more computationally efficient than DelaymUG.
Each DelayUG maintains a single pointer into the delay memory. When the object is run, a tick's worth of samples are read and replaced with an equal number of samples from the input signal. The pointer is then incremented by a tick. When the pointer reaches the end of the delay memory, it automatically jumps back to the beginning, even if it's in the middle of a tick - in other words, the length of the delay memory needn't be a multiple of the tick size. The rate at which the pointer is incremented can't be modified, nor can you offset the beginning of the delay memory. However, you can reposition the pointer to any arbitrary sample in the delay memory through the setPointer: method.
DelayUGabc a output b input c delay memory