#import <DelayUG.h>
Public Member Functions | |
(id) | - setInput: |
Sets the input patchpoint to aPatchPoint. | |
(id) | - setOutput: |
Sets the output patchpoint to aPatchPoint. | |
(id) | - setDelayMemory: |
Sets the SynthData object used as the delay memory to aSynthData. | |
(id) | - adjustLength: |
Sets the number of delayed samples to newDelayLength. | |
(id) | - setPointer: |
Repositions the pointer to point to the n'th sample in the delay memory, counting from sample 0. | |
(id) | - resetPointer |
Resets the pointer to the beginning of the delay memory. | |
(int) | - length |
Returns the number of samples in the delay memory. | |
(id) | - runSelf |
Subclass implementation of this method provides instructions for making the object's DSP code usable (as defined by the subclass). | |
(id) | - idleSelf |
You never send this message. | |
Static Public Member Functions | |
(BOOL) | + shouldOptimize: |
Specifies that all arguments are to be optimized if possible except the delay pointer. |
DelayUG and DelaymUG both delay their input signal by some number of samples before producing it at its output. They require a SynthData object to store the delayed signal. They differ in that DelayUG will accept any SynthData object, while DelaymUG requires a SynthData object allocated as "moduls", using the Orchestra method allocModulsSynthData:. DelaymUG is much more computationally efficient than DelaymUG.
Each DelayUG maintains a single pointer into the delay memory. When the object is run, a tick's worth of samples are read and replaced with an equal number of samples from the input signal. The pointer is then incremented by a tick. When the pointer reaches the end of the delay memory, it automatically jumps back to the beginning, even if it's in the middle of a tick - in other words, the length of the delay memory needn't be a multiple of the tick size. The rate at which the pointer is incremented can't be modified, nor can you offset the beginning of the delay memory. However, you can reposition the pointer to any arbitrary sample in the delay memory through the setPointer: method.
DelayUGabc a output b input c delay memory
+ (BOOL) shouldOptimize: | (unsigned) | arg |
Specifies that all arguments are to be optimized if possible except the delay pointer.
arg | is an unsigned. |
Reimplemented from MKUnitGenerator.
- (id) setInput: | (id) | aPatchPoint |
Sets the input patchpoint to aPatchPoint.
aPatchPoint | is an id. |
- (id) setOutput: | (id) | aPatchPoint |
Sets the output patchpoint to aPatchPoint.
aPatchPoint | is an id. |
- (id) setDelayMemory: | (id) | aSynthData |
Sets the SynthData object used as the delay memory to aSynthData.
The length of the SynthData must be greater than or equal to the amount of delay (in samples) that's desired. If aSynthData is nil, the delay memory is set to the sink location. For DelaymUG, aSynthData must be allocated as "modulus" memory.
aSynthData | is an id. |
- (id) adjustLength: | (int) | newDelayLength |
Sets the number of delayed samples to newDelayLength.
The argument must be no greater than the length of the SynthData object that's used as the delay memory.
newDelayLength | is an int. |
- (id) setPointer: | (int) | offset |
Repositions the pointer to point to the n'th sample in the delay memory, counting from sample 0.
offset | is an int. |
- (id) resetPointer |
Resets the pointer to the beginning of the delay memory.
- (int) length |
Returns the number of samples in the delay memory.
- (id) runSelf |
Subclass implementation of this method provides instructions for making the object's DSP code usable (as defined by the subclass).
You never invoke this method directly, it's invoked automatically by the run method. The default does nothing and returns the receiver.
Reimplemented from MKUnitGenerator.
- (id) idleSelf |
You never send this message.
Reimplemented from MKUnitGenerator.