When allowing 32 bit input for the Winamp/DSP plug-in but feeding it with something different, e.g. 16 bit, what happens considering bit depth?
- Does the host or ST convert the 16 bit to 32 bit? Maybe there's even a second conversion from integers to float in ST?
- What bit depth does ST output? (I can choose different output bit depths in the host but again I don't know if ST or the host is doing the conversion.)
The idea is to minimize the number of conversions assuming mainly 24 bit input and sometimes 16 bit.