So next possibility was FIR filters. To get close to a brickwall, those filters need a really high order, which means a really long sample. For example, for 3000 tabs, filtering will only be effective after the 3000th float sample value. So we need to apply this filter on a 13 * 256bytes loop of the original 256bytes sample. Then we just keep the last 256 post filter float values.
On the following screenshot, figure 2 would be the 2nd BL filter and figure 3 would the 7th (the last).
As you can imagine, figure 2 will work fine but figure 3 filter will give us the same kind of issues we had with biquads. This slope is too far from brickwall. So we will keep way too many harmonics that will go back aliasing, even if we change our cut-off freq. A bigger order could help but 3000 is already quite enormous for the Owl. After normalizing and shifting cut-off frequencies, we have a little less aliasing than biquads but still way too much with harmonics rich waveforms.
That led me to a different way of manipulating harmonics. By going through frequency domain it is way more efficient to just cancel every harmonics we don’t want. We just need to do a fourier transform and inverse fourier transform. So I used Rebel Technology c++ fourier classes. Obviously, the result is way closer to earlevel’s result. After testing a few waveforms, only the real complex ones still alias at HF. A solution would be to filter and store every 5 or 6 semitones instead of every octave but it would take a lot more memory.
In order to do some morphing, I kept the same storage structure. I’m storing X oscillators, every of them contains Y waveform samples, morph X selects the oscillator while morph Y selects the waveform sample. Crossfading between them is made with linear interpolation. So there are always two oscillators and two waveforms running. This means the output is a mix of four waveforms (four different sounds).
An important thing to remember is that the final patch stores X * Y * 7 256bytes samples.