Input scalar=-6.4383 offset=-0.0021
Output scalar=5.0293 offset=-0.0034
This gives expected v/oct tracking. I've set output gain to max, while input gain is set to 92 and there's no way to change it in UI. It looks like not using max output gain was giving me incorrect results. I was thinking about exposing input gain as well in volume menu to use full range for CV, should not take much time to get this working.
I don't quite understand why their signs are opposite, but writing to DAC a value and reading it back from ADC gives me opposite sample sign when normalized as float. But after calibration voltage is coming out as expected, not inverted. I've seen mentions of inverted ADC in source code, but it was referring to other devices, so not sure if this is expected or not for Magus. I'm using the following code for this conversion (should be the same thing that happens in SampleBuffer class):
Output = (int32_t)(current_sample * 2147483648.0f) >> 8;
Input = (float)(int32_t)((pv->audio_input)<<8) / 2147483648.0f;
I will be opening PR soon, but wanted to make another minor change to current code first - will use average value over multiple samples during measurements, this should give a bit more precise results.
And I'll rebase to latest commit and test with it - when I've tried it a few weeks ago, I had some issues compiling. It looked like CubeMX upgrade broke something related to USB audio device initialization. So I've used commit from stable release with older CubeMX and that worked fine. Did you run latest firmware on Magus after your December updates?