Just got the tail pipe sniffer project fired up with both the SLC Free 2 and some new SpeedHut AFR gauges.
With the sensors in free air the digital is reading 19.9 AFR, The Gauge is reading 19.5. I don’t have a stable way to check it at another fixed value (Mid range). I looked at the voltage at the output of the SLC2 (I added a jack to monitor) and what I’m seeing is 4.765V and 4.753 with both 4.9 sensors in free air. The gauges as I think they should be calibrated 5.0V is 20 AFR on the SpeedHuts. I changed which gauge is on what controller and you can just see the little difference move from one to the other so I think they are reading correctly, but the SLC’s digital is different then the analog.
Is this caused by the heater offset on the analog line like the mod you have pinned at the top of the forum? I have pretty recent boards, and not sure if the mod will fix/help the offset I’m seeing.
Everything on the Ground side of the wiring is star grounded to the battery ground, SLC’s grounds are thick and pretty short if that matters. Voltage is running at 13.1V as the gauge and the battery also show the same (battery has bluetooth biometrics as it might be called…).
Trying to figure out if or how this will affect the 14.7AFR reading mid scale? Or if the fix is the Mod that is posted.
Link to video on my GoogleDrive - SLC2-Videos - Google Drive
In process wiring, will be cleaning up a bit, and securing.
And also ran it for 2 hours with about 65% of battery life left, so should be plenty good to do testing without much worry of needing a charge.

