The other day, a friend on IRC was looking for a way to calibrate a sensor he had on his raspi’s I2C bus. He stated that, the sensor gave a millivolt reading from an ADC, that the millivolt reading was linear to the measured parameter, and that his existing code was incomplete (no calibration) and thus produced a reading with considerable offset at the lower end of the scale, compared to a reference sensor. So he was wondering how to do High/Low Point calibration in software.
He had another requirement: He wanted to map the calibrated millivolt reading from the sensor to a zero based ppm scale in a linear fashion.
It took me a while to figure something out, but here it is:
The idea is to provide ratio-based mapping between the two scales -millivolts from sensor to value on the target scale- while referencing the zero point on the target scale to the Low Point Calibration millivolt value. The targetCalHigh variable is the target scale value at calibration high point (the high point calibration solution’s known ppm value). Because the targetCalLow value is zero as he calibrates the low point with 0ppm solution, we don’t need it for the calculation. The targetValue is the calibrated reading, mapped to whatever zero based scale targetCalHigh refers to.
did some verification of the formula with different calibration solutions (0ppm, 550ppm, 1000ppm and 2200ppm) and said that his raspi based probe provides accurate readings now when compared to the same reference meter, and that the problem appears to be solved.
Feel free to use my Math as you see fit.