Comfort Automation/ Security System Forums > Products > Temperature Sensor > Sensor drift over time? |
Moderated by: admin |
Author | Post | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
leavewei Member
|
I was reading stuff on NTC self heating and I was thinking if the thermal runaway is something that could really affect a measurement. So, not an offset, but actually a drift on the resistance. So I tried to think what happens in a circuit like this: The maximum power absorbed by the thermistor is when has the same resistance as Rref due to the maximum power transfer theorem. Plotting this power absorbed versus temperature (i.e. resistance) and facing the graphs, I obtain something like this: Where of course at 25°C both resistance are at the same value. Now, for temperatures higher than 25°C, I can see that there is some negative feedback that keeps the sensor from drifting: as temperature increases, the power dissipated decrease and temperature decrease again, and I expect that it will stay in one balance point lower than the offset without this negative feedback. On the other hand, for temperatures lower than 25°C, as the sensor increase its temperature, it will increase the dissipated power, increasing further, up to arrive at stationary point of 25°C. Now, if those assumptions are correct, how this is taken into account in the design? Is this effect dumped out somehow? This question I think applies also on some "fixed" resistors, but with significant parasitic thermal dependence. |
|||||||||
slychiu Administrator
|
Thanks for the analysis. The effects will not be significant if we are not looking for great precision, and there are compensation techniques in use to handle this. Do you have the TSM to test? |