Most modern AVR MCU's have an on-chip temperature sensor, however neither avr-libc nor Arduino provides a simple way to read the temperature sensor. I'm building wireless nodes which I want to be able to sense temperature. In addition to the ATtiny88's I'm currently using, I want to be able to use other AVRs like the ATmega328. With that in mind I decided to write a small library to read the on-chip temperature sensor.
I found a couple people who already did some work with the on-chip temperature sensor. Connor tested the Atmega32u4, and Albert tested the Atmega328. As can be seen from their code, each AVR seems to have slightly different ways of setting up the ADC to read the temperature. Neither the MUX bits nor the reference is consistent across different parts. For example on the ATtiny88, the internal voltage reference is selected by clearing the ADMUX REFS0 bit, while on the ATmega328 it is selected by setting both REFS0 and REFS1.
One way of writing code that compiles on different MCUs is to use #ifdef statements based on the type of MCU. For example, when compiling for the ATmega328, avr-gcc defines, "__AVR_ATmega328__", and when compiling for the ATmega168 it defines, "__AVR_ATmega168__". Both MCUs are in the same family (along with the ATmega48 & ATmega88), and therefore have the same ADC settings. Facing the prospect of a big list of #ifdef statements, I decided to look for a simpler way to code the ADC settings.
I looked through the avr-libc headers in the include/avr directory. Although there is no definitions for the MUX settings for various ADC inputs (i.e. ADC8 for temperature measurement on the ATtiny88), there are definitions for the individual reference and mux bits. After comparing the datasheets, I came up with the following code to define the ADC input for temperature measurement:
#if defined (REFS1) && !defined(REFS2) && !defined(MUX4)
// m48 family
#define ADCINPUT (1<<REFS0) | (1<<REFS1) | (1<<MUX3)
#elif !defined(REFS1) && !defined(MUX4)
#define ADCINPUT (0<<REFS0) | (1<<MUX3)
// tinyx5 0x0f = MUX0-3
#define ADCINPUT (0<<REFS0) | (1<<REFS1) | (0x0f)
// tinyx4 0x0f = MUX0-3
#define ADCINPUT (0<<REFS0) | (1<<REFS1) | (1<<MUX5) | (1<<MUX1)
#error unsupported MCU
From previous experiments I had done with the ATtiny85, I knew that the ADC temperature input is quite noisy, with the readings often varying by a few degrees from one to the next. The datasheets refer to ADC noise reduction sleep mode as one way to reduce noise, which would require enabling interrupts and making an empty ADC interrupt. I decided averaging over a number of samples would be easier way.
I don't want my library to take up a lot of code space, so I needed to be careful with how I do math. Douglas Jones wrote a great analysis of doing efficient math on small CPUs. To take an average requires adding a number of samples and then dividing. To correct for the ADC gain error requires dividing by a floating-point number such as 1.06, something that would be very slow to do at runtime. Dividing a 16-bit number by 256 is very fast on an AVR - avr-gcc just takes the high 8 bits. I could do the floating-point divide at compile time by making the number of additions I do equal to 256 divided by the gain:
#define ADC_GAIN 1.06
#define SAMPLE_COUNT ((256/ADC_GAIN)+0.5)
The ADC value is a 10-bit value representing the approximate temperature in Kelvin. AVRs are only rated for -40C to +85C operation, so a signed 8-bit value representing the temperature in Celcius is more practical. Subtracting 273 from the ADC value before adding it is all that is needed to do the conversion.
I think one of the reasons people external thermistors or I2C temperature sensing chips instead of the internal AVR temperature sensor is the lack of factory calibration. As explained in Application Note AVR122, the uncalibrated readings from an AVR can be off significantly. Without ADC noise reduction mode and running at 16Mhz, I have observed results that were off by 50C.
My first thought was to write a calibration program which would be run when the AVR is a known temperature, and write the temperature offset value to EEPROM. Then when the end application code is flashed, the temperature library code would read the offset from EEPROM whenever the temperature is read. But a better way would be to automatically run the calibration when the application code is flashed. However, how could I do that?
In my post, Trimming the fat from avr-gcc code, I showed how main() isn't actually the first code to run after an AVR is reset. Not only does avr-gcc insert code that runs before main, it allows you to add your own code that runs before main. With that technique, I wrote a calibration function that will automatically get run before main:
// temperature at programming time
#define AIR_TEMPERATURE 25
__attribute__ ((section (".init8")))\
void calibrate_temp (void)
if ( eeprom_read_byte(&temp_offset) == 0xff)
// temperature uncalibrated
char tempVal = temperature(); // throw away 1st sample
tempVal = temperature();
// 0xff == -1 so final offset is reading - AIR_TEMPERATURE -1
eeprom_write_byte( &temp_offset, (tempVal - AIR_TEMPERATURE) -1);
The complete code is available in my google code repository. To use it, include temperature.h, and call the temperature function from your code. You'll have to link in temperature.o as well, or just use my Makefile which creates a library containing temperature.o that gets linked with the target code. See test_temperature.c for the basic example program.
In my testing with a Pro Mini, the temperature readings were very stable, with no variation between dozens of readings taken one second apart. I also used the ice cube technique (in a plastic bag so the water doesn't drip on the board), and got steady readings of 0C after about 30 seconds.