■ FIGURE 2. Butterfly Back: Safety Pin and Piezo Speaker.
unaware that a second has passed. That is, until it
gets to some code that checks the external seconds
variable against the local (automatic) seconds
variable that it uses so that it would note that a
second has passed and do whatever it does for that
event. For instance, change the seconds value on
the Butterfly LCD.
We will simplify our lives a bit by using the
library libsmws7.a (see end of article for download
information) that hides all the timer interrupt and
LCD driver stuff. (You are welcome!)
Converting Computer Time
To Human Readable Time
We can keep a count of seconds, but what
good does it do us if our watch reads 40241? If
the count started at midnight, then this number of
seconds would indicate that the time is ten minutes
and 41 seconds after 11:00 in the morning. So, we
are going to need to do some computing to convert
the count to something we can read.
foolish to try to 'overclock' a computer, it might seem to
run perfectly until a critical event and run off crazy and
raise the landing gear moments before touchdown.
Also, the faster you run a CPU, the more power it uses.
The Butterfly uses an ATmega169 that can run at 8 MHz,
but what's the hurry? Many AVRs can run off an internal
oscillator (cheap, but can be inaccurate) or an external
clock (costs extra, but can be much more precise).
The Butterfly uses the interesting method of doing both.
It has an external watch crystal that runs at 32768 beats
per second — way slow for a computer, but very cheap
and accurate. It uses this external crystal to calibrate the
internal oscillator to run at 2 MHz. This provides us with a
fast and accurate CPU clock and gives us the opportunity
to use the crystal to generate pulses for a real time clock.
Think for a moment about the watch crystal frequency
32768. Seem weird? Well, remembering our last
Workshop on binary numbers we see that it is binary:
Binary 1000 0000 0000 0000
But more importantly, 32767 — one beat short of 32768 is:
BCD — Binary Coded Decimal
Binary Coded Decimal is a coding trick (or 'algorithm'
for the more OCD among us) that eases the storage and
conversion of binary numbers to decimal numbers. Say
you have a count of the watch crystal beats in binary and
want to display this number on an LCD in human readable
decimal numbers. Using BCD, we can divide an eight bit
byte into two, four bit nibbles and store them as single
decimal integers — 0 to 9 — in each nibble. Since 9 is the
largest decimal digit and we can store two digits per byte,
99 is the largest decimal value we can store. Yes, using a
byte to encode a maximum value of 99 when it could
encode up to 256 values wastes space, but it provides a
good way to store human readable decimal digits.
If a decimal number in a byte is less than 99, we can
convert it to a BCD byte using the following algorithm (or
'trick' for the less OCD among us):
Set the initial byte (uint8_t) to some decimal two-digit
Binary 0111 1111 1111 1111
If we think electronics, we note that the highest bit
changes from 0 to 1 once each second, so if we hook up
a circuit that can keep a binary count (piece of cake) that
can interrupt our code each time this bit changes from 0
to 1, then we can keep a count of seconds. Our main
program will run along merrily doing whatever it does and
once each second, we can have the current state stored,
run an interrupt handler that can add one second to an
external seconds variable, and then restore the main
program state which will resume whatever it was doing —
uint8_t initialByte = 54;
Declare a variable for the upper nibble value:
uint8_t high = 0;
Count the tens in initialByte:
while (initialByte >= 10)
initialByte -= 10;