16-bit signed binary to binary coded decimal(BCD) ascii

Peter Hansen peter at engcorp.com
Thu Feb 13 23:56:08 CET 2003

```Deirdre Hackett wrote:
>
> I want to take in data in binary format from a magnetometer.
> The magnetometer sends out the x,y,and z components of the magnetic field.
> The order of output for the binary format is:
> Xhi, Xlo, Yhi, Ylo, Zhi, Zlo.
> The binary format is more efficient since only 7 bytes are transmitted

First thing that isn't clear: where do the *seven* bytes come from? You
show only six items in that list.  Maybe it's not relevant, but when I
read requirements docs I'm hyper-sensitive to anything that doesn't seem
to fit...

> as opposed to BCD ascii where 28 bytes are transmitted.

How do you calculate this 28?

> After all that long-winded babble, my question is how can i convert
> the 16-bit binary information into BCD ascii, which is easier for user
> interpretation.

Could you please explain by example exactly what you mean by "BCD ASCII"?
This is not, at least in my experience, a completely well-defined technical
term.  (Examples with actual binary would be best, to avoid confusion.)

ASCII refers to a standard for encoding various characters in 7 bits,
with the eighth bit set to zero on 8-bit machines.  BCD refers, usually,
to a way of packing numeric information such that each nybble (four bits)
of an eight-bit byte holds one decimal digit.  For example, the number
67 would be stored in a byte as the binary value 0110 0111 (six then seven),
which has character value 103, and which in ASCII is the lowercase "g".

Since two 8-bit bytes (e.g. Xhi and Xlo together) can represent values from
0 to 65535, which is five digits, it would take three bytes to hold the
values when packed as BCD (with the upper nybble unused).  That doesn't
jive with your "28 bytes" (or four per value) so I can't guess what you
meant by the term.

> There must be a simple way to do or maybe even someone has written
> their own module to do it.