Help with CRC calculation

Jeff Winkelman jwinkelman at
Mon Oct 1 22:13:50 CEST 2001

I am pulling my hair out trying to figure out why my C code calculates
my CRC correctly, and the Python code does not.  Will some friendly
person out there please show me the light?

Thanks in advance.

CRC Algorithm in C:

#define POLY 0x8048

unsigned short CRC16(char *pData, unsigned short length)
      unsigned char i;
      unsigned int data;
      unsigned int crc = 0xffff;

      if (length == 0)
            return (~crc);

            for (i=0, data=(unsigned int)0xff & *pData++;
                 i < 8; 
                 i++, data >>= 1)
                  if ((crc & 0x0001) ^ (data & 0x0001))
                        crc = (crc >> 1) ^ POLY;
                  else  crc >>= 1;
      } while (--length);

      crc = ~crc;
      data = crc;
      crc = (crc << 8) | ((data >> 8) & 0xff);

      return (crc);

Possible equivalent in Python:

def crc16(packet):
    POLY = 0x8048
    crc = 0xFFFF

    if len(packet) == 0:
        return ~crc

    for byte in packet:
        data = byte & 0xFF
        for x in range(8):
            if ((crc & 0x1) ^ (data & 0x1)):
                crc = (crc >> 1) ^ POLY
                crc = crc >> 1
            data = data >> 1

    crc = ~crc
    data = crc
    crc = (crc << 8) | ((data >> 8) & 0xFF)
    return -crc

I've tried forcing everything to longs, and writting the code in C and
calling it from Python.  The bizarre thing is that when I run the C
code stand alone, it works.  But any involvement with Python, whether
it be fully implemented in python, or called by Python, it doesn't

I saw a posting from Guido that suggested casting the unsigned integer
into a long and returning it that way (to help with unsigned to signed
conversion), but it didn't work either.

Any help would be HUGELY appreciated.


More information about the Python-list mailing list