UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte

feedthetroll at gmx.de feedthetroll at gmx.de
Fri Jul 5 14:14:13 CEST 2013

Am Freitag, 5. Juli 2013 12:33:05 UTC+2 schrieb Νίκος Gr33k:
> ...
> Wait!
> Are you saying that the ip address is being returned as a byte string 
> which then i have to decode with something like:
> host = socket.gethostbyaddr( os.environ['REMOTE_HOST'].decode('utf-8') )[0]

I get a decode error when python tries to automatically decode a bytestring
assuming it to be utf-8 encoded.
I am sure the error will disappear, when I try to decode it explicit using
utf-8. Heureka! I got it!

Or in other words:
If a big stone falls on my foot accidently, it hurts.
But I am sure it will not hurt, if take that same stone and throw it on my foot.

Heureka! I got it!


Am 14.06.2013 10:35, schrieb Fábio Santos:
> Also you have been shown this link and I feel you really need to read it.
> http://slash7.com/2006/12/22/vampires/

More information about the Python-list mailing list