UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb6 in position 0: invalid start byte
Dave Angel
davea at davea.name
Fri Jul 5 09:05:33 EDT 2013
On 07/05/2013 06:33 AM, Νίκος Gr33k wrote:
> Στις 5/7/2013 12:21 μμ, ο/η Dave Angel έγραψε:
>> Traceback (most recent call last):
>> File "<stdin>", line 1, in <module>
>> File "/usr/local/lib/python3.3/os.py", line 669, in __getitem__
>> value = self._data[self.encodekey(key)]
>> KeyError: b'REMOTE_ADDR
>
>
> Wait!
> Are you saying that the ip address is being returned as a byte string
> which then i have to decode with something like:
>
> host = socket.gethostbyaddr( os.environ['REMOTE_HOST'].decode('utf-8') )[0]
>
Don't fix the problem till you understand it. Figure out who is dealing
with a byte string here, and where that byte string came from. Adding a
decode, especially one that's going to do the same decode as your
original error message, is very premature.
You're quoting from my error output, and that's caused because I don't
have such an environment variable. But you do. So why aren't you in
there debugging it? And why on earth are you using the complex
expression instead of a refactored one which might be simple enough for
you to figure out what's wrong with it.
There is definitely something strange going on with that os.environ
reference (NOT call). So have you yet succeeded in running the factored
lines? If you can't get them to run, at least up to the point that you
get that unicode error, then you'll make progress only by guessing.
Get to that interactive debug session, and enter the lines till you get
an error. Then at least you know which line is causing the error.
xxx = os.environ['REMOTE_HOST']
yyy = socket.gethostbyaddr(xxx)
host = yyy[0]
I'll bet the real problem is you're using some greek characters in the
name of the environment variable, rather than "REMOTE_HOST" So
everything you show us is laboriously retyped, hiding the real problems
underneath.
--
DaveA
More information about the Python-list
mailing list