Why is Decimal('0') > 9999.0 True in Python?

Posted by parxier on Stack Overflow See other posts from Stack Overflow or by parxier
Published on 2010-03-11T23:40:06Z Indexed on 2010/04/13 11:32 UTC
Read the original article Hit count: 249

Filed under:
|
|
|
|

This is somehow related to my question Why is ''>0 True in Python?

In Python 2.6.4:

>> Decimal('0') > 9999.0
True

From the answer to my original question I understand that when comparing objects of different types in Python 2.x the types are ordered by their name. But in this case:

>> type(Decimal('0')).__name__ > type(9999.0).__name__
False

Why is Decimal('0') > 9999.0 == True then?

UPDATE: I usually work on Ubuntu (Linux 2.6.31-20-generic #57-Ubuntu SMP Mon Feb 8 09:05:19 UTC 2010 i686 GNU/Linux, Python 2.6.4 (r264:75706, Dec 7 2009, 18:45:15) [GCC 4.4.1] on linux2). On Windows (WinXP Professional SP3, Python 2.6.4 (r264:75706, Nov 3 2009, 13:23:17) [MSC v.1500 32 bit (Intel)] on win32) my original statement works differently:

>> Decimal('0') > 9999.0
False

I even more puzzled now. %-(

© Stack Overflow or respective owner

Related posts about python

Related posts about logic