[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Can global variable be passed into Python function?

On Mon, Feb 24, 2014 at 2:24 AM, Marko Rauhamaa <marko at pacujo.net> wrote:
> Or id(n) == 2 ** 64 + n for 63-bit integers; other objects get the
> RAM address of the internal ?emory block:
>    >>> id(5)
>    18446744073709551621
>    >>> id([])
>    3074657068
>    >>> id(id([]))
>    18446744076784207372

Assuming you define "63-bit integers" either as 0<=n<2**63 or as
-2**62<=n<2**62, this could work, but it would depend on never using
memory with addresses with bit 63 set, as id() is (if I recall
correctly) supposed to return an integer in the native range. I'm not
sure you can depend on that sort of pattern of memory usage.

In any case, you'd need some way to pretend that every integer is
really an object, so you'd need to define id(), the 'is' operator, and
everything else that can work with objects, to ensure that they
correctly handle this. It would be a reasonable performance
improvement to use native integers for the small ones (where "small"
integers might still be fairly large by human standards), but unlike
in languages like Pike (which does something like what you're saying),
Python has a concept of "object identity" which can't be broken.
(Pike's integers simply _are_, they aren't considered separate
objects. You can't test them for identity. Its strings, also, simply
are, although since Pike strings are guaranteed to be interned, their
values and identities really are the same. To Pike, it's only more
complex types that need to distinguish value from identity.) So this
optimization, which certainly does make sense on the face of it, would
potentially make a mess of things elsewhere.

I'm sure PyPy optimizes small integers somewhat, though.