[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Entering a very large number

On 3/25/18 9:37 PM, bartc wrote:
> On 26/03/2018 00:27, Richard Damon wrote:
>> On 3/25/18 8:32 AM, bartc wrote:
>>> Using CPython on my machine, doing a string to int conversion that 
>>> specific number took 200 times as long as doing a normal assignment.
>>> That conversion took 4 microseconds.
>>> Not significant if it's only done once. But it might be executed a 
>>> million times.
>> The other half of that thought is how does the 4 microseconds to 
>> create the constant compare to the operations USING that number. My 
>> guess is that for most things the usage will swamp the 
>> initialization, even if that is somewhat inefficient.
> Calling a function that sets up C using 'C = 288714...' on one line, 
> and that then calculates D=C+C, takes 0.12 seconds to call 1000000 times.
> To do D=C*C, takes 2.2 seconds (I've subtracted the function call 
> overhead of 0.25 seconds; there might not be any function call).
> If I instead initialise C using 'C = int("288712...")', then timings 
> increase as follows:
> 0.12? =>?? 3.7 seconds
> 2.2?? =>?? 5.9 seconds
> So the overhead /can/ be substantial, and /can/ be significant 
> compared with doing bignum calculations.
> Of course, once initialised, C might be used a hundred times, then the 
> overhead is less significant. But it is not small enough to just dismiss.
And my point is that writing a program to just add or multiply once two 
FIXED big long numbers (since they are in the source code, that seems 
fixed), a million times seems? unlikely (and even then the cost isn't 
that bad, since that sounds like a run once program). And of course once 
the answer has been computed, it needs to be output somehow, which 
likely will add to the time significantly. The question would be do you 
have a REAL PRACTICAL case where this overhead actually makes a 
significant difference, or are we just talking a theoretical toy. It is 
a fact that it seems the language has no notation to express such a 
large number as a number in source code on mulitple lines (except as a 
string). I would think it would take something more than a toy program 
to persuade for something notation to do this directly (and there are 
alternatives that have been mentioned that aren't that bad for this sort 
of corner case).

Richard Damon