Time complexity with bit cost

Posted by Keyser on Stack Overflow See other posts from Stack Overflow or by Keyser
Published on 2012-10-06T21:34:52Z Indexed on 2012/10/06 21:37 UTC
Read the original article Hit count: 313

I think I might have completely misunderstood bit cost analysis.

I'm trying to wrap my head around the concept of studying an algorithm's time complexity with respect to bit cost (instead of unit cost) and it seems to be impossible to find anything on the subject. Is this considered to be so trivial that no one ever needs to have it explained to them? Well I do. (Also, there doesn't even seem to be anything on wikipedia which is very unusual).

Here's what I have so far:

The bit cost of multiplication and division of two numbers with n bits is O(n^2) (in general?)

So, for example:

int number = 2;
for(int i = 0; i < n; i++ ){
    number = i*i;
}

has a time complexity with respect to bit cost of O(n^3), because it does n multiplications (right?)

But in a regular scenario we want the time complexity with respect to the input. So, how does that scenario work? The number of bits in i could be considered a constant. Which would make the time complexity the same as with unit cost except with a bigger constant (and both would be linear).

Also, I'm guessing addition and subtraction can be done in constant time, O(1). Couldn't find any info on it but it seems reasonable since it's one assembler operation.

© Stack Overflow or respective owner

Related posts about time-complexity

Related posts about bit-cost