Why does working processors harder use more electrical power?

Posted by GazTheDestroyer on Programmers See other posts from Programmers or by GazTheDestroyer
Published on 2012-11-12T09:04:17Z Indexed on 2012/11/12 11:20 UTC
Read the original article Hit count: 236

Filed under:
|
|

Back in the mists of time when I started coding, at least as far as I'm aware, processors all used a fixed amount of power. There was no such thing as a processor being "idle".

These days there are all sorts of technologies for reducing power usage when the processor is not very busy, mostly by dynamically reducing the clock rate.

My question is why does running at a lower clock rate use less power?

My mental picture of a processor is of a reference voltage (say 5V) representing a binary 1, and 0V representing 0. Therefore I tend to think of of a constant 5V being applied across the entire chip, with the various logic gates disconnecting this voltage when "off", meaning a constant amount of power is being used. The rate at which these gates are turned on and off seems to have no relation to the power used.

I have no doubt this is a hopelessly naive picture, but I am no electrical engineer. Can someone explain what's really going on with frequency scaling, and how it saves power. Are there any other ways that a processor uses more or less power depending on state? eg Does it use more power if more gates are open?

How are mobile / low power processors different from their desktop cousins? Are they just simpler (less transistors?), or is there some other fundamental design difference?

© Programmers or respective owner

Related posts about processor

Related posts about power