concept

Moore's Law

Moore's Law is an observation and prediction in computer hardware that the number of transistors on a microchip doubles approximately every two years, leading to exponential growth in computing power and efficiency. It was first articulated by Gordon Moore, co-founder of Intel, in 1965 and has historically driven rapid advancements in semiconductor technology, influencing everything from personal computers to smartphones. While originally focused on transistor density, it has become a broader principle symbolizing the pace of technological progress in the electronics industry.

Also known as: Moore Law, Moore's Observation, Transistor Scaling Law, Semiconductor Scaling, Moores Law (common misspelling)
🧊Why learn Moore's Law?

Developers should understand Moore's Law to grasp the historical context of hardware evolution, which underpins software performance expectations and scalability. It helps in making informed decisions about system architecture, such as anticipating future hardware capabilities for long-term projects or optimizing code for upcoming processor improvements. Knowledge of this concept is crucial for fields like high-performance computing, embedded systems, and AI, where hardware advancements directly impact algorithm design and deployment strategies.

Compare Moore's Law

Learning Resources

Related Tools

Alternatives to Moore's Law