Roughly every 18 months, the number of transistors that can be squeezed onto an integrated circuit can double. This trend was first seen in 1965 by Intel co-founder Gordon Moore, and is popularly called “Moore’s Law.” The results have propelled technology ahead and transformed it into a trillion-dollar industry, where unimaginably powerful chips are seen in everything from home computers to autonomous automobiles to smart household apparatus. But Moore’s Law might not be able to go on forever.
The high tech industry may love its talk of exponential growth and a digitally-driven “end of scarcity,” but there are physical limitations to the ability to continually shrink the size of elements on a processor. Already, billions of transistors on the most recent chips are imperceptible to the human eye. If Moore’s Law was supposed to continue through 2050, engineers would need to create transistors from parts which are smaller than a single atom of hydrogen. Additionally, it is increasingly expensive for businesses to keep up. As a consequence of these factors, a lot of individuals predict Moore’s Law will peter out in the early 2020s. Here are seven reasons why the end of Moore’s Law will not indicate the end of calculating progress as we understand it.
Moore’s Law will not finish just like that and despite its title, is not a universal law of this sort. On the contrary, it’s an observable trend like the fact that Michael Bay tends to launch a new Transformers film in the summer — except, you know, great. Because Moore’s Law is not going to just end like someone turning off gravity, it merely means that the rate of improvements will occur a bit slower.
While there are beautifully coded applications out there, most of the time developers have not had to worry too much about streamlining their code to make it less lethargic year after year because they know that next year’s computer chips will have the ability to run it even better. If Moore’s Law no longer gets the very same advances, then this approach can’t be relied upon. Squeezing more software performance from the very same chips will consequently become a greater priority. Beyond speed, hopefully, it’ll mean more elegant software with an excellent degree of focus on user experience, look-and-feel, and quality. Even if Moore’s Law were to end tomorrow, optimizing the current applications would still provide years, if not decades, of expansion — even without hardware improvements. Graphics processing units (GPUs) are only one example of this. Custom specialized processors may also be utilized for neural networks, computer vision for self-driving automobiles, voice recognition, and Internet of Things apparatus. GPUs are already a driving force for computer vision in the autonomous automobiles and automobile to infrastructure networks.
These distinctive designs can boast a selection of improvements, such as higher levels of performance per watt. Companies jumping on this customized bandwagon include market leader Intel, Google, Wave Computing, Nvidia, IBM, and much more. Much like better programming, the downturn in manufacturing improvements compels chip designers to be more considerate when it comes to dreaming up new architectural discoveries.