Moore's law controversy!

Definition

Moore's Law takes its name from Intel co-founder Gordon Moore who observed in 1965 that the number of transistors on a microchip doubles every two years with cost reduction. It is important to note in this regard that many consider Moore's law rather as a guideline for historical observation and future prediction of the number of transistors in a semiconductor device or a chip. In Wikipedia, we read: "We should not speak of '' Moore's laws '' but of '' Moore's conjectures '' since Moore's statements are only guesses ''. The Universalis Encyclopedia also specifies that "Moore's law is a set of conjectures and statements on the complexity of integrated circuits".

moore

However, in general understanding, Moore's law implies that as transistors on integrated circuits become more efficient, computers and their computing power become smaller, faster and cheaper. Another principle in Moore's law predicts that the growth of microprocessors is exponential.

Impacts and forms of use

Nevertheless, the main question remains whether Moore's law is still relevant today. More than 50 years after its definition, we still feel the impact and lasting benefits of Moore's law in many ways. The personal computer era has been dominated by the feeling that the consumer needs the latest and best performing machine on the market. Each of us needed to buy a new computer or a new phone every two or four years, either because what we already had was getting too slow, not running a new application, or for other reasons that would justify the acquisition of a more efficient tool. Without knowing it, we undergo the very expression of the phenomenon of Moore's law. On an industrial scale, practically all facets of a high-tech society pass through and benefit from the effects of Moore's law. Today, all technological devices from mobile phones, such as smartphones and tablet computers, video games, spreadsheets, accurate weather forecasts and global positioning systems (GPS), etc. would not function without periodically having new powerful miniature processors.

However, today, some people question this logic. This is partly due to the change in consumer behavior: many computer owners use their computers to perform simple tasks, such as browsing the web or sending e-mail; those applications that do not require a lot of computation from the computer. In addition, the growing popularity of cloud computing is another reason why powerful PCs are not as necessary. Cloud computing is shifting the load of data processing and storage to an online network of computers. This allows users to access applications and information over the Internet, and therefore does not necessarily need local computing power to take advantage of cloud computing. It is for this reason that devices such as smartphones and netbooks are gaining popularity. These devices do not have the raw processing power of the latest desktop and laptop computers. On the other hand, they still allow users to access the applications and data they need.

Moore's law: between obsolescence and sustainability

Moore's law has been a subject of controversy for a few years between, on the one hand, its obsolescence as a model for measuring technological development and its overtaking by new models in the computer industry, and, on the other hand, its sustainability as a framework for reference of the exponential evolution of the computing potential of new microprocessors. On both sides, arguments and counter-arguments keep pace with a debate of several decades. In this regard, it should be recalled that Gordon Moore himself announced the end of his law around 1975. In an interview in 2005, he admitted that his law "cannot last forever. The nature of the exponential functions finally hits a wall. ” Many experts in turn agree that computers should reach the physical limits of the said law in the 2020s. Many arguments are in favor of this hypothesis. First, transistors today reach atomically small sizes that physics would no longer reduce further, because it would no longer be possible at some point to make things even tinier than they already are. Although the price may remain constant, performance cannot exceed the physical limits of the processors.

To get an idea of ​​the magnitude of the exponential performance of integrated circuits, Laurent Alexandre, in his book "La guerre des intelligences", establishes a telling chronology. He observes that "in 1951, a transistor was 10 millimeters wide; in 1971, 10 microns or one hundredth of a millimeter; in 2017, manufacturers released the first microprocessors etched into 10-nanometer transistors; therefore a hundred thousand times finer than a millimeter. Ten thousand transistors would be the width of a hair. By dint of this doubling every eighteen months, there are now 10 billion transistors on a single microprocessor / In June 2017, IBM presented the first prototype transistor engraved in 5 nanometers or 25 atoms wide. The industrialists announce for 2021, the first microprocessors engraved in 3 nanometers, or 15 atoms wide. An experimental transistor engraved in 1 nanometer has even been produced. At the beginning of 2017, the specialists who announced the death of Moore's law were again mistaken ” [1] .

However, it is only recently that companies have noticed a slowdown in the processing power of the processor. Besides the high temperatures of the transistors, which would ultimately make it impossible to create smaller circuits, the cooling of the transistors also takes more energy than the amount of energy, which is already passing through the transistors. "We will most likely reach the limit of 2-3 nanometers in a few years, so with spacings of ten atoms between the components not allowing the evacuation of heat and, moreover, with quantum behaviors of electrons in the components which will make them unreliable ” [2] . In short, innovative alternatives are now necessary to replace a theory of miniaturization of transistors, which feeds the progress of data processing for more than half a century. Engineers and scientists need to find other ways to make computers more efficient than using integrated circuits with more superpowered microchips. Instead of physical processes, applications and software can help improve the speed and efficiency of computers. Cloud computing, wireless communications, the Internet of Things (IoT) and quantum physics can all play a role in the future of innovation in computer technology.

One of the ways of apprehending the end of Moore's law is in specific domain architectures (DSA). These are processors designed to speed up specific tasks of an application. The idea is that instead of using general purpose processors such as the CPU (Central Processing Unit) to handle a multitude of tasks, different types of particular processors are tailored to the needs of specific tasks. This is among others the example of the TPU chip (Tensor Processing Unit) built by Google for complex inference tasks in neural networks. The TPU chip was designed specifically for this task. Since this task is at the heart of Google's activities, it makes perfect sense to download to a specific processing chip.

To speed up their huge workloads, cloud companies use some sort of acceleration technology. One of the possible options is in graphics processing units (GPUs), which have been modified for some time to support a wide variety of applications. Network processing units (NPU) have also been widely used for networking. These two options provide a significant number of smaller processors where workloads can be broken down and parallelized to run on a series of these smaller processors.

For proponents of the durability of Moore's law, Gordon Moore's prediction that the number of transistors on a chip would double every two years, is interpreted using a false variable, that of time. However, Moore incorporated in his law the hypothesis that, over time, unit demand, and therefore production volumes, would react to the continuous fall in prices. Empirically, and perhaps misunderstood by the market, growth in unit transistor output has slowed considerably during this economic cycle. As a result, and not surprisingly, the price drop for the calculation began to stabilize. The public proclaims "the end of Moore's law", without realizing that it has been biased from the start.

Wong, vice president of corporate research at Taiwan Semiconductor Manufacturing Corp, made a presentation to that effect at a conference in which he supported the idea that not only was Moore's law alive and well, but that it would remain viable with the panoply of technologies for the next three decades [3] . The only thing that matters to maintain Moore's Law is, from his point of view, to continue to improve the density of integrated circuits. He noted that many innovations in semiconductor manufacturing have helped maintain density on an upward curve. No matter how these higher densities are achieved, as long as companies can continue to provide more transistors in smaller spaces with better energy efficiency. Wong thinks that the next big effort will come from the side of artificial intelligence and 5G.

What perspective?

All technological innovations are due to the need to develop new IT platforms for applications requiring faster and more energy efficient hardware. This development extended to minicomputers in the 1970s, to personal computers in the 1980s, to the Internet in the 1990s and now to mobile computing. Each of these technologies has imposed greater density through improvements in semiconductor manufacturing, thereby replicating the precepts of Moore's law. But one would wonder all the same if there would one day be an end to this law of half a century?

For many, the answer is yes, without determining when it could happen. On the one hand, it is very likely that engineers will encounter a technical barrier preventing them from finding a way to manufacture even smaller components. But even if there were no technical barriers, economic considerations could come into play. If it is not economically possible to produce circuits with smaller transistors, there may be no reason to continue developing them.