On the Limits to Exponential Growth of Technology
Reading about Nvidia’s VP claiming that Moore’s Law is dead got me thinking about what this means for the expected exponential growth of technological evolution. Apparently Gordon Moore himself said that his law is dead. Elsewhere then I also read an argument about the development of particular technologies following an S curve whereas the uptake of a brand new technology feeds its accelerating improvements until the saturation point at which there is no longer sufficient demand to sustain the acceleration, and the growth begins to level off until a new technology emerges to repeat the process.
This, however, refers to specific technologies or paradigms, and not necessarily to the more fundamental measures of growth such as overal number of computations that can be performed in a unit of time and with a specific amount of power. As one technology and its associated paradigm levels off, perhaps even creating a temporary dip in growth, another may very well be preparing to take the maximum achieved amount of performance into another S curve. In this sense, the overall exponential curve is consisted of S curves which always level off at a higher level than they begun, as seen in this rough illustration:
This means that there could be dips in the rate of growth along the way without impacting the general trend of acceleration. The dips are actually times when the current technological paradigm has reached its limits and the new one is being searched for or developed. I think we may be in such a dip right now when it comes to CPU technologies. The transistors are reaching their limits so chip makers turn to multi-core and parallel computing to keep increasing performance. Meanwhile quantum computing and memristor technology are emerging as new paradigms that would begin the next S-cycle.
Each new technological paradigm seems to be orders of magnitude more powerful than the last which contributes to the overall trend of acceleration. Memristors and quantum computers would leave transistor based computing in the figurative dust.
It may be worth noting the role that market demand plays in pushing accelerated growth. If the markets do not demand more power, more possibilities etc. then they wouldn’t pay for it and the development of technologies which push the boundaries at an ever increasing rates would stagnate. The continued accelerated growth then seems to indicate that there is distinct demand among the masses of individuals for more out of less, as if reflecting a kind of universal urge to transcend current limitations. I for one think this is only natural to human beings, and it’s an interesting validation of transhumanism and transhumanist thinking.
Ultimately, there is the question of whether hard limits to exponential growth of our technological power do exist. We know that there are certain laws of nature, but so far better understanding of them has only caused us to grow more, not less. Knowledge has proven to be power. Understanding something gives us the ability to alter it into something else, something which does not just exist besides us, but that actually adds to us some new possibility we haven’t had before.
In a sense we aren’t really creating anything “more” in the universe so as to reach some kind of a limitation where “more” becomes too much. We are simply rearranging what already exists to suit us. Matter and energy never actually cease to exist. They merely transform from one form to another. All we’re doing is becoming better at intelligently and consciously transforming it ourselves. In that sense the only limits are the limits of existence itself. In other words, everything that the universe can do, a conscious and intelligent being on a careful path of gradual increase in understanding and self-development, can do as well.
There is also a view of limitations based on the view of fundamental building blocks of matter. For example, the stagnation in the increases of processing power on a single CPU core is attributed to the reaching of a fundamental physical level of an atom. Quantum computing however doesn’t concern itself with such a limitation as it deals with a sub-atomic level. And then there are levels that go even beyond sub-atomic particles. There are some who expect this process of discovery of ever smaller building blocks to go on indefinitely (fractal theory).
Whether that is true or not, however, does not necessarily impact the possibility of ongoing acceleration of computing power. Sometimes it is not required that we build out of ever smaller particles, but rather to build smarter systems, make the most out of the level at which we are operating now. Memristors seem to be an example of that.
Finally, there may be a point at which the question of whether more acceleration is possible might become irrelevant to most of us because what we would’ve reached at that point is for all intents and purposes ever devised the ultimate destination. It is hard to imagine at this point what this may be, but that is why such a destination is post-singularity. Given what we have to work with right now, however, that ultimate destination is very far off and very much beyond such things as “merely” creating new species of “AI”, “merely” augmenting ourselves with seemingly god-like powers and so on. This is because the singularity, given all the technologies we already posess or are developing as we speak, is already within view.