Friday, October 23, 2015

I don't fear AI

In a recent debate on artificial intelligence, one that I was part of while in Frankfurt's B3 Biennale 2015, I faced an existential question on whether humans should fear AI or hail its forthcoming. At the moment I was more supportive of the acceptance and welcoming of the rise of AI as a 'natural' evolution in robotics and science. After all, as a futurist, I do not impose my likings on the future I see us heading to; I try to simplify a sophisticated network of most probable scenarios that will result from past and current trends and developments. Now, after reviewing my testimony, I would like to infuse some subjectivity into my judgement and say, not as a futurist but as a scientist, that I do not think AI is the "singularity" that will usher the end of human domination, like some xenophobes have lately been warning. I would go one step further and say that I do not fear artificial intelligence at all.

I recall a philosophical rule I learned, can't recall where from, that the attributes of the whole are the sum of the attributes of all the parts, and that the attributes of any part are some of the attributes of the whole. The part cannot have all the attributes of the whole, or else it becomes the whole and there can't be two "whole"s. From that philosophy, I go to the point of what collective humanity is capable of : the sum of the capabilities of all humans, dead and alive, put together, and other capabilities that emerge from when all humans come together, at any moment. Thus, no single human or any part of humanity could be capable of what collective humanity could be capable of.

Knowledge is no exception.

Since our senses are physical we are limited to the world of physical. Our abilities, even if they sound incalculable, are basically dependent on biological limitations, or biological potentials. Humans, as primates, have proven intelligence from their ability to extend their potential beyond their biological potential. Creating and utilizing tools had given primates an edge over other creatures. Humans used tools to create better tools; with fire we reshaped metal, with better shaped metallic instruments we built stronger machines, with more accurate machines we built smarter computers, and with super computers we have extended our biological potential to become space invaders.

No group of humans could ever claim to have solely achieved anything grand; they must have read a book they have not written themselves, thus, knowledge was transferred.

Now, "singularity" as a concept, in principle, is not only improbable, but also impossible. It requires the infinite knowledge of what humans haven't yet known, using superior instruments that they have not yet created nor imagined with an imagination they have not yet reached, using unlimited physical potential that they have not yet possessed. That is what singularity requires.

The starting point for singularity has not yet come into existence because, basically, civilization is way too primitive to offer the grounds for singularity to exist.

Having said that, if and when artificial intelligence comes into existence, it will be similar to, in some ways, and different than, in other ways, any intelligent biological being or biological system. It will have limitations imposed partially by civilization and partially by planet Earth, and the best it could reach is to have access to resources without being able to fully utilize them. Much like humans, collectively, have managed to utilize resources without being able to reach singularity. Artificial intelligence will have access to all information that humans have stored within reach, that is digitally, yet still, AI will not possess the hardware capable of saving and processing that sum of information, nor will it possess the ability to generate the necessary power to run itself. We still can't generate enough power from renewable energy resources, nor will AI. That is not singularity.

Now, having removed the possibility of AI with the potential of singularity, I don't see why we need to fear AI anyway. Fearing AI is just another form of xenophobia. Any average adult person from the 1950s wouldn't have believed in nanotechnology. Someone born in 1950 would still not trust 3D printing of human organs. AI is another technology that today's people can't fathom and can't trust, but in the future will become a common feature amongst advanced tools that were created by humans.

Singularity, if ever achieved, should not be feared. It should be worshiped because that is what singularity is; it is the incarnation of the most powerful idea that humans have ever created: God.

by Nael Gharzeddine

No comments:

Post a Comment