Singularity happens when the machine becomes capable of processing instructions faster than the human brain. It’s predicted to occur around 2045 -by Ray Kurzweil– as the fastest PC’s speed is 177,730 MIPS (Million Instruction Per Second) and the fastest supercomputer’s speed is 10^10 MIPS which is close to the human brain’s speed (around 10^14 MIPS).
After this point, the machine can predict easily what we’re gonna do before we even think about it. How ? by simply creating a decision tree.
A decision tree is a list of actions and its consequences, something like if X do Y, else if Z do L and so on. The deeper the tree the more possibilities it contains and when a machine is faster than the human brain it can go out of control because it will predict your actions towards the situation.
Lets take playing chess with a computer that is faster than your brain as an example. The computer will be able to create a tree of every move you make and the move next to it by going deeper into this tree. You won’t be able by any means to defeat the computer because you can barely predict the next 2 or 3 moves but the computer will predict far more than that and with the goal of defeating you the computer will definitely win !
But how that can be dangerous ? that’s what the I, Robot movie was talking about. By more machine intelligence it will predict that humans by their nature create conflicts when dealing with each other everyday and there is no way to come over these conflicts except by preventing humans to deal with each other because that’s what robots are supposed to do, to SAVE US according to the three laws or robotics by Isaac Asimov :
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.