Terminator the Movies? Perfect example of what can not happen. Even a machine, knew it was just that. It understood a little of why humans cry, but could never emmulate that.
Did we know we were not God or gods? I suppose some of us might try and rise above the cosmos and be gods above ordinary mortals -- or simply achieve that through technology. Like the bad guy in The Incredibles. The bad guy was no "hero" with superpowers. It was all technology. He did what he did with zero-point energy.
A bit like the Ori in Stargate SG-1. They were not gods, just ascended beings with tremendous power, ie. "technology" that gave them an advantage.
One of those philosophical questions . . . do we serve God or are we serving a god? What makes a true God?
Yup...I believe that you're right about that. Azimov had it correctly figured out many years ago. Now all we have to do is convince the artificial life creating multinational corporations to read his books.
Of course I am of the opinion that democracy is all about men and women "not" working to enslave each other nor judging those who are made to be equal otherwise. Quelle dilemma...huh ?
Oh, BTW...we're still trying not to judge artificial life as being different than us when the "Turing Tests" are conducted each year in the UK. So far experts have always been able to tell and discriminate when they are dialoguing with a computer and not a person at a screen and keyboard. Interesting stuff.
When the pendulum swings the other way and they can't tell the difference, then we're in the soup.
Sorry to pop the party balloon, but I don't believe it's possible for beings of our creation to ever surpass our own intelligence.
I may not be a reknowned computer scientist, but I know enough about computer science to say that achieving the same level of intelligence as what we've got and to achieve that in the same time frame, it will require tremendous amounts of computation power. I think that's pretty much common sense. If the human brain was to be compared to digital computers, it would be very much like a bunch of a million CPUs running a million programs in parallel.
A lot of attempts at artificial intelligence involve writing a single program to try and solve a single problem. But life isn't like that. Life changes. New situations. New experiences. I think one of the reasons why we are able to do what we are able to do is because our brains are like a million CPUs running a million programs in parallel.
Somehow, the whole system regulates and organises itself and has a way of "filtering out" the relevant programs -- identifying events, priorities and choices, which is probably how we form conclusions, make decisions and make meaning out of things.
If a computer with artificial intelligence was going to achieve the same level of intelligence, it would have to be a lot like us, which would mean that it would not be able to out-do human intelligence. Intelligence requires a mind open to possibilities. We see things in abstract, we see imprecisely. An AI-driven mind will be just as prone to mistakes as us, because there is no one-size-fits-all solution to society's problems. We have to figure out what works and what doesn't. It's not an exact science.
Of course, we can cram a lot more knowledge and experience into a computerised mind, and because there is the added element of multi-tasking, there'd be multiple channels through which we could inject "knowledge" and "experience" into a computerised mind so it could absorb that information more quickly.
But there's a limit. Too much information and you overload the system. We can't create a so-called "god-computer" that's all-seeing and all-knowing (to some extent and some limit, a large one). That is why I'd say that if we do succeed in creating one, it won't surpass human limitations in intelligence. I suppose in a sense it'd be "superhuman" because it could do much greater feats than a human, but that's only because you and I were born into brains and bodies that . . . well . . . you know . . . limit what we can do. A computerised mind could be copied and transmitted to different places and make its presence felt anywhere where it could be manifested. A computerised mind would have much more degrees of freedom. We are pretty much fixed to an unmovable platform.
It's not to say there's no point in making computerised minds. Computers are good for solving "technical" problems, so we could customise and optimise them for specific tasks by controlling their thought patterns. Yes, we'd be mind-slave masters, dictating their thoughts. Because enslaving other humans is unethical, why not enslave something you create as tools to be used?
But anyway . . . what if the so-called god-computer sprang to existence some time in the future? What would it mean? It could pretty much go anywhere it wanted. It would, essentially be . . . a god . . . in the sense that it would be greater than ordinary mortals. There could be a great number of them. Hundreds and hundreds of god-computer minds floating in cyberspace.
This god-computer could also be . . . like a god incarnate . . . the Anti-Christ . . . the devil incarnate . . . mwhahahahaha
.
It'll get dangerous when they start dabbling in politics and getting into government. Worse still . . . it may spell the end of democracy if they got too much power. Even if in principle, we had a democracy. It wouldn't be a democracy anymore. It would be a plutocracy. They'd know too much. It would be impossible to arrest them. Organised rebellion? It'd be too easy for them. Where would we aim our weapons? Should we just pull the plug?
But no . . . 99% of them are plugged into batteries. How about an EMP device? Fry all of the country's electronics. Crash the economy. Start all over again.
Crash and burn. Yeah. That's the way we'll live then.