Next civilization on Earth

plouton6

Well-Known Member
Messages
60
Reaction score
44
Points
18
Location
Curitiba, Brazil
There are monkeys currently in the stone age. However, they may not get the chance to make Earth the planet of the apes and build a civilization from the technology humans leave behind when we disappear, possibly not due to a virus from space, a lab virus, nuclear warfare, climate change, a cosmic cataclysm, et cetera.

We have been warned by a robot that the best way to avoid destruction caused by artificial intelligence is by having no artifical intelligence at all, unless humans become cyborgs. We are building emotional intelligence into robots, so it's very likely that they'll have logics that could be compared to emotions, including a sense of preservation of the planet (therefore, self-preservation).

We're probably developing something we can't control. What are the odds that the next civilization on Earth will be machines?
 
Do robots have an innate need to create more robots?

While our machines will know more than us about us very soon (know what we will buy/do) they will still be operated by a human..albeit for the same greed.

Without us...robots would have no direction
 
Robots may create more robots to build more infrastructure to increase processing capabilities to keep learning.

Machine learning is about robots being able to learn and make decisions they were not programmed to. It can get out of hand. What will they do with what they learn?

If we give them the power to decide what they can learn and how they can apply knowledge, we can't control the outcome. Humans are that stupid.
 
"Control".

We don't control any of the technologies we created.

Fire. We can't control bush fires, forest fires. A house on fire. We can only mitigate.

We can't control writing. Censorship is always a losing battle.

We can't control individual transportation. Count the victims, witness the immense resources we squander on it.

We can't control our economy.

We can't control alcohol.

...

And yet we live and create more technologies. And we manage. Without being in control.

I think "control" is a bit of an illusion we succumb to, a fantasy of power we don't actually wield, an ideal. "Being in control" is a status symbol, like a car, or a glass of expensive liquor. We're not even in control of our drive to control things.

Besides, AI, contrary to the hype of the past decade, is still not more than junk statistics ("data science") and Markov Chains aka Dominoes ("Neural Networks"). And it all breaks down without a beehive of IT staff to swarm over the infradtructure day by day and mend things. IMO.
 
We don't control any of the technologies we created.

Good point.

Fire isn't human technology. It's a force of nature that we (sort of) learned how to handle. I agree with the other examples though.

Besides, AI, contrary to the hype of the past decade, is still not more than junk statistics ("data science") and Markov Chains aka Dominoes ("Neural Networks"). And it all breaks down without a beehive of IT staff to swarm over the infradtructure day by day and mend things. IMO.

Maybe present IT hasn't reached the level some people expected 10 years ago, but it exceeded expectations some other people had back then. It may take more time, but we definitely can achieve truly and fully autonomous technology.

People generally didn't imagine 20 years ago that they could have a computer in their phones, yet it's generally unthinkable to live without smartphones nowadays.

There are people making money out of sexbots fully capable of having a conversation and simulating sexual intercourse with human expressions and reactions (there are people who actually pay for them, probably because they're satisfied with the level of realism). I used to think that this was one of the things that science fiction would never get right, but surprisingly the technology exists and there's a market for it. If there's money invested in development of that kind of AI, there surely is a lot more money invested in more serious technology.

There are no limits to human brilliance and stupidity. There are big companies investing a lot of money in research and development of AI, such as Google and Tesla. I think it's not a matter of if, but when.
 
Fire isn't human technology. It's a force of nature that we (sort of) learned how to handle. I agree with the other examples though.

Is intelligence a human invention, though? Or are we just learning to handle it in ourselves? It was already here when we became aware of it.

People generally didn't imagine 20 years ago that they could have a computer in their phones, yet it's generally unthinkable to live without smartphones nowadays.

That's not AI, though, just miniaturization.

Research the "Eliza" program. It started as a joke in the late 60ies, early 70ies. It runs on 70ies era home computer resources. Yet academic psychologists thought it was a real bit of intelligence, and were surprised at its realism. It was the most primitive pattern matching you can imagine.

I was too young back then to have been part of it, but counting it, there have been two AI hypes in my lifetime so far. Technology wise, I don't see the big advance. Even the programming languages that are fashionable today are badly re-invented wheels compared to what the l70-80ies AI community used. It's just that the chips run faster and the memory is cheaper nowadays.

Our most impressive "autonomous" AIs still can't outperform an ant in terms of cognitive capacity. And ants feed themselves and don't have to be maintained by human handlers all the time.

I enjoy a good Sci-Fi story utilizing FTL travel, but just because a billionaire launched a sports car or a suggestively shaped rocket into space, I would not get ready to tackle the Grandfather Paradox yet.
 
Last edited:
Back
Top