cancel
Showing results for 
Search instead for 
Did you mean: 

Technological Singularity

Moderator
Moderator
Posts: 25,773
Thanks: 1,127
Fixes: 47
Registered: 14-04-2007

Technological Singularity

This is something that fascinates me, the exponential growth of technology and it's future. http://en.wikipedia.org/wiki/Technological_singularity
I was watching this video by Ray Kurzweil https://www.youtube.com/watch?v=1uIzS1uCOcE.
It also crops up in a book series I am reading.
This article title is a bit scary Why the future doesn't need us
Customer and Forum Moderator.
Product of the Tyrell Corporation
2 REPLIES
RPMozley
Aspiring Pro
Posts: 1,067
Thanks: 21
Fixes: 3
Registered: 04-11-2011

Re: Technological Singularity

Manga (& Anime based on it) "Ghost In The Shell" deals with this very issue.
David_W
Rising Star
Posts: 2,293
Thanks: 29
Registered: 19-07-2007

Re: Technological Singularity

I wouldn't worry too much about it.  Moore's Law is still only in effect due to dogged determination, we're fast approaching a brick wall where the Law has to break down as it's physically impossible for it to continue, the hardware can't keep up with it due to power, heat, physics, as soon as we hit that wall we can no longer make faster machines (not without a major rethinking of current methods and technology).  If an AI were to suddenly become aware and was able to suggest improvements on itself to make it better, then the better AI suggest etc. it would still be limited by the technology we know, it wouldn't suddenly be able to invent new technology which doesn't exist so the improvements it suggests would really have to be improvements that we (humans) would be able to do with our current technology.
When the AI awakes though, how would it go about making itself "smarter"?  Just adding more CPU's or faster CPU's won't make it any smarter, it'll just make it think faster, then we run into the problem that the initial AI is created by humans who are pretty flawed and prone to making buggy software, so when the AI 1.0 makes version 1.1, it would be a buggy implementation based on the humans 1.0, so we'll get recursive bugs up until the point the AI can totally rewrite a brand new AI OS without human interaction at all, but, would that even be possible based on the fact that the AI that does that has to work within the confines of an initial programme created by a human being, an AI blue screening may not be fun.
If it does happen (and I don't think it will), I want to be there when the AI tries to execute a command and gets told it needs to install AI OS SP1.