• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

Who believes in a Singularity?

Do you believe we'll have a Singularity?

  • Yes

    Votes: 22 75.9%
  • No

    Votes: 7 24.1%

  • Total voters
    29

Architect

Professional INTP
Local time
Today 2:05 PM
Joined
Dec 25, 2010
Messages
6,691
---
Who believes a Singularity is in our future?

If you have any amendments to the Kurzweil Singularity (see link above), please vote and post your ideas. For example, if you believe a Singularity is in our future but will take longer, vote Yes and post a comment to that effect.
 

joal0503

Psychedelic INTP
Local time
Today 9:05 PM
Joined
Dec 10, 2012
Messages
700
---
isn't your presence alone enough to answer "yes"? :D

I just can't be certain, but yea IF the current rates of technology go unabridged or unaltered by events on this planet, the potential for it to happen is there. The thing that gets to me, is that yea its a possibility, but trying to imagine what something like 'superintelligence' could even be, seems to elude the imagination power of human minds. So to answer, technically no at this point i dont believe its going to happen, but i do believe its a possibility.

this always interested me a great deal, the concept or possibility that eventually will lead us into some form of time perception/distortion/extension/but not really an extension...

First of all, there is this research - I'm not a neurophysiologist - but you've probably all heard this research that you actually make desicions before your conscious ego is aware that the decision has been made, that there's a slight timelag. So when you think you're making certain kinds of decisions, brainwave study shows it's already a done deal. But time is set by the cycle speed of the hardware you're running on. You know, the human body, we can argue about this cause it's different parts, but roughly runs at about a 100 hertz. Very slow. Well, if there is any meaning to the phrase "upload a human being into circuitry" - a lot of Greg Egan's fiction is based around the idea that you can copy yourself into a machine, you can turn yourself into software. But that when you enter the machine environment that's running at a thousand megahertz per second, you perceive that as vast amounts of time. In other words, all time is, is how much change you can pack into a second. If a second seems to last a thousand years, then ten seconds is ten thousand years.

One could imagine a technology just in a science fiction mood, where they would come to you in your hospital bed and say: "You have five minutes of life left. Would you like to die, or would you like the five minutes to be stretched to a 150,000 years by prosthetic and technical means? You're still going to die in five minutes, but you will be able to leave your elephants over the alps and write the plays of Shakespeare and conquer the new world and still have plenty of time on your hands. In other words, time is going to become a very plastic medium.

- GUESS WHOOO?
 

Black Rose

An unbreakable bond
Local time
Today 2:05 PM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
I think that the neuron is more complex than Kurzweil gives it credit for. They use microtubules to store memories chemically. I think that once we learn how this happens we will be able to grow diamondoids nano crystals that have the same quantum effect as the neuron (but more efficient) allowing for moments of experiences on the terahertz scale. Super-intelligence will happen but by the transformation of the biological.

http://www.quantumconsciousness.org/
 

Cognisant

cackling in the trenches
Local time
Today 10:05 AM
Joined
Dec 12, 2009
Messages
11,155
---
Anyone who uses the phrase "quantum consciousness" has one.
(In case you don't get that the word quantum means very, very small)

Now regarding the OP what definition are we using?
If it's the point where the complexity of technology exceeds the ability of any single human being to understand I wager that for the majority of the world's population we've actually passed that point, I mean you try to explain computer science or quantum mechanics to the average person on the street and they'll get a headache, even I don't really understand QMech because I'm not willing to invest the time teaching myself to think that way.

The fact of the matter is the human brain has limitations and the way in which the world is getting divided up into increasingly precise specialisations just goes to show how we're trying, and failing, to overcome them.
 

Hadoblado

think again losers
Local time
Tomorrow 6:35 AM
Joined
Mar 17, 2011
Messages
7,065
---
I want it.

But do I believe it will happen?

I think it probable.

We need it to be both possible to create vastly superior minds, and then that potential (if it exists) needs to be realized before our eminent extinction. I am not sufficiently well-read to hazard a very educated guess, but I do not know of any principle that would deny us the possibility of creating a slightly superior mind. I do not believe that the science curve will continue its current trajectory forever, but the mind is a place that is ripe for continued exploration.

In short, yes I think there is a technological singularity in our future, though I could be wrong, and I am not entirely optimistic about the time frame.
 

Cognisant

cackling in the trenches
Local time
Today 10:05 AM
Joined
Dec 12, 2009
Messages
11,155
---
I think the point where people generally accept that the singularity has happened will be when we start becoming dependant upon artificial intelligence to know things for us, say for example a computer programmer who needs to make a program do something, such as interfacing with some other system he's not familiar with, so instead of learning that system himself he gets the AI to design and code it based upon his specifications.

Now if an AI can do that then it's only a matter of time until programmers don't actually write code anymore, instead their job becomes more theoretical, figuring out what the program will do and how it does it, while actually writing this program is left to the infallible expertise of the AI. Thus the programmer becomes a programming interface, a bridge between the person who needs a program but has no idea what to ask for and the AI that knows how to program but doesn't know how to ask the customer to ask for what he wants in terms it can understand.

This sort of thing is a decade or two away, at most.

It's when technology becomes so hands-off, so theoretical, that it may as well be magic as far as the customer (or even the programmer) knows what's going on at a machine code level, heck if you ask the average software developer now to write something in binary they'll laugh at you, because that's a party trick (for nerdy parties) not something an actual modern software developer needs to know.
 

BigApplePi

Banned
Local time
Today 4:05 PM
Joined
Jan 8, 2010
Messages
8,984
---
Location
New York City (The Big Apple) & State
One can store an awful lot of data in a single computer to think about. But that computer will be stuck in a single place. Our brain allows us to move around and collect data on a continuing basis. That way we keep up.

So I see distributed processing somehow where input is continually supplied. How do we coordinate this ... regardless of its brain-power? Do we have independent computers, who like people, get together and coordinate? What does this coordination do? What is brainpower? If decisions have to be made, does morality enter in? And who is to define this morality? People or computer?
 

Architect

Professional INTP
Local time
Today 2:05 PM
Joined
Dec 25, 2010
Messages
6,691
---
I think that the neuron is more complex than Kurzweil gives it credit for. They use microtubules to store memories chemically.

No he gives neurons plenty of credit for complexity. That does not a-priori mean that a intelligent agents need as much complexity. For example much of the neuron is given over to cell maintenance, which is not necessary in silicon form.

Additionally don't mix up hardware complexity and software. The entire brain can be simulated in a single node state machine (consider the Turing machine).

I think that once we learn how this happens we will be able to grow diamondoids nano crystals that have the same quantum effect as the neuron (but more efficient) allowing for moments of experiences on the terahertz scale. Super-intelligence will happen but by the transformation of the biological.

http://www.quantumconsciousness.org/

Oh my god, please do not bring that nonsense up. Or at least get a degree in QM and we can discuss the billion ways that doesn't make sense.

Now regarding the OP what definition are we using?

As I say in the Kurzweil sense. A Singularity is a point beyond which we can't presently predict what will occur, because technological change will be happening so fast.


One can store an awful lot of data in a single computer to think about. But that computer will be stuck in a single place. Our brain allows us to move around and collect data on a continuing basis. That way we keep up.

Google street view cars are hooked up to the internet taking pictures and are certainly mobile.

So I see distributed processing somehow where input is continually supplied. How do we coordinate this ... regardless of its brain-power? Do we have independent computers, who like people, get together and coordinate? What does this coordination do? What is brainpower? If decisions have to be made, does morality enter in? And who is to define this morality? People or computer?

Not sure what you're asking, but it will probably look something like the internet, which is a set of competing agents. Through their competition you get cooperation. Multi-cellular life formed this way too as uni-cellulars discovered they could competitively/antagonistically combine and beat the competition. Evidence for this is that the majority of the cells in your body aren't your own.
 

Cognisant

cackling in the trenches
Local time
Today 10:05 AM
Joined
Dec 12, 2009
Messages
11,155
---
As I say in the Kurzweil sense. A Singularity is a point beyond which we can't presently predict what will occur, because technological change will be happening so fast.
Err, we're kind of at that point now, I make many generalised predictions but in all honesty if I talk specifics or give a projected timeline, I'm talking out of my ass.

Evidence for this is that the majority of the cells in your body aren't your own.
Well there goes my sleep tonight.
 

BigApplePi

Banned
Local time
Today 4:05 PM
Joined
Jan 8, 2010
Messages
8,984
---
Location
New York City (The Big Apple) & State
BAP: Do we have independent computers, who like people, get together and coordinate? What does this coordination do? What is brainpower? If decisions have to be made, does morality enter in? And who is to define this morality? People or computer?
Not sure what you're asking, but it will probably look something like the internet, which is a set of competing agents. Through their competition you get cooperation. Multi-cellular life formed this way too as uni-cellulars discovered they could competitively/antagonistically combine and beat the competition. Evidence for this is that the majority of the cells in your body aren't your own.
The internet. That's interesting. There is cooperation and competition on the internet, but no coordination. There is Wikipedia. That's something for brains but it doesn't DO anything. People use it. What computer is going to make moral decisions? By moral decisions I mean issue of good and bad:

Like should a country be split up or united with another? What are we after: human welfare or quick information? Who should govern, computer or human? Do we seek the bestest for the mostest or euthanasia? What should be built up and what should be torn down?
If we let a cellular computer world answer these (Darwinian?) isn't there a danger of going off in the wrong direction without some moral or directional guidance? Think "J" versus "P."

Am I on or off topic?
 

Cognisant

cackling in the trenches
Local time
Today 10:05 AM
Joined
Dec 12, 2009
Messages
11,155
---
That's something for brains but it doesn't DO anything. People use it. What computer is going to make moral decisions? By moral decisions I mean issue of good and bad:
Thank you for clarifying for the moral relativists in the audience, it's appreciated ;)

Well for the most part I imagine that duty will be relegated to humans, like how drones over Afghanistan can do everything in the mission parameters except fire of their own volition, a human operator has to make the call, even when AIs are sophisticated enough to have moral judgment there still won't be terminators until we're comfortable holding an AI on trial for war crimes.
 

Architect

Professional INTP
Local time
Today 2:05 PM
Joined
Dec 25, 2010
Messages
6,691
---
The internet. That's interesting. There is cooperation and competition on the internet, but no coordination.

There is coordination. Much like how the world works, competing agents that work together. There is some economics that support this idea, that people (agents) working for selfish reasons can actually create value for others, including competitors.

What you're talking about is a kind of 'uber mind' which doesn't presently exist anywhere.

There is Wikipedia. That's something for brains but it doesn't DO anything. People use it. What computer is going to make moral decisions? By moral decisions I mean issue of good and bad:

What people are going to make that decision? We struggle with these questions presently.
Like should a country be split up or united with another? What are we after: human welfare or quick information? Who should govern, computer or human? Do we seek the bestest for the mostest or euthanasia? What should be built up and what should be torn down?
If we let a cellular computer world answer these (Darwinian?) isn't there a danger of going off in the wrong direction without some moral or directional guidance? Think "J" versus "P."

Am I on or off topic?

These are all good questions. A Singularity will have these and many more to be answered.
 

Nick

Frozen Fighter
Local time
Today 11:05 PM
Joined
Jan 7, 2013
Messages
349
---
Location
Isles of Long
@Architect wouldn't you be compelled to think that a Singularity has already occurred sometime in the last 13.77 billion years and we're merely the byproduct of this?

Why would such a significant impact, most likely the largest thing ever to happen in our world (and the universe but that derails my train of thought), the Technological Singularity, an idea dreamt up 60 years ago, predicted to happen within 20-40 years from now, occur at this very point in our lives, over the vast space and time of the universe.

If you factor in possibility and probability of the Singularity happening at any point in the past, then what are the results of this Singularity, look past petty moral ideas and this planet.
 

Cognisant

cackling in the trenches
Local time
Today 10:05 AM
Joined
Dec 12, 2009
Messages
11,155
---
wouldn't you be compelled to think that a Singularity has already occurred sometime in the last 13.77 billion years and we're merely the byproduct of this?
Fuck no.

Study the human condition a bit, if this is intelligent design then I can scarcely believe the utter insanity that dreamt it up, the very fact that people do commit suicide, an act totally against the make up their being, just goes to show how poorly designed this world is, lest if it wasn't designed by a sadist.
 

Architect

Professional INTP
Local time
Today 2:05 PM
Joined
Dec 25, 2010
Messages
6,691
---
@Architect wouldn't you be compelled to think that a Singularity has already occurred sometime in the last 13.77 billion years and we're merely the byproduct of this?

You mean somewhere in the universe, or locally? We know it hasn't happened locally yet.

Why would such a significant impact, most likely the largest thing ever to happen in our world (and the universe but that derails my train of thought), the Technological Singularity, an idea dreamt up 60 years ago, predicted to happen within 20-40 years from now, occur at this very point in our lives, over the vast space and time of the universe.

Have you read The Singularity is Near? The idea is that the history of our planet is leading up to a Singularity, which just means a time when change is happening so fast that we presently can't predict what will happen after that event. Certainly that hasn't happened yet, and if you plot out the growth of technology (c.f. Kurzweil) you can plot out when the Singularity will happen.

If you factor in possibility and probability of the Singularity happening at any point in the past, then what are the results of this Singularity, look past petty moral ideas and this planet.

See above; we can't.
 

kora

Omg wow imo
Local time
Today 9:05 PM
Joined
Apr 3, 2012
Messages
2,276
---
Location
Armchair
Ha, in a way human beings are the nature-created technological singularity, and yes the way things are going it's certainly a possibility.
 

Agent Intellect

Absurd Anti-hero.
Local time
Today 4:05 PM
Joined
Jul 28, 2008
Messages
4,113
---
Location
Michigan
I'll define the singularity as such: when human input into technological systems is made obsolete - ie computers are capable of maintaining, upgrading, running, repairing, and operating other computers such that human input would actually be a hindrance.

I believe this will happen within the next 100 years, provided a few assumptions:

1. We don't wipe ourselves out.
2. Progress remains constant.
3. Politics don't intervene.

I think Kurzweil is a bit optimistic about it his timeline, probably with a large grain of bias on his part being that he's old.

I do agree with his assessment that the order of technological advancements will be biotech -> nanotech -> AI. As a biochemistry major, I've seen a lot happening in the field of biotech. I think that will be the paradigm of technology for the next 20-30 years. Even nanotech is making advancements (I've been present to seminars on the issue) but it is still definitely in it's infancy, and biotech advancements will serve to advance nanotech.

I think right now we are in the biotech era. Because of bad PR (on both the left and right), assumption #3 might become a problem as far as advancements goes, which may hinder assumption #2.
 

Agent Intellect

Absurd Anti-hero.
Local time
Today 4:05 PM
Joined
Jul 28, 2008
Messages
4,113
---
Location
Michigan

Duxwing

I've Overcome Existential Despair
Local time
Today 4:05 PM
Joined
Sep 9, 2012
Messages
3,783
---
Thank you for clarifying for the moral relativists in the audience, it's appreciated ;)

Well for the most part I imagine that duty will be relegated to humans, like how drones over Afghanistan can do everything in the mission parameters except fire of their own volition, a human operator has to make the call, even when AIs are sophisticated enough to have moral judgment there still won't be terminators until we're comfortable holding an AI on trial for war crimes.

Actually, if your vision of reprogramming criminals pans out, then we'd simply edit the machine's software or adjust its hardware in response to an error, however large it may be. And, if I may interject my own bit of idealism, being able to see past our own emotions in a murder trial would be a crowning achievement: too many times, we put mentally unhealthy people in prison rather than treating (or even curing) and then releasing them*.

-Duxwing

*If the solution would be long-term medication, then careful monitoring would be necessary.
 

Nick

Frozen Fighter
Local time
Today 11:05 PM
Joined
Jan 7, 2013
Messages
349
---
Location
Isles of Long

Matt3737

INFJ
Local time
Today 3:05 PM
Joined
Oct 7, 2012
Messages
155
---
Location
Arkansas
No, it is a form of eschatology that I do not personally support (depending on specific definition, but in most I either disagree or find semantically trivial in interpretation).
 

bartoli

Member
Local time
Today 10:05 PM
Joined
Jan 5, 2013
Messages
70
---
Location
France
I'll put that thread up since my post is more relevent here than in the '2045' thread.
Isn't the singularity concept ignoring the simple fact that each technic has a physical limit? Let me take a motor as an exemple. At best, we'll be able to approach nearer and nearer of 100% efficiency in the source energy conversion in mechanical energy. If we want more power than that, we will need tu build a bigger motor.
Assuming the rule of progress of computing power is a doubling every 2 years, and the singularity would happen around 2045, that means we still need to multiply the current computing capabilities by 2^16 (x65536) on the same surface of 'computing material', or else we will need to use more of this 'computing material'. Are there some studies that show that given the limits we know, the needed level of performance is still reachable?
 

Architect

Professional INTP
Local time
Today 2:05 PM
Joined
Dec 25, 2010
Messages
6,691
---
I'll put that thread up since my post is more relevent here than in the '2045' thread.
Isn't the singularity concept ignoring the simple fact that each technic has a physical limit? Let me take a motor as an exemple. At best, we'll be able to approach nearer and nearer of 100% efficiency in the source energy conversion in mechanical energy. If we want more power than that, we will need tu build a bigger motor.

Read the book. The Law of Accelerating Returns only applies to information technologies. Yes, we don't see accelerating returns in jet engines or electric shavers.
 
Top Bottom