Total Posts:8|Showing Posts:1-8
Jump to topic:

The Technological Singularity . . .

Idealist
Posts: 2,520
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 7:31:30 PM
Posted: 2 years ago
Okay, it sounds like the scientific version of the Rapture to me:

http://en.wikipedia.org...

But great minds and researchers alike (most notably Ray Kurzweil) are saying that breakthroughs in and integration of certain technologies will soon give us a world of eternal life, resurrection of the dead, and beings so great as to rival God. At first I dismissed it, but the more things I've studied the more I've begun to question it. Is science really capable of virtually anything?
Bullish
Posts: 3,527
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 7:53:23 PM
Posted: 2 years ago
At 12/22/2013 7:31:30 PM, Idealist wrote:
Is science really capable of virtually anything?

I don't think this is the correctly phrased question. As science is a study of the universe, and the universe can do what ever it can do, science can do anything it can do, which includes the resurrection of the dead. The question is whether we as humans can control it. And I say no.
0x5f3759df
Idealist
Posts: 2,520
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 8:31:01 PM
Posted: 2 years ago
At 12/22/2013 7:53:23 PM, Bullish wrote:
At 12/22/2013 7:31:30 PM, Idealist wrote:
Is science really capable of virtually anything?

I don't think this is the correctly phrased question. As science is a study of the universe, and the universe can do what ever it can do, science can do anything it can do, which includes the resurrection of the dead. The question is whether we as humans can control it. And I say no.

I don't think I've ever heard it put quite that way. The universe cannot do anything unless it is sentient or made to be operational. The whole idea is that man will gain power through the use of his machines, possibly even joining with them in cyborg form as the next great leap . If an intelligent computer can be synthesized which itself has the capacity to form a next generation more intelligent than itself, then the leaps of intelligence would become exponential. If it were possible, the only question would lie in our own ability to maintain control of the machines. We've all seen or heard of "The Matrix," I'm sure. Do you believe a computer can replace a human mind? We are creators ourselves, so can we create even more and greater creators? What physical factors do you imagine could halt the drive of technological progress?
Bullish
Posts: 3,527
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 8:41:33 PM
Posted: 2 years ago
At 12/22/2013 8:31:01 PM, Idealist wrote:
At 12/22/2013 7:53:23 PM, Bullish wrote:
At 12/22/2013 7:31:30 PM, Idealist wrote:
Is science really capable of virtually anything?

I don't think this is the correctly phrased question. As science is a study of the universe, and the universe can do what ever it can do, science can do anything it can do, which includes the resurrection of the dead. The question is whether we as humans can control it. And I say no.

I don't think I've ever heard it put quite that way. The universe cannot do anything unless it is sentient or made to be operational.

We weren't on the same boat there, but I get what you're saying now so I'll more directly address that.

The whole idea is that man will gain power through the use of his machines, possibly even joining with them in cyborg form as the next great leap . If an intelligent computer can be synthesized which itself has the capacity to form a next generation more intelligent than itself, then the leaps of intelligence would become exponential. If it were possible, the only question would lie in our own ability to maintain control of the machines. We've all seen or heard of "The Matrix," I'm sure. Do you believe a computer can replace a human mind? We are creators ourselves, so can we create even more and greater creators? What physical factors do you imagine could halt the drive of technological progress?

I think we can create things that are more intelligent than we are, in every respect. But it becomes more about chance, than innovation, at a certain point. Humans evolved from much less sentience animals, so nature definitely has a way for itself to create smarter beings. I don't see why humans can't just become smarter on our own, by means of natural/artificial selection. The problem with that though, is that it tends to become more and more difficult to evolve as we become more complex; most proponents of the Singularity think of an exponential growth, instead of a logarithmic one (there may be an upper limit for all I know).

I wouldn't say that there is a specific obstacle, other than our own intelligence. I'd say it's about luck. I say we will definitely have beings smarter than ourselves in the future, be it other humans or machines. Whether the exponential growth will come to be, I don't know; I doubt it.
0x5f3759df
Idealist
Posts: 2,520
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 8:56:11 PM
Posted: 2 years ago
At 12/22/2013 8:41:33 PM, Bullish wrote:
At 12/22/2013 8:31:01 PM, Idealist wrote:
At 12/22/2013 7:53:23 PM, Bullish wrote:
At 12/22/2013 7:31:30 PM, Idealist wrote:
Is science really capable of virtually anything?

I don't think this is the correctly phrased question. As science is a study of the universe, and the universe can do what ever it can do, science can do anything it can do, which includes the resurrection of the dead. The question is whether we as humans can control it. And I say no.

I don't think I've ever heard it put quite that way. The universe cannot do anything unless it is sentient or made to be operational.

We weren't on the same boat there, but I get what you're saying now so I'll more directly address that.

The whole idea is that man will gain power through the use of his machines, possibly even joining with them in cyborg form as the next great leap . If an intelligent computer can be synthesized which itself has the capacity to form a next generation more intelligent than itself, then the leaps of intelligence would become exponential. If it were possible, the only question would lie in our own ability to maintain control of the machines. We've all seen or heard of "The Matrix," I'm sure. Do you believe a computer can replace a human mind? We are creators ourselves, so can we create even more and greater creators? What physical factors do you imagine could halt the drive of technological progress?

I think we can create things that are more intelligent than we are, in every respect. But it becomes more about chance, than innovation, at a certain point. Humans evolved from much less sentience animals, so nature definitely has a way for itself to create smarter beings. I don't see why humans can't just become smarter on our own, by means of natural/artificial selection. The problem with that though, is that it tends to become more and more difficult to evolve as we become more complex; most proponents of the Singularity think of an exponential growth, instead of a logarithmic one (there may be an upper limit for all I know).

Yes, the problem seems to be that they not only believe the Singularity is possible, they also want to be part of it. That means pushing all the throttles to their stops. Even if humans are continuing to evolve (which is hotly debated right now) that kind of progress is much too slow in this age of instant gratification. The last attempt at artificial selection (eugenics) ended in public outrage and a great blow to science from which it still hasn't recovered. I've actually written to some relevant scientists about this, and one actually answered with interest. He says that we are nearing the end of the silicon age and so Moore's Law will break-down before we can reach such computing potential, but that doesn't stop possibilities like quantum computing. The last part of what you said was what I was actually pondering: is there an actual upper limit to what technology can achieve?

I wouldn't say that there is a specific obstacle, other than our own intelligence. I'd say it's about luck. I say we will definitely have beings smarter than ourselves in the future, be it other humans or machines. Whether the exponential growth will come to be, I don't know; I doubt it.

So even though we don't yet know how our own minds work, or even what consciousness is, you still feel confident that we will surpass ourselves? Let me ask you then - do you think that would be wise? It seems to me that growth can be too rapid to survive. Even a physical explosion is nothing more than a rapid expansion.
Bullish
Posts: 3,527
Add as Friend
Challenge to a Debate
Send a Message
12/22/2013 9:14:14 PM
Posted: 2 years ago
At 12/22/2013 8:56:11 PM, Idealist wrote:
At 12/22/2013 8:41:33 PM, Bullish wrote:
I think we can create things that are more intelligent than we are, in every respect. But it becomes more about chance, than innovation, at a certain point. Humans evolved from much less sentience animals, so nature definitely has a way for itself to create smarter beings. I don't see why humans can't just become smarter on our own, by means of natural/artificial selection. The problem with that though, is that it tends to become more and more difficult to evolve as we become more complex; most proponents of the Singularity think of an exponential growth, instead of a logarithmic one (there may be an upper limit for all I know).

Yes, the problem seems to be that they not only believe the Singularity is possible, they also want to be part of it. That means pushing all the throttles to their stops. Even if humans are continuing to evolve (which is hotly debated right now) that kind of progress is much too slow in this age of instant gratification. The last attempt at artificial selection (eugenics) ended in public outrage and a great blow to science from which it still hasn't recovered. I've actually written to some relevant scientists about this, and one actually answered with interest. He says that we are nearing the end of the silicon age and so Moore's Law will break-down before we can reach such computing potential, but that doesn't stop possibilities like quantum computing. The last part of what you said was what I was actually pondering: is there an actual upper limit to what technology can achieve?

I believe as well that Moore's Law's semi-conductor age is nearing it's end. But I've read a bit on quantum computers, and they seem very promising. Also, with quantum computing, it's no longer just about memory anymore, it's about uncertainty, and the theoretical "2" that singularity predictors imagine, which could bring infinite^2 possibilities. I have nothing intelligent to say when it comes to upper limits. We may be experiencing the middle of a logistic function, we may have infinite potential.

I wouldn't say that there is a specific obstacle, other than our own intelligence. I'd say it's about luck. I say we will definitely have beings smarter than ourselves in the future, be it other humans or machines. Whether the exponential growth will come to be, I don't know; I doubt it.

So even though we don't yet know how our own minds work, or even what consciousness is, you still feel confident that we will surpass ourselves? Let me ask you then - do you think that would be wise? It seems to me that growth can be too rapid to survive. Even a physical explosion is nothing more than a rapid expansion.

Yes, I am fairly confident that we can surpass something we fail to, or may never understand; it'll take a deal of guessing, and luck. Sometimes, if we don't look at it, it has the potential to performs better.

I think it would be wise, regardless of if it would end in our destruction. I don't care much for human existence anyway, especially if something better is created in the process. I would still urge caution though; obviously it wouldn't be very wise to cast a new singularity virus on to the Internet without constraint.
0x5f3759df
AlbinoBunny
Posts: 3,781
Add as Friend
Challenge to a Debate
Send a Message
12/24/2013 7:49:36 AM
Posted: 2 years ago
I'm sure we could do a lot of things. The important questions for me are should we do them? And would we do them?
bladerunner060 | bsh1 , 2014! Presidency campaign!

http://www.debate.org...
http://www.debate.org... - Running for president.
http://www.debate.org... - Running as his vice president.

May the best man win!
Idealist
Posts: 2,520
Add as Friend
Challenge to a Debate
Send a Message
12/26/2013 8:38:33 PM
Posted: 2 years ago
At 12/24/2013 7:49:36 AM, AlbinoBunny wrote:
I'm sure we could do a lot of things. The important questions for me are should we do them? And would we do them?

Yeah, I'd have to agree with that. Human beings are pretty good at focusing more on what they want than on what actually benefits them.