Singularity is a bad idea
Debate Rounds (4)
First off. I will list the "good" that will come from the Singularity but then explain why I think it will not be as good as it is advertised. There are obviously A LOT more than what I have listed but for the sake of time and space, I hope my opponent can accept what I have listed and argue from there.
5.)End of capitalism
6.)Sci-fi fantasy world
7.)Man and machine
Who wouldn't want to live forever? We have seen it in the movies and most of us envy the very idea of immortality; the ability to live forever; to live through history itself and witness change in the world forever! Sounds cool and legitimate right? But here is the other side: humans are still evolving. If you introduce immortality, our bodies will physically stay the same, but our minds will have to adapt to the world, not our bodies. Imagine not having the right tongue to speak the languages in one million AD? Imagine not having the right digestive system to eat the foods millions of years from now? Will you enjoy the same entertainment? You would go crazy! Imagine a war on eternal life? If wars were harsh with the loss of life, think how wars will be when eternal life is the means to fight.
2.)Advocates of the Singularity propose this idea of freedom, that you will be able to do whatever you want. You will live in a world where boundaries are endless and failure is a thing of the past. But with great power comes great responsibility...We could have the power to change our race, sex, age, color and brain to fit our needs of necessity. We could have uploaded minds where our brain is a storage device so we won't forget anything (I wouldn't mind that)! Unfortunately this sounds more like advertising rather than proclaiming real ideas for the future. If what they say is true, what can be done that would NOT be good due to the "freedom" we would hold? Once we have robots with a higher intelligence level than us, we will become enslaved by the very thing we created.
3.)Utopia is claimed to be the the perfect society, but in who's eyes,? Who will see to it? Not the average Joe. The movement towards a utopia is the first step to a totalitarian government. The focus on a paradise gives the government full reign on your rights as a citizen. Take a way a little freedom of speech, but you will have paradise.Once our government sees that a utopia is possible, what will it do to achieve it? A utopia ASSUMES that humans do not have the right traits to survive in the world. Therefore, we need a state where it removes those traits and we get a perfect world. Imagine the government taking away everything that makes you YOU just so you can conform to the confines of this new society. Would you be willing to do that for the perfect society?
4.) Peace from singularity is about as unbelievable as a football bat. The Singularity proposes the idea that we will all be protected from any form of violence. With a perfect society, what could possibly go wrong? Something is bound to go wrong. With so much power through technology, someone WILL have the ability to hack it. We didn't think we could be attacked on 9/11 but we did. Hacking will be the new form of terrorism. It will be just as plausible as a bomb in a building. But what happens when something does go wrong? The government will try to hide it, as it has in past years, in order to maintain the image of a perfect society.
5.) Not everybody is a fan of capitalism nowadays. But when you don't have capitalism, you have a dictatorship: AKA communism. Instead of you working to benefit you (capitalism) you will work to benefit the government (Communism). Everything you do will be monitored by the government in order to make sure it benefits the whole rather than you. Imagine a world where everything you do is not done for you? Capitalism will be gone and you will have a dictator who wars over this perfect society and oppresses anyone who thinks retracts from the idea whether they are civilians or politicians. A utopia is the perfect society for anyone power hungry enough to achieve it.
6.) Flying cars! Robots! Floating mansions! Time Travel! Yayyy! But for what? When the Singularity strikes, we will become a society more materialistic than we are now. With so much possible and almost "no limitations", what's to prevent someone from exploiting what is given to them: freedom. When one person is brainwashed into believing he/she is free, they psychologically convince themselves that they are really free. With the ability to do almost anything, the very idea that something is NOT possible, is unimaginable. So when something is not attainable, that person will do anything to achieve it because they have the means to do it. Everybody will want to have what they WANT not what they NEED. We will become a society where enough is not enough and not enough is unacceptable.
7.) To your point about the merging of man and machine and/or when "machine surpasses man", it isn't one or the other. The first step is when man and machine are thought to be merged together to form a single conscience. The next step is when that intelligence surpasses human intelligence. We are blinded by the idea that we can control what we create, when in fact we don't. When it comes to technology involving humans and robots alike, it is IMPOSSIBLE to know what exactly the technology we create 1000 years from now, will do. When the year comes when we can merge man and machine for the very first time, we will not know what the machinery will do because we have never given intelligence an essence. We have to be aware that whatever we create in that it can be just as easily manipulated as it was created. When you give machinery a mind of its own, IT IS NOT YOUR MIND. It is completely separate from your image of a "perfect" robot.
I hope this is the start to a great debate! The floor is yours my friend...
Eternal life would be a useful thing if we had the resources to sustain it. But, you have presented some easy to resolve issues with it. I will have to sidetrack onto my biology studies to explain why you are incorrect about evolution.
You see, evolution and natural selection work in combination. The reason neanderthals don't exist anymore is because something wiped them out that failed to wipe out homo sapiens. My guess was the use of fire. You see, natural selection means those who are best adapted to their situation (survival of the fittest) will live long enough to reproduce and carry on their genes. This means that evolution occurs, but it doesn't replace the old species unless it is necessary. What advantage could wipe out an immortal society to the point of introducing a new species, and if it did occur, what makes you think you would survive this genophage long enough to meet the new species, we'll call them Novo Hominem. (new man) . This means evolution would not be an issue.
Back to immortality. We would never make food we couldn't eat, and my passage which will earn me 27 marks in my exam in two weeks time explains why. No new species. On language, maybe a new language will arise, but in the world of the singularity, our built in microcomputers will be able to translate languages in an instant, erasing the need. Think Star Wars, where everyone understands R2-D2, but speak to it in english. Each person hears in their own language. A war on eternal life would be interesting, no doubt some religions would be opposed, while the wealthy would remain. I believe eventually everyone will be ready to accept death in time. This merely allows us to choose our own time of death. Death waits for no man, but he gets in line for the machines.
2; Freedom. We already experience a lack of privacy in the world, it's undeniable. But your idea of being enslaved is a false premonition. I'll have to draw from more sci-fi, as this is very much the genre. In Mass Effect, there are 3 kinds of intelligence. Natural intelligence, ie; living things. Virtual Intelligence, where a computer has preset answers to a stimulus, and Artificial Intelligence, akin to natural intelligence. You see, in the real world, AI can never truly exist. Things like Cleverbot are VIs. The we could never be enslaved by machines because we would be the machines. We would be merged with them, physically (a la Deus Ex) and mentally (by cloud storage). You could never put your personality on the computer because AI is an impossibility though. Identity is a complex issue.
3; Utopia; the ideal society, doesn't imply totalitarian control. It implies near-perfection in its citizens. 21st century media depicts future societies as dystopias because it makes for a better story setting. Think back to Star Trek in the 60s, around the Cold War, depicting a society with multinational crews aboard the Enterprise. Utopias do imply people currently lack certain virtues, but stripping away individuality is not a requirement to achieve virtue. The virtues as I understand the are Tolerance, Bravery, Justice, Wisdom, Transcendance (ie; having a purpose other than to eat/sleep/breed) and Humanity (love). Individuality can still be Tolerated in a Utopia. Differences are to be embraced.
4; Peace. With enhancements into technologies of war, the Mutually Assured Destruction theory will become ever more prevalent, preventing large scale war from occurring, but the assumed abundance of resources, and the increased usage of the Metaverse, as Neal Stephenson first named it in his amazing book Snow Crash, (back in 1992!) predicted a society where real space wasn't an issue because you stayed at home and went out online instead, Land won't be an issue, and war won't be a requirement. Example; the US's use of fracking has led to a massive amount of gas being found, enough for the next 100 years. Now, they have less need of oil and will be less inclined to begin campaigns in oil rich countries. Also, like in these oil rich countries, where fundamentalist religions are common, the internet and the spread of technology has led to increased education there thanks to the development of communications. It leads to less people being fanatically religious and lessens the likelihood of terrorism, especially in a world where abundance is present, meaning no one will need to make viruses to steal money as they will have their necessities in plentiful supply. Crime in general will be lowered.
5; The internet is the ultimate place of freedom. Everyone can express their voice, and as Rich Jeni once said, The Web brings people together because no matter what kind of a twisted sexual mutant you happen to be, you've got millions of pals out there. Type in 'Find people that have sex with goats that are on fire' and the computer will say, 'Specify type of goat.'" There is a great deal of freedom on the net that easily prevents dictatorship occurring, anyone who would try to establish one would be challenged by defenders of anonymity, namely Anonymous, the bipolar freedom fighters, as I like to call them. (Good and Bad in equal measures). Freedom of speech with never be an issue in the Age of Information. (I would also like to bring up my hypothesis that a capital of a word somehow makes it correct, a theory I will call the Theory or Capitalised Importance).
6; In a society where resources are limitless, everything can be built by machine imprinted with software, and people will be able to effortlessly maintain these machines as the knowledge of how to will be accessed by micro computers. People will make things because they enjoy making things. Artists paint because they enjoy painting. Musicians like music and Directors like to direct. This plenty could only serve to weed out those who are just in it for the money (looking at you MB) and support individual pursuit of dreams and goals. You can't say we would have what we need; if we didn't have what we needed, we would be dead. We will become a society that only does things because they want to do them.
7; As earlier mentioned, true AI is an impossibility. Machines can only do what they are programmed to do, and would never be capable of Terminator style uprisings. Yes, we would be dependent on them, but I am dependent on my lungs, but can control them at will. Watch. *takes breath*............ ahhhhh..... *begins choking on bad joke, dies with face on keyboard*
That was fun. See you next round, and yes, I'll be leaving my signature the bottom of each post.
1. Since your biology studies surpass mine (I failed that class), I shall attack from another direction! Lets talk about the psychological aspect of it. You would have to outlive your family (if immortality is not mass produced) and see your relatives die off. What would you do when you see your children die off? You would lose the incentive to have children in the beginning.The point of family is to show you that you belong to something. Eternal life is more like eternal suffering. Eternal life is a slow death. The whole point of life is to have a meaning right? Well if we live forever, what is our purpose? Purpose gives us a drive to live our lives. A limited life is what gives us the incentive to do things before we die. If life is limitless, why maintain a healthy life style to "prolong" your life. You wont think it's worth it in the end.
2. I am sorry. but I am not familiar with Mass Effect 3 of any character in the game. I do however disagree with your statement that my idea of enslavement is "a false premonition". When you give AI that uses decision theory, it 1.)will want to self improve 2.) adapt to the world 3.) make decisions on its own 4.) and acquire resources without human interference. If the robots could soon make decision on their own, won't they be able to CHOOSE whether to conform to society? If they don't, then, to save me some time, refer to the movie "iRobot". (As for me, ALL CAPS will signify importance and the theory shall be called the Theory of All-Capitalized Significance).
3.) If a Totalitarianism is only a utopia to the rulers. Totalitarianism: the political concept that the citizen should be totally subject to an absolute state authority. With perfection comes restrictions and regulations. How do you keep a society perfect, and under who's control will it be to make perfection? Not the people. You set rules so nothing can go wrong. Regulations on food, jobs even and life style would sky rocket because perfection requires limits. The right food would have to be made a certain way so pollution is not longer a factor, and obesity no longer exists. Specific jobs would have to be done in order to maintain a clean, crime-less society. What are you willing to give up to live in a perfect society?
4.) Yes it would appear that way but, a perfect society would do anything to keep it perfect. You would never hear any violent outbreaks over anything. You are assuming that an attack on our country would be a physical attack such as bombs or invasions. However, with "enhancements into technologies of war", we will no longer need humans or robots to inflict harm on other countries. All that would be needed would be a single human with an IQ of a retarded lab monkey to infiltrate such a vulnerable system. If everything is controlled by technology, everything can be destroyed by technology.
5.)Yes the Web brings people together but what I am talking about is the deterioration of any form of government, other than communism, into a dictatorship. Communism FORMS the dictator and in return, he will make sure the ends justify the means. The Web does create a new interspace of freedom but a couple people arguing on Facebook (God knows what it will be in 100 years) about their leader will do nothing to instigate the ruler/dictator. When someone realizes the power they have, the power to control a society to mere perfections, they won't care about a couple of civilians arguing about current policies or resenting regulations. The rulers will be challenged by the "defenders of anonymity, but the ruler won't have a care in the world.
6.) "Want" and "need" are two different words. You WANT a GT 500 Mustang, but you can live without it. You NEED water because you can't live without it. You say it will be a society "where resources are limitless". My point exactly. With unlimited resources, what is there NOT to accomplish? Yes for the people that enjoy what they do will get to do just that. The other side is comprised of people that exploit this surplus of resources, especially the rich. Do they have have the power? Yes. Do they have the means? Yes of course! Sooo...why not? What stopping them? I am just afraid that with the ability to do everything and anything will dissipate the significance of what we do. Today, we have to go to classes to learn a new language. Come the Singularity, all you need a a microchip that can translate any language you hear so there is no point to learn it in the first place. Everything will be done FOR us other than BY us, and that can lead to problems out of our control.
7.) Arggg this frustrates me..."Machines can only do what they are programmed to do". Not with decision theory in-placed. In the future, there will come a time when machines WILL have the ability to make decisions for themselves and they will be able to say "no" when they want. Fortunately, you CAN control your lungs because it is a part of you. However, AI is NOT under your control because we will give it control of itself.
That was fun...er. But next round, I shall bring down the hammer. Off to you my brave opponent.
1; Biology isn't my shield, logic is my fortress, and it is defended in all directions. Why would you be able to live forever and not your relatives? You could be a great-great grandfather in your lifetime. No, you wouldn't forget relatives, as your implanted nanocomputers would enhance your memory systems. Oh, and family can be a lot of things to different people. Sure family is a sense of belonging to some, but eternal life doesn't prevent that. Rather, an Immortal could decide their own death, be satisfied with the legacy they have left and have death on their own terms. Many consider life too short, and just because the human naturally dies of old age around 80 years old doesn't mean we have to accept that. Humans naturally died of smallpox before innoculation.
2; My Mass Effect point was about Virtual Intelligence. Yes I've seen iRobot. My point is that true AI, or decision theory Ai, is impossible. We have never come close to building an independent thinking machine because to do so would be to fully understand the brain. If you are worried about robot uprisings, have you heard of Asimov's Laws of Robotics?
The science fiction writer Issac Asimov devised 3 rules which ensured robots could never take control.
1; A robot must never harm a human being, or by inaction, allow a human being to come to harm.
2; A robot must obey a human, unless the command defies the first law of robotics.
3; A robot must ensure its own survival, unless it would defy the first and second laws of robotics.
These three laws can be fitted by a detailed description of harm preventing bubble wrapping, keeping machines in control.
3; We are working on different definitions of Utopia. I was working on; A utopia is a community or society possessing highly desirable or near perfect qualities. Regulations will be more and more difficult to install, especially thanks to the internet which allows communications between people across the world and freedom of speech.
4; Computer viruses are constantly being upgraded, as are the defenses against them. Since virus outbreaks aren't of the same severity they used to be, even with more and more people using them, I doubt cyber crime could be a possiblilty. Besides, it takes a genius to devise a working virus capable of infiltrating a decent firewall.
5; The internet is a much greater force than just Facebook. You seem to see the world under one flag, an impossibilty even in a future society. No government will be able to rule the entire world, and will not be able to instill a single man as ruler either. Every country is on the world stage, and dictators aren't appreciated.
6; Patency only lasts so long. The formulae for unlimited energy and resources will be eventually shared out , or just leaked out. Then, the vast supply of the resource will result in a loss in value of the good, making it affordable for all. Only a monopoly could instill the exploitation you fear, and governments keep a close eye on those.
7; Decision theory is an impossibility. We know more about distant stars than about why we sleep, that's how complex the brain is. We may someday know its inner workings, but to replicate the most complex organ in the body would be more amazing than any development in the world. Virtual Intelligence, the closest thing we have, such as Siri or Cleverbot, simply has a list of answers based on input. It cannot think in the same way a human will, and to put human qualities on it such as ambition would be ridiculous.
The hammer may fall, but the dagger strikes twice and safely out before a single blow is landed. See you next round my friend.
Pillowface forfeited this round.
1 votes has been placed for this debate.
Vote Placed by BradK 2 years ago
|Agreed with before the debate:||-||-||0 points|
|Agreed with after the debate:||-||-||0 points|
|Who had better conduct:||-||-||1 point|
|Had better spelling and grammar:||-||-||1 point|
|Made more convincing arguments:||-||-||3 points|
|Used the most reliable sources:||-||-||2 points|
|Total points awarded:||0||3|
Reasons for voting decision: Pro's idea that a perfect society has problems is a paradox. If it has problems, it isn't perfect. Invalid argument Pro ignored the biology, but I understood the language-barrier-eradicating R2D2 translator analogy. Con said "Rather, an Immortal could decide their own death, be satisfied with the legacy they have left and have death on their own terms." A good counter to Pro's argument of: death is the thing that gives life meaning, which would mean immortality would remove the meaning from life. Pro said "All that would be needed would be a single human with an IQ of a retarded lab monkey to infiltrate such a vulnerable system" - bare assert fallacy. Pro said " Arggg this frustrates me..."Machines can [have decision algorithms]" countered con's 3 AI types Con said " My point is that true AI, or decision theory Ai, is impossible." - An unsupported argument. P1 is undermined if a psychotic human needs to be killed for benefit of all. Pro: 2F 1D 1C Con: 1C 1U Net=
You are not eligible to vote on this debate
This debate has been configured to only allow voters who meet the requirements set by the debaters. This debate either has an Elo score requirement or is to be voted on by a select panel of judges.