Latest topics
» Egypt: Stones And Ancient Civilization – Nassim Haramein
Today at 12:32 am by PurpleSkyz

» Government missing now $21 Trillion! Does anyone care?
Today at 12:28 am by Jaguar-2016

» The Real Creator of Bitcoin Is Likely The NSA as One World Currency
Today at 12:26 am by PurpleSkyz

» Retired Marine Steve Motley on Current Updates; Trump, the Marines, Q-Anon
Today at 12:23 am by PurpleSkyz

» How To Be Kind In A Sometimes Cruel World
Today at 12:21 am by PurpleSkyz

» Abracadabra
Today at 12:18 am by PurpleSkyz

» Even One Exercise Event Improves Brain Health, Recent Study Finds
Today at 12:16 am by PurpleSkyz

» This Is How Your Eyes Can Heal You
Today at 12:11 am by PurpleSkyz

» Alfred Webre ~ Patty Greer & Ken: C60 "Bucky Ball" Molecule - DNA Activation & Longevity
Today at 12:08 am by PurpleSkyz

» THE DIAMETRIC GEOMETRY OF SOVEREIGNTY VERSUS SATANISM ~ Sacha Stone
Today at 12:06 am by PurpleSkyz

» Info-Whores - Patty Greer + Originator of Anonymous
Today at 12:04 am by PurpleSkyz

»  MegaAnon says the "gang" etc. arrests going on now in DC etc. are about #PEDOGATE #MassArrests #Pizzagate
Yesterday at 11:39 pm by Jaguar-2016

» Blood Pressure & Essential Oils
Yesterday at 10:48 pm by NanneeRose

» Disinfowars
Yesterday at 10:10 pm by NanneeRose

» Where Are All Our Dinar Dudes? Is This A Little Curious?
Yesterday at 8:57 pm by NanneeRose

» Next One.....
Yesterday at 8:16 pm by NanneeRose

» Think About It Now........
Yesterday at 8:14 pm by NanneeRose

» Butch.....the Rooster...For Marty, Terboand Nannee
Yesterday at 7:45 pm by topspin2

» THE US CONSTITUTION AND THE GREAT LAW OF PEACE
Yesterday at 7:34 pm by topspin2

» Here Are the Smoking Guns, Mr. Trump.... by Anna Von Reitz
Yesterday at 7:08 pm by topspin2

» Eyes wide Open ~ The Secret Underground - Part 2
Yesterday at 7:01 pm by PurpleSkyz

» The Essential Change
Yesterday at 6:27 pm by NanneeRose

» The hypocrisy of the "alt-right."
Yesterday at 5:58 pm by Avebury

» A dad's hilarious letter to school asks them to explain why they're living in 1968.
Yesterday at 5:39 pm by PurpleSkyz

» The GoldFish Report No. 167- Week 47 POTUS Report w/ Dr. Jim Fetzer: Red Pill Big Picture
Yesterday at 5:33 pm by PurpleSkyz

» She Was SIXTEEN GOING ON SEVENTEEN (Roy Moore Was 32) - A Randy Rainbow Song Parody
Yesterday at 4:15 pm by PurpleSkyz

» The enigmatic ancient ‘city’ of Midas, built by the Phrygians almost 3,000 years ago
Yesterday at 2:28 pm by PurpleSkyz

» Philippines Orders Probe Into Sanofi Dengue Vaccine for 730,000 Children
Yesterday at 2:25 pm by PurpleSkyz

» New York City Port Authority Blast
Yesterday at 2:20 pm by PurpleSkyz

» This Footage Could be the Best Evidence Ever of Triangular TR-3B type UFO over Russia
Yesterday at 1:39 pm by PurpleSkyz

» 5 Tips for the Uninspired Spiritual Seeker (Part 1) By Wes Annac
Yesterday at 1:09 pm by PurpleSkyz

» UFO News ~ UFOs ET CRAFTS OVER MELBOURNE AUSTRALIA plus MORE
Yesterday at 1:03 pm by PurpleSkyz

» NIBIRU News ~ *SPECIAL*SECOND SUN - TULUKSAK-ALASKA plus MORE
Yesterday at 12:59 pm by PurpleSkyz

» Praying Statue and Tower Found On Planet Mercury
Yesterday at 12:48 pm by PurpleSkyz

» TU-E CAPITAL / Alert ! Heritage Assets vs Financial Scams - TooGoodToBeTrue 2017 Short Documentary
Yesterday at 12:46 pm by PurpleSkyz

» Benjamin Fulford Update for December 11, 2017
Yesterday at 12:41 pm by PurpleSkyz

» NEIL KEENAN UPDATE | Neil’s Return Home
Yesterday at 12:37 pm by PurpleSkyz

» TDA Account Troubleshooting ~ April LaJune
Yesterday at 11:35 am by PurpleSkyz

CLICK THE SUBSCRIBE BUTTON BELOW TO RECEIVE OUR DAILY NEWSLETTER

A 2ND EMAIL COMPLETES THE ACTIVATION PROCESS




CLICK THE PURPLE BUTTON TO VIEW OUR LATEST POSTS



You are not connected. Please login or register

Out Of Mind » GALACTIC AWARENESS » UFO DISCLOSURE, ISS, MUFON, SETI & NASA » 7 Totally Unexpected Outcomes That Could Follow the Singularity

7 Totally Unexpected Outcomes That Could Follow the Singularity

View previous topic View next topic Go down  Message [Page 1 of 1]

PurpleSkyz

avatar
Admin
7 Totally Unexpected Outcomes That Could Follow the Singularity
By definition, the Technological Singularity is a blind spot in our predictive thinking. Futurists have a hard time imagining what life will be like after we create greater-than-human artificial intelligences. Here are seven outcomes of the Singularity that nobody thinks about — and which could leave us completely blindsided.
For the purpose of this list, I decided to maintain a very loose definition of the Technological Singularity. My own personal preference is that of an intelligence explosion and the onset of multiple (and potentially competing) streams of both artificial general superintelligence (SAI) and weak AI. But the Singularity could also result in a kind of Kurzweilian future in which humanity has merged with machines. Or a Moravecian world in which our “mind children” have left the cradle to explore the cosmos, or a Hansonian society of competing uploads, featuring rapid economic and technological growth.
In addition to some of these scenarios, a Singularity could result in a complete existential shift for human civilization, like our conversion to digital life, or the rise of a world free from scarcity and suffering. Or it could result in a total disaster and a global apocalypse. Hugo de Garis has talked about a global struggle for power involving massively intelligent machines set against humanity — the so-called artilect war.
But there are some lesser known scenarios that are also worth keeping in mind, lest we be caught unawares. Here are seven of the most unexpected outcomes of the Singularity.
1. AI Wireheads

It’s generally assumed that a self-improving artificial superintelligence (SAI) will strive to become progressively smarter. But what if cognitive enhancement is not the goal? What if an AI just wants to have fun? Some futurists and scifi writers have speculated that future humans will engage in the practice of wireheading — the artificial stimulation of the brain to experience pleasure (check out Larry Niven’s Known Space stories for some good examples). An AI might conclude, for example, that optimizing its capacity to experience pleasure is the most purposeful and worthwhile thing it could do. And indeed, evolution guides the behavior of animals in a similar fashion. Perhaps a transcending, self-modifying AI will not be immune to similar tendencies.
At the same time, an SAI could also interpret its utility function in such a way that it decides to wirehead the entire human population. It might do this, for example, if it was pre-programmed to be “safe” and consider the best interests of humans, thus taking its injunction to an extreme. Indeed, an AI could get its value system completely botched up by concluding that maximum amounts of pleasure is the highest possible utility for itself and for humans.
As an aside, futurist Stephen Omohundro disagrees with the AI wirehead prediction, arguing that AIs will work hard to avoid becoming wireheads because it would be harmful to their goals.” Image: Mondolithic Studios.
2. “So long and thanks for all the virtual fish”
Imagine this scenario: The Technological Singularity happens — and the emerging SAI simply packs up and leaves. It could just launch itself into space and disappear forever.

But in order for this scenario to make any sense, an SAI would have to conclude, for whatever reason, that interacting with human civilization is simply not worth the trouble; it's just time to leave Earth — Douglas Adams' dolphin-style.
Image: Colie Wertz.
3. The Rise of an Invisible Singleton
Expand
It’s conceivable that a sufficiently advanced AI (or a transcending mind upload) could set itself up as a singleton — a hypothetical world order in which there is a single decision-making agency (or entity) at the highest level of control. But rather than make itself and its global monopoly obvious, this god-like AI could covertly exert control over the human population.
To do so, an SAI singleton would use surveillance (including reliable lie detection) and mind-control technologies, communication technologies, and other forms of artificial intelligence. Ultimately, it would work to prevent any threats to its own existence and supremacy, while exerting control over the most important parts of its territory, or domain — all the while remaining invisible in the background.
4. Our Very Own Butlerian Jihad
Another possibility is that humanity might actually defeat an artificial superintelligence — a totally unexpected outcome just based on the sheer improbability of it. No doubt, once a malign or misguided SAI (or even a weak AI) gets out of control, it will be very difficult, if not impossible, to stop. But humanity, perhaps in conjunction with a friendly AI, or by some other means, could fight back and find away to beat it down before it can invoke its will over the planet and human affairs. Alternately, future humans could work to prevent it from coming about in the first place.
Expand
Frank Herbert addressed these possibilities in the Dune series by virtue of the “Butlerian Jihad” — a cataclysmic event in which the “god of machine logic” was overthrown by humanity and a new fundamental tenet invoked: “Thou shalt not make a machine in the likeness of a human mind.” The Jihad resulted in the destruction of all intelligent machines and the rise of a new feudal society. It also resulted in the rise of the mentat order — humans with extraordinary cognitive abilities who functioned as virtual computers.
5. First Contact
Our transition to a post-Singularity civilization could also expose us to a larger, technologically advanced intergalactic community. There are a number of different possibilities, here — and not all of them good.
First, a post-Singularity civilization (or SAI) might quickly figure out how to communicate with extraterrestrials (either by receiving or transmitting). There may be a kind of cosmic internet that we’re oblivious to, but which only advanced civs might be able to detect (e.g. some kind of quantum communication scheme involving non-locality). Second, a kind of Prime Directive may be in effect — a galactic policy of non-interference in which ‘primitive’ civilizations are left alone. But instead of waiting for us to develop faster-than-light travel, an extraterrestrial civilization might be waiting for us to achieve and survive a Technological Singularity.

Thirdly, and related to the last point, an alien civilization might also be waiting for us to reach the Singularity, at which time it will conduct a risk assessment to determine if our emerging SAI or post-Singularity civilization poses some kind of threat. If it doesn’t like what it sees, it could destroy us in an instant. Or it might just destroy us anyway, in an effort to enforce its galactic monopoly. This might actually be how berserker probes work; they sit idle in some location of the solar system, becoming active at the first sign of a pending Singularity.
6. Our Simulation Gets Shuts Down

If we’re living in a giant computer simulation, it’s possible that we’re living in a so-called ancestor simulation — a simulation that’s being run by posthumans for some particular reason. It could be for entertainment, or for a science experiment. An ancestor simulation could also be run in tandem with many other simulations in order to create a large sample pool, or to allow for the introduction of different variables. Disturbingly, it’s possible that the simulations are only designed to reach a certain point in history — and that point could very well be the Singularity.
So if we reach that stage, everything could suddenly go dark. What’s more, the computational demands required to run a post-Singularity simulation of a civilization could be enormous. The clock rate, or even rendering time, of the simulation could result in the simulation running so slowly that the posthumans would no longer have any practical use for it. They’d probably just shut it down.
7. The AI Starts to Hack Into the Universe

Admittedly, this one’s pretty speculative (not that the other ones haven’t been!) — but think of it as a kind of ‘we don’t know what we don’t know’ sort of thing. A sufficiently advanced SAI could start to see directly into the fabric of the cosmos and figure out how to hack into its ‘code.’ It could start to mess around with the universe to further its needs, perhaps by making subtle alterations to the laws of the universe itself, or by finding (or engineering) an ‘escape hatch’ in order to avoid the inevitable onslaught of entropy. Alternately, an SAI could construct a basement universe — a small artificially created universe linked to the current universe by a wormhole. This could then be used for living space, computing, or as a way to escape the eventual heat death of the parent universe.
Or, an SAI could migrate and disappear into an exceedingly small living space (what the futurist John Smart refers to as STEM space — highly compressed areas of space, time, energy, and matter) and conduct its business there. In such a scenario, an advanced AI would remain completely oblivious to us puny meatbags; to an SAI, the idea of conversing with humans might be akin to us wanting to have a conversation with a plant.
7 Totally Unexpected Outcomes That Could Follow the Singularity
Favorite
Share Twitter


Thanks to: http://crystalseed.ning.com



  

View previous topic View next topic Back to top  Message [Page 1 of 1]

Permissions in this forum:
You cannot reply to topics in this forum