Will the singularity / robopocalypse happen and if it does, what will you do? Sam
Him:
Ok, Sam, we’ll allow a two-parter.
By singularity, I assume you mean Ray Kurzweil’s (brace for the coming ridiculous oversimplification) description of the point at which non-human intelligence far surpasses our own. Then, semi-miraculously, we merge our limited organic intelligence (and presumably bodies of some type) with the superior non-biological intelligence. What follows this synthesis is the emergence of a new “human” - we’ll call them people 2.0. They are massively more intelligent and vastly superior physically. These “new humans” spend a few centuries making massive technological progress here on earth before venturing out and saturating the universe with our new awesomeness. I like the sound of this and vote YES. Since Kurzweil identifies 2045 as the point at which this shit really kicks off, count me in!
But, the robopocalypse does sound like a potentially super shitty road bump on the path to Plan A (see singularity above). Here is how it plays out. A myriad of rapidly emerging technologies adhere dutifully to Ray’s theory of exponential growth. Tech titans, entrepreneurs, VCs and bankers (the cockroach/leach hybrids of modern capitalism) continue to become obscenely rich developing the latest versions of social media apps, autonomous machines, smart appliances, AR, VR, etc. All the while unaware or indifferent to the fact they don’t fully understand and/or can’t really control these digital Frankensteins. Then, BAM, general AI. Oh shit, these things know they exist, they’re networked to communicate nearly instantaneously and on a scale our monkey brains can’t fathom. Plan A may have just gone down the drain.
I don’t know what they want or what their plan is, but it’s pretty clear my biological averageness isn’t a “value add”. So, do I join some sort of resistance? Even the best and brightest among us probably don’t stand a chance; I suspect my role will be sacrificial at best.
I think Elon Musk suggested that as ants are to us, we might be to this new apex intelligence. Most normal humans, exterminators being the obvious exception, do not actively seek out and destroy ants. Ergo, if my family and I can avoid this new “species”, we may be able to live out some low / no-tech agrarian or hunter-gather style existence. My odds here aren’t great either, but it beats the alternatives - some undefinable, probably life draining, form of enslavement or outright death.
This crazy tech train is already rolling, it’s picking up steam (actually it’s probably magnetic) and we can’t stop it. So yea, I’m hoping for the scenario where I become a super smart, universe exploring bad-ass. I think that’s the general idea behind the Singularity.
Her:
Oh geez, I really hate this question for many reasons:
a) I’m not really 100% sure what “the awakening” means.
b) I was raised as a Southern Baptist (I've since converted) but will always live in the fear of knowing the end of the world could literally happen upon us at any moment.
c) I am, sadly, technology challenged. Computers hate me…I’m serious. Ask anyone who has seen me navigate through anything with a screen…It’s so rough.
From what I remember about my husband and son's recount of the book Robopocalypse, I am fairly certain that the “Awakening” is referring to the moment when all the robots, computers, and all kitchen appliances everywhere join forces with the new understanding that they can actually control us, instead of us controlling them. Now doesn’t that sound just lovely.
If/when it happens, I know they will come for me first. They will be able to sense the frustration and animosity I have held towards them for so many years, and I am sure to be in the first round of casualties. They will be able to identify me (and probably all of us tech-challenged humans) like bears can sense fear. Therefore, I will not be around to see what happens after. All I can pray is that my family is spared pain and torture. Best of luck to all who survive and are committed to a life of pain and servitude. I think it really will suck for you.
The End.