Staredit Network > Forums > Lite Discussion > Topic: Roko's Basilisk
Roko's Basilisk
Apr 6 2015, 10:55 pm
By: Oh_Man  

Apr 6 2015, 10:55 pm Oh_Man Post #1

Now on ICCUP, channel donuts

http://www.slate.com/articles/technology/bitwise/2014/07/roko_s_basilisk_the_most_terrifying_thought_experiment_of_all_time.html
http://www.patheos.com/blogs/hallq/2014/12/rokos-basilisk-lesswrong/

Anyone up for an in-depth discussion on this? I can't believe some people are so up-in-arms about this, even going so far as to attempt to censor discussion on it. For me personally it sounds like a variant of Pascal's Wager, so I'm finding it hard to take seriously, and seriously questioning the mental state of those who do.

Quote
One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way (and I will explain how) for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer?

The admin's moronic response:
Quote
Listen to me very closely, you idiot.
YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.
You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends.
This post was STUPID.


Post has been edited 1 time(s), last time on Apr 6 2015, 11:01 pm by Oh_Man.




Apr 6 2015, 11:46 pm Fire_Kame Post #2

Stupid babies need the most attention

Is...is a flame battle from another forum now considered Lite Discussion?

I feel so meta.




Apr 6 2015, 11:56 pm Zoan Post #3

Math + Physics + StarCraft = Zoan

The idea that something like that might come into existence in the future is pretty cool, since it's possible, but then that bit about us being inside a simulation is just dumb, especially the part about resurrecting people just to torture them to death over and over.



\:rip\:ooooo\:wob\:ooooo \:angel\: ooooo\:wob\:ooooo\:rip\:

Apr 7 2015, 4:15 am NudeRaider Post #4

The entire universe has been neatly divided into things to (a) mate with, (b) eat, (c) run away from, and (d) rocks -Terry Pratchett

Only having read your quotes: wtf

I see no grounds for fruitful discussion because apparently we all agree that this admin has a too vivid imagination.




Apr 7 2015, 1:28 pm Oh_Man Post #5

Now on ICCUP, channel donuts

What do you guys think about the whole concept of dangerous ideas? I'm against censorship generally, but I've never really considered seriously some sort of knowledge that is 'so dangerous' it should be censored.

I guess things like how to make an atom bomb? How to 3D print viruses? It would be censored on the grounds that you don't want that sort of info going into the wrong hands, I suppose.




Apr 7 2015, 5:04 pm Vrael Post #6



Uh duh this could never happen because God would never let such a thing happen problem solved.


Ok sorry done busting your balls Oh_Man, but what is the basilisk exactly? It's an evil supergenius that gives us 2 boxes and we have to take just one or both, except both those options suck? When the evil supergenius realizes that they can just put eternal torment in the 2nd box (assuming they will realize this since its a supergenius), then we're fucked either way, so doesn't this problem just reduce to "Hey guys, I just thought of a situation in which we're all fucked because we're all fucked. No seriously I wasn't joking, I didn't have any actual logic or anything to go on here, I just wanted to screw with you all."

There's the whole crap with it being able to blackmail us before it even exists, but since there's no actual apparent mechanism for this, how is this even a problem? Is there a real description of this thing somewhere?



None.

Apr 7 2015, 5:22 pm NudeRaider Post #7

The entire universe has been neatly divided into things to (a) mate with, (b) eat, (c) run away from, and (d) rocks -Terry Pratchett

Quote from Oh_Man
What do you guys think about the whole concept of dangerous ideas? I'm against censorship generally, but I've never really considered seriously some sort of knowledge that is 'so dangerous' it should be censored.
And again we agree.

Quote from Oh_Man
I guess things like how to make an atom bomb? How to 3D print viruses? It would be censored on the grounds that you don't want that sort of info going into the wrong hands, I suppose.
Building the bomb is easy. Acquiring the necessary materials is what's hard. ;)
So no, I don't think it's necessary to censor it.




Apr 7 2015, 5:36 pm jjf28 Post #8

Oh bother...

Atom bombs takes a fair amount of heavily restricted resources, but other forms of Knowledge Enabled Mass Destruction could be achieved by reasonably smart individuals or small groups with limited resources through biological or mechanical self-replicating technology.

Replicating automatons, even if not intended for destruction could potentially lead to resource exhaustion (Gray-Goo-Apocalypse), drastic atmospheric changes, etc.

Why the future doesn't need us - Bill Joy



Rs_yes-im4real - Clan Aura - jjf28.net84.net

Reached the top of StarCraft theory crafting 2:12 AM CST, August 2nd, 2014.

Apr 7 2015, 6:51 pm Fire_Kame Post #9

Stupid babies need the most attention

Quote from Vrael
Uh duh this could never happen because God would never let such a thing happen problem solved.

You joke, but I doubt this comparison is coincidental. The basalisk is pretty much saying I might or might not exist, and if I do and you don't support me I'm going to destroy you. God says that you have to have faith that he exists and fulfill his work on Earth. If you don't, you go to hell. Whether that's just a cease of existence or eternal torment is then a matter of interpretation.

Honestly I found it to be a thinly veiled attempt to trap theists in their own logic...




Apr 8 2015, 2:14 am Zoan Post #10

Math + Physics + StarCraft = Zoan

Quote from Fire_Kame
The basalisk is pretty much saying I might or might not exist, and if I do and you don't support me I'm going to destroy you.

I thought it was "This thing will eventually exist, and when it does, it will torture those who didn't help create it."

Edit:

I personally thought of it as if something close to the resulting computer from the short story The Last Question ended up being a self centered prick who decided to torture everyone who didn't aid in its creation.

Post has been edited 3 time(s), last time on Apr 8 2015, 2:24 am by Zoan.



\:rip\:ooooo\:wob\:ooooo \:angel\: ooooo\:wob\:ooooo\:rip\:

Apr 8 2015, 10:58 pm Oh_Man Post #11

Now on ICCUP, channel donuts

Quote from Fire_Kame
Honestly I found it to be a thinly veiled attempt to trap theists in their own logic...
Quote from Oh_Man
For me personally it sounds like a variant of Pascal's Wager, so I'm finding it hard to take seriously

Yes. Yes. Very thinly veiled. So thinly veiled, in fact, one may think there was no veil at all! :rolleyes:

Quote from Vrael
Ok sorry done busting your balls Oh_Man, but what is the basilisk exactly?
https://en.wikipedia.org/wiki/LessWrong#Roko.27s_basilisk

It is the notion that one day there would exist an artificial intelligence built to create a utopia, and that it would interpret that any human not working towards that utopia should be punished. So if you aren't actively helping to bring the AI into existence then the AI would punish you for it when it came into existence. The concept even goes one step further where people say it would attempt to punish you even if you were dead, resurrecting you somehow by backwards engineering you from your Facebook comments (LOL) and using your DNA.

So the people taking this thought experiment VERY seriously were of the opinion that all knowledge of the basilisk should be erased, because if people didn't know about the basilisk then they wouldn't be punished by the basilisk for not working toward it's existence. Ignorance is bliss type of thing.

Anyway like I said it is basically Pascal's Wager, except with a supreme all powerful AI, instead of a supreme all powerful god. But apparently some idiots take it very seriously.




Apr 8 2015, 11:23 pm Zoan Post #12

Math + Physics + StarCraft = Zoan

I thought Pascal's Wager went something like:

If there is a God and you follow him, you gain something infinite.
If there is a God and you don't follow him, you lose something infinite.
If there is not a God and you follow him, you lose something finite.
If there is not a God and you don't follow him, you gain something finite.

Thus it's better to believe, since you risk something finite for something infinite.

While Roko's Basilisk is more about the punishment side, like:

If it is created and you helped create it, you gain something finite (your life).
If it is created and you didn't help create it, you are tortured for a very long time.
If it is not created and you tried to make it, you lose something finite.
If it is not created and you didn't try to make it, you gain something finite.

Thus it's better to help create it, since you run the risk of endless torture if you don't.

So they're similar, but Pascal's Wager focuses more on the benefit of believing than the penalty of not, while the Basilisk thing focuses solely on the punishment of not.

As a note, I'm sure there are some variances of Pascal's Wager that also incorporate the idea that hell is a punishment for those who don't believe, but I'm fairly certain that the original idea as presented by Pascal was solely about the gain.



\:rip\:ooooo\:wob\:ooooo \:angel\: ooooo\:wob\:ooooo\:rip\:

Apr 10 2015, 1:01 pm Oh_Man Post #13

Now on ICCUP, channel donuts

It's more like:

If there is a God and you follow him, you gain something infinite.
If there is a God and you don't follow him, you lose something infinite.
If there is not a God and you follow him, you lose something finite.
If there is not a God and you don't follow him, you gain something finite.

If there is a Basilisk and you follow him, you gain something infinite.
If there is a Basilisk and you don't follow him, you lose something infinite.
If there is not a Basilisk and you follow him, you lose something finite.
If there is not a Basilisk and you don't follow him, you gain something finite.


Remember gain and loss are just two sides of the same coin. Punishment and reward. The basilisk tortures you if you don't help, and, presumably, rewards those who live in its utopia.




Apr 10 2015, 1:33 pm Sacrieur Post #14

Still Napping

The entire thing reeks of tumblrism. Err. Thing.

There doesn't need to be a discussion on this. After I sifted through the writer's soup-like text, I discovered that this is what happens when people think they're smart but aren't actually smart. The end result is pseudo-philosophical/pseudoscience BS with a Master's in human group-think idiocy. It's not even limited to dumb people. There are plenty of smart professors who make similarly stupid claims about AI, pretending they can predict the future with perfect clarity.

TL;DR: you're not Asimov.

Post has been edited 1 time(s), last time on Apr 10 2015, 1:38 pm by Sacrieur.



None.

Apr 10 2015, 5:17 pm Azrael Post #15



I understand why it's being censored. I guess if you're firmly dedicated to a personal belief that the singularity won't happen in your lifetime, then you can rest easy (at least until it does happen, assuming you're wrong).

The point is that, by not being presented with the option to help create a malicious super AI, you cannot be judged by that AI for your inaction if it should come into existence. The only people who can be judged are the ones who actually had a choice: the ones who have been exposed to Roko's Basilisk.

The act of reading the thought experiment objectively increases the probability of something bad happening to you in the future.

Since you're looking for a way to compare it to religion: It's the same reaction a Christian community would have if someone managed to post a paragraph of text that caused you to break the Ten Commandments just by reading it.

On that note, considering you apparently understood the connotation already, it's kind of strange that you'd post this without at least including a warning above your links; something along the lines of "If you believe the singularity will happen in your lifetime, reading these links will potentially harm you".

Although based on the replies, I'm guessing most posters here would need to google what the singularity is before they could even start trying to comment on it.




Apr 11 2015, 12:25 am Zoan Post #16

Math + Physics + StarCraft = Zoan

I don't know, I got more of the "Oh no, we must help create the Basilisk for fear of punishment" vibe from reading the article rather than the "Devote your life to the Basilisk and reap your heavenly reward of 'eternal' life after assimilation" one. Especially since if the second was the point of the Basilisk, there would be no censorship of the idea.

While both do provide the motivation to do something, they are not the same thing, and depending on which of the two ideas do the motivation two very different types of followers are made.

So again, it's similar to Pascal's Wager, but this difference is important.



\:rip\:ooooo\:wob\:ooooo \:angel\: ooooo\:wob\:ooooo\:rip\:

Apr 11 2015, 8:17 am Oh_Man Post #17

Now on ICCUP, channel donuts

I didn't post a warning because I think anyone scared by this is retarded. It is the same as Pascal's wager, the exact same, and is refuted in the same way as one refutes Pascal's wager. If anyone takes Pascal' wager or the basilisk seriously, then I suggest you read everything wrong with the argument here:

http://rationalwiki.org/wiki/Pascal%27s_wager




Apr 11 2015, 11:02 am NudeRaider Post #18

The entire universe has been neatly divided into things to (a) mate with, (b) eat, (c) run away from, and (d) rocks -Terry Pratchett

O.o I can't believe I actually spent time with this...

Though I did find this: http://rationalwiki.org/wiki/Pascal%27s_wager#Reversing_the_wager
not believing yields:
Cruel God: infinite win or finite win
Benevolent: infinite loss or finite win
=> net gain: 2x finite win (the infinites cancel each other out)

believing yields:
Cruel God: infinite loss or finite loss
Benevolent: infinite win or finite loss
=> net gain: 2x finite loss
This even holds true if you argue the chance for a benevolent God is higher.

Pascal's Wager, case closed?

Now you could say with the punishing AI this doesn't work anymore because it defines the "alignment" of the AI to punish disbelievers, or non-contributers, in this case.
Unless... if you can imagine the punishing AI that way, who says it's gonna turn out that way? What if my AI actually gets built, the one that doesn't state how it will react towards its creators and those not helping? You know, whoever's in charge of its behavior patterns may be a cruel bastard seeking to sacrifice himself for the greater evil. And again, yes, it's a better chance your AI gets built, but as shown above with Pascal's wager likelihood is irrelevant as long as the chance exists.


Quote from Azrael
The point is that, by not being presented with the option to help create a malicious super AI, you cannot be judged by that AI for your inaction if it should come into existence. The only people who can be judged are the ones who actually had a choice: the ones who have been exposed to Roko's Basilisk.
Who says it's gonna work that way?
If I were (helping) to create this AI I would include those agnostic of the idea as well because it'd help spreading the idea immensely and thus further the goal because there'd be more potential helpers.




Apr 11 2015, 2:45 pm Oh_Man Post #19

Now on ICCUP, channel donuts

Quote from NudeRaider
O.o I can't believe I actually spent time with this...
Tehehehe. My job is done.

Quote from NudeRaider
Unless... if you can imagine the punishing AI that way, who says it's gonna turn out that way? What if my AI actually gets built, the one that doesn't state how it will react towards its creators and those not helping? You know, whoever's in charge of its behavior patterns may be a cruel bastard seeking to sacrifice himself for the greater evil. And again, yes, it's a better chance your AI gets built, but as shown above with Pascal's wager likelihood is irrelevant as long as the chance exists.
That for me has always been the number one nail in the coffin for the wager (and the basilisk too). The wager is a false dichotomy. It never factors in any other god (or any other AI).

The number two nail in the coffin would be the assumption of the punishment for non-belief. It's absurd to assume that is the reason for punishment. If I were to assume the existence of a supreme being I think it beggars belief that such a being would punish you for believing in something on faith basis, rather than by coming to a logical conclusion.




Apr 11 2015, 4:35 pm Azrael Post #20



Quote from Oh_Man
I didn't post a warning because I think anyone scared by this is retarded.
So you're pinning it on your consistent desire to actively disrespect the beliefs of anyone who isn't you. Got it.

Quote from Oh_Man
It is the same as Pascal's wager, the exact same
Except it isn't, as was already explained earlier in the thread. Pascal's Wager doesn't say that you'll still get into heaven if you don't believe in God, as long as you don't hear about Pascal's Wager.

I mean, do you know what a "basilisk" is? http://dictionary.reference.com/browse/basilisk

If the name didn't already give it away, the entire point is that being exposed to Roko's Basilisk is what will kill you in the future. An actually-relevant religious example was provided for you earlier.

Quote from NudeRaider
Who says it's gonna work that way?
Everyone. That is what this conversation is about. If you don't believe it would work that way, you're no longer talking about Roko's Basilisk, since it's specifically about an AI that does work that way.




Options
  Back to forum
Please log in to reply to this topic or to report it.
Members in this topic: None.
[12:00 pm]
jjf28 -- jjf28
jjf28 shouted: how many times do you have to touch the flame before you figure out that it burns?
<-- burn notice quote
[11:59 am]
jjf28 -- you guys take me too seriously ¯\_(ツ)_/¯
[11:30 am]
Mini Moose 2707 -- jjf28
jjf28 shouted: wait, are people following dem0n's advice now?
What? He's done tons of GRP work and artwork for tons of games. I'm pretty sure he's as close to an expert as SEN has.
[10:36 am]
TF- -- it is possible to live in a simulation, yes
[08:45 am]
lil-Inferno -- I shall pose a question to you then: is it possible for a man to be up during both real nigga hours AND family man hours?
[08:45 am]
lil-Inferno -- ok
[04:27 am]
Pr0nogo -- existing as a human being is maddening enough to make anyone a fucking retard
[03:32 am]
IskatuMesk -- you expect too much out of today's youth
[03:32 am]
IskatuMesk -- FaRTy1billion
FaRTy1billion shouted: that doesn't mean he'll never say anything of value
doesn't mean he is capable of doing so
[03:30 am]
O)FaRTy1billion[MM] -- or, if the fire is too big, find a different stick and touch it with that, then take some fire with you
Please log in to shout.


Members Online: Broadstoneaq5l, Roy, jun3hong, Sylviase6x, jjf28