Staredit Network > Forums > Lite Discussion > Topic: Roko's Basilisk
Roko's Basilisk
Apr 6 2015, 10:55 pm
By: Oh_Man  

Apr 11 2015, 11:40 pm NudeRaider Post #21

We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch

Quote from Azrael
Quote from NudeRaider
Who says it's gonna work that way?
Everyone. That is what this conversation is about. If you don't believe it would work that way, you're no longer talking about Roko's Basilisk, since it's specifically about an AI that does work that way.
Okay my bad for asking the wrong question.
Why would you imagine the basilisk to work that way? The chances of it becoming real would be bigger the way I proposed.

Or did I miss something that would somehow make it detrimental to the basilisk?




Apr 12 2015, 2:49 am Azrael Post #22



Quote from NudeRaider
Why would you imagine the basilisk to work that way? The chances of it becoming real would be bigger the way I proposed.

Or did I miss something that would somehow make it detrimental to the basilisk?
The idea is that it won't hold something against you that you never knew about; the fact you didn't hear about it is not your fault, nor is it in your control. Once you do hear about the AI, however, you have two options: help bring the AI into existence, or choose not to help bring the AI into existence.

Anyone who believes this AI will exist will already be spreading this as much as possible. If you included "people that haven't heard of it" in the punishment, it would not incentivize those people to hear about it, since they couldn't possibly know about the potential punishment or the incentive.




Apr 12 2015, 6:24 am Oh_Man Post #23

Find Me On Discord (Brood War UMS Community & Staredit Network)

Quote from Azrael
So you're pinning it on your consistent desire to actively disrespect the beliefs of anyone who isn't you. Got it.
There are many beliefs that are not worthy of respect. I take that statement to be self-evident, but I can extrapolate if you so desire.

Post has been edited 1 time(s), last time on Apr 12 2015, 6:33 am by Oh_Man.




Apr 12 2015, 11:14 am NudeRaider Post #24

We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch

Quote from Azrael
The idea is that it won't hold something against you that you never knew about; the fact you didn't hear about it is not your fault, nor is it in your control. Once you do hear about the AI, however, you have two options: help bring the AI into existence, or choose not to help bring the AI into existence.

Anyone who believes this AI will exist will already be spreading this as much as possible. If you included "people that haven't heard of it" in the punishment, it would not incentivize those people to hear about it, since they couldn't possibly know about the potential punishment or the incentive.
That still doesn't answer the why. Like what's the advantage of thinking it in this way? You just covered why it would not be an advantage. And you forgot about those who know about it and try to censor it because - how did our lovely admin put it - "it's dangerous knowledge" "that is stupid be put up for discussion". The less people censoring it the more people can get informed about it. So from my perspective my proposal would improve exposure to the idea. So your turn, why, despite what I just said, would it be helpful for the basilisk to punish only those knowing about it?




Apr 12 2015, 1:20 pm Azrael Post #25



Quote from NudeRaider
And you forgot about those who know about it and try to censor it because - how did our lovely admin put it - "it's dangerous knowledge" "that is stupid be put up for discussion". The less people censoring it the more people can get informed about it.
This is the flaw in your thinking. The people who are censoring it are already going to be punished, because they are actively acting against the AI's best interests. Again, they have no additional incentive to spread it to anyone, since they have no intention of helping the AI come into existence.

If you were to say "Well hey admin, what if everyone who hasn't read it also gets punished?", then he'd go "uh, then I'm going to try extra-hard to make sure this never happens" followed by "and why the hell would it make logical sense to punish people for something they had no choice in?" and "so basically the whole world would be wiped out then lolk".




Apr 12 2015, 2:17 pm NudeRaider Post #26

We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch

Quote from Azrael
This is the flaw in your thinking.
Yes and no. Yes because I didn't realize the admin is actively trying to prevent it.
And no because I was assuming a more reasonable folk like me and you (apparently) who couldn't give a rats ass about the thought experiment but wouldn't deny others the chance to make their mind. We would only spread it if we thought that would not harm anyone.

Actually a bad example again, because personally I wouldn't restrict myself talking about it either way because I'm certain the AI would never be realized. But others who aren't that certain, for them it might make the difference.




Apr 12 2015, 4:07 pm Ahli Post #27

I do stuff and thingies... Try widening and reducing the number of small nooks and crannies to correct the problem.

I wouldn't want to live in Utopia created by the AI when the AI was mass murdering people and I was alive and aware that this occurred because of the AI.

Thus, I take the chance that such an AI does not execute its punishment during the rest of my life.

I think it's unethical to knowingly help in the creation of such a device. Thus, everyone that willingly helps in its creation would be a disgrace to human kind.




Dec 23 2015, 3:00 pm ClansAreForGays Post #28



Quote from Ahli
I wouldn't want to live in Utopia created by the AI when the AI was mass murdering people and I was alive and aware that this occurred because of the AI.
Actually, you would. No matter how righteous you might briefly feel about it, you'd rather be on the utopia side than the punishment side.




Dec 23 2015, 11:29 pm Voyager7456 Post #29

Responsible for my own happiness? I can't even be responsible for my own breakfast

Quote from ClansAreForGays
Quote from Ahli
I wouldn't want to live in Utopia created by the AI when the AI was mass murdering people and I was alive and aware that this occurred because of the AI.
Actually, you would. No matter how righteous you might briefly feel about it, you'd rather be on the utopia side than the punishment side.

Thanks for bumping the thread, dick. Now I'm going to be tortured for eternity by an AI.



all i am is a contrary canary
but i'm crazy for you
i watched you cradling a tissue box
sneezing and sniffling, you were still a fox


Modding Resources: The Necromodicon [WIP] | Mod Night
My Projects: SCFC | ARAI | Excision [WIP] | SCFC2 [BETA] | Robots vs. Humans | Leviathan Wakes [BETA]


Options
  Back to forum
Please log in to reply to this topic or to report it.
Members in this topic: None.
[11:50 pm]
O)FaRTy1billion[MM] -- nice, now i have more than enough
[11:49 pm]
O)FaRTy1billion[MM] -- if i don't gamble them away first
[11:49 pm]
O)FaRTy1billion[MM] -- o, due to a donation i now have enough minerals to send you minerals
[2024-4-17. : 3:26 am]
O)FaRTy1billion[MM] -- i have to ask for minerals first tho cuz i don't have enough to send
[2024-4-17. : 1:53 am]
Vrael -- bet u'll ask for my minerals first and then just send me some lousy vespene gas instead
[2024-4-17. : 1:52 am]
Vrael -- hah do you think I was born yesterday?
[2024-4-17. : 1:08 am]
O)FaRTy1billion[MM] -- i'll trade you mineral counts
[2024-4-16. : 5:05 pm]
Vrael -- Its simple, just send all minerals to Vrael until you have 0 minerals then your account is gone
[2024-4-16. : 4:31 pm]
Zoan -- where's the option to delete my account
[2024-4-16. : 4:30 pm]
Zoan -- goodbye forever
Please log in to shout.


Members Online: NudeRaider