[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?
Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


File: 48692f839e86b72⋯.jpg (58.09 KB,540x960,9:16,4ad.jpg)

 No.14181

CMV: Suicide is a rational choice for many people

Life has no meaning.

Due to that the only thing that really matters is the quality of the journey. Do you enjoy the Dance/Game of life.

If most of the time Life is shitty for you (High amounts of suffering), even if nothing terrible really happens. You're probably better off dead. Why would you like to experience a shitty time?

Possible counterarguments – answers

Not everybody feel more suffering than joy in their life – It's true, but many do. I would assume > 10%. Many of these people choose to live only due to fear of death, or to serve ideologies, religions or other spooks. They are being used by the system just like animals that are bred for their meat. It doesn't mean their life worth experiencing.

Even if it's rational to commit suicide, it's not easy or sometimes even possible, Survival instinct is a bitch – I agree, My argument only says that's the "rational" thing to do, the one that makes sense. Not that's it's easy or always possible.

Some people live for other people – Agreed again, if you have dependents (like small children) maybe it's better you won't kill yourself and leave them alone. Your friends/parents/wife are also a reasonable reason, but weaker IMO - if they really love you they should understand that death might be better for you.

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14184

You need Jesus

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14187

Utilitarianism is retarded, ergo, all arguments stemming from utilitarian calculus are invalid.

Having said that, I concede that value of life may have some relevance to the issue of suicide. So, here goes:

1. A person in a given moment has rights equal to those of other instances of this person in time, i.e. himself at other moments.

2. Sometimes even very unhappy people strongly prefer to exist, not in the sense that they fear death but rather that they consider life worthwhile.

3. Killing yourself when unhappy automatically kills those happy instances by precluding their existence.

4. It is immoral to kill people who have a preference to exist. It is, however, not immoral to deny people death when they don't have such preference.

> Why would you like to experience a shitty time?

This isn't a kino, degenerate.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14188

It may not even be possible to commit suicide, according to some models of the world: https://en.wikipedia.org/wiki/Quantum_suicide_and_immortality

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14192

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14200

>>14188

This. In the event that I want to kill myself, I intend to rig myself up with explosives that trigger based on some custom signal. Then I text some cute girl "be my gf" or similar, and I write a piece of software that prereads her response, performs sentiment analysis, and triggers the explosives to instantly terminate my conscious experience if it isn't a positive response. If I survive, I've figured out how to basically become the god of my own little universe and I get to have sex with infinity cute girls and maybe no longer feel suicidal as a result. If it doesn't, well I wanted to be dead anyway in this scenario.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14205

>>14187

It's also symmetrically immoral to force instances of yourself to suffer as it is to prevent from them to exist and have joy, and if you'll believe Benatar asymmetry it's even worse.

Besides, it's not a utilitarian view - morals is social-subjective BS anyway. It's from a completely selfish hedonistic view.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14206

>>14200

Before you go through with this, consider that you going through with it would make non-negligible the probability that you're a character, maybe even the protagonist, in Scott's short story. Which means your attempt would go wrong in some horrible but clever way. Trying to enumerate the horrible but clever ways it could go wrong before you go through with it would, of course, substantially raise the probability of you being such a character.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14208

>>14205

> It's also symmetrically immoral to force instances of yourself to suffer as it is to prevent from them to exist and have joy

I disagree, what are you gonna do? Hedonism boils down to utilitarianism, it's just a feeble-minded autistic American model where utility is conveniently put on the scale from negative to positive infinity. But qualitative differences do not form a linear scale.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14209

>>14206

Are you referring to a specific story of his?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14226

>>14209

No, just extrapolating from the common tropes and themes.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14244

>>14206

So we need to show Scott this thread and inspire him to write this story?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14249

>>14244

Yes. Please do it and report the results.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14297

>>14206

It's impossible to be a character in a story. Characters in stories have no agency or internal experience; they're just words on a page.

I know that's fairly obvious, but we are a bunch inclined towards missing the forest for the trees.

What you'd more realistically be concerned about is that you're in some kind of simulation. Which, yes, could be a simulation that's a post-singularity medium for narrative fiction, in which you're a protagonist in one of our caliph's short stories. But if it's actually Scott, if he's somehow ended up writing in a medium in which the main characters are sentient he's not going to write those characters a Bad End.

But the actual realistic outcome is much more quotidian; she says yes, but either she didn't realise who the text was from and thought it was from Chad, or she's just lying to "prank" you. You have to rule out things going wrong in dumb ways before you're entitled to have them go wrong in clever ways.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14298

>>14297

You can acausally determine the behavior of characters in stories based on your own behavior. If you have behavior X, and that is because rationalisty people tend to have behavior X, and Scott is writing a story about a rationalisty person, then your decision to have behavior X makes the person in the story more likely to have behavior X.

The expected number of actual people trying the quantum billionaire trick is much smaller than the expected number of fictional people written by Scott trying it, so if you value the results of your decision system in fictional situations even a little bit (for example because fiction affects opinions) it may be rational to behave as if you're in a Scott story.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14337

>>14298

>The expected number of actual people trying the quantum billionaire trick is much smaller than the expected number of fictional people written by Scott trying it

Scott's only one man, and he won't likely be making the same kind of rationalist-y fiction you can hypothesise about by knowing Scott's writing decades from now, and he hasn't even written on the quantum billionaire trick, so the expectation that gives me is <1. What kind of assumptions are you making to get expectations that make a Pascal's-Mugging-adjacent "even a little bit" argument viable?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14339

>>14181

You know nobody's going to change your view, yet you have no intention of suicide. Funny that.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14340

>>14337

I think it's very unlikely that even one person is going to try the quantum billionaire trick, at least until ems are invented.

If you find yourself seriously considering it then you're arguably pretty likely to be a fictional character. If you care in the abstract about how fictional rationalisty people are portrayed, then you should take that into consideration, because there's a decent chance you'll influence that.

This consideration still isn't as important as the possibility of dying and/or becoming a billionaire, but it might be enough to change some specifics about the way you execute it, or be a tiebreaker.

It's not likely*important enough to be an imperative, but it's likely enough to take into consideration.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14342

>>14340

I was completely serious about >>14200, other than implementation specifics, and I'm not a fictional character as far as I can tell. I'm not suicidal, but in general I have a hard time believing the set of people who (believe they) have nothing to lose, from now until the invention of ems, contains 0 people who consider quantum suicide trickery worth messing with.

Also your whole argument still relies on fictional characters having a conscious experience. My intuition is that it would be wasteful to simulate a universe with this level of complexity simply for storytelling, so it would be relatively easy to identify your world as created for narrative purposes from the inside. Our universe would need to meet that criteria before I can bother worrying about whether I'm in a Scott masterwork or some plebeian mass entertainment.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14343

>>14342

>Also your whole argument still relies on fictional characters having a conscious experience.

It doesn't. By being the kind of person who would consider the possibility of being in a story, a fictional character based on people like you will be more likely to consider the possibility of being in a story.

That's the beauty of acausal decision theory. You can acausally influence the behavior of your simulation, often even if the simulation is insufficiently detailed to be conscious.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]