[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?
Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


File: bef156dc53e8590⋯.png (23.36 KB,240x240,1:1,aed09849dc12981cd348297dd9….png)

 No.13165

Polyamory = good, because it's just your obsolete primate programming that makes you upset about Tyrone fucking your gf

EA = good, because your enlightened primate programming makes you care about maximising starving n*****s in Africa.

Am I missing something here or there is a contradiction?

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13166

Some of your primate heuristics are useful, some of them aren't. Obeying them or getting rid of them is always instrumental, never a terminal goal.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13167

>>13166

Some but not all? What are some terminal goals that are not primate heuristics?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13168

>>13167

Certain primate heuristics are the direct cause of most things we do, but they're not usually considered goals, and they tend to conflict with other primate heuristics in some way.

The terminal goal of EA is roughly to increase global utility as much as possible, i.e. utilitarianism.

A more complicated but not much more useful explanation is that EA communities explicitly value utilitarianism and doing things that are more utilitarian is a way to increase social status in those communities.

So if you want to get all technical and reductionist it's caused by the drive for social status which the group is collectively harnessing for the purpose of increasing global utility. But that way of framing things has limited use. In most contexts it's useful to skip all the parts with the words "status" and "signaling".

But that's getting away from the topic of the thread.

The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier. Suppressing primate heuristics is instrumental, it's not the goal.

The reason EA is considered good by a lot of rationalists is cynically that it lets you appear rational and ethical, or less cynically that it increases global happiness.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13169

>>13168

So the end goal of EA is universe tiled in hedonium? Why do rationalists still seem to be averse to wireheading themselves?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13170

>>13169

Some utilitarians would be on board with that. Some utilitarians think that doesn't maximize utility (e.g. preference utilitarians) or even happiness (e.g. certain objections about unique brain states).

There exist EA people who would be on board with that. I don't know how many. I would think people who think tiling the universe with hedonium is good wouldn't object to wireheading either.

There also exist EA people who aren't utilitarians, even though the overall sensibilities of EA are pretty utilitarian. I don't know how many.

Most EA has much more mundane goals. Preventing malaria or improving nutritional intake of poor rural laborers using low-cost interventions is good whether you subscribe to preference utilitarianism or negative utilitarianism or hedonic utilitarianism.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13171

>>13165

Tyrone wouldn't stoop down to fucking the sort of girl who goes poly.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13172

>>13166

>Obeying them or getting rid of them is always instrumental, never a terminal goal.

Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?

>>13168

>The reason polyamory is considered good by a lot of rationalists isn't that it suppresses obsolete primate programming, it's that it (presumably) makes the people involved happier.

So it's not good because it suppresses the (((obsolete))) primate desire to pair bond, sure, but it is good because it achieves the primate desire to experience happiness. Really makes me think. And what I'm thinking is that you haven't thought this through all the way. You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"

>>13171

Tyrone will copulate with whomever he damn well pleases.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13173

>>13172

>Why not? Did the Terminal Goal Fairy come down and bless your dear monkey brain with some sort of exogenous goals?

In explicit justifications rationalists give, like the OP seems to be criticizing, it's an instrumental goal. In the first case it's not "this is primate programming so you should get rid of it" but "this is primate programming so it's okay to get rid of it to achieve X".

>You never answered the other anon's question, "What are some terminal goals that are not primate heuristics?"

That's what the first paragraph was about. Explicitly stated terminal goals are not always primate heuristics. Direct causes are primate heuristics, but talking in terms of those is usually not useful.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.13228

EA or any other form of altruism in general is a consequence of idiocy.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]