[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?
Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


File: 5e3c7d6ef3bd682⋯.jpg (71.28 KB,960x646,480:323,19260544_1379592862131494_.jpg)

 No.9822

Is heroin the most rational and optimal choice for wireheading?

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9823

What is MTD?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9824

>>9823

moralistic therapeutic deism

Read you some Bethke, anon.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9825

DAE remember that time an EA literally proposed starting a farm of wireheaded rats as a way to improve the world?

http://effective-altruism.com/ea/nu/charities_i_would_like_to_see/

also, hedonium (>>2555) is way more efficient than wireheading animal minds

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9826

>>9825

fixed:

>>2555

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9827

>>9825

I'm not sure what's best about this; how well it shows up utilitarianism, effective altruism or animal rights.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9828

>>9827

>what's best about this

It's how well it shows up utilitarianism, because the true moral position is actually negative utilitarianism.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9829

>>9828

Bob: "Ouch, my stomach hurts."

Classical total utilitarian: "Don't worry! Wait while I create more happy people to make up for it."

Average utilitarian: "Never fear! Let me create more people with only mild stomach aches to improve the average."

Egalitarian: "I'm sorry to hear that. Here, let me give everyone else awful stomach aches too."

Negative utilitarian: "Here, take this medicine to make your stomach feel better."

The medicine is a lethal dose of sedatives.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9830

>>9829

Someone not existing is a net increase in happiness, so that's not actually negative utilitarianism.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9831

>>9830

Wat? Negative utilitarianism is about minimizing suffering, not minimizing happiness.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9832

>>9831

I misunderstood.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9833

The core of the problem is thinking "assume every person has an utility function, then create a master utility function, such that…". Such theories can't even handle spite. Locally, things work much better, and the flaws are subtler.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9834

>>9829

I think the best utilitarianism is to take the utility function of all existing people and ignore people that don't exist yet. If new people come into existence, then add them to the utility function, maybe.

It's not perfectly consistent, but it's better than the repugnant conclusion.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9835

>>4863

>better than the repugnant conclusion.

If you can't do utilitarianism just admit it and find some other moral system. Future people are real.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9836

>>9835

They're less real than animals, whose utility we also don't give a fuck about

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9837

>>9835

>Future people are real.

They literally aren't. They don't exist now, and they won't exist at all if we don't create them. At least not in our little slice of the multiverse.

Please, tell me why I should prefer the universe to be filled with a quadrillion unhappy minimal agents, because that technically maximizes utility. That's clearly not what I want. It's not what anyone wants. So lets not reward that in our utility function.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9838

It's sort of a variation of the robot problem. A democratic nation rules that robots are people and get a vote. And anyone can make robots.

So someone just builds a hundred million robots that all want him to be the dictator, and wins the election.

(Something like this sort of happens today when a political party favors immigration because immigrants tend to vote for them more. Although at least immigrants already exist.)

A solution is just to fix the number of votes in the society and distribute them among the existing population. If you make five robots, you all get one sixth of a vote.

You could do something similar with utility functions. You can make new agents if you want, but they split your share of the total utility.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9839

>>9833

How can it not handle spite? People have utility functions, to the extent that they prefer some universe states over others. You can fit a utility function to those preferences.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9840

>>9836

>>9837

Read about relativity of simultaneity until you can picture the ladder + barn doors thought experiment in your head from both the stationary barn and the stationary ladder reference frames, then think about a block universe.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9841

>>9840

>believing in (((Einstein)))'s theories

Glad I'm not educated stupid like this, I bet you're a spherecuck too

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9842

>>9838

>So someone just builds a hundred million robots that all want him to be the dictator, and wins the election.

Why would they want him to be dictator? If your answer is "Because he made them", then they wouldn't get votes.

>Although at least immigrants already exist.

Not true, the biggest consequence of immigration without integration is them becoming a subculture within the society. You end up with millions more "immigrants" than came into the country in the first place, even more than are "left" in the donor country.

>A solution

The solution is recognition of the fact that some systems are fundamentally flawed and will always have situations in which they produce perverse outcomes.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9843

>>9840

Physicists' highly idealized mathematical models of time have little to do with how we should treat time, as actual agents in a causal universe.

>>9842

>Why would they want him to be dictator?

Yes that's a bit extreme, but it's a silly thought experiment.

More realistically, the first robot just builds a hundred million copies of itself and they all vote for the robot emperor and exterminate the humans.

>The solution is recognition of the fact that some systems are fundamentally flawed and will always have situations in which they produce perverse outcomes.

Um, yeah literally everyone agrees naive utilitarianism has serious flaws. So why not fix it instead of just accepting perverse outcomes?

Especially if we plan on building actual AIs that implement this stuff. But even if it was just for fun philosophical debate about what the best moral system is.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9844

>>9843

>and exterminate the humans.

Same question applies. Why would they want to exterminate the humans?

inb4 "it's silly"

>So why not fix it

Sure it can go on the list of ideologies to fix, after Stalinism and communism and Nazism and theocracy and anarcho-primitivism and so on.

>accepting perverse outcomes

Fundamentally flawed.

Fundamentally.

Flawed.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9845

>>9844

>Same question applies. Why would they want to exterminate the humans?

Because they are robots that don't value human life.

>Sure it can go on the list of ideologies to fix, after Stalinism and communism and Nazism and theocracy and anarcho-primitivism and so on.

Please, enlighten me on how to fix any or all of the problems with communism and make it actually work. That would be completely amazing and great if you could do that.

But you can't do that. You can fix utilitarianism though. At it's core it's an optimal system that solves tons of issues every other system has. It just has some weird edge cases, that can be easily fixed with methods like I proposed.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9846

>>9845

>Because they are robots that don't value human life.

Why wouldn't they?

>But you can't do that.

That's the point.

>that can be easily fixed with methods like I proposed.

lolno

Using your >>9838 "You can make new agents if you want, but they split your share of the total utility." we can end up with all a tiny minority of the world having the vast majority of the utility. For example, if half the starting agents have children at replacement rate and the other half have above replacement rate, you can end up with an infinitesimal percentage of the total population having 50% of utility. Your fix is no fix.

But attempting fixes is irrelevant: "Utility" is impossible to properly define and even if you could define it, you run into the problem of knowledge, since no one can know what actions have the most utility. You are in the situation communism is in, the need to be omniscient and to be able to communicate that omniscience to everyone instantly. And we've already heard your view of communism.

Fundamentally flawed. Even more so than communism, which at least has a logically possible goal of redistribution, no matter how impractical and wrongheaded.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9847

>>9846

>For example, if half the starting agents have children at replacement rate and the other half have above replacement rate, you can end up with an infinitesimal percentage of the total population having 50% of utility.

I'm completely ok with that. If you choose to not have children, you shouldn't be forced to subsidize breeders.

But if you prefer you can distribute utility among all living people.

>"Utility" is impossible to properly define

It has a precise mathematical definition. And for practical purposes we can approximate it well enough.

>you run into the problem of knowledge, since no one can know what actions have the most utility.

Hence expected utility. You take the actions with the highest predicted utility. This is really basic stuff.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.9848

>>9847

> for practical purposes we can approximate it well enough.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.11175

>>9825

He even made a music video about it:

https://www.youtube.com/watch?v=1CcgrdFF6ik

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]