[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?

Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


File: e486ef99f352335⋯.png (129.13 KB,458x458,1:1,nc.png)

 No.6949

In particular, the parts of the “Less Wrong worldview” that are most notorious in the public eye are the parts I have no affinity with whatsoever. I don’t buy the Bayesian / Jaynesian stuff. I don’t think Friendly AI or MIRI are promising at all. I am skeptical of the idea that learning instrumental rationality can give you an “edge” in life (although Superforecasters has made me a bit less so). HPMOR is like fingernails on a chalkboard to me and I am still baffled as to why anyone likes it. Roko’s Basilisk is of course nonsense; it isn’t a major part of actual Less Wrong culture but everyone’s heard of it so it’s relevant here. The mixture of serious philosophy and (often anime-inspired) “wisdom literature” (”Tsuyoku Naritai,” “Something To Protect”) in the Sequences makes no sense to me. Eliezer Yudkowsky in general strikes me as a deeply misguided person who has some laudable ideals but fails to live up to them in his personal conduct as well as much of his writing, and whose aesthetics often repel me (for whatever that’s worth). Etc., etc.

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.6950

What the fuck does "I don't buy the Bayesian stuff" mean? Do you think Bayes' theorem is wrong somehow?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.6951

File: ba603082c44eb9c⋯.jpg (17.47 KB,413x395,413:395,m.jpg)

>>6949

>I don’t buy the Bayesian / Jaynesian stuff.

>isn't smart enough to understand Cox's theorem

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.6952

>>6950

I'm guessing he thinks we're trying to build a political party so that nerds and nerdy things can be popular or whatever and that those things are supposed to be advertising.

The anti-AI anti-Yudkowksy bunch has a lot more groupthink going on than the weirdos with blogs they obsess about. Chances are "Less Wrong worldview" makes a lot more sense to them than it does to anyone who ever posted there.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.6953

I like him. He's too smart not to come around on the AI stuff if he keeps thinking about it.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.12458

>>6953

Has he yet? Also, how good is Almost Nowhere compared to The Northern Caves? I still haven't read Almost Nowhere.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.12459

>>6950

>he's not a Kalman-Filterist

>he's not a Solomonoffian

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.12461

>>6953

I have bad news about getting people to understand things when their salary depends on not understanding them

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.12462

>>6949

what exactly isn't being doubted here. is there anything?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.12464

>>12461

How does his salary depend on underestimating AI?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]