[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/ratanon/ - Rationalists Anonymous

Remember when /ratanon/ was good?
Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


File: 33fafc4f391bc79⋯.png (88.44 KB,740x584,185:146,unknown.png)

 No.14964

What are your singularity timelines /ratanon/? Do you think we'll all get optimized by the end of this decade?

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14974

What is that screencap from?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.14975

https://www.unz.com/akarlin/short-history-of-3rd-millennium/

>(1) (a) Direct Technosingularity - 25%, if Kurzweil/MIRI/DeepMind are correct, with a probability peak around 2045, and most likely to be implemented via neural networks (Lin & Tegmark, 2016).

>(2) The Age of Em - <1%, since we cannot obtain functional models even of 40 year old microchips from scanning them, to say nothing of biological organisms (Jonas & Kording, 2016)

>(3) (a) Biosingularity to Technosingularity - 50%, since the genomics revolution is just getting started and governments are unlikely to either want to, let alone be successful at, rigorously suppressing it. And if AGI is harder than the optimists say, and will take considerably longer than mid-century to develop, then it's a safe bet that IQ-augmented humans will come to play a critical role in eventually developing it. I would put the probability peak for a technosingularity from a biosingularity at around 2100.

>(3) (b) Direct Biosingularity - 5%, if we decide that proceeding with AGI is too risky, or that consciousness both has cardinal inherent value and is only possible with a biological substrate.

>(4) Eschaton - 10%, of which: (a) Philosophical existential risks - 5%; (b) Malevolent AGI - 1%; (c) Other existential risks, primarily technological ones: 4%.

>(5) The Age of Malthusian Industrialism - 10%, with about even odds on whether we manage to launch the technosingularity the second time round.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]