Viewing a single comment thread. View all comments

adventurousprogram4 t1_j9s10z7 wrote

EY is a total clown who inserts enough truth into his (incredibly lengthy) arguments that an air of correctness and solid reasoning permeate from it, but most of his claims simply reduce to p(everyone dies | literally anyone but EY charts the course) ~= 1. I am not exaggerating, he got angry publicly that others had not thought of everything he'd thought of before him when it was so obviously correct.

21

FinancialElephant t1_j9sqtwq wrote

I don't know anything about him when it comes to alignment. Seems like a lot of unrigorous wasted effort at first glance, but I haven't really had the time or desire to look into it.

The overbearing smugness of Inadequate Equilibria was nauseating. It was unreadable, even for poop reading. The guy is really impressed with himself for believing he came up with theories that have existed for a long time, but that he was too lazy and too disrespectful to research. I will admit there were a couple good snippets in the book (but given the general lack of originality, can we really be sure those snippets were original?).

>When things suck, they usually suck in a way that's a Nash Equilibrium.

There you go, I just saved you a couple hours.

What has EY actually done or built? He seems like one of those guys that wants to be seen as technical or intellectual but hasn't actually built anything or done anything other than nebulously / unrigorously / long-windedly discuss ideas to make himself sound impressive. Kinda like the Yuval Noah Harari of AI.

17

needlzor t1_j9sspwd wrote

Surprised I had to scroll down this much to see this opinion, which I agree completely with. The danger I worry about most isn't superintelligent AI, it's people like Yudkowsky creating their little cults around the potential for superintelligent AI.

9