LoquaciousAntipodean OP t1_j5cluk4 wrote
Reply to comment by sticky_symbols in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
That's quite likely, as Shakespeare said, 'brevity is the soul of wit'. Too many philosophers forget that insight, and water the currency of human expression into meaninglessness with their tedious metaphysical over-analyses.
I try to avoid it, I try to keep my prose 'punchy' and 'compelling' as much as I can (hence the agressive tone 😅 sorry about that), but it's hard when you're trying to drill down to the core of such ridiculously complex, nuanced concepts as 'what even is intelligence, anyway?'
Didn't name myself 'Loquacious' for nothing: I'm proactively prolix to the point of painful, punishing parody; stupidly sesquipedalian and stuffed with surplus sarcastic swill; vexatiously verbose in a vulgar, vitriolic, virtually villainous vision of vile vanity... 🤮
sticky_symbols t1_j5duh63 wrote
Ok, thanks for copping to it.
If you want more engagement, brevity is the soul of wit.
LoquaciousAntipodean OP t1_j5e1ec7 wrote
Yes, but engagement isn't necessarily my goal, and I think 111+ total comments isn't too bad going, personally. It's been quite a fun and informative discussion for me, I've enjoyed it hugely.
My broad ideological goal is to chop down ivory towers, and try to avoid building a new one for myself while I'm doing it. The 'karma points' on this OP are pretty rough, I know, but imo karma is just fluff anyway.
A view's a view, and if I've managed to make people think, even if the only thing some of them might think is that I'm an arsehole, at least I got them to think something 🤣
sticky_symbols t1_j5ftrlk wrote
You're right, it sounds like you're accomplishing what you want.
Viewing a single comment thread. View all comments