I'm not really a safetyist, but this seems to me a bad argument. "It can't get worse." Why not? I can think of many things worse than nonprofits or governments that ASI could do.
I'm not really a safetyist, but this seems to me a bad argument. "It can't get worse." Why not? I can think of many things worse than nonprofits or governments that ASI could do.
It’s not a proof of guaranteed safety. It’s meant to shift priors. Many safetyists seem to believe there’s never been a misaligned ASI, because we’re not all dead. My point is that our lives are dominated by misaligned ASIs. Many things that are both vastly smarter than individual humans and evil have been out to get us for a while. We continue to survive both because we produce our own competing good egregores and because bad stuff is less effective as t->infinity.