AGI in a vulnerable world

I’ve been thinking about a class of AI-takeoff scenarios where a very large number of people can build dangerous, unsafe AGI before anyone can build safe AGI. This seems particularly likely if: It is considerably

AI Timelines

2019 trends in GPU price per FLOPS

We estimate that in recent years, GPU prices have fallen at rates that would yield an order of magnitude over roughly: 17 years for single-precision FLOPS 10 years for half-precision FLOPS 5 years for half-precision