# 1.1 Priority objection: AGI is Too Far so it isn't worth worrying about Priority objection: AGI is Too Far so it isn't worth worrying about: A frequent argument against work on AI Safety is that we are hundreds if not thousands of years away from developing superintelligent machines and so even if they may present some danger it is a waste of human and computational resources to allocate any effort to address Superintelligence Risk at this point in time. # 1.2 Priority objection: A Soft Takeoff is more likely and so we will have Time to Prepare Priority objection: A Soft Takeoff is more likely and so we will have Time to Prepare: AI takeoff refers to the speed with which an AGI can get to superintelligent capabilities. While hard takeoff is likely and means that process will be very quick, some argue that we will face a soft takeoff and so will have adequate time (years) to prepare. # 1.3 Priority objection: There is No Obvious Path to Get to AGI from Current AI Priority objection: There is No Obvious Path to Get to AGI from Current AI: While we are making good progress on AI, it is not obvious how to get from our current state in AI to AGI and current methods may not scale. # 1.4 Priority objection: Something Else is More Important than AI safety / alignment Priority objection: Something Else is More Important than AI safety / alignment: Some have argued that global climate change, pandemics, social injustice, and a dozen of other more immediate concerns are more important than AI risk and should be prioritized over wasting money and human capital on something like AI Safety. # 1.5 Priority objection: Short Term AI Concerns are more important than AI safety Priority objection: Short Term AI Concerns are more important than AI safety: Similar to the argument that something else is more important, proponents claim that immediate issues with today’s AIs, such as algorithmic bias, technological unemployment or limited transparency should take precedence over concerns about future technology (AGI/superintelligence), which doesn’t yet exist and may not exist for decades