On December 21st, Ethereum founder Vitalik Buterin posted on social media, stating, "My definition of AGI (General Artificial Intelligence) is:
AGI is a powerful artificial intelligence, and if one day all humans suddenly disappear and the AI is uploaded into a robot's body, it will be able to independently continue civilization.
Obviously, this is a very difficult definition to measure, but I think it is the core of the intuitive difference between 'AI we are used to' and 'AGI' that many people have in their minds. It marks a transition from a tool that constantly relies on human input to a self-sufficient form of life.
ASI (Super Artificial Intelligence) is a completely different matter - my definition is when humans no longer add value to productivity in the loop (just like in board games, we actually only reached this point in the past decade)
Yes, ASI scares me - even my definition of AGI scares me because it brings obvious risks of losing control. I support focusing our work on building intelligent enhancement tools for humans, rather than building super intelligent life forms