The term AI itself is a shifting of goalposts. What was AI 50 years ago is now AGI, so we can call this shit AI though it’s nothing of the sort. And everybody’s falling for the hype: governments, militaries, police forces, care providers, hospitals… not to speak of the insane amounts of energy & resources this wastes, and other highly problematic, erm, problems. What a fucking disaster.
If it wasn’t for those huge caveats I’d be all for it. Use it for what it can do (which isn’t all that much), research it. But don’t fall for the shit some tech bro envisions for us.
You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.
The term AI itself is a shifting of goalposts. What was AI 50 years ago is now AGI, so we can call this shit AI though it’s nothing of the sort. And everybody’s falling for the hype: governments, militaries, police forces, care providers, hospitals… not to speak of the insane amounts of energy & resources this wastes, and other highly problematic, erm, problems. What a fucking disaster.
If it wasn’t for those huge caveats I’d be all for it. Use it for what it can do (which isn’t all that much), research it. But don’t fall for the shit some tech bro envisions for us.
You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.