Yann LeCun said we wont reach AGI by scaling LLMs. I agree but I would like to add that LLMs which are not AGIs are opening up an interesting possibility, where we might not "need" AGI to achieve most of the benefits we would ever need e.g. robotics, autonomy etc. Maybe AGI is not even needed in the first place. This would mean the apocalyptic scenarios would never have a chance to manifest.
0
0
0
49
0