@
[email protected] @
[email protected] literally this. the worst part about current llms is that they cant say "i dont know" and would rather come up with some nonsense, making them utterly useless on their own.
llms are *really* good at writing *convincing* texts, which is really all that they should be doing. paired with a human that would be able to discern hallucinations from reality. not the other way around.
agi is scary and all, but llms just wont become agi anytime soon