@GossiTheDog
It is just such a weird concept to me, that GenAI can really achieve so much, but just makes shit up.
Think about the achievement:
- we can literally just “Star Trek ask” for stuff and it tells us. About anything.
- it can parse, learn and retain huge interconnected data sets
- it can run complex task, managing inputs and outputs
That this can happen from saying something in plain language is amazing.
The fact that it fails at the basic fundamentals of “don’t make up stuff that isn’t in a dataset” is truly wild. No matter how many guardrails you put in place it’s like “NOOPE”