Kapisch @BitbyBit - 3d
LLMs "hallucinate" because they're not search engines. They predict the most likely next token. Even if that means inventing fake URLs, stats, or people. So never take it by its word, always double check.