용어집으로 돌아가기
AI & LLM
AI Hallucination
When an AI system generates false, misleading, or fabricated information that appears plausible but has no basis in reality or training data.
AI Hallucination 이해하기
AI hallucinations occur when language models confidently produce incorrect information, invented facts, or nonexistent sources. For brand visibility, hallucinations can mean AI systems misrepresent your product features, pricing, or company information. Combating hallucinations requires clear, authoritative content that AI systems can reliably reference, reducing the likelihood of fabricated information about your brand.
관련 리소스
유용한 체크리스트
다른 용어 보기
AI AgentAI CrawlersAI SEOAI VisibilityChain of ThoughtCitation AuthorityContext WindowE-E-A-TEmbeddingsFew-shot LearningFine-tuningGroundingInferenceJSON-LDLLM OptimizationMultimodal AIProgrammatic SEOPrompt EngineeringRAG (Retrieval-Augmented Generation)Robots.txtSemantic HTMLStructured DataTokenizationTopical AuthorityVector SearchZero-shot Learning