Generative AI and Factuality: Dealing with 'Hallucinations' in Enterprise deployments
Hallucinations in Generative AI refer to the generation of text that are not based on factual information but instead arise from the model's internal processes or and the biases within it. In other words, hallucinations occur when the AI generates content that is fictional, inaccurate, or disconnected from reality. In enterprise grade deployments, factuality is one of the key challenges when utilizing Generative AI.
In this keynote, Goehkan will share methods and best practices based on real-life examples as well as academy research to deal with this challenge in Generative AI.