AI: Student Guide to ChatGPT, CoPilot and Other AI Resources
- Home
- Creating Prompts
- Fact-checking is always needed
- AI & Databases
- Is using ChatGPT considered cheating?
- Ethical Considerations
- Citing Generative AI
- Tools by Category
- Different Disciplines
-
FAQs
Toggle Dropdown
- How can I protect my privacy while using ChatGPT?
- I can't find the citations that ChatGPT gave me. What should I do?
- How are generative AI models biased, and how can I avoid biased results?
- How can I fact-check the information that ChatGPT and other AI resources give me?
- Which AI tools are the best for searching?
Fact-checking is always needed
AI "hallucination"
The official term in the field of AI is "hallucination." This refers to the fact that it sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic.
Fact-checking AI output is crucial to cut down on these errors, especially as AI becomes more integrated into important decision-making processes. AI systems can often sound confident but still get things wrong. This usually happens because they don’t have all the right information, might be influenced by bias, or simply make guesses based on patterns.
ChatGPT often makes up fictional sources
One area where ChatGPT usually gives fictional answers is when asked to create a list of sources. Since we've had many questions from students about this, we offer this FAQ: I can’t find the citations that ChatGPT gave me. What should I do?
Causes of AI Hallucinations
- Insufficient or low-quality training data*
- Incorrect assumptions
- Biases in training data
- Lack of real-world understanding
Consequences of AI Hallucinations
- Misinformation
- Damage to reputation
- Legal and ethical implications
Mitigating AI Hallucinations
- Fact-checking
- Improving training data
- Developing better AI models
- User awareness
"Vegetative Electron Microscopy"???
For an entertaining, real-world example of low-quality training data, be sure to read "A weird phrase is plaguing scientific papers - and we traced it back to a glitch in AI training data," from The Conversation, April 15, 2025.
- Last Updated: Oct 20, 2025 12:15 PM
- URL: https://guides.lib.uiowa.edu/AIStudentGuide
- Print Page