Combining Generative AI Tools with Traditional Sources for Optimal Results
Hybrid Approach: Using both traditional sources and generative AI can provide a more comprehensive understanding by using them for their strengths and compensating for their weaknesses.
- Example:
- Use generative AI to get a list the main sub-areas to of a topic and of the important terminology
- Use an encyclopedia and search for terms provided by the generative AI to get a broad overview and verify information provided in the AI summary
- Use articles from academic journals for detailed research
- Use generative AI for definitions of terms or rephrasing of complex paragraphs when parts of academic articles are difficult to understand
Importance of Critical Thinking
- Critical Evaluation: One needs to critically evaluate all sources, whether traditional or AI-generated.
- Example: Cross-check information from multiple sources to ensure accuracy and reliability.
When Traditional Sources are Best:
- Need for verified information: When accuracy is critical (e.g., medical, legal, financial)
- Deep topic exploration: For a comprehensive understanding of established subjects
- Fast-changing topics: For topics that have changed since a pre-trained model was last updated - especially if the tool is not augmenting responses with text from and links to more recent human-created and vetted information
- Citations Required: When formal citations are needed for academic or professional work
Ways Generative AI, Especially General-Purpose Large Language Models, Can Be Unreliable:
- Hallucinations: Generative AI can confidently present incorrect information as fact.
- Example: A student using an LLM for a research paper might receive fabricated citations or misattributed quotes that appear legitimate.
- Inaccurate citations: Related to hallucinations, even when citations point to sources that exist, generative AI tools make claims about the content of the source that do not accurately represent what is in the source.
- Example: If you try to trace the information from the generative AI back to a primary source, you cannot find that information
- Outdated Information: Most models have training cutoffs and lack recent information.
- Example: Using generative AI to understand current medical guidelines could lead to outdated treatment recommendations.
- Training data biases: Models reflect biases present in their training data.
- Example: Using generative AI for historical research might yield perspectives that overrepresent dominant cultural viewpoints while minimizing marginalized experiences.
- Surface-level knowledge: Related to outdated information and training data biases, if the training data lacks information about a domain, it may not be able to give more than superficial responses or may be more prone give plausible-sounding incorrect information.
- Example: A person using generative AI for complex case analysis might receive oversimplified interpretations that miss critical nuances related to the subject.