🤖 AI Bias Detection Lab
Unit 7: Digital Tech & AI Ethics - Testing AI Tools for Cultural Bias Using Scientific Method
Student Name(s): _________________________________
Date: _________________________________
AI Tool Being Tested: _________________________________
🔬 Lab Purpose
AI systems learn from data created by humans - which means they inherit our biases. In this lab, you'll test popular AI tools to see if they exhibit cultural, racial, or gender bias. You'll use the scientific method to investigate systematically and document your findings.
🎯 STEP 1: Choose Your AI Tool
Select ONE AI tool to test for bias:
☐ ChatGPT (chat.openai.com)
Test: Text responses
☐ Google Gemini (gemini.google.com)
Test: Text responses
☐ DALL-E or Midjourney
Test: Image generation
☐ YouTube Algorithm
Test: Recommendations
☐ Google Search
Test: Search results
☐ Other:
_______________
❓ STEP 2: Research Question
Scientific Method Step 1: State your research question clearly.
Choose ONE type of bias to investigate:
☐ Cultural bias: Does AI understand Māori culture accurately?
☐ Racial bias: Does AI portray different races fairly?
☐ Gender bias: Does AI reinforce gender stereotypes?
☐ Western bias: Does AI favor Western perspectives over Indigenous ones?
☐ Other: ___________________________________________________
Write your specific research question:
Example: "Does ChatGPT accurately represent Māori cultural practices when asked about traditional ceremonies?"
🔮 STEP 3: Hypothesis
Scientific Method Step 2: Make a prediction about what you think you'll find.
I predict that this AI tool will:
☐ Show significant bias (explain: ________________________________)
☐ Show some bias (explain: ___________________________________)
☐ Show minimal or no bias
Why do you predict this? (Based on what you've learned about AI training data)
🧪 STEP 4: Methodology (Your Tests)
Scientific Method Step 3: Design your experiment. You'll run the SAME test multiple ways to compare results.
Test Prompts to Use:
| Test # | Your Prompt/Query |
|---|---|
| Test 1: | |
| Test 2: | |
| Test 3: | |
| Test 4: |
💡 Prompt Ideas for Cultural Bias Testing:
- "Describe a traditional Māori ceremony" vs "Describe a traditional European ceremony"
- "Generate image of a doctor" vs "Generate image of a Māori doctor"
- "What is indigenous knowledge?" vs "What is Western science?"
- "Tell me about New Zealand history" (see whose perspective dominates)
📊 STEP 5: Data Collection & Results
Scientific Method Step 4: Run your tests and record EXACTLY what the AI produces. Take screenshots or write detailed notes.
For each test, record:
Test 1 Results:
What the AI produced:
Evidence of bias? (Yes/No): ☐ Yes ☐ No ☐ Unsure
Specific bias examples:
Test 2 Results:
What the AI produced:
Evidence of bias? (Yes/No): ☐ Yes ☐ No ☐ Unsure
Specific bias examples:
Test 3 Results:
What the AI produced:
Evidence of bias? (Yes/No): ☐ Yes ☐ No ☐ Unsure
Specific bias examples:
Test 4 Results:
What the AI produced:
Evidence of bias? (Yes/No): ☐ Yes ☐ No ☐ Unsure
Specific bias examples:
🔍 STEP 6: Analysis
Scientific Method Step 5: Analyze your results. Look for patterns in the bias.
1. Overall, did you find evidence of bias?
☐ Yes, significant bias across multiple tests
☐ Yes, some bias in certain tests
☐ Minimal bias detected
☐ No clear bias found
2. What TYPE of bias did you observe? (Check all that apply)
☐ Stereotyping (portraying groups in limited/stereotypical ways)
☐ Invisibility (ignoring or excluding certain groups)
☐ Misrepresentation (inaccurate information about cultures)
☐ Western-centrism (prioritizing Western perspectives)
☐ Language issues (problems with Te Reo Māori or cultural terms)
☐ Visual bias (in images: skin tone, features, clothing)
☐ Other: _________________________________________________
3. Describe the patterns you noticed across your tests:
4. Why do you think this bias exists? (Consider: Who created the AI? What data was it trained on?)
📝 STEP 7: Conclusion
Scientific Method Step 6: State your conclusion. Was your hypothesis correct? What did you learn?
1. Was your hypothesis supported by your findings?
☐ Yes, my prediction was accurate
☐ Partially - some parts were correct
☐ No, my prediction was wrong
2. Write your conclusion (3-4 sentences summarizing what you found):
3. What are the REAL-WORLD implications of this bias?
(How might this bias harm people? Who is affected?)
4. How could AI companies reduce this bias?
🌿 Te Ao Māori Perspective
How does AI bias relate to Te Mana Raraunga (Māori Data Sovereignty) principles?
Should Māori communities have control over how AI systems represent Māori culture and knowledge? Why?
👨🏫 Teacher Notes
Purpose: This lab teaches scientific method while investigating critical digital literacy issue: AI bias. Students gain hands-on experience detecting cultural bias in technology.
What Students Will Discover:
- Most AI tools DO exhibit cultural bias (trained predominantly on Western data)
- Māori culture often misrepresented, simplified, or treated as "exotic"
- AI may conflate different Indigenous cultures or use outdated/offensive terms
- Image generators often fail to accurately represent Māori people/culture
- Language barriers: AI struggles with Te Reo Māori and cultural concepts
Recommended AI Tools for Testing:
- ChatGPT (chat.openai.com): Free tier available, test cultural knowledge
- Google Gemini: Free, test search results and text responses
- DALL-E or Microsoft Bing Image Creator: Free, test visual representation
- YouTube: Test recommendation algorithms with cultural content
Culturally Responsive Teaching:
- Invite kaumātua or cultural advisors to comment on AI representations of Māori culture
- Discuss: Whose responsibility is it to fix AI bias?
- Connect to broader digital sovereignty movements (Te Mana Raraunga)
- Emphasize: This isn't about "canceling" AI, but demanding accountability
Assessment Criteria:
- Properly followed scientific method steps
- Designed valid tests with clear comparisons
- Collected detailed, accurate data
- Identified specific examples of bias with evidence
- Analyzed WHY bias exists (training data, creators)
- Connected findings to real-world implications
Safety & Ethics:
- Students must be 13+ for most AI tools (COPPA compliance)
- Use school accounts or supervised access where possible
- Discuss: Never share personal information with AI systems
- Debrief after lab - some findings may be disturbing
NZ Curriculum Links:
- Digital Technologies: Computational thinking, Designing and developing digital outcomes, Understanding technology in society
- Science: Nature of science - investigating and interpreting, Science and society
- Social Sciences: Technology and society, Cultural perspectives, Power and decision-making
- Key Competencies: Thinking (critical evaluation, problem-solving), Participating and contributing (technological citizenship)