I co-teach AP Computer Science A through Microsoft’s TEALS program. The classroom runs on Chromebooks, Google Classroom, and code.org (AWS). Corporate infrastructure top to bottom. This year I added an AI tutor. That’s apparently the controversial part.

The research is interesting: a Wharton study found students using standard ChatGPT performed 17% worse on exams—the “crutch” effect. But students using AI with pedagogical guardrails showed no negative effect. The problem isn’t AI in education. It’s unguided AI. So I built a tutor that asks probing questions instead of giving answers. I’m sharing the prompt I use and how to set one up yourself.

While, China made AI education mandatory for six-year-olds this year. We’re still deciding whether to block ChatGPT.

  • GreenBeard@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 minutes ago

    Hard pass. There absolutely should be no AI in any classroom under any circumstances. The whole point of a classroom is to build a foundation on which to understand the fundamentals before they slap a set of training wheels on and vibe-code their way into disaster. Most of these LLMs ignore whatever guardrails you slap on them far too frequently.

    The most important lesson these kids need to learn is if you can’t do it yourself, you shouldn’t be letting an LLM do it for you. If the best you can say about the effects is “This version doesn’t seem to be actively harming them” then the bar is in hell, and we shouldn’t be playing with these tools at all at this point.

  • XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    3 hours ago

    I got into volunteering through TEALS, Microsoft’s nonprofit.

    Good for you / I’m sorry to hear that

    The class runs on Chromebooks managed by Google Classroom, writing code on code.org—which is powered by AWS.

    My condolences to the students. It sounds like they’re already being brought up in a world where they are expected to own nothing and be happy.

    I hope you teach them about how terrible this privacy violation is, and how they are slowly being groomed into dependency.

    Corporate infrastructure is already the foundation of public CS education.

    That’s very sad too.

    …wait, you’re upset because you want to indoctrinate the children with more stuff?

    • davidwkeith@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      3 hours ago

      Yeah, no. The state of affairs is sad, but a common complaint about AI in the classroom is there is no open source, federated, or other ‘free’ version. It sucks, but we need to work with the tools we have.

      • RalfWausE@feddit.org
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        2 hours ago

        No, de decisively don’t “need to work with the tools we have”. We have to teach our students the dangers if that abomination and ways to disrupt, hamper, poison and destroy this stuff.

      • XLE@piefed.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        2 hours ago

        Don’t the tools we have include internet and even (gasp) book literacy rather than going to a chatbot? At very best, evidence AI helps anyone is shaky. At worst, we are witnessing a reverse Flynn effect in education right now, and this alleged tool - besides not doing what was promised and can’t even make enough money to prop itself up - has been caught enticing children into suicide. If a billionaire genius like Sam Altman can’t code in a guardrail to save a child’s life, how can you?

        Why encourage it?

        Are the children being taught a tool, or are they being used as guinea pigs?

  • Kairos@lemmy.today
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 hours ago

    “Using” AI is not well defined. I assume the one that showed no difference is because the students found it useless.

    • davidwkeith@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      Eactly! The cohert that showed no difference didn’t provide guidance of any sort, just provided GPT-4 as a resource. The cohert that benifited had a tutor agent setup and the students were instructed to treat it like a tutor. Like calculators, computers, and the Internet before, we need to design curriculum with AI in mind for it to be useful.