Skip to main content

Unblocking AI fears

Hayden Rolls avatar
Written by Hayden Rolls
Updated today
  1. "But won't AI de-skill educators?”

    AI isn’t the problem — how we design it is

    It’s important to separate two different things:

    an “AI problem” and a design problem with AI tools.

    Research shows that AI can help or harm learning, depending on how and when it’s used.

    A 2025 study by MIT looked at students using ChatGPT to write essays. The researchers found that when students used AI too early in the learning process, they learned less. They remembered fewer facts and engaged less with the task. This often happened when students simply typed in the essay question and accepted the generated answer.

    However, the same study notes that timing matters. The MIT researchers noted that when AI is used after learners have done their own thinking, it can improve understanding without weakening long-term thinking skills.

    Another MIT study in 2025 found similar results in the workplace. When AI was designed and used well, it helped people ask better questions, spot gaps in their knowledge, and get real value from AI. As the researchers explained, “Metacognition — thinking about your thinking — is the missing link between simply using AI and using it well.” In other words, AI is most effective when it encourages active thinking, rather than passive reliance.

    Most tools like ChatGPT are generic. They are not built for early childhood education. This creates real risks. With the wrong tools, an educator can type a single sentence and generate a full learning story. Over time, this can lead to de-skilling, where professional judgment and reflection are lost.

    That’s why design matters.

    At Mana, we built a purpose-built AI tool, created by educators, for educators. Our AI is designed to act as a coach, not a shortcut. Coach Sue uses Socratic questioning to guide thinking. You can only generate documentation after you’ve done enough critical reflection. The AI supports thinking — it doesn’t replace it.

    In a 2024 survey of over 10,000 educators using Mana, 89% said Coach Sue improved their critical reflection.

    Coach Sue is designed to engage the social brain. Humans learn best through interaction. Sue acts like a mini educational leader — a thought partner you can talk to about your practice.

    So AI isn’t the problem.

    The real question is how we design AI tools to support thinking, protect professional skill, and use AI’s power responsibly.

    It is important here that we distinguish what is an "AI" problem vs a "design" problem of AI tools.

  2. "Is storing data with AI safe? ”

    Children’s data safety is about security, not AI

    When we talk about children’s data and safety, the key issue isn’t AI itself.

    It’s how information is stored, who can access it, and what safeguards are in place to prevent misuse.

    In other words, this is a data storage and cybersecurity issue, not an AI feature.

    We see this distinction every day. Companies like Apple and Samsung use AI across their products. Yet many of us are comfortable storing photos and videos of our own children on our phones. That trust comes from confidence in the security systems they’ve put in place — not because the technology lacks AI.

    The same principle applies in early childhood education.

    In a recent industry-wide review by Aram Advisory (see here), Mana was named the safest platform in early childhood education. Mana is currently the only provider in the sector with an independently certified SOC 2 cybersecurity attestation.

    SOC 2 certification means our systems have been externally audited against strict international standards for data security, privacy, and access controls. This provides a higher level of assurance that children’s data is protected from leaks, misuse, or unauthorised access.

    When it comes to children’s data, the right question isn’t “Does this platform use AI?”

    It’s “How well is the data protected?”

  3. “But AI is not their authentic voice - it doesn't reflect their own words.”

    Writing support has always existed

    Getting help with writing is nothing new.

    World leaders often work with speechwriters. Many authors use ghostwriters or editors. Executives regularly rely on graduates or assistants to draft reports and emails. Even Martin Luther King Jr. worked with two lesser-known speechwriters, Clarence B. Jones and Stanley Levison. Yet the words — and the responsibility for them — were ultimately his.

    AI can support educators in a similar way.

    Educators bring the deep knowledge, insight, and understanding of each child. AI simply helps turn those ideas into clear, professional language. The educator reviews, edits, and owns the final words.

    This isn’t about replacing professional judgement or an educator’s voice. It’s about supporting it — like a trusted co-writer.

    By reducing the time spent perfecting wording, educators can focus more on what matters most: being present with children.

  4. “But won’t they lose the ability to write learning stories truly on their own?”

    Progress changes how we work

    There was a time when people navigated by the stars. Today, we use GPS. We once solved long division by hand. Now, we use calculators. Throughout history, new tools have changed how work gets done.

    These changes aren’t about taking shortcuts. They’re about progress. New tools reduce time spent on mechanics, so we can focus on what matters most.

    For educators, this means spending more time thinking, reflecting, and being present with children — rather than perfecting every sentence of documentation.

    For parents, it’s important to understand how much time documentation can take away from direct time with children. The right tools help lighten that load, giving educators more energy and attention for what truly counts — the children themselves.

Did this answer your question?