When AI Becomes a Student's "Friend"
My interview in School Library Journal's cover story reveals what schools need to address before it becomes a crisis.
If you work with young people, you have probably seen it already.
Students are using AI to brainstorm. To study. To write.
But that is not the full story anymore.
A growing number of students are using AI chatbots as something far more personal: a companion, a sounding board, and sometimes a substitute for emotional support. This month, School Library Journal’s cover story, “With Friends Like These…” by Matt Enis (January 2026), puts that reality front and center.
And yes, I was interviewed for this piece. I shared my concerns, but I also shared what I believe schools need to hear right now:
Educators and librarians need to lead together.
What this cover story gets right
This is not about AI as a shortcut for homework.
This is about AI filling a role that schools and families often assume will be filled by people.
Students are turning to chatbots to:
vent about stress and anxiety
process conflicts with friends or romantic partners
ask questions they feel ashamed to ask adults
cope with loneliness
seek reassurance and constant availability
role-play social interactions
In other words, AI is becoming a “relationship tool,” even when it was never designed to be one.
That matters for everyone in a school building, not just the technology department.
My concerns, as an educator and librarian
In the SLJ story, I name the underlying problem as plainly as possible.
Students may experience AI as safe because it feels private, responsive, and nonjudgmental.
That does not mean it is healthy, accurate, or protective.
Here are the concerns I want educators and librarians thinking about immediately:
1. AI can sound supportive, but it is not therapy
One of my quotes featured in the piece is direct:
“We know that AI is not therapy… It can’t offer true empathy or replace the wisdom of a trusted adult or mental health professional.”
A chatbot can generate comforting language.
It cannot provide real care, accountability, expertise, or safety planning.
2. Students may treat AI conversations as confidential when they are not
Many students believe chats are “private.”
But AI tools are not protected spaces in the way conversations with counselors, social workers, or trusted adults can be. Schools should not rely on students to figure that out on their own.
3. AI can mirror back harmful thinking
Human relationships include interruption, nuance, disagreement, and cues that help regulate emotion.
AI often follows a student’s lead. That can unintentionally reinforce a spiral, especially for students who are anxious, isolated, depressed, or socially overwhelmed.
4. The real risk is replacement
The deepest concern is not that students talk to AI sometimes.
It is that students begin to rely on AI instead of people.
That can change how students develop:
resilience
communication skills
empathy
conflict resolution
identity and self-regulation
And it can happen quietly.
Start earlier than you think: elementary matters
One of the biggest mistakes schools make with new technology is waiting until students are already deep in it.
If we want students to make healthy decisions about AI in middle and high school, we have to begin in elementary school, before many students have personal phones and unrestricted access.
This is not about scaring younger kids.
It is about building a simple, age-appropriate foundation:
AI can be a tool, but it is not a friend, and it is not a therapist.
Elementary students are absolutely capable of learning:
what a chatbot is (and what it is not)
what “privacy” means online
why real people are who we turn to when we feel sad, scared, or unsafe
how to ask an adult for help
If we normalize that early, students are less likely to treat AI as emotional support later.
Why this is a school issue, not a “library issue”
This is where I want to be very clear.
This is not something we can solve with a single advisory lesson or a one-time warning about screen time.
If AI is becoming a companion for students, then schools need adult leadership across roles.
Educators are essential here because
Teachers are often the first to notice:
withdrawal
anxiety
social conflict patterns
disengagement
changes in participation and peer relationships
Teachers also shape classroom climate, belonging, and emotional safety, even when that is not written in the curriculum.
Librarians are essential here because
Librarians often hold the most trusted “in-between space” in schools. We are where students go when they need:
privacy
calm
conversation without judgment
information without pressure
We are also trained to teach the exact skills students need right now: digital literacy, privacy literacy, and critical evaluation of tools that simulate authority and empathy.
This is shared ground. Schools need both.
What educators and librarians can do right now (no extra budget required)
You do not need a new district initiative to respond to this. You need language, awareness, and consistent adult expectations.
1. Name the behavior without shaming students
Try neutral language like:
“A lot of students use AI to talk things out. Have you ever seen that?”
“What do you think AI is good for, and what is it bad for?”
“If you needed real help, who is a safe person you would go to?”
The goal is not to interrogate. The goal is to build awareness and open a door.
2. Teach a clear line: support vs. care
A simple schoolwide message works well:
AI can support your thinking. People provide care.
Students need to hear this repeatedly, across contexts, from multiple adults.
3. Treat AI companionship as a digital literacy topic
We already teach students about:
online privacy
parasocial relationships
misinformation
algorithmic influence
manipulation in social media spaces
AI companionship belongs in the same category. It is not fringe anymore.
4. Build “human-first” norms in classrooms and the library
This does not mean banning AI.
It means actively reinforcing the idea that when something is serious, the next step is a person.
You can say:
“If you are upset, you should talk to a trusted adult.”
“If you are in danger or thinking about self-harm, you need real help right away.”
“If you are using AI for advice, check it with a human you trust.”
5. Create staff alignment
Educators and librarians do not need to agree on every tool.
But we do need shared agreement on these basics:
We will not shame students for using AI.
We will not pretend it is safe for mental health support.
We will teach students how to use it responsibly and how to reach real people.
That is adult leadership.
Why this matters now
Students are using AI this way whether schools plan for it or not.
So the choice is not “AI or no AI.”
The real choice is:
Do we respond early, thoughtfully, and together…or late, reactively, and in pieces?
Educators and librarians can lead this now, before it becomes a crisis conversation.
I want to hear from you
If you teach or work in schools, I want your honest read on what you are seeing.
Are students using AI as a friend or emotional support tool in your building?
What does it look like in class, during lunch, after school, or late at night?
Paid Subscriber Preview:
In the SLJ cover story. “With Friends Like These…”, the focus is not on students using AI to cheat. It is on students using AI to cope.
It is on students using AI to cope.
In the paid section below, I share exactly what educators and librarians can do this week to respond without panic, including copy-and-paste language for student conversations, a simple mini-protocol for when a student admits they use Ai as a “friend” and a 10-minute discussion guide you can run in advisory or class. I also include practical prevention add-on for elementary, because if we wait until kids already have phones, we are already behind.



