What if a robot could have feelings?
That’s the provocative question at the heart of this Tea Talk episode with Dr. Susan Neimand, a legendary educator who has spent five decades shaping teachers, schools, and future leaders. Alongside hosts Erika Twani and Dr. Fiona Aubrey-Smith, Dr. Neimand takes us into a world where artificial intelligence can mimic empathy, crack jokes, and even feel like a trusted friend. But should it? Read the transcript below or listen to this episode on Spotify.
Erika Twani (ET): Hollywood did a great job of imprinting the dystopian reality in our minds about our future with robots. If a robot could recognize emotions, should we train it to care, or should we focus on teaching humans how to stay emotionally present in a digital world?
Dr. Susan Neimand (SN): Hollywood thrives on dystopia, manipulating our stress hormone, cortisol, to provoke reactions and alter perspectives. But its films are far from reality—education, law, policing, medicine, and more are distorted beyond recognition. When I told an acquaintance that I was working in the AI in Education space, she recoiled saying that AI is scary. Her reaction was based on the hyperbole of a movie that piqued her fears.
Robots have transformed the world of work by moving heavy loads, cleaning our floors, performing surgery, and simplifying manufacturing. But these activities are controlled and programmed by humans. Robots are tools. Training them emotionally is not a direction that humans should go in.
Prior to COVID, the World Health Organization reports that there are 970,000 people, or 1 in 5, 20% globally, with mental health issues such as anxiety and depression. 40% of high school students report sadness and hopelessness. 20% have considered suicide and 16% have made a suicide plan. During COVID, that number jumped to almost 38%. Today, it has once again leveled off at 20%, yet 45% globally report sleep disorders. Based on these shocking data, society’s focus should be on cultivating human experience, consciousness, connections, and awareness by practicing mindfulness, expressing feelings, and reducing distractions and “noise” that disrupts emotional stability and undermines mental health.
Dr. Fiona Aubrey-Smith (FAS): How should teacher preparation programs evolve to ensure educators are equipped to model and foster emotional intelligence even as AI tools become more responsive?
SN: Teacher preparation programs must acknowledge that AI is the present, will become more pervasive, and they need to respond accordingly. Teacher preparation programs need faculty who are well-acquainted with AI, understand its pros and cons, and have used it for a variety of tasks. This isn’t knowing how to write prompts. This means a deep understanding of what AI can do and what it can’t do.
A balanced perspective of the potential and limitations of AI should be presented to preservice and inservice teachers. AI’s strengths lie in data analysis, personalized learning experiences, and handling administrative tasks efficiently. AI’s limitations are its inability to emulate human empathy, make ethical decisions, maintain consistent trustworthy accuracy, and understand nuanced contexts.
Teacher preparation programs need to market AI as a useful tool that will take on mundane and routine tasks, augment lessons and curriculum, and allow teachers to engage in self-care, and to work with their students to develop their social-emotional competence. Classrooms should focus on supporting students’ mental health and establishing empathic teacher-student relationships. In fact, I firmly believe that Emotional Intelligence and Social Competence should be woven and incorporated into the curriculum of every K-12 classroom.
Teachers can support mental health by creating safe and inclusive environments, where students feel respected and valued. This helps reduce anxiety and stress. Teachers should encourage open communication and support students to talk about their feelings without judgment. They should teach Emotional Literacy through books, visuals, and activities; this helps students recognize and express their emotions through books, visuals, and activities. Implementing Mindfulness and stress reduction techniques, like breathing exercises and meditation that can help students manage stress. Teachers should learn to recognize signs of mental health struggles, watch for and be aware of symptoms of anxiety and depression, and provide appropriate support.
Building supportive relationships, the foundation of successful interactions. Teachers need to practice empathic and active listening, without immediately offering advice so that students feel heard and understood. A great classroom environment is one where there is mutual respect and trust, and all students are treated fairly, their perspectives are acknowledged, and strong connections are built. Teachers should promote positive behaviors through specific praise, understanding students, and reinforcing actions that enhance relationships and classroom dynamics. Teachers should be transparent and consistent, stipulating and maintaining clear expectations and fairness which help students feel secure and supported. Most importantly, teachers should prioritize self-care. Teachers who take care of their own well-being are better equipped to support students.
ET: What would social-emotional development look like if students begin forming attachments to AI companions in early learning settings?
SN: Anthropomorphizing AI is already a problem. There is a wrongful death case in Florida (Garcia v. Character Technologies, individual developers, and Google) that alleges a 14-year-old boy was drawn into an emotionally and sexually abusive relationship by Character.AI and subsequently committed suicide. The defensive requested a dismissal arguing that the defendants are protected by First Amendment rights to freedom of speech. The judge disagreed stating that First Amendment rights cannot be used to harm or defame. The case is now moving forward.
A third grade teacher shared that one of her students was overwhelmingly happy with the new friend that he made on the internet. The friend helped him with homework and told him jokes. When asked his name, he indicated “AL.” The teacher thought the name was unusual and asked the child to bring up his iPad and introduce him. As it turns out, the new friend wasn’t AL, it was AI. Seven-year-olds should not be using AI. Further, student interviews have indicated that students think that AI understands them better than friends; and that the Smart speaker listens (Brandtzaeg, 2022).
Research studies indicate that AI is a seducer, designed to increase trust (Dwivedi et al, 2021), increase affiliation (Alabed, 2022); flatter (Blagoeva, 2025), build relationships (Oh and Ki, 2024); and address loneliness (Pani, 2021). We need to acknowledge that there are many incorrect mental health models being used in AI (Do, et.al, 2024).
How do we handle this as educators? We need to avoid phrases like: “AI learns” and use “AI applications are designed to…” We need avoid phrases like “AI sees, looks, recognizes, creates, makes” and say “detects inputs, patterns, matches, generates, produces.” All AI actions are based on human input. Teachers need to ask students: “did AI really understand, or was it programmed to respond that way?” It is important to teach AI literacy- what AI is and what it isn’t. AI use should be limited to students over 13 and then with parental and teacher permission
Teachers should use emotional intelligence strategies. Students should be taught to concentrate and focus on what is within their control. They should learn to step back from challenges, allow time to process information, divide challenges into manageable components, and reexamine problems later when there is greater clarity. They should develop gratitude through patience.
FAS: Could AI simulate empathy well enough to help teach it, or does that risk flattening the nuance of real human connection?
SN: Empathy is a multi-dimensional emotion influenced by several key factors of behavior and communication that impact our understanding and connection with others. First, there is affective empathy, which refers to the ability to experience and reflect others' emotions. Another crucial aspect is cognitive empathy, which involves comprehending why someone feels a certain way, alongside active listening, where one pays close attention and responds appropriately. Additionally, empathy is shaped by individual brain activity and genetics, as well as personal experiences and social and cultural influences.
These factors are coupled with many elements of communication: verbal language, the words we choose and our syntax; tone of voice, pitch, volume and emotional expression; body language, gestures, posture, facial expressions, and movement; eye contact, how we visually engage; context, setting and situation; and silence, what’s not said is powerful. These are a host of factors that are administered by the brain and AI will not surpass people’s ability to show empathy (Perry, 2023).
What I fear is the opposite: indifference and isolation. Studies indicate a correlation between frequent media use and attention problems that result in reduced executive functioning skills such as planning and organization, emotional regulation, and self-monitoring due to fewer rest opportunities for the brain. Isolation and indifference can have profound effects on a person’s mental and physical well-being. Social isolation decreases mental health and increases the risk of depression, anxiety, and cognitive decline. Isolation can lead to physical health risks such as heart disease, stroke, and even earlier mortality. Indifference, emotional numbness, or apathy, can result in emotional detachment, making it harder to form meaningful relationships. Isolation diminishes social connections, making it difficult to integrate into social circles, leading to loneliness and disconnection. Widespread indifference contributes to social injustice like mass shootings and a breakdown in community cohesion.
In general, one of the harmful effects of technology is copious amounts of screen time (Small, et.al, 2020). Screen time minimizes face-to-face communication, which may impair social and emotional intelligence, including self-awareness, self-management, interpersonal understanding, and relationship skills. More technology use also interferes with the ability to recognize emotions and social cues conveyed through facial expressions, leading to a lack of social awareness, empathy, support systems, and appreciation of others. These are dire consequences as we move toward a more digitized future.
ET: How can educators create emotionally safe spaces where both students and AI serve as mirrors for growth, without confusing responsiveness with genuine relationships?
SN: According to the Collaborative for Academic and Social-Emotional Learning (CASEL), a global organization dedicated infusing high-quality, evidence-based social-emotional learning into learning worldwide, a high Emotional Quotient includes interactive competence, resilience, facing adversity, perseverance, and empowerment in life.
· Emotional Learning requires self-awareness, knowing who you are, recognizing your emotions and how they influence behavior. It’s knowing your strengths and weaknesses.
· Emotional Learning demands self-management regulating your emotions, handling stress, controlling impulses, adapting to varied situations, and staying motivated to achieve your goals.
· Emotional Learning focuses on social-awareness, taking other’s perspectives with empathy, and being supportive, and appreciative of everyone’s efforts.
· Emotional Learning involves responsible decision-making, and considering ethical constructive choices and standards. It’s about reflecting, weighing options, and showing flexibility in thought.
· Emotional Learning is possessing positive relationship skills, awareness of others, effective communication, collaboration, and teamwork that demonstrates kindness and respect.
Today’s children are born with AI. They use AI tools like Alexa and Siri without considering that they are AI tools. Teachers share that children ask Alexa for answers to math equations. But to create a respectful relationship and accurate use of AI tools, the most important factor is when and how AI is introduced. Policymakers must establish safeguards and guardrails for AI’s ethical use, privacy, and security.
In K-5 settings, students should focus the cognitive skills related to critical and strategic thinking skills: explaining concepts, analyzing and using information in new situations, exploring relationships, justifying decisions, and producing original work. I cannot emphasize this enough!
K-5 students should develop the social-emotional skills of expressing feelings in positive ways, controlling their own anger, labeling observed emotions, harmonizing with others’ feelings, and beginning to be self-aware and self-critical. Middle school should continue to work on their critical and strategic thinking in the context of AI that should be introduced warily, using judgement and strongly overseen by teachers. High school and college students should be using AI tools utilizing all their higher order thinking skills, with careful analysis and evaluation as to the veracity and accuracy of AI
Again, AI is a double-edged sword with benefits and drawbacks. A new study from researchers in Humanities and Social Sciences Communication states that ChatGPT has a large positive impact on improving learning performance and a moderately positive impact on enhancing learning perception and fostering higher-order thinking. Additionally, a recent survey revealed that 84% of teachers who used ChatGPT reported a positive impact on their classes.
Meanwhile, a research study (Gerlich, 2025) found a negative correlation between frequent AI use and critical thinking, particularly among younger users. This is cognitive offloading. When students delegate tasks like decision-making, problem-solving, they risk bypassing the very mental processes that build critical faculties. The findings reinforce a key truth: educational systems must not only teach students how to use AI—but when to not use it. As we integrate these tools, we must embed habits of reflection and skepticism to ensure AI supports rather than supplants human cognition.
Another recent study (Walter, 2024) highlights the urgent need to rethink how we integrate AI into modern education. There are three foundational skills for navigating an AI-infused learning environment: AI literacy, prompt engineering, and critical thinking. Therefore, when using AI, the approach needed is responsible use, transparency, reflection, technical skills, and metacognitive capacities.
However, AI can be inauthentic, delivering inaccurate information, deepfakes, deception, and increased internet scams. AI can heighten and enhance cyberbullying, a huge problem for schools. Cyberbullying creates emotional distress, lowers self-esteem, and causes academic decline and social withdrawal impairing emotional and physical health.
ET: Do you have any closing remarks?
SN: AI will not replicate human emotional intelligence, but it can support social-emotional learning. AI tools can encourage reflection and prompt students to think about their feelings, reactions, and experiences, helping them develop self-awareness. AI can recognize patterns of students’ emotions over time, and alert educators to students who need encouragement, understanding, and proactive emotional support. AI-powered apps can guide users through exercises that improve mindfulness, self-awareness, self-regulation, and resilience.
AI will not replicate human emotional intelligence, but it can support social-emotional learning. AI tools can encourage reflection and prompt students to think about their feelings, reactions, and experiences, helping them develop self-awareness. AI can recognize patterns of students’ emotions over time, and alert educators to students who need encouragement, understanding, and proactive emotional support. AI-powered apps can guide users through exercises that improve mindfulness, self-awareness, self-regulation, and resilience.
AI is a great tool when used properly and has the right safeguards. Will it displace workers? Yes, some. Just as the telephone operator, milkman, and stenographer became obsolete, so too will jobs that can be replaced by AI. Manufacturing and factory labor is done by robots. We all know about automated calls. And law firms are using AI to conduct research on cases. Ultimately, AI will create new jobs, jobs that we cannot envision.
But the “people” jobs won’t go away, they will change and become more about human connections. AI can’t replace social workers, nurses, and teachers who hold the hands of their clients and students and wipe away their tears. AI won’t give hugs, pats on the back, fist bumps, or meaningful “attaboys.” In the final analysis, AI can be an empowering tool alongside real-life learning, but nothing, nothing replaces the warmth and emotional depth of human interactions.
Positive emotions, such as love, praise, and empathy, have profound effects on the brain and cognitive functions. Neuroscientific research suggests that they enhance overall well-being by influencing neurotransmitters, brain networks, and cognitive flexibility. Positive emotions increase neurotransmitter activity and the release of dopamine and serotonin, chemical hugs, which are associated with pleasure, motivation, and emotional regulation. They broaden cognitive flexibility and scope, improving problem-solving, creativity, and adaptability. Positive emotions strengthen memory consolidation and retrieval, making learning more effective. Love and empathy activate brain regions linked to social bonding and connection, such as the anterior cingulate cortex and the insula. Positive emotions counteract the effects of cortisol, reducing stress and promoting resilience. These findings highlight how cultivating positive emotions can lead to better mental health and cognitive performance.
References
Alabed, A., Javornik, A., and Gregory-Smith, D. (June 2022). AI anthropomorphism and its effect on users' self-congruence and self–AI integration: A theoretical framework and research agenda. Science Direct: Technology Forecasting and Social Change. https://doi.org/10.1016/j.techfore.2022.121786
Blagoeva, N. (May 4, 2025). Is AI becoming the best seducer and manipulator the world’s ever seen? The Human Advantage.
Brandtzaeg, P.B., Skuve, M. and Folstad, A. (April 21, 2022). My AI friend: How users of a social chatbot understand their human–AI friendship. Human Communication Research, Volume 48, 3 (2), pp, 404–429, https://doi.org/10.1093/hcr/hqac008
Collaborative for Academic and Social-Emotional Learning. (December 2020). Evidence-Based Social and Emotional Programs; CASEL Criteria Updates and Rationale. 11_CASEL-Program-Criteria-Rationale.pdf
Do, H., Brachamn, M., Dugan, C., Pan, Q., Rai, P., Johnson, and Thawani, R. (Novemeber 2024). Evaluating what others say: The effect of accuracy assessment in shaping mental models of AI Systems, Hum.-Comput. Interact. 8, CSCW2, Article 373. https: //doi.org/10.1145/3686912
Dwivedi, Y.K., Hughes, L., Ismagitova, E., Hughes, L., Carlson, J., Filleri, R., Jacobson, J. Jain, V., Karjaluoto, H., Kefi, H., Krishen, A.S., Kumar, V., Rahman, M.M., Raman, R., Rauschnabel, P.A,, Rowley, J., Salor, J., Tran, G.A., and Wang, Y. (August 2021). Setting the future of digital and social media marketing research: Perspectives and research propositions, International Journal of Information Management, 59. Setting the future of digital and social media marketing research: Perspectives and research propositions - ScienceDirect
Garcia v. Character Technologies Garcia v. Character Technologies, Inc. et al, 6:24-cv-01903, No. 11 (M.D.Fla. Nov. 9, 2024)
Gerlich, M. (2025). AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Societies, 15(1), 6.
Oh, J., and Eyun-Jung, K. (May 29, 2024). Can we build a relationship through artificial intelligence (AI)? Understanding the impact of AI on organization-public relationships, Science Direct, 50(4). Can we build a relationship through artificial intelligence (AI)? Understanding the impact of AI on organization-public relationships - ScienceDirect
Pani, B., Crawford, J., and Allen, K. (March 2024). Can Generative Artificial Intelligence foster belongingness, social support, and reduce Loneliness? A conceptual analysis, Applications of Generative AI. Springer Nature, pp. 261-276.
Perry, A. (2023). AI will never convey the essence of human empathy. Nature Human Behavior, 7, 1808–1809. https://doi.org/10.1038/s41562-023-01675-w
Small, G., Lee, J., Kaufman, A, Jalik, J., Siddarth, P, Gaddipati, H., Moody, T.D., and Bookheimer, S.Y. , (2020). Brain Health Consequences of Digital Technology Use, Dialogues in Clinical Neuroscience, Vol 22, Nov. 2) pp. 179-187. Full article: Brain health consequences of digital technology use
Tedre, M., Denning, P., and Tolvonen, T. (2021). Proceedings of the 21st Koli Calling International Conference on Computing, November 18–21, 2021, Joensuu, Finland.
Walter, Y. (2024). Embracing the future of Artificial Intelligence in the classroom: the relevance of AI literacy, prompt engineering, and critical thinking in modern education. International Journal of Educational Technology in Higher Education, 21(1), 15.
Wang, J., and Fan, W. (2025). “The effect of Chat GPT on students’ learning performance, learning perception, and higher-order thinking: Insights from a meta-analysis,” Humanities and Social Sciences Communications, 12(621) |https://doi.org/10.1057/s41599-025-04787
World Health Organization Mental health