Frequent AI use pushes students toward prosocial behavior but deepens digital dependence
AI in education cannot be treated as a neutral tool, the study argues. Its effects are mediated by psychological and social dynamics that determine whether outcomes are beneficial or harmful. Universities and educators must therefore consider not only how AI is integrated into classrooms but also how it shapes the lived experiences of students.
A new study examines how frequent interaction with AI tools influences the social and behavioral patterns of undergraduates. The research uncovers a double-edged effect, where AI use can both encourage positive social actions and exacerbate problematic digital behaviors.
Published in Behavioral Sciences, the study The Double-Edged Sword Effect of AI Interaction Frequency with AI on College Students: The Moderating Role of Peer Support, the paper draws on a longitudinal survey of 388 students. Using structural equation modeling and bootstrap testing, the authors analyzed how AI engagement interacts with peer support, need for affiliation, loneliness, and outcomes such as prosocial behavior and problematic mobile phone use.
How frequent AI interaction shapes student behavior
The research identifies two parallel psychological pathways linking AI use to student outcomes. On one hand, frequent AI interaction stimulates a greater need for affiliation, which in turn promotes prosocial behavior. Students who rely on AI tools more often appear motivated to connect with others, leading to more cooperative and supportive actions within their peer groups.
On the other hand, the same pattern of AI use is associated with increased loneliness, which fuels problematic mobile phone use. The authors highlight that students who spend more time with AI systems may experience weaker interpersonal bonds, creating a reliance on digital devices for comfort or distraction.
These findings reveal the dual nature of AI’s role in education and social life. While it can indirectly encourage students to help and support one another, it also risks isolating them in ways that contribute to compulsive phone habits. The study’s model underscores that the frequency of AI interaction alone does not dictate outcomes - it is the psychological mediators of affiliation and loneliness that determine whether the effects are constructive or harmful.
Role of peer support in mitigating risks
Peer support emerges as a crucial factor in shaping how students respond to their AI usage. The study shows that supportive peer environments amplify the positive pathway from AI use to prosocial behavior. In other words, students surrounded by peers who encourage interaction and cooperation are more likely to channel their increased affiliation needs into helpful actions.
However, peer support does not significantly alter the negative pathway leading to problematic mobile phone use. Loneliness remains a strong driver of digital overuse regardless of whether students have supportive peers. This suggests that while peer networks can strengthen social benefits, they may not be sufficient to counteract the isolating effects that frequent AI use can trigger.
The authors argue that educational institutions should pay close attention to this imbalance. Programs that encourage peer collaboration, group learning, and real-world social engagement could help maximize the benefits of AI. At the same time, targeted interventions may be needed to address loneliness and reduce reliance on mobile devices when AI interactions become excessive.
What the findings mean for education and student well-being
AI in education cannot be treated as a neutral tool, the study argues. Its effects are mediated by psychological and social dynamics that determine whether outcomes are beneficial or harmful. Universities and educators must therefore consider not only how AI is integrated into classrooms but also how it shapes the lived experiences of students.
On the positive side, AI interaction can encourage cooperative learning, empathy, and prosocial engagement, particularly in environments where peer support is strong. These outcomes align with broader goals of education that go beyond technical knowledge to include social development and civic responsibility.
Yet the risks cannot be overlooked. Increased loneliness and problematic mobile phone use carry potential long-term costs for academic performance and mental health. Left unchecked, these patterns could erode the very benefits AI is meant to deliver. The findings suggest that monitoring student well-being and providing targeted support should be integral parts of any strategy for AI adoption in higher education.
- FIRST PUBLISHED IN:
- Devdiscourse

