AI assistive tools boost academic success for visually impaired students

Behavioral Intention was revealed to be the key mediator between technology features and actual academic outcomes. Three variables, Performance Expectancy, Social Influence, and Effort Expectancy, had strong positive effects on students’ willingness to use these tools. In contrast, Facilitating Conditions had a surprising negative relationship with Behavioral Intention, suggesting that institutional support that is overly structured or misaligned with students’ needs may suppress autonomous motivation.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 20-06-2025 18:22 IST | Created: 20-06-2025 18:22 IST
AI assistive tools boost academic success for visually impaired students
Representative Image. Credit: ChatGPT

A newly published academic study has revealed that artificial intelligence-powered assistive technologies significantly enhance the academic performance of visually impaired university students, provided the students have strong personal motivation to adopt these tools. The study, titled “Sustainable AI Solutions for Empowering Visually Impaired Students: The Role of Assistive Technologies in Academic Success”, was published in the journal Sustainability. It focuses on university students in Saudi Arabia, offering rare empirical insight into how AI tools function as both enablers and barriers in disability-inclusive higher education.

The research tested a theoretical model based on the Unified Theory of Acceptance and Use of Technology (UTAUT) and gathered responses from 390 students with blindness or low vision across five major Saudi universities. The findings identify critical psychological, social, and institutional variables that affect whether AI-powered assistive tools genuinely translate into improved academic performance.

How do AI assistive tools impact academic success for visually impaired students?

The study explored the direct impact of AI-based assistive tools on the academic performance of blind or visually impaired students. AI applications such as screen readers, speech-to-text interfaces, and virtual assistants (e.g., Siri, Google Assistant) were considered key enablers of digital access in higher education. However, the study found that not all AI-related features contributed equally to academic success.

Performance Expectancy, the belief that AI tools will improve learning, emerged as the strongest predictor of academic improvement. Students who trusted the academic value of these tools were more likely to experience better outcomes. Social Influence, support and encouragement from peers, faculty, or family, also positively affected performance, albeit to a smaller degree. Interestingly, the perceived ease of using the tools (Effort Expectancy) and the availability of supporting infrastructure (Facilitating Conditions) had no significant direct influence on performance. In fact, some students found that heavy-handed institutional support undermined their motivation to independently adopt AI tools.

The study’s structural model showed that AI tool usage, when paired with strong behavioral intent, explained nearly 75% of the variance in academic performance among students surveyed. This underscores the importance of internal motivation and trust in the technology, rather than the presence of the tools alone.

What drives students to use these technologies?

The study also focused on understanding what drives students’ intentions to adopt AI assistive technologies in the first place. Behavioral Intention was revealed to be the key mediator between technology features and actual academic outcomes. Three variables, Performance Expectancy, Social Influence, and Effort Expectancy, had strong positive effects on students’ willingness to use these tools. In contrast, Facilitating Conditions had a surprising negative relationship with Behavioral Intention, suggesting that institutional support that is overly structured or misaligned with students’ needs may suppress autonomous motivation.

The highest predictive factor for intention was Social Influence. Peer encouragement and positive reinforcement from instructors or family played a significant role in motivating students to engage with AI tools. Performance Expectancy followed closely, suggesting that belief in the academic usefulness of AI tools is a major driver of commitment. Although Effort Expectancy had a lesser impact, it still positively influenced intent, especially when students perceived the tools as user-friendly and compatible with their needs.

These findings suggest that universities and policymakers should not assume that access alone guarantees use. Instead, behavioral intention must be cultivated through awareness campaigns, peer mentorship, and value alignment.

Are institutional supports enough or is motivation the missing link?

The third pillar of the research sought to determine whether existing support systems, such as training programs, technical resources, or infrastructure, were sufficient to boost outcomes. The answer was largely negative. Facilitating Conditions, often assumed to be critical enablers of adoption, were found to have no significant impact on academic performance or usage intent. In some cases, overly formalized support mechanisms even produced a reverse effect, creating perceptions that AI tools were complicated or burdensome to use.

By contrast, the study found that Behavioral Intention had an exceptionally strong effect on performance. Students who expressed high motivation to use AI tools demonstrated significantly better academic results, regardless of the surrounding institutional framework. This finding reinforces theories from the Technology Acceptance Model (TAM) and the Theory of Planned Behavior, which emphasize the central role of internal motivation in technology adoption.

The authors warn that without efforts to build trust, agency, and intrinsic motivation among visually impaired students, AI tools may remain underutilized or fail to yield expected benefits. They call for universities to invest in peer-led training models, student-centered AI design, and inclusive policy development to close the intention–performance gap.

Toward inclusive AI-driven education

With global commitments under the UN’s SDG4 for inclusive and equitable education, the findings challenge educational systems to go beyond infrastructure provisioning and embrace user-centric approaches to AI integration. For Saudi Arabia, which has invested heavily in AI infrastructure, the research suggests that attention must now shift to aligning these technologies with user perceptions, social dynamics, and motivation systems.

The researchers recommend that policymakers embed AI assistive technologies in national digital strategies with earmarked funding for usability testing, student training, and mentorship programs. Universities are urged to conduct regular assessments of technology impact, led by students themselves, to ensure continued relevance and effectiveness.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback