AI could widen inequality in African agriculture due to deep gender and disability gaps

Women farmers contribute between 40 and 50 percent of agricultural labor in many African countries but often have less access to land, credit, digital devices, and extension services compared to men. Persons living with disabilities face even stronger barriers that limit their chances to use and benefit from new technology. These gaps create a situation where digital solutions built without their input fail to reach the people who need help the most.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 18-11-2025 14:53 IST | Created: 18-11-2025 14:53 IST
AI could widen inequality in African agriculture due to deep gender and disability gaps
Representative Image. Credit: ChatGPT

A new cross-country study has warned that artificial intelligence (AI) tools used in African agriculture risk widening inequality if they are developed without the direct involvement of women farmers and persons living with disabilities. The research examines how social inclusion affects the design and real-world impact of AI systems in farming communities across Nigeria and Uganda.

Published in AI and Society, the paper "An Integrated Approach to Gender Equality, Diversity, and Inclusion in the Development of Artificial Intelligence Tools in Agriculture and Food System in Africa" delivers one of the clearest assessments to date of why many AI tools fail to take hold in African smallholder farming, despite years of investment and rising interest in digital agriculture. The authors argue that technology that ignores the lived reality of farmers, especially women and persons with disabilities, often struggles to gain trust or solve the most urgent problems facing rural households.

The study focuses on two AI research projects funded under the AI4D Africa programme. The first is a pest detection model developed in Nigeria for yellow pepper farmers. The second is a cassava disease detection tool tested with farmers in Uganda. Together, they offer a sharp comparison of how early and continuous involvement of diverse groups shapes everything from problem definition to practical adoption. The authors state that these differences reveal how inclusive design is not only a social issue but a central requirement for the success of agricultural AI tools.

How do social barriers affect the use of AI tools in African agriculture?

Women farmers contribute between 40 and 50 percent of agricultural labor in many African countries but often have less access to land, credit, digital devices, and extension services compared to men. Persons living with disabilities face even stronger barriers that limit their chances to use and benefit from new technology. These gaps create a situation where digital solutions built without their input fail to reach the people who need help the most.

The authors explain that AI driven tools have the promise to improve crop yields, identify pests faster, guide field decisions, and provide real time support for farmers dealing with unpredictable weather and shifting market conditions. But these gains cannot materialize when large parts of the farming population cannot use the tools. In many regions, women farmers lack reliable smartphones, steady data access, or the digital confidence needed to engage with new platforms. Persons living with disabilities face physical, social, and cultural obstacles that reduce their chances of being included in training spaces or tool testing.

These problems show up in multiple stages. They shape which problems researchers think are worth solving. They shape what data is collected and whose fields or crops are represented. They shape how the final tool works in the hands of real users. When these barriers are ignored, the AI product becomes misaligned with daily needs. The authors stress that this misalignment can reinforce inequality because it gives more benefits to farmers who already have better access to resources and digital skills.

The research shows that inclusion is not a simple exercise of collecting feedback at the end of a project. Instead, it must be built into the full chain of design. This includes identifying who is represented in early workshops, how the problem is defined, whose crops are prioritized, what training needs exist, and how tools are introduced to farming groups that have less confidence with digital systems.

How did Nigeria and Uganda demonstrate the difference between inclusive and non-inclusive AI design?

In Nigeria, the research team worked with yellow pepper farmers to identify their most urgent problems before designing the AI model. Farmers from different backgrounds took part in problem mapping sessions, field discussions, and practical demonstrations. Women farmers and persons living with disabilities participated in these activities, which helped the team understand that pest damage was a top concern. This early engagement shaped the training data for the AI model and helped ensure that the tool addressed a real and widely felt need. As a result, farmers showed more trust in the tool because they saw a direct link between their daily challenges and the design choices made by the team.

In Uganda, the research followed a different path. A cassava disease detection model was developed first, and farmers were brought in later to evaluate it. During these discussions, many farmers explained that soil fertility problems and access to soil information were their greatest concerns, not disease detection. This revealed a major disconnect between what researchers thought farmers needed and what farmers actually wanted. Women farmers in particular stressed that they needed support with soil related decisions to reduce losses and improve household food stability.

The Uganda case shows how lack of early involvement can weaken the usefulness of a tool. Farmers may hesitate to adopt a technology that does not address their most pressing problem. They may also distrust the tool because they were not part of shaping it. By the time this gap becomes visible, the development process is already far along, making it harder to adjust the model or rebuild trust.

Together, the two cases demonstrate that the strongest AI tools come from systems where research teams listen to farmers from the start. When women and persons living with disabilities take part in shaping goals, selecting features, and testing models, the tool becomes more grounded in real experience. The study warns that excluding these groups can lead to tools that are technically sound but socially weak. This can slow adoption, reduce impact, and limit the return on investment made by governments or donors.

What framework can improve inclusion in the development of AI for African agriculture?

Based on evidence from the two case studies, the authors propose a Gender Equality, Diversity and Inclusion framework that sets out a step by step approach for designing AI systems for African agriculture. The framework covers three major stages. Each stage aims to correct blind spots that often arise when researchers make assumptions about user needs.

The first stage focuses on pre-development. This includes identifying who should be part of early conversations, ensuring that women farmers and persons living with disabilities can participate in workshops, and building trust before any data is collected. The authors stress that this stage shapes everything that follows because it determines how the core problem is defined. If researchers ignore the lived experience of women, they risk choosing problems that do not match household needs.

The second stage addresses the development process. It encourages research teams to work with multidisciplinary groups that include gender experts, disability rights advocates, and social scientists alongside data scientists and agricultural specialists. The framework insists on inclusive data practices, including the need to understand whether the dataset represents the fields, crops, seasons, and farming styles used by diverse communities. If women farmers grow smaller plots with different crop varieties, the model should reflect this rather than favoring data from larger commercial farms.

The third stage covers deployment, testing and adoption. The study finds that many AI tools fail at this stage because training sessions do not consider digital skills gaps or physical accessibility. The framework recommends safe learning environments where farmers can test tools without fear of failure. It encourages field demonstrations that allow farmers to learn with support, as well as follow up visits to ensure adoption is sustained. It also stresses the importance of presenting AI information in simple language and giving farmers time to build confidence.

The authors note that inclusive design also builds better data. When women farmers help shape data collection, the model becomes more accurate for their fields. When persons living with disabilities are part of training, they help identify parts of the interface that need clearer navigation. In this way, inclusion directly raises the technical performance of the system.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback