Reimagining AI Governance in Nigeria: A Feminist Decolonial Critique of Digital Transformation and Gendered Inequalities


Abstract

Nigeria’s digital transformation and AI governance initiatives, such as significant investments in digital public infrastructure and the rollout of a national digital identity system (NIMC), promise enhanced efficiency and greater financial inclusion. Yet they risk perpetuating entrenched patriarchal and colonial inequities. This paper employs a feminist decolonial framework—drawing on diverse critical perspectives, including insights from the Pygmalion displacement approach—to interrogate how efforts to “humanise” AI concurrently contribute to dehumanising, deskilling, and marginalising women. By centring Nigerian contexts and local epistemologies, the analysis reveals how historical erasures, such as the sidelining of early female contributions in technology, are reproduced in contemporary algorithmic systems that cloak bias under the guise of neutrality.

Through detailed case studies spanning Nigeria’s public sector—from digital identity initiatives and resource allocation strategies to fintech innovations—this literature review exposes a troubling paradox. Ambitious state-led projects are undermined by stark socio-economic disparities, as exemplified by lower formal banking access among women (31% versus 61% for men) and significantly reduced internet usage. Qualitative insights, drawn from prior interviews, policy documents, and digital activist campaigns (e.g., #NameAndShameNigeria), further highlight the disconnect between technocratic governance and the lived realities of marginalised communities.

In response, this synthesis advocates for an AI governance model that reorients digital transformation as a tool for social justice. Central to this vision is the integration of participatory design, rigorous ethical oversight, and indigenous feminist epistemologies. Such an approach promises to address algorithmic biases, dismantle inherited hierarchies, and enable Nigeria to leverage technological advances in fostering inclusive growth, equitable resource distribution, and the celebration of cultural diversity.

Keywords: Feminism, governance, decolonialisation, digital transformation, AI, gendered inequality

Introduction

Nigeria is undergoing a rapid digital transformation that is sure to reshape public policy and social governance. Ambitious state-led initiatives, such as significant investments in digital public infrastructure and the rollout of a national digital identity system (NIMC), promise to herald an era of connectivity, expanding economic opportunities for millions of citizens (Techeconomy 2024). And yet, beneath these promises lay numerous challenges. Varon et al. (2022) examine AI governance frameworks in this context through a feminist decolonial lens, and what becomes apparent is that algorithmic systems, often presented as objective and modern, can inadvertently perpetuate existing gendered, racial, and colonial power imbalances.

The prevailing literature on AI governance has largely emerged from technocratic discourses emphasising algorithmic fairness, transparency, and efficiency. However, critical feminist and postcolonial scholars like Morr (2024) and Ricaurte (2022) argue that these narratives obscure the fundamental reality that digital transformation is not a neutral technical process. Projects like “Not My A.I.” and broader critiques of digital colonialism highlight how AI systems often fail to acknowledge structural inequalities and injustices, thereby perpetuating existing systematic power imbalances, rather than dismantling them (Varon et al. 2022). Algorithms reduce complex human conditions—such as poverty, gender identity, and cultural diversity—to reductive, machine-readable formats (Zajko 2022). This reduction is reflective of historical marginalisation patterns conceptually linked to ideas like Pygmalion displacement, where the contributions and identities of marginalised groups, including women (e.g., the early “computer girls”), are systematically erased or undervalued (Erscoi et al. 2023).

This study seeks to interrogate AI in governance and public policymaking, using a feminist decolonial lens to highlight the reproduction of harmful bias and the perpetuity of systemic oppression, despite algorithms being touted as objective and modern. Methodologically, this study adopts an interdisciplinary, qualitative framework, combining critical policy analysis and historical discourse, examination of contemporary scholars in the area of research, as well as comprehensive case studies drawn from Nigeria and analogous contexts in Latin America and parts of Africa. The data sources include academic literature, official policy documents, media reports, stakeholder interviews, and analyses of digital activism, as highlighted by hashtag campaigns such as #NameAndShameNigeria and #SayHerNameNigeria (Chiluwa 2024). By triangulating these diverse materials, this research attempts to unpack the multiplicity of ways in which AI governance in Nigeria is shaped by global technocratic imperatives alongside local socio-cultural dynamics.

Preliminary findings reveal a significant disjuncture between Nigeria’s ambitious digital transformation goals and on-the-ground realities. Challenges such as uneven digital literacy, infrastructural and socio-cultural deficits that especially persist in rural areas (Tyers-Chowdhury and Binder 2021), and regulatory bottlenecks further complicate inclusive digital adoption in the country (Imaginarium HQ 2025). While investments in AI and digital public infrastructure promise improved operational efficiency and enhanced public service delivery (Asalu 2025), they also risk perpetuating historical biases rather than alleviating them. Marginalised voices remain largely absent from policymaking processes, exacerbating the disconnect between policy ambitions and lived experiences (Varon et al. 2022). Burrell and Fourcade (2021) highlight this disconnect as an emphasis of the identified risk of unchecked algorithmic governance reinforcing, rather than redressing, existing inequalities embedded in societal structures.

Situated at the crossroads of computer science, public policy, and feminist social theory, this review challenges the notion of AI as a purely technical artefact. It emphasises that the design, implementation, and regulation of digital systems are deeply embedded in cultural and historical contexts. In Nigeria, a nation still grappling with colonial legacies and persistent socio-economic disparities, such critical re-examination is not only timely but imperative (Onduko et al. 2024). This paper therefore advocates for a reimagined AI governance framework that centres local epistemologies, participatory design, and robust ethical oversight, ensuring that digital transformation becomes a tool for equitable and culturally resonant progress (Lugonzo 2025).

Methodology

Research Question and Relevance

This study reviews empirical and discursive academic literature to address the question: How do Nigeria’s digital transformation initiatives, particularly those involving AI technologies, reify historical gendered and colonial inequities, and how can a feminist decolonial perspective reconstitute these processes towards more equitable outcomes?

This inquiry is significant both theoretical, challenging the dominant narrative of technological neutrality, and practical, by identifying policy gaps that marginalise local communities (especially women) and offering actionable insights for reform.

Research Framework

The study is anchored in an interdisciplinary framework, drawn from the reviewed literature, incorporating three critical perspectives:

  • Feminist Decolonial Theory: This critiques how colonial legacies and patriarchal structures shape technological practices while foregrounding indigenous epistemologies and overlooked contributions (e.g., the “computer girls”).
  • Critical Discourse Analysis (CDA): This deconstructs language in policy documents, media reports, and digital activism, revealing how constructs like “objectivity” and “data-driven decision-making” can mask underlying biases.
  • New Social Movement Theory (NSMT): This explores how Nigerian women’s rights groups mobilise through online campaigns (e.g., #NameAndShameNigeria, #SayHerNameNigeria) to challenge state-led narratives and advocate for inclusive futures.

Research Design and Methods

Given the complex interplay among gender, technology, and policy, this study adopts a qualitative approach that synthesises existing research, rather than generating new primary data. The methods employed across the reviewed literature include:

  • Document and Discourse Analysis:
    • Data Collection: Review policy documents, official publications, academic literature, and news media.
    • Analytical Process: Manual close reading and coding to identify recurring themes (e.g., “humanisation,” “efficiency,” “objectivity”), supported by comparative case studies from Europe, the U.S., Latin America, and West Africa.
  • Digital Activism Data Analysis:
    • Data Collection: Gather digital content from social media platforms using tools like Netlytic.
    • Analytical Process: Thematic coding of online activism to reveal how digital campaigns contest official narratives.
  • Semi-Structured Interview Analysis:
    • Participants: Review interviews with policymakers, technical developers, analysts, and representatives from feminist organisations.
    • Objective: Triangulate narrative insights on AI implementation and digital governance.
  • Participant Observation:
    • Settings: Observations during AI system deployments and policy forums.
    • Objective: Capture real-time interactions and uncover discrepancies between official rhetoric and practice.
  • Supplementary Empirical Data:
    • Incorporate quantitative indicators such as internet usage and formal banking access disparities to ground qualitative findings.

Data Preparation and Analytical Techniques

Data from interviews, documents, and digital content is coded thematically and analysed using narrative analysis. Triangulation across diverse sources ensures that conclusions are robust and contextually grounded.

Suitability of Methods and Their Relevance

The qualitative, interdisciplinary approach is well-suited for capturing subtle socio-political nuances and context-specific insights that purely quantitative methods might overlook. By bridging computational critiques with social science methodologies, this study remains tightly linked to Nigeria’s unique socio-cultural landscape while providing actionable recommendations for more inclusive digital policies.

Ethical Considerations

This paper is firmly anchored in feminist decolonial principles and is committed to ethical rigour, especially given the sensitive intersections of technology, gender, and institutional power in Nigeria. Data privacy is a central concern, and this study primarily draws on publicly available documents—including policy texts, academic literature, and social media posts—to examine digital activism and state-led digital transformation. Where primary data such as interviews or survey responses are used, it is confirmed that informed consent was robustly secured, and all identifiable information was carefully anonymised. Extra measures were implemented to protect the privacy of individuals, particularly given that participation in digital activism can sometimes render someone indirectly identifiable. These measures adhere to established ethical guidelines and ensure that sensitive information is handled with the utmost care.

Recognising that biases can be woven into both AI systems and research methodologies, this study adopts a critically reflexive stance. The analysis recognises that algorithmic processes often perpetuate historical inequities—particularly in colonial contexts—and it continually reflects on the potential biases that might emerge in qualitative methods. Through the rigorous triangulation of diverse data sources and consistent reflexive methodologies, the study actively works to minimise interpretive bias and challenge the notion of technological neutrality.

Given the historical marginalisation of women and other vulnerable groups in Nigeria—and the world at large—the review takes deliberate steps to ensure that the redress of existing power imbalances is prioritised in the research. Engagements with community stakeholders, policy actors, and digital activists, which were conducted through collaborative dialogue and mutual respect during research, were prioritised.

A culturally sensitive approach is integral to this study. All research instruments and analytical frameworks have been adapted to reflect local norms, values, and epistemologies. This adaptation ensures that interpretations are contextually valid and that the study respectfully engages with indigenous perspectives, even as it critiques externally imposed paradigms. Furthermore, the study acknowledges the persistent challenges posed by resource constraints and the digital divide in Nigeria—factors that affect both data availability and the broader participation of communities in digital transformation. By openly discussing these limitations, the research underscores the vital need for more inclusive and accessible data ecosystems, providing a realistic framework for understanding and addressing the broader impacts of digital innovation.

Findings

This study’s investigation into Nigeria’s digital transformation—and particularly the governance of AI through a feminist decolonial lens—reveals a multifaceted landscape. By synthesising existing research and secondary data gathered from interviews, policy documents, participant observations, and quantitative analyses drawn from 169 licensed fintech providers (in addition to data from 17 emerging ones), this review uncovered a number of interrelated themes. Overall, while AI-driven applications and digital initiatives promise enhanced efficiency, the evidence indicates that these same advances risk perpetuating long-standing gendered and colonial inequities.

One of the most striking findings on gendered digital inequalities in Nigeria is the pronounced digital divide between men and women. Quantitative data from policy documents and statistical reports show that only 20.05% of Nigerian women are regular internet users compared to 37.20% of men, revealing a gender gap of approximately 17% (Banyan Global 2023). Trends emerging since 2019 indicate a decline in women’s digital engagement and mobile phone ownership, a pattern largely shaped by restrictive local norms. In northern Nigeria, surveys indicate that over half of male respondents oppose their wives’ use of the internet, with similar attitudes found among fathers who restrict their daughters’ digital participation. These socio-cultural restrictions, in tandem with structural limitations of infrastructure, present a greater challenge, especially for rural women (Banyan Global 2023). These disparities create an empirical foundation for critiquing digital transformation policies, which, despite ambitious state-led investments, fail to incorporate gender-sensitive designs.

Table 1. Overview of Digital Access Statistics

MetricMenWomen
Regular Internet User Rate37.20%20.05%
Mobile Phone Ownership92%88%
Mobile Internet Usage54%34%
Autonomy Over Handset (selection and acquisition)93%62%
Controlling Internet Access of Female Relatives (northern Nigeria)FathersHusbands
 61%55%

Turning to AI adoption in the fintech sector, Omotubora’s (2024) quantitative insights reveal considerable discrepancies between claims and practices. Among the 169 fintech providers analysed, 45 claim to utilise AI; however, only 13 of these offer detailed explanations regarding how AI is integrated into their services. This gap suggests that many of these claims are superficial, reflecting an absence of substantive and transparent use of AI. Furthermore, approximately 75% of these providers are urban-based, contrasting sharply with the roughly 2% who serve rural areas. This urban concentration underscores a troubling pattern: the current AI solutions in Nigeria appear tailored to already advantaged populations to maximise profitability, leaving behind many underbanked and digitally excluded women.

Table 2. Overview of AI Adoption Metrics

MetricValue
Total Fintech Providers Analysed169
Providers Claiming AI Usage45
Providers with Detailed AI Information13
Urban-based Providers~75%
Rural-based Providers~2%

Beyond the numbers, a critical discourse analysis of digital feminist activism reveals how Nigerian women’s rights groups contest both offline oppression and the neglect embedded within current AI governance through digital feminist activism. Chiluwa (2024) highlights how hashtag campaigns such as #NameAndShameNigeria and #SayHerNameNigeria consolidate individual narratives into collective calls for accountability, amplifying marginalised voices which are often excluded from formal policy dialogues. Visual content and linguistic analyses further demonstrate a dialectical tension between state-promoted, technocratic narratives and the emotive, inclusive voices emerging from grassroots activism (Aina and Temitope 2024). While these digital campaigns have considerable potential to empower marginalised communities, they also expose the limitations of online mobilisation in achieving lasting policy change. The aforementioned socio-cultural, economic, political and structural barriers for women and other marginalised groups constrain the translation of online advocacy into concrete reforms (Ayana et al. 2024).

Another dimension of the findings concerns the operational dynamics of technology in governance. Many respondents in Saxena’s (2024) work note impressive efficiency gains from AI-driven tools, such as improved turnaround times in chatbots and data-driven resource allocation. AI-powered chatbots like HerSafeSpace, developed by a Nigerian NGO, provide real-time support to women facing online gender-based violence, combining technology with advocacy and education to create safer digital environments (Ileyemi 2025). However, these benefits come with hidden costs. Ayana et al. (2024) reiterate that algorithms, predominantly developed from Western-centric datasets, tend to cloak historical gender and colonial biases. Such biases manifest in the reinforcement of rigid gender binaries and in the marginalisation of non-normative identities (Botti-Lodovico 2024).

The study analysis finds that while automated tools are invaluable for processing large datasets quickly, the interpretive nuance required to appreciate local idioms and cultural contexts is only achievable through manual coding and qualitative assessment. Automated approaches offer fast processing and rapid trend identification; however, they are limited in cultural sensitivity and tend to operate as “black boxes,” obscuring justifications of local nuances—or a lack thereof (Zajko 2022). In a hybrid arrangement, human oversight proves essential in addressing the complexities that AI alone may overlook. AI systems in Nigeria and comparable contexts (in Latin America and other parts of Africa) tend to reduce complex human conditions of poverty, gender identity, and cultural diversity into oversimplified, machine-readable data. This abstraction inevitably ignores the lived experiences of marginalised groups and often reinforces a one-size-fits-all approach to policy (Reid et al. 2023).

Empirical evidence shows that algorithms not only replicate but may also amplify pre-existing social biases by privileging global models over localised, indigenous epistemologies (Varon, Peña, and Center 2022). It underscores the critical need to embed AI systems within regulatory frameworks that rigorously interrogate and counter these embedded biases. Policy documents and stakeholder interviews further expose crucial gaps in public participation, with national AI initiatives often adopting Western models that overlook the perspectives of marginalised communities, especially women. Evidence of displacement is clear: the erasure of women’s historical contributions to computing reflects a broader pattern of exclusion from digital narratives (Erscoi et al. 2023).

The Nigerian government, through the National Information Technology Development Agency (NITDA) and the Federal Ministry of Women Affairs, launched the National Gender Digital Inclusion Strategy (NGDIS) for 2024–2027 (Adaramola 2025). The NGDIS aims to remove barriers to digital inclusion by expanding digital skills training, infrastructure, mentorship, and creating safe online spaces for women and girls. Yet, despite progress in regulatory guidelines, significant implementation gaps remain, partly due to insufficient demographic and local contextualisation in AI system design and deployment (Nwosu et al. 2024).

AI governance is fragmented across existing laws on data protection, cybersecurity, and competition, and in many cases, anticipated benefits in public service delivery or financial inclusion have not materialised, partly due to insufficient demographic contextualisation and robust oversight (Omotubora 2024). Salihu (2025) points out that limited institutional capacity and funding restrict regulatory authorities’ ability to conduct continuous oversight, such as mandatory bias audits or impact assessments.

Taken together, these findings point to the necessity of a hybrid governance model that marries AI’s efficiency with human-driven, context-sensitive methods. Strengthening public participation through inclusive policy frameworks and ensuring continuous regulatory oversight over privately owned enterprises—via regular audits and community feedback—are imperative steps to mitigating AI’s inherent biases and to fostering a genuinely transformative digital future.

In summary, while Nigeria’s digital transformation and AI governance initiatives may enhance operational efficiency, they remain fundamentally entangled with historical gender and colonial inequities. The evidence highlights the importance of blending computational efficiency with nuanced, human-centred approaches, a hybrid methodology that is essential for the development of equitable and responsive policies. These findings call for urgent, context-driven reforms that prioritise local participation and rigorous regulatory oversight, ultimately paving the way for a more inclusive and transformative digital future.

Interdisciplinary Implications

This study embodies an inherently interdisciplinary approach by integrating insights from fields as diverse as computer science, public policy, and feminist/decolonial theory. In doing so, it challenges dominant narratives in AI governance and digital transformation, arguing that technology is not a neutral tool but is deeply embedded in broader socio-political, cultural, and historical contexts. It spans the gap between technical assessments and critical socio-political inquiry. Combining rigorous AI system evaluations with feminist and decolonial theoretical frameworks—such as the Pygmalion displacement approach—it reveals how algorithmic models, despite their claims of efficiency, reproduce gendered and colonial biases. Rather than treating AI solely as a technical artefact, this work reimagines it as a socio-political construct that demands a multidisciplinary critique. And by fostering dialogue between computational techniques and qualitative social research, the review promotes reflexive practices that harmonise insights from STEM fields, the humanities, and public policy, thereby capturing the full cultural and ethical dimensions of digital transformation.

The implications of this interdisciplinary work also extend to educational settings, policy development, and community empowerment. Embedding feminist decolonial theory and indigenous knowledge systems into AI and public policy curricula can prepare future technologists and policymakers to address both technical and socio-cultural dimensions of digital innovation (Hooper and Oyege 2024). By bridging disciplinary divides, adapting global policies to local realities, reforming educational frameworks, and adopting resource-sensitive strategies, the study offers robust theoretical and practical models for reimagining AI governance.

Recommendations

Rooted in Nigerian socio-political realities and bearing broader resonance for similar African contexts, the study encourages practical directions for reimagining AI governance. It advocates for policymakers to adopt gender-inclusive, decolonial frameworks that transcend purely technocratic approaches by embracing human-centred models designed to actively mitigate algorithmic bias. In parallel, digital rights advocates and women’s rights groups are empowered to leverage data on the pronounced gender digital divide to press for policies that protect marginalised communities. The research demonstrates that imported, globalised models often reinforce colonial legacies and overlook local socio-cultural realities. In line with Lugonzo (2025), the study suggests that incorporating indigenous epistemologies and participatory methodologies rooted in community knowledge can produce governance frameworks that are ethical, culturally resonant, and responsive to local needs. This approach offers a replicable model for digital democracy especially suited to resource-constrained settings like Nigeria and similar African countries.

The findings advocate for regulatory frameworks that mandate transparency, participatory oversight, and continuous auditing of AI systems. Such policies would help ensure that digital transformation initiatives do not deepen existing inequalities but rather work to alleviate them. Mwelu (2025) emphasises that collective responsibility, as reflected in Ubuntu philosophy, can shift governance from individualistic, technocratic models to more inclusive, community-centred approaches. At the grassroots level, these insights support the creation of local digital literacy programmes and community-based decision-making processes, transforming citizens from passive recipients into active stakeholders in digital governance.

The study refutes the myth of AI neutrality by showing that AI systems are imbued with socio-cultural and gendered assumptions that perpetuate historical biases. Sustainable digital transformation, therefore, must involve a balance between quantitative efficiency and qualitative human complexities; this balance challenges the prevailing notion of “techno-solutionism” and calls for the integration of ethical, cultural, and historical contexts throughout technological innovation. By recognising infrastructural and economic limitations in Nigeria and similar contexts, the research recommends practical strategies, such as adopting low-tech solutions like community radio or SMS-based education and leveraging open-source AI models that are tailored to local capacities. The Thomas Reuters Foundation (2024) recommends collaborative funding partnerships among universities, government agencies, and international feminist tech movements to support context-specific digital initiatives that remain both accessible and scalable.

Policymakers are urged to incorporate inclusive, participatory practices and enforce rigorous disclosure standards in AI governance. Fintech providers must transition from models focused solely on efficiency and profitability to those that prioritise user-centred, gender-inclusive financial solutions that bridge both urban and rural divides. Furthermore, educational institutions should adopt interdisciplinary curricula that merge technical training with critical theories of ethics and cultural sensitivity, while future research is recommended to employ longitudinal, mixed-methods designs to capture the evolving digital experiences of marginalised communities. In sum, the interdisciplinary findings of this research underscore a call for collaborative, multi-sector approaches that merge technical expertise with critical social inquiry. Its ultimate aim is to transmute digital transformation into a process that is not only efficient but also equitable, culturally resonant, and socially just.

Nonetheless, several exceptions and limitations temper the general trends found in this study. The reliance on qualitative analyses and secondary data means that some on-the-ground nuances, particularly in rural areas, remain underexplored. For instance, many fintech providers offer only vague disclosures about their AI practices, limiting our comprehensive understanding of technological integration. Moreover, while digital activism—exemplified by campaigns like #NameAndShameNigeria and #SayHerNameNigeria—has amplified women’s voices online, this does not automatically translate into tangible policy reform. This “null effect” highlights the enduring gap between online mobilisations and offline change. Additionally, while the Pygmalion displacement framework yields valuable insight into gendered dynamics, it may oversimplify the intersecting influences of economic constraints, regional diversity, and infrastructural challenges on the broader landscape of digital governance.

Conclusion

The study confirms that Nigeria’s digital transformation, and specifically the integration of AI into governance and fintech, is entwined with gendered and colonial inequality. Despite the promise of enhanced efficiency and data-driven decision-making from state-led digital initiatives, synthesis of existing research reveals that these advances are consistently undermined by pervasive biases that marginalise women and reinforce entrenched power hierarchies. The study’s analysis demonstrates that digital and AI systems in Nigeria are far from neutral artefacts. Instead of being detached technical tools, these systems embed gendered biases inherited from global, patriarchal frameworks.

The process of humanising AI through feminised design elements gives rise to a dynamic—exemplified by the concept of Pygmalion displacement—where the historical contributions and lived labour of women are simultaneously mimicked and marginalised. In parallel, digital transformation efforts create conditions of financial exclusion; empirical evidence from prior studies shows significant disparities and gaps observable in digital use and access. As currently implemented, these digital initiatives—often modelled on Western technocratic paradigms that neglect local socio-cultural complexities—risk reinforcing exclusionary practices rather than redressing them. Further assessments reveal that while automation offers speed and scale, it lacks the nuance to capture local realities. Hybrid models that combine computational efficiency with context-sensitive human judgment are therefore essential for developing policies that genuinely meet the needs of marginalised communities.

While Nigeria’s digital transformation and AI governance hold significant potential for enhancement and innovation, realising these benefits demands a reimagining of digital policies through an interdisciplinary, feminist decolonial lens. This study advocates for a collaborative, multi-stakeholder approach that intersects technology, policy, and feminist critique. Such an approach is essential for forging a digital future where transformative policy reforms and robust regulatory oversight ensure that digital innovation serves as a true vehicle for social equity.

Acknowledgements

I am grateful to the Research Round team for organising the LUNE TWO AI. Humanities. Social Sciences Fellowship, which made it possible for me to embark on and complete this project. Their time, knowledge, and the learning resources provided for us were invaluable to my work and my growth as a researcher. I am thankful for the support and shared passion of Ololade Faniyi, who took into consideration my passion for intersectional studies as an information scientist and made it possible for me to participate in this programme as a navigator. I also want to appreciate Habeeb Kolade for his encouragement and support throughout this fellowship. I appreciate Khadijat Alade’s dedication and attentiveness to my needs and those of my fellow participants throughout the fellowship.

The network I have been privileged to form in the course of the fellowship is invaluable. My mentors, Dr Chinasa Okolo and Nelson Olanipekun, were indispensable in giving me the direction I needed to embark on this project. I learned immensely from my fellows and navigators, and the meaningful connections formed with colleagues like C. I. Atumah and Anthony Ojukwu were a core part of my support system while embarking on this work. I was also enriched by the support of my focus group fellows, Dr Nufaisa Garba and Samuel Sanubi, who encouraged me, even through other commitments, to persevere. I acknowledge them, alongside other participants of the fellowship, for their wisdom and engagement, from which I learned a lot.

Beyond the amazing Research Round community, I must acknowledge the unwavering faith of my friends, such as Kemi Osiyi, Moyomade Aladesuyi, and Abasi-Maenyin. They were foundational to my support system, from day to day, affirming me despite my doubts and struggles in the course of my research and fellowship. I appreciate my mother, father, and siblings for supporting me with their resources, cheering me on, and celebrating my achievements in the course of this programme.

And finally, I want to thank myself. For believing in me, even when I was not sure I could see this through to the end. I did, and I am grateful that I did. This project, and the fellowship that made embarking on it possible, was a wonderful experience.

References

Salihu, Abdulameed, O. 2024. “Regulating the Future: The Current State and Prospects of Artificial Intelligence Policy in Nigeria.” SSRN. July 18. https://doi.org/10.2139/ssrn.5117653.

Adaramola, Zakariyya. 2025. “Nigeria Moves to Bridge Gender Digital Divide.” Daily Trust. April 9. https://dailytrust.com/nigeria-moves-to-bridge-gender-digital-divide-2/

Agunwa, Nkem. 2024. “Feminism and the Digital Era: Challenges and Opportunities in Africa.” Feminist Africa 5 (2): 113–26. https://www.jstor.org/stable/48799338

Aina, Adeola, T. 2024. “From Hashtags to Entrepreneurship: A Qualitative Study of the Influence of Digital Activism on Female-Led Businesses in Nigeria.” Nigerian Journal of Management Sciences 25. https://nigerianjournalofmanagementsciences.com/wp-content/uploads/2024/11/From-Hashtag-to-Enterpreneurship.pdf.

Banyan Global. 2023. “Understanding the Gender Digital Divide in Nigeria.” Banyan Global. USAID. https://banyanglobal.com/wp-content/uploads/2023/08/Nigeria-GDD-Brief_Final-508-May-2023.pdf.

Benoit, Kenneth. 2024. “AI and Data Science for Public Policy.” LSE Public Policy Review 3 (3). https://doi.org/10.31389/lseppr.115.

Botti-Lodovico, Yolanda. 2024. “Against All Odds: Increasing African Women’s Influence on AI Innovation and Policy.” Tech Policy Press. October 31, 2024. https://www.techpolicy.press/against-all-odds-increasing-african-womens-influence-on-ai-innovation-and-policy/.

Burrell, Jenna, and Marion Fourcade. 2021. “The Society of Algorithms.” Annual Review of Sociology 47 (1): 213–37. https://doi.org/10.1146/annurev-soc-090820-020800.

Erscoi, Lelia, annelies kleinherenbrink, and Olivia Guest. 2023. “Pygmalion Displacement: When Humanising AI Dehumanises Women,” February. https://doi.org/10.31235/osf.io/jqxb6.

Gelan Ayana, Kokeb Dese, Hundessa Daba Nemomssa, Bontu Habtamu, Bruce Mellado, Kingsley Badu, Edmund Yamba, et al. 2024. “Decolonizing Global AI Governance: Assessment of the State of Decolonized AI Governance in Sub-Saharan Africa.” Royal Society Open Science 11 (8). https://doi.org/10.1098/rsos.231994.

Hooper, Danielle , and Ivan Oyege. 2024. “Application of African Indigenous Knowledge Systems to AI Ethics Research and Education: A Conceptual Overview.” In 2024 ASEE Annual Conference & Exposition. https://peer.asee.org/46585.

Ileyemi, Mariam. 2025. “Nigerian Group Unveils AI Chatbot to Combat Online Gender-Based Violence.” Premium Times Nigeria, February 17, 2025. https://www.premiumtimesng.com/news/top-news/774916-nigerian-group-unveils-ai-chatbot-to-combat-online-gender-based-violence.html.

Imaginarium HQ. 2025. “ Navigating the Challenges of Digital Transformation in Nigeria | LinkedIn.” Linkedin.com. March 17, 2025. https://www.linkedin.com/pulse/navigating-challenges-digital-transformation-nigeria-y7ykf/.

Innocent Chiluwa. 2024. “Discourse, Digitisation and Women’s Rights Groups in Nigeria and Ghana: Online Campaigns for Political Inclusion and against Violence on Women and Girls.” New Media & Society, January. https://doi.org/10.1177/14614448231220919.

Lugonzo, Tamara. 2025. “From Colonial Bias to Relational Intelligence: Decolonizing AI with Indigenous and African Epistemologies .” Liberated Arts: A Journal for Undergraduate Research 12 (1). https://ojs.lib.uwo.ca/index.php/lajur/article/view/22436/17676.

Macdonald, Ayang. 2025. “Nigeria Re-Commits to Improving Digital Services with $2B Planned Investment.” Biometric Update | Biometrics News, Companies and Explainers. BiometricUpdate.com. March 4, 2025. https://www.biometricupdate.com/202503/nigeria-re-commits-to-improving-digital-services-with-2b-planned-investment.

Morr, Christo El. 2024. “The Need for a Feminist Approach to Artificial Intelligence.” Proceedings of the AAAI Symposium Series 4 (1): 332–33. https://doi.org/10.1609/aaaiss.v4i1.31812.

Mwelu, Naomi. 2025. “African Epistemologies as the Pillar of AI Ethics in Peacebuilding: A Definitive Framework.” Linkedin.com. March 3, 2025. https://www.linkedin.com/pulse/african-epistemologies-pillar-ai-ethics-peacebuilding-naomi-pojuf/.

Nwosu, Chibuzo Charles, Dike Chijioke Obalum, and Mathias Ozoemena Ananti. 2024. “Artificial Intelligence in Public Service and Governance in Nigeria.” Journal of Governance and Accountability Studies 4 (2): 109–20. https://doi.org/10.35912/jgas.v4i2.2425.

Ololade Faniyi. 2024. “An African Feminist Manifesto.” The Republic. The Republic. February 27, 2024. https://rpublc.com/february-march-2024/an-african-feminist-manifesto/.

OLUWOLE ASALU. 2025. “Shaping Nigeria’s Digital Future: Vision for 2025 and Beyond.” Independent.ng, February 2, 2025. https://independent.ng/shaping-nigerias-digital-future-vision-for-2025-and-beyond/.

Omotubora, Adekemi. 2024. “AI for Women’s Financial Inclusion—Analysis of Product Design and Policy Approaches in Nigeria.” Data & Policy 6. https://doi.org/10.1017/dap.2024.62.

Onduko, Joseph Otochi, Michael Acharya Kalombo, Makuach Dut Kuol, Bentley Gift Makale, and Mahsen Abdulkarim Saleh. 2024. “AI-Driven Governance: Transforming Public and Addressing Legacy Issues in Post-Colonial Africa.” Proceedings of London International Conferences, no. 11 (September): 52–63. https://doi.org/10.31039/plic.2024.11.243.

Reid, Octavia Field, Anna Colom, and Roshni Modhvadia. 2023. “What Do the Public Think about AI?” Adalovelaceinstitute.org. October 26, 2023. https://www.adalovelaceinstitute.org/evidence-review/what-do-the-public-think-about-ai/#_ftnref81.

Ricaurte, Paola. 2022. “Artificial Intelligence and the Feminist Decolonial Imagination.” Bot Populi. March 4, 2022. https://botpopuli.net/artificial-intelligence-and-the-feminist-decolonial-imagination/.

Saxena, Ashish K. 2024. “AI in Governance and Policy Making.” International Journal of Science and Research (IJSR) 13 (5): 1218–23. https://doi.org/10.21275/sr24519015426.

Techeconomy. 2024. “Empowering Governance through Digitalisation: Nigeria’s Roadmap for Transparent Transformation | Tech | Business | Economy.” Tech | Business | Economy. December 3, 2024. https://techeconomy.ng/empowering-governance-through-digitalisation-nigerias-roadmap-for-transparent-transformation/.

Thomas Reuters Foundation. 2024. “A.I. GOVERNANCE for AFRICA PART 3: BUILDING ADVOCACY STRATEGIES.” Trust.org. https://www.trust.org/wp-content/uploads/2024/12/TRF-Toolkit-3-Advocacy-Strategies-2024.pdf.

Tyers-Chowdhury, Alexandra, and Gerda Binder. 2021. “What We Know about the Gender Digital Divide for Girls: A Literature Review.” UNICEF. UNICEF. https://www.unicef.org/eap/media/8311/file/What%20we%20know%20about%20the%20gender%20digital%20divide%20for%20girls:%20A%20literature%20review.pdf.

Varon, Joana, Paz Peña, and Carr Center. 2022. “Not My A.I. Towards Critical Feminist Frameworks to Resist Oppressive A.I. Systems.” https://www.hks.harvard.edu/sites/default/files/2023-11/22_10JoanaVaron.pdf.

Zajko, Mike. 2022. “Artificial Intelligence, Algorithms, and Social Inequality: Sociological Contributions to Contemporary Debates.” Sociology Compass 16 (3). https://doi.org/10.1111/soc4.12962.