The Mozilla Technology Fund (MTF) supports open source technologists whose work furthers promising approaches to solving pressing issues.
In 2022, Mozilla welcomed our inaugural Mozilla Technology Fund cohort, which focused on reducing the bias in and increasing the transparency of artificial intelligence (AI) systems. We intentionally cast a wide net for our first cohort, funding art projects, creative writing utilities and crowdsourcing tools that did everything from measuring the unfair outcomes of voice assistant technology to exposing the inner workings of social media recommendation engines. Building on our learnings from that cohort, we decided to focus our resources in 2023 on an emerging area of tooling where we saw a real opportunity for impact: open source auditing tools.
This year we're asking: Is there a role AI systems can play in addressing topics like environmental degradation, climate change, indigenous justice, food justice, and energy justice? Could AI technologies be a part of the solutions to these issues and not just a part of the problem? For this year's call, we are particularly interested in projects which address the health, economic and social impacts of climate change on the Most Affected People and Areas (MAPA) — people of color, indigenous and traditional peoples, local communities, and specific ethnic-racial groups across the global south. Click here to learn more about the 2024 AI and Environmental Justice awards.
Eligibility Criteria
We imagine that the Bias and Transparency in AI Awards will support a variety of software projects (including utilities and frameworks), data sets, tools and design concepts. We will not consider applications for policy or research projects (though software projects which leverage, support or amplify policy and research initiatives will be considered—for example, bias metrics and statistical analyses being turned into easy to use and interpret software implementations). Some example projects we can imagine:
- Projects that help expose elements of how AI systems in consumer technology products work, the bias that may be inherent in them and/or how to mitigate the bias in these systems
- Utilities that help developers understand and identify bias when building datasets and AI systems
- Components that allow developers to provide more transparency to users on the inner workings of AI systems
- Tools to help identify, understand and mitigate the bias inherent in publicly-available datasets
Applicants should:
- Have a product or working prototype in hand—projects which have not moved beyond the idea stage will not be considered
- Already have a core team in place to support the development of the project (this team might include software developers working in close collaboration with AI researchers, designers, product/project managers and subject matter experts)
- Embrace openness, transparency, and community stewardship as methodology
- Make their work available under an open-source license
Applicants must meet the following requirements
- Be legally able to receive funds in the form of a grant from the Mozilla Foundation (a U.S. 501(c)(3) non-profit organization)
- Be working towards solving a public interest problem or issue related to the focus of this Call for Proposals
- Meet the criteria outlined in the ‘What are we looking for?' section above
Offered Benefits
Successful applications will be rewarded awards of up to $50,000 USD each out of a total pool of $300,000 USD.