How can algorithm development help counter human bias?
Algorithm development is a core component of digital and technology innovation. Algorithms are the logical formulae that information systems and processes use to make decisions. They may be simple or incredibly complex. They are used in almost all kinds of information systems, in defining how applications work and how they process and interrogate data. In artificial intelligence they are used to prompt and assist machine learning.
Algorithms will by default reflect the bias of their design and development teams, and any user testing community. This bias may manifest as ‘coded blind spots’ where logic cases are not considered or deemed marginal ‘exception cases’. It may also manifest as conditions and rules being set, which are overtly or implicitly discriminatory.
In order to ensure technology is socially equitable and inclusive, algorithms need to be designed, coded and tested fairly. There needs to be a proactive focus on considering all use cases and ‘logic’ judgements encoded therein, from the design phase, through development, testing and iterative product enhancement.
Where do I start?
Here are some practical steps you might take if you want to shift towards more inclusive algorithm development.
1. Make a commitment to inclusive algorithms
Any organisation that wishes to lead on inclusive innovation might begin by simply making a commitment to inclusive algorithm development. Setting a clear, positive intention statement can be extremely powerful. It is important that people at all levels of the organisation recognise and respect the meaning of such a commitment, from software analysts, engineers and testers to Senior Leadership. A statement of intent will have more validity and impact if more people can contribute to the choice of words and phrasing.
2. Embed inclusivity metrics into software test and evaluation processes
Setting specific measures around the inclusivity of an algorithm helps ensure it is part of the success criteria of the end solution. We help organisations to define and embed a set of measures and metrics that work for them.
3. Audit existing live algorithms
By auditing algorithms through the lens of inclusivity, we can begin to identify and fix areas where they may fall short.
4. Consider a diverse user population
Agile development can focus on a narrow, tightly defined set of target users or customers. Start by checking that user or audience definition is broad and adequately inclusive. It is crucial to proactively sample needs and user journeys across a diverse stakeholder audience. This may mean actively engaging with disadvantaged and marginal groups via alternative formats.
Sector Spotlight: Algorithms in Financial Services credit and risk scoring
There has been greater public and media scrutiny of algorithmic bias in financial risk assessment and credit scoring, particularly in the United States. The increasing prevalence of AI-assisted decisions is also relevant here. A five-point ‘REACT’ strategy has been proposed to help financial institutions reduce the effects of bias in credit scoring, and to improve algorithmic hygiene. The five parts of this are:
Ready to Take Action?
If you’re a digital innovation leader and would like to discuss how to activate Inclusive Innovation Strategy in your organisation, we offer a free 45-minute virtual Strategy consultation.
Inclusive Innovation leaders may wish to read the A+ Alliance call to action on Inclusive Algorithms: https://aplusalliance.org/en/pages/call_to_action.
The World Bank has produced detailed guidelines on credit and risk scoring, which is available for public download here: https://thedocs.worldbank.org/en/doc/935891585869698451-0130022020/CREDIT-SCORING-APPROACHES-GUIDELINES-FINAL-WEB