Bridging the gap between the use of AI in the private and third sector

The 2024 AI Summit was jam-packed with cutting-edge technology, inspiring ideas, and hope for the future. What does our panel of charity leaders think this means for third sector organisations?

 

On the second day of the summit, DataKind UK’s Data Science Project Manager Caitlin Loftus was joined by a panel of experts with backgrounds spanning the public and private sectors: CEO of Magic Breakfast Lindsey MacDonald; CTO of Learning With Parents Peyman Owladi; and DataKind UK Ethics Advisor Michelle Lee.

They discussed the opportunities, barriers, and potential solutions to bridging the gap between the public and private sectors’ use of AI, starting with how much, and how quickly, AI is driving the progress of private companies.

The opportunities that AI offers the third sector are the same, such as improvements in staff productivity, operational efficiency, and cost-effectiveness. And the results have the potential to be fantastic, solving the world’s most wicked problems.

But by the same token, the barriers and risks of using new technology and tools in the third sector are that much higher. AI horror stories revolve around privacy issues, data leakage, and problems with bias. Understandably, when vulnerable people could be placed at risk, the sector is extremely reluctant to experiment.

Fuel innovation

These blockers are going unaddressed, resulting in a widening gap between companies that are experimenting, and those left behind. Third sector organisations cannot keep up with the speed of AI development — traditional three- to five-year plans do not suit this volatile, ambiguous environment.

And while an exploratory approach is vital in order to engage with these technologies, the sector’s resources are constrained by risk-averse stakeholders. A charity’s decision to engage with AI will always be trumped by its top priority of supporting service users, in comparison to private companies that can place funding and development of AI use at their core.

However, we know the sector is good at innovation when it has the opportunity and freedom to do so. During the pandemic, we saw frontline services and charities adapt their models rapidly to fit remote working conditions and rising demand.

To innovate and benefit from advances in AI, the sector needs the resources and flexibility to do so. This means that funders need to recognise the rapidly changing landscape, and the importance of allowing for flexibility in delivery models and plans.

Leadership and boards also need the confidence and bravery to provide more opportunities to test and share policies and learning. Innovation, trust, and structures that allows risk-taking are needed in order for the sector to catch up.

Our AI Summit panel: Lindsey McDonald, Peyman Owladi, Michelle Lee, and Caitlin Loftus

Keep humans in the loop

One area in which the private sector is far ahead is the adoption of automation, such as chatbots, to tackle operational challenges that used to require lots of human involvement, like customer service.

Although time and capacity are always in short supply in the third sector, charities are rightly nervous about using automated resources in their service delivery. Removing human intervention entirely has already proven to be harmful, especially when supporting vulnerable people. Keeping people in the ‘loop’ of an automated service is essential. A good example is the UK government’s advice AI, Caddy — early trials quickly showed that it is most effective in collaboration with human support advisers.

A further risk is that digitised services can be inaccessible to minoritised groups. To usefully adapt Generative AI, Large Language Models, and other new technology, explicit consideration of how to provide safe, equitable access is key. Tools might be slower to develop, but would result in better design and more careful consideration of users’ safety.

Collaborate across sectors

To close this gap and address these challenges, the private sector needs to take responsibility for meeting the third sector’s needs for slower, safer, equitable technology.

Those building tech solutions must implement them in ways that do not cause further divide. Companies could lower the barriers to entry by sharing what resources they already have access to and providing legislature, sponsorships, and discounted software.

They can also consider more pro bono support and skilled volunteering to bridge the skills and capacity gap that third sector organisations face. This kind of collaboration between sectors would also enable learning and collective thinking among peer groups. Building partnerships involves trust, commitment to shared values, and recognising the expertise of each side.

DataKind UK has seen first-hand how much impact can be gained by implementing data science solutions, even fairly simple analyses, for social causes. Projects that benefit both people and planet — such as an anti-corruption organisation that was able to provide evidence that changed UK law; a women’s employment charity that reached new regions that had high unemployment; and a charity supporting seriously ill children that was able to get more uptake — have a huge, positive impact. These projects have all been highly successful without using AI — so what could be achieved with it?


Huge thanks to our fantastic panel: chair and DataKind UK Data Science Project Manager Caitlin Loftus; CEO of Magic Breakfast Lindsey MacDonald; CTO of Learning With Parents Peyman Owladi; and DataKind UK Ethics Advisor Michelle Lee. If you are keen to learn more about the use of AI in the third sector, keep your eyes peeled for more relevant resources we recommend!

Previous
Previous

Committee member Gail Dawes

Next
Next

Committee member Natan Mish