Nourish Logo

Artificial intelligence (AI) has shifted from a niche topic in tech circles to a headline conversation across health and care over the past couple of years. What was once the preserve of data scientists and software engineers is now discussed in care home corridors, home care offices, and even over the dinner table! But while the hype is loud, the reality for social care is more nuanced, filled with both opportunity and the responsibility to get it right. Join us as we explore the reality and potential of AI in social care.

The reality of AI in social care

Much of the buzz stems from Generative AI (GenAI). Tools like ChatGPT and Microsoft Copilot that create new content like text or images. These have made AI accessible to anyone, even those with no technical background. This accessibility has sparked imagination and curiosity across the care sector. Care leaders are starting to ask, “What can AI do for us?” 

However, the reality is that large-scale return on investment (ROI) for AI in social care hasn’t been fully realised yet. While the tech industry is racing ahead, the challenge for our sector is not to chase AI for its novelty. But to apply it deliberately to real business and care problems. 

Two clear paths exist: 

  1. Tech-driven innovation  
    Companies build increasingly powerful models. An exciting approach, but one that is often disconnected from on-the-ground needs. Which is incompatible with community centred care. 
  1. Problem-driven design  
    Where we start with the care challenge and design AI tools to address it in safe, specific, and scalable ways. The best forms of which involved lived experience throughout. Which is known as keeping a ‘human in the loop’. 

For obvious reasons, at Nourish we believe it’s the second path that holds real promise for social care. 

Why AI can be a game changer for social care

At its best, AI offers a way to augment human work, not replace it. In social care, this means easing the administrative load, surfacing critical insights faster, and supporting preventative approaches that improve quality of life for the people we serve. 

A useful way to think about this is through the Triple Aim framework from US healthcare, which focuses on: 

For UK care providers, AI can directly support these aims. For example: 

Crucially, this is not about replacing carers with algorithms. It’s about using AI in social care to lift some of the cognitive burden. So that staff can spend more time doing what only humans can. Building relationships and delivering compassionate, intuitive care. 

How AI works in practice

AI depends on data, and in social care, the ongoing shift to digital systems means we now have more data than ever before. Care records, care notes, health metrics, and incident reports all hold valuable insights if we know how to extract them. 

Two main AI techniques are particularly relevant: 

  1. Generative AI (GenAI) 
    These models excel at working with large amounts of unstructured text. For example, they can be trained to identify patterns in free-text care notes, spotting trends that might otherwise go unnoticed. 
  1. Machine Learning (ML) 
    This involves feeding structured data into a model to detect patterns and make predictions. For instance, by analysing hydration levels and health conditions a machine learning model can help predict falls risk. 

The most effective approach blends these techniques with expert oversight. A concept known as supervised learning. This ensures the AI’s “understanding” is guided by the experience of clinical professionals and frontline carers. Which in turn ensures the insights it produces are safe, relevant, and trustworthy. 

Why responsible AI matters

Social care deals with some of the most sensitive data possible, and the wellbeing of real people. That makes Responsible AI not just an ethical choice but a practical necessity. 

Responsible AI follows core principles: 

This last principle is crucial. In social care, AI should suggest, not act. That is what we mean by augmenting, rather than replacing care. A falls-risk prediction, for example, should prompt a human review and intervention. As opposed to automatically changing a care plan. 

This protects against the risks of over-automation. So, providers can ensure that the irreplaceable human qualities of care, empathy, intuition, and contextual judgment, remain at the centre. This is why we build systems that are transparent and auditable. So, we understand why recommendations are given and remain accountable to them. 

Practical applications on the horizon

Responsible AI opens the door to several promising use cases: 

These examples share a common goal. Namely: moving from reactive care ‘What happened?’ to proactive and preventative care ‘Why is it happening, and how can we change the outcome?’. 

Building trust in AI in social care

For AI to be embraced in social care, trust must be earned and maintained. This means: 

Trust isn’t a one-off achievement. It’s a relationship that must be nurtured through ongoing transparency and collaboration.  

The road ahead

The potential of AI in social care is undeniable. Used responsibly, it can improve outcomes, reduce costs, and allow carers to focus more on human connection. But the key word is ‘responsibly’. Rooted in human experience and shaped by the people and communities it supports. 

The most effective AI in our sector will come from co-production. Solutions developed hand-in-hand with those who understand the realities of care and support. Both in terms of those who provide care and support, and those who utilise it. This ensures the technology supports the real needs of the sector. Rather than forcing the sector to adapt to the technology. 

In the end, AI in social care should not be about replacing human judgment but empowering it. The goal is a future where technology enhances the compassion, skill, and dedication that define our sector. Where AI is the assistant, and people remain firmly in charge. 

Watch our Head of Data and AI, Sudha Regmi, discuss responsible AI and our Responsive Design at UKCW 2025 here.

Our Chief Marketing Officer Lee Gilbert recently joined David Thompson and Natasha Bone of Rehability Care for a two part episode of their podcast, Social Care Chronicles.

‘Digital Care Planning in Action: Transforming Lives with Nourish’ explores how we are working with our users to reshape the future of social care.

Social Care Chronicles Part 1

Part 1 of the episode covers a range of topics from reducing paperwork to empowering individuals with learning disabilities, autism, and mental health conditions. Our conversation uncovers the real-world impact of person-centred, data-driven care.

Watch Part 1 Here

You’ll learn

Whether you’re a care provider, tech innovator, or policymaker. This episode is packed with insights on digital transformation in social care.

Social Care Chronicles Part 2

Don’t miss Part 2, where we dive deeper into implementation, integration, and what’s next for the future of digital care!

Part 2 premieres at 11:00 A.M. 10th September 2025.

Watch Part 2 Here

In Part 2 we explore

Learn more about the innovations at Nourish. And how we’re building for the future of social care on our Articles page.

Technology initiated a sea change in social care. The rapid uptake of digital systems in social care saw the number of providers using digital social care records (DSCRs) double in the past four years. This impacts every aspect of the care community, from the processes we use each day, to the outcomes people using support experience. The benefits a provider can enjoy from ‘going digital’ are well documented. What is less well documented is the impact technology, and by recent extension AI, have on our perspective. Specifically, our perspective of ‘what good looks like’ in social care.

We sat down with Lewis Sheldrake, an expert with over 15 year’s experience working in local government and a legacy of innovative implementations of technology and training to discuss this topic. Lewis won the prestigious Local Government Challenge in 2023 with his novel AI Labs project. This project centred on ‘leveraging AI into all aspects of local government service delivery’. Crucially, in a way that supported emotional intelligence and promoted human interaction, two core tenants that ran through our conversation. We chatted about the changing perspectives on what good looks like in social care; moving from reactivity to proactivity, the relationship between data, AI and benchmarking in care and why we need to be open to new opportunities in technology.

Lewis Sheldrake spoke to us in a personal capacity and not on behalf of any local government or association.

Red heart Red heart Quote
“If we just look to use technology to fulfil functions already fulfilled by traditional models of care, I think that would be a missed opportunity.”

Let’s start at the start, how would you define ‘what good looks like in social care’ traditionally?

“From a local authority point of view the absolute baseline of what good looks like in social care is not having any kind of substantial safeguarding risks. Not being in a position where you’re leaving your most vulnerable without care. Or not essentially fulfilling some of the statutory duties that are placed upon local authorities under the Care Act.

“You can probably already pick up the fact that a lot of what I’m talking about is the absence of certain things happening, as opposed to it being a positive. I think that, unfortunately, is part of the challenge that we face. Looking at this through the lens of local authorities, it’s mainly focussed on avoiding crises, rather than proactive, aspirational care.

“Whereas if you were to put it from the perspective of a person receiving care, it’s different. For them, it’s having that assurance that the care that they are receiving is safe and of a good quality. So they can live safely and independently in their own home for as long as possible

“This contrast in perspective is critical, and it’s exactly the point from which we must evolve. It creates the space for technology to bridge the gap and, crucially, help redefine what ‘good’ can and should look like in modern social care.”

What kind of technology expands upon those perspectives when introduced to the process?

Red heart Red heart Quote
“I think invariably that so many of these cases could have been foreseen with the right technology and data in place.”

“Firstly, I think it’s important to understand that by the time someone gets to their local authority they are at a certain level of need. Their needs are relatively acute and consequently are going to require a level of intervention. One that’s likely expensive at that stage. This is the reactive model we’ve become accustomed to.

“We often hear cases where a family member can no longer cope. They’ve been providing informal care to that loved one and they’re burned out. Under the Care Act 2014, the local authority has a statutory duty to assess those needs – and where they meet eligibility criteria, arrange appropriate support. By this stage, the intervention is often urgent, complex, and resource-intensive.

“This is happening at a time where councils are absolutely creaking with the volume and complexity of demand that is arriving at their front door. And I think invariably that so many of these cases could have been foreseen with the right technology and data in place.”

Red heart Red heart Quote
“During COVID a council were able to identify with 95% accuracy which of their residents would likely be on the shielding list. Through the use of data, they’re able to accurately predict those people and proactively support them

So technology can help social care move from reactive to proactive action?

“Absolutely, with the right technology we could intervene earlier to help the person avoid requiring a care package for longer. Keeping them there, living well and independently for longer. Supporting their next of kin to be able to continue to provide that care but also have some respite for themselves. I think this is where technology really can fit in. There are two key components of this.

“The first is about being smart in our uses of data. There are some really good examples from my experience around using data. Such as a council utilising data from other interactions it had with people to help build greater levels of prediction. Initiatives to understand when somebody is deteriorating to such an extent that a proactive intervention would be valuable.

“I know during COVID a council were able to identify with 95% accuracy which of their residents would likely be on the shielding list. Through the use of data, they’re able to accurately predict those people and proactively support them. I also know of councils who utilised data to develop predictive falls models. Again, this significantly changes the effectiveness of care, as we can proactively reach out to at-risk people and offer them interventions. Interventions which, along with improving quality of life for citizens, save the local authorities money.

“The second part of this is through digital technology devices. For example, in the case of falls, a device that can detect when a person falls and activate an alarm in response to send for support. But beyond responding to incidents, there’s increasing potential to analyse the patterns and behaviours that often precede a fall. This allows us not just to react, but to intervene earlier or mitigate the risk before a fall occurs at all.”

Red heart Red heart Quote
“We often hear the same phrases: ‘It was only a matter of time.’ ‘We could see this coming.’ These reflections highlight how predictable many crises are – with hindsight.”

“I think both of these aspects, if used coherently, will alleviate the amount of pressure arriving at the front door of local authorities. Both in volume and also in terms of acuity. Now, by the time someone is coming to you for a care package you already have a more rounded understanding of their circumstances. Who they are, the context they live in, and the support networks around them. . The volume of that home care they need is less than it otherwise might have been thanks to earlier, preventative interventions.

“In effect, it helps smooth the peaks in demand – reducing the levels of complexity and acuity of cases presenting at any one time. Which in turn lowers the cost to the council and the financial burden on the person receiving support.”

Within the context of this more proactive view of what ‘good’ looks like in social care, how important is data?

“It’s central. Broadly, there are two ways of using data to understand need and provide effective care.

“There’s the strategic, macro use of aggregated data across large population groups. This approach is highly effective at generating predictive models that assess risk and identify patterns. Providing valuable insights for both providers and commissioners. It enables more intelligent, data informed decisions about how services are designed and delivered, ensuring they are suitably tailored to meet the needs of their clients. We’ve seen examples of this approach applied with great success in other high-risk sectors, such as the aviation industry.

“The second way, and I think the more exciting side, is the micro, hyper personalised application. Where we can focus down on the individual to really understand their needs and ambitions. Again, we see impressive examples of this data application in other sectors. Such as the preference-driven algorithms behind Amazon, Netflix, and Spotify. As well as personalised customer journeys across digital platforms.

“If you were to think about how some of those principles that underlie their architecture. Albeit very different sectors with very different objectives. It raises an important question: what if that architecture were applied to a health and social care context? How helpful that would be to ensure people are getting exactly what they want and need, when they want and need it?

“One of the most powerful aspects of this shift from reactive to proactive care is the ability to anticipate. In social care, hospitals, and communities, we often hear the same phrases: ‘It was only a matter of time.’ ‘We could see this coming.’ These reflections highlight how predictable many crises are – with hindsight. With the right acquisition and application of data, we can change what good looks like in social care in a positive, person led way.”

How can AI support care providers to utilise their data for benchmarking what good looks like in social care?

“If we break down the core functions that exist in care, there are a number of different actors doing different tasks e.g. care planning, initial assessments, delivery of that care. I think there are really compelling applications for AI for each of those. Applications that can enhance the delivery of that function, while in turn delivering a higher level of quality and precision to the end user.

“We’re already seeing promising examples of AI reducing administrative burden with data entry. In terms of things like transcription and data input. I think it’s a good start, but there is significant untapped potential to expand AI’s role across the wider care ecosystem.


“For any care plan that’s pulled together, you think about how many other care plans have gone before that. Drawing upon the decades of experience and knowledge from the people that are inputting into those care plans. With AI this information can be readily triangulated to make the most precise care plan for any given set of circumstances. AI can prompt follow up actions or suggest referrals based on all the data your service has. These prompts support care decisions rather than automate them. Helping to standardise the service offer based on the individual needs of each client, by drawing upon the wealth of experiences and outcomes across your service to inform best practice. Ideally alleviating the variability of individual social workers, while enhancing the specificity of your care plans.

“The data gathered during this care provision is then fed back into the system. This creates a virtuous cycle of person led, community centred care. And that’s just one quick example. From high level strategy to direct care delivery in people’s homes there are applications for data and AI that improve service quality, operational efficiency and ultimately deliver the objectives that keep people living safely and independently in a place they call home for longer.”

The potential is certainly impressive, and you touched on something important about AI application. How do we make sure AI augments, rather than replaces, human interactions in care?

“The most immediate answer is reducing administrative burden. There’s lots of opportunities for AI and care technology in general to afford people more time delivering what they got into the job to do. Face to face care, in a more personalised and informed way.

Red heart Red heart Quote
“We’re using care as a kind of umbrella term for a whole number of things at the moment.”

“Let me offer a counterpoint. There’s a common misconception that, more human care always equates to better care. But in some cases, that’s not true. Overprescription and unnecessarily invasive care can diminish a person’s independence and dignity. Take supported living settings for example. Imagine someone with learning disabilities who receives 24/7 care. There are people coming in, waking them up in the night to routinely check in on them. This is well-intentioned, but disruptive. In such cases, the use of technology there can help provide that person with a more respectful and person-centred alternative. Providing greater levels of privacy, independence and dignity. While still ensuring support is available when genuinely needed.

“My key point is about precision and that is certainly where I think AI can play a transformative role. Ensuring care is sufficiently proportionate to the needs of the individual. I don’t think that necessarily means more care is better. I think it’s about the quality, appropriateness and value of the ‘care’ being provided.

“Care in inverted commas mind you, because we’re using care as a kind of umbrella term for a whole number of things at the moment. A lot of responsibilities that are falling under the umbrella of care are not actual direct care. They are different forms of administrative tasks. We need to think about how to displace that through the use of AI and other digital tools to ensure that we are maximising our resources and delivering the best outcomes possible.”

Do you see AI and data supporting not only care quality and cohesiveness, but also capacity?

“Absolutely, I think it has too. We have to be realistic. There are massive capacity challenges both in terms of the workforce, and also in terms of the budgets to support social care.

I personally think there are circumstances where technology could well replace some types of care which are not necessary to be delivered in person. With an ageing population and increasing levels of need and vulnerability, we have to use our finite resources wisely. Care capacity is not limitless, and technology offers a valuable opportunity to redeploy human effort where it’s needed most.”

Capacity is a sensitive subject, what kind of opportunities do you see?

“Understandably so, there are massive capacity challenges in social care, both in terms of workforce and budget. But rather than viewing these constraints purely as limitations. They invite us to re-examine our definition of ‘good’. They imagine how technology and AI can shape what good looks like in social care into a new vision. One that’s more sustainable, personalised, and outcomes focused.

“So much care provision is historically focussed on things like washing, bathing, food, medication. But if we consider this through the lens of Maslow’s hierarchy of needs. These are foundational; they sit at the base of the pyramid. Essential, yes, but not sufficient for a fulfilling life.”

“What it often fails to address, whether due to technical limitations or lack of resource, is anything related to the higher levels of that hierarchy. Support for self-esteem, companionship, and emotional fulfilment is frequently absent. Let alone opportunities for people to self-actualise!

“I really believe that there’s an opportunity to move away from the primary function of care provision being to give people the bare necessities and to basically keep them alive.

“An opportunity for us to move to a form of care that helps people have a greater level of self-esteem, belonging and purpose. Take social isolation for example. Everyone is aware of our social isolation problem and the significance of its health implications. But actual interventions to address this issue are sparse, largely due to cost.

“I think there is huge potential to augment existing models of care using technology and AI to alleviate some of these kinds of challenges.

“AI tools, even just the currently mainstream ones like ChatGPT offer fascinating potential in supporting social connection, stimulation, and engagement. For some people, these platforms provide opportunities to engage in meaningful conversations they might not otherwise have. Interactions that validate their experiences, challenge their thinking, and stimulate them intellectually. It’s obviously not care in the way that we understand and conceptualise care and certainly traditionally. But when you stop to think about it. If someone is able to enjoy an engaging conversation about a subject that’s meaningful to them, that validates their experience, challenges them and stimulates them intellectually, isn’t that a core tenant of ‘good’ care?

“I think there’s value in that. These possibilities have scope, and the potential to progress much further and I don’t think it should be ruled out. Absolutely, AI and technology can help drive more informed decisions, reduce administrative burden and promote coproduction.

“But if we just look to use technology to fulfil functions already fulfilled by traditional models of care. I think that would be a missed opportunity.”