Fraud Blocker

How to guide

CQC

The complete 2026 guide to the CQC Single Assessment Framework for homecare providers

CQC Single Assessment Framework explained for homecare: Quality Statements, 6 evidence categories, scoring thresholds, DSCR requirements, and how to prepare in 2026.

Table of contents

The Single Assessment Framework changed how care providers are assessed, and 2026 is the year that change becomes fully operational. The CQC has set a target of 9,000 assessments published by September 2026, which means the probability of your agency being assessed in the next 12 months is higher than it has been for several years.

If you run a homecare agency and you're trying to understand what the CQC actually expects from you right now, this guide is for you. It explains what the framework contains, how the evidence and scoring system works, what is currently changing, and what you can do to prepare.

From KLOEs to Quality Statements: what actually changed

Before the Single Assessment Framework, the CQC assessed providers against Key Lines of Enquiry (KLOEs). These were prescriptive, sector-specific prompts that told inspectors what to look for in each area of care.

The SAF replaced KLOEs with 34 Quality Statements. The change is more than a rebrand. Quality Statements are written from two distinct perspectives.
- "We" statements describe what providers commit to delivering.
- "I" statements describe what a person receiving care should actually experience.

This framing is deliberate: it shifts the starting point from process compliance to care outcomes, and it is what inspectors are now looking for.

In practice, an inspector isn't just checking whether you have a medication policy: they're looking for evidence that people in your care are experiencing the outcomes that policy is supposed to produce.

One further shift worth understanding: previously, a provider either passed or failed against a KLOE. Under the SAF, every Quality Statement is scored on a four-point scale, which means partial credit exists, but so does more granular exposure of weakness. A single area of significant failure can cap your overall rating for an entire Key Question.

What's changing:

Following significant criticism of the SAF, including from Dame Penny Dash's review and the Care Providers Alliance, the CQC ran a public consultation called Better Regulation, Better Care, which closed in December 2025. The proposals include developing sector-specific frameworks, introducing supporting questions more similar to the old KLOEs, and moving away from the arithmetic scoring model currently used to convert Quality Statement scores into ratings. The five Key Questions and the four-point rating scale are confirmed to continue. Sector-specific frameworks for homecare have not yet been published. Until they are, the 34 Quality Statements and the current scoring methodology remain operative. Plan your preparation against what exists now, and watch for CQC announcements on the new sector-specific guidance.

The five Key Questions: what the CQC is still assessing

The five Key Questions haven't changed and will not change under the proposed reforms. They are Safe, Effective, Caring, Responsive, and Well-led. Understanding what each is genuinely looking for helps you focus your evidence-gathering in the right places rather than trying to cover everything equally.

Safe: Are people protected from avoidable harm? This covers medication management, safeguarding, risk assessment, incident reporting, and staff training and competency. In the current SAF, Safe carries the highest number of Quality Statements and draws the most scrutiny in homecare assessments. Your mandatory training records, medication audit trails, and alert resolution data are all directly relevant here.

Effective: Does care achieve good outcomes? Inspectors look at care planning quality, clinical oversight, whether staff receive the right training and development, and whether care is delivered based on current evidence and best practice.

Caring: Are people treated with compassion and dignity? This question is most heavily evidenced through the direct experiences of people who use services, family feedback, and observed interactions between staff and those they care for. Your investment in person-centred care shows up most clearly here.

Responsive: Is care organised around individual needs? Inspectors look at how personalised care plans are, how quickly concerns are acted on, and how accessible services are to people with different communication needs. The quality of your daily care notes and the speed at which alerts are resolved both feed into this question.

Well-led: Is there effective leadership and governance? This question tells the CQC whether your agency can sustain quality over time. It covers how you monitor and improve quality, your culture, your use of data, and whether your leadership team is visible and accountable. Your policies and procedures and your internal quality monitoring activity are the foundations here.

For homecare providers, Safe and Well-led tend to draw the most scrutiny. They're the areas where systemic failure is most likely to surface, and where having structured, accessible digital records makes the clearest difference to your evidence.

The 6 evidence categories: how the CQC gathers what it needs

Understanding the Quality Statements tells you what is being measured. Understanding the six evidence categories tells you how the CQC gathers the evidence to make that measurement. Knowing this distinction helps you prepare more precisely.

1. People's experience

What people receiving care say about their experience. This includes feedback gathered during assessments, conversations with service users, and information submitted through the CQC's Tell Us About Your Care system. Regular client surveys and consistent feedback mechanisms demonstrate an agency that actively listens and responds. If this is an area where your evidence is thin, it is the first place to address.

2. Feedback from staff and leaders

Conversations with care workers and managers about how the service operates in practice. Inspectors pay attention to whether frontline staff understand their responsibilities, feel supported, and can speak openly about concerns. Culture is assessed through conversation as much as through documentation.

3. Feedback from partners

Input from GPs, social workers, district nurses, and local authority commissioners. Agencies with strong working relationships with external partners, and with evidence of responsive, integrated communication, tend to score more strongly in Effective and Responsive.

4. Observation

What inspectors see during site visits or remote assessment activity. This includes how staff interact with people, how the office environment is managed, and how care is being delivered day to day.

5. Processes

Your documentation, policies, care records, training logs, medication records, and audit activity. This is the category where digital care records make the most material difference. An agency that can produce a clear, timestamped audit trail of any care activity within minutes presents a fundamentally different picture to one that needs to retrieve paper files. For homecare providers, this is increasingly where assessments are shaped. Birdie's care quality platform covers eMAR, alerts, auditing tools, and care planning, producing structured, inspection-ready records as a by-product of day-to-day operations.

6. Outcomes

The actual results of the care being delivered: whether people's health and wellbeing are maintained, whether medication errors or incidents are occurring, and whether care plans are being followed in practice. This category requires data, not just documentation.

The scoring system: how CQC currently converts evidence into ratings

Under the current SAF methodology, each Quality Statement assessed is scored on a scale of 1 to 4, where 1 is Inadequate and 4 is Outstanding. The scores across all Quality Statements assessed under a Key Question are combined, converted to a percentage, and mapped to a rating using the following thresholds published by CQC:

25 to 38% = Inadequate

39 to 62% = Requires Improvement

63 to 87% = Good

Over 87% = Outstanding

There is a critical override: if any single Quality Statement within a Key Question is scored as Inadequate, that Key Question cannot receive a rating above Requires Improvement, regardless of how the other statements score. One area of significant failure has the potential to cap your entire rating for that question.

It's also worth noting that CQC inspectors typically assess a subset of the 34 Quality Statements during any given assessment, not all 34. For homecare, assessments tend to focus on 10 to 12 statements, with the selection reflecting the scope of the service and any known areas of concern.

What's changing:

The Better Regulation, Better Care proposals include replacing this arithmetic scoring approach with professional judgements made "in the round" against Ratings Characteristics guidance not yet published. Until that guidance is finalised and the new framework is rolled out, the current methodology applies. Continue building evidence against the existing Quality Statements.

Why digital records matter to your CQC outcome

The CQC's movement toward more frequent, data-informed assessment has made one thing operationally clear: paper-based records create risk. Not because the CQC directly penalises providers for using paper, but because paper records cannot produce the real-time audit trails, accessible evidence, and trend data that inspectors increasingly expect to see across the Processes and Outcomes evidence categories.

The Digital Social Care Record (DSCR) is the framework established by NHS England for standardising how homecare providers capture and store care information digitally. NHS England maintains an Assured Solutions List of DSCR-compliant software, which providers can use to verify that a system meets the required national standards for security, data quality, and interoperability.

The operational case is straightforward. When a CQC inspector asks for six months of medication administration records for a specific client, a digital system produces that in seconds. When they ask how many alerts have been raised and resolved in the last quarter, the same. When they want to see evidence of care plan reviews across your client base, the same. The ability to retrieve structured, timestamped records at short notice is not a feature. It's the difference between an assessment that runs on your terms and one where you are scrambling to produce evidence on the day.

Birdie is on the NHS England Assured Solutions List for Digital Social Care Records. The platform covers electronic call monitoring, eMAR, care planning, assessments, and alerts, producing the structured data that supports evidence-gathering across all six CQC evidence categories. If you're evaluating whether your current system meets the DSCR standard, the Digital Care Hub and Birdie DSCR Switching Worksheet is a practical starting point. For a broader comparison of what to look for in domiciliary care software, Birdie's 2026 buyer's guide covers the key considerations.

Monitoring quality continuously: Birdie's Q-Score

The shift toward more frequent assessment means that waiting for an inspection to discover gaps in your evidence is a significant operational risk. The agencies that handle assessments most confidently are not the ones that prepare hardest in the weeks beforehand. They're the ones that monitor quality continuously as part of how they work every day.

Birdie's Q-Score is a weekly quality monitoring tool that analyses data from your Birdie account across four areas aligned with the CQC's Key Questions: Care Delivery, Care Planning, Responsiveness, and Caring Staff. Each area is scored from 1 to 4, using the same scale as the CQC's own ratings.

The Q-Score is designed to show you where your evidence base is strong and where it needs attention. An agency scoring poorly on Responsiveness can see exactly where alerts are going unresolved or where response times are slipping. They can act on that information before an inspection surfaces those gaps.

Two things are worth being clear about. First, the Q-Score draws on the data your agency is already generating through everyday use of Birdie. It doesn't require a separate monitoring process. Second, it's a guide for continuous improvement, not a guaranteed predictor of your CQC rating. CQC inspectors draw on information from all six evidence categories, including feedback from people and staff that no software system can fully capture. The Q-Score gives you a meaningful, data-driven signal about where to direct your attention.

What that looks like in practice:

- Azure Care moved from Good to Outstanding by using Birdie Analytics to shift from firefighting to proactive quality monitoring.

- Christies Care achieved Outstanding and saw their Q-Score move from 2.8 to 3.4 after focused improvements to care planning.

- Britannia Homecare, which had received a string of Requires Improvement ratings, used Birdie to address the process and documentation gaps that had been holding them back, and moved to Good.

These are not outliers: they reflect what happens when the evidence base for quality care is organised and accessible rather than dispersed across paper files or disconnected systems.

What to do right now

With the CQC on track to publish 9,000 assessments by September 2026, the likelihood of your agency being assessed in the coming months is real. Here's how to focus your preparation.

Audit your six evidence categories. Work through each category and ask honestly where your current evidence is strong and where it is thin. People's Experience and Outcomes are the categories most likely to expose gaps in agencies that have not yet moved to structured digital record-keeping. Birdie's CQC Inspection Preparedness resource offers a practical framework for working through this exercise.

Focus on Safe and Well-led. These two Key Questions draw the most scrutiny in homecare assessments. Demonstrating that your safeguarding processes are working, your incident reporting is thorough, your leadership monitors quality actively, and your staff feel supported to speak up: these are the foundations of a strong outcome.

Get your digital records in order. If you are still using paper for any part of your care recording, medication management, or audit activity, this year is the time to change that. The DSCR Switching Guide from Digital Care Hub and Birdie explains what to look for and how to evaluate your options. Birdie can also provide official documentation confirming DSCR assured supplier status for use in tender applications.

Use your data week by week. Whether through Birdie's Q-Score or another quality monitoring approach, inspection readiness should be something you track continuously. For a practical walkthrough of how Birdie supports this, watch the How Birdie Helps with Auditing webinar or book a demo.

For a practical preparation checklist mapped to the current framework, download Birdie's CQC Toolkit. For broader context on how quality and compliance connect to commercial resilience in 2026, see Birdie's 2026 homecare growth guide.

Published date:

March 23, 2026

Author:

Hannah Nakano Stewart

Share on socials

Join the mailing list

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Ready to work smarter, not just harder?

Transform your homecare agency with technology that connects, informs, and supports your team every step of the way.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo.

99.9% uptime

99.9% uptime

99.9% uptime

99.9% uptime