In a hospital with 6,000 staff, about 250 are consultants. They represent 4% of the workforce. But their decisions activate, animate, and control everything from board to basement. The quality of the organisation is almost entirely predicated on the quality of those decisions.

The Care Quality Commission does not answer the fundamental question: are the doctors in this building making good decisions that lead to good outcomes?

CQC defines quality through five domains: Safe, Caring, Responsive, Effective, and Well-Led. Not a single hospital is rated less than Good for Caring. Good to know. You will be cared for, while all else may not be good enough. The Effective domain checks whether clinicians follow guidelines and have access to data. It does not assess decision quality or clinical outcomes. Well-Led inspects the bureaucracy of management and the board. Not how the consultants control and direct the means of production.

To be direct: CQC inspects the bureaucracy of care. It inspects everything except the outcomes and the clinical decisions made by the doctors in the building.

The food hygiene rating problem

The Food Hygiene Rating Problem Both tell you the process is clean. Neither tells you the output is good. = Food Hygiene Rating ✓ Clean kitchen ✓ Safe food handling ✓ No rat droppings ✓ Temperature storage correct Says nothing about whether the food is good. CQC Hospital Rating ✓ Governance in place ✓ Management arrangements ✓ Safety checklists completed ✓ Policies documented Says nothing about clinical decision quality or outcomes. Source: STRASYS analysis

In the UK, food establishments display Food Hygiene Ratings. A score of 5 means no rat droppings, the place is clean, staff handle food safely. It says nothing about the chefs or the quality of what comes out of the kitchen. For that, you need a food critic. Michelin, perhaps.

The CQC hospital rating works the same way. It tells you the bureaucracy of the kitchen is in order. It does not tell you whether the consultants are producing good clinical outcomes.

The Secretary of State has been remarkably blunt. CQC is not fit for purpose. We await the second Penny Dash report. It must surely address what really matters.

The coffee room metric

Soon after arriving in a trust labelled Inadequate, I discovered that CQC had measured one thing about consultant engagement: whether the doctors' common room was occupied during the inspection visit. It was empty. That number, zero, became the measure of consultant engagement.

The chief nurse of NHS Improvement stared me down across the boardroom table: "How do I know the consultants are making good decisions if they are not even in their room?"

Eighteen months later, the room was full. CQC concluded there must have been collusion. The data point mattered, and then it did not.

In a world filled with data, this was the measure. Bizarre.

What the patient actually gets

A hospital is filled with over 30 specialties and many more services, covering needs from womb to tomb. Individual services in the same building can range from outstanding to inadequate. CQC rarely gives that granularity. The overall hospital rating is applied as a proxy for the quality of any specific service within it.

When a GP refers a patient through the choice and booking system, up comes the hospital name, the address, and the CQC rating. The patient is supposed to extrapolate from the overall rating that the specific specialty they need is also Good or Outstanding. It is not that simple.

How do patients actually choose? Insider trading. Ask a friend who knows a doctor. Check the diary. Assess travel options. Nobody writes to ask: what are your outcomes, your complication rates, your complaint rates, your litigation rates?

It is, after all, just data. And we have it.

We have the data, the tools, and the imagination

At STRASYS, the Decision Intelligence engine for healthcare, we built Board Operating System to evaluate what CQC does not: board dynamics, decision quality, performance, and risk management. The AI-driven evaluation addresses the Well-Led domain with evidence, not self-assessment. Because the quality of the board's decision-making determines the environment in which clinical decisions are made.

The Strasys Value Index goes further, measuring how effectively trusts deliver high-quality, timely healthcare relative to cost. It accounts for population need, not just raw activity. Lower mortality and readmission is better. Shorter waits improve outcomes. Shifting care to appropriate settings is good, if quality is maintained. The Value Index asks the question CQC does not: is this trust delivering value?

Naeem Younis, STRASYS CEO, has argued that the NHS needs to move beyond inspecting the bureaucracy of care to measuring the decisions and outcomes that actually determine whether patients are well served. We have the data. We have the tools. The question is whether the system has the imagination to use them.