Monday 22 June 2015

Understanding care quality: confronting complexity

Understanding care quality: confronting complexity

How will we know whether the quality of care in the NHS is improving or deteriorating under the new government? Or whether and how the financial pressures on the service are affecting the quality of patient care?

The NHS has to collect vast swathes of data on the quality of care it provides, but this gives us only a tiny glimpse into the nature of the million-plus patient contacts that take place every couple of days. This is not only because the data collected comes from a very small sample of NHS activity, but also because the ways in which the information is collected (the topics it focuses on, how questions are posed, and the types of information used to generate answers) shape the data generated about the quality of patient care. This is not to say that the data produces ‘untruths’, but rather that it only offers insights into some aspects of care, in some places, at some moments. Had the focus or design of any of our quality measurement tools been different, then some quite different truths may have emerged.

The limitations of current approaches to assessing quality in the NHS is one of the themes of a new Biomed Central issue on quality in health care. A number of the authors criticise the emphasis that governments and national health bodies place on the measurable aspects of care quality, which risks devaluing the less tangible, relational aspects of health care.

Numbers are fantastic things – they have a comforting air of certainty about them and are particularly good in enabling comparisons across space and time. But some of the most important characteristics of health care seem to lose something when they are translated in this way. Iona Heath (former head of the Royal College of General Practitioners) argues in her essay for the collection that the Quality and Outcomes Framework offers only a very partial and impoverished picture of the quality of primary care. She claims that the quality of that care depends on two things, ‘judgement and human relationships’, neither of which is easy to measure.

In his paper, Dane Pfleuger, who studies quality measurement in English NHS hospitals, points out how all measurement tools reflect particular priorities and ways of thinking at the time they were designed. Measurement processes aren’t just a means of describing reality, they also shape the very reality they are reporting on. For example, an early political commitment by the last Labour government to end mixed-sex wards has shaped and continues to shape how we measure patient experience, through multiple questions on the topic in the national inpatient survey. A different political focus (such as on the security of hospital buildings) could have resulted in a different set of survey questions, and led to a different set of data to draw on in response to the question: ‘Has patient experience improved under this government?’ Of course, those alternative questions in the survey would probably have prompted a different set of managerial priorities at a local level, leading to change and improvement in the different areas.

Pfleuger’s point is not so much that we’ve got the wrong quality measures, but that we need to recognise how partial and limited current approaches are, and complement them with a variety of different kinds of information. For example, he suggests that quality accounts could be audited, not just against available official data, but also the views of staff about the state of care in their organisations.

One of the challenges of combining multiple approaches to understanding the quality of care is that you probably won’t get a single, easily-digested message about whether care in an institution is good or bad, declining or improving. In Pfleuger’s study, one head of quality improvement at an NHS trust reported softer forms of information, such as patient stories, were seen as a ‘bit of a liability to management’ because they could produce quite different impressions to ‘official’ versions of the quality of care based on compulsory and audited data.

Drawing on a whole range of information about care quality will surely take us a bit closer to a meaningful understanding of what care is like in practice. We need to engage with the fact that care is complex and multifaceted. Summarising and analysing what we know from available data to inform policy, practice and indeed politics are essential tasks of policy analysis. But so too is the recognition of how limited that knowledge is. The best management reports and discussions of quality should not be those that present only numbers and unequivocal judgements of performance, but those that admit uncertainty and limits of focus, acknowledge complexity, and promote action that is sensitive to those limitations. Kings Fund

No comments:

Post a Comment