# QOF News

### QOF indicators consultation 2016

The NICE indicators consultation is currently live and will be open until the end of the month. This is a combination of CCG indicators and QOF indicators as well as some other indicators for GPs which will not carry incentives.

You can read my response here but there is nothing particularly inspiring. Perhaps of most concern is a suggestion of a atrial fibrillation screening scheme that is not supported by the National Screening Committee or even the NICE guidelines although the consultation document does make it sound like the NICE guidelines say something else. Ooops!

### Retired QOF Indicators

QOF in 2014/15 was quite a bit smaller than it had been the year before. This was largely due to the removal of clinical indicators and the funding being moved into the global sum. Other points were moved to the new admission avoidance DES.

One thing that did not receive much publicity at the time was that NHS England planned to continue to monitor some of the indicators that had been removed. These results have now been published. Things did not go entirely to plan and for technical and other reasons just over half of the practices actually have data available. HSCIC refer to these as "Indicators no longer in QOF" or INLIQ. This is now on the site and can be identified by the grey colour of the data in the table.

The HSCIC does warn about comparing this data with previous years as they say that the dates and rules may differ. In practice they don't actually vary very much at all. There is a more important reason to be a little cautious and that is these indicators are no longer curated by the practices. Whilst exception reporting still applies practices are far less likely to enter exception codes where there is little reason to do so. The biggest drops occur where there has been little clinical benefit to patients in the views of GPs.

Somerset CCG is a special case. Lots of indicators were effectively retired there in 2014/15 although prevalence was still counted and other indicators continued as quality measures. There is therefore a lot more grey in the Somerset statistics than in the rest of England.

The indicators themselves remain the same. Internally (and when I run the downloads) each indicator now has an "active" flag. If an indicator is not active then it is presented in a grey font. This gives maximum flexibility as things may changes rapidly and differently across the country in the next couple of years.

### Calculating prevalence

For ease of comparison all of the prevalences on this site are based on the whole practice registered list, not just those in the correct age group - this applies to areas such as diabetes or epilepsy. This is largely because countries other than England do not list the specific number of patients on the practice list over, say, seventeen years old. It is also the prevalence that is used for adjustment of point value.
I was asked this week why the whole list is used for prevalence adjustment rather than the age adjusted subgroup. Is this unfair on practices? Well the answer is no, it is actually more fair the way it is, but for some quite complicated reasons. We have to look at some maths.
$\begin{array}{c}\mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{PracPrev}}{\mathrm{AvgPrev}}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\left(\frac{\mathrm{Register}}{\mathrm{PracList}}\right)}{\left(\frac{\mathrm{AvgReg}}{\mathrm{AvgList}}\right)}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{Register}}{\overline{)\mathrm{PracList}}}×\frac{\overline{)\mathrm{AvgList}}}{\mathrm{AvgReg}}×\frac{\overline{)\mathrm{PracList}}}{\overline{)\mathrm{AvgList}}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{Register}}{\mathrm{AvgReg}}\end{array}$

We have started from saying that the point value is modified by the practice prevalence relative to the average practice prevalence. Then the point value is modified by the relative size of the practice list overall. The second line expands this a bit by using the register size and the list size against the averages. It is true that this is not exactly how the average prevalence is calculated but it is pretty close.

After simplifying the formula there is a lot we can cancel from the top and bottom until we get to the final formula which basically says that the practice gets a set amount per person on the register but that this drops as the national average register size rises. Nothing else matters.

We can try again using an 'Eligible' denominator for the register.

$\begin{array}{c}\mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{PracPrev}}{\mathrm{AvgPrev}}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\left(\frac{\mathrm{Register}}{\mathrm{Eligible}}\right)}{\left(\frac{\mathrm{AvgReg}}{\mathrm{AvgEgble}}\right)}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{Register}}{\mathrm{Eligible}}×\frac{\mathrm{AvgEgble}}{\mathrm{AvgReg}}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\\ \\ \mathrm{Point}\mathrm{value}=\mathrm{£}160×\frac{\mathrm{AvgEgble}}{\mathrm{Eligible}}×\frac{\mathrm{Register}}{\mathrm{AvgReg}}×\frac{\mathrm{PracList}}{\mathrm{AvgList}}\end{array}$

There is much the same process here but there is a lot less to cancel out. That is not necessarily a bad thing but we can see how this formula behaves. If we assume a practice of average list size then the last term will be one. If it has an average register size for diabetes then the middle term will be one as well. Interestingly in this case the point value would vary with the proportion of over 17 year olds on the practice list (i.e. Eligible would change without changing the overall list size). This is not what we want to see at all as the practice list makeup would alter income without any change to the actual number of patients treated.

So that is why the overall list size is used to calculate prevalence.

### Release notes 2015

All four countries now have data on the site for the year 2014/15. The four countries have continued to diverge in their requirements and common ground is becoming smaller. Where indicators are broadly equivalent I have tried to make them comparable.

There are a couple of things worth noting below.

### Wales

Wales was first to publish this year, about two weeks ahead of Scotland and a month ahead of England and Northern Ireland, which was impressive. There are no significant issues with their statistics. They have continued to publish data about local practices groups so I have used these on the site. I can't find any official codes for these so they have a "QDB" code which is entirely made up by me.

### Scotland

Most of the Scottish data is fairly straightforward. There seems to have been a bit of muddle about the actual identifiers for indicators and I have used those in the final publication. QS002 was used last year and then for a completely different indicator this year so I have used QS002A for the new one.

### England

English data imported fairly easily this year, much helped by a "raw data" publication. There is a new "sub-region" which was used in publication which adds another level to the hierarchy on the site with the exception of Wessex. On the spreadsheets Wessex CCG was listed as its own sub region and to avoid a horrible loop it simply skips the sub region stage.

### Northern Ireland

As things stand there is no prevalence data for Northern Ireland where there is no other indicator in that disease area. This affects obesity, epilepsy and learning disability and there is not a lot of smoking data either. I will update this if more information becomes available.

There are now Local Commissioning Groups in Northern Ireland. I have used these in much the same way as the old boards although they cover different areas, most notably by having a separate group for Belfast.

### 2015 Data publication dates

All of the data from this site comes from the various governments around the UK. Some of the publication dates have been announced. It takes me a little bit of time to translate this onto the site but for interest here are the dates currently available.

### QOF Consultation

It is nearly the deadline for responses for the consultation by the NICE committee on potential new QOF indicators (5pm on the 23rd of February). The first that these indicators could be expected to be seen would be 2016-17 and in general the committee has been largely ignored over the last couple of years. You can read the NICE consultation papers and my response to them.

### 2014 QOF data

The data for 2013/14 is now on the site. It proved a bit of a challenge this year as, for the first time, there were material differences between the QOF in the four countries. Many of these differences were around the areas of thresholds and timings although the gaps have widened in 2014/15 so need to be tackled.

I had to bite the bullet and have some sort of policy about this and how to present the data. For precision each indicator is reported at practice level as they have been published by the four countries. Indicators particular to Scotland have the (S) suffix, Wales uses the W suffix and Northern Irish indicators end NI. England has no suffix - the assumption seems to be that the other nations have opted out of the English QOF. Not all indicators have an equivalent in each country.

One of the strengths of this site has always been that fact that data from all four countries can be compared. I was quite keen that this could continued. In any case the indicators are often quite similar. We are not so much comparing apples and oranges but rather Cox's and Braeburn apples (again possibly a rather pointless comparison as Cox's knock Braeburns out of the park every time in my view but it will do as analogy).

The site has grouped similar indicators as they have changed over time. If you click on the calendar icon you may see several similar indicators being used over time. This is a rough approximation but pretty effective. I have used the same sort of grouping to compare indicators between countries. The figures for the UK are based on this grouping and use indicators without a suffix where there is an English equivalent.

This does mean that the centile figures are largely confined to each country (where there are country specific indicators) but remain UK wide where the indicators are the same across the UK. This mostly is the case in disease prevalence.

Currently there is a bug which means that practices outside England do not compare properly with the UK figures. I will correct this over the next few days.

### Welsh Practice Groupings

The data for Wales was published with small practice grouping and I have taken these onto the site. The systems is pretty good at arbitrary hierarchies so manages these fairly easily. There is not much detail about these groupings other than names and I have made up some custom codes for them.

### Depression Prevalence

For the last couple of years the depression prevalence figure has been based on the number of patients who had received a diagnosis after the first of April 2006. Prior to that it was based on the number of patients who had received a diagnosis ever. This caused a bit of a jump in the figures and this did stand out last year. There was also some muddle in the figures last year.

This prevalence was in the figures as "DEP PREV 2". I have changed this to reflect the new rules to "DEP PREV 3" although these two are linked and the historical trend will appear between them. I will tidy up last years figures to the same in the next couple of days.