At the end of last term I wrote

this blog post. It was my attempt to a) predict what changes the DfE would make to the KS2 progress methodology this year, and b) get my excuses in early about why my 2016

VA Calculator could not be relied upon for predicting VA for 2017. For what it's worth, I reckon the 2017 Calculator will be better for predicting 2018 VA, but 2016 data was all over the shop and provided no basis for predicting anything.

Anyway, no doubt you've all now downloaded your data from the

tables checking website (and if you haven't, please do so now. Guidance is

here) and have spent the last week trying to make sense of it, getting round what -1.8 means and how those confidence intervals work. Perhaps you've used my latest VA calculator to recalculate data with certain pupils removed, or updating results in light of review outcomes, or maybe changing results to those 'what if' outcomes.

This is all good fun (or not depending on your data) and a useful exercise, especially if you are expecting a visit, but it's important to understand that the DfE has made changes to the methodology this year - some of which I predicted and some of which I didn't - and, of course, the better we understand how VA works, the better we can fight our corner.

So what's changed?

**Actually let's start with what hasn't changed:**

**1) National average is still 0**

VA is a relative measure. It involves comparing a pupil's attainment score to the national average score for all pupils with the same start point (i.e. the average KS2 score for the prior attainment group (PAG)). The difference between the actual and the estimated score is the pupil's VA score. Adding up all the differences and dividing by the number of pupils

included in the progress measure gives us the school's VA score. If you calculate the national average difference the result will be 0. Always.

School VA scores can be interpreted as follows:

- Negative: progress is below average
- Positive: progress is above average
- Zero: progress is average

Note that a positive score does not necessarily mean all pupils made above average progress, and a negative score does not indicate that all pupils made below average progress. It's worth investigating the impact that individual pupils have on overall progress scores and take them out if necessary (I don't mean in a mafia way, obviously).

**2) The latest year's data is used to generate estimates **

Pupils are compared against the average score for pupils with same start point in the same year. This is why estimates based on the previous year's methodology should be treated with caution and used for guidance only. So, the latest

VA calculator is fine for analysing 2017 data, but is not going to provide you with bombproof estimates for 2018. Same goes for FFT.

**3) KS1 prior attainment still involves double weighting maths**

KS1 APS is used to define prior attainment groups (PAGs) for the KS2 progress measure. It used to be a straight up mean average, but since 2016 has involved double weighting maths, and is calculated as follows:

(R+W+M+M)/4

If that fills you with rage and despair, try this:

(((R+W)/2)+M)/2

Bands are as follows:

low PA: KS1 APS <12

Mid PA: KS1 APS 12-17.99

High PA: KS1 APS 18+

**4) Writing nominal scores stay the same**

WTS: 91

EXS: 103

GDS: 113

This means that we'll continue to see wild swings in progress scores as pupils lurch 10 points in either direction depending on the assessment they get, and any pupil with a KS1 APS of 16.5 or higher has to get GDS to get a positive score, but GDS assessments are kept in a remote castle under armed guard. I love this measure.

**5) As do pre-key stage nominal scores**

No change here either, which means the

problems continue. Scores assigned to pre-key stage pupils in reading, writing and maths are as follows:

PKF: 73

PKE: 76

PKG: 79

Despite reforms (see changes below) these generally result in negative scores (definitely if the pupils was P8 or above at KS1). It's little wonder so many schools are hedging their bets and entering pre-key stage pupils for tests in the hope they score the minimum of 80.

**6) confidence intervals still define those red and green boxes**
These can go on both the changed and not changed piles. Confidence intervals change each year due to annual changes in standard deviations and numbers of pupils in the cohort, but the way in which they are

used to define statistical significance doesn't. Schools have confidence intervals constructed around their progress scores, which involves an upper and a lower limit. These indicate statistical significance as follows:

Both upper and lower limit are positive (e.g. 0.7 to 3.9): progress is significantly above average

Both upper and lower limit are negative (e.g. -4.6 to -1.1): progress is significantly below average

Confidence interval straddles 0 (e.g. -1.6 to 2.2): progress is in line with average

**7) Floor standards don't move**
This shocked me. If i had to pick one data thing that I thought was certain to change it would be the floor standard thresholds. But no, they remain as follows:

Reading: -5

Writing: -7

Maths: -5

Schools are below floor if they fall below 65% achieving the expected standard in reading, writing and maths combined, and fall below any one of the above progress thresholds (caveat: if just below one measure then it needs to be sig-. Hint: it will be). Oh, and floor standards only apply to cohorts of 11 or more pupils.

**And now for what has changed**
**1) Estimates - most go up but some go down**
The estimates - those benchmarks representing average attainment for each PAG against which each pupil's KS2 score is compared - change every year. This year most have gone up (as expected) but some, for lower PAGs, have gone down. This is due to the inclusion of data from special schools, which was introduced to mitigate the issue of whopping negative scores for pre-key stage pupils.

Click

here to view how the estimates have changed for each comparable PAG. Note that due to new, lower PAGs introduced for 2017, not all are comparable with 2016.

**2) Four new KS1 PAGs**
The lowest PAG in 2016 (PAG1) spanned the KS1 APS range from 0 to <2.5, which includes pupils that were P1 up to P6 at KS1. Introducing data from special schools in 2017 has enabled this to be split into 4 new PAGs, which better differentiates these pupils. The use of special school data has also had the effect of lowering progress estimates for low prior attainment pupils, which goes some way to mitigating the issue described

here. However, despite these reforms, if the pupil has a KS1 APS of 2.75 or above (P8 upwards) a pre-key stage assessment at KS2

**is **going to result in a negative score.

**3) New nominal scores for lowest attaining pupils at KS2**
in 2016, all pupils that were below the standards of the pre-key stage at KS2 were assigned a blanket score of 70. This has changed this year, with a new series of nominal scores assigned to individual p-scales at KS2, i.e:

P1-3: 59 points

P4: 61 points

P5: 63 points

P6: 65 points

P7: 67 points

P8: 69 points

BLW but no p-scale: 71 points

I'm not sure how much this helps mainstream primary schools. If you have a pupil that was assessed in p-scales they would have been better off under the 2016 scoring regime (they would have received 70 points); as it stands they can get a maximum of 69. Great.

**Please note: these nominal scores are used for progress measures only. They are not included in average scaled scores. **
**4) Closing the progress loophole of despair**
Remember

this? In 2016, if a pupil was entered for KS2 tests and did not achieve enough marks to gain a scaled score, then they were excluded from progress measures, which was a bonus (unless they also had a PKS assessment, in which case they ended up with a nominal score that put a huge dent in the school's progress score). This year the DfE have closed this particular issue by assigning these pupils a nominal score of 79, which puts them on a par with PKG pupils (no surprise there). In the

VA calculator, such pupils should be coded as N.

The loophole is still open by the way. Pupils with missing results, or who were absent from tests, are not included in progress measures, and I find that rather worrying.

**5) Standard deviations change**
These show how much, on average, pupils' scores deviate from the national average score; and they are used to construct the confidence intervals, which dictate statistical significance. This is another reason why we can't accurately predict progress in advance.

-----

So, there you go: quite a lot of change to get your head round. It has to be said that unless the DfE recalculate 2016 progress scores using this updated methodology (which they won't), I really can't see how last year's data can be compared to this year's.

But it will be, obviously.