Jon Boagey, NYA operations director, blogs on the difficulty of collecting data on youth services.
According to the Audit Commission, funding from central to local government has dropped by 37% since 2010. Few council departments could have felt the impact more than youth services which in many cases have seen greater cuts than the average council reductions.
Those who counter the ‘disproportionate cuts’ argument say that there has been additional income but it’s not necessarily managed by the council and stark figures about local authority spending are missing major investments in programmes such as National Citizens Service.
The reality is that both things have happened – cuts to one type of service and additional income for another.
The situation is further complicated because when we talk about youth services there is an assumption that we are talking about an homogenous service. The change to youth services didn’t begin with the cuts – it began in earnest during the last government when services were required to move to integrated youth support services. When the cuts hit hard a few years later and wider service reorganisation followed again and again, the impact on already integrated youth services was even greater. By the end of the 2000s it was almost impossible to compare one council youth service with another – every authority was structured differently.
The second and related issue is that partly because of this re-organisation it has become increasingly difficult to collect good data on youth service spending. NYA ceased its audit of local authority youth services in 2008 (a combination of funding reductions but also unreliable data). The only thing left for those who wanted to understand the national spending picture was the DfE’s Section 251 data (it is a requirement that each local authority provide financial data to the department on education spending). Tucked away in this data are a couple of questions on youth service spending (targeted and universal) which some of us have puzzled over for a number of years.
So, news that CIPFA (the Chartered Institute of Public Finance and Accountancy), has reviewed the Section 251 data and declared it ‘not fit for purpose’ doesn’t come as a great surprise. What’s shocking is that it’s not just the youth service questions. It’s the whole lot!
The data is so unreliable that key spending figures such as costs of residential child care or per head adoption costs, are all considered unreliable. Why? CIPFA identifies three main reasons, but council re-organisation and ‘noise’ (a statistical term for inaccuracy?) are the main ones.
Where does this leave us? NYA is committed to collecting some basic data on services for young people in 2015 but the main question lies with the DfE (which wasn’t mentioned at all in the CIPFA report). At a time when we so badly need good data to understand what’s really going on why have we let such a key source become so unreliable and why is it still being published? The DfE needs to give an answer.