8 PERFORMANCE AND OUTCOME MEASURES
8.1 This chapter examines issues around measurement of performance and outcomes of services provided by TSOs. Senior managers were asked to discuss the mechanisms by which their services were monitored and evaluated by funding bodies. Evaluation of service delivery may be built into delivery contracts and may place requirements on the service provider to establish minimum staffing ratios and deliver specific outcomes for service users. Contracts also specify expected outcomes required of service providers. In addition to these measures, some local authorities would also conduct annual audits of outcomes to ensure that service providers were meeting their contractual obligations.
8.2 On the whole, respondents were supportive of the principal of measurement, in particular in order to evidence the value added and contribution of the organisation. However, a number of issues were raised which are covered in this chapter.
8.3 Firstly, there were differences in the monitoring and evaluation regimes implemented by funders. Where funding to provide a service had been secured from a local authority, there was considerable variation across different local authorities in approaches to monitoring and evaluation. Secondly, many funders were focused on measuring 'hard' outcomes which did not always capture the added value provided by the TSO. Finally, the use of SROI and other tools to measure 'added value' of TSO is examined.
Variability in Measurement and Scrutiny by Local Authorities
8.4 Local authorities fund a variety of services supplied by TSOs. Many of the large national TSOs that participated in the research held contracts with more than half of Scotland's local authorities. Local authorities are therefore important providers of funding for services supplied by TSOs. Interviews with senior managers in TSOs suggest that there is considerable variation across local authorities in the way that services are monitored and evaluated.
8.5 Several of the large national TSOs working across a number of local authorities were critical of perceived inconsistencies in the evaluation criteria used by different local authorities. For instance, this senior manager with a large TSO commented on the variation across local authorities.
Do you know it's so varied and you know we obviously work across 24 local authorities at the moment and it is so varied. You have some commissioners could tell you how much you spent on Post-its probably and others would be struggling you know to really tell you how many service users you're supporting.
Senior Manager, National Health and Social Care Provider
8.6 There was also variation in the perceived proportionality of external security by local authorities. Some were perceived to be fairly minimal in their scrutiny, while others were over-concerned with small details.
8.7 Another large social care TSO working across many local authorities contrasted inconsistencies in approaches to monitoring with their experience of monitoring from central government statutory organisations such as Her Majesty's Inspectorate for Education ( HMIE) that was perceived to ' be pretty solid…they are clear about what it is that they are looking at'. This clarity was contrasted with the bureaucratic structures imposed by some local authorities on service providers who were required to supply onerous paperwork and comply with local inspection routines that varied across local authorities.
Focus on Hard Outcomes and Added Value
8.8 Respondents across many TSOs felt that the structure of funding requirements was focused on measuring 'hard outcomes'. These outcomes were often quantitative measures of activity such as number of clients into work or training or number of hours attending training or education activities. A key point made about the focus on hard outcomes was that these did not capture the range and breadth of work being carried out by TSOs, often with hard-to-reach and/or vulnerable clients who needed significant inputs before moving into outcomes such as employment.
8.9 Other issues raised by respondents about the focus on hard outcomes included: that it produced contradictory incentives; that the most vulnerable clients were losing out because they were less likely to achieve quick and measureable hard outcomes, and: it did not easily recognise the work done by multiple agencies. It was felt by some that funders did not understand the complexity of the work done by TSOs. However, some organisations used alternative methods of measurement to measure 'soft' outcomes, including existing tools as well as devising their own tools and carrying out additional research.
8.10 The focus on hard outcomes as a means of evaluating the progress of TSOs towards contractual targets has led to a fundamental change in the landscape within which TSOs operate. There has been greater competition between TSOs, improvements in staff training and monitoring systems and greater efficiencies in the delivery process. These changes were broadly welcomed by respondents from a range of TSOs. There was however also a sense that such changes had come to place a disproportionate value on outcome variables that could be measured whilst lessoning the importance of soft outcomes that were considered of equal importance. Although the measurement of learner hours or job sustainability provided relatively simple and convenient variables with which to measure progress towards targets, such measures were not perceived to be an adequate reflection of the added value brought by the work of TSOs.
8.11 Across interviews with a wide variety of TSOs, there was widespread agreement that the added value of the activities performed by the TSO went beyond the services for which they were contracted to deliver. There was a view that these additional outcomes were not being adequately recognised by funders or, crucially, rewarded. For instance, much of the work of many TSOs is more qualitative, such as wider social impacts on communities and society, preventative work, or working with clients with complex problems not easily captured by the current approach of measuring outcomes. This makes it harder for TSOs to accurately evidence much of the work they are doing, yet it was something many were keen to do in order to provide evidence for all the work that they did as well to help show clients the progress they had made:
The outcomes of the approach itself is problematic. A lot of the work is not terribly well defined by target driven outcomes it's much more social outcomes which are hard to evidence...it does force the work sometimes into fairly artificial categories in order to meet the funding targets to get money through to evidence you are doing a good job, particularly if the area you are operating in is not statutory duty and most of our work isn't, most of our work is about prevention.... so I think we end up with an issue around trying to squeeze what we think are meeting the need… into very narrow outcomes required for funding....
Senior Manager, Equalities FG
8.12 By not measuring soft outcomes or other ways in which an individual could be considered to have made progress, there was a perception across many TSOs that the full extent of the value they added to the delivery of contracts was not being recognised. For example, a respondent from one organisation felt that there was no mechanism for measuring the value added for the way in which employment training for the unemployed would have outcomes that were consistent with national performance targets, the SOAs.
8.13 One effect of existing performance and monitoring approaches was perceived to have been a lack of recognition of the deeply entrenched problems faced by some client groups. During one focus group it was suggested by participants that current monitoring arrangements were encouraging a focus on those clients that required less support to ensure that outcomes could be achieved and measured more quickly than would occur if the client needed more intensive pre-employment support. The focus on attaining quick, clear results with clients had, it was argued, led to those with some of the greatest need being overlooked in the pursuit of targets. For instance, the outcomes-focused approach encouraged competition between services for groups of clients who can easily have measureable 'positive' outcomes:
Everybody is fighting for the same people. Everybody wants to enrol them on their books so they can count them in terms of positive outcomes. That's all they want to do. There's not a real commitment to helping people where they are at because the funding structure actually doesn't ...there's no recognition of the work that is needed, so...it's really important that that is recognised and properly funded.
Senior Manager, Equalities FG
8.14 Hard outcome measures were also causing disquiet among TSOs when there was a need to refer a client on to another organisation before a hard outcome had been achieved. The variable nature of a clients' progress means that one organisation can spend a lot of time working with the client before they move onto another provider where they then quickly achieve an outcome. By structuring and rewarding client engagement in a way that only recognises a successful outcome, there may be a disincentive for an organisation to either provide a high level of early intervention if it is known that there will be a subsequent referral to another TSO which will receive credit for the outcome.
The organisations in our network all do slightly different things but at different stages in people's lives. Somebody can move onto another project from here called [X] and we would see that as a really positive outcome in a sense that it's a life's journey that they're going on as well as trying to get employment... sometimes it just takes another project to actually do the finishing off bit and suddenly it just clicks for somebody because somebody else will be able to do something that you can't do.
Senior Manager, Equalities FG
8.15 The key point here regarding performance and outcome measures was that the client journey, from early intervention to employment, may require the involvement of several organisations each with a different specialism in different parts of that journey.
8.16 Part of the difficulty in getting a wide range of funders to recognise the additional outcomes generated by TSO activities, was a perceived lack of understanding of the complexity of the issues faced in moving some hard to reach client groups closer to work or training. Many respondents felt that certain funders did not appreciate the range or depth of the support that some client groups required. For instance, while time spent with clients did not count towards outcome targets this was considered a vital step in moving to meeting targets:
So part of looking at the outcomes is realising there has to be some pre-learning activities and some pre-learning outcomes…it helps make decisions about the next steps, to inform choice, giving somebody impartial guidance in their own home isn't just a kind of we just went round and said 'do you want to go on that course?' No, you sit down and explore the range of options. It's all work. It's all time. And none of that is actually counted by any funder. Like some of the stuff happens by magic…sometimes it would be nice if they could give recognition to some of the stuff that happens. And I think that a lot of people know it. They are caught by what they have to count..but at the end of the day what are they actually measuring?
Officer, Regional Learning Provider
8.17 However, the focus on hard outcomes was not universal or inevitable among all funders. For instance in spite of pressure on TSOs to work within monitoring arrangements that emphasised hard outcomes, one respondent described working with one local authority funder in order to accommodate the soft outcomes that arose:
Because we are working with a client group that is furthest away from the labour market there is less pressure on us to achieve hard outcomes as such, it is more important for us to engage with individuals and encourage them to sustain in the programme.
Service Manager, Local Learning Provider
8.18 There have been initiatives to promote greater streamlining of reporting, for instance BV2 seeks to put greater emphasis on outcomes and to produce a 'more streamlined scrutiny landscape' 44. The Enterprising Third Sector Action plan also acknowledges that the evidence of the value added by the third sector through its provision of quality services is incomplete. One of the means promoted is by developing the means to provide a better evidence base through the use of SROI and helping the third sector develop its own evidence 45. The latter point is echoed in the Joint Statement which promotes greater streamlining of reporting through increased emphasis on self-assessment in third sector reporting to local authorities 46.
SROI and Other Measures
8.19 Most organisations were keen to evidence the full extent of the work they undertook in order to show the value they added. While all respondents were aware of SROI its use was still fairly limited. However, many organisations were using other means of measuring 'soft' outcomes, including existing measurement tools as well as devising their own tools and carrying out additional research.
8.20 It was clear that there was little use of the Social Return on Investment ( SROI) approach to measuring the added value of TSOs. SROI was considered to be an experimental and little known methodology that had not gained widespread acceptance among the TSOs that participated in the research.
8.21 However, a number anticipated that they may use it in the future, while one was already beginning to look at the model for one of their services. One organisation had used SROI and were pleased to be able to show the added value of their work. However, there was some scepticism within the organisation (and within other organisations) as to the influence this would have. This was because they thought local authorities did not want to see evidence of the added value of their work because the authorities would then be under more pressure to provide additional funding to the TSOs.
I think the difficulty with SROI or something of that ilk is that it's useful for us to promote that in terms of this is what we do for every pound that the [funder] gives us. We had a SROI done recently and its £6.70 something for every pound that we get. I mean that's terrific you know…but in terms of recognition by local authorities and such like I'm not sure that they would want to totally embrace it because I think it could mean that they would need to put more money where their mouth is.
Senior Manager, Local Employability Provider
8.22 One TSO that had incorporated social accounts into their annual report had done so to enable the range and breadth of their work to be fully described. Social accounting was seen to be an important tool for gaining wider recognition of the work of the organisation and for improving relations with funders who were made aware of the range of work undertaken by the TSO.
What social accounts has helped do from our perspective as a company is to show what else we have done on top. Not just in employability but in regeneration and sustainability, you get a lot more in-depth feedback from the customers and it gets shared around. It definitely helps with recognition.
Senior Manager, National Employability Provider
8.23 There were a number of issues raised around the SROI model and its potential for use in the sector. For instance, some felt that SROI was not being used to evidence procurement decisions. Rather price, as outlined previously, was the main concern. It was also felt that although the SROI statements are interesting for an organisation, it is impossible to compare organisations because of the different client groups and the different services provided etc. Others felt that SROI might only be useful once it was clearly decided what the added value of the third sector was and it was suggested that this might include flexibility; responsiveness; being able to deliver services much more rapidly than local authorities; and willingness to provide services beyond those outlined in a contract because of the charitable purpose of TSOs.
8.24 The costs for TSOs associated with implementing SROIs would also have to be considered, in particular because of the perceived complexity of the model and the additional resources required to implement it. One respondent actually felt there had been more interest in SROI from the private sector than the public sector and that the Scottish Government needed to be more active in promoting the model.
8.25 A number of organisations were using alternative methods in order to evidence the work that they did, in particular in order to capture 'soft' outcomes. For instance, several of the organisations were either already using or trialling some existing tools. Among those mentioned included HGIOCL2 (How Good is Our Community Learning and Development) , Weavers Triangle, the Rickter Scale 47. Others were also carrying out research or developing their own tools.
8.26 A number of respondents felt that there were challenges in measuring the progress of the organisations' client groups. For instance, there was a diverse range of clients including some with learning difficulties; and some whose literacy skills meant they could not complete a text-based evaluation. Another issue raised was that the progress of clients was often not linear in a way sometimes assumed by monitoring tools:
A lot of services have a tunnel vision: we start at point A, we are going to build a point B, we are going to build a point C. And it's going to go like that. With our client group it doesn't work like that. They may start quite confident, and then something happens, they might have something with their condition or whatever it might be, their confidence takes another knock. And they are not going to go straight line. They are going to go here.
Senior Manager, Regional Learning Provider
8.27 There was considerable variation in approaches, and therefore perceptions of proportionality, to monitoring and evaluation of services by TSOs across local authorities.
8.28 There was a perception among TSOs that funders were focused on measuring 'hard' outcomes which did not capture the full range of added value of TSOs.
8.29 Much of the work carried out by TSOs was often with clients who were hard-to-reach and/or who had complex issues, focused on prevention and had wider social impacts that were less (and less easily) measured.
8.30 The focus on 'hard' outcomes could lead to: contradictory incentives; the most vulnerable clients losing out because they were less likely to achieve quick and measureable hard outcomes, and: it did not easily recognise the work done by multiple agencies. It was felt by some that funders did not understand the complexity of the work done by TSOs.
8.31 At the same time, the focus on 'hard' outcomes had resulted in radical changes within TSOs including improvements in monitoring systems, staff training and greater efficiencies in the service delivery approach.
8.32 Most organisations were keen to evidence the full extent of the work they undertook in order to show the value they added. While use of SROI was still fairly limited, many organisations were using other means of measuring 'soft' outcomes, including existing measurement tools as well as devising their own tools and carrying out additional research.