Loading...

Follow HEPI | The Higher Education Policy Institute on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Earlier this week, I had the pleasure of speaking at Wonkhe’s conference on Augar, in the final session of the day, alongside Rachel Wolf, on the politics of the report. Here are my remarks.

On the day the Augar report appeared, I said on the radio that it felt like Christmas Day. To a higher education policy wonk, over 200 pages of tightly-argued and long-awaited text on how to reform higher education is a rare but exciting gift.

The Augar proposals are a coherent package, largely evidence based and certainly worthy of discussion. When we evaluated the report against our original submission to the call for evidence, we found it did a good job of addressing nine of our 10 points in full or in part, and we scored it 6.5 out of 9 – which is a First (though call that grade inflation if you like).

Maintenance grants are a good example. We’ve long called for their return because it is morally wrong to expect the poorest entrants to emerge with the biggest debts. The Augar report listed our support for this (p.191) alongside the support of other institutions and made a clear call for the return of grants that was very welcome.

So I found it depressing how some senior political figures on both sides of the political spectrum tried to close down a decent conversation on Augar immediately, indeed before it had even been published in some instances.

Everyone who condemns the Augar proposals out of hand should have to pay penance if they cannot say what they would do instead – and how they would pay for it.

Philip Augar himself has said the report should be implemented in full. But this is likely to be the triumph of hope over expectation. There are lots of reasons why it is optimistic to think a report with over 50 policy recommendations won’t be subject to a pick-and-mix approach by policymakers.

For example, the Augar report is not the Post-18 Review; it is feeding into the Post-18 review, which may come to different conclusions. Robbins, Dearing and Browne were all cherry-picked, and they had some cross-party support, whereas Augar was an explicitly Conservative document.

Moreover, even the biggest fans of the report have found some weak spots. I have yet to meet a single person who thinks the proposal to stop funding foundation years makes sense – just look at our recent blogs for more evidence.

The risk, especially given the weird political times in which we live, is that policymakers opt to do the bits that save money but reject the bits that cost money. As a senior policymaker said at a HEPI event the other day, the recommended freezing in the unit of resource between now and 2022/23 is the one thing most likely to happen. That doesn’t need legislative change, would help the public finances and won’t find much opposition outside universities.

But let’s get back to the title. It is undoubtedly true that Augar depends on politics. So let’s look at the politics. Boris Johnson and Jeremy Hunt are relatively unusual among Tory MPs as they both have university campuses in their constituencies (though the number of students is far greater in Johnson’s seat than Hunt’s). As a former London Mayor and a former Secretary of State for Health, they each have a good understanding of the importance of our higher education and research base.

So we can but hope, in particular, for a better approach to international students. It is 10 years since the Conservatives’ adopted a net inward migration target of tens of thousands and decided to plonk students within it. That needs to change as do the restrictive post-study work rules.

This is even more urgent if Augar does happen because the extra fee income from more international students could help lubricate any challenges. (Of course, none of this changes the fact that the greatest benefits arising from the presence of international students are non-financial.)

One final point about student loans, which links back to the Augar report’s specific recommendations. In the current leadership election, Jeremy Hunt has promised to wipe out the student loans of entrepreneurs and to reduce the interest rate on student loans (which the Brexit Party have also been flirting with).

Such changes are regressive as well as expensive, but they might also be politically smart because the politics of student loans are in the repayment phase more than the fee level and because experience abroad shows it can make an electoral difference (just look at the New Zealand election of 2005).

Either way, this is very different terrain to the Augar report’s proposals to keep interest as it is once people have left university and to extend the repayment phase from 30 to 40 years.

Indeed, Jeremy Hunt’s promises combined with Jo Johnson’s role in Boris Johnson’s campaign suggest the higher education debate may already be moving away from the idea of implementing the major recommendations in the Augar report.

The post The future for Augar is political appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It is not beyond the realms of possibility that there will be a general election before the year is out – perhaps as early as September. I once thought a second referendum was more likely than a general election but, at this moment, it seems I was almost certainly wrong.

It is also entirely possible that neither an election nor a referendum occurs, but the new Prime Minister will have to complete a huge obstacle course successfully in order to avoid it. Some commitments made during the current leadership election are hard to deliver on the existing make-up of Parliament.

So it is an apposite moment to ask if universities and other higher education institutions should be doing anything more now to prepare for the possible election. The answer is a resounding yes.

One thing institutions could quickly and easily do is to invite all the likely candidates for the mainstream political parties onto campus for a briefing and a tour. The goal should be to update them on current issues affecting the sector (international students, financial sustainability, demographic changes and so on).

When done well, the impact of such visits can be much greater than it seems at the time: they can make prospective lawmakers lifelong friends of the sector. (This may sound unlikely, but some of the most interesting things I have ever done were the result of being invited, as an election candidate, to one-off visits to places I wouldn’t normally have been invited. Nine years on, I still recall them clearly.)

Don’t invite only the person most likely to win. Very many people who end up as MPs have stood elsewhere beforehand and will always remember the experience. They may end up representing somewhere without a university where a majority of people believe too many people reach higher education. Giving them a taste of life in a university today can provide another set of influences – and may prove to be a good long-term investment. (Besides, in the current febrile political environment, the person you are certain will win may not be the person who does…).

A second constructive job that can be done now is to build a really accessible, credible and very short document explaining all that your institution does for its area. You probably already have one – but how widely has it gone out?

Any aspiring politician on the stump needs cold hard memorable facts to shower around. Knowing that the local higher education institution employs W people, attracts X students, supports Y businesses and ensures Z graduates a year stick around to work in the local area are the sort of killer facts that politicians love. But if they take up more than one side of one sheet of paper, they are unlikely to be read and they certainly won’t be assimilated or remembered.

Thirdly, if there is an election, there may not be much time for anyone to prepare properly. So there may be fewer public events than usual. Can you put someone in charge now of organising a hustings for students and staff, so that you’re ready to roll the arrangements out the minute an election is called?

Not all these ideas will work for every institution. There may well be better ideas out there. But we have a brief moment when we have the luxury to prepare for something that we know may well occur. It would be unwise to waste it.

The post Three ways universities can prepare for a possible general election appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A version of this guest article was originally presented by Iain Mansfield at the HEPI Partner Policy Briefing Day.

“I can call up 50 academics who will tell me how to design the perfect pension system, but I can’t find any who can tell me how to improve the one we have now.” – New Labour minister

Like all such statements, the words aren’t literally true, yet successfully convey an important message. There is too often a disconnect between the language spoken by academics and that spoken by policy makers, a disconnect which can make it harder for the world-class research taking place in our universities to influence policy.

As Paul Johnson, Director of the Institute for Fiscal Studies, wrote in the Times earlier this year, “Why do organisations like the IFS exist at all,” when “there are thousands of economists and many thousands of other social scientists employed by UK universities?” The answer, as he goes on to say, is that “there remains a yawning chasm between the work of most academic social scientists and the work of government” and so there is a room for organisations, such as the IFS – or, indeed, HEPI – to bridge that divide.

Most policy makers, whether civil servants and politicians, really do want evidence. And think-tanks speak our language, the best of them combining world class policy insights – both their own and from the wider literature – with an understanding of the political and policy pressures that form part of government policy making. So how can university researchers learn from this, to allow their research to have a greater impact on policy making, without diminishing its world-class nature?

Push and Pull Barriers

Structural and cultural issues, on both sides, create barriers even where there is great willingness to engage.

Academics are sometimes surprised to learn that civil servants don’t have access to peer reviewed journals. Even if they did, reading original academic literature is not part of the broader culture: the typical civil servant would be far more likely to read the latest report by HEPI, the IFS or the Sutton Trust. This combines with a broader lack of understanding of the academic landscape. Of the hundreds of a researchers in a field, who are really the leading experts?

Because the policy cycle is complex, there are times when government will be very open to new ideas and evidence, and others when parliamentary arithmetic or broader politics means it is focused on delivery – an idea that arrives at the wrong time is unlikely to get traction. And policy makers can be prone to looking for the silver bullet, the magic answer that a minister can announce in a forthcoming speech, whereas research often gives much more complex answers.

From the academic side, influencing policy is an entirely separate skill from carrying out research – and one individual may not be talented or have experience at both. This can be compounded by the fact that, in a ‘publish or perish’ culture, influencing policy may not be seen as a priority, reducing the opportunity for younger researchers to develop these skills.

More broadly, while officials can underestimate the complexity of research, too often academics underestimate the complexity of policy making, presenting solutions devoid of any understanding of the broader fiscal, societal or political pressures essential to implementation. There can also be a certain naivety at influencing, thinking that if research is clearly right, presenting it once to a minister or senior official will be enough. In reality, influencing requires long, hard graft to win over not just a single individual, but the broader ecosystem of advisers, politicians, select committees, think tanks and pressure groups that help determine which policies do and don’t get taken forward.

Identify-Engage-Embed-Impact

There is no unique way to have impact, but one way to think about the process is the following:

  • Identify: Who needs to know about your research? Which civil servants, politicians, MPs, select committees, think tanks? Who is already interested in the subject? Is there public concern? A burning platform? Are there any broader policy initiatives – e.g. Industrial Strategy, Northern Powerhouse – that it could be linked to?
  • Engage: Find the initial connection. Get a meeting with the minister/civil servants/MPs. Run a parliamentary briefing event. Invite an official to speak at a conference. Respond to government consultations. Have a policy blog to discuss your research and comment on relevant topical issues. Make sure your media engagement is effective – whether that’s press releases, interviews or social media.
  • Embed: Make yourself indispensable. Build strong personal relationships with officials at all levels. Run a regular series of engagements to which officials, ministers, parliamentarians and think tanks are invited. Provide the secretariat for an All-Party-Parliamentary-Group. Offer to lend a PhD student or post-doc to the relevant government department at a busy time for them (e.g. a consultation period). Regularly comment on government announcements in your research area and make sure the department and press are aware. Bid to carry our government sponsored research.
  • Impact: What impact are you aiming for? Greater awareness? A change in public attitudes? Better targeted policies? New laws / regulations? A change to existing regulations? Targeted investment or research programmes? Knowing your goal will give you a greater chance of succeeding.

One Effective Operating Model: The Policy Centre

Although some academics are capable of single-handedly producing world-class research and having an impact on policy, this is relatively rare. ‘Hire Anna Vignoles’ is good advice, but not very generalisable. So what can a university do structurally to ensure that influencing with impact can occur systematically?

One effective option is to establish a policy centre, such as the Centre for Competition Policy at UEA or the UK Trade Policy Observatory at Sussex. Both are highly effective examples that demonstrate how researchers can dramatically increase its influence within government.

Centres such as these can establish a clear and recognisable brand to which research outputs, policy papers and events can be aligned. Just as policy makers look for the next think tank report, they will look for the next output from the centre. The director of such a centre can support this by establishing long-term relationships with (often transient) officials, MPs, and other policy makers, providing a familiar face and reliable source of policy help. Research outputs rarely fit perfectly into policy-shaped holes – but as a centre, it’s much easier to field an academic whose research does address the problem at hand.

Internally, centres provide a setting where policy engagement and influencing is genuinely valued. This enables early-career academics can get involved and learn the necessary skills and provides a management framework in which policy impact can be appropriately recognised. It allows division of labour: while the centre as a whole may have a clear mandate to influence policy: individual academics within it can take a greater or lesser role in that endeavour and skilled administrative staff can be appointed to support engagement and influencing activities. Overall, a centre can be an efficient way of mobilising research that enables many of the influencing activities set out above.

I am sure there will be a continued role for think-tanks such as HEPI and the IFS for many years to come. But if more of the world-class research taking place in universities can have a greater impact on policy, we will all be better off.

The post Bridging the Research-Policy Divide appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

The results of this year’s school exams will be announced in a few weeks’ time. But as recently reported in the TES, Times and Telegraph, different examiners can legitimately give the same script different marks. As a consequence, of the more than 6 million grades to be awarded this August, over 1.5 million – that’s about 1 in 4 – will be wrong. But no one knows which specific grades, and to which specific candidates; nor does the appeals process right these wrongs. To me, this is a ‘bad thing’. I believe that all grades should be fully reliable.

When I speak about this, someone always says, “That’s impossible”.

No.

It is possible, and this blog identifies 22 ways to do it. None is perfect; some are harder to implement, some easier; some might work better in combination. So imagine you have total power. Which would you choose, and why? Or would you prefer to maintain the status quo? Answers, please, as comments, and if you can think of some other possibilities, please post those too.

Firstly, a few words of context. Even if marking is technically of high quality (as it is), it is necessarily ‘fuzzy’, with some subjects (such as History) being more fuzzy than others (such as Physics). Because fuzzy marks can straddle a grade boundary, an original mark and a re-mark might be on different sides of a grade boundary. That’s why grades are unreliable, as illustrated in Figure 1.

Figure 1: Fuzzy marks. Four candidates are given marks as shown by each X; other legitimate marks are indicated by the ‘whiskers’. The grades awarded to candidates A and B are reliable; those to candidates C and D are not. The length of any whisker is one possible measure of the examination subject’s fuzziness.

To award reliable grades, we need to ensure that that any original grade is confirmed by a re-mark, even though marking is fuzzy. The BIG QUESTION, of course, is “How?”. This blog suggests some answers.

The possibilities cluster, as shown by the sub-headings; several are variations on a theme. Each possibility has two elements: the first specifying how the grade is determined from the original mark; the second defining what happens when the script is re-marked as the result, for example, of an appeal. Since this is a blog, I will be brief; more detail is available here.

A quick note on the current process, against which any alternative can be compared:

  • A script is marked once, with the grade determined by mapping the mark onto a grade scale.
  • A re-mark is allowed only if the original mark can be shown to be incorrect as attributable to, for example, a marking error, with the re-mark determining the new grade.

Possibilities based on the current process

1. Change the policy for appeals. Allow re-marks on request, rather than requiring evidence for a marking error.

2. Double marking. In the belief that ‘two heads are better than one’, mark every script twice, and award the grade based on, say, the average, with appeals as (1).

Possibilities intended to eliminate fuzziness, with the grade determined by the given mark, and appeals as (1)

3. Re-structure exams as unambiguous multiple choice questions.

4. Tighter mark schemes, so that even essays are given the same mark by different examiners.

5. Better-trained examiners, so that examiners are all ‘of the same mind’.

6. Just one examiner, so ensuring consistency.

Possibilities that accept that fuzziness is real, but do not use a measure of fuzziness on the certificate; the grade is determined by the given mark, and appeals are as (1)

7. Review all ‘boundary straddling’ scripts before the results are published.

8. Fewer grades. The fewer the grades, the wider the grade widths, and the lower the likelihood that a grade boundary is straddled.

9. Subject-dependent grade structures. Physics is inherently less fuzzy than History, so Physics can accommodate more grades than History.

Possibilities that accept that fuzziness is real, and implicitly or explicitly use a measure of fuzziness on the candidate’s certificate

Figure 2: Grading according to m + f. A script is originally marked m = 58; the grade is determined by m + f = 62, this being grade B. There are no marking errors, and the script is re-marked m* = 61. As expected, the re-mark is within the range from m – f = 54 to m + f = 62, so confirming the original grade.

10. One grade (upper). The certificate shows one grade, determined by m + f, as illustrated in Figure 2 for an examination subject for which the fuzziness f is measured as 4 marks. If the script is re-marked m*, and if a marking error is discovered, a new grade is determined by m* + f. If no marking errors are discovered, it is to be expected that any re-mark will be different from the original mark, and within the range from m – f to m + f. Since f has been taken into account in determining the original grade, a fair policy for appeals is therefore that:

  • A re-mark should be available on request (and I would argue for no fee).
  • If the re-mark m* is within the range from m – f to m + f, the original grade is confirmed.
  • If the re-mark m* is less than m – f or greater than m + f, a new grade is awarded based on m* + f.

By determining f statistically correctly, new grades would be awarded only very rarely – so explaining why this delivers reliable grades.

11. One grade (lower). The certificate shows one grade, determined by m – f; appeals as (10).

12. Two grades (upper). Award two grades, determined by m and m + f; appeals as (10).

13. Two grades (range). Award two grades, determined by m – f and m + f; appeals as (10).

14. Two grades (lower). Award two grades, determined by m – f and m; appeals as (10).

15. Three grades. Award three grades, determined by each of m – f, m and m + f; appeals as (10).

16 – 21. Variants of each of (10) to (15) using αf. The parameter α defines an ‘adjusted’ mark m + αf, and can take any value from – 1 to + 1. Three special cases are α = 1 (so grading according to m + f ), α = – 1 (m – f ), and α = 0 (grading according to m, as currently). The significance of α is that it determines the degree of fuzziness that is taken into account, so controlling the reliability of the awarded grades, from the same reliability as now (α = 0) to very close to 100% reliability (α = 1 or α = – 1). The policy for appeals is as (10).

22. No grade: declare the mark and the fuzziness. Solutions 10 to 21 – each of which is a particular case of the generalised m + αf concept – represent different attempts to map a fuzzy mark onto a cliff-edge grade boundary so that any marks left dangling over the edge do as little damage as possible. This solution is different: it gets rid of the cliff – the certificate shows the mark m, and also the fuzziness f for the examination subject. The policy for appeals is as (10).

None of these is perfect; all have consequences, some beneficial, some problematic. To determine the best, I believe there should be an independently-led study to identify all the possibilities (including maintaining the status quo), and to evaluate each wisely.

That said, in my opinion, possibilities 2, 3, 4, 5 and 6 are non-starters; I include them for completeness. I consider the best to be 22, with the certificate showing not a grade, but the mark m and also the measure f of the examination subject’s fuzziness. If grades must be maintained, for GCSE, AS and A level, I choose 10 (grades based on m + f), for that delivers reliability as well as assuring that no candidates are denied opportunities they might deserve. But for VQs, T levels, professional qualifications and the driving test, my vote goes to 11 (m – f ) – I find it reassuring that plumbers, bricklayers, brain surgeons, and all those other drivers, really are qualified.

What do you think?

And if you believe that having reliable grades matters, please click here

The post Students will be given more than 1.5 million wrong GCSE, AS and A level grades this summer. Here are some potential solutions. Which do you prefer? appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today’s guest blog, from Dr Greg Walker, CEO of MillionPlus, is a full and thoughtful critique of the Augar review, and we urge people to read it carefully. 

As someone with experience with independent reviews of higher education, I understand the energy and effort that panel members and officials put into such exercises. MillionPlus as a mission group inputted constructively into the Augar review process, submitting three tranches of written evidence. MillionPlus welcomed the report’s recommendations on enabling more flexible study and the proposed return to maintenance grants. Indeed, MillionPlus is unique as a mission group in that our membership spans the HE and FE sectors, our universities are closely connected to the college sector to the extent that no fewer than 13 FE colleges are part of their university structures in England and Scotland. These structures are living, breathing evidence that clear synergies and a sensible division of labour between FE and HE is possible.

The report itself is a very particular take on post-18 education. Perhaps this was inevitable given that it is the only government-commissioned review in England to ever have looked at post-18 provision at all levels and contexts – contrary to the report’s claim in the report on page 5 that the Robbins Review had an equally broad scope. Our agreement with some key recommendations does not prevent us from being a candid friend concerning flaws in the report, most obviously the manner in which it seeks to contrasting higher education and further education. Others have rightly addressed the distorted understanding of university finances in the report, points I won’t rehearse here.

The framing of the report: ‘care and neglect’

From the outset of the report there is a binary contrast in the term of ‘care and neglect’ for the universities and colleges respectively (p.5), reflecting the highly challenging time faced by the FE sector since the austerity period. Yet as I have noted elsewhere, the post-2012 period has certainly not been a ‘golden age’ for a significant part of the HE sector, mainly modern universities, which has faced the negative impact of changes to government policy in a range of different areas from initial teacher and nurse education to the effects of the rapid collapse in part-time HE study and tighter visa controls.

The report’s omission of this point and the wider context of the changes to investment in HE and FE during the last nine years is an example of a selective use of evidence reinforcing the ‘care and neglect’ narrative. Yet as Lord Bob Kerslake, a former head of the civil service, set out in January, this narrative is flawed because the context of funding changes in 2012 beyond was not so clear cut:

A view [from Augar] seems to be forming that… universities have got too big and FE and colleges too small. For me, this is a completely specious trade-off. The enormous cut in FE funding…. was a consequence not of resources being transferred from FE to universities but of decisions made in the austerity period. The savings made in HE from shifting the burden to student loans arguably mitigated some of the potential impact of [FE] cuts. Universities have grown because more young people want to go to them.

Kerslake’s remarks call into question the HE/FE ‘trade-off’ that is central to the report. The back-to-back presentation of chapters in the Augar report on higher education and further education, without the context stated by Kerslake, seem designed to fuel an artificially contrasting view of the two sectors. Indeed, FE and HE are treated in the report as sectors fundamentally in the same area, generic ‘tertiary education’, without noting that it is the university sector specifically which is one of the UK’s leading export industries (with an £18bn annual export value) thus delivering jobs and wider benefits for communities and the national economy.

How the report’s framing drives the evidence presented

The care/neglect framing of report tends to drive the selection of the evidence presented in it. In the Higher Education chapter, five pages of text are spent examining the value of courses against earnings data for people in their twenties – a lagged data source with many issues and omissions that does not measure the full benefit of a degree. There is also a critique of shifting provision that has taken place in universities, trends that took place as a direct response to major funding changes introduced in 2012 outlined above. Other areas of political controversy, as varied as Vice-Chancellor pay and degree classification issues, are given airtime with a critical edge.

Yet in the FE section of the report a critique of existing provision by colleges is limited to a couple of sentences (p.125) and is attributed to the government’s funding regime, not to college management, in contrast to the way in which the panel finds fault in higher education. The sharp increase in pay for FE college principals since 2010, despite their financial straits, is not even noted let alone critiqued in the report. Concerningly high drop-out rates for many FE courses do not feature at all in the FE section, yet non-continuation rates in HE are critiqued in some detail in the HE chapter.

The report also does not outline the rationale behind the lamented reduction in the adult education budget in the last nine years. I make no judgement here about the rights or wrongs of austerity, noting only that the report again takes decisions taken by government departments out of context by failing to connect the relevant facts to the rationale at the time. The size of the adult education budget was reduced in this period ostensibly because of the impact of the austerity programme on departmental budgets combined with two additional reasons, namely that:

  1. The UK Government was already funding, at a rapidly rising rate, apprenticeships at the same levels of skills (Levels 2-3) and for largely the same group of people as DfE funds for classroom-based adult FE. One fact demonstrates this: apprenticeship starts in the five years to 2009-10 totalled just 1.1m but in the five years to 2015/16 the number of starts had more than doubled to 2.5m. The government believed that on-the-job training via apprenticeships would have advantages over classroom learning in FE. Funding was switched and colleges were encouraged to become apprenticeship providers as a response, which only some have managed successfully.
  2. There was a substantially reduced cohort of potential adult learners to educate in FE, because the rapidly rising proportion of younger people leaving schools and colleges at 19 who had already achieved the requisite qualifications at level 2 and/or level 3. This left a smaller pool of 20-35 year olds requiring such programmes at college, as:
  • the proportion of 19-year olds attaining good level 3 qualifications increased from 42% in 2004 to over 60% by 2016, while;
  • those 19-year olds gaining good Level 2 qualifications (5 A-C GCSEs) increased from 66% in 2004 to 87% by 2016.

Though Augar cites similar data (p.23) the report does not link these data to austerity spending prioritisation forced on the Department for Education at the time. Indeed, the report surprisingly draws no connection at all between the drop in the adult education budget and the increase in investment in apprenticeships at any point in its 212 pages, despite having lengthy sections devoted to both FE and apprenticeships. As a piece of analysis this is obviously lacking, raising suspicions that the report was, in its own (telling) words on p.122, making “a case for change” rather than presenting the evidence neutrally, basing recommendations strictly on the evidence uncovered.

Augar’s selective presentation of international evidence

International evidence is cited in the report when this appears to support the recommendations made by the panel, such as with data cited on international funding comparisons and the relatively low take up of ‘standalone’ Level 4 and 5 qualifications, though misleadingly not Level 4 and 5 study taken in the round. Yet in its discussion of student retention issues the report omits to note that the proportion of UK students who progress to complete degrees is relatively high, certainly when compared to other European nations where we are viewed as an exemplar of good practice in relation to student retention. The report fails to recount that student satisfaction is also high in the UK compared to other comparable nations and that our personal tutoring and pastoral support systems are regarded as a model to emulate abroad.

The report also states boldly that the UK has “one of the highest university participation rates among OECD countries” (p.20) while omitting the highly pertinent fact that our overall higher education participation rate for entry is 1 percentage point below the OECD average (see p.10 here) and that our degree level participation rates are still below 19 other developed nation competitors in the OECD, where participation rates in some countries have hit 70%. The report implies that the UK is an outlier when it comes to participation, with ‘too many students’ at university, when the comparative data doesn’t support this narrative.

Universities should ‘buck the market’?

The section of the report on markets in higher education treads carefully, given the UK Government’s stated policy of encouraging greater HE marketisation. The candid admission that there is now a market for HE, with all its pressures and unintended consequences, is not surprising. Yet the report’s injunction that university leaders should do their best to ignore competition by not marketing their HE programmes too vigorously surely is (p.78). The recommendation that universities should, in effect, attempt to ‘buck the market’ in this and other ways isn’t a realistic one for university leaders charged with keeping their educational charity a ‘going concern’ for the benefit of prospective students in their locality.

The panel might have instead made helpful recommendations that put limits on the impact of some for-profit providers in their efforts to expand their recruitment of students in low-cost subjects, something again omitted in the report’s critique of HE provision. For example, the panel could have made a recommendation to fill the still unaddressed hole in the regulatory framework established under the HE and Research Act 2017 that allows small private HE providers to escape regulation of their provision because of the OfS’s (understandable) decision to not use the ‘basic category’ of OfS registration. This was a missed opportunity.

Conclusion: The FE/HE ‘trade-off’ – Augar’s strategic misstep

It is widely noted that the Augar Panel was established at the behest of the Treasury and No.10 Downing Street, against the advice of the Department for Education. I would argue that the fundamental misstep made by Augar was to concede wrongly that investment in HE should be squeezed in order for FE funding to be enhanced (the ‘trade-off’). To declare this only months after Theresa May declared the ‘end of austerity’ at the Conservative Party conference seems a senseless concession to a now discarded fiscal agenda. This strategic error was illustrated by the fact that, within days of the Augar report being published, both Theresa May and her possible replacements were each proposing a massive loosening of fiscal policy with pledges worth tens of billions of pounds both to boost public spending and to slash taxes. These pledges include an expectation, that may well be granted in the coming days, to give the education budget in England an additional £3bn a year.

Yet instead of arguing the case for proper investment for both HE and FE, Augar fell into the trap of a HE/FE ‘trade-off’ that the Treasury may to capitalise on to the disbenefit of future cohorts of students. The risk for the FE sector from this tie-in is that any substantial additional funds for colleges is tethered to a drop in the HE fee cap or an extension to the student loan repayment period – measures that may not be deliverable in the current hung parliament – not justified independently of what the level of investment in higher education student should be. This point is realised clearly by the Association of Colleges itself, which is now arguing rightly that the country needs to increase the share of GDP we invest in education overall. Either way, the Augar Panel’s vision faces major obstacles to its implementation, some of which are of its own making.

The post Does Augar present ‘evidence-based policy’, or ‘policy-based evidence’? appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A guest blog kindly contributed by Paul Woodgates, at PA Consulting 

It’s hard to remember a time when universities faced so many unknowns. Will the Augar Report recommendations ever become reality? How much money will the sector have? What will Brexit mean? Will the economy nosedive? What kind of government will we have by Christmas?

And we can overlay on to those questions all the ‘usual’ uncertainties facing a university. Will students apply for our courses? What will other universities do? Will overseas students be attracted elsewhere? How will we do in the REF?

Uncertainty is the new normal

Our annual survey of UK vice-chancellors shows how much this is affecting university decision making. But contrary to the myths about ivory towers, universities have always been good at adapting to changes in the external environment. Part of the huge success of the sector has been its ability to respond – as it did to the relaxation of the Oxbridge monopoly in the early nineteenth century, to Robbins in the ’60s, to mass expansion in the ’90s, and to competition and fees in the last few years. And our survey shows that most vice-chancellors remain optimistic that they will find a way through again.

They are nevertheless asking themselves how they can create new student offers, shut down old ones, recruit new staff, remodel research capability, or make expensive changes to campuses when so much is unknown. There are three principles that might help.

1. Financial sustainability is a fixed point

Without financial sustainability nothing else can be done. Vice-chancellors know that income must cover costs and create the ability to invest for the future, and that means having a full understanding of the drivers of both revenue and spending. But less than a quarter of vice-chancellors we surveyed thought they had achieved financial resilience. That makes efficiency critical, and everything from payroll to developing new programmes must be focused on the mission of the institution and done in the most effective way.

2. Manage risks – but be bold

Now is surely the time to innovate. And we are seeing innovation all over the sector – new academic partnerships, new modes of delivery, new programmes, new approaches to delivering professional services. And yet our survey revealed many vice-chancellors have a nagging feeling that these initiatives are often either too peripheral to the core business of the institution, or too slow to have the desired impact.

Innovation will only make a difference if there is an understanding that not all new ideas will succeed. Universities must be willing to embrace a portfolio approach to developing new ways of working – whether teaching, research or back office – and accept that some ideas will be runaway successes, some will be worthwhile but no more, and a few will fail and should rightly be consigned to the lessons learned file.

3. Agility is the key: be ready to change, and change again

It will become increasingly vital for universities to understand what students, funders, staff and the public value, and adapt to that better and more quickly than their competitors. Agility will perhaps become the single greatest determinant of future success. But it cannot mean simply responding to the latest fad. Agility must be seen in the context of a clear sense of purpose, a statement of direction and the values within which the university operates.

Agility will mean having data that provides an understanding of what attracts and retains students. It will mean having governance processes in place to decide quickly and confidently which changes to make and which to avoid. It will mean ensuring that spend on technology and estates specifically considers the need for repurposing in short timescales. It will mean having an organisational model that allows, for example, multi-skilled pools of professional services staff to be deployed wherever the work is. Above all, it will mean a culture in which flexibility is the norm and rigid boundaries of discipline, organisation and hierarchy are no more.

Here comes the future!

With no sign of uncertainty going away, the hidden enemy now is strategic drift. To combat that drift, university leaders should focus on financial sustainability, develop a new approach to risk taking, and create institutions that are much more agile. Their ability to lead and cajole their universities into following them on that journey will, perhaps, be the biggest determinant of future strategic success.

The post Steering a course through the chaos appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

At a recent HEPI roundtable on the future for global research the focus of discussion was the major recent study by Elsevier, produced in partnership with Ipsos Mori, which set out three future research scenarios – Brave open world, Tech titans, and Eastern ascendance – and explored how they could play over the medium to long-term. The HEPI roundtable brought together a group of senior leaders and also early career researchers to test the hypotheses laid out in the report and determine the impact this new world of research, and the key drivers of funding, could have on their institutions.

https://www.elsevier.com/connect/elsevier-research-futures-report

1) Brave open world

Globally, state funders and philanthropic organizations have joined forces and pushed through the creation of platforms where the research they fund must be published open access (OA). But the form of that OA varies by region; Europe is mostly gold, while North America and Asia Pacific is generally green. Rapid advances in artificial intelligence (AI) and technology mean these platforms are flourishing – they are interoperable, and content is easy to access and showcase. As a result, there are fewer subscription-based journals.

A number of broad science, gold OA megajournals with low article publishing charges exist to publish content not captured by open platforms. Major society journals remain active, many operating a gold OA model, but struggle for manuscript submissions, so revenue is low. Preprints thrive in this world and are linked to the final article versions, which are still recognized as the authoritative version. Researchers benefit from access to data in a variety of ways, for example, via bite-sized publications and dynamic notebook-style articles.

The advances in AI and technology have also provided new methods of generating and communicating results. While research quality is still an important measure of performance, journal publication plays a diminishing role in determining a researcher’s career progress. Increasingly, research is assessed against agreed societal impact standards.

2) Tech Titans

Industry and philanthropic foundations are the principal research funders, with far-reaching consequences for the research community. Some are feeling this impact more than others, for example, academic institutions with a focus on life sciences struggle. There have been significant advances in machine learning with sophisticated artificial intelligence (AI) products driving innovation. This has led to large technology and data analytics companies becoming the curators and distributors of knowledge.

Research articles and journals play a much reduced role, with preprint servers and analytical layers over online content replacing some of their traditional functions. The article has become atomized with each part of a research publication created and hosted separately, but all elements are linked. Large technology companies have created a market shift toward AI-driven evaluation of these research outputs; however, current systems have proved susceptible to manipulation and there is pressure to increase their security.

Not all aspects of research are open; for example, where industry is funding research, key research data is not always made available so companies can retain a competitive and financial advantage.

For researchers, the developments in technology and consolidation of analytical services have revolutionized the way research is performed, enabling many to work independently of institutes and even funders – “science-as-a-service” is emerging as barriers to entry are reduced or removed.

3) Eastern Ascendance

China’s desire to transform into a knowledge-based economy has led to heavy public investment in research and development (R&D) and the systems and processes to capitalize on this in industrial and economic terms. As a result, China’s level of R&D funding is proportionally much higher than the West’s and continues to grow, changing the shape of scientific research. The sheer volume of investment by China, and other research nations in the region, has made the East a magnet for international researchers.

A lack of global alignment on grand challenges has resulted in inefficiencies in the international research system. Open science practices have been adopted in some countries and regions, but not all. Journal publishing is a mixed model of open access (OA) – gold and green – and subscription publishing. Individual research outputs can be accessed separately, but are always linked to the final article; for example, research findings, data and code.

Governments, industry and other research funders compete for scientific advantage through the controlled distribution and trading of data. When data is believed to hold no further commercial value, it is released so it can be linked back to its related research outputs.

Elsevier are gathering views on what researchers, policy makers and others in different parts of the world think the future will be like: https://www.menti.com/1ptw7ka6qf

Thoughts from our roundtable
At the HEPI roundtable responses ranged from scepticism but also support for how some of the scenarios could play out.

Against the backdrop of the current Hong Kong protester crackdown, many voiced disbelief that China could establish itself as a destination of choice for top global researchers uness there was a moveaway from authoritarianism towards democracy. One person pointed out that the most successful universities have always been those with institutional autonomy.. Others pointed to India to suggest that it could in fact be other countries or regions that over time become more dominant in research leadership and funding.,

On Open Access, there were mixed feelings about whether the world was moving towards greater international collaboration, with some pointing to hopeful examples like the human genome project, while others decried the lack of genuinely open collaboration.

Some noted that change required people to enact it and that there were a range of interests at play. One participant noted that, as a researcher he was boldly in favour of open access, as a university manager he had some enthusiasm but real nervousness about the cost, and through his role in a learned society he could see that open access could bankrupt the institution.

Few disputed that artificial intelligence was transforming research, but would that enhance or diminish the role of blue skies thinking and life science research? And as machines increasingly automate some areas of research, would the focus move to more creative areas of research that (so far) are resistant to automation?

We would love to hear what you think of the Research Futures predictions – please let us know using this poll: https://www.menti.com/1ptw7ka6qf

The post Predicting the future of research appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

My response to last week’s National Student Survey results focused on question 26, which is generally ignored by everyone but is actually the question which always seems to score the lowest results of all.

It is about the utility of student unions. It is, admittedly, an odd question that implies student unions are primarily about securing academic quality rather than all their other roles: improving students’ social life, campaigning, delivering welfare services and lots of other things – some of which are explored in the history of student unions that we published last year. The precise wording of the question, which students are asked to say if they agree with, is: ‘The students’ union (association or guild) effectively represents students’ academic interests’.

Yet Q26 is not such a bad question that it can go on being ignored altogether – and does anyone seriously believe that a different question about satisfaction with student unions would top the tree? That seems unlikely, given the turnover of elected officers and the poor financial position of many unions, but perhaps the Office for Students should test the proposition out.

Nationally, the student union movement is undergoing a crisis, which has led to the abolition of the International Students’ Officer post, among various ‘liberation’ posts. This is despite the best efforts of the strong outgoing NUS team, headed by Shakira Martin and Amatey Doku, who were always fantastic advocates for their members. They stemmed the problems but always had limited room for manoeuvre given the state of the organisation that they inherited.

Whenever anyone flags such issues, they get accused of attacking student unions. That is tiresome, but my goal is the opposite. Student unions are incredibly important to the student experience – for example, in reducing loneliness and improving well-being, in supporting students with housing and financial challenges, in delivering policy changes and in providing a voice to the voiceless. So they deserve proper support.

Student unions also deliver social capital. My own first forays into the world of politics and policy were through a student union, at Manchester University in the early 1990s, and – as a postgraduate – I was a delegate at the fiery 1999 NUS Annual Conference, which was the first after Tony Blair had (re)introduced student fees. As with so many people who work on higher education policy, it is possible that I would not be doing the job that I do without such experiences.

That doesn’t mean the student movement is perfect. Far from it. I have long thought it too willing to become bogged down in national and international politics that it can do next to nothing about. Such virtue-signalling has a three big opportunity costs. First, it takes time and effort away from other priorities. Secondly, it disengages the many students who do not identify as left-wing radicals. Thirdly, it is the gift that keeps on giving to journalists who want to want to take the mickey out of ‘snowflakes’. I much prefer the old approach of the NUS where party politics were kept at bay.

But just like being a member of a political party does not mean you support every position that your party takes, one doesn’t need to support every position of the student union movement (as far as a single position can ever be discerned) to believe we benefit from a strong student union movement. Even when the NUS were hosting huge protests over the introduction of £9,000 fees, Whitehall was still funding their anti-extremism work and talking to their elected officers about issues of shared interest.

Indeed, I have never met a senior university figure, such as a vice-chancellor of a chair of governors, who talks negatively about their student union. Governing body meetings are infinitely better with a student or two present to prick bubbles and conferences on higher education only really come alive when students are present. As students increasingly come from diverse backgrounds generally very different from the backgrounds of politicians, civil servants and senior academics, the role of student unions should be more important than ever before. Yet they are struggling.

So what can be done? One specific area where we need to do better as a sector is in the student experience of international students. In the HEPI / Advance HE Student Academic Experience Survey, non-EU international students are least likely to say they feel they are receiving value-for-money. Hardly surprising, perhaps, given the higher fees they pay – each student deliver a cross-subsidy from their fees to research of around £8,000 during their time in the UK.

But it is undeniably in the interests of everyone in the higher education sector to improve the perceptions of international students. So my suggestion is this: the sector should consider putting its hand in its pocket to show its support for student unions by funding a new full-time International Students’ Officer at the NUS.

To avoid a moral hazard problem or a long-term conflict of interest, this should be strictly time-limited, perhaps on a five-year basis with the sector funding 100% of the costs in 2019/20, 80% in 2020/21, down to 20% in 2023/24 and 0% in the following year. Assuming the gross costs are £45,000 a year, this would cost a mere £75 for each of the 600 institutions with a student union in the NUS. If only universities were to pay, the figure would be around £350 – remember that they are making, on average, an £8,000 surplus from each international student that they educate.

If this were to happen, it would send a clear signal that the higher education sector recognises the value of student unions, wants to improve the perceptions of international students and believes the dictum that higher education is a partnership between institutions, staff and students.

The post If student unions score so poorly, is it time for universities to dip their hands in their pockets to fund a new NUS International Students’ Officer? appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This is a guest blog from Dr Richard Budd, from the Centre for Higher Education Research and Evaluation (CHERE) at Lancaster University.

The National Student Survey (NSS) has supposedly tracked the quality of undergraduate provision across the UK since 2005. It is apposite that universities are encouraged to pay attention to their teaching – the introduction of the NSS seems to assume that they otherwise wouldn’t – but it probably doesn’t do this. This is because it is underpinned by several flawed assumptions that render it highly problematic at best, and meaningless at worst.

The first and fundamental assumption is that NSS scores reflect teaching quality. The OfS describes it as gathering ‘students’ opinions on the quality of their courses’, which is obviously slightly different: it is a proxy measure. This is not to say that students’ perspectives are not valuable – they are – but pedagogy takes many forms, and a session or course may work well one year and not the next, or well for some groups and not for others. The processes of pedagogy are complex, and as educators we continuously adapt our classroom practice accordingly.

The survey addresses areas such as support, teaching, and feedback, but one-sidedly, focusing entirely on the university’s provision. The central role – that of the student – is curiously absent. Access to staff, information on assessment procedures, and ‘opportunities to explore ideas or concepts in depth’ will have been present, but students may not have availed themselves of them for various reasons. We can sometimes be more explicit – often more inclusive – but students are not passive recipients of an education and they know this. Capturing the extent to which students have engaged is difficult (it is invisible to systemic monitoring), and asking them to self-report would be a nonsense; in combination with the ongoing grade inflation furore, the media would have a field day.

The relationship between the NSS and teaching quality is therefore tenuous, and is made more so through the next assumption, which is that responses reflect three or more years of pedagogy. Sabri, (2013) explored the realities of the NSS at institutional level, and reports that some students not only found the questions infuriatingly simplistic, but also cited difficulties in thinking beyond their teaching at the point the survey is administered – the spring of their final year. Matters are further complicated when students are taking a degree in more than one discipline – which department do you report on? Sabri also notes that the focus for students at this point is overwhelmingly on their final assignments and life beyond graduation. (The timing is tricky for universities, too, as it comes when interim grades are being released; this can encourage universities to rush results out to please students, placing a further burden on already overworked staff.) IPSOS MORI suggest the NSS should take about ten minutes to complete, but if your main aim is to get it out of the way to stop the incessant messages exhorting you to complete it, 27 Likert-scale responses can take far less. It is supposed to be voluntary, but I doubt if it feels that way, and universities are under major pressure to ensure that at least 50% of every course completes it.

A further misleading premise is that the data is comparable between and within universities, and over time. This would require the NSS be immune to changing social, political and economic conditions, as well as generically applicable across the full spectrum of teaching provision. There is some evidence that disciplines mediate how students approach the survey questions, and that this works against courses with fewer classroom hours and more discursive teaching cultures. Research also suggests that elite universities admit students with higher grades in part because those students will do well with relatively limited support, freeing up more time for research. On the flipside, the less prestigious post-92 institutions, who are less academically – and therefore socially – selective, may be more supportive of their students.

NSS scores are far more valuable to post-92s, who cannot compete in research status terms, so they are likely to view it differently. Some universities expend a huge amount of energy in relation to it, but constantly seeking feedback in the interests of delivery optimisation (and subsequent NSS results) can be counterproductive if it encourages students to be excessively critical of everything. It is undocumented but widely known that universities game the NSS relentlessly – rewards for completion, social events to improve morale, etc – as happy students are more likely to report positively. The NSS Good Practice Guide forbids ‘inappropriate influence’ such as discussions with students as to the nature of the questions or the implications of the results. This is infantilising: students know what the NSS is, they can see the issues with it, and they must sense the institutional nervousness around it. An open discussion with them as to how it all works (or doesn’t) would surely be far healthier.

Another implicit, incorrect assumption is that the NSS strongly informs student choice. This is based on the false notion that all potential applicants are equally well-informed, entirely rational in their decision-making, and that they place great value on NSS scores. Research from 2015 shows that the NSS has a far weaker influence on choice than university status – a 10 per cent rise in NSS scores (which would be very large) only creates a 2.5 per cent increase in applications. The results do, though, feed into a number of rankings and the TEF, which, of course, also doesn’t certify teaching quality. League tables are widely known to be specious, and feeding bad data into a poor model is not the answer: two wrongs don’t make a right.

These issues in combination mean that the NSS would struggle to achieve a pass grade in Social Science Research Methods 101. The same could be said of most metrics which claim to represent the state of UK HE. It is perverse that, for a sector which revolves above all around the production, verification, curation, and dissemination of high quality knowledge, we are partly governed with incredibly poor data. We also demean our mission when we market ourselves through it.

The post The NSS: Unfit for Purpose appeared first on HEPI.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

MillionPlus has been at the forefront of organisations calling for the Level 4 and 5 education space in England to be revived as an outcome of the Augar process and it is pleasing to see this agenda picked up in the report. These important qualifications, such as Foundation degrees or Higher Nationals, have always had a significant proportion of those studying them as part-time students or those whose course has been sponsored by their employer.

Their decline in take-up, much highlighted in the Augar Review, related to the funding changes introduced by the government in 2012 which has affected part-time study generally, combined with the effect of the great recession and public spending reductions on private and public sector training budgets since 2009. Modern universities have never lost their interest in, or engagement with, study at this level, something reflected in the fact that just under a third of standalone Level 4 and 5 qualifications are provided predominantly by these universities.

What’s missing?

The Augar Report uses the catchy term of ‘the missing middle’ in relation to Level 4 and 5, a phrase that oversimplifies the situation we face in reviving this area of provision. Someone reading the report without knowledge of higher education might think that those studying for a degree rather than, say, a Higher National Certificate, miss out on two key levels of higher education. But any student on a degree programme at a university is, in their first and second years of study, directly engaged in Level 4 and 5 study. Furthermore, at modern universities and other providers many courses remain work-related, enabling students to benefit from programmes which should be categorised as ‘higher technical education’ in Augar’s terms.

There is therefore nothing ‘missing’ in this for students on degree programmes, unless higher technical education is arbitrarily defined as a specific technician qualification, which is but a subset of higher technical education in any proper definition. Yet the Augar report makes misleading claims here, including that England has:

  • a “very small number of Level 4/5 students” (p.33), that there
  •  “were only 190,000 people studying at Level 4 and 5 in 2016/17” (p.34), and even that
  • the rise in degree study has been “partly caused” by the “steady decline in the number of people studying higher technical provision” (p.123).

These statements could only be technically accurate on the premise that only ‘standalone’ Level 4 and 5 qualifications constitute the whole of Level 4 and 5 study in England, a definition which tendentiously and wrongly excludes the huge scale of Level 4 and 5 study experienced by students within degree provision, i.e. several hundreds of thousands of students each year. There is no reason not to consider these students as Level 4 and 5 learners, period.

Ironically, the report elsewhere seems to implicitly concede this reality when it calls for universities to award all students what it calls ‘interim’ HE qualifications, which would be designed to ‘normalise’ Level 4 and 5 awards alongside degrees. Qualifications such as Certificates or Diplomas of Higher Education, which are at Levels 4 and 5 respectively, are currently awarded to those not completing their degree, not to all students as they progress through their programme. This is a positive proposal worthy of consideration for sure, but an acid test of this proposal is whether this can become a universal practice in the sector, with the likes of Oxford or King’s College London awarding interim qualifications as much as modern universities might wish to. ‘Normalising’ sub-degree qualifications by this route will only happen if this is a truly sector-wide practice.

Recommendation for duplication?

Report also contains a lack of clarity in the diverse purposes of apprenticeship and level 4 and 5 study. This is reflected in its endorsement of proposals for ‘National Standards for Higher Technical Education’ as the basis for kitemarking these sub-degree qualifications as relevant for employment and eligibility for full loan and student maintenance support. MillionPlus welcomed those recommendations in the review designed to boost flexible learning and smaller chunks of learning (‘credit’) that fall short of a qualification. Yet apprenticeships, as the Augar report itself states, are a form of on-the-job skills training for employees, based on occupational standards now devised by employer trailblazers (previously by Sector Skills Councils) under the auspices of the Institute for Apprenticeships and Technical Education. Level 4 and 5 qualifications can and do fit into some of these standards (and frameworks elsewhere in the UK), with their educational as well as skills-based aspects, while the cost of this education and training can be reclaimed by employers from their levy payment.

Yet the Augar report surprisingly omits an explanation of, even at a high level, how National Standards for Higher Technical Education might differ from the role of apprenticeship standards. This confusion might store up trouble for employers, as I fear that parallel sets of apprenticeship standards and higher technical education standards will simply create duplication and confusion for them unless these are carefully distinguished. Those who have seen the (painfully) slow progress in devising apprenticeship standards to cover the range of occupational areas since the Richard Review in 2012, a process far from complete even in 2019, will know that this is no small task.

Engaging to address challenges

The Department for Education is working with stakeholders in relation to this crucial agenda in a positive spirit, with officials openly looking at a range of options on how take up in this sub-degree space can be enhanced, such as through broadening the use of the apprenticeship levy to include employer-sponsored qualifications that are genuinely work-related and productivity enhancing, not just those tied to apprenticeship standards. There is much to address here and MillionPlus, through its agenda-setting and engagement in this area, is contributing to the development of sensible proposals that will help employers, future students and work-based learners achieve their aims.

The post The Augar report and the not-so-missing middle appeared first on HEPI.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview