A new approach to academic portfolio review – Part one
Most institutions have at some point lined up student enrolments on each programme and noted that a few programmes account for the bulk of their recruitment with a long tail of programmes recruiting very few students and no plans to grow or close them. The long tail is an indicator of inefficiency, whether that’s complex curricula, the practice of resting rather than closing provision, or a willingness to approve programmes for which there is high academic supply-side interest but limited student demand.
“The long tail is an indicator of inefficiency, whether that’s complex curricula, the practice of resting rather than closing provision, or a willingness to approve programmes for which there is high academic supply-side interest but limited student demand.”
While academic portfolio reviews appear to be simple exercises which will readily generate savings, it’s easy to underestimate what it takes to complete a portfolio review well. Outside higher education, many industries rely on the concept of the product lifecycle. As tastes and demands change, providers respond and replace products when they are no longer viable. In higher education this would translate to an academic portfolio constantly renewed in response to changing student demand, new research, and the emergence of new professions. But in HE this causes a cultural problem where opportunities for change and renewal are countered by academic autonomy and protectiveness of programmes.
Rather than an occasional quick fix, academic portfolio review is more effective as an ongoing process enabled by robust data, good governance and leadership to deliver a successful portfolio. However, academic portfolio exercises are more often mediocre, low impact reviews undertaken by institutions aware there is a problem but lacking a politically acceptable solution.
Case study 1 – unintended portfolio growth
A provider undertook a portfolio review with the intention of improving teaching efficiency and releasing academic staff time to focus on research. The exercise was led by the Pro Vice Chancellor for Education with support from relevant professional services departments. The Planning department initially provided data on programmes with fifteen or fewer students registered over the previous five years with new provision highlighted to indicate where there was an expectation of student number growth. The dataset was later expanded to module level and to show where modules were shared across programmes. This was presented to the Executive for feedback which informed communications to Faculties and Schools. Faculty Deans were also asked to communicate the purposes and intended outcome of the exercise in their faculty.
Following the launch, the Planning department received feedback that academic colleagues were concerned that the real purpose of the exercise was to reduce overall numbers of academic staff. The faculty responses argued that for every proposed module or programme closure, there was a rationale for why it should remain open.
In the time the exercise ran, the University approved the creation of more new modules than it closed and did not close any programmes which were not already on teach-out. The portfolio review, intended to reduce the overall scale of provision, had the effect of increasing the amount and complexity of programmes and modules It was quietly closed.
Case study 2 – programme complexity
A provider received feedback that students were frustrated by high levels of module choice rejections and overwhelmed by the breadth of the programme offering. Initial analysis by SUMS Group suggested that the root cause was combination of high module choice and timetable clashes which was leading to high numbers of rejections during the module selection and confirmation process. This was exacerbated a high degree of content overlap between modules which presented an opportunity to improve teaching efficiency and create a more distinctive and higher performing portfolio.
“Initial analysis by SUMS Group suggested that the root cause was combination of high module choice and timetable clashes which was leading to high numbers of rejections during the module selection and confirmation process.”
The institution developed dashboard data looking at the distinctiveness, demand and performance of modules and programmes and explored the institutional data landscape by mapping data roles. This work allowed the institution to surface issues and to develop a better understanding of current levels of complexity and distinctiveness in the portfolio as the basis for future decision-making. It also identified the root causes for the high numbers of rejections.
Case study 3 – managed change
A provider ran an exercise intended to reduce academic portfolio complexity, create capacity to manage enterprise delivery more effectively and transaction costs. The exercise was sponsored by the Vice President for Education and separated from discussions around future staffing efficiencies, instead the discussion focussed on movement towards sector norms for student: staff ratios (SSRs).
While the exercise had a senior sponsor, faculty leadership were empowered to take forward the exercise and take responsibility for the change. Faculties were set a broad ambition to reduce complexity and to cut an agreed percentage of modules and programmes overall where provision was no longer viable. The outcome was a set of recommendations on whether to retain, modify or delete programmes and modules to achieve the set reduction.
During the exercise, faculties were given additional analysis and insight support to inform their recommendations. There was also support from colleagues with change management experience who were deployed as trouble-shooters into flashpoints as issues arose. The exercise drew on change management models and there were careful communications and messaging during the process.
A working group of senior academic leaders was convened to scrutinise the recommendations and given the authority to accept or request further information from faculties.
The exercise was broadly successful in achieving its aim of reducing the unwieldiness of the academic portfolio and the approach around end-to-end change management was key to the success of the exercise. The process was characterised by its consensual nature and the empowerment and agency given to faculties over how to conduct the exercise locally while contributing to the collective endeavour which had clear expectations set from the outset.
Case study 4 – ignoring market insight
A provider was keen to take a more data-led approach to its ongoing portfolio development and asked its Planning department to develop market insight data to inform development of new provision and retitling of existing provision to stimulate demand. A dashboard was developed using HESA and UCAS data supplemented by regional employer data and Planning offered live lookups and more in-depth analysis as required. The dashboard was not made widely available as it required prior knowledge of the dataset to achieve the most useful results.
Usage of the new market insight service was variable depending on the strength of relationship between the Faculty or School and Planning. Some faculties used it to test new ideas before proceeding to programme development, but others preferred to open new provision without or against advice. The dashboard also proved useful for course review when new provision failed to recruit, although there remained a reticence about closing courses. The provider lacked strong leadership of the course approval and closure process with the result that its portfolio continued to grow organically.
Factors for failure
What most of these case studies have in common are several factors for failure:
- Academic portfolio review is initiated in response to a problem, but without clear objectives. The intention is often to streamline provision sometimes with an intention to reach a financial saving rather than create a sustainable, attractive academic portfolio.
- Communication of the exercise and its intentions to the academic community are poor and lead to the academic community feeling under attack.
- The student voice is missing and provision is designed based on perceptions of student demand, or what students should want to study.
- Leadership is often weak and unsupported which means there are little or no consequences when the objective is not reached.
Academic portfolio review should be a process of renewal and ongoing dialogue which presents an opportunity to increase student satisfaction and demand. Part two of this thought piece (coming soon) proposes a new approach to academic portfolio review and the key elements for success, which could improve the success rate of review exercises and new provisions.
If you’d like to know more about SUMS Strategy, Planning and Transformation Service, or Marketing and Student Recruitment Service, more information is available on our website.
We’re happy to discuss any topics raised here and what they could mean for you in more detail. Contact Rhiannon Birch at r.birch@reading.ac.uk for more information.