Britain’s Forgotten Generation – It Needn’t Be That Way…
Can’t get job above the minimum wage? Can’t buy a house? Are you part of a lost generation of people who feel they’re missing out on the good life their parents have enjoyed? Talk to my two twentysomething daughters, and you’ll hear such sentiments.
Alan Milburn, the British government’s “social mobility tsar” has published a report that makes depressing reading for people who have entered the workforce over the past decade, and those coming up behind them. As reported in the Daily Telegraph, he and his colleagues:
“…believe that policymakers have to come to terms with a new truth that emerges from the mass of evidence that is contained in our report,” he said
“Although entrenched poverty has to be a priority and requires a specific policy agenda … transient poverty, growing insecurity and falling mobility are far more widespread than politicians, employers and educators have so far recognised.
“There is the growing cohort of low and middle income families squeezed between falling earnings and rising house prices, rising university fees, rising youth unemployment, who fear their children will be worse off than they have been.
“Many of today’s children face the prospect of having lower living standards than their parents when they grow up.”
It needn’t be that way. Here’s a speech I’ve put in the mouth of a business leader 17 years from now who is looking back on a transformation:
As we enter 2030, we in the United Kingdom can congratulate ourselves on the recent UNESCO survey rating us as having the top education system in the world both at secondary and tertiary levels. Our economy has recovered since the bad old days following the 2008 financial crisis. British talent is coveted everywhere, not just because of our technical ability but because of the balance of skills that separates mediocrity from excellence.
How did we achieve this?
Seventeen years ago our education system was in a tailspin. Kids were leaving secondary school with qualifications that had been steadily devalued over the previous thirty years at the initiative of successive governments keen to prove that they were raising educational standards. In reality they were progressively lowering the bar. As a result, more people were getting better grades because the exams were easier to pass.
The same thing was happening at tertiary level. The creation of tens of new universities meant that more people had access to university places. But not only were those places available to school leavers with exam results far less impressive than were required of entrants a generation before, but an “everyone’s a winner” attitude pervaded these institutions. Students were obtaining “good degrees” in far greater numbers. This was not a testament to good teaching, because even at the most highly-rated universities tutorial time had dramatically decreased, and the number of lectures students were required to attend had also declined.
Within the institutions themselves, attitudes had changed. Universities no longer regarded themselves as temples of learning, but as businesses. They were encouraged to do so by successive governments that saw everything in terms of business. Competition and return on investment were the primary drivers. For university staff, job protection and the imperative of maintaining self-sustaining organisations became more important than the needs of the individuals who were relying on them for the best possible education.
After 2013, it became clear to many students that they were not getting value for money. Most universities had raised their fees to the maximum level permitted by the government, and an increasing number of young people were entering the workforce saddled with crippling debt. Not only that, but they were discovering that their degrees no longer gave them a competitive advantage in the employment stakes. After all, there are only so many jobs available in the media, leisure and tourism.
Worse still, the idea that gaining a first-class honours degree in any subject was sufficient proof of a person’s intellectual rigour was increasingly questionable. Academia and employers started encouraging students in increasing numbers to re-mortgage their future by undertaking further specialised degrees. A bachelor’s degree was no longer a passport to a decent job. You needed a master’s as well.
The traditional respect accorded to degree-holders steadily eroded, as it became clear that even a doctorate was no guarantee that the holder was capable of making an effective contribution in a stressed and competitive workplace.
What was life like for an employer in those days?
We were increasingly faced with job applicants – many with “good degrees” – whose CVs provided shocking evidence of poor language skills, of an inability to organise information in a coherent way. While they may have come to us packed with recently acquired knowledge, many were sadly lacking in the kind of skills that would make them effective members of our workforce. They were naïve. They were poor communicators. They lacked empathy and common sense. They knew little about what would be expected of them, so their learning curves were long and steep. Some believed that success would spring from the number of hours they worked in a day – regardless of how productive their efforts.
Others – often products of the most privileged families, the best schools and most prestigious universities – brought with them a sense of entitlement. They saw career advancement as their right, and although many were prepared to work hard, some were reluctant to do more than the minimum necessary to achieve success. And we, the employers, were more than willing to open doors for them because we were overly impressed by their glittering backgrounds, their confident manners and even their looks.
Once the new measures were bedded down, the change started to happen. More and more young people took the view that university education was not worth the investment. Instead of going to a run-of-the mill university, they chose instead to enrol in company apprenticeship schemes that equipped them to succeed in their chosen professions.
The worst of the universities closed down. Many of those that remained devoted themselves increasingly to the multitude of foreign students who still regarded the UK as the gold standard of tertiary education. Eager to maintain their positions in the league tables, those surviving second tier universities used the more lucrative fees they gained from foreign students to raise their standards. By 2018 the number of places available to home-grown students declined by 50%. This resulted in greater competition for places, and pressure on the government and the secondary schools to impose more rigorous standards in the curriculum.
In another significant development, the universities started to look at the context of the academic achievements presented by their applicants. Much as Oxford and Cambridge have done for decades, these universities started to look beyond results for evidence of the whole person. They looked for what made the person special – for ingenuity, emotional intelligence and lateral thinking. For life experience that set one person apart from another.
This presented a challenge for the secondary schools. Gradually it dawned on them that they must equip their pupils with life skills, not just the narrow academic learning to which all but the elite private schools confined themselves. Skills that were seen as essential under the increasingly stringent university entry criteria. And skills that enabled students to become productive employees in the shortest possible time. And last but not least, skills that enable them to become independent and functional members of society.
Government recognised the need. In 2018 it introduced a set of reforms that affected both the national curriculum and employment law.
The first measure was the introduction of compulsory classes on life, employment and citizenship skills up to GCSE level. Students learned how to manage their finances, how to network, how to stay healthy, how to apply for a job, how to negotiate, how to influence and persuade. Schools were not allowed to reduce the teaching of other subjects. The additional subjects would be accommodated by the addition of 30 minutes to the standard school day.
The second introduced statutory limits to the amount of homework that schools could require students to carry out. This reduced the workload on teachers required for assessing homework projects.
The third measure was to introduce a minimum of two weeks of compulsory work experience during the GCSE year (Year 11). Businesses willing to take students for work experience were allowed tax breaks and subsidies to reimburse them for the burden of facilitating the scheme. Where work experience was not available, registered charities were encouraged, again through financial incentives, to organise community work projects in which the students could participate. The outcomes and reflections on their work experience became a part of the test criteria for the Life Skills GCSE qualification. Teachers were required to actively monitor the schemes in which their students were enrolled.
The fourth measure was to require 20% of the national curriculum from primary school onwards to be devoted to collaborative learning – wherein students worked together on projects and were assessed both on the outcomes and on their contribution to the collaborative group.
Finally, all universities were required to include in every degree marks awarded for communication skills – written, verbal and non-verbal.
On the employment side, the Government passed a law prohibiting employers from discriminating in favour of job seekers who held university degrees over those who did not. The exception to the law was degrees requiring specific university qualifications in order for the applicant to carry out a specific job – for example medicine, applied science and engineering, but not accounting and law, since other routes were available for accountants and lawyers to gain the necessary qualifications.
The test for employment switched from “what you know” to “what you can do” and “what you can learn”. Employers were required to demonstrate that they had selection criteria in place to measure actual skills, knowledge related to the job and learning ability. The focus shifted from the past – degrees and other qualifications as passports to employment – to the present and the future – knowledge and skills that are useful today, and indicators of future potential.
The apparent downgrading of degrees as a criterion for hiring led to howls of protest from the academic establishment. It predicted the demise of the liberal arts and pointed out the futility of a student striving for the best degrees. Those in favour of the measure argued that the government had no desire to snuff out the study of subjects like languages, philosophy and history. The point was that these courses should be designed to give the student the best possible chance of employment.
An additional measure allayed the fears of the naysayers. Fees for non-occupational courses – pure science and the liberal arts – were reduced by 80%. This made such courses far more affordable, on the basis that careers in academia, government and education were likely to be less well remunerated.
The result of these changes was the rejuvenation of the UK university system. While a minority of universities – Oxford and Cambridge among them – continued to function as multi-disciplinary centres of learning, a number of newer universities started to follow the German model. They set themselves up as specialist schools of medicine, engineering and natural science. This enabled them to attract the cream of foreign talent, many of whom remained in the country to carry out cutting edge research, and whose inventions and entrepreneurship created thousands of new jobs.
The government introduced one final measure to foster competition and talent in the workplace. Foreign graduates wishing to remain in the United Kingdom for further study were granted work permits for up to five years after the completion of their studies. At the same time they were allowed to apply for British citizenship under a fast track process that would result in them gaining citizenship after three years of residence. The measure was greeted with outrage by the newly-merged Conservative Independence Party. But supporters claimed that it was not sufficient for the UK to be the home of much of the world’s financial capital, and that we needed a matching inflow of human capital.
The scheme enabled the UK to benefit from the wealth created by many people who otherwise might have returned to their home countries after completing their studies. Just as the England victory in the 2022 FIFA World Cup was attributed to foreign players helping to raise the standards of English footballers in the game’s top echelon, the presence of international talent in the British workforce compelled home-grown talent to raise its game.
So how do we, as one of the leading British technology companies, attract the best available people?
First of all, employment doesn’t begin with the milk round. You may remember that twenty years ago, Britain’s largest employers would talent-spot potential employees by staging road shows for undergraduates in their final year of study.
These days the process begins much earlier. Whereas in the golden years of the social media companies saw the likes of Facebook and Twitter as a means of promoting their brand and products, some organisations started to reverse the paradigm. In addition to encouraging people to follow them, they started to follow people.
It was a form of ethical grooming. Using the demographic data that the social media and blogging sites were generating, these companies started to identify potential employees on the basis of interest, activity, location and evidence of the kind of thought processes that they believed were potentially valuable to them. They would offer them opportunities for internships and paid collaborative projects. In some cases they would enrol them in sponsored study groups and ask them to undertake research on their behalf. And at the appropriate time they would invite them to apply for permanent employment.
By the mid-20s, in some cases up to 50% of corporate hires came from these sources. To meet their remaining needs, companies like mine developed a battery of tests to ensure that the basis of hiring was not “have done”, but “can do”. In our case, we assess written English, the ability to listen, to influence, to persuade. We look for evidence of resilience, of critical thinking and creativity. We set great store by emotional intelligence – self-awareness, the ability to empathise, to manage one’s self, to work effectively in teams. These tests, combined with occupational testing to ensure that applicants meet the minimal practical requirements for the jobs for which they are being hired, have resulted in our new employees being able to become productive far more quickly than in previous decades.
And once they are in place, we engage them in learner-centred programmes that include personal development, action learning, problem-based learning and self-directed independent learning. Much of our occupational training is outsourced to the new technical universities. These days they are flexible enough to provide in-house training as well as traditional courses on their own campuses. We also take enrol our staff in virtual classes offered by some of the leading international universities such as Harvard, MIT and Beijing. Our investment in learning is equal to what we invest in research and development. Why? Because we believe that if our company is to succeed, the learning journey on which our new employees embark is far more important than that which they have been on thus far.
Many organisations have followed our lead, even down to start-ups and small businesses. Those who invest in advanced recruitment techniques and employee learning programmes, including apprenticeships, have emerged as the real success stories of the 2020s.
Today our workforce is the envy of the world. The economic recovery that started in 2013 has continued ever since. The generation that feared it would never see the prosperity enjoyed by the baby boomers is wealthier and more content than any of its predecessors. The commanding heights of the economy are no longer controlled by corporate psychopaths. The 2008 crisis taught us that short-term thinking and corporate greed benefit the few and not the many. We survived the decline of the financial sector and are the stronger for it, because we have come to realise – like our German colleagues – that the longer view always wins.
And finally, after decades of post-imperial angst, Great Britain has redefined itself as an empire of the mind.
Far fetched? Probably. And what do I know? After all, I don’t have degrees in politics, sociology, education or economics. I’m just an ageing baby boomer who refuses to believe that his children’s generation can’t fix the problems that his generation helped to create. Things could get worse, but isn’t it time we started dreaming of how they can get better?