#i completely lost the ability to logically assess the situation. it's always the worst of cases + made up horrific scenarios
Explore tagged Tumblr posts
serdtse · 2 years ago
Text
//
1 note · View note
newstechreviews · 4 years ago
Link
Forty-five days before the announcement of the first suspected case of what would become known as COVID-19, the Global Health Security Index was published. The project–led by the Nuclear Threat Initiative and the Johns Hopkins Center for Health Security–assessed 195 countries on their perceived ability to handle a major disease outbreak. The U.S. ranked first.
It’s clear the report was wildly overconfident in the U.S., failing to account for social ills that had accumulated in the country over the past few years, rendering it unprepared for what was about to hit. At some point in mid-September–perhaps by the time you are reading this–the number of confirmed coronavirus-related deaths in the U.S. will have passed 200,000, more than in any other country by far.
If, early in the spring, the U.S. had mobilized its ample resources and expertise in a coherent national effort to prepare for the virus, things might have turned out differently. If, in midsummer, the country had doubled down on the measures (masks, social-distancing rules, restricted indoor activities and public gatherings) that seemed to be working, instead of prematurely declaring victory, things might have turned out differently. The tragedy is that if science and common sense solutions were united in a national, coordinated response, the U.S. could have avoided many thousands of more deaths this summer.
Indeed, many other countries in similar situations were able to face this challenge where the U.S. apparently could not. Italy, for example, had a similar per capita case rate as the U.S. in April. By emerging slowly from lockdowns, limiting domestic and foreign travel, and allowing its government response to be largely guided by scientists, Italy has kept COVID-19 almost entirely at bay. In that same time period, U.S. daily cases doubled, before they started to fall in late summer.
Among the world’s wealthy nations, only the U.S. has an outbreak that continues to spin out of control. Of the 10 worst-hit countries, the U.S. has the seventh-highest number of deaths per 100,000 population; the other nine countries in the top 10 have an average per capita GDP of $10,195, compared to $65,281 for the U.S. Some countries, like New Zealand, have even come close to eradicating COVID-19 entirely. Vietnam, where officials implemented particularly intense lockdown measures, didn’t record a single virus-related death until July 31.
There is nothing auspicious about watching the summer turn to autumn; all the new season brings are more hard choices. At every level–from elected officials responsible for the lives of millions to parents responsible for the lives of one or two children–Americans will continue to have to make nearly impossible decisions, despite the fact that after months of watching their country fail, many are now profoundly distrustful, uneasy and confused.
Tumblr media
John Moore—Getty ImagesFriends and family mourn the death of Conrad Coleman Jr. on July 3 in New Rochelle, N.Y. Coleman, 39, died of COVID-19 on June 20, just over two months after his father also died of the disease
At this point, we can start to see why the U.S. foundered: a failure of leadership at many levels and across parties; a distrust of scientists, the media and expertise in general; and deeply ingrained cultural attitudes about individuality and how we value human lives have all combined to result in a horrifically inadequate pandemic response. COVID-19 has weakened the U.S. and exposed the systemic fractures in the country, and the gulf between what this nation promises its citizens and what it actually delivers.
Although America’s problems were widespread, they start at the top. A complete catalog of President Donald Trump’s failures to address the pandemic will be fodder for history books. There were weeks wasted early on stubbornly clinging to a fantastical belief that the virus would simply “disappear”; testing and contact tracing programs were inadequate; states were encouraged to reopen ahead of his own Administration’s guidelines; and statistics were repeatedly cherry-picked to make the U.S. situation look far better than it was, while undermining scientists who said otherwise. “I wanted to always play it down,” Trump told the journalist Bob Woodward on March 19 in a newly revealed conversation. “I still like playing it down, because I don’t want to create a panic.”
Common-sense solutions like face masks were undercut or ignored. Research shows that wearing a facial covering significantly reduces the spread of COVID-19, and a pre-existing culture of mask wearing in East Asia is often cited as one reason countries in that region were able to control their outbreaks. In the U.S., Trump did not wear a mask in public until July 11, more than three months after the CDC recommended facial coverings, transforming what ought to have been a scientific issue into a partisan one. A Pew Research Center survey published on June 25 found that 63% of Democrats and Democratic-leaning independents said masks should always be worn in public, compared with 29% of Republicans and Republican-leaning independents.
By far the government’s most glaring failure was a lack of adequate testing infrastructure from the beginning. Testing is key to a pandemic response–the more data officials have about an outbreak, the better equipped they are to respond. Rather than call for more testing, Trump has instead suggested that maybe the U.S. should be testing less. He has repeatedly, and incorrectly, blamed increases in new cases on more testing. “If we didn’t do testing, we’d have no cases,” the President said in June, later suggesting he was being sarcastic. But less testing only means fewer cases are detected, not that they don’t exist. In the U.S. the percentage of tests coming back positive increased from about 4.5% in mid-June to about 5.7% as of early September, evidence the virus was spreading regardless of whether we tested for it. (By comparison, Germany’s overall daily positivity rate is under 3% and in Italy it’s about 2%.)
Testing in the U.S. peaked in July, at about 820,000 new tests administered per day, according to the COVID Tracking Project, but as of this writing has fallen to about 740,000. Some Americans now say they are waiting more than two weeks for their test results, a delay that makes the outcome all but worthless, as people can be infected in the window between when they get tested and when they receive their results.
Most experts believe that early on, we did not understand the full scale of the spread of the virus because we were testing only those who got sick. But now we know 30% to 45% of infected people who contract the virus show no symptoms whatsoever and can pass it on. When there’s a robust and accessible testing system, even asymptomatic cases can be discovered and isolated. But as soon as testing becomes inaccessible again, we’re back to where we were before: probably missing many cases.
Tumblr media
Tod Seelie—The GuardianPeople sleeping in a parking lot in Las Vegas after a homeless shelter shut down because of COVID-19
Seven months after the coronavirus was found on American soil, we’re still suffering hundreds, sometimes more than a thousand, deaths every day. An American Nurses Association survey from late July and early August found that of 21,000 U.S. nurses polled, 42% reported either widespread or intermittent shortages in personal protective equipment (PPE) like masks, gloves and medical gowns. Schools and colleges are attempting to open for in-person learning only to suffer major outbreaks and send students home; some of them will likely spread the virus in their communities. More than 13 million Americans remain unemployed as of August, according to Bureau of Labor Statistics data published Sept. 4.
U.S. leaders have largely eschewed short- and medium-term unflashy solutions in favor of perceived silver bullets, like a vaccine–hence the Administration’s “Operation Warp Speed,” an effort to accelerate vaccine development. The logic of focusing so heavily on magic-wand solutions fails to account for the many people who will suffer and die in the meantime even while effective strategies to fight COVID-19 already exist.
We’re also struggling because of the U.S. health care system. The country spends nearly 17% of annual GDP on health care–far more than any other nation in the Organisation for Economic Co-operation and Development. Yet it has one of the lowest life expectancies, at 78.6 years, comparable to those in countries like Estonia and Turkey, which spend only 6.4% and 4.2% of their GDP on health care, respectively. Even the government’s decision to cover coronavirus-related treatment costs has ended up in confusion and fear among lower income patients thanks to our dysfunctional medical billing system.
The coronavirus has laid bare the inequalities of American public health. Black Americans are nearly three times as likely as white Americans to get COVID-19, nearly five times as likely to be hospitalized and twice as likely to die. As the Centers for Disease Control and Prevention (CDC) notes, being Black in the U.S. is a marker of risk for underlying conditions that make COVID-19 more dangerous, “including socioeconomic status, access to health care and increased exposure to the virus due to occupation (e.g., frontline, essential and critical infrastructure workers).” In other words, COVID-19 is more dangerous for Black Americans because of generations of systemic racism and discrimination. The same is true to a lesser extent for Native American and Latino communities, according to CDC data.
COVID-19, like any virus, is mindless; it doesn’t discriminate based on the color of a person’s skin or the figure in their checking account. But precisely because it attacks blindly, the virus has given further evidence for the truth that was made clear this summer in response to another of the country’s epidemics, racially motivated police violence: the U.S. has not adequately addressed its legacy of racism.
Tumblr media
Neil Blake—The Grand Rapids Press/APThe line for a drive-through food pantry in Grand Rapids, Mich.
Americans today tend to value the individual over the collective. A 2011 Pew survey found that 58% of Americans said “freedom to pursue life’s goals without interference from the state” is more important than the state guaranteeing “nobody is in need.” It’s easy to view that trait as a root cause of the country’s struggles with COVID-19; a pandemic requires people to make temporary sacrifices for the benefit of the group, whether it’s wearing a mask or skipping a visit to their local bar.
Americans have banded together in times of crisis before, but we need to be led there. “We take our cues from leaders,” says Dr. David Rosner, a professor at Columbia University. Trump and other leaders on the right, including Gov. Ron DeSantis of Florida and Gov. Tate Reeves of Mississippi, respectively, have disparaged public-health officials, criticizing their calls for shutting down businesses and other drastic but necessary measures. Many public-health experts, meanwhile, are concerned that the White House is pressuring agencies like the Food and Drug Administration to approve treatments such as convalescent plasma despite a lack of supportive data. Governors, left largely on their own, have been a mixed bag, and even those who’ve been praised, like New York’s Andrew Cuomo, could likely have taken more aggressive action to protect public health.
Absent adequate leadership, it’s been up to everyday Americans to band together in the fight against COVID-19. To some extent, that’s been happening–doctors, nurses, bus drivers and other essential workers have been rightfully celebrated as heroes, and many have paid a price for their bravery. But at least some Americans still refuse to take such a simple step as wearing a mask.
Why? Because we’re also in the midst of an epistemic crisis. Republicans and Democrats today don’t just disagree on issues; they disagree on the basic truths that structure their respective realities. Half the country gets its news from places that parrot whatever the Administration says, true or not; half does not. This politicization manifests in myriad ways, but the most vital is this: in early June (at which point more than 100,000 Americans had already died of COVID-19), fewer than half of Republican voters polled said the outbreak was a major threat to the health of the U.S. population as a whole. Throughout July and August, the White House’s Coronavirus Task Force was sending private messages to states about the severity of the outbreak, while President Trump and Vice President Mike Pence publicly stated that everything was under control.
Some incredulity about the virus and public-health recommendations is understandable given the reality that scientific understanding of the newly emergent virus is evolving in real time. The ever shifting advice from health officials doesn’t instill public confidence, especially in those already primed to be skeptical of experts. “Because this is a new infectious disease, a new virus, we don’t have all the answers scientifically,” says Colleen Barry, chair of the department of health policy and management at Johns Hopkins Bloomberg School of Public Health. “I think that creates an environment that could potentially erode trust even further over time.” But the trust fractures on partisan lines. While 43% of Democrats told Pew in 2019 that they had a “great deal” of trust in scientists, only 27% of Republicans said the same.
Truly worrying are the numbers of Americans who already say they are hesitant to receive an eventual COVID-19 vaccination. Mass vaccination will work only with enough buy-in from the public; the damage the President and others are doing to Americans’ trust in science could have significant consequences for the country’s ability to get past this pandemic.
There’s another disturbing undercurrent to Americans’ attitude toward the pandemic thus far: a seeming willingness to accept mass death. As a nation we may have become dull to horrors that come our way as news, from gun violence to the seemingly never-ending incidents of police brutality to the water crises in Flint, Mich., and elsewhere. Americans seem to have already been inured to the idea that other Americans will die regularly, when they do not need to.
It is difficult to quantify apathy. But what else could explain that nearly half a year in, we still haven’t figured out how to equip the frontline workers who, in trying to save the lives of others, are putting their own lives at risk? What else could explain why 66% of Americans–roughly 217.5 million people–still aren’t always wearing masks in public?
Despite all that, it seems the U.S. is finally beginning to make some progress again: daily cases have fallen from a high of 20.5 per capita in July to around 12 in early September. But we’re still well above the springtime numbers–the curve may be flattening, but it’s leveling out at a point that’s pretty frightening. Furthermore, experts worry that yet another wave could come this winter, exacerbated by the annual flu season.
Tumblr media
Jae C. Hong—APCardboard cutout “fans” at an L.A. Angels baseball game
There are reasons for optimism. Efforts to create a vaccine continue at breakneck speed; it’s possible at least one will be available by the end of the year. Doctors are getting better at treating severe cases, in part because of new research on treatments like steroids (although some patients are suffering far longer than expected, a phenomenon known as “long-haul COVID”). As the virus rages, perhaps more Americans will follow public-health measures.
But there is plenty of room for improvement. At the very least, every American should have access to adequate PPE–especially those in health care, education, food service and other high-risk fields. We need a major investment in testing and tracing, as other countries have done. Our leaders need to listen to experts and let policy be driven by science. And for the time being, all of us need to accept that there are certain things we cannot, or should not, do, like go to the movies or host an indoor wedding.
“Americans [may] start to say, ‘If everyone’s not wearing masks, if everyone’s not social distancing, if people are having family parties inside with lots of people together, if we’re flouting the public-health recommendations, we’re going to keep seeing transmission,'” says Ann Keller, an associate professor at the UC Berkeley School of Public Health.
The U.S. is no longer the epicenter of the global pandemic; that unfortunate torch has been passed to countries like India, Argentina and Brazil. And in the coming months there might yet be a vaccine, or more likely a cadre of vaccines, that finally halts the march of COVID-19 through the country. But even so, some 200,000 Americans have already died, and many more may do so before a vaccine emerges unless America starts to implement and invest in the science-based solutions already available to us. Each one of those lives lost represents an entire world, not only of those individuals but also of their family, friends, colleagues and loved ones. This is humbling–and it should be. The only path forward is one of humility, of recognition that if America is exceptional with regard to COVID-19, it’s in a way most people would not celebrate.
–With reporting by Emily Barone and Julia Zorthian/New York
0 notes
jakegrxz · 8 years ago
Text
Struggling in the neoliberal university
Tumblr media
How has the broader socio-economic process of neoliberalism restructured higher education in the UK? What are the everyday, human implications of this? What form should resistance to it - from students, workers, and academics - take? [See the print edition here.]
Editor's note, 20th June 2018: This piece was originally written in February 2017 as the feature of Issue 9 of Incite, the political magazine I was Editor of while completing my undergraduate degree. As a piece published for a campus magazine at the University of Surrey, it was originally intended for Surrey students to read, and some of the language (e.g.: 'our Students' Union') reflects this. I have left this in tact rather than amending it.
In the wake of exam season, one wonders what exactly the point of the whole exercise was. At best, the experience feels meaningless and frustrating, if manageable. At worst, it can be anxiety-inducing and sleep-depriving, making us question our own abilities and feel wholly out of control. And yet, despite this, it all feels natural at this point. We have been examined in education for years now; this is how it is.
Perhaps it is just us, too, inside our minds. As the late cultural theorist Mark Fisher wrote, who devastatingly took his own life last month, stress in our society has been ‘privatised’. In tracing the roots of our unhappiness, instead of looking outward to deteriorating social and political conditions, we are increasingly inclined to look inward, towards brain chemistry or personal history. The deteriorating conditions we operate under, which may include precarious work, constant monitoring (via workplace appraisals, target setting, or university examinations) are deemed unfortunate yet ‘natural’, depoliticised.
Fisher’s analysis is no doubt pertinent in reference to the university. Our very own Students’ Union has not agitated for less assessments or a less intensive exam season, but rather released a saccharine Facebook video entitled ‘You Can Do It Surrey!’, aiming to motivate students to put themselves through “late nights and early mornings” with the promise that it “will definitely be worth it”. Simultaneously, it has offered events to help exam stress such as “Therapy Dog Session” and “Happiness Café”. The message is clear — exams are natural, unavoidable and worth it. Either distract yourself from them or look inwards, in futility. Structural change is unimaginable.
But contra the Students’ Union, we should resist such narratives, and connect our distresses to the broader structures of neoliberalism. Indeed, the pressure, stress and general malaise of exam season functions as a highly visual spectacle of how neoliberalism, mediated by the institution of the university, is oppressing students. For what fuels such anxiety in exam season is the fear of failure, precisely constituted by a fear of becoming less ‘employable’. During exam season, our relationship to the labour market as students is even more exposed than usual. Most (but not all) students come to university to help with their future careers in some way; the expectation is that, devoid of much other choice, getting a degree will secure a certain level of income, security and perhaps ‘success’. Exam season pushes this logic to the limit, as exams are the very conditionalities we need to meet in order to pass our degree, and thus achieve that certain level of income, security and ‘success’. Exams thus come to function as unnaturally distilled and measurable indicators of our future income, security and status. With such distillation and measurability comes heightened anxiety for all, to varying degrees. A lot comes to depend on very little.
I should note that I am not arguing for the removal of assessment in education. Assessment, reducing it to the very verb to assess, is an integral part of social life. We assess when we debate with a friend, relative or coworker — we judge their arguments against what we know of the subject under discussion, retort accordingly, and then they repeat the same process themselves. Knowledge is exchanged; education takes place. Hence, what I am arguing instead is that the particular form of assessment we are exposed to as students operates under a neoliberal framework that, through commodification and grading, serves to create unnecessary stress and divisions, as well as undermine the value of education as an end in itself. For assessments do not have to come in the form of time-restricted exams in silent teaching rooms that take place in an intense two-week period. Nor do they need to be numerically reduced to certain grades that hierarchically rank students and implicitly ascribe higher scoring students higher value. Rather, one can imagine education as radically egalitarian and cooperative — we may write essays, and then discuss them with our tutors, without the need for arbitrary grading, ranking or disciplining. Students of the natural sciences may be numerically tested, but not have their degree depend upon passing, nor be tested in the form of hours-long examinations that occur twice a year. Education need not be given a ‘score’ that inevitably becomes a symbol for our ‘value’, understood in terms of ‘employability’ or market viability.
The focus of this essay is not, however, solely on assessments. Rather, they serve as a portal into a wider topic of discussion: the neoliberal university. If exam season is noticeably distressing in part because of neoliberal logic, what other parts of university life in 2017 are too, perhaps less noticeably? How else is market logic corrupting education? And more broadly, what purpose have universities come to serve under neoliberalism? Assessing these questions, and the interplay between them, requires us first to trace the history of the institution of the university and neoliberalism.
Tumblr media
The Road to the Neoliberal University
Universities have always served certain purposes in society, with these purposes shifting as other factors in society, such as class relations or the dominant modes of production, have shifted. This point may seem abstract, or irrelevant, but it is vital. It reveals how the university has always been situated within a particular social, political and economic context which has shaped its functions. Understanding the university, therefore, requires contextual understanding. This can be appreciated historically.
Regarding the UK, before the nineteenth century, universities primarily served as important sites for the socialisation of elites from the ruling classes, immersing them in a certain kind of knowledge and ‘high culture’. For example, late medieval and early modern universities such as Oxford and Cambridge served to educate members of the ruling classes for high positions in the church, the law and government. Thus, as Michael Rustin notes, “their primary function was more to provide a cultural and social formation for elites than to produce useful knowledge”. This function was enlarged with the onset of industrialisation in the early 1800s, when the rising bourgeoisie of industrial manufacturers contributed to the formation of great city universities such as Leeds and Sheffield which specialised in engineering and science, and the expanding bureaucratic arm of government led to modernisations in university curricula. The influence of dominant modes of production and the market on the university become clearer at this stage — as mass capitalist industrial production spread, so did the imperative for technical university education in subjects such as engineering, for example.
The context of the aftermath of World War Two saw the next big institutional changes of the university. With the rise of welfare states and new class compromises across Europe, the university came to expand into a ‘mass institution’, emblematic of enhanced opportunities and shared entitlements. Universities no longer came to be seen as primarily the home of the ruling class but became open to all those with the adequate academic qualification, reflecting new class settlements. In the UK this was expressed via the 1963 Robbins Report, whose reforms began a gradual increase in young people attending university; before then the rates had been stuck at 4–5% — now nearly 50% of the 18+ age group attend university. This period saw the university, at least within the UK, at perhaps its most decommodified and egalitarian — grants were issued to all students, and the 1960s-1980s oversaw the birth of exciting, radical new academic disciplines such as sociology and cultural studies.
This particular institutional formation, however, soon began to break down. Although more democratic and egalitarian than previous formations, it was also, unsurprisingly, far more expensive. Removing tuition fees and paying for increasing numbers of young people to study at university, in the name of equal opportunities, was not — and never will be — cheap. Consequently, as the post-war class settlement lost legitimacy amidst the worldwide economic troubles of the 1970s, the university began to be gradually integrated into the emerging dominant political and economic order — neoliberalism.
One may then fairly question at this point: what exactly is neoliberalism? There is no concrete definition, but the term generally describes a set of political and economic ideas and policies that emerged internationally (led by states such as the UK, the US and Chile) from the 1980s onwards, influenced by classical liberal economics. Policies related to privatisation, reduced public spending and free trade are all classic examples of neoliberalism in action, all tied together by an unconditional veneration of the market and ethics of individualism and individual choice. A central component of neoliberalism is market creation, in all areas of life. The state’s role, then, thus becomes to create, uphold and ‘regulate’ such markets, rather than provide services. Early UK examples of market creation were in the energy and telecommunications sector under Thatcher, where state owned enterprises were sold off in order to build a market of private providers, following the logic that entrepreneurial competition would drive down prices, increase efficiency and offer consumers more choice. In the UK, neoliberal market-making has gradually ‘spilled over’ into more and more sectors — the Major government oversaw the privatisation of the railways in a crooked attempt to create a transport provider market, reforms under the New Labour and Coalition governments created internal markets inside the NHS, and — most related to this piece — the Blair and Cameron years were instrumental in the creation of markets within the education sector.
The Neoliberal University
Finally, then, we arrive at the concept of the neoliberal university, the current institutional formation of the university under neoliberalism. What does the neoliberal university look like, and what environment does it operate in? As with other historical formations, the answers to these questions can be found by looking at contemporary class relations and dominant modes of production. As multinational companies have come to dominate the sphere of production in evermore areas of the economy, the neoliberal university has come to be an institution that places more emphasis on ‘profitable’ subjects over less profitable, even so-called ‘mickey mouse’ ones. Indeed, the fastest growing subjects by student numbers are universally from the natural sciences (biological sciences, veterinary sciences and mathematics in particular) while the slowest growing or even shrinking subjects tend to be related to history, philosophy and languages. The reasons for this general trend are twofold. First, in the UK, the government has actively discouraged additional funding for subjects deemed antithetical to ‘enterprise culture’, such as the humanities and social sciences. The Browne Report of 2010 (which raised tuition fees to £9000 — yes, that one), for example, completely removed the teaching grant for arts, humanities and social science subjects, a grant that remains in place for STEM subjects. Secondly, the logic of neoliberalism makes it more rational for many applicants to choose more ‘scientific’ or ‘proper’ subjects, because the financial burden of tuition fee debt increases the incentive to seek substantial financial return upon graduation. Neoliberalism turns academic degrees into financial investments, and one can hardly blame growing numbers of students for seeking some kind of return on it, however sorry that situation may be. Thus, through this double bind, the neoliberal university represses ‘subversive’ or ‘less profitable’ academic disciplines while encouraging the growth of subjects deemed useful to big business.
Another key feature of the neoliberal university is how it operates within an artificial and manufactured higher education market. This is most pronounced in countries such as the US, but successive governments in the UK have made creeping reforms (amidst huge resistance) that are gradually constructing a ‘free market’ of higher education. The logic behind these reforms is based upon an erosion of the traditional class settlement in the UK, at least with regards to higher education. No longer is higher education understood in terms of class compromise, where the higher classes primarily fund a higher education system free and open to all; rather, class is factored out of the equation almost completely. Society is instead understood as a collection of atomised individual consumers, and consequently higher education becomes not a universal right based upon dominant notions of equality of opportunity, but a commodity to be purchased. All this was made ever more clear when the responsibility for universities moved from the Department of Education to the Department of Business, Industry and Skills in 2009.
Upon these underlying assumptions, markets are being built. The latest attempt at this is the government’s Higher Education and Research Bill, which is slowly making its way through Parliament against various currents of opposition. The 2016 White Paper on higher education preceding the Bill makes it very clear that the government ultimately seeks to create a differentiated, deregulated and competitive market in higher education, and outlines policy proposals to reach this goal. For example, the White Paper proposes streamlining bureaucratic structures to make it easier for new higher education providers to enter the ‘market’, and provides provisions for so-called ‘market exit’, where under-performing universities cease to exist. Additionally, the Paper allows better performing universities (as so judged under the controversial Teaching Excellence Framework) to charge slightly higher fees than lesser ones from the 2018–19 year onward. These are gradual reforms, and we have not seen the creation of a fully unleashed market as of yet. Most significantly, the maximum tuition fee cap remains in the Bill (although it now increases with inflation). Nonetheless, the reforms reflect the continuation of a 25-year old trend in British higher education towards markets and away from good quality higher education for all.
The Human Cost
The commodifying effects of these market-making reforms may appear abstract or distant, but they have very real consequences. For the transformation of higher education into a commodity bought on a market is not simply a theoretical point — it affects the everyday lives of students, professors, and university administrators. Regarding students, a key way to appreciate this is to think back to the example of exam season I used to open this piece. As noted, exams are so stressful because they reveal in stark terms how exposed our higher education is to the labour market under the neoliberal university. Higher education becomes a means to escape the precarious, low-wage labour market neoliberalism has created — but only if we do well in our exams. We are thus always under the watchful gaze of ‘employability’ while at university, exposed, and this damaging exposure manifests itself in numerous ways. The inadequacy of maintenance loans/grants for many students is one such way, forcing many students to take up part time casual work in order to keep themselves financially afloat at university. This takes time away from students to properly focus themselves on their degree (causing additional pressures when examinations or assessments are present), and transforms students into a useful pool of casual labour that neoliberalism thrives on. As Jeremy Gilbert notes, exposure in this sense acts to discipline students towards a certain kind of behaviour, making it harder for students to question their place in the world at the exact time they have historically done so. Furthermore, a reliance on part-time work at university pushes students towards the mould of passive consumer, who ‘purchases’ their education through ‘proper’ work. Critical thought is side-lined in the process.
What is less apparent to students is how academics, too, are struggling in the neoliberal university. Much of this stems from the erosion of academic freedom that neoliberalism has brought about, as universities in the UK have increasingly come to be managed like businesses or brands. Since the 1988 Education Reform Act universities are a form of ‘corporation’ legally, and governed increasingly hierarchically, marginalising formerly collegial and relatively democratic forms of internal governance. Accordingly, the power of academics over university decision-making has generally decreased, and this goes hand in hand with the growth of managerial roles within university governance. As neoliberalism has submitted the university increasingly to market logic, occupations with expertise in markets and regulation have blossomed within it — accountants, public relations and human resource practitioners, administrators, and so on. With these reforms the university increasingly follows a corporate model, embedded within business culture, creating various pressures and incentives for academics. Research begins to be subtly influenced by business interests in order to bring in funding, ‘customer satisfaction’ becomes paramount with academics subject to new regimes of monitoring and assessment, and ultimately academic autonomy suffers. With these pressures, it comes as little surprise mental illness is an increasing problem among academics, as a 2013 University and College Union report found.
Pressures come not just from the content and high expectations of academics’ employment, but also the terms of it. Part-time, fixed term, and zero-hours contracts are on the rise in academia as universities seek to minimise costs and squeeze as much productivity out of their workers as possible. As a Guardian investigation revealed last November, more than half of all academic staff working in UK universities are on insecure ‘atypical’ contracts, with more prestigious Russell Group universities being particularly guilty of this. The results are as you’d expect — low pay, for long hours, on insecure terms. A number interviewed for The Guardian noted yearly pay as often around the extremely low mark of £6000 a year, despite academic success (by contrast, the average yearly wage is £26,500). This is the human cost of the neoliberal university; when education becomes a commodity, so do the teachers. The result is dehumanising practices of poor pay, overwork, and insecure employment.
Tumblr media
Conclusion: What Next?
The picture, then, seems bleak. Not only are academics and students suffering under the neoliberal university, but there seems little we can do to change it; as shown, the shift to neoliberalism is a global and historical one, infiltrating and feeding off of every aspect of our lives. What then can we as students do to resist? Is it even possible?
The answer is: of course, provided we are pragmatic, organised, well-informed and realistic. While we may not be able to overthrow global neoliberalism by ourselves, what we can do is resist it locally, at every point it impinges upon our lives. The NUS is doing this right now with its boycott of the National Student Survey (NSS), and it is a struggle we should wholeheartedly get behind, unlike our Students’ Union which has disgracefully opposed it. From next year, the NSS will be used by the government as part of the Teaching Excellence Framework (TEF) to grade universities, allowing higher TEF-scoring universities to charge higher fees. The intention behind this, as I wrote above, is to create an artificial higher education market in the UK, and as such the NSS functions as a locality where the marketisation of higher education collides directly with students. A co-ordinated national boycott, then, could hugely complicate the government’s neoliberal higher education plans, which we as students should be 100% opposed to in every way.
For we should never underestimate the strength of mass collective action, well informed by a broad historical understanding of neoliberalism, in effecting change. Just last year, 1000 UCL students held a rent strike in protest against poor living conditions in expensive student accommodation, and won a rent cut worth £850,000 and a £350,000 bursary for students from low-income backgrounds. Their demands were carefully linked to an understanding of the neoliberal university, noting how UCL profited £16 million per year from student rent, and student ‘Cut the Rent’ groups are now spreading across the country, aided by the NUS. We should take these rent strikes as inspiration — one thousand coordinated students at UCL have started a national movement and achieved real successes. Think of what one thousand coordinated students at Surrey could do: not just rent strikes, but also exam strikes, assessment strikes, campus boycotts. So often we are demoralised and apathetic about our struggles when in reality, with organisation and conviction, together we have power. And when we resist, exercising that power, we plant the seeds for a new, better, post-neoliberal future. Only then can we begin to escape the neoliberal university, and all its oppression.
0 notes