CEOs, authors and white-collar work


People trust the Apple brand – Steve Jobs during a Stevenote

How old are you? And she says, I’m 13. And I say, 13? That’s young. Is it hard to get work at Foxconn when you’re – and she says, oh no. And her friends all agree, they don’t really check ages. –  Mike Daisey, Mr Daisey and the Apple Factory

We have reached a point where reading and writing are now inseparable from the consumer technology and software that produce them. This essay, for example, is written on the same computer – a MacBook Pro – on which I write code for my day job. Those who don’t use computers at their place of employ will almost certainly own a mobile phone – and increasingly, it will be a smart phone, of the kind that have made fortunes for companies and CEOs.

In a list of popular topics on Twitter during 2011, Steve Jobs made his first appearance in August: in that month, his resignation from Apple garnered 7,064 tweets per second.

Then, on 5 October, Jobs died.

Before his death, my Twitter stream had been filled with tweets about Occupy Wall Street. After eighteen days of 140-character critiques of capitalism and the distribution of wealth, the tweets praising Jobs’ life and impact began to appear, with users posting 6,049 per second. This eclipsed even the response to Osama Bin Laden’s killing which, at its height, spurred 5,106 per second.

Tweeters recorded their memories of Jobs and Apple, with a strong dose of personal nostalgia. Comments ranged from the artefacts of Jobs’ life (‘I wrote this from my iPad’) – or terse memoirs of the commentators’ childhoods (‘I remember my first Apple IIe’), followed by the ubiquitous, ‘RIP #SteveJobs’.

Soon after the Twitter storm, the blog posts appeared. Then the eulogies from other technology company founders and rivals, like Bill Gates and Larry Page. Hagiographies of Jobs accumulated, assuring fans of his everlasting place in the technological canon.

How did a capitalist become revered as a creative genius? Where did the mythic Jobs begin?

In Peter Laurie’s 1980 The Micro Revolution, Jobs is not mentioned, except as a passing reference to Wozniak’s friend, ‘the other Steve’, whose contribution was to sell his van to fund their company. In Steven Levy’s 1984 book on hackers, Jobs’ significance isn’t his technical skills. ‘As an engineer,’ Levy writes, ‘Jobs was mediocre; his strength was as a planner, someone with vision to see how computers could extend to a point of usefulness beyond that dreamed of by pure hackers like Steve Wozniak.’

What Steve Jobs did so successfully over the years was integrate himself into the identity of the Apple inventions. Thus, the piece of technology you’re buying isn’t a faceless product; instead, it’s something that Steve made for you.

Even his keynotes were known as Stevenotes. In Jobs’ 1984 presentation of the Macintosh a sequence of images appeared on the Mac’s screen showcasing its functions, including an image of Jobs envisioning the Mac. Jobs hovers above the machine and it finishes with its synthesised voice: ‘I introduce a man who has been like a father to me, Steve Jobs.’

Personality is important in the creation of myths in technology. There is a style to the ‘visionary’ CEO that embodies the product – compare Jobs with his turtleneck and jeans, to Gates in his suit, a contrast replicated in the technological ‘wars of identification’. In the Apple advertising campaign, ‘I’m a Mac, I’m a PC’, as in the more spontaneous flame wars in internet forums, the point of difference isn’t between the hardware of a non-Mac and a Mac but between Microsoft and Apple – even though much of the technological architecture is identical.

Within the neoliberal reimagining of society, this makes perfect sense. No longer are the traditional organisations – church, community, family, club – the centre of social life and identification. Relationships are marketised, formed through commercial transactions. The products we purchase reinforce both what we do and, by extension, who we are.

Where celebrities are concerned, the association is relatively straightforward. We listen to them perform music, watch them on screen, read about them in newspapers and magazines. We might know more about Beyonce than we do our neighbours; more about Neighbours than our extended family. In the arts and entertainment industry (including the most voyeuristic of entertainments: royalty), the product and presentation are inextricably linked.

In the past, technology companies couldn’t simply assume a similar relationship between product and presentation. Despite the best efforts of Mitt Romney, the public is generally resistant to thinking of corporations as people, just as corporations are resistant to reminding us that actual, living people (such as the hundreds of thousands of Foxconn employees in China) make their products. Hence the importance of a branding that presents the CEO as equivalent to a local shopkeeper – disarming, friendly, personally interested in your welfare.

More than that, a product requires an author to give it an identity. Apple fostered curiosity by never releasing details prior to an unveiling, which is when Jobs would arrive, revealing the device (his device) to fans and media. That delivery granted Jobs, like a film director, chief authorial status, obfuscating his actual role in the company and establishing his rapport with consumers as social equals.

Interestingly, this image of a creative genius CEO, forging technology directly for consumers, has not been replicated by Steve Ballmer, the new CEO at Microsoft. Why is Ballmer unable to achieve a Gates-like celebrity? The mythologising of a CEO as creative agent builds on a romantic notion of authorship, that nineteenth-century image of the lone writer toiling away, his genius spontaneously creating a work of undeniable brilliance, with no editor, no external suggestions on drafts, no rewrites. There’s a parallel in cinema. In auteur theory, the director is perceived as the film’s key author, rather than, say, the production crew.

Ballmer may have started as Microsoft’s thirtieth employee but he has only ever been portrayed as a businessman. Interviews with him focus on business and competition, and there are no legends of his creative role.

He is not, in other words, an author.

In technology, authorial imagery makes use of the ‘geek’ and ‘hacker’ stereotypes. The technologist (as in the hardware hacker or software engineer) has undergone a slow transmogrification in popular culture, from computer hacker David Lightman (WarGames) to unsuspecting corporate employee battling the villainous CEO (Antitrust) and, finally, to The Social Network’s ‘average Harvard guy’, Mark Zuckerberg.

In this last model, the CEO as the final expression of the technologist, the American dream is realised in a rather conventional way. That is, you get a great idea, drop out of college, start a company, make money, and live happily ever after – à la Zuckerberg, Gates, Jobs.

Wealth, however, can’t always buy popular public appeal. That’s why it matters that these technologist-CEOs are seen as company authors, since the perception provides them with authority and allows consumers a point of identification. The celebrity of startup CEOs thus grows with the brand.

The Social Network, as a film and a concept, is impossible without this formula: the CEO as both the visionary and the embodiment of the company. It’s an equation in which the company assumes the qualities of the labour it exploits, with capital as the productive element. Hence, Zuckerberg is the only creative agent in the entire film – all those other workers and contributors are mere background.

Nowadays, any new startup software and technology company can build on the established mythology of the maker-CEO, establishing an immediate relationship with their audience/consumers.

The modern technology company doesn’t simply transform labour power into capital, it also transforms the capitalist into the labourer. Recall how ideas within Apple always belong to Jobs. Andy Hertzfeld, in his reflections on working on the first Macintosh, relates the legend of Jobs’ ‘reality distortion field’. Hertzfeld’s immediate manager explained to him how, if anyone told Jobs a new idea: ‘he’ll usually tell you that he thinks it’s stupid. But then, if he actually likes it, exactly one week later, he’ll come back to you and propose your idea to you, as if he thought of it.’

Facebook employs over 3000 people, Apple directly employs just over 60,000, and Microsoft 92,000. Most other technology companies are substantially smaller, but the work (although not the working conditions) is more or less the same regardless of size: products are researched, designed and written.

The outpouring of sentiment regarding Jobs’ death was intense, especially among the technology community. Programmers and tech geeks are stereotyped as cynical but many of the eulogies were generated by those very people.

In a way, it’s understandable. Younger developers and engineers grew up with the computers, while the older ones shared Jobs’ history. Many would have learned of his death via an Apple product while they were working on an Apple product.

Yet it’s this very community that is most exploited by the likes of Jobs and the other tech-CEOs. The aura of authorship helps reinforce a commonality of experience with exploited programmers, a commonality based around the act of creation.

Software itself lacks a materiality: it’s a product created at the level of information. The programmer writes the same program for herself as a hobby, as the owner of a business, for a client as a contractor, for an employer, or for the open-source movement. Nothing about the activity changes except the programmer’s social relationship to that software.

This is illustrative of a broader point. Take, by way of example, two writers. The first writer works full-time writing user manuals for a technology company, while writing a novel in her spare time. The second is an established and successful novelist able to live off her royalties (and investments).

Both are novelists; both spend their days writing. They perform the same tasks yet their social roles are qualitatively different. The first writer is an employee; the second is essentially running a small business.

In an attempt to explain how class works in technological settings, McKenzie Wark, in his 2004 A Hacker Manifesto, proposes two new classes that superseded the old classes of capitalist and worker: the abstract class of ‘hacker’ (artist, author, programmer) who creates information, and the ‘vectorialist’ who controls the information circuit.

‘The vectorialist class is waging an intensive struggle to dispossess hackers of their intellectual property,’ Wark explains.

Yet is the distinction between the intellectual labour and manual labour in technology sufficient to warrant a new category? Walter Benjamin addressed this issue in ‘Author as Producer’ where he argued that the key to the understanding of intellectual labour is its relationship to the means of production: ‘revolutionary struggle does not take place between capitalism and the intellect, but between capitalism and the proletariat.’

By linking the nature of the class to the particular products of its labour, Wark instead suggests a common ground between white-collar workers and anyone else who produces intellectual property, as distinct from those who produce physical products.

It’s true that the IT field has, like other creative white-collar jobs, a social fluidity, as the equipment required for programming (essentially, manufacturing software) is far cheaper than that of industrial manufacturing. As a programmer, the worker can move from being an employee within a large company, to self-employment, to running a small business, then back to being an employee in a short space of time.

Wark assumes that a material object and a non-material object are treated differently under capitalism. It is, however, social relationships, not physical artefacts, that are objectified through economic transactions. Whether the operating system of the iPhone or its hardware, the commodity need only have a use-value and an exchange-value.

In his 2012 essay ‘The Revolt of the Salaried Bourgeoisie’, Slavoj Žižek makes a similar argument to Wark:

This new bourgeoisie still appropriates surplus value, but in the (mystified) form of what has been called ‘surplus wage’: they are paid rather more than the proletarian ‘minimum wage’ (an often mythic point of reference whose only real example in today’s global economy is the wage of a sweatshop worker in China or Indonesia), and it is this distinction from common proletarians which determines their status.

Far from being limited to managers, the category of workers earning a surplus wage extends to all sorts of experts, administrators, public servants, doctors, lawyers, journalists, intellectuals and artists.

For Žižek, white-collar intellectual workers should be considered part of the ‘salaried bourgeoisie’, comparable to those who manage the companies they are employed by. It is, he says, purely the monopoly over the intellectual commons that gives such these companies their profitability, rather than the rate at which they exploit their workforce – indeed, ‘Microsoft pays its intellectual workers a relatively high salary’.

But relative to what? Microsoft’s profits? The hours the programmers work? The economy? Steve Ballmer? Foxconn employees?

Henry Ford was, of course, also known to pay his workers well compared to other manufacturers of his era, precisely because his system allowed more efficient exploitation. As Richard Seymour explains, in a critique of Žižek, the ‘relatively high’ salaries offered by Ford ‘were possible in part because the techniques of Taylorism allowed the more effective extraction of relative surplus value’.

Žižek’s argument suggests that software companies consist entirely of the bourgeoisie, a category that would therefore lump together the owners, accountants, system administration, information architects, business analysts, graphic designers and programmers.

What is particularly pernicious about both Wark’s and Žižek’s arguments is that they distance white-collar intellectual work from blue-collar work. Both imply that white-collar professional programmers in Facebook have more in common with Zuckerberg (as ‘hacker’ or manager) than they do with the Foxconn employees who assemble the hardware on which they, the programmers, write their code. For Wark, class is a category based on activity; for Žižek, it seems to be based on rates of pay.

Yet when we think of white-collar work, we define it by its relationship to management– that is, we imagine people working in an office. This is where the class divide is most obvious. Those who control their experience of work, even if they put in long hours themselves, can enjoy the sense of authoring without losing the products of their labour. The common identification that many white-collar workers make with films like Fight Club and Office Space has less to do with the particular tasks they undertake and much more to do with a universal feeling of disempowerment. (Interestingly, both films were made before the dot-com bubble burst, and when the IT economy was at its height.) That’s why ‘wage-slave’ programmers put their own time into open-source projects, while other white-collar workers use their leisure for creative fields like writing.

So what do Californian Apple employees and Chinese Foxconn employees have in common? True, their work is different: even within Apple’s Silicon Valley headquarters, a range of roles separate employees within the company. But what matters is less the activity than the set of relations around it – working conditions, workplace, control over the job, control over the product of the labour. On face value, these seem vastly different, but within Apple the distinction is a matter of degree.

The first Macintoshes were manufactured in California. The production process took place in warehouse featuring automated assembly equipment; a video from back then shows the machines being assembled by a casually-dressed long-haired factory employee, someone who looks very much like the programmers you’d see in the Californian offices today. Apple’s manufacturing only became so culturally different after the rise of a heavily concentrated manufacturing sector in China.

Still, it isn’t only manufacturing that is moved offshore. White-collar programming jobs are also outsourced to China and India, where even the big IT companies now have offices. With expansions of education, the increasingly generic nature of IT, it isn’t unreasonable to think that other sections could be moved to lower income economies. Instead of ‘Designed by Apple in California, Assembled in China’, we may well read, ‘Designed by Apple in California, Programmed in India, Assembled in China’.

What Apple workers in California and Foxconn workers in China have in common is their future, which will be decided by the Apple management. The flipside of that is that Apple employees could also, by struggling to improve their own conditions, help improve the conditions of Foxconn employees.

Originally, the Macintosh was heralded as celebrating ‘the individuals who created it – not the corporation’. Inside the case of every Mac were the signatures of the entire Macintosh division. But those Macs were manufactured in Fremont, California – the plant that closed down in 1992.

Benjamin Laird

Benjamin Laird is a Melbourne-based computer programmer and poet. He is currently a PhD candidate at RMIT researching poetry and programming and he is a website producer for Overland literary journal and Cordite Poetry Review.

More by Benjamin Laird ›

Overland is a not-for-profit magazine with a proud history of supporting writers, and publishing ideas and voices often excluded from other places.

If you like this piece, or support Overland’s work in general, please subscribe or donate.


Related articles & Essays