Rewarding Work In ‘Red Dead Redemption 2’

REWARDING WORK IN ‘RED DEAD REDEMPTION 2’

In this essay I thought I would write about the ways in which Rockstar’s Red Dead Redemption 2 incorporates the elements work needs in order to be rewarding into its gameplay.

First, though, we need to figure out what those elements are. Barry Schwartz has looked into this, and come up with the following ideas:

“Satisfied people do their work because they feel that they are in charge. Their work day offers them a measure of autonomy and discretion. They use that autonomy and discretion to achieve a level of mastery or expertise…Finally, these people are satisfied with their work because they find what they do meaningful. Potentially, their work makes a difference to the world”.

I think the key words in that passage are ‘autonomy’, ‘discretion’, ‘mastery’ and ‘meaning’. Whenever physical or mental activity incorporates these, you have work that is rewarding.

So how does Red Dead Redemption 2 fare? First off, the environment in which this game is set obviously lends itself to ‘autonomy’. It is set in the vast expanse of the American Wild West and, as the game’s trailer puts it, “the world is full of adventures and experiences that you discover naturally as you move fluidly from one moment to another”. This gives the game a non-linear feel, as the player can ride off in any direction.

Along the way, the player is likely to encounter various situations and activities. Most of the time you are not required to participate and can decide for yourself whether or not to get involved. This means the game manages to incorporate another feature work needs in order to be rewarding, namely ‘discretion’. We also see discretion at work during missions, where you are asked to make decisions like what actions members of your posse should take, or whether an aggressive or pacifist response is most appropriate for the current situation.

One could also cite character management and customisation as further ways in which this game provides opportunities for discretion or judgement. As the game’s trailer says, “your experience is defined by the choices and decisions you make…You can, of course, choose what to wear, ride and eat”. Furthermore, these are not merely cosmetic choices that just change your appearance but have no real consequences. Your character has various health attributes that you need to take care of. A decent coat in winter could mean the difference between life and death, whereas during a hot summer it would not be wise to wear such warm clothing. From character customisation and management, to the snap decisions required of the player during missions, to the open world and the nonlinear experiences it offers, Red Dead Redemption 2 provides plenty of opportunity to apply one’s discretion.

When it comes to mastery, ever since Pong was introduced with the simple instruction to ‘avoid missing ball for high score’, videogames have provided players with challenges that test their ability and enable them to feel like their skills are developing.

The best games don’t just rely on setting a challenge like getting from A to B in a set time or shooting X number of targets. They incorporate systems of feedback into the gameplay that informs the player how well they are performing and whether they should try another strategy. You have visual and audio cues that let you know if things are going well or not, and the best games do not leave you in the dark over what you should be doing, but at the same time don’t hold your hand and instead leave it up to you to figure out how to accomplish what needs to be done.

Finally, by providing the player with identifiable tokens of progress in the form of special items, areas and other stuff that you unlock by achieving certain objectives and challenges, games like Red Dead Redemption 2 let you feel like you are gaining mastery and making real progress as the gameplay continues.

Now, when it comes to meaningful work, one might struggle to claim Red Dead Redemption 2 provides much of this if we consider the game from the perspective of ‘real life’. This is, after all, just a videogame. Sitting in front of a TV pressing buttons on a joypad hardly stands besides researching the cure for cancer as “work that makes a positive difference to the world”.

But in the context of the in-world experience, many games offer a grand narrative that sees the player progress from a nobody at the start to a significant figure whose actions and decisions have had a decisive effect on shaping history by the end. You become the hero who saved the world. Admittedly, in Rockstar’s most famous franchise (Grand Theft Auto) you are attempting to rise in the ranks of criminals, which is not exactly everyone’s idea of a positive contribution to society. And, in Red Dead Redemption 2 you are cast as an outlaw and can engage in the kind of activities for which Rockstar has earned an image as the bad boy of videogaming. You know, robbery, murder, that kind of thing. But there does appear to be opportunities to act like an outlaw with noble intentions. There are situations in which you can choose to help people or to refrain from killing foes. According to the trailer, “there are countless secrets to uncover and people to meet. You can get into raucous altercations…chase down bounties. Your behaviour has consequences and people will remember you and your actions”.

So, like all the best games, Rockstar’s Red Dead Redemption 2 successfully incorporates ‘autonomy’, ‘discretion’ and ‘mastery’ into a grand narrative that provides a sense of social meaning.

This achievement is all the more remarkable when you consider how menial an activity videogaming actually is. After all, what are you physically doing when you play these games? Repeatedly pushing buttons. Really, that’s it. Yet, somehow, games designers can take this dull, repetitive activity, one that ranks alongside rote assembly line work as the most menial ever created, and build an experience on top of it that is so compelling people happily pay good money so they can do it!

But, really, it is because it is we the players who are paying to do work in videogames that designers have every reason to try and make it as engaging and rewarding as possible. Aside from stories of people who work in sweatshops grinding through MMORPGS to level up avatars that then get sold on to people who would rather pay than do the tiresome work of obtaining a high-level character themselves, nobody is ever coerced into playing a videogame. Participating in such activity is entirely voluntary for the vast majority of players, so if they are to sell, their developers have to make sure that the work a gamer has to do in order to get through the game is rewarding.

However, when it comes to jobs, I believe there is an incentive to reduce the elements that make work rewarding. If you take the first three features (which were ‘autonomy’, ‘discretion’ and ‘mastery’), I believe these have something in common, which is that they provide opportunity to enhance one’s individuality. The best games really do try to include ways to let the player customise the experience, which in some cases goes as far as incorporating editing tools that enable you to craft whole new levels and gameplay. I would argue that the reason why videogames tend not to make good films is because the characters in them are often pretty much blank slates intended to be filled in by the player’s personality, not well-developed characters with their own psychology.

The best videogames provide plenty of opportunity to enhance one’s individuality. We are each of us unique individuals with our own lifestyles, presences and abilities, and ideally we would have jobs that reflect this. But this could potentially cause problems when combined with the other feature that makes work rewarding, which is that it should be ‘socially meaningful’.

Imagine that my job is very important and valuable to society, and that it is also perfectly tailored to suit the unique individual I am. Obviously this would be tremendously engaging and rewarding work for me, but what if one day I was run over by a bus? The company would be in big trouble, for who could replace me and fit into a position uniquely suited to the individual I am?

On the other hand, if you can somehow reduce or even eliminate the amount of autonomy, discretion and mastery a job requires, you also need rely less on the individuality your employee. In so doing, the employee can be treated less like a unique person and more like an interchangeable unit that can be removed and replaced at the employer’s discretion.

This obviously impacts on the employee’s bargaining power, as anyone who feels they are eminently replaceable is not going to ask for better pay or preferable working conditions. Cheaper labour means more profit for employers. Furthermore, employees are in a competitive environment in which they fight to earn enough money to keep from becoming too indebted, to keep up appearances in environments that emphasise material wealth as the sign of success, and in which there are taxes that have to be paid if you don’t want to go to jail. In other words, there is a lot of ‘negative motivation’ leading people to submit to jobs not because they expect to be rewarded if they do, but because they fear the consequences if they don’t.

So, employers have to pay their workers in order to get jobs done, and they prefer cheaper labour where possible and so are incentivised to reduce the qualities of work that make it rewarding, as in doing so they make employees more like interchangeable and replaceable units. Furthermore, in the world of wage labour there are various forms of negative motivation that pushes people into accepting jobs that are not very rewarding, so unlike videogames designers employers need not be too concerned that the work they offer sucks.

I think this helps explain why people seemingly don’t want to work. It really is bizarre if you think about it. Imagine an animal like a dolphin, obviously evolved for a life in water and yet seemingly reluctant to leave dry land and go swimming. People are like that. We have large brains housing creative minds, dexterous hands that can use tools in complex ways, sophisticated language that enables us to cooperate and compete in ways no other animal can even imagine doing, and we are healthiest when mentally and physically active in social situations. In short, we evolved to work but apparently we don’t want to. At least, that seems to be the attitude people have when the topic of UBI comes up. As well as objections about how unaffordable they think it would be, people claim that if you did not have to earn wages in order to survive, nobody would work and we would just passively consume TV all day.

Actually, the evidence shows that it is in countries where people spend the most time in jobs that we find the highest consumption of TV. Which makes sense if you think about it. Having burned so much energy in their jobs they don’t have much left to do anything else in their spare time. On the other hand, people who live in countries in which less time is expected to be spent in jobs tend to be engaged in more voluntary work and spend less time sat in front of the TV.

Also, to get back to the theme of this essay, videogames debunk the theory that nobody wants to work. After all, if that were true, nobody would pay good money to go through the effort of trying to beat the various challenges these games set. Nor would anyone develop their sporting or artistic talents. After all, these things take work, and quite a lot of it in some cases. The reason why we so willingly pay to do the work in videogames is, as I have argued, because the designers of such games have an incentive to make such work as rewarding as they possibly can, because at the end of the day they want as many people to go out and buy the game and recommend others do so as possible. On the other hand, job providers are incentivised to reduce the qualities that make work rewarding in order to make employees more replaceable and exploitable. Not that all jobs can have such qualities reduced; it’s just that enough can be made unrewarding to explain why 90 percent of people don’t enjoy their paid work.

As Red Dead Redemption 2 shows, it is not really work we don’t like. It is jobs.

Thanks to Rockstar for the images

REFERENCES

‘Why We Work’ by Barry Schwartz

‘Utopia For Realists’ by Rutger Bregman

‘Red Dead Redemption 2 trailer’ by Rockstar

Advertisements
Posted in Uncategorized | Leave a comment

On Slavery

ON SLAVERY (PART ONE)

The past, it has been said, is a foreign country where things are done differently. At times, when looking back at the past, one feels a sense of relief to live now and not then. Who, for example, has heard of accounts of people enduring surgery while awake and aware and not thought “thank goodness I live now, when anaesthesia exists”?

And then there is the practice that is the main topic of this series. Slavery was once legal and widely practiced. Thank goodness we live now, when it is not only illegal but considered so morally repugnant there is a call to take down the monuments of historical figures whose fortunes partially depended on it, regardless of what philanthropic achievements they may also have accomplished. Not everyone believes this move to strip historical figures of their monuments because they did not live according to modern ethical principles is just, but we must all feel that the abolishment of slavery ranks as one of the high points of human progress.

Yet I feel like we have the wrong belief when it comes to slavery. Not wrong in the sense of our moral attitude toward it, but in the sense of how we think it ended and the extent to which it did end.

The way its end is popularised

In the popular imagination, it was the superior moral argument that ended slavery. Abolitionists campaigned to make it illegal and as right was on their side they ultimately won. And that was that, slavery was abolished. And while it was being practiced, we are encouraged to believe it was always the most brutal violation of a person’s liberty. Dramas and documentaries always portray the practice as white Europeans colonising foreign lands and, finding people of colour and being too prejudiced to see we are all the same beneath the superficial difference of skin tones, treat them like beasts of burden. They round them up, clap them in chains, throw them into the cargo hold of a ship and then sell them in markets to brutal masters who force them to toil under the crack of the whip.

What these beliefs do is make it seem like a chasm exists between the past and the present. Over there, beyond that great dividing line, there was slavery. Thank goodness we live over here where there is freedom and career opportunities.

But, really, this is a flawed belief. The abolishment of slavery was neither as decisive nor as complete as we are lead to believe. There is no gulf between slavery and jobs; rather, they exist on a continuum. And if there is freedom to be found along this path, then we have not yet reached that point.

The transition

You only have to imagine how a sudden transition to illegality for slavery would play out in practice to see how it would have been a gradual evolution toward freedom. Picture the scene. You are a slave and, as such, you own nothing at all, not even your own body. But then slavery is abolished and now, for the first time since your capture, there is something you can call your own. You are the sole owner of your labour power. But you don’t own anything else, and everything around you is the property of others. You cannot farm the land in order to grow your own food because that’s somebody else’s private property. You own no tools and have no money with which to buy them and you cannot just take some for to do so would be stealing.

All in all, as a former slave who now owns your own labour power and little else, your options are going to feel very limited. In fact, you would probably feel like there is only one thing you can do. You are going to have to beg your former masters to employ you. Now, this is hardly going to be a negotiation between equals. Pretty much all the bargaining power will be in the hands of the rich, propertied and well-connected former owners. So, when you come begging for a job, are you really going to be offered reasonable hours, paid vacations, entitlement to in-work safety protocols and a decent salary? No, certainly not, not if your former masters follow capitalist logic and are out to hire labour as cost-effectively as possible. What you will be offered would be work that is barely distinguishable from your former state of complete servitude, with no rights other than the right to quit and wages so low you can only subsist on them (which of course means that actually quitting work altogether feels like an unobtainable fantasy) and (if there are plenty of former slaves also looking for employment) not much chance of getting offers that are better anywhere else. After all, why would former owners not squeeze every last drop of value out of your labour power, when they hold all the bargaining chips?

The comedian Steve Hughes summed up how it really felt the day slavery was ‘abolished’ in one of his stand-up shows:

“Right! You lot are free to go. We’ll see you back here tomorrow at six-thirty!”.

In the next instalment, we will see that the situation was probably even worse, because of how rigged society was against those recently ‘liberated’ slaves.

REFERENCES

“The New Human Rights Movement” by Peter Joseph

Steve Hughes standup routine

ON SLAVERY (PART TWO)

Slavery and racism

No essay on slavery can avoid talking about racial prejudice. After all, racism is often portrayed as being synonymous with slavery. But while there is no denying that an attitude of white superiority has existed, especially during the late 19th and early 20th century, we are wrong to suppose that blacks were enslaved simply because white racists considered them inferior. No, what actually drove slavery (or, at least, American slavery) was economics. Simply put, there was market pressure to secure cheap labour and profitable investments, and the commodity of slave labour just seemed a better deal compared to what was to be had.

As professor of Sociology, William Julius Wilson explained, “the conversion to slavery was not only prompted by the heightened concern over a cheap labour shortage in the face of rapid development of tobacco farming as a commercial enterprise and the declining number of white indentured servants entering the colonies, but also by the fact that the slave had become a better investment than the servant. As life expectancy increased…planters were willing to finance the extra cost of slaves. Indeed, during the first half of the seventeenth century, indentured labour was actually more advantageous than slave labour”.

That term ‘indentured labour’ is worth pondering. You may recall from part one how slavery is often portrayed as a violent theft of a person’s liberty (in movies, for example, there is often a sequence showing people being rounded up and physically forced into their new role as labourers or domestic servants). But a person did not always come to be in a position of servitude because others physically forced them into it. Sometimes people sold themselves into slavery. Now why on earth would anybody do such a thing? For the same reason plenty of people submit to employment. They are in debt and faced with likely punishment if it is not paid and so they ‘voluntarily’ give up their liberty and labour for others until the debt burden is lifted. In the case of 17th century indentured servants (and quite a few people today, actually) debts were so substantial it could take a lifetime in order to clear a debt, meaning little practical difference in such cases between an indentured servant and an outright slave.

I put voluntarily in scare quotes because I believe it is possible that, even though people who made such a decision may not have been physically forced into slavery, nevertheless there were other pressures which, if coercive enough, could have psychologically forced them into a life of servitude. In other essays I have referred to this as ‘negative motivation’, taking action not because you hope to be rewarded if you do, but because you fear the outcome if you don’t. For some reason free market ideologues believe that, once you legally grant the right of the individual to remove his or her labour, any deal involving the hiring of labour must be one that is free of any form of coercion and is voluntary in the true meaning of the word (“if they don’t like the deal being offered, they can walk away!”) It seems much more realistic to me that, rather than a sharp distinction between unfree slaves and employees whose decision to hire their labour is entirely volitional, you can instead draw a smooth continuum from the slave who is physically forced into servitude, to the indentured servant who is psychologically coerced into servitude, to the employee whose experience is a kind of ‘carrot and stick’ combination of rewards and punishment and so on up to the worker who regards his career as his true calling and does it gladly.

European indentured servants were not only practically similar to slaves. Attitudes toward them were also similar. As civil rights professor Carter A Wilson explained:

“Colour prejudice against Africans was rare in the first two-thirds of the 17th century. Legal distinctions between black slaves and white servants did not appear until the 1660s…Interracial marriages were common in the first half of the 17th century and…at this time they provoked little or no reaction”.

How slavery became racist

So, if market economics and not racism was what caused slavery, how did prejudice end up such a dominant part of the practice? It seems as though racism and class distinction was deliberately stirred up as a means of exerting control. Around the last half of the 17th century, expanded agriculture in Southern states created a huge demand for cheap labour, and that demand was answered by way of the global African slave trade. That also obviously meant a dramatic increase in population size. Thus, it was around this time that public policy began to change, with the intent to create security through hierarchical dominance. The invention of division between poor whites and black slaves was carried out in order to achieve the social distinction necessary for hierarchy. According to historian Edmund S. Morgan, for example, a government assembly in Virginia:

“Did what it could to foster contempt of whites for blacks and Indians…In 1680 it prescribed 30 lashes…’if any negro or other slave shall presume to lift up his hand in opposition against any Christian’. This was a particularly effective provision that allowed servants to bully slaves without fear of retaliation, thus placing them psychologically on a par with masters”.

The purpose of this prejudiced-based bullying was to ensure the growing slave population remained subdued and controllable. As Peter Joseph put it, it was a move to “generate a culture of bigotry and dominance that echoes to this day. So, in a sense, racism has effectively been a system reinforcer to optimise slave labour by way of sociological manipulation”.

Even after slavery was supposedly abolished, there continued to be an interest in controlling minority and lower-class populations. Segregation played an obvious part here, effectively trapping people in areas and circumstances where political and economic oppression were ever-present. As Peter Joseph explained, “the legal system morphed from direct racial oppression to indirect by targeting the outcomes of historical and present socioeconomic inequality, rather than any specific group”.

In other words, although in theory slavery has been made illegal in most countries, in actual fact societies were, and in many places continue to be, structured in such a way as to ensure a ready supply of labour that is not as free as we would like to believe. More on that in the next instalment.

REFERENCES

The New Human Rights Movement by Peter Joseph

Centuries Of Change by Ian Mortimer

ON SLAVERY (PART THREE)

How slavery is still legal

In part one we were asked to imagine a newly liberated slave who is deciding what to do in order to live. We imagined that he would refrain from stealing or trespassing on the grounds that to do so would break laws. In reality, he would have found it incredibly hard to avoid breaking any laws, because the judicial system was so rigged against his class.

The aftermath of the civil war left the South in a state of economic turmoil, and under such chaotic conditions authorities played fast and loose with the power to arrest and detain. There were vagrancy laws that were vaguely defined and other dubious reasons to charge folk (typically blacks and poor people). This actually had little to do with a drive to restore law and order. The purpose was actually to ensure prisons were kept well stocked. You see, forced labour as a form of punishment was still legal so anyone (a former slave, say) who got arrested and found guilty of whatever could be commanded to do what was to all practical intents and purposes slave labour. The practice even had a name: Convict leasing. So popular was this practice that, by 1898, 73 percent of Alabama’s total revenue was derived from convict leasing, and it took many decades for federal government to shut it down completely.

But, actually, an argument could be made saying the practice was never completely abolished. Even today we have private prisons and corporations exploiting the labour of inmates. Companies like McDonald’s and Starbucks ‘employ’ prisoners, who in some cases earn as little as 23 cents an hour. Also, there are contractual agreements between state and local governments and private prisons that require the state to meet prison-occupancy quotas or otherwise pay for empty cells. The practice of convict leasing resulted in corrupt arrests being carried out in order to meet labour demand, and this current practice of maximum occupancy of prisons regardless of a region’s actual crime levels has also resulted in corruption. There was, for example, the 2008 ‘kids for cash’ scandal in which two Pennsylvanian judges were taking millions in bribes from a for-profit prison company to increase the number of inmates. With a pool of labour for hire at mere pennies an hour, one can appreciate the economic incentive to keep prisons well stocked.

The prison-industrial complex

Having said that, the largest beneficiary of slave labour from prison inmates is not private business but rather the State. As was explained in the Storyville documentary “Jailed In America”, “when someone is convicted and moves from jail to a federal or state prison, the government now has legal access to them as a workforce. These prisoners work for almost nothing, making road signs…or just about anything the government decides”. They may also be put to work providing services the prisons require in order to function, such as doing laundry or maintaining the building’s plumbing. Incarceration is part of a massive prison-industrial complex, an industry worth some $265 billion a year. It could not exist were it not for inmates and so there needs to be a steady supply of new people. Hierarchical societies are structured in such a way as to ensure poor people face limited life choices that are highly likely to lead to incarceration. And the way such things as parole are conducted further adds to the idea that the prison-industrial complex is structured in such a way as to provide a supply of slaves. Being on parole comes with conditions which, if broken, lead to violators being returned to prison. These include such things as being homeless or out of work. Note that for everybody else these are not illegal. Nevertheless for those on parole being made homeless or losing your job (and plenty of other situations that are not law-breaking) result in your being thrown back into jail and the slave labour that often awaits.

Why do we punish the guilty?

When it comes to prisoners, we are encouraged to believe that inmates are just bad people who freely chose to commit crime. Such an attitude probably has its roots in monotheism and its portrayal of the human as an individual with free will who exists separate from the rest of nature. Although one should be careful not to absolve individuals of all personal responsibility, the fact of the matter is that what free will we have is easily overcome. Both magicians and fraudsters understand and exploit flaws in our ability to make decisions and process information, tricking us into carrying out actions of their choosing while believing we are exercising pure free will. There are also plenty of experiments that show how easily people’s ability to make independent choices can be affected by peer and authoritarian pressure. Environmental and social factors impact on our ability to freely choose, and these predominantly affect the lower classes. What kind of upbringing you had, the state of your education, the quality of your diet, economic factors and more can set people on a course that is more likely to end in a conviction compared to the life choices presented to others.

Again, I should stress that this is not being pointed out in order to argue that personal responsibility does not exist, because it does at least to some degree. But, equally, we really shouldn’t condemn those found guilty when we know nothing of the factors that may have influenced the way their life turned out. Crime is sometimes described as a ‘social disease’. Sometimes it is necessary to quarantine people who have a contagious biological disease. Note, however, that no moral condemnation is attached to such a decision. But when it comes to those who catch the social disease of criminality there does tend to be moral condemnation along with the need to separate such people from society. Any society based around competition for material advantage via whatever method you can get away with, and which also incentivises negative attitudes towards the losers of such competitive behaviour (being labelled as failures and so on), is just bound to create conditions in which some will succumb to the temptations of crime. In a neo-liberal free market where everything is a commodity with a price tag attached, how ethical you are depends on how ethical you can afford to be. Morality doesn’t really come into it. As Peter Joseph said, with regard to corporations exploiting the cheap labour of prison inmates:

“This pursuit of cost-efficiency is what notably defines market efficiency…This is simply the nature of capitalist logic, and the still-common idea that the rise of capitalism was somehow instrumental in the general ending of abject slavery on the structural level is little more than denialism”.

Indeed. For, as we have already seen, the popular conception of how slavery ended is quite wrong. It did not just end with the passing of laws that made it illegal. Rather, there has been a long process of rooting out the opportunities for exploitation and establishing the rights people (particularly lower classes) require in order to live in reasonable comfort and security. While capitalism should get some credit for its contribution toward creating the wealth that makes it more affordable to be ethical, it should not be forgotten that most rights we have now come to expect as workers had to be fought for. I have no doubt that, were it not for the pressure from socialist movements, work under capitalism would have remained so exploitative, life for the majority would be akin to slavery and the wealth generated would be much more concentrated as indeed it has been in all redistributive societies where the poor have little or no voice with which to protest their conditions (those being the sort of societies we have predominantly lived in since the Neolithic revolution).

Nor should we kid ourselves into thinking the struggle to end slavery is over. It continues to exist in varying degrees of obviousness, mostly because the root cause of most slavery persists to this day. We have encountered this cause several times throughout this series. It was there when we talked about people in debt selling themselves into slavery. It turned up again by implication when we discussed the forced labour of prisoners, for incarceration has long been justified as a means of making wrongdoers ‘pay their debt to society’.

Yes, the root cause of exploitation is debt. That’s what we will look into next time.

REFERENCES

Storyville: Jailed In America

The New Human Rights Movement by Peter Joseph

ON SLAVERY (PART FOUR)

Debt

Along with war and conquest, the imposition of debt stands as one of two major causes of slavery and servitude. Some would probably argue that debt is a natural and inevitable part of any society involving interactions between individuals with differing whatever. While it’s true that any society can only function so long as people recognise and meet ongoing obligations toward one another, the amount of debt that exists in the world today is way out of proportion of anything required to maintain a prospering, egalitarian society. It is instead diagnostic of a market system that has become decoupled from reality.

Much of the pursuit of profit now has little to do with making physical products intended to solve real problems, but instead has moved to the abstract world of financialisation in which Wall Street and its equivalent in other countries collude with governments to create and manipulate complex forms of debt. Major companies no longer derive their profits principally from selling actual products. Instead they float their shares on the stock market, borrow cheap money from the government, buy back their own shares and thereby gain a boost in the paper profit of the company.

This move into abstraction is not without consequences, for there are real downsides to this expansion of debt. Any investigation into how banking works will reveal that banks don’t actually lend out money others have deposited, but instead create money ‘out of nothing’ whenever anyone meets the criteria of being worthy of a loan.

Actually, money is not really being created out of nothing. Rather, wealth is being snatched from the future in order to pay for goods and services here and now. This practice is fair enough when the wealth snatched from the future gets into the hands of those who really can build a better tomorrow. But in reality it too often ends up being used for short-term profit that ultimately causes long-term harm. Banking is a complex system in which people bring about the creation of money out of debt (and of course it is predominantly the poor who need to take out loans) and then, thanks to the negative and positive externalities of market capitalism, the debt and money separate, with the upper classes extracting the money while the poor get burdened with the debt. Are there exceptions to this rule? Yes, but then one can also find exceptions to the ‘survival of the fittest’ rule that drives evolution. Nevertheless evolution is fact and the consequences of this kind of inequality are fact.

How impactful is debt? It’s relative…

How much it matters that you are in debt really depends on how likely it is that you will encounter somebody more powerful than you who can demand repayment. This means that, for the most powerful player of all, debt is of no consequence whatsoever because the day of reckoning will never come. As Alan Greenspan once pointed out, “the US can pay any debt it has because we can always print money”. Or, to put it another way, the US can endlessly snatch wealth from the future without fear that a mighty one will one day come along demanding repayment (of course this rests on the assumption that the country remains the dominant power in the world).

But for weaker players, it’s a different story altogether. Consider the words of President Obasanjo of Nigeria:

“All that we had borrowed up to 1985 or 1986 was around $5 billion and we have paid back so far about $16 billion. Yet, we are told that we still owe about $28 billion…because of the foreign creditors’ interest rates”.

By 2004, the developing world was having to pay $20 in interest repayment for every dollar received in foreign aid and grants. The result was crippling austerity and the creation of highly vulnerable people ripe for exploitation by predatory corporations. And austerity is not just a third-world phenomenon. Even rich countries had to put up with it following the last great speculative bubble (in sub-prime lending as you may recall). But, in keeping with the idea that the powerful escape the consequences of bad societal decisions while the weak must bare the cost, the CEOs who lead the way in reckless speculation got away with it for the most part, riding off into the sunset with breathtakingly large pensions and severance packages, while the poor had services cut and good, secure jobs taken away and replaced with gig work stripped of many hard-won benefits.

As debt grows and its harmful consequences fall predominantly on the vulnerable, such people become more desperate, more prepared to do anything to delay the day of reckoning. As Kevin Bales, who is an expert in human trafficking, explained, “the question isn’t ‘are they the right colour to be slaves?’, but ‘are they vulnerable enough to be enslaved?’. The criteria of enslavement today do not concern colour, tribe, or religion; they focus on weakness, gullibility and deprivation”.

How many are slaves today?

Slavery has not been abolished. It still exists to varying degrees. That slavery continues to this day cannot be doubted (any human rights organisation will correct you with evidence if you believe otherwise) but how much of it there is depends on how you define servitude. According to UN estimates there are roughly 27 million slaves in the world today. However, another organisation called the Walk Free Foundation has put the total at more like 46 million. They are mostly bonded labourers or debt slaves in India, Pakistan, Bangladesh and Nepal.

But could the numbers be higher still? Think back to the notion of debt bondage and selling oneself into slavery, which we touched upon in part one. What, fundamentally, is the difference between selling yourself to one master for life, and being in a position where you must constantly make your labour available for hire, toiling away for minimal reward while others gain most of the rewards being generated by workers like yourself? Doesn’t that just show a continuum of exploitation from abject slavery to indentured servitude to wage labour? Yes, one could argue that the conditions of wage labour is preferable to outright slavery (at least in a lot of cases) but you cannot really call either condition ‘freedom’. After all, if to be free is to work for oneself and to gain most of the rewards from a job well done (and also to carry the costs of not doing your work competently) then precious few can claim to be truly liberated from the bonds of servitude. For as Federal Reserve expert G Edward Griffin said:

“No matter where you earn money, its origin was a bank and its ultimate destination is a bank…This total of human effort is ultimately for the benefit of those who create fiat money. It is a form of modern serfdom in which the great mass of society works as indentured servants to a ruling class of financial nobility”.

If Robinson’s argument is valid, the true number of slaves in the world today would be counted in the billions.

CONCLUSION

At the beginning of this series a question was posed: Is the popular portrayal of slavery’s end incorrect? We have seen how slavery did not just get abolished with the passing of an Act, creating a gulf between the un-free past and the liberated now. Rather, escape from slavery has been a long process that has made only modest progress in breaking the bonds of servitude and, in some cases, none at all. Progress toward freedom is so slow because, at its very root, market capitalism contains the socioeconomic structures that have given rise to exploitation since the Neolithic period: Systems that justify competition, self-interest, hierarchy and inequality, perpetuating scarcity and profiteering from the growing environmental and social fallout of negative externalities by exploiting the vulnerabilities lower class people and developing nations feel under such circumstances.

Yes, to some extent progress has been made. But not as much as we are lead to believe by apologists for capitalism and certainly nowhere near enough compared to what is technically possible. For example, much is made of the apparent reduction in abject poverty around the world (a condition most likely to result in exploitation). But what’s not appreciated is that such results are obtained by using an infeasibly low threshold for an absolute minimum wage. On the other hand, were we to use the ‘Ethical Poverty Line’ devised by Peter Edward (set at about $7.40 a day), then 4.2 billion people or 60 percent of the world remain in an impoverished state, ripe for exploitation.

Or consider that it would cost about $30 billion a year to end world hunger and that the 1800 billionaires in the world could provide such provision for 200 years and still have roughly $500 million each. It is disgusting that malnourishment and other forms of deprivation that are completely unnecessary continue to exist. The reason they persist is because market capitalism profits from servicing the problems they generate, and really has no interest in bringing about an end to scarcity because assumptions of scarcity are fundamental to how this competitive system works. If you add up all the deaths caused by various negative externalities ultimately traceable to market competition’s root socioeconomic orientation, capitalism has killed more people than all of the 20th century’s despots combined, and has enslaved more people than any other system in history.

REFERENCES

“The New Human Rights Movement” by Peter Joseph

“The Creature From Jekyll Island” by Edward G. Robinson

“Modernising Money” by Joseph Lietar

Posted in Uncategorized | 1 Comment

WHY IS THERE ‘MAKE-WORK’?

WHY IS THERE ‘MAKE-WORK’?

Do you always have work to do at your place of employment, or is your work of a kind where sometimes you are busy, while at other times there’s not much to do? If you are one of those employees working where activity goes through peaks and troughs, chances are you have encountered an attitude that is usually accepted as normal but which would have been regarded as quite bizarre by most of our ancestors.

The best way to explain what I mean is to quote from an employee who has experienced this weird attitude. David Graeber has several such interviews in his book, ‘Bullsht Jobs: A Theory’. Here’s a typical example from ‘Patrick’ who worked in a convenience store:

“Being on shift on a Sunday afternoon…was just appalling. They had this thing about us not being able to just do nothing, even if the shop was empty. So we couldn’t just sit at the till and read a magazine. Instead, the manager made up utterly meaningless work for us to do, like going round the whole shop and checking that things were in date (even though we knew for a fact they were because of the turnover rate) or rearranging products on shelves in even more pristine order than they already were”.

What I am referring to, then, is that attitude employers have that regards slack time as something that should be filled with pointless tasks or ‘make-work’.

How else might these slow periods be dealt with? I can think of a few alternative options. The business could send unneeded staff home without pay. They could send them home with pay. They could require them to stay at their posts, but let the staff socialise, play games or pursue their own interests until there are real work-based duties to carry out.

Of all these options, sending staff home with pay is the least popular. It hardly ever happens. Letting employees do their own thing during slow periods is also pretty unusual. Sending staff home, forfeiting remaining wages is more widely practiced, especially with zero-hours contracts that specify no set hours. But if you are in a regular job and there are times when the work is slow, the most common solution is to have that time filled with useless tasks.

It’s hard to see how this practice of making up pointless tasks is in any way productive. Indeed, a case could be made that it encourages anti-productivity. David Graeber recalled an incident when he worked as a cleaner in a kitchen and he and the rest of the cleaning staff pulled together to get everything done as well and quickly as possible. With their work completed, they all relaxed…until the boss turned up.

“I don’t care if there are no more dishes coming in right now, you are on my time! You can goof around on your own time! Get back to work!”

He then had them occupy their time scrubbing baseboards that were already in pristine condition. From then on, the cleaning staff took their time carrying out their duties.

Graeber’s boss’s outburst provides insights into why this attitude exists and why it would have seemed so peculiar to our ancestors. He said, “you are on my time”. In other words, he did not consider his staff’s time to be their own. No, he had purchased their time, which made it his, and so to see them doing anything but look busy felt almost like robbery.

How Our Ancestors Worked

But our ancestors could not possibly have conceived of time as something distinct from work and available to be purchased, and they certainly would have seen no reason to fill down time with make-work. You can tell this is so by noting how make-work is absent from the rest of the animal kingdom. You have animals that live short, frenetic lives, constantly busy at the task of survival. Think of the industrious ant or the hummingbird, forever moving in the search for nectar. You have animals that are sometimes active but at other times take life easy, such as lions who mostly sleep and only occasionally hunt. But what you never see are animals being instructed to do pointless tasks.

There’s every reason to believe our ancestors would have been under no such instructions, either, particularly when you know a bit about the kind of societies they lived in and the practicalities they faced. Our earliest ancestors lived in bands or tribes in which there were no upper or lower classes, for the simple reason that the hunter-gatherer lifestyle would not permit much social stratification.

This should not be taken to mean that there was absolute equality among members of bands or tribes, however. Leaders did emerge, distinguishing themselves from the rest of the band or tribe through qualities like personality or strength. Both bands and tribes had big-men, recognised in some ways as the leader. But such leaders would have been barely distinguishable from ordinary tribe members. At best, the big-man could only sway communal decisions and had no independent decision-making authority to wield or knew any diplomatic secrets that could confer individual advantage. Moreover, the big-man’s lifestyle was indistinguishable from everyone else’s. As Jared Diamond put it, “he lives in the same type of hut, has the same clothes and ornaments, or is as naked, as everyone else”.

Given that our ancestors were hunter-gatherers, it would have made no sense for ‘big-men’ to make anyone fill spare time with make-work. No, the sensible would have been to permit relaxation during slack periods in order for there to be plenty of energy when the time came to put it to good use. You can imagine how there would have been seasons in which there was plenty of fruit to gather, or moments when everyone should mobilise to bring home game. But afterwards, when the fruit was picked and the hog roasting on the spit, the time left was better spent playing, socialising, or resting.

This is, in fact, how we evolved to work. We are designed for occasional bursts of intense energy, which is then followed by relaxation as we slowly build up for the next short period of high activity.

This work pattern could hardly have changed much when human societies transitioned to farming and were able to develop into chieftains and larger hierarchical societies. After all, farming is also very seasonal work, so here too it would have made much more sense to adopt work attitudes that encouraged intense activity when necessary (such as when the harvest was ready to be gathered) but at other times to just leave the peasants alone to potter about minding and maintaining things or relaxing.

Now, it’s true that the evolution of human societies into hierarchical structures not only entailed the emergence of a ruling ‘upper class’ but also a lower caste of slaves and serfs. But, although we commonly conceive of such lower caste people as being worked to death by brutal task-masters, in actual fact early upper classes were nowhere near as obsessed with time-management as is the modern boss and didn’t care what people were up to so long as the necessary work was accomplished. As Graeber explained, “the typical medieval serf, male or female, probably worked from dawn to dusk for twenty to thirty days out of any year, but just a few hours a day otherwise, and on feast days, not at all. And feast days were not infrequent”.

So, our ancestors saw no need to fill idle time with make-work, partly because it was (and still is) of little practical use. But if masters of serfs could plainly see how silly it is to force make-work on their serfs, why can’t modern managers grasp the same thing with regards to their staff? Well, it all has to do with concepts of time, and that’s something we’ll look into next time…

REFERENCES

Bullshit Jobs: A Theory by David Graeber

Guns, Germs and Steel by Jared Diamond

WHY THERE IS ‘MAKE-WORK’

If you could go back in time and say to somebody, “can I borrow you for a few minutes?”, your request would have been met with a baffled look. This would be because such a person would have no understanding of time as being broken up into hours, minutes and seconds. Instead, what understanding of time there was consisted of passing seasons, cycles like day and night or the length of time actions took, on average to perform. “I will be there in five minutes” means nothing to a rural person in Madagascar, but saying it takes two cookings of a pot of rice would have let somebody know how long your journey would likely take. As Graeber explained, for societies without clocks, “time is not a grid against which work can be measured, because the work is the measure itself”.

It’s because our ancestors had no ‘clock’ concept of time that they could not therefore conceive of somebody’s labour-power as being distinct from the labourer himself. Consequently, if somebody came across, say, a cooper, they could imagine offering to buy the barrels he made, or they could imagine buying the cooper himself. But the notion of buying something as abstract as time? How was that possible?

Well, once slavery came about our ancestors did have an approximation to modern employment practices, in that slaves could be rented instead of bought outright. Whenever we find examples of wage labour in ancient times, it pretty much always involves people who were slaves already, hired out to do some other masters’ work for a while.

Around the 19th century we do see occasional warnings by plantation owners that slaves had best be kept busy during idle periods, for who knows what they might plot if left with time on their hands? But it took technological innovations from the 14th century onward to really make time seem like a commodity that could be bought, spent, misspent or stolen.

Clocks and buying time

What set us on the road to bosses complaining about ‘their time’ being wasted was similar to what lead to the evolution of money. Our ancestors lived in gift-based economies in which favours were freely undertaken with the vague understanding that they would be suitably reciprocated at a later date. But when was a favour suitably reciprocated or a slight adequately compensated? Such questions lead to rules, regulations, laws and contracts that gradually quantified obligations and transformed them into debts and credits that could be precisely calculated.

By the 14th century, clocks had been invented and began to show up in town squares. But where the clock-based concept of time really took off was in the factories of the industrial revolution. The increasing routinisation and micro-tasking of work that typified the production-line brought about the quantisation of time into discrete chunks that could be bought, and the need to coordinate logistics lead standardised times (imagine running trains when no two towns agree on when it is 2PM). By dividing time into the now-familiar hours, minutes and seconds, we created a concept of time that conceives it as a definite quantity that could be purchased, distinct from both the labourer and his produce. It became possible to conceive of buying a portion of his time and owning whatever produce that got created during that time, while not actually owning the labourer himself. This, of course, is what distinguishes an employee from a slave.

But once we began thinking about time as discrete units that could be bought, that then lead to a belief that time could be wastefully spent, not just by being literally idle but by spending ‘somebody else’s’ time doing your own thing, like playing a board-game or reading a magazine. The attitude I referred to earlier (‘don’t let slaves be idle lest they plot to free themselves’) was carried over to working practices in industrial cities. This, combined with the idea that you could buy somebody’s time but they could then waste ‘your’ time (misspend it) lead to the peculiar modern notion of time discipline and its obsession with busyness and make-work. Once you get to the 18th century and onwards, you get the emergence of bosses and upper classes who increasingly view the old episodic style of working (which involved occasional bursts of intense energy, which is then followed by relaxation as we slowly build up for the next short period of high activity) as problematic rather than sensible. Moralists came to see poverty as being due to bad time-management. If you were poor it was because your time was being spent recklessly or wastefully. What better remedy than to have your misspent time purchased by somebody who was rich and, therefore, better able to budget time carefully, as one who is frugal would budget and dispose of money?

It was not only the bosses who came to see time as purchasable units that might be misspent. So, too, did employees, especially since the old struggle between the conflicting interests of employer and employee meant the latter also had to adopt the clock-concept of time. If you are an employee, you want an hourly wage for an hours’ work. But if you are the boss, it would be preferable to somehow extract more than an hours’ work for an hour’s pay. Early factories did not allow workers to bring in their own timepieces, which meant those employees only had the owner’s clock to go on. Such owners regularly fiddled with the clock so as to appropriate more value (by getting them to do overtime for free) from their employees. This lead to arguments over hourly rates, free time, fixed-hour contracts and all that. But, as David Graeber pointed out, “the very act of demanding ‘free time’…had the effect of subtly reinforcing this idea that when the worker was ‘on the clock’ his time truly did belong to the person who had bought it”.

So, the belief that any spare time in work should be filled with pointless tasks came about as a result of somebody’s time becoming conceived of as distinct units that somebody else could buy and, consequently, as something that could be stolen or misspent. This in turn lead to a form of moralising that regarded idleness as sin, as something to be eradicated through the provision of make-work and indignation upon seeing employees doing anything other than their jobs or pretending to carry out tasks when their actual job is done.

It’s not just in stores, offices and factories that this attitude prevails. Where care work is concerned, the service being offered can sometimes consist of being on stand-by just in case the elderly client needs attention. But the elderly person gets so indignant about the carer ‘sitting around wasting my money’ they, too, end up being asked to pretend to do ‘something useful’ like tidy up a home they have already tidied. From the perspective of the stand-by carer, this can make the work intolerably frustrating.

The future of make-work

Make-work also has worrying implications if future technological capabilities will be as potent as futurists like Ray Kurzweil claim. I would argue that each major work revolution has focused on successfully less urgent demands. The agricultural revolution was concerned with food production, which is of obvious importance since we cannot live without food nor do any other work without adequate nutrition. The industrial revolutions (and the socialist movements that accompanied them) lead to greater standards of living and increased comfort. While not as essential as food, conveniences like microwaves, carpets and television sets can make life more pleasant and the products of manufacturing enable us to carry out essential work with more ease.

But what happens when people have enough of what they need to lead healthy, comfortable lives? Their consumption slows, and that’s anathema to a growth-based system like market capitalism. No wonder, then, that from the 50s onwards psychologists like Edward Bernays were working with advertising departments in order to create fake needs so as to sell bogus cures. No wonder, then, that we went from being utilitarian in our attitude toward products, buying them for practical purposes and make-do-and-mending in order to get maximum-possible use out of our stuff, to adopting a throwaway culture, replacing stuff just because it’s out of fashion or because it was designed to fail as soon as can be gotten away with and not built to be easily maintained.

General AI and atomically-precise manufacturing could drastically increase the efficiency with which we manage and carry out the rearrangement of materials, lead to a radical reduction in waste and free up time, as we would have the means to automate most of today’s jobs. Once we have automated jobs in agriculture, manufacturing, services and administration, the sensible thing would be to pursue interests outside of the narrow sphere of wage labour. It would be a good time to rediscover the periodic working practices of our ancestors and the greater commitment to social capital typical of tribal living, only with the added bonus of immense technological capability to keep us safe from hardships that do sometimes affect hunter-gatherer societies.

But is such an outcome likely to happen when it has to evolve within a system based on a throwaway culture and where work is seen as virtuous in and of itself to the extent that ‘spare time’ is considered to be something that should be filled with pointless tasks? What I am saying is that markets have already proven themselves capable of creating scarcity where little real needs exist, so it is not too great a leap of imagination to suppose that the moral indignation that stems from the attitude ‘time is money’ and ‘you are misspending my time’ could work against what should be capitalism’s greatest triumph, which is to unlock the potential abundance inherent in the Earth’s richness of resources and elevate us to positions where we can live comfortable lives that need not come with the condition that some have to adopt extreme levels of frugality, and where we are free to become all we can be within a more rounded existence. Instead of that promising outcome, we might well just fill the technological-unemployment gap with make-work and bullshit jobs.

What a waste of time it would be if that were to happen.

REFERENCES

Bullshit Jobs: A Theory by David Graeber

Guns, Germs and Steel by Jared Diamond

Posted in Uncategorized | Leave a comment

The Road To Freedom?

In 1944 the Austrian economist, Friedrich August Von Hayek, published ‘The Road To Serfdom’. The book set out to argue that the free market is the only viable way of bringing about freedom and prosperity. Actually, the book does not talk so much about the virtues of free markets but rather the downsides of the alternative which, at the time, was central planning. Hayek’s argument was that we can only handle the complexities of reality in a bottom-up fashion, with individuals looking after their own self-interests while guided by pricing signals. This, he reckoned, would result in the efficient allocation of resources arising from what would now be called emergent behaviour.

On the other hand, if we instead relied on a centralised authority to determine resource allocation, such an authority would inevitably find the complexity of modern economies too much to handle. The only way the authority could gain some measure of control would be to exercise more power over the people, restricting their freedom and making them live their lives according to some plan. Thus, a socialist economy would become more authoritarian over time. As the title of the book said, Hayek reckoned socialism to be the road to serfdom.

It’s fair to say that the book remains one of the classic texts of neo-liberalism. Margaret Thatcher described Hayek as one of the great intellects of the 20th century, and he was awarded the Nobel Prize in economics in 1974. Even now, some 64 years after its publication, it is still regarded as a definitive refutation of leftist politics and proof that only neo-liberalism can deliver prosperity. You could say that Hayek is as important a figure to the free market as Karl Marx is to communism.

But, I wonder, does Hayek’s argument really successfully demolish every alternative to neo-liberalism? Does the selfish pursuit of money and the conversion of everything to a commodity to be bought and sold on the market still stand as the only way we can achieve peace and prosperity? Or are its advocates wrong to say there is no alternative?

I would say there is an alternative. We are no longer restricted to the either-or choice of laissez-faire capitalism or authoritarian central planning. There might just be a third way.

It’s worth baring in mind the time in which Hayek wrote his book and how things have changed since then. At the heart of his argument is the belief that the world is really, really complex and, because of this, far too much information is generated for a centralised authority to handle without imposing real restrictions on individual liberty. Only market competition guided by pricing signals can manage such complexity. But, remember, he was writing in 1944. Communications back then was a good deal more primitive than is the case today. There was not one satellite in orbit. Now we have many hundreds, if not thousands, constantly monitoring all kinds of stuff such as weather patterns, urban sprawl, how crops are faring and so on and so on. This amounts to a network of sensors englobing our planet and allowing for realtime feedback about all kinds of important things. Such a perspective simply didn’t exist when ‘Road’ was published.

The advances we have made in our ability to transmit information is truly remarkable. The numbers are hard to grasp as they are pretty astronomical, but let’s give ourselves some standard of comparison and see if that helps. The author James Martin proposed the ‘Shakespeare’ as the standard of reference for our ability to transmit information. One Shakespeare is equivalent to 70 million bits, enough to encode everything the Bard wrote in his lifetime.

Using a laser beam, you can transmit 500 Shakespeares per second. Sounds impressive, but in fact fibre optics technology can do much better. By using a technique called Wavelength Division Multiplexing, the bandwidth of a fibre can be divided into many separate wavelengths. Think of it as encoding information on different colours of the spectrum. Some modern fibres are able to transmit 96 laser beams simultaneously, each beam carrying tens of billions of bits per second or 13,000 Shakespeares.

But we are still not done, because many such fibres can be packed into a single cable. Indeed, some companies make cables with more than 600 strands of optical fibre. That is sufficient to handle 14 million Shakespeares or a thousand trillion bits per second.

Think about that. We can now transmit data equivalent to 14 million times Shakespeare’s lifetime’s output from one side of the planet to the other almost instantaneously. Of course, this is quantity of information and not necessarily quality (not everything we send over the Internet is of Shakespearean standards!) but the point is that we can now send an awful lot of information around the world whereas this was not possible in Hayek’s day.

It would do little good to transmit petabits of information if we did not also improve our ability to store and crunch that data. In 1944 computers barely existed. What computers did exist came in the form of room-sized electromechanical behemoths that consumed huge amounts of power and were so temperamental only specialised engineers could be trusted to go near them.

Ray Kurzweil once said, “if all the computers in 1960 had stopped functioning, few people would have noticed. A few thousand scientists would have seen a delay in getting printouts from their last submission of data on punch cards. Some businesses reports would have been held up. Nothing to worry about”. And this was in 1960, over a decade after Road was published.

Since then, Moore’s Law (related to the price-performance of computer circuitry) has increased the power of computers by billions of times. It has shrunk hardware from the room-sized calculators of old to swift, multi-tasking supercomputers that can easily slip into your pocket. The cost has been reduced from about 100 calculations per second per thousand dollars in 1960, to well over a billion cps by 2000. Such a reduction means we can treat computing as essentially free, as proven by the way people are constantly on their web-enabled devices without ever fretting about how much it is costing. Also, computers have become increasingly user-friendly over time, from devices that required considerable technical skill for even simple tasks to modern conveniences like Alexa that can be interacted with through ordinary conversation.

The result of all this technological progress is that we are now practically cyborgs from infancy, thanks to the near-constant access to enormously powerful and intuitive computational devices. We live as part of a vast, dense network of bio-digital beings, connected to one another regardless of distance and with ready access to all kinds of information and digital assistance.

What this has to do with Hayek’s argument was expressed in an opinion put forward by David Graeber: “One could easily make a case that the main reason the Soviet economy worked so badly was because they were never able to develop computer technology efficient enough to coordinate such large amounts of data automatically…now it would be easy”.

In part two, we will see how the Internet and other technological advances provide options that were not feasible when ‘Road’ was written.

When Hayek wrote his book there was no Internet. Nobody was a blogger. Not one video had been uploaded. There was not a single Wikipedia entry, not one modded videogame. Linux and bitcoin were not words in anyone’s vocabulary. Now, such things are a ubiquitous part of modern life and most of them are free, part of the collaborative commons. OK, the price of bitcoin went crazily high but its founder provided the underlying blockchain of technology gratis, and made its white paper public knowledge so anyone could improve and expand upon it to create stuff like a decentralised social media site built on a blockchain.

Indeed, there’s now a great many things we can do on a voluntary basis. Much of the content of the web owes its existence more to passion than the pursuit of money. Jeremy Rifkin calls this ‘collaboratism’. Collaboratism means engaging in work not because financial pressures or some authority compels you, but because the means of producing and distributing stuff has become cheap enough that anyone with any drive to do something has the means to flex their creative muscles, and to connect with others with complementary skills and weaknesses.

This kind of technological progress changes many things. For example, when you have ready access to manufacturing or logistical systems it makes more sense not to have private ownership of stuff (which nearly always entails that stuff sitting in storage not being used for most of its life) but rather using stuff as and when you need it, and then making it available for others to use when you don’t. Think, for example, of driverless cars that could be there when you need transport and make themselves available for others to use if not. If that car was your own private possession, it would probably be parked somewhere not being used by anyone for long stretches. What a waste of resources!

This is the kind of world advocated by the Zeitgeist Movement. Critics of Peter Joseph tend to dismiss him using the same arguments Hayek used in ‘Road’. But this is to fundamentally misunderstand Joseph’s position. He is in no way advocating any centralised control, but rather more efficient decentralised methods than the corrupt monetary systems that are leaking value from today’s markets.

As to why neo-liberals tend to mistake Zeitgeist’s resource-based economy for central planning, maybe it can be traced back to concept drawings by Jacques Fresco? His Venus project shows plans for cities whose infrastructure is organised into a circle, at the centre of which sits a big computer monitoring the various flows of information a city generates. Such an illustration sure makes it seem like a centralised authority is in charge.

But you have to bare in mind that this city-wide perspective is only one viewpoint. If we could zoom out, we would see that the spokes of this ‘wheel’ radiate out beyond the confines of the city to connect with other cities, such that it becomes a node in a web of interconnected smart cities. Or, you could zoom in to a more personal level, and see that each person is a node in the network thanks to the web-enabled devices they have ready access to. Just shift perspective and what seems like a centralised master computer turns out to be a node in a network.

I would make an analogy with the web of life. Imagine telling somebody that there is a digital programme, encoded in DNA, running evolution. Imagine that person demanding to know where, precisely, the computer running this programme is located, and also telling you evolution can’t possibly work because Hayek proved centralised planning is hopeless. This would be a fundamental misunderstanding, because the code of life is not to be found in any particular location, but rather distributed throughout the world. Nobody is in charge, there is no top-down authority commanding natural selection.

Similarly, when confronted with Zeitgeist’s outline for systems of feedback that would enable us to track the world’s resources and manage them according to the principles of technical efficiency, it’s always denounced by critics as central planning. It’s almost as if such people forget the Internet ever existed.

When Hayek wrote ‘Road,’ mass production was the most obvious manifestation of market competition’s drive to produce sellable commodities, and mass production at that time was largely dependent on factories powered by large stations. Those were hugely expensive means of production that only a minority could afford to own, and which were most efficiently run along fascist lines. You might have been free to quit your job but once you clocked in you become part of a vertically-integrated management structure and had authorities whose orders had to be obeyed (and who, for the most part, were more interested in lining their own pockets and those of the banking and governmental masters they answered to than rewarding your efforts).

In marked contrast, the technologies of the 21st century could enable production by the masses, for the simple reason that the means of production and distribution could become ever more accessible in terms of cost and ease-of-use. Few can own a factory but if the price-performance of atomically-precise manufacturing goes far enough, what is effectively a factory in a box could sit beside your printer, and if robots follow the same trajectory as computers they should go from being very limited, expensive and largely inaccessible labour-saving devices to cheap, versatile, user-friendly, ubiquitous helpers. We could all become owners of the means of production. Such a decentralised form of production works best when we act as collaborating individuals united by complementary strengths and weaknesses in laterally-scaled networks, which is quite different from the vertically-integrated management that jobs have traditionally been designed around.

CONCLUSION.

When Hayek wrote ‘Road’, the only alternative to free markets he could imagine was central planning. But really, who could blame him? There was no satellite communication, hardly anybody had access to computers and the World Wide Web did not exist. In short there was none of the infrastructure that the digital commons needs to get off the ground, making it perfectly reasonable for Hayek not to consider collaboratism as a viable alternative to the selfish pursuit of money.

Now, the infrastructure is beginning to fall into place. We have a communications web, an information web, and the beginnings of a logistic web and energy web too. Thanks to advances in artificial intelligence, robotics, nanotechnology and more, we are approaching the point of near zero-marginal cost for the creation and delivery of all kinds of content, not just digital stuff but physical stuff too. We can now work together, forming groups and collaborating on projects out of passion rather than out of some selfish pursuit of monetary gain.

‘The Road To Serfdom still stands as an effective argument that market competition is preferable to central planning. But when you consider how laissez-faire principles brought about the financial crisis of 2008 (Wall Street really did take advantage of Ayn Rand devotee Alan Greenspan’s deregulation and the commodifying of political influence to make fraudulent activity legal and prey on people’s financial gullibility) and the impossibility of sustaining free market principles in anything that resembles the way market competition actually developed (covered in my essay series ‘This Is What You get’) I suspect that, were he alive today, Hayek would be championing the Zeitgeist movement as the best way of bringing about prosperity. In 1944 there may have been no viable alternative to neo-liberalism, but that’s changing.

REFERENCES

“The Road To Serfdom” by Hayek

‘Zeitgeist Movement Defined’

‘The Zero-Marginal Cost Society’ by Jeremy Rifkin

‘Age Of Spiritual Machines and ‘The Singularity Is near’ by Ray Kurzweil

‘The Meaning Of The 21st Century’ by James Martin.

“Bullshit Jobs: A Theory” by David Graeber

Posted in Uncategorized | Leave a comment

BULLSHIT JOBS and the NEW FEUDALISTS

BULLSHIT JOBS AND THE NEW FEUDALISTS

Have you ever felt like your job was a waste of time? If so, you are not alone. When Yougov asked people ‘does your job make a meaningful contribution to the world?’, 37% replied that it did not and 13% were ‘unsure’. In other words, fifty percent of people polled either didn’t know whether their job was worthwhile or not, or were certain that it was not. If you are one of these people, chances are you have a ‘bullshit job’.

‘What is a bullshit job’?

It might be worth talking a bit about what the term ‘bullshit job’ means. Perhaps the easiest way to grasp this is to consider its opposite. When it comes to employment, we usually assume that some need is first identified, and then some service is created to fill that gap in the market. An obvious way to tell if that service is necessary to society overall would be to observe the effect when it is removed- say, as a consequence of strike action. If society experiences a noticeable and negative effect, then it’s almost certain that the job was a valuable one.

On the other hand, if a job could disappear without almost anybody noticing (because its absence has either no effect or is actually beneficial) that would be a bullshit job.

Here’s one such example of such a job, taken from David Graeber’s ‘Bullshit Jobs: A Theory’:

“I worked as a museum guard for a major global security company in a museum where one exhibition room was left unused more or less permanently. My job was to guard that empty room, ensuring no museum guests touch the…well, nothing in the room, and ensure nobody set any fires. To keep my mind sharp and attention undivided, I was forbidden any form of mental stimulation, like books, phones etc. Since nobody was ever there, in practice I sat still and twiddled my thumbs for seven and a half hours, waiting for the fire alarm to sound. If it did, I was to calmly stand up and walk out. That was it”.

Now, some points are worth going over at this stage. Firstly, a bullshit job is best thought of as one that makes no positive contribution to society overall (since it would hardly matter if the position did not exist) rather than one that is of no benefit to absolutely anyone. As we shall see, it could suit some people to employ somebody to stand or sit around wearing an impressive-looking uniform. It’s just that whatever function this serves really has little to do with capitalism as most people understand it.

Secondly, one can always invent a meaning for this job, just as philosophers have made up reasons why Sisyphus could find meaning in his pointless task of rolling that boulder up-hill in the sure and certain knowledge that it would roll back down again. But, really, all this does is to highlight what bullshit such jobs are. After all, where genuine jobs are concerned one need not wrack one’s brains making up justifications, because the need pre-exists the job.

So, with those points out of the way and with a definition of bullshit jobs to work with (‘employment of no positive significance to society overall’) we can return to the question ‘how come such jobs exist?’.

‘This cannot be!’

One reason, strangely enough, is because many people assume they cannot exist. The reason why is because the very idea of bullshit jobs seems to run contrary to how capitalism is meant to work. If one word could be used to sum up the workings of capitalism in the popular imagination, that word would probably be ‘efficiency’. Capitalism is imagined to be ruthless in its drive to cut costs and reduce waste. That being the case, it surely makes no sense for any business to make up pointless jobs.

At the same time, people have no problem believing stories of how socialist countries like the USSR made up pointless jobs like having several clerks sell a loaf of bread where only one was necessary, due to some top-down command to achieve full employment. After all, governments and bureaucracies are known for wasting public money.

It’s worth thinking about what happened in the Soviet example and what did not. No authority figure ever demanded that pointless jobs be invented. Instead, there was a general push to achieve full employment but not much diligence in ensuring such jobs met actual demands. Those lower down with targets to meet did what was necessary to tick boxes and meet their quotas.

Studies from Harvard Business School, Northwestern University’s Kellogg School of Management, and others have shown that goals people set for themselves with the intention of gaining mastery are usually healthy, but when those goals are imposed on them by others- such as sales targets, standardized test scores and quarterly returns- such incentives, though intended to ensure peak performance, often produce the opposite. They can lead to efforts to game the system and look good without producing the underlying results the metric was supposed to be assessing. As Patrick Schiltz, a professor of law, put it:

“Your entire frame of reference will change [and the dozens of quick decisions you will make every day] will reflect a set of values that embodies not what is right or wrong but what is profitable, what you can get away with”.

Practical examples abound. Sears imposed a sales quota on its auto repair staff- who responded by overcharging customers and carrying out repairs that weren’t actually needed. Ford set the goal of producing a car by a particular date at a certain price that had to be at a certain weight, constraints that lead to safety checks being omitted and the dangerous Ford Pinto (a car that tended to explode if involved in a rear-end collision, due to the placement of its fuel tank) being sold to the public.

Perhaps most infamously, the way extrinsic motivation can cause people to focus on the short-term while discounting longer-term consequences contributed to the financial crisis of 2008, as buyers bought unaffordable homes, mortgage brokers chased commissions, Wall Street traders wanted new securities to sell, and politicians wanted people to spend, spend spend because that would keep the economy buoyant- at least while they were in office.

With all that in mind, it’s worth remembering the one thing that unites thinkers on the left and right sides of the political spectrum in Western thinking. Both agree that there should be more jobs. I don’t think I have seen a current-affairs debate where the call for ‘more jobs’ wasn’t made, and made often.

Whether you are a ‘lefty’ or a ‘right-winger’, you probably believe that there should be ‘more jobs’. You just disagree on how to go about creating them. For those on the left, the way to do it would be through strengthening workers’ rights, improving state education and maybe through workfare programs like Roosevelt’s ‘New Deal’. For right-wingers, it’s achieved through deregulation and tax-breaks for business, the idea being that this will free up entrepreneurs and create more jobs.

But, in neither case does anyone insist that whatever jobs are created should be of benefit to society overall. Instead, it’s just usually assumed that of course they will be. This is roughly comparable to somebody being so convinced that burglary does not happen they take no precautions to protect themselves against theft. This just makes them more vulnerable to criminal activity.

If this analogy is to work, it has to be the case that we are wrong to assume modern markets actively work against bullshit jobs; that, actually, there are reasons why pointless jobs are being created. In that case, our assumption that such jobs can’t exist would work against the possibility of acting to prevent their proliferation.

In fact, such reasons do exist, and a major one is something called ‘Managerial Feudalism’. What is that? Well, that’s a topic we will tackle in the next instalment.

REFERENCES

‘Bullshit jobs: A Theory’ by David Graeber

‘Why We Work’ by Barry Schwartzh

BULLSHIT JOBS AND THE NEW FEUDALISTS

Bullshit jobs are proliferating throughout the economy, and the reason why is partly due to something called ‘managerial feudalism’. In order to understand the role this plays in the creation of bullshit jobs, we need to look at the various positions people occupied in feudal societies. If you have ever watched a drama set in such times, you will no doubt have noticed how there is always an elite class of people who employ the services of a great many others. In some cases, their servants perform tasks that would be considered useful in today’s society, attending to such things as gardening, food preparation and household duties. But the nobility also seem to be surrounded by individuals who (despite the importance of their appearance, what with all the flashy uniforms they wear) don’t seem to be doing much of anything.

What are all these people for? Mostly, they are just there to make their superiors look, well, ‘superior’ . By being able to walk into a room surrounded by men in smart uniforms, nobles give off an air of gravitas. And the greater your entourage is, the more important you must be. At least, that’s the impression you hope to convey when you employ people to stand around making you look impressive.

The desire to place oneself above subordinates and to increase the numbers of those subordinates, thereby gaining a show of prestige, happens whenever society structures itself into a definite hierarchy with a minority that hold a ‘noble’ position within that structure. This is exactly what we find in large businesses, where the executive classes assume the role of the nobility. In order to understand why bullshit jobs exist, we need to look at how the condition of managerial feudalism came about.

Rise of the corporate nobility

Once upon a time, from around the mid-40s to the mid-70s, businesses ran what might be called ‘paternalistic’ models that worked in the interests of all stakeholders. The need to rebuild infrastructure following the war, a desire to provide security to those who had fought in it, the strength of unions, and governments following Keynesian economics, all worked to ensure that increases in productivity would bring about increases to worker compensation.

But, during the 80s and onwards, attitudes towards worker collectives and Keynesian economics changed and were instead seen as stifling entrepreneurs. This gave rise to more lean-and-mean economic practices. What really helped the rise of the lean-and-mean model in the 80s and 90s was certain federal and state regulatory changes, coupled with innovations from Wall Street. The federal and state regulatory changes brought about an environment in which corporate mergers and takeovers could flourish.

Meanwhile, Michael Milken, of investment house Drexel Burnham, created high-yield debt instruments known as ‘junk bonds’, which allowed for much riskier and aggressive corporate raids. This triggered an era of hostile takeovers, leveraged buyouts and corporate bustups.

The people who most benefited from all this deregulation and financialisation were those at the executive level. Once upon a time, the CEO of a large corporation would have been the epitome of the cool, rational planner. He or she would have been trained in ‘management science’ and probably worked his or her way up within the ranks of the organisation so that, by the time they reached the top, the CEO had mastered every aspect of the business. Once there at the apex of the corporate pyramid this highly trained, rational specialist would have carried out the central belief of the college-educated middle-class, with its mandate of progress for all and not just the few.

But as the corporate world became more volatile toward the end of the 20th century, questions began to arise over whether such rationality and level-headedness was best for delivering the new goal of short-term boosts to shareholders’ profits. With the business world now seen as so tumultuous and complex as to “defy predictability and even rationality” (as an article in Fast Company put it) a new kind of CEO emerged, one driven more by intuition and gut-feeling. The new CEO was less of a manager with great experience obtained from working his way up the company hierarchy, and more of a flamboyant leader who had achieved celebrity status in the business world, and was hired on the basis of his showmanship, whether his prior role had anything to do with the new position or not. And they certainly prospered in their position, because the focus on improving the bottom line and rewarding celebrity CEOs saw executive pay soar to over three hundred times that of the typical worker.

It’s hard to exaggerate the difference between the old-style corporate boss and the new breed that arose around the late 20th century. As David Graeber pointed out, the old-fashioned leaders of industry identified much more with the workers in their own firms and it was not until the era of mergers, acquisitions and bustups that we get this fusion between the financial sector and the executive classes.

This marked change in attitudes was reflected in comments made by the Business Roundtable in the 1990s. At the start of the decade, Business Roundtable said of corporate responsibility that they “are chartered to serve both their shareholders and society as a whole”. But, seven years later, the message had changed to “the notion that the board must somehow balance the interests of other stakeholders fundamentally misconstrues the role of directors”. In other words, a corporation looks after its shareholders and the interests of other stakeholders-employees, customers, and society in general-are of far less importance.

Pointless White-Collar Jobs

Now, the term ‘lean and mean’ implies that capitalism had become more, well, ‘capitalist’, taking the axe to any unnecessary expenditure and therefore bringing about more streamlined operations run by more efficient employees. In other words, the exact opposite of conditions favourable to the growth of bullshit jobs. But, actually, the pressure to downsize was directed mostly at those near the bottom doing the blue-collar work of moving, fixing and maintaining things. They were subjected to ‘scientific management’ theories designed to dehumanise work and bring about robotic levels of efficiency, or were replaced by automation or lost their jobs when the firm took advantage of globalisation and moved abroad where more exploitable workers were available. This freed up lots of capital, and it is how that capital was used that is key to understanding how this so-called ‘lean-and-mean’ period brought about bullshit jobs. As Graeber said, “the same period that saw the most ruthless application of speed-ups and downsizing in the blue-collar sector also brought a rapid multiplication of meaningless managerial and administrative posts in almost all large firms. It’s as if businesses were endlessly trimming the fat on the shop floor and using the resulting savings to acquire even more unnecessary workers in the offices upstairs…The end result was that, just as Socialist regimes had created millions of dummy proletarian jobs, capitalist regimes somehow ended up presiding over the creation of millions of dummy white-collar jobs instead”.

REFERENCES

“White Collar Sweatshop” by Jill Andresky Frazier

“Bullshit Jobs: A Theory” by David Graeber

“Smile Or Die” By Barbara Ehrenreich

BULLSHIT JOBS AND THE NEW FEUDALISTS.

The era of mergers and acquisitions which broke up admittedly bloated old corporations in order to bring about short-term boosts to shareholders resulted in the creation of a ‘noble class’ of executives, and subordinates whose only purpose was add to the prestige of those above them. One such employee was ‘Ophelia’, interviewed in Graeber’s book. “My current job title is Portfolio Coordinator, and everyone always asks what that means, or what it is I actually do? I have no idea. I’m still trying to figure it out….Most of the midlevel managers sit around and stare at a wall, seemingly bored to death and just trying to kill time doing pointless things (like that one guy who rearranges his backpack for a half hour every day). Obviously, there isn’t enough work to keep most of us occupied, but—in a weird logic that probably just makes them all feel more important about their own jobs—we are now recruiting another manager”.

This raises a couple of questions. How come the person ultimately in charge did nothing to prevent this flagrant waste of money? And how did an era of corporate bustups, mergers and acquisitions result in a proliferation of bullshit jobs?

Well, firstly one has to recognise a crucial difference between corporate raiders and the ‘robber barons’ they styled themselves on. The crucial difference is that people like Rockefeller and Vanderbilt, whatever you think of their practices, actually built business empires. But corporate raiders like James Goldsmith and Al ‘Chainsaw’ Dunlap didn’t do much building. No, they just took advantage of deregulation and financial innovations like junk bonds to tear apart existing businesses, lay off thousands and gain short-term boosts to their shares. They were vultures. That’s not necessarily derogatory. Vultures play a necessary part in cleaning away carcasses. Arguably, the old corporate structure had become too bloated and inefficient and really the axe should have come down on it. What I am suggesting is that, while the raiders were good at profiteering from the death of the old corporate structure, they lacked the ability to prevent the rise of a new one just as liable to create bullshit jobs.

The Influence Of Positive Thought

We can perhaps understand why by combining ‘managerial feudalism’ and its nobles looking for shows of status and flunkies providing a visible manifestation of that superiority, with the phenomenon I talked about in the series ‘How Religion Caused The Great Recession’.

In that series, I explained how early settlers of the United States practiced ‘Calvinism’. The Calvinist religion saw much virtue in industrious labour and particularly in constant self-examination for any sinful thought. Such an outlook probably helped settlers survive in what was, after all, the ‘Wild West’.

But as the harsh environments were gradually tamed, the constant self-examination for sinful thought and its eradication through labour came to impose a hefty toll on those who became cut off from industrious work. Faced with people succumbing to the symptoms of neurasthenia, and with the medical establishment seemingly unable to cure such patients, people began to reject their forebears’ punitive religion. In the 1860s, Phineas Parkhurst Quimby met up with one Mary Baker Eddy, and together they launched the cultural phenomenon of positive thinking. Drawing on a variety of sources from transcendentalism to Hinduism, New Thought re-imagined God from the hostile deity of Calvinism to a positive and all-powerful spirit. And humanity was brought closer to God, too, thanks to a concept of Man as part of one universal, benevolent spirit. And if reality consisted of nothing but the perfect and positive spirit of God, how could there be such things as sin, disease, and other negative things? New Thought saw these as mere errors that humans could eradicate through “the boundless power of spirit”.

But although intended as an alternative to Calvinism, New Thought did not succeed in eradicating all the harmful aspects of that religion. As Barbara Ehrenreich explained in ‘Smile Or Die’, “it ended up preserving some of Calvinism’s more toxic features- a harsh judgmentalism, echoing the old religion’s condemnation of sin, and the insistence on the constant exterior labour of self-examination”. The only difference was that while the Calvinist’s introspection was intended to eradicate sin, the practitioner of New Thought and its later incarnations of positive thinking was constantly monitoring the self for negativity. Anything other than positive thought was an error that had to be driven out of the mind.

So, from the 19th century onwards, a belief that the universe is fundamentally benevolent and that the power of positive thought could make wishes come true and prevent all negative things from happening, was simmering away in the American subconsciousness. When consumerism took hold in the 20th century, positive thinking would become increasingly imposed on anyone looking to get ahead in an increasingly materialistic world.

What all this has to do with the current topic, is that the cult of positive thinking that was begun with New Thought and amplified by 20th century consumer culture ended up having an effect on how businesses were run. Whereas, before the Great Depression, there had been campaigners speaking out against the excesses of the wealthy and the oppression imposed on the poor, the prosperity gospel that had begun in the 19th century and which was amplified by megachurches and TV evangelists responding to market signals from 20th century consumption culture, had a markedly different message: There was nothing amiss with a deeply unequal society. Anyone at all stood to become as wealthy as the top 1 percent. Just remain resolutely optimistic and all will be well.

But, unlike with the megachurches (which one could leave at any time) or television evangelists (which one could always just turn off) the books and seminars to be consumed at corporate events were often mandatory for any employee who wanted to keep his or her job. Workers were required to read books like Mike Hernacki’s ‘The Ultimate Secret to Getting Everything You Want’ or ‘The Secrets Of The Millionaire Mind’ by T. Harv Ecker, which encouraged practitioners of positive thinking to place their hands on their hearts and say out loud, “I love rich people! And I’m going to be one of those rich people too!”.

Remember, that Positive Thinking ideology considers any negativity to be a sin, and some of its gurus recommended removing negative people from one’s life. And in the world of corporate America-where, other than in clear-cut cases of racial, gender, or age-related discrimination, anyone can be fired for any reason or no reason at all-that was easy to do: terminate that negative person’s employment. Joel Osteen of Houston Lakewood church (described as “America’s most influential Christian” by Church Report magazine) told his followers, “employers prefer employees who are excited about working at their companies…God wants you to give it everything you’ve got. Be enthusiastic. Set an example”. And if you didn’t set an example and radiate unbridled optimism every second of the working day, you were made an example of. As banking expert Steve Eisman explained, “anybody who voiced negativity was thrown out”.

Such was the fate of Mike Gelband, who was in charge of Lehman Brothers’ real estate division. At the end of 2006 he grew increasingly anxious over the growing subprime mortgage bubble and advised “we have to rethink our business model”. For this unforgivable lapse into negativity, Lehman CEO Richard Fuld fired the miscreant.

A Bullshit Corporate Culture

So, the corporate culture had become one that was decidedly hostile to any bad news, such that even those in positions of high authority got the sack if they voiced any negativity. As for the lower ranks, whatever misgivings they had concerning the way things were had to be filtered through layer upon layer of management. If there’s already a culture of hiding negative reports on how business practices are shaping up, of putting a positive spin on everything, it’s not much of a step from there to not being entirely truthful about the usefulness of the people being hired. This is even more likely to happen if A) your status is defined by how many subordinates you have (and, therefore, to lose subordinates is to suffer diminished status) and B) if employees come to depend on the pretty generous salaries that often come with bullshit white-collar work, for example because their consumerist lifestyle has left them with substantial mortgages and credit card bills. If that’s the case, then it’s probably not a good idea to broadcast how unnecessary some jobs are.

The idea that those in ultimate authority might be prevented from knowing everything that’s going on in their business was encapsulated by a comment that one billionaire made to crisis manager Eric Dezenhall: “I’m the most lied to man in the world”.

It’s important to point out that the role of CEO is not itself bullshit. What is being argued instead is that some CEOs are effectively blind to all the bullshit happening in their firms. Why wouldn’t they be, when anyone bringing them bad news is liable to be sacked, when executives and middle-managers surround themselves with yes-men and flunkies, and when an obsession with increasing shareholder value is creating some decidedly dodgy business practices disguised through impenetrable economic jargon and management-speak? Such practices are well-suited to redirecting resources so as to create an elite minority with sufficient wealth and power to be deserving of the ‘nobility’ label, for creating elaborate hierarchies of flunkies who are just there to provide visible displays of their ‘superiors’ magnificence, and spindoctors pulling the wool over people’s eyes and preventing the truth from being revealed. Medieval feudalism had its priestly caste with their religious texts written in an obscure tongue with which to justify the divine right of kings and all that. Managerial Feudalism has the financial and banking sector and all the obscure language that comes with it, ceaselessly denouncing working classes whenever they demand living wages and justifying any money grab or show of status by the executive and managerial classes no matter how greedy and socially unjust.

It’s when we examine financialisation that we really understand how it can be that BS jobs exist. That’s a topic for next time.

REFERENCES

“White Collar Sweatshop” by Jill Andresky Frazier

“Bullshit Jobs: A Theory” by David Graeber

“Smile Or Die” By Barbara Ehrenreich

BULLSHIT JOBS AND THE NEW FEUDALISTS

In what way does the world of finance help bring about bullshit jobs? Well, it partly has to do with the way jobs are categorised in the popular imagination. When we talk about major revolutions in working practice we speak of transitions from hunter-gathering, to farming, to manufacturing, to services. Such terms imply that at every stage people always transition to work that is of obvious benefit to society, involving as it does the creation of products that improve quality of life, or by offering services that meet some pressing need or just make life more pleasant.

What’s wrong with this belief is that it paints the wrong picture of what everyone in ‘services’ does. Contrary to what the term implies, not everyone in ‘services’ is helping their fellow human beings by clipping hedges, serving ice-cream and so on. No, there’s a fourth sector involved in work of a different kind, one economists call FIRE after Finance, Insurance and Real-Estate.

The kind of thing this sector is involved in is well illustrated by the goings-on that lead up to the 2008 crash. Banks’ profits once relied on the quality of the loans they extended. However, quite recently we have seen a switch toward ‘securitisation’, which in practice involves bundling multiple loans together and selling portions of those bundles to investors as Collateralized Debt Obligations or CDOs. Rather than earning interest as loans are repaid over time, when it comes to securitisation the banks’ profit is derived from fees for arranging the loans. As to the risk inherent in lending money, it’s the buyer of the CDO who takes on the risk, meaning that, as far as the bank is concerned, defaults are somebody else’s problem.

This caused a shift from lending that was quality-driven toward quantity-driven borrowing. Thanks to securitisation, banks could make loans with the knowledge that they could be sold off to someone else, the risk associated with such loans being their problem. What this meant was that banks were freed from the downside of defaults. And if conditions are in place to cause wild exuberance, borrowing is bound to spiral out of control.

Of course, that’s precisely what happened in the runup to the 2008 subprime mortgage crisis. In the words of Bernard Lietaer and Jacqui Dunne, “math ‘quants’ took the giant pools of home loans now sitting on their employers’ balance sheets and repackaged them into highly complex, opaque, and difficult-to-value securities that were sold as safe bets. As more and more of these risky securities were purchased by pension funds, insurance firms, and other stewards of the global public’s savings, the quants’ securitisation machine demanded more loans, which in turn led to a massive expansion of dubious lending to low-income American households”.

Advertisements for banks really push the message that they are but humble servants helping customers protect and manage their money. And with talk of ‘markets’ and ‘products’, the financial ‘industry’ likewise presents itself as doing the traditional work of making useful stuff and providing much-needed services. If you believe the propaganda, the primary purpose of this sector is to help direct investments to those parts of commerce and industry that will raise prosperity, while earning an honest profit in the process.

But while this kind of thing does happen, it’s very misleading to portray the financial sector as being mostly concerned with such services. We can see this is so by looking at where the money goes. A piffling 0.8 percent of the £435 billion created by the UK government in quantitative easing (ie money printing) went to the real, productive economy. The rest went to the financial sector.

As David Graeber explained, what this sector actually does is as follows: “the overwhelming bulk of its profits comes from colluding with government to create, and then trade and manipulate, various forms of debt”. In other words, what the FIRE sector mostly does is create money from ‘nothing’. But, the thing is, there actually is no such thing as money from nothing. If somebody is making money out of thin air, somebody somewhere else is being lumbered with the cost. So, really, financialisation is the subordination of value-adding activity to the servicing of debt.

It is under such conditions, in which work is morphed into a political process of appropriating wealth and the repackaging and redistribution of debt, that the nature of BS jobs (which seems so bizarre from the traditional capitalist point-of-view) actually makes sense. From the perspective of the FIRE sector, the more inefficient and unnecessary chains of command there are, the more adept such organisations become at the art of rent-extraction, of soaking up resources before they get to claimants.

An example of such practices was provided by ‘Elliot’:

“I did a job for a little while working for one of the ‘big four’ accountancy firms. They had been contracted by a bank to provide compensation to customers that had been involved in the PPI scandal. The accountancy firm was paid by the case, and we were paid by the hour. As a result, they purposefully mis-trained and disorganised the staff so that jobs were repeatedly and consistently done wrong. The systems and practices were changed and modified all the time, to ensure no one could get used to the new practice and actually do the work correctly. This meant that cases had to be redone and contracts extended. The senior management had to be aware of this, but it was never explicitly stated. In looser moments, some of the management said things like “we make money from dealing with a leaky pipe-do you fix the pipe, or do you let the pipe keep leaking?’’’.

In order for such organisations to continue doing what they are doing, there has to be employees that work to prevent such dubious practices from becoming widely known. Faithful allies must be rewarded, whistleblowers punished. Those on the rise must show visible signs of success, surrounded by important-looking men who make their ‘superiors’ look special in office environments where one’s status is determined by how many underlings you command. Meanwhile, those flunky roles are themselves a handy means of distributing political favours, and since those in the lower ranks had best be distracted from the dodgy goings on, this incentivises the creation of an elaborate hierarchy of job positions, titles and honours. Let them occupy themselves squabbling over that.

So, ‘Managerial Feudalism’ is so-called because the FIRE sector (which in practice is spreading, which is why car commercials no longer tell you what it costs to buy the vehicle, only what APR representative you can expect if you take out a loan) has brought about conditions that resemble classic medieval feudalism, which was likewise primed to create hierarchies of nobles, flunkies, mystic castes quoting obscure texts, and downtrodden masses.

This is not without consequence. In the early 20th century, economists like Keynes were tracking progress in science, technology and management and predicting that, by the 21st century, our industries would be so productive we could drastically reduce the amount of time devoted to paid employment, investing the time gained in the pursuit of a more well-rounded existence. When you consider that 50 percent of jobs are either definitely bullshit or kind of vague regarding their value to society, you can see how people like Keynes were partly correct. Had we continued to focus on technical efficiency and productive capability we doubtlessly would have access to much more leisure and prosperity. But, instead, business, economics and politics combined in such a way as to create a new kind of feudalism that has imposed itself on top of capitalism.

Recapping what we have learned over this series, the old paternalistic corporate model came under attack during an era of bustups, mergers and acquisitions. The corporate raiders who lead this attack were different from their predecessors in that they identified much more with finance than the workers under their management. This, coupled with a cult of materialist positive thinking, gave rise to an executive class whose salary and bonus structure put them in a ‘noble’ position. It also gave rise to a corporate culture that was hostile to any bad news. This meant that, when the savings that were being made by bringing the axe down on those at the lower end of the corporate hierarchy only ended up being wasted by the hiring of more levels of management, there were few people who dared speak out against this practice. Moreover, keeping one’s mouth shut and hoping you, too, might be in line for a pointless but well-paid white collar job had become the sensible choice for those burdened with the high costs of an over-consumptive lifestyle. And that part of the ‘service’ sector which has little to do with providing services but is more concerned with colluding with government in order to repackage and sell ever-more complex forms of debt had every incentive to run things as inefficiently as possible, since those are the conditions in which rent-extraction can cream off more of other people’s money.

Such conditions encourage the existence of jobs that are more to do with appropriating rather than creating wealth, and with disguising the fact that this is happening. When your status is defined by how many underlings you have, this can encourage an increase in the levels of management. If other big businesses employ somebody to sit at a desk, your company must do likewise. Not because the person has anything useful to do, necessarily, but simply because it’s ‘what is done’. When you make your money from a ‘leaky pipe’ (ie some deficiency in the system) this can encourage ‘duct-taping’ jobs that merely manage the problem rather than deal with it. This is like employing somebody to replace the bucket rather than fix the leaking roof. Of course, in that overly-simplistic example the ruse would be easily spotted. But in the deliberately complex world of the FIRE sector there is more chance of doing things incompetently and getting away with it, because few can penetrate the jargon and management-speak and see the bullshit hiding behind it.

What this all means is that the ‘technological unemployment’ gap that Keynes predicted has been filled with jobs that, quite frankly, don’t need to exist. If you can’t imagine how that can happen under capitalism, well, your mistake is in assuming our current system is something that people like Adam Smith or Milton Friedman would recognise as ‘capitalist’. Bullshit jobs really shouldn’t exist in the kind of free market that people like Stefan Molyneux promote, but they can and do exist in the whatever market system dominates today.

REFERENCES

“White Collar Sweatshop” by Jill Andresky Frazier

“Bullshit Jobs: A Theory” by David Graeber

“Smile Or Die” By Barbara Ehrenreich

Posted in Uncategorized | Leave a comment

WHY EXECUTIVES DON’T STRIKE

Strikes. They’re a nuisance, aren’t they? Bringing disruption to our lives by denying us the services we rely on. But have you ever noticed how the workers who organise strikes always seem to be employees at the lower end of the corporate hierarchy? It’s always blue-collar workers, junior doctors and other lowly types that are threatening such action. Executives, for some reason, never stage a walkout.

I wonder why that is?

Now, some might think the reason is obvious: Strikes are undertaken in order to get more pay, and so executives have no need for such action as they are already very handsomely compensated. For example, if you are an advertising executive your yearly salary is around half a million pounds. Not too bad!

But, actually, ‘more money’ is not the only reason why workers feel the need to strike. Sometimes, strike action is undertaken in order to bring to the world’s attention unfair working practices. If being treated unfairly justifies a walkout, then maybe executives would have a reason to strike?

Think about how such people are portrayed in movies. In nearly all cases, executives in films are portrayed as corrupt. You have Gordon Gecko in ‘Wall Street’, breaking laws and destroying small businesses in his thirst for more dirty money. You have the executive classes in ‘Elysium’, living in luxury aboard their space station while down on earth their overworked, underpaid blue-collar employees are callously discarded when they fall foul of atrocious working conditions the higher-ups are too uncaring to fix. You have the CEO of OCP looking on in concern as Robocop2 lays waste to the city- not concern for the people it’s killing mind you, but at what it could mean for his company’s shares (“this could look bad for OCP, Johnson! Scramble our best spin team!”).

Those are just a few examples of films that make business men out to be bad guys. Now try to think of movies where executives are not portrayed as villains, but as heroes. I can only think of two. Batman’s Bruce Wayne has a strong moral code. But that’s not a particularly good example, because he is only being altruistic when he is the Caped Crusader. His ‘Bruce Wayne’ persona is of a billionaire playboy who is a bit of a prick. And in the Christopher Nolan films the board of directors that run Wayne enterprises are your usual bunch of villains in suits. The other example I can think of is Ayn Rand’s ‘Atlas Shrugged’, and do you know what that book and movie is about? It’s about successful businessmen becoming so disgruntled with being portrayed as villains by society that they go on strike.

So, given how often successful businessmen are portrayed as bad guys, why don’t they ever stage a walkout and remind us all of how much we rely on the work they do, just as their fictional counterparts in Rand’s opus did?

I think the reason why is as follows: because it just wouldn’t work out the way it did in ‘Atlas Shrugged’. In that story, the consequences were that society soon started falling apart. When workers low down in the corporate hierarchy stage a walkout, the effects are, indeed, most often immediate and near-catastrophic. Everything grinds to a halt, everyday life is hopelessly disrupted, and we are reminded that such people provide vital services we can scarcely do without. I would suggest that if the executive classes were to stage a walkout, life would not grind to a halt, at least not for quite some time. On the contrary, most people would not even notice anything amiss.

Now, you might counter that this is mere speculation with nothing to back it up. However, I believe there are a couple of examples that indicate that what I say is true.

The first example involves something that happened during the decade from 1966 to 1976 in Ireland. During that time, Ireland experienced three bank strikes that caused banks to shut down for a total of twelve months. During the time in which they were closed, no checks could be cashed, no banking transactions could be carried out, and the Irish lost access to well over 80% of the money supply.

You would have thought this would have spelled utter disaster for Ireland. After all, banking executives are among the top earners (paid around £5 million a year, as well as being awarded endless bonuses) and we’re always being told of the utterly vital function the banking and financial sectors play in the economy. Surely, then, Ireland was brought to her knees very soon after the banks closed their doors and removed their services?

Actually, no. Instead, the Irish just carried on doing business without the banks. They understood that, since the banks were closed, there was nothing to stop people writing a check and using it like cash. Once official checks were used up, people used stationary from shops as checks, written in denominations of fives, tens, and twenties. And it was not just individuals who operated this mutual credit system, businesses also got in on the act. Large employers like Guinness issued paychecks not in the usual full-salary amount but rather in various smaller denominations, precisely so they could be used as a medium of exchange as though they were cash.

All this was possible because, at the time, Ireland had a small population of three million inhabitants. In most communities, people had a high degree of personal contact with other individuals, and where knowledge of somebody was lacking, local shops and pubs had owners who knew their clientele very well and could vouch for a person’s creditworthiness.

According to economics professor Antoin E. Murphy, author of ‘Money in an Economy without Banks’, “The Irish created an unregulated, totally anarchistic community currency matrix…there was nobody in charge and people took the checks they liked and didn’t take the checks they didn’t like….And, it worked! As soon as the banks opened again, you’re back to fear and deprivation and scarcity. But until that point it had been a wonderful time”.

A few years before the Irish incident, New York’s refuse collectors went on strike and just ten days afterwards the city was brought to her knees. I don’t think anyone would have described that situation as ‘a wonderful time’. Unlike the millions paid to city bankers, refuse workers get around £12,000 a year.

Another example suggesting that executives wouldn’t be missed for quite some time were they to disappear would be the company Uber, for it has seen not only the resignation of its founder, Travis Kalanick, but also a whole bunch of other top executives, so that, according to a 2017 article in ‘marketwatch’, it “is currently operating without a CEO, Chief operating officer, chief financial officer, or chief marketing officer”. Did the company fall down without the aid of these essential people? No, it carried on just fine without them.

Now this is intriguing. Why is it, that when low-paid staff nearer the bottom of the corporate hierarchy go on strike we feel the pain almost immediately, but on the rare occasions when highly-rewarded executives don’t show up for work nobody cares because nothing much changes?

I think it all hinges on what these people actually do. What do they actually do? It’s hard to say, because any role you can think of that might be of use to a company turns out to be a job description for somebody lower down the hierarchy. Do they make anything, these executives? No, the workers down in manufacturing do that. Do they manage anything? No, managers do that. Are they responsible for sales? No, that’s what salespeople are for. And so on and so on. Now, I’m not suggesting the CEO does literally nothing but it stands to reason that when you have delegated responsibility for just about everything to your subordinates, it’s going to harm that company much more if the subordinates don’t show up than if you were to disappear.

And that’s just counting the official jobs subordinates have. But what about unofficial ones? Take Personal Assistants. If you have ever watched the Apprentice you know the sort of employee I am talking about: The woman or man at the desk who answers the phone and says ‘Lord Sugar/ Mr Trump will see you now’. According to David Graeber, secretarial work like answering the phone, doing filing and taking diction is not all PAs do. “in fact, they often ended up doing 80 percent to 90 percent of their bosses’ jobs, and sometimes, 100 percent…It would be fascinating—though probably impossible—to write a history of books, designs, plans, and documents attributed to famous men that were actually written by their secretaries”.

So businesses seem not to be negatively affected when executives don’t show up for work. But when they are present, is their work of value to society? Not according to studies into negative externalities (in other words, the social costs of doing business) Let’s take the example of advertisement executive mentioned earlier. As you may recall, advertisement executives bring home a yearly salary of around £500,000. But the studies reckon that around £11.50 of social value is destroyed per £1 paid. Contrast this with a recycling worker, who brings home a yearly income of around £12,500, and creates £12 in social value for every £1 they are paid.

This, then, is why executives don’t strike. Far from reminding us what a valuable service they provide, it would instead shine a light on how businesses could function perfectly well without them, at least for much longer periods than they could function if their much lower-paid subordinates were to stage a walkout. For people who are a credit to society in terms of creating more social value for every pound they are paid, strike action can be an effective way of empathising the value to society their work generates. But that can hardly be the case when your work causes negative externalities that cost society more than it benefits from your existence. In that case, strikes can only shine light on the fact that you are not all that necessary.

REFERENCES

‘Bullshit Jobs: A Theory’ by David Graeber

‘Rethinking Money’ by Bernard Lietar and Jacque Dunne

“Money in an Economy Without Banks’ by Antoin E. Murphy

“Marketwatch”.

Posted in Uncategorized | Leave a comment

What Videogames Teach Us About work

Videogames have been featuring in the news recently. BBC Radio 4 is running a half-hour programme about Fortnite and in an article written for i by Will Tanner, it was reported that a Universal Basic Income experiment was ended because “ministers refused to extend its funding amidst concern that young teenagers would stay at home and play computer games instead of looking for work”.

That argument had a tone that is sadly familiar, depicting videogaming as an addictive evil that distracts its victims from what they ought to be doing. But I think it would be more accurate to say that gamers have already found meaningful work and are reluctant to forsake it and submit to less rewarding labour instead.

This way of looking at it goes largely unrecognised because we are not taught to equate videogaming with work. Instead, you ‘play’ a videogame and we are raised to believe that play is childish, a distraction, mere fun. Play, we are encouraged to believe, is the opposite of work.

But it really isn’t. One only has to look at the play other animals engage in to see there is a serious side to it. It’s a way of honing skills that will become essential in later life.

Similarly, in videogaming we find many activities that can be seen to hone skills that are important in this digital age we live in. Authors Bryon Reeves and J. Leighton Read list over a hundred such activities, including:

“Getting information: Observing, receiving and otherwise obtaining information from all relevant sources.

Identifying information by categorising, estimating, recognising differences or similarities and detecting changes in circumstances and events.

Estimating sizes, distances and quantities or determining time, cost, resources, or materials needed to perform a work activity.

Thinking creatively: developing, designing or creating new applications, ideas, relationships, systems or products, including artistic contributions”.

Also, in an article written for ‘Wired’ (“You Play World of Warcraft? You’re Hired!”) John Seely and Douglas Thomas explain how “the process of becoming an effective guildmaster amounts to a total-immersion course in leadership…to run a large one, a guild master must be adept at many skills: attracting, evaluating and recruiting new members; creating apprenticeship programs; executing group strategy…these conditions provide realworld training a manager can apply directly in the workplace”.

Far from being a distraction from work, videogames are, along with jobs, one of modern life’s two main work providers. Instead of lending support to the idea that people don’t want to work, videogames demonstrate how eager we are to engage in productive activity, to reach for goals, to solve problems and to take part in collaborative projects.

It does, however, raise a question: How come one work provider is able to draw upon willing and eager volunteers, while the other (jobs) mostly creates a feeling that work is a necessary evil you wouldn’t do if you had a choice? And, yes, that is how a great many people feel, as revealed by polls that show ninety percent of people hate their jobs.

Fundamentally, I think it all has to do with the direction in which money flows, and how that affects the design of work in videogames and jobs.

What do I mean by the direction in which money flows? Quite simply, I mean that if you have a job, then, assuming you are not an unpaid intern, a company will be paying you to work. This means that you are both an investment and a cost. On the other hand, when it comes to videogames, you pay a company to work, since you have to first purchase the game (and even if it is free-to-play like Fortnite, the company will have some means of extracting money from you). This means that you represent almost all profit, and only negligible cost.

Because videogame publishers want as many people to spend money on their games as possible, it obviously makes sense if working in a gaming context is as enjoyable and rewarding as it can be. When it comes to making work engaging, productive activity should provide opportunity to pursue mastery; it should offer autonomy, flexibility, judgement and creativity that is firmly in the hands of the individual doing the actual work.

The best videogames are great at providing all these conditions. Autonomy and flexibility are found in games where you don’t have to tackle challenges in a strictly linear fashion but can forge your own path instead. For example, in ‘Batman: Arkham Knight’ you, as the Caped Crusader, are free to roam Gotham City, swooping down to fight crime as and when you find it. If you hear an alarm ringing, you can locate its source and do a sub-mission involving a bank robbery. If you see smoke you can attempt to arrest Firefly. Exactly how you get to the game’s finale is entirely up to you.

Many games offer creativity, providing opportunities to customise the look of your character or items you have acquired. Some games come with comprehensive editing tools that offer even more scope for creative expression, such as ‘LittleBigPlanet’ which goes as far as enabling players to create whole new games. And since their very inception, videogames have given us the chance to exercise our judgement and gain mastery, as we make the snap decisions required to advance up the high-score charts, helped by well-crafted feedback systems that informs us when we are doing well and when we should try alternative strategies.

Now, it’s true that jobs may also provide the things that make work worthwhile. But, the crucial difference is that, where videogames are concerned, there is never a good reason to try and reduce or eliminate such qualities. Doing so would only make for a bad game that nobody would choose to play. There is, however, a reason why employers might want to reduce such qualities in a job. There is something that unites these qualities, which is that they all help to enhance our individuality. That’s not something that employers necessarily desire. The more creativity, judgement, and autonomy can be reduced on an individual level, the easier it becomes to train new recruits. Indeed, in many ways it’s preferable if your employees are less like unique individuals and more like interchangeable units that can be replaced at as short a notice as possible. The reason why that’s advantageous is because it reduces the bargaining power of the workforce, since you are less likely to complain about pay and working conditions if you know it won’t be too difficult for the boss to fire and replace you.

The result? A cheaper workforce, more value extracted from the commodity of labour-power, and more profit for those the labourers work for. You have to bare in mind that employees are quite low down in the pecking order for rewards from the labour process. Governments want their cut, banks and financial services want their cut, the company executives want their cut, and they take priority over the working classes, kind of like how the more powerful predators and scavengers get the juicy meat and leave only scraps for the rest to fight over. When it comes to the pursuit of more profit, it pays to make work as unrewarding (in a monetary sense) as you can get away with, which often results in work being designed to be as unrewarding (in the sense of not being engaging) as possible.

“But why would people choose to do work designed to be lacking the very qualities that make it engaging?”, you might be asking. The answer can be found in ‘negative motivation’. Being without a job can have serious consequences. Cut off from an income, bills cannot be paid and the threat of rough sleeping looms ever closer. On top of that there is cultural pressure to ‘get a job’, so much so that we don’t care if the job is useless or even harmful to society (‘at least s/he has a job’). This all amounts to enormous pressure to submit to employment, not really because of the gains people expect if they do have a job, but rather because of the punishment they dread if they don’t.

Videogame companies, on the other hand, cannot rely on negative motivation for the simple fact that hardly anyone can be forced to play games (I say hardly anyone, because there are sweatshops in which people grind through MMORPGS to level up characters that can be sold on to richer customers). This further emphasises the point that videogames never have an incentive to make work less rewarding, whereas such incentives do exist in the world of jobs.

CONCLUSION

Videogames, far from demonstrating our distaste for work, in fact show how willing and eager to work we are. So willing, in fact, that our desire to work supports one of the most successful industries of the modern age. Every day, millions of us spend billions all so we can engage in the work videogaming requires. If we really hated work, the first person to put a quarter into the first arcade game would have walked away in disgust at having to pay to stand there and perform repetitive manual labour. What, are you crazy?

What videogaming shows instead is that if you can take that simple mechanical operation and craft around it creativity, flexibility, autonomy, judgement and mastery, the result is work that people want to do so much they will gladly pay for it. But if, in the interest of extracting more value for money out of your workforce, you reduce or eliminate such qualities, people will hate such work and will only submit to it if circumstances force them.

That’s what jobs teach us.

REFERENCES

‘Wired’

‘Total Engagement’ by Bryon Reeves and J. Leighton Read.

‘Why We Work’ by Barry Schwartz.

Posted in Uncategorized | 2 Comments