How Religion Caused The Great Recession cont.

At the end of part three I ended with the hint that something dark and troubling occurred within corporate America at the end of the 20th century. The story of that change was told in my book ‘How Jobs Destroyed Work’, which I will quote from now.
“During the war, the USA achieved full employment for the first time since the 1920s. When the war was over, there was a lot of concern about the possibility of a postwar recession, which the government sought to avoid through various acts and initiatives. The acts included the ‘Employment Act’ of 1946, which “committed the federal government to maintain maximum employment and with it a high level of aggregate demand”. The initiatives included the GI bill, an education initiative that helped upgrade the workforce, thereby providing a large pool of white-collar workers for the administrative and management-type roles that corporations increasingly depended upon.
As well as anxieties about recession prompting the State to push for high employment, conditions enabled by the war played a part in other ways. For one thing, industry in America was still largely intact, unlike that of Europe’s. The government invested heavily in the business sector, particularly through highway construction and defence-related expenditures. Also, wartime research had helped launch an era of technological innovation, such as IBM’s development of the first general-purpose computer. Finally, wage freezes had been put in place during the war, and this had required employers to use fringe benefits with which to attract employees. This favoured the largest corporations, who could afford to offer greater benefits than their smaller rivals.
But those corporate benefit packages were still costly, even forty or fifty years ago. This might have discouraged their mass adoption, had it not been for militant unions during the postwar period. It made sense to the larger corporations that if they treated their employees well, that would improve emotional attachment to the company, and the threat of socialism would be avoided.
And so it came to pass that the early postwar decades enjoyed economic growth and price stability. The large corporations delivered on their promise of long-term employment prospects, meaning that anyone fortunate enough to land a job there felt secure, and expected that their own prosperity would rise along with the company’s fortunes.
But all that was to change in the 80s and 90s. 
During the 80s, attitudes toward the paternalistic model changed. The 1970s ended in recession, and during this period two of America’s largest companies- Chrysler and Lockheed-survived only because of government bailouts. The new decade began with inflation approaching 15%, and unemployment over 8.5%. Gold prices were soaring, a trend that is often associated with investor pessimism. Indeed, there was a general mood of unease regarding the the US’s economic prospects, as the stock market went into the worst slump since the 1930s.
Amidst all this financial trouble, people began looking at those large corporations with their many benefits packages and saw not businesses to be inspired by but rather dinosaurs to be blamed for worsening conditions. Increasingly, people saw the large corporations as bloated and inefficient, handicapped by too much bureaucracy and a workforce with an over-inflated sense of entitlement. It seemed as though America was increasingly unable to compete against more nimble competitors, most notably from Japan and West Germany. The nation was importing 25% of its steel and 53% of its numerically controlled machine tools by 1981.
What really helped the rise of the lean-and-mean model in the 80s and 90s was certain federal and state regulatory changes, coupled with innovations from Wall Street. The federal and state regulatory changes brought about an environment in which corporate mergers and takeovers could flourish. For example, there had been laws protecting local companies from out-of-state suitors, but these were declared unconstitutional by the Supreme Court. Also, President Reagan appointed an attorney who had previously defended large corporations against antitrust suits to be head of the Department of Justice’s antitrust division. This all but guaranteed there would no interference from the federal government with the growing acquisitions and mergers movement.
Meanwhile, Michael Milken, of investment house Drexel Burnham, created high-yield debt instruments known as ‘junk bonds’, which allowed for much riskier and aggressive corporate raids. These, along with the state and federal regulatory changes mentioned earlier, triggered an era of hostile takeovers, leveraged buyouts and corporate bustups”.
So what these changes-particularly the growth in the 80s of finance capitalism-did, was to transform the corporation from its traditional image of a task-based entity engaged in some collective activity defined not just in terms of profit but in an overall contribution to society, to one in which shareholder’s profits were the be all and end all. Everything else, including pride in the product in some cases (consider, for example, the internal email sent by an S&P employee which read “let’s hope we are all wealthy and retired by the time this house of cards falters”), was to be disregarded. All the focus was on the short-term raising of stock prices.
This marked change in attitudes was reflected in comments made by the Business Roundtable in the 1990s. At the start of the decade, Business Roundtable said of corporate responsibility that they “are chartered to serve both their shareholders and society as a whole”. But, seven years later, the message had changed to “the notion that the board must somehow balance the interests of other stakeholders fundamentally misconstrues the role of directors”. In other words, a corporation looks after its shareholders and the interests of other stakeholders-employees, customers, and society in general-are of far less importance.
Certainly the employee of 80s and 90s corporate America would have recognised their lack of importance in what as an increasingly insecure environment. Finance capitalism by that time had transformed the corporation from a paternal entity rewarding loyal workers with security and regular wages, to aggregations of financial assets that existed only to be merged, broken apart or destroyed, according to the whims of executives chasing short-term shareholder profit.
Some observers, among them Noam Chomsky and Jacques Fresco, have noted how corporations tend to have the same organizational structure as fascist dictatorships. In other words, there is a strict hierarchy that demands tight control at the top and obedience at every level. Granted, there may be a measure of give-and-take, but the line of authority is usually clear. Others, perhaps most notably Michel Foucault, have argued that prisons and factories came in at more or less the same time, and their operators consciously borrowed each other’s’ control techniques. 
For example, in the late 18th Century, social theorist Jeremy Bentham designed the ‘panopticon’. ‘Pan’ means ‘inmates’ and ‘opticon’ means ‘observed’ and so the panopticon was a prison designed in such a way that all inmates could be kept under surveillance by a single watchman. True, it was impossible for a single observer to keep an eye on all inmates at once, but the panopticon was designed in such a way as to make it impossible for any inmate to know if he was being watched or not. The inmates only knew that it was possible that they could currently be under surveillance. Bentham’s belief was that, under such conditions, inmates would effectively mind their own behaviour.
So what became of the panopticon? They are everywhere, only we now tend to refer to them as ‘offices’. Many a white-collar employee (those below the executive level, at least) spend their in-office hours in a cubicle, most likely of a one-size-fits-all, institutional-gray design that can be set up, reconfigured, and moved at the whim of those higher up the line of authority: A constant reminder of the employee’s own lack of security and importance to the corporation. Moreover, cubicles are (in the words of one employee) “mechanisms of constant surveillance”, lacking doors and usually arranged so that managers can spy on whoever they like at any time. The employees are usually made to work facing a wall, so cannot know if they are being watched unless they look over their shoulder. The message such an environment sends out is clear: We can see what you are-or are not- doing. So work harder or we’ll replace you. The employee found him or herself in a harsh working environment that did everything it could to underscore their vulnerability. 
As conditions for the average employee diminished and prosperity for those at the executive level soured to dizzying heights, America in the 80s and 90s had virtually returned to the highly polarised conditions of the 1920s. David Leonhart of the New York Times reckoned, “it’s as if every household in that bottom 80 percent is writing a check for $7000 every year and sending it to the top 1 percent”.
But whereas, before the Great Depression, there had been campaigners speaking out against the excesses of the wealthy and the oppression imposed on the poor, the prosperity gospel that had begun in the 19th century and which was amplified by megachurches and TV evangelists responding to market signals from late 20th century consumption culture, had a markedly different message: There was nothing amiss with a deeply unequal society. Anyone at all stood to become as wealthy as the top 1 percent. Just remain resolutely optimistic and all will be well.
Within this highly unstable environment, the positive-thinking ideology that had begun with 19th century New Thought and inflated by corporate-style churches, found an environment to which it was well suited. All kinds of life coaches and motivational gurus emerged, spreading the gospel of prosperity, and applying management speak to disguise what were worsening conditions. For example, following the Chase-Chemical merger, employees who lost their jobs were not laid off, they were instead referred to as ‘saves’. Other corporations going through mass layoffs in pursuit of boosting shareholder value in the short-term referred to those selected for redundancy as ‘nonselected employees’.
Over time, the message that life coaches and motivational gurus delivered become one on which everyone was supposed to consider the deterioration of work and its rewards in corporate America as a positive thing overall. Corporations paid substantial sums of money to the motivational industry, whose members told employees that to be laid off was an opportunity for self-development, that the volatile state of the jobs market was a welcome breeding ground producing winners.
And, unlike with the megachurches (which one could leave at any time) the books and seminars to be consumed at corporate events were often mandatory for any employee who wanted to keep his or her job. Workers were required to read books like Mike Hernacki’s ‘The Ultimate Secret to Getting Everything You Want’ or ‘The Secrets Of The Millionaire Mind’ by T. Harv Ecker, which encouraged practitioners of positive thinking to place their hands on their hearts and say out loud, “I love rich people! And I’m going to be one of those rich people too!”.
Along with being made to conform to all the rules and worksheets of the self-help literature, employees in corporate America found themselves having to attend Native American healing circles, Buddhist seminars, fire walking and other ritualistic practices, all in the name of maintaining a feverish pitch of optimism among worsening conditions. Such was the level of religious-like devotion to the gospel of prosperity and positive thinking that a 1996 business self-help book reckoned, “if you want to find a genuine mystic, you are more likely to find one in a boardroom than in a monastery or cathedral”.
In part five we will see how CEOs were transformed into cult-like leaders during the tumultuous 80s and 90s.
“Financial Fiasco” by Johan Norberg
‘Smile Or Die’ by Barbara Ehrenreich 
‘White Collar Sweatshop’ by Jill Frazer
‘How Jobs Destroyed Work’ by Extropia DaSilva
In part three of this series, we saw how the consumer culture of the late 20th century inspired churches to become more secular and corporate in their appearance, and how, as they grew into gigantic organisations, pastors were obliged to become more like CEOs in how they dressed and behaved. At the same time throughout the late 20/early 21st century, actual CEOs were becoming more like cult leaders. The transformation of the corporate world during the 80s and 90s (discussed in part four) had much to do with this.
Once upon a time, the CEO of a large corporation would have been the epitome of the cool, rational planner. He or she would have been trained in ‘management science’ and probably worked his or her way up within the ranks of the organisation so that, by the time they reached the top, the CEO had mastered every aspect of the business. Once there at the apex of the corporate pyramid this highly trained, rational specialist would have carried out the central belief of the college-educated middle-class, with its mandate of progress for all and not just the few.
But as the corporate world became more volatile toward the end of the 20th century, questions began to arise over whether such rationality and level-headedness was best for delivering the new goal of short-term boosts to shareholders’ profits. In 1999, Businessweek captured the changing mood when it asked, “who has time for decision trees and five year plans any more? Unlike the marketplace of twenty years ago, today’s information and services-dominated industry is all about instantaneous decision-making”.
These changes brought about a transformation in leadership. With the business world now seen as so tumultuous and complex as to “defy predictability and even rationality” (as an article in Fast Company put it) a new kind of CEO emerged, one driven more by intuition and gut-feeling. The new CEO was less of a manager with great experience obtained from working his way up the company hierarchy, and more of a flamboyant leader who had achieved celebrity status in the business world, and was hired on the basis of his showmanship, whether his prior role had anything to do with the new position or not.
A 2002 article in Human Relations described the celebrity CEO as being someone with “a monomaniacal conviction that there is one right way of doing things, and believe they possess an almost divine insight into reality”.
So, whereas the pastor of a megachurch was becoming more like a corporate executive, the corporate executive was becoming more like the leader of a cult. This transformation was no doubt helped by the replacement of old-style management consultants with motivational gurus. Pastorpreneurs, celebrity motivational gurus and flamboyant CEOS socialised together, advised one another, and in so doing created a business environment mixed with irrationality. According to Ehrenreich, “forsaking the ‘science’ of management, corporate leaders began a wild thrashing around in search for new ways to explain an increasingly uncertain world-everything from chaos theory…to eastern religions”.
It was certainly a time of increasing uncertainty. With the likes of Tom Peters (described by the LA Times as the ‘uberguru’ of management) offering advice like “destroy your corporation before a competitor does!”, everybody’s position in 90s corporate America was precarious. But whereas the white-collar precariat lived with the prospect of being fired at any time while shouldering the burden of increasing debt, the focus of boosting shares and rewarding celebrity CEOS had seen executive pay soar to over three hundred times that of the typical worker, and golden parachutes handed out even to the boss whose reckless behaviour crossed the line into outright criminality. For example, in 2006 the chief executive of UnitedHealth was pursued by the US Securities and Exchange Commission for illegal backdating of stock options, actions that got him fired and made to repay $465 million in partial settlement. But he also received the largest ‘golden handshake’ in corporate history, amounting to nearly $1 billion. As Ehrenreich said, “the combination of great danger and potentially dazzling rewards (lead) to a wave of giddiness that swept through America”.
Celebrity CEOs, going from their Gulfstream jets to their limousines to their luxury villas or four-star hotels, lived (in the words of Washington DC ‘crisis manager’ Eric Dezenhall) “in an artificial bubble of constant, uncritical reinforcement…a consumer of reassuring cliches”. They had come to believe in the teachings of the motivational books and speakers they recommended (maybe with a degree of cynicism) to their subordinates; positive-thinking preachers who claimed great wealth would come to anyone who visualised success, worked hard, and never complained. The average American did not complain, either, since by now the incessant New Thought message convinced positive thinkers that anyone could ascend to the world of unstinting luxury. According to researchers at the Brookings Institute, “the strong belief in opportunity and upward mobility is the explanation that is often given for Americans’ high tolerance for inequality. The majority of Americans surveyed believe they will be above mean average income in the future (even though that is a mathematical impossibility)”.
But perhaps a more accurate way to put it would be to say that the average American could not complain, at least not of they wanted to keep their job. Remember, that Positive Thinking ideology considers any negativity to be a sin, and some of its gurus recommended removing negative people from one’s life. And in the world of corporate America-where, other than in clear-cut cases of racial, gender, or age-related discrimination, anyone can be fired for any reason or no reason at all-that was easy to do: terminate that negative person’s employment. Joel Osteen of Houston Lakewood church (described as “America’s most influential Christian” by Church Report magazine) told his followers, “employers prefer employees who are excited about working at their companies…God wants you to give it everything you’ve got. Be enthusiastic. Set an example”. And if you didn’t set an example and radiate unbridled optimism every second of the working day, you were made an example of. As banking expert Steve Eisman explained, “anybody who voiced negativity was thrown out”.
Such was the fate of Mike Gelband, who was in charge of Lehman Brothers’ real estate division. At the end of 2006 he grew increasingly anxious over the growing subprime mortgage bubble and advised “we have to rethink our business model”. For this unforgivable lapse into negativity, Lehman CEO Richard Fuld fired the miscreant.
But, actually, sacking was not the worst fate that could befall an employee in 21st century corporate America. With every white-collar employee under pressure to work on their attitudes, the pressure on that group who most require permanent smiles and positivity-the sales team-reached ludicrous heights. Underperforming salespeople were subjected to having eggs broken on their faces, were made to bend over and receive a spanking with the metal yard signs of competing companies, and in one case even subjected to waterboarding (“you saw how hard Chad fought for air right there. I want you to go back inside and fight that hard for sales”, in the words of the Prosper Management supervisor who conducted this example of motivational guidance).
So this was America in the 21st century. A world in which megachurch pastorpreneurs and TV evangelists preached to millions the Good News that “God caused the bank to ignore my credit score” (in the words of Osteen). A world in which CEOs became like the leaders of cults who, according to Steve Eisman, were infected with the executive mind-set of ‘hedge fund disease’ (“The symptoms are megalomania, plus narcissism, plus solipsism…How could you be wrong about anything? To think something is to make it happen. You’re God”) who were surrounded by yes-men who dared not raise any concerns for fear of being fired for ‘negativity’. A world in which to ‘underperform’ in sales could lead to humiliating ritual punishments like being made to wear nappies. 
Pumped up with the New Thought belief that positive thinking could make wishes come true and that God would intervene to prevent any negative outcome, Americans confronted those other circumstances happening in the early 21st century: Monetary policy from the Federal Reserve coupled with surpluses of fast-growing emerging economies making money cheaper than ever; US politicians working to increase the share of home-owning families; a financial industry apparently transforming large risks into smaller ones through repackaging, labelling and selling them coupled with regulations and bonuses that tempted people into the market for mortgage-backed securities. 
After the subprime mortgage bubble burst and it became obvious that the good times had been propped up by out-of-control speculation and borrowing, inevitably the cry went up: ‘Why did nobody see this coming?’. Hopefully this series has offered some explanations by showing how, prior to the 2008 crash, prosperity preachers and optimism coaches told people they could realise their material ambitions through the power of belief (‘self-help writer Stephen Covey encouraged those satisfied with what they had to “admit that what you have isn’t enough”) the perception of negative thought as a form of sin that must be removed from one’s life had the effect of ejecting cautious people bearing bad news from the workplace and there was an executive class making decisions based on gut-feeling who were behaving very much like the motivational gurus and prosperity preachers they socialised with and who they forced upon their subordinates, while at the same time enriching themselves through corporate mergers and bustups that unlocked shares-boosting capital while destroying the jobs of hundreds of thousands of people (those employees maxing out their credit cards in spending sprees in order to compensate for the deterioation of rewards in the workplace.) 
Coming up next, the concluding chapter of this essay.
‘Financial Fiasco’ by
‘Smile Or Die’ by Barbara Ehrenreich
‘White-Collar Sweatshop’ by Jill Andresky Frasier
In the aftermath of the 2008 crash, faced with an epidemic of foreclosures in the housing market, the collapse of some of the oldest financial institutions and the national debt rising to $10 trillion, people understandably asked: Why? How come all those highly respected and lavishly rewarded experts never saw the crash coming? Taking into consideration the evidence presented in this essay, I think we can conclude that the West was blinded by a combination of New Thought and neo-classical ideology.
When the likes of Mary Baker Eddy and Quimby sought to create a positive alternative to the grim outlook of Calvinism, they imagined the universe to consist of nothing but an all-nurturing, all-supplying spirit. Humanity, as part of this maximally-beneficial entity, had but to exercise their powers of positive thinking, banish all negative thoughts, and everything would turn out all right.
And, as Ehrenreich pointed out, “what was market fundamentalism other than runaway positive thinking? In the ideology of the Bush administration and, to a somewhat lesser extent, the Clinton administration before it, there was no need for vigilance or anxiety about America’s financial institutions because ‘the market’ would take care of everything. It achieved the status of a deity, this market”.
The real world is too complex for human minds to fully grasp. Since that’s the case, science is obliged to devise simplified models, to work with a crude ‘toy universe’ when thinking about this universe in which we live. For example, Newtonian physics cannot accurately predict the interaction of three or more orbiting bodies. So rocket scientists planning to send a probe to, say, Mars, work with a simplified model in which there are only two objects-Mars and the Sun. The thinking is that, on human timescales, the Sun’s influence swamps everything else, so the approximation is good enough for all practical intents and purposes.
All the sciences have to make simplifying assumptions, and economics is no exception. According to Mark Braund and Ross Ashcroft (authors of “The Survival Manual: A Sane Person’s Guide to Navigating the 21st Century”) “neo-classical economics looks only at the factors influencing the investment and consumption decisions of individuals and firms. It focuses on how things would work in an imaginary world where all participants in the economy shared full and equal knowledge, not only of the market but also of the consequences of their decisions. It also assumes that everyone faces the same choices in life”.
As we have seen over the course of this essay, there are a couple of dubious claims here. There is, for example, the claim that participants share full and equal knowledge, both of the market and their decision’s consequences. This can hardly be said to apply to a corporate world in which celebrity CEOs floated high above the concerns of ordinary citizens in a bubble of luxury, surrounded by subordinates conditioned to bring them nothing but good news. “I’m the most lied to man in the world”, was how one CEO explained his situation.
Nor could it be said to apply to ordinary Americans, those folk who, in work, were obliged to attend seminars and read books by so-called experts armed with a pseudoscientific mix of economics, quantum physics and mysticism (as one life coach insisted, “with quantum physics, science is leaving behind the notion that human beings are powerless victims and moving toward an understanding that we are fully empowered creators of our lives and of our world”) and engineering a working environment where the entrenched cult of optimism made it advisable to conform lest you be targeted for ‘releases of resources’ or whatever euphemism for layoffs the company used.
Outside of work, the American citizen was preached to by TV evangelists broadcasting their ‘prosperity gospel’ that God wanted true believers in optimism to have it all (a situation that inspired a 2008 Time article called ‘Maybe We Should Blame God For The Subprime mortgage mess’). They were advised by (in Ehrenreich’s words) “professional optimists (who) dominated the world of economic commentary…Escalating house prices were pumping the entire economy by encouraging people to use their homes like ATMS…taking out home equity loans to finance surging consumption-and housing prices were believed to be permanently resistant to gravity”.
According to Washington Post columnist Steve Pearlstein, “at the heart of any economic or financial mania is an epidemic of self-delusion that infects not only large numbers of sophisticated investors but also many of the smartest, most experienced and sophisticated executives and bankers”.
An economy infected with an epidemic of self-delusion and where the pressure is on to conform to a ‘yes-man’ culture of positive thinking is hardly conducive to bringing about the neo-classical concept of man as a perfectly informed and rational agent.
Then there is the notion of everyone facing the same choices in life. Here, I will just point out that some finance companies involved in subprime mortgages were undertaking debt-to-asset ratios of 30 to 1, and ask the reader to take a member of the white-collar proletariat, massively indebted, working in a corporate environment whose advice to those facing unprecedented levels of ‘restructuring’ and ‘career-change opportunities’ (more euphemisms for layoffs) were “don’t blame the system, don’t blame the boss, work harder and pray more” or ‘deal with it, you big babies!”, and compare that person to the likes of Jack Welch, the CEO who laid off over a hundred thousand workers, who retired with a monthly income of $2.1 million, got given an $800,000-a-month Manhattan apartment, a Boeing 737 (also courtesy of the company) oh, and free security guards for his many homes. Does anyone really believe these are people who face the same choices in life?
When we make references to the ‘free market’, what, exactly is this ‘freedom’ we are referring to? The neo-liberal ideologue would no doubt claim it refers to the freedom to partake in voluntary exchange. As Ayn Rand said, “money is the material shape of the principle that men who wish to deal with one another must deal by trade and give value for value…An honest man is one who knows that he cannot consume more than he has produced”.
But, if that is the case, then it is difficult to imagine how all those toxic assets could have been accumulating in the financial sector or how borrowing could have pushed the national debt to ten trillion dollars. I think a more apt description would be: “The free market is a competitive environment in which players strive to obtain greater material wealth than other players, by whatever means they can get away with”. This definition leaves open the possibility that some may aim to get ahead by cheating and the spreading of misleading information. They may not be able to get away with it-that depends on how clued-up and vigilant the other players are to such deception and what regulatory structures are in place to curb such behaviour-but, in nature, parasites can evolve to alter the minds of their hosts such that they nurture rather than fight off the bloodsucker. The same thing can be said of market parasites.
Gillian Tett of the Financial Times has commented on how an elite “try to stay in power; and the way they stay in power is not merely by controlling the means of production but by controlling the cognitive map, the way we think. And what really matters in that respect is…what is left undebated, unsaid”.
In a corporate environment amidst a consumerist world feeding off of New Thought ideology, there was quite a lot left unsaid. As Adam Michelson, senior Vice President of Countrywide, said, “these are the times when that one person who might respond with a negative comment or a cautious appraisal might be the first to be ostracised. There is a great risk to nonconformity in any feverishly frothy environment like that”.
Indeed. America in the early 21st century was riding high on optimism. Communism had been defeated, and the turbulent world of financial capitalism was sold to the public as a rising tide that lifts all boats. According to Robert Reich, “optimism…explains why we spend so much and save so little…our willingness to go into debt is intimately related to our optimism”.
As we have seen through the course of this essay, this optimism can be traced back to the Calvinist religion that helped the founders of this nation tame the harsh wilderness, and the New Thought ideology that attempted to undo the mental damage such a punitive religion could impose, but actually ended up being just as harsh on ‘sin’ as what preceded it. The only difference was that it was negative thinking rather than pleasure-seeking that was held up as sinful.
As Ehrenreich explained, “for centuries, or at least since the Protestant Reformation, western economic elites have flattered themselves with the idea that poverty is a voluntary condition. The Calvinist saw it as a result of sloth and other bad habits; the positive thinker blamed it on a wilful failure to embrace abundance. This victim-blaming approach meshed neatly with the prevailing economic determinism of the past two decades. Welfare recipients were pushed into low-wage jobs, supposedly in part, to boost their self-esteem; laid off and soon to be laid off workers were subjected to motivational speakers and exercises. But the economic meltdown should have undone, once and for all, the idea of poverty as a personal shortcoming…The lines at the unemployment offices and churches offering free food include strivers as well as slackers”.
It seems God was not on hand to save us from ourselves after all.
Smile or Die by Barbara Ehrenreich
The Survival Manual by Mark Braund and Ross Ashcroft
Atlas Shrugged by Ayn Rand

Posted in Uncategorized | Leave a comment

How Religion Caused The Great Recession

Any essay with a title like ‘how religion caused the Great Recession’ had better begin with a caveat or two, so here goes. First of all, religion was not what caused the financial crash of 2008, which is to say it was not the main reason for the subprime mortgage bubble. As to what was the main culprit, well that probably depends on one’s political ideology. The anti-capitalist would likely blame ‘too big to fail’ banks and irresponsible Wall Street wolves, while the anti-Left would probably cite State interference in the mortgage market as the main villain.
While either of these doubtless do stand up as greater culprits, both politics and finance, along with other kinds of collective activity, take place amidst the societies of the day and cannot help but be influenced by the beliefs and attitudes that evolve within them. And it is here, in the influencing of minds and group action, that we will see how religion helped set us up for a subprime mortgage bubble. But now I must make the second caveat and say that there are many different kinds of religion offering diverse schools of thought, and doubtless some would have guarded against the reckless borrowing and lending that lead to the 2008 Crash. But leading up to that crash there was an ideology sweeping through America, one that set the world up for a fall from the dizzying heights of the greatest delusion, and the origins for this hubristic attitude can be traced way back to the faith of the Pilgrim fathers.
As far as Westerners are concerned, the United States was colonised by pilgrims whose ancestry could be traced back to the Brownist English Dissenters who, in the 16th-17th century, had fled from the dangerous political climate of their native England for the Netherlands. The pilgrims arranged with English investors to establish a new North American colony, because they were concerned that emigrating to the Netherlands would lead to a loss of their English identity. So, in 1620, they established the Plymouth colony in present day Massachusetts, which was the second successful English settlement (Jamestown, Virginia, being the first. It was settled in 1607.)
The pilgrims who founded the Plymouth colony subscribed to a variant of the Puritan faith known as Calvinism, named after John Calvin who lived in the 16th century. This was a particularly harsh and judgemental form of Christianity, one whose God “reveals his hatred for his creatures, not his love for them”, in the words of literary scholar Ann Douglas. Calvinists believed that this God’s heaven had only a limited number of spaces available, and whether you were chosen or not had been predetermined since before your birth. As to one’s duties here on earth, the Calvinist religion saw much virtue in industrious labour and particularly in constant self-examination for any sinful thought. Idleness and pleasure-seeking were viewed as being particularly contemptible sins.
In ‘Protestant Ethics and the Spirit of Capitalism’, Max Weber argued that capitalism has its roots in Calvinist Protestantism, since it taught its followers to defer gratification in favour of hard work and wealth accumulation. It was also a mindset that was pretty well suited to the conditions the New World imposed on the colonists. Forget the images invoked by the patriotic song ‘America The Beautiful’ with its amber waves of grain, from sea to shining sea. What greeted the settlers was “a hideous, desolate wilderness” (in the words of William Bradford). Not for nothing was this land known as the Wild West. In a harsh environment such as this, where even subsistence demanded ceaseless effort, the tough-minded ideology of Calvinism probably helped them survive.
Elements of Calvinism would persist in America right through to the modern age, with the middle- and upper-class considering busyness for its own sake as a means of obtaining status (a rather convenient mindset for the increasingly demanding corporations of the 80s and 90s, as we will see). But as the harsh Wild West was gradually tamed, the constant self-examination for sinful thought and its eradication through labour came to impose a hefty toll on those who became cut off from industrious work (as were, for example, women- barred from higher education by male prejudice and faced with industrialisation stripping away productive home tasks like sewing and soap-making). With productive activity taken away, Calvinism left these people with nothing but morbid introspection and this lead to various illnesses that we would now recognise as being diagnostic of mental stress.
Faced with people succumbing to the symptoms of neurasthenia, and with the medical establishment seemingly unable to cure such patients, people began to reject their forebears’ punitive religion. There was, for example, Phineas Parkhurst Quimby, a watchmaker and inventor who held metaphysical beliefs concerning (in his words) “the science of life and happiness”. In the 1860s, Quimby met up with one Mary Baker Eddy who, like many middle-class women of her day, had rejected the guilt-ridden and patriarchal Calvinism in favour of a more loving and maternal deity. 
Together, Eddy and Quimby launched what we now describe as the cultural phenomenon of positive thinking. Back in the 1800s, the post-Calvinist way of thinking that Quimby and Eddy established was known as New Thought. Drawing on a variety of sources from transcendentalism to Hinduism, New Thought re-imagined God from the hostile deity of Calvinism to a positive and all-powerful spirit. And humanity was brought closer to God, too. Out went the idea of an exclusive heaven reserved only for a select few, replaced with a concept of Man as part of one universal, benevolent spirit. And if reality consisted of nothing but the perfect and positive spirit of God, how could there be such things as sin, disease, and other negative things? New Thought saw these as mere errors that humans could eradicate through “the boundless power of spirit”.
Patients suffering mental breakdowns due to the ceaseless morbid introspection of Calvinism came to see Quimby and his ‘talking cure’, which sought to replace such negative thoughts with a belief in a universe that was benevolent, coupled with an insistence that the patient could ‘correct’ any negativity through positive thinking. The ‘Talking cure’ did indeed seem to cure the mental anxieties that were leading to invalidism among Calvinists who had idleness imposed upon them.
Meanwhile, Mary Baker Eddy went on to gain considerable wealth after founding Christian Science, the core teachings of which were that the material world did not exist; there was only Thought, Mind, Spirit, Goodness and Love. Whatever negativity or want seemed to exist were but temporary delusions.
New Thought went on to influence such people as William James, the first American psychologist, who claimed in his ‘Varieties of Religious Experience’ that, through New Thought, “lifelong invalids have had their health restored”. It also influenced Norman Vincent Peale, perhaps best known for his 1952 “The Power of Positive Thinking”. But perhaps most importantly, as far as this essay is concerned, Mary Baker Eddy’s notion of negativity as controllable delusions influenced the mystical teachings of modern-day ‘motivational gurus’ who would lead those aspiring to the American Dream into believing that success and wealth would surely come their way if only they believed fervently enough.
And now we come to the dark side of New Thought. Although intended as an alternative to Calvinism, New Thought did not succeed in eradicating all the harmful aspects of that religion. As Barbara Ehrenreich explained in ‘Smile Or Die’, “it ended up preserving some of Calvinism’s more toxic features- a harsh judgmentalism, echoing the old religion’s condemnation of sin, and the insistence on the constant exterior labour of self-examination”. The only difference was that while the Calvinist’s introspection was intended to eradicate sin, the practitioner of New Thought and its later incarnations of positive thinking was constantly monitoring the self for negativity. Anything other than positive thought was an error that had to be driven out of the mind.
So, from the 19th century onwards, a belief that the universe is fundamentally benevolent and that the power of positive thought could make wishes come true and prevent all negative things from happening, was simmering away in the American subconsciousness. When consumerism took hold in the 20th century, positive thinking would become increasingly imposed on anyone looking to get ahead in an increasingly materialistic world.
To be continued…
‘Guns, Germs and Steel’ by Jared Diamond
‘Smile Or Die’ by Barbara Ehrenreich.

In part one, we saw how the Plymouth Colonists settled in a harsh, untamed environment that required ceaseless labour just to maintain subsistence living. Gradually, though, the unforgiving Wild West would be tamed, with railroads and freeways stretching from State to State, vast swathes of farmland providing an abundance of food, and industrial centres capable of such high productivity it seemed as though everybody’s needs would soon be met.
But while this might sound like a positive thing, it actually posed something of a problem to the economic system that had been established. It was a system based on perpetual growth and that was fundamentally opposed to any notion of ‘enough’ that might dwell in the human soul. In the competitive world of business, companies manufacturing goods were compelled to steadily increase market share and profits, of fear of being swallowed by a larger enterprise, but how could perpetual growth be maintained when customers acted with frugality and were content with what they had?
Psychologists were therefore brought in to change the human psyche. One such expert was Edward Bernays. He took certain ideas from Freudian analysis about human status and applied them to advertisement campaigns. Products were no longer to be thought of as mere practical solutions to a limited set of problems. They were, instead, symbols representative of one’s identity, physical representations of one’s status. The car, the appliance, the furniture, were to be less relevant in terms of their utility and seen instead as fashion accessories. Advertising played a major role in developing this new consumer culture, because if the economy was to fulfil its imperative of perpetual growth, the customer had to be persuaded to buy things they did not even know they needed.
The consumer economy necessitated the rise of sales and service-based industries and those kinds of workplaces proved fertile breeding ground for positive thinking. After all, we all expect staff in shops and waiters serving us food to be friendly and greet us with smiles and a positive attitude (even if we don’t really believe the grinning sales assistant is genuinely pleased to see us). 
Increasingly, then, employees found themselves in occupations that required the kind of self-examination and improvement that practitioners of positive thinking strived to achieve. As Ehrenreich explained, “the work of Americans, and especially its ever-growing white-collar proletariat, is in no small part work that is performed on the self in order to make that self more acceptable and even likeable to employers, clients, coworkers and potential customers”. Nor were interpersonal skills and constant optimism confined to obvious places like sales and service-based industries. As Carnegie observed, “even in such technical lines as engineering, about 15 percent of one’s financial success is due to one’s technical knowledge and about 85 percent is due to skill in human engineering”.
And so, whether in work or out, the consumer lived surrounded by the positive thinking message that anyone can have whatever they want, provided they exercised sufficient belief that good things will come their way. It was a belief generated in no small part to create an insatiable appetite for consumer culture. And as the corporate world seemed to ascend to increasingly dazzling heights of financial success, some clergymen noticed this ascendency and recognised within it methods to grow their churches.
Continued in part three
‘Culture In Decline’ by Peter Joseph
‘Smile Or Die’ by Barbara Ehrenreich.

In part two, we saw how a market system based on perpetual growth required a change in social attitudes once productivity was capable of meeting basic needs, and that transition was one in which we went from frugality to signalling our individuality through consumption. By the late 20th century it would have been impossible to miss consumption culture and, perhaps inevitably, marketing, advertising and other aspects of growth culture began to have an influence in areas one might consider to be outside such economic concerns.
One such example would be Church. Membership of mainstream church had been declining in the latter part of the 20th century. In the past, churches faced with an increasing number of ‘unchurched’ folk might have sent out missionaries to try and convert the heathen population. But, this being an era of marketing, they tried something different. They did what any business would do when looking to relaunch a flagging product. They began thinking of potential members as ‘customers’ and conducted market research in order to determine what the ‘customer’ wanted. The various surveys and research indicated that people were not much interested in the kind of sermons they had sat through as children. Not for them, the angry sermon condemning sin. In fact, the market research showed people were not much interested in traditional church at all.
So pastors like Rick Warren, Bill Hybels and Robert Shuller set about reconfiguring church in order to better accommodate what the ‘customer’ wanted. Out went the hard pews, replaced with comfortable seating. Out went all the imagery of conventional churches. These new churches would have little in the way of traditional Christian iconography, such as crosses or images of Jesus. The result of this transformation was a building that looked less like a church and more like architecture that fit seamlessly with the modernist corporate-style environment of the rest of the city.
It was not only the physical appearance of the church that changed to suit the modern corporate, secular world. The sermons themselves changed as well. The more demanding principles of Christianity with its teachings of modesty and humble living were discarded, replaced with positive messages very much like the ones New Thought had preached. The new breed of pastor saw themselves not as critics of the secular, materialistic world but rather as active participants within it. They preached a ‘prosperity gospel’, one which claimed God wanted you to achieve status, wealth and other trappings of material success.
In terms of growth, this tactic of transforming churches into secular conference centres spreading the good news that God would payeth thy credit card proved very successful. The churches led by the likes of Schuller, Warren and Hybels became ‘megachurches’ which, if you include those attending via TV broadcast, preached to an audience of millions. Being so big, megachurches had to employ hundreds of people and find millions of dollars to keep the organisation running. These conditions led to their pastors becoming ever less like traditional clergy and more like CEOs of large corporations. As Ehrenreich explained, many of these churches were “nondenominational, meaning they couldn’t turn to a centralised bureaucracy for financial or any other kind of support…They depended entirely on their own charisma and salesmanship”.
So the audience of a megachurch entered a building that looked pretty much like a corporate headquarters. The person preaching to them wore a business suit like any CEO and probably thought of himself as a ‘pastorpreneur’- part pastor, part entrepreneur. And the message the pastorpreneur delivered was much the same as the one the corporate world wanted to get across: Through positive thinking, you can make anything happen. You can get ahead, you can become successful, you will become rich. Consider, for instance, the words of televangelist Joyce Meyer: “I believe God wants to give us nice things”.
But, underneath all that positivity was the dark undercurrent of New Thought’s attitude toward negativity as a sin. If, despite all your positivity, riches did not come your way, don’t look for any flaw in business, economics or politics. Instead, blame yourself. You just didn’t try hard enough. Pastor Robert Schuller advised his congregation to “never verbalise a negative emotion”.
Still, at least the megachurch managed to remain a nice, comfortable place in which to receive the prosperity gospel. As Ehrenreich said, in a megachurch “no one will yell at you, impose impossible deadlines…All the visual signs of corporate power and efficiency, only without the cruelty and fear”. 
The same could not be said of corporate America in the late 20th/early 21st century.
Continued in PART FOUR
‘Smile Or Die’ by Barbara Ehrenreich 

Posted in work jobs and all that | Tagged , , , | Leave a comment

(Dis)honest ways to make it rich

Money. In a world where most things come with a price tag attached, we are all obliged to try and acquire the stuff. This can be achieved through fair means or foul. But what is an honest way of becoming wealthy?
To my mind there is only one way to become wealthy through entirely honest means, and that is to provide a product or service that an informed customer may choose to spend his or her money on. The company that provides this product or service relies only on the actual quality of it to keep them ahead of competitors. 
Furthermore, the honest boss of a company recognises that he or she is but one member of a team, and it was that collective which worked together to bring product X to the market. We might make a comparison to the conductor of an orchestra. A great conductor can make the difference between a performance that is merely OK and one that is sublime. It would be wrong, however, to attribute the excellence of the performance entirely to the person who happens to lead the orchestra. Obviously, were it not for the violinists, the trumpet players, the pianists, the percussionists and all the other members of the orchestra, bringing what is likely years of hard practice at perfecting the craft of playing their chosen instrument, there would be no music at all, sublime or otherwise. 
The same can be said for the CEO of a company. A great CEO can make the difference between an outstanding year for the company and a merely average (or abysmal) one. But a CEO of a Fortune 500 company could no more bring its product to market by themselves than a conductor could wave his baton like a wand and create music, absent of all the other members of the team we call an orchestra. The honest boss recognises that he or she is but one person among many, and that it is actually the organisation the team comprises that earns those £billions multinational corporations bring in. The honest boss would not accept a financial reward that is so high other members of the team must necessarily do with so little even if they work full time their daily life is one of constant money anxiety. Of course, it would also be wrong to pay everybody the same, since people clearly have different levels of responsibilities and skills in any organisation. But there is surely a mutually beneficial compromise between the extremes of total equality and high inequality.

Do all companies competing in the market adhere strictly to these conditions for honest money-making? Clearly not, as it is not too difficult to find examples of businesses that break at least one of the rules I just mentioned. To recap, the totally honest business:
1. Sells a product or service to an informed customer. In other words, whatever advertisement is used to try and sell the product gives an honest description of its advantages and disadvantages in comparison to rival products. Any potential customer truly knows exactly what it is they are about to pay for.
2. Relies only on the actual quality of that product/service in order to stay ahead of the competition. In other words, there is no lobbying the State to pass laws that disadvantage their competitors, grant monopoly rights that are then exploited through price hikes that would not be possible in true free market competition, and other distortions of the market. 
3. Acknowledge that it is the company, not any one person, who earns the profit. This income should be distributed in a mutually beneficial way that avoids the injustices of total equality (which fails to compensate for differences in responsibility and talent) and extreme inequality (which places unnecessary anxiety on the disfavoured and can lead to structural violence, as the disenfranchised riot against what is an obviously rigged system). In a good business, every person from the bottom to the top is motivated by the hope of success; the bad business relies on the fear of failure to keep its employees working.
So, are customers always completely informed about the product they are about to buy? Not according to the documentary ‘Will Work For Free’:
“If I wandered into a phone shop, unsure of what to buy, and make the mistake of telling the salesperson that I am not too familiar with the differences, I leave myself open to product sale bias. In this scenario, the store has no problem selling the best products, so instead, I am presented with an inferior product which the store is struggling to offload. The salesperson’s job here, becomes distorted and I the customer will most likely be subjected to either a sales pitch as opposed to honest insight”.
Or how about this quote from the Daily Mail:
“The Herts and Essex Fertility Centre charges £1,247 for three drugs routinely prescribed to women on IVF. But the same prescriptions in the same quantities cost £876.72 from Boots or £929.22 from Asda. Couples who buy drugs from the clinics may have no idea they are paying over the odds or that they can get them elsewhere…Experts accused the clinics of exploitation, calling the way they charge for IVF drugs ‘a complete racket'”.
Neither of these quotes convey an impression of potential customers making decisions armed with a complete set of all facts. I dare say most people have had the experience of dealing with some salesperson or business who appears to be, shall we say, economical with the truth in order to ensure a sale.
Moving on to the second condition for honest money making, I think it is fair to say that there are all kinds of distortions of the free market principle of trading true value for value under conditions of competition that favour only those who genuinely provide the best product/service. In fact, this quote from Ayn Rand’s Atlas Shrugged sounds suspiciously like the actual (as opposed to some ideological ‘free’) market:
“When you see that trading is done, not by consent, but by compulsion–when you see that in order to produce, you need to obtain permission from men who produce nothing–when you see that money is flowing to those who deal, not in goods, but in favours–when you see that men get richer by graft and by pull than by work, and your laws don’t protect you against them, but protect them against you–when you see corruption being rewarded and honesty becoming a self-sacrifice–you may know that your society is doomed”. 
When you watch an athlete push their body to the limits of human capability and beat a record for the fastest sprint, the longest jump, the most perfectly executed dive and so on, you can only feel a sense of admiration for those who have put in such hard work to achieve this pinnacle, and a sense of humility that some are able to dedicate themselves to an effort more supreme than most of us could ever endure.
But, given that the human body is capable of doing only so much, at some point an ever-decreasing time limit for completing a sprint or other record-breaking achievement starts to seem kind of…dubious. And then, it happens. The once-celebrated athlete is exposed as a cheat. He took performance-enhancing drugs, or some other means of gaining an edge that is against the rules of professional sport. 
It is, by the way, a bit unfair to dismiss those who are caught relying on such dubious tactics as just cheats. It is almost certain that these athletes trained every bit as hard as those who never touched performance-enhancing drugs. It is not like they just slobbed around in front of the TV all their lives, then one day injected themselves with something and wandered down to the Olympic park to beat Usain Bolt. No, their result came about by a mixture of honest and dishonest methods. 
Just as we understand that an individual can only break a sports record by a certain amount before it becomes obvious that they must have relied on some kind of cheating, so too should we recognise that an individual can only make so much money for themselves before their wages and bonuses are the result, not simply of their own merit, but a combination of honest work and cheating, of either rigging the system to favour themselves and disadvantage their fellow workers, or being in a position to benefit from a system that is already rigged. Is the CEO of a multinational company worth five times as much as the average employee? Doubtlessly, yes, because he or she shoulders enormous responsibility. Ten times more? Twenty? I think most people would still accept this is fair. But when those at the executive level start making hundreds or thousands of times more than the average employee, we really should be as dubious of this reward as we would be of an athlete who somehow manages to shave ten, fifty, or one hundred seconds off the previous world record in the sprint. Anyone who is a billionaire definitely did not earn that money through entirely honest means under conditions of true competition in which all have an equal chance to excel. No, they were in a position to take advantage of a rigged system.
Like everything created created by humans, markets are not perfect but flawed creations. If we come up with a description of markets which recognises the possibility that some may violate one or more of the conditions for acquiring deserved wealth, we can see what that flaw is. So here goes: The free market is an arena of competition in which individuals and groups try to gain a greater monetary reward than other individuals and groups, via whatever method they can get away with’. Clearly, such conditions are prone to cheats whose method for gaining wealth relies not entirely on making their own money, but (at least in part, to a greater or lesser extent, ) in taking wealth from others. Modern understandings of markets view them not as efficient machines, but rather as chaotic ecosystems. Just as the natural ecosystem inevitably allows parasites to evolve, so too do market systems give rise to parasites. And, just as in the natural world, those parasites are under competitive pressure to hide themselves from their victims, or better yet to fool their victims into believing they are something to be protected, rather than fought. 
Bare in mind what I said about the ‘cheating’ athlete, though. Just as the athlete was not simply a cheat but somebody who relied on a combination of meritocratic and dubious methods for achieving success, so too are the most successful cheats in business rarely simply parasites. Just as natural parasites have a competitive selective advantage in becoming interwoven among some vital function of their hosts body, such that removing the parasite without causing harm to the host presents a great challenge, so too are market parasites under selective pressure to interweave their dubious wealth-extraction schemes among genuinely useful services. The best parasites are never just cheats.
But, whatever. It is true to say that if those who take wealth, rather than make it, are allowed to flourish too much, then society is doomed just like Rand said. We know what needs to be done to ensure that never happens, though: Don’t let them get away with it.
“Will Work For Free” You tube documentary
“And They Charge Hundreds of Pounds More For Drugs You Can Buy at Asda” by the Daily Mail
‘Atlas Shrugged’ by Ayn Rand

Posted in work jobs and all that | Tagged , , | Leave a comment



In 1993 the science fiction author Vernor Vinge wrote a paper in which he predicted a coming event which would radically change life as we know it. “I argue in this paper”, he wrote, “that we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence”. Vinge coined a name for this change. He called it the ‘Technological Singularity’.

What I want to explore in this essay is the possibility that a singularity is not a unique event but one which has happened more than once. I believe that Vinge’s own words lead us to suppose that reality has gone through profound shifts in possibility before. The key sentence is as follows:

“This change will be a throwing away of all previous rules…developments that before might only happen in “a million years” (if ever) will likely happen in the next century”.

In other words, a key aspect of a singularity is that it leads to a dramatic change in perceptions of time, or rather, a dramatic compression of possibility, such the the wildly implausible becomes likely. With that in mind I think we can see in the past history of our universe at least three events which qualify as Singularities.


Before science fiction writers speculated on the possibility of technology bringing about such a dramatic change that it imposed a ‘singularity’ on our future through which we could not peer and see clearly what was to come, cosmologists looked to the dim and distant past and traced the evolution of the universe itself until they reached a point where our understanding of physics can take us no further. They called this point where the state of existence is shrouded in utter mystery a ‘singularity’ and no doubt those speculators of the future (Vinge was not the first person to use the phrase in the context of future technological change) borrowed the phrase from the cosmologists. 

Today, by far the most popular theory of the universe’s origins is ‘The Big Bang’, which points to evidence that our universe is expanding and argues that, if this is so, as we go back in time the universe must have been smaller until a moment is reached where all of creation was compressed into a speck of infinitesimal size. This naturally leads to the question, ‘what happened before?’, to which the answer seems to be ‘there was no ‘before’. The Big Bang marks the moment time and space began, so asking what came before the Big Bang is as nonsensical as asking ‘what’s north of the North Pole”? It follows from this that it is meaningless to wonder how long that mysterious pre-Big Bang reality lasted, for without time a nanosecond is no different to an eternity. What bigger difference in perceptions of time can there be than the transition from timelessness to change that can be measured?

Vernor Vinge saw the Technological Singularity as an event which could compress our expectations of what is possible in a given time-frame, making ‘only in a million years’ events happen within a century, if not sooner. If we look to our past, we can see another event which dramatically speeded up possibility, and that event was the transition from single-step selection to cumulative selection.

Richard Dawkins illustrated single-step selection by selecting a phrase from Shakespeare’s ‘Hamlet’ (‘METHINKS IT IS LIKE A WEASEL’) and asked how long we should expect to wait for a monkey, randomly bashing away at a special word processor with only 27 characters (each letter of the alphabet-capital only-plus a ‘space’ as the 28th character) and which allows only exactly 28 bashes per go, which happens to be the same amount of characters in that phrase, if we include a space as a ‘character’.
The chances of a monkey happening to type ‘M’ as the first letter is one in 27. After all, there are 27 other possible characters that the primate could happen to bash. In order to get the first two letters the monkey must beat odds of 1/27 times by 1/27, which gives us odds of 1/ 729. In order to randomly type the entire sentence with no spelling errors and spaces all in the correct place, the monkey must beat odds of about 1 in 10,000 million, million, million, million, million, million. As you can imagine, then, you would likely have to wait a very, very long time for a monkey to bash out that precise phrase on Dawkins’s special keyboard.
And yet these odds are quite good compared to the odds of a haemoglobin molecule happening to assemble itself from the random recombinations of amino acids from which it is made. The haemoglobin molecule consists of four chains of amino acids, there are 146 amino acids per chain, and in living things we commonly find 20 different kinds of amino acids. Another science fiction writer- Isaac Asimov, calculated the number of possible ways of arranging 20 kinds of things in chains 146 links long and came up with the ‘haemoglobin number’, which is (more or less) a one with one hundred and ninety noughts after it. Compare that to our ‘METHINKS’ odds in which the monkey ‘only’ had to beat odds of 1/10^40 or 1 followed by forty zeros. And, of course, one haemoglobin molecule makes up only a tiny fraction of the complexity of a living organism. If it were left up to random chance, we would have to wait far longer than the life of the universe itself for life to emerge.

Since life evidently has emerged we must conclude that a Vinge-style ‘possibility compression’ must have occurred at some point in the past, and we know exactly what that event was (although we are still in the dark as to what exact form it took). That event was the transition from single step selection aka random chance to ‘cumulative selection’.

The difference between these two is that, whereas single-step selection has no memory whatsoever of the past, where cumulative selection is concerned the results of one process is fed into subsequent processes. To illustrate the power of cumulative selection, Dawkins designed a computer program that, like that monkey, randomly a random sequences of 28 letters. It would then duplicate that phrase but with a certain chance that ‘copying error’ would alter the phrase. The computer would then examine all those ‘offspring’ phrases, and reproduce whichever phrase most closely resembled ‘METHINKS IT IS LIKE A WEASEL’. 

First of all, the program typed the following:

Pretty much the kind of thing you would expect a monkey to produce were it let loose on a word processor. After ten generations and selecting of ‘phrase closest to METHINKS IT IS LIKE A WEASEL’ the program had managed to produce:
Still hardly a recognisable word, let alone Shakespearian in its quality.
By the time the 30 generations had been bred and selected, a resemblance to the target phrase had become undeniable:
And within 43 generations cumulative selection had produced the exact quotation.
How long did it take for the computer to evolve five word quotation from ‘Hamlet’? About eleven seconds. Compare that to how long we would expect to wait if we relied only on random chance: About a million, million, million, million, million years.
Now, there is one important difference between Dawkins’s evolutionary program and natural selection, and it is this: That program was given a definite target in that it had to search through strings of 28 characters and select the one which most resembled, however slightly, the phrase ‘METHINKS IT IS LIKE A WEASEL’. Natural selection, on the other hand, is not heading toward any definite future goal. But still, the experiment Dawkins run does give us some inkling of the power of cumulative selection to dramatically speed up the likelihood of something as improbable as the complexity of life as we know it today. Paraphrasing Vinge, we might say that ‘events which would otherwise take about a million, million, million, million, million years to happen, can actually happen between eleven seconds and one hour’.

The theory of evolution by natural selection tells us that human beings are just one more species belonging to a great family tree comprised of all living things that exist, or have ever existed. But are human beings really just another animal, no more remarkable than any other creature or plant? Or is there a good reason to pick human beings out for being special in some way? 

I think the latter is true, and the special reason is as follows: Human beings, unique among life on Earth, enabled a new kind of evolutionary process. As Dawkins wrote, “There is an evolution-like process, orders of magnitude faster than biological evolution…This is variously called cultural evolution, exosomatic evolution, or technological evolution”. Whereas all other forms of life on Earth can only adapt at the speed of natural selection, human beings have the imagination, the communicative capability, and the dexterity to reshape materials around them to produce useful designs intended to suit some purpose. We don’t have to wait for natural selection to adapt us for operating under water, we can develop snorkels, aqualungs, submarines and other forms of aquatic technology. And while it took billions of years for natural selection to produce flying animals like birds, it took only a couple of million years for human beings to fly to the Moon. 

Our hominid ancestors were not always so rapid in their technological development. If we look back more than forty thousand years we find man-made artefacts that had hardly changed for a million years. Generation upon generation upon generation of humans produced the same kind of flint knife that their ancestors relied on, plus a few other tools. As for paintings and carvings and figurines, they produced none.
Our distant ancestors were no different to any other animal. Humans are not the only animals that make use of tools. Other primates have been observed using blades of grass to ‘fish’ ants out of holes with; thrushes use a stone as an ‘anvil’, bashing snails against it so as to break its shell and get at the soft meat inside; beavers fell trees in order to construct dams, the list goes on. But none of those animals have anything like technology, which is an ever-accumulating family of tools and techniques solving more and more problems. No beaver ever figured out how to add hydroelectric power to its dam, no thrush ever learned to add its snail meat to a recipe combining that ingredient with others to produce a tastier dish. Similarly, it seems our 40,000 year old+ ancestors never figured out that they could make dramatic improvements to their tools and that there was an almost infinite range of possible tools and techniques that were waiting to be fashioned from the resources around them. 
But then, something happened, and Jared Diamond called this event ‘The Great Leap Forward’. As I said before, prior to this Leap, the tools our ancestors used hardly changed for a million years, but after it we find paintings, carvings, musical instruments, and the beginnings of true technological capability that would result in, among other wondrous inventions, the iPad2 and app on which I am writing this very essay a mere 40,000 years or so later.
As Matt Ridley wrote in ‘The Rational Optimist’, the human race has “surrounded itself with peculiar, non-random arrangements of atoms called technologies, which it invents, reinvents and discards almost continuously. This is not true for other creatures…they do not ‘raise their standard of living’, or experience ‘economic growth’…they do not experience agricultural, urban, commercial, industrial, and information revolutions”.

This leads one to ask what it is about the human species that enabled it to trigger this paradigm shift to technological evolution. Some authorities think it has something to do with language. Perhaps not the evolution of language itself (linguists like Stephen Pinker believe language to be older than the Leap) but rather (as Dawkins speculated) “a new trick of grammar, such as the conditional clause, which, at a stroke, would have enabled ‘what if’ imagination to flower”. You can understand why people look to language, or some adaptation of the ability to communicate linguistically, for the triggering of the Great Leap Forward, because technology is an inherently collaborative process. The idea of the lone genius who gets a great idea from out of nowhere is pretty much false. Instead, inventors take materials, tools, techniques and ideas that already exist and put them together in a new way to achieve a new result. They rely, in other words, on the work that has already been accomplished. Furthermore, they pretty much always rely, either directly or indirectly, on support from other people in building whatever they are making. This fact is illustrated in a famous essay called ‘I, Pencil’, in which you are challenged to make the kind of pencil you can buy in a shop from scratch. Of course, you could snap off a twig and use it to etch markings into soft Earth, but could you make a pencil with a graphite tip and a little rubber on the opposite end, held in place by a strip of metal? In order to do that you would have to know how to mine that graphite and metal, produce that rubber, and how to get that graphite inside a hollow tube of wood. And, of course, you could not rely on anyone else’s tools to do this work, you would have to build it all entirely from scratch.

Simply put, this is an impossible task for one person to do. All of us rely on work that was done almost entirely by other people. Technology is very much dependent on specialisation and exchange, on people collaborating with one another and relying on an accumulating, evolving record of knowledge. Such a capability could never have arisen had each individual only had its own mind to rely on, and no way of communicating ideas from one mind to another and across generations. Vinge himself actually used the arrival of the human species as an example of a Singularity-like change. Recall that he wrote:
“We humans have the ability to internalize the world and conduct “what if’s” in our heads; we can solve many problems thousands of times faster than natural selection”.

So, there we are: My three examples of past events which had such a dramatic effect on time, on what is possible, that they deserve to be thought of as ‘Singularities’. Makes you wonder how different the future will be if Vinge’s ‘technological singularity’ actually happens, doesn’t it?

The Blind Watchmaker, and The Ancestor’s Tale” by Richard Dawkins.
“Rational Optimist” by Matt Ridley.
“The Technological Singularity” by Vernor Vinge.
“I, Pencil” by Leonard E. Read.

Posted in technology and us | Leave a comment


It is difficult to imagine going through life without relying on banks and the services they provide. We use such services every day. Our wages get wired to our accounts, we regularly use credit cards to buy things in stores and online. And then there are all those bills and loan offers that get sent to us. Yes, clearly, banking is interwoven into all our lives.
This, however, is not universally true. Around the world there are people- roughly 2.5 billion adults- that don’t have access to banks and the services they provide. Such people are unable to start savings accounts and they cannot get credit cards. While in countries like Canada, the U.K, and Germany around 96 percent of people above the age of fifteen have a bank account, in Pakistan only 27 percent of the population does. Nor is this a problem exclusive to developing countries. While in the USA some 88 percent of people have a bank account of some kind, some 30 percent must make do with nontraditional banking sources such as payday loans, and have insufficient access to the financial system. These people, who either live where there are no banks or who are unable to access all the services we have come to expect from the industry, are the ‘unbanked’.
So, how come there are people who don’t have access to such a crucial service? The two main reasons are: Lack of facilities and lack of documentation.
Starting with the former, there are people who live in places where traditional banks don’t want to go. This is partly because, being poor, they don’t offer the kind of profits that richer clients do. But, also, it is because these people live in places with insufficient infrastructure and security and that makes building bank branches in these areas difficult. As such, there is a distinct lack of services that we take for granted. For example, in Uganda circa 2005 there were one hundred ATM machines for 27 million people.
As well as lacking the infrastructure that banking requires, the unbanked have the problem of insufficient documentation. As Peruvian economist Hernando de Soto has shown, economic growth and the creation of wealth depend upon clearly defined and documented property rights. In the West we have documents attached to our assets- things like our cars and our homes- that can be presented as collateral to a bank to borrow money. But in developing countries people have assets but no documentation. Lacking documentation to prove their identity, put up collateral and create credit histories, the unbanked have a lack of the basic foundations for participating in the banking system, and are limited to cash transactions. In the absence of mainstream banking, a shadow banking system has emerged to meet the needs of the unbanked. But such organisations leave these vulnerable people open to corruption.
Fortunately, we have now developed, and are expanding, the technological capability to integrate the unbanked into the global financial system. This is being achieved not through building physical bank branches but by leapfrogging brick-and-mortar banks using mobile technology. The potential in bringing those 2.5 billion unbanked into the global economy is not to be sniffed at. According to the journalist Robert Neuwirth, in aggregate there is some $10 trillions-worth of assets owned by undocumented people around the world. Were they their own country, it would be a country whose economy is second only to the USA.
It is therefore no fool’s errand to find ways of overcoming the obstacles the unbanked face in becoming part of the global economy. Indeed, the work done so far proves how valuable this can be. Thanks to globalisation and digitisation, people in India with sufficient understanding of English and IT skills can work from home servicing computers in America and Europe. Multinational companies now source their goods from all over the world, bringing job prospects to areas that hitherto had only a hand-to-mouth existence. Of course, this is not without some negative consequences. It means jobs are lost in richer countries as they are outsourced to places where there is cheaper labour. But it helped drop the percentage of the world’s population living on less than $1.25 a day from 43.1% to 20.6%.
Perhaps the most important device for bringing the unbanked into the 21st century is the mobile phone, for it is this device, plus the wireless infrastructure and IT that support it, that more than anything have allowed people to move away from cash and opt for more secure and useful forms of money.
Developing countries may be most open to this kind of change for a couple of reasons. For one, in richer countries we are used to digital currencies that offer all the convenience of cashless currency, but hide quite a bit of expense from the end user; that expense eventually showing up as higher prices. In developing countries, corruption and excessive bureaucracy are more blatant. Workers in some African countries must pay their boss a bribe in order to receive their wages. In Cairo, acquiring and registering a plot of state-owned land involves wading through some 77 bureaucratic procedures across 33 agencies, and can take up to 14 years. So we should perhaps expect to see such people embrace blockchain 2.0 and cryptocurrency solutions faster than countries that have convenient credit-card payment schemes with hidden charges.
Secondly, in developing countries we find a greater proportion of self-employed people- rickshaw drivers, food-stall operators, small business owners- and for such people it is particularly important to save costs on financial transactions. A payment solution that is more secure, able to circumvent corruption and avoids hidden charges, has obvious benefits for people who have to look after every penny. 
I would add a third reason why mobile banking, cryptocurrency and blockchain 2.0 technology may develop in countries where there are the ‘unbanked’ and that is the ‘latecomer’s advantage’. There is no rule that says a country has to retread all the steps that lead to the modern world. They can leapfrog straight to the latest technologies and practices. Indeed, this leapfrogging makes a great deal of sense, because the most modern technologies often do the same job as predecessors, only more cost effectively and less wastefully. The cost of purchasing and burying copper wire for a communications infrastructure would be more than $100 million. Cell tower infrastructure would cost a relatively small tens of thousands of dollars. If a city like Zinder in South Niger were to adopt PCs, then by the time 10% of the population were using them, the power they consume- 1,500 KW- would exceed that of all households today. Mobile devices, on the other hand, would consume just 74KWs, and as they run off of batteries they would be more useful in areas where power outage is a common experience. It is for reasons such as these that countries like El Salvdore and Panama have adopted mobile communications faster than the USA.

The fact that the rich nations have well-established systems and infrastructures could be an impediment to progress. W. Brian Arthur, External Professor at the Santa Fe Institute and author of ‘The Nature of Technology’ has written about how established technologies and practices can delay the adoption of new methods, even though those new methods are superior. In 1955, the economist Marvin Frankel noticed that cotton mills in Lancashire were not using the more modern and efficient machinery. This was because the old brick structures that housed the old machinery would have to be torn down before the new machinery could be installed. As Arthur wrote, “The outer assemblies thus locked in the inner machinery and thus the Lancashire mills did not change”. To this day, whenever a technology is so interwoven with the fabric of everyday life or business practice that replacing it seems too much bother, we say it has become ‘locked-in’.
There is also a psychological aspect to consider. Established technologies and practices can lead people to adopting certain ways of doing things, and upstart technologies that obsolete the old ways can be threatening. Sociologist Diane Vaughan called this ‘Psychological Dissonance’ and wrote:
“(We use) a frame of reference constructed from integrated sets of assumptions, expectations, and experiences…This frame of reference is not easily altered or dismantled, because the way we tend to see the world is intimately linked to how we see and define ourselves in relation to the world. Thus, we have a vested interest in maintaining consistency because our own identity is at risk”.
Therefore, established technologies, infrastructures and methods can create hysteresis- a delayed response to change- that holds the new at bay, at least until the old ways simply cannot be stretched any further. So, it could be that developing countries which lack many of these established infrastructures and technologies, would adopt the new and accommodate themselves more quickly to the methods and practices they make possible.
As a digital communications infrastructure is established and made accessible, the unbanked have the opportunity to pursue specialisation and exchange, building services that reduce problems in economic activity, match underused resources with unmet needs, and generally follow a proven path to prosperity. Through a combination of the Internet, mobile telephony and micro-financing, websites like Kiva allow individuals in the West to lend to African entrepreneurs who are able to deposit receipts and pay bills without having to handle cash. Zambian farmers have boosted profits by 20 percent by using their mobile phones to buy seeds and fertiliser.

Perhaps the most successful mobile banking scheme (in terms of helping the unbanked, at least) is M-Pesa. M-Pesa came into existence in 2007, when Safaricom began a pilot program that turned prepaid calling minutes into a form of currency. In order to use M-Pesa, people sign up for an account and their phone gets an E-Wallet. They can then go to a local Safaricom agent and pay cash for “e-float”. As Paul Vigna and Michael. J. Casey explained in “Cryptocurrency”, “this money isn’t actually held in the form of Kenyan shillings but as a separate claim on the overall M-Pesa e-float, all of which is backed by depositors in the banks with which Safaricom has accounts”. This currency can then be sent to other phone users who also have M-Pesa accounts, or a user can withdraw cash by going to an agent who will hand over money provided the user has an equivalent amount of e-float in their account.

Today, two-thirds of Kenyans use M-Pesa and 25 percent of the country’s GDP flows through it. Vodaphone, who own 40 percent of Safaricom, have brought M-Pesa to Tanzania, South Africa, Fiji, India, Romania and others. The relief group, Concern worldwide, used M-Pesa to help bring aid to Kenya’s remote Kerio valley. With the nation’s institutions frozen after violence broke out after a hotly contested election, this form of digital currency provided a means of moving money around, and the transaction fee that Safaricom charged was far less than the cost of transporting food and material. In Tansania, people who neither live near a hospital or can afford to travel to one are helped by an organisation called Comprehensive Community-Based Rehabilitation, which uses M-Pesa to cover their travel expenses.

As a form of digital money, M-Pesa is not without its drawbacks. To the end user it may appear automatic, but lurking in the background there is an infrastructure that is unwieldy and expensive. Agents still have to handle cash-indeed, large amounts of cash-which can leave the vulnerable to criminals. And, as Vigna and Casey explained, “when agents run out of money, they have to either stop what they are doing, close the shop and go to a bank, or stop what they are doing and send somebody on their behalf”. Also, Vodaphone has partnerships with other payment networks that all charge the usual fees and banking-system-dependent costs we have unfortunately come to expect from such a middleman-heavy service.

Little wonder, then, that many cryptocurrency enthusiasts see bitcoin as a solution to the problems M-Pesa and other banking-system-dependent forms of digital money cannot resolve. After all, bitcoin makes possible the direct transfer of money between two parties, entirely bypassing the cumbersome and expensive system for international transfers. Because bitcoins are essentially nothing but lines of code, it does not even necessarily require a smartphone to participate in this form of currency. A project called 37Coins uses people who have Android smartphones as ‘gateways’ to transmit messages, and this allows others to use cheaper, more rudimentary phones to send money via SMS. Mozilla, the company perhaps best known for the Firefox browser, sells a suitable phone for just $25.

Cryptocurrency also deals with the documentation. As Vigna and Casey pointed out, “you, your identity and your credit history are irrelevant. You do need an electronic platform with which to connect to the Internet. But if you are able to get that, bitcoin allows you to send or receive money from anywhere”. With smartphones becoming cheaper, and bitcoin wallets becoming easier to use, this can only help decentralised, peer-to-peer cryptocurrency spread further.
Eliminating the need for middlemen who all take their cut lowers the costs of transactions. But this is only the beginning of the benefits that the technology behind bitcoin could bring about. The blockchain provides a middleman-free way to exchange any asset. Not just money but intellectual property, contracts, and so on. It creates an irrefutable public record not controlled from any one central institution. According to Vigna and Casey, “the Blockchain’s groundbreaking model for authenticating information could liberate the poor from the incompetence and corruption of bureacrats and judges. Digitised registers of real-estate deeds, all fully administered by a cryptocurrency computer network without the engagement of a central government agency, could be created to cheaply and reliably manage people’s rights to property”.


Looking further into the future, we can foresee a time when your money is truly your money. This is not the case with cash. Anybody who gains access to your purse or wallet can spend the money it contains. But if your smartphone will not unlock without biometric data unique to you, then it’s useless to anyone else. The science-fiction author Charles Stross imagined a scenario in which a thief snatches a bag, only for the bag to start screaming in distress at being handled by a stranger. With a combination of sensors and artificial intelligence that can distinguish between property’s rightful owner and everybody else, and GPS tracking, our personal devices could behave just like that bag, immediately alerting authorities and providing incriminating evidence.

As for people who have permission to access our property, intelligent devices could allow for more precise control over the extent of that access. No more having to carry different loyalty cards; the phone would track your position, know you are in a certain store and allow you to use its customer reward scheme. People who rent out their homes on AirBnB and other such services could rely on smartphones that provide access to the home for a set period, or which permit entry to some rooms but not others. And, seeing as how bitcoin does no care who you are, anything smart enough to begin using the service could do so. As Mike Hearn, a former Google employee, pointed out, “bitcoin has no intermediaries. Therefore, there’s really nothing to stop a computer from connecting to the Internet and taking part”. Indeed, any suitably smart AI could, and Hearn has envisaged driverless taxis that connect to an automated, electronic marketplace he has dubbed ‘Tradenet’. The car (or, rather, the AI that controls it) would own itself, paying its costs and receiving its own revenue. If it were programmed to provide as cheap and efficient a service as possible, it would be focused on maximising its productivity, with no interest whatsoever on bling and other signs of material wealth. 
The issue of robots taking over tasks traditionally the preserve of wage-earning humans has lead some to suppose that money won’t be necessary in the future. It seems a reasonable argument to make: the robots don’t work for wages, humans can’t compete against them for jobs, so goods and services might as well be provided free by our tireless AI servants. 

But this argument assumes that money is merely a commodity recognised as a unit of exchange in order to overcome what would otherwise be cumbersome barter exchanges. However, if we look past the physical manifestations of money (be that gold coins, paper notes or lines of code) and focus instead on the credit and trust relationships between the individual and society at large, we discover money is not so much an intrinsically-valuable commodity but rather akin to a social contract whose value depends entirely on everybody agreeing it can be redeemed for an agreed-upon measure of goods and services. Even if robots completely take over the economy and do all jobs, it is still an economy and there would still be resources and services whose relative value has to be measured, somehow. So it seems likely that robots and AI would rely on some way of measuring the relative value of the resources they are using, the goods they are creating, and the services they are offering. So long as there is a society, there will be obligations between debtors and creditors. And that, ultimately, is what money is.

The unbanked are a reminder that scarcity often has little to do with resources being scarce, but rather because we lack the ability to access them. There is a tremendous reserve of human potential unfortunately constrained by cumbersome bureaucracy, corruption, and unengaging work. By finding solutions to these problems, we can make the future brighter. The Mobile technology, cryptocurrency and Blockchain are doing their part to make that happen.

“Rational Optimist” by Matt Ridley.
“Accelerando” by Charles Stross
“Cryptocurrency” by Paul Vigna and Michael J. Casey
“Rethinking Money” by Bernard Lietaer and Jacqui Dunne.
“Technology: What It Is And How It Evolves” by W. Brian Arthur.

Posted in Uncategorized | Tagged , | Leave a comment

Thoughts On the Baby Boomers (part two)

In part one, we saw how historical events early in the 20th century lead to lower inequality and greater opportunity for the post-war generation, but that this meritocratic capitalism is no more. So what happened? Why did the conditions that Baby Boomers enjoyed- lower inequality, greater opportunity, more security- come to an end? I would argue that this was largely brought about by two things: faith in the market and distrust in the old bureaucracies. From the 1980s through to the 21st century, politicians, economists and others on both the right and the left attempted to extend an idea of freedom that was modelled on the market. This model could trace its roots back to a scientific theory called Game Theory, which was developed during the Cold War and turned by John Nash into a way of looking at social interaction in general. Nash argued that individuals lived their lives in a game, pursuing only their self-interests and constantly adjusting to one another’s strategies. Economists argued that, if this were true, then we should give up on the very idea of the collective people’s will. There was no way of adding up all individuals’ competing desires to produce one coherent goal. Furthermore, it suggested that the old idea of politicians and civil servants being motivated by some altruistic calling to serve the public good, was a lie. In reality, it was now thought, politics and the civil service were motivated by self-interest to build up their empires. The idea of public duty was based on an illusion. Only the market could possibly respond to people’s self-interested drives and create overall prosperity. The best thing that politicians could do was to stop interfering.
Political leaders like Thatcher, Major, and Blair (and, in the States, presidents like Reagan and Clinton) set about deregulating the market. Following advice from management consultants, John Major set out to create an alternate system intended to mimic the self-interested drive of the free market. This involved setting performance targets, the idea being that this would harness individualism and cause a transformation from self-serving bureaucrats to heroic entrepreneurs who would be driven by market forces to provide great services. The inexorable logic at the heart of game theory lead to this targets-based system spreading much further than bureaucratic institutions, as teachers, nurses and workers in the private sector were also given performance targets.
It wasn’t only politicians who had grown tired of the old bureaucracies. Throughout the 70s and 80s, popular wisdom increasingly saw corporations as bloated and inefficient, with business executives handicapping their organisations with unwieldy bureaucracies, and an overly-entitled workforce who were spoiled by far too generous rewards and low performance demands. In America, the business community was perceived as being unable to compete against more nimble foreign competitors. In the UK, years of industrial action turned the public against the Unions.
This perception lead to leaders in politics and business to taking drastic action in deregulation. In America, President Reagan appointed an attorney, who had previously defended large corporations against anti-trust suits, as head of the Department of Justice’s antitrust division, thereby pretty much guaranteeing non-interference from the government in the face of a growing mergers-and-acquisition movement. And, around the same time, the Supreme Court declared laws aimed at shielding local companies from out-of-state suitors to be unconstitutional, a decision that also helped accelerate an era of mergers and acquisitions.
Meanwhile, in the City, Michael Milken, of the investment house Drexel Burnham, created high-yield debt instruments known as junk bonds. They enabled far riskier and aggressive corporate raids than were possible in the previous, more cautious era. This deregulation and move toward hostile takeovers, leveraged buyouts and corporate bust ups lead to a dramatically different working environment. This change is perhaps best illustrated by considering the nicknames that CEOs of this period acquired. In 1962, Earl S. Willis, manager of employee benefits services at General Electric wrote, “maximising employment security is a prime company goal”. In marked contrast to this, 20 years later General Electric’s CEO Jack Welch earned the nickname ‘Neutron Jack’ because, so the wags quipped, by the time he was done with all the layoffs and cutbacks, only the buildings were left standing.
He was hardly alone. Indeed, the 80s was a period in which employees went through many a corporate crisis, brought about either by deregulatory trends to global competition. The pressures these trends placed on workplaces lead to a corporate perspective focuses on increasingly short-term goals, and job conditions that became increasingly uncertain, unrewarding, and demanding. Permanent careers became impermanent jobs, with a move from permanent staff to contingent labour in the form of temps and independent contractors, and during the 80s and 90s pension protections that had existed for almost as long as a century were cut back or eliminated altogether, resulting in growing numbers of men and women who lack pensions entirely.
The change that these circumstances wrought was summed up by Steven Hill, who wrote in an article for Salon:
“In a sense, employers and employees used to be married to each other, and there was a sense of commitment and a shared destiny. Now, employers just want a bunch of one-night-stands with their employees…with ‘jobs’ amounting to a series of low-paid micro-gigs and piece work, offering little empowerment for average workers”.
It was not only the workplace that underwent radical changes. The deregulations of the 80s and 90s freed up a lot of capital, but achieved this at the expense of creating some highly risky financial instruments. For example, there was something known as Consolidated Debt Obligations. As Dylan Ratigan explained, “CDOs gave banks a way to sell investors bets on whether all of us will be able to pay all our bills”. And then there were ‘Asset-backed Securities’, a type of bond that enabled buying the right to collect on debt payments like credit cards and car loans. Aided by increasingly powerful computers capable of tracking huge quantities of data, those existing bonds became gigantic Consolidated Debt Obligations. As Ratigan explained, “the new idea in banking was to take every kind of obligation to repay borrowed money- trillions of dollars-worth-put them in a statistical blender, and then sell portions of the mixture as investments”.
Investment banks began to intentionally mix high-risk loans such as poor people’s housing loans, with low-risk loans such as wealthy people’s credit cards. This mix of high- and low-risk loans produced a credit score rating comparable to medium-risk loans, even though the safe loans in the mix provided no protection from the really risky ones.
The overall result was ever-increasing risk transfer or what Dylan Ratigan called “playing hot potato with debt…The traditional incentive for banks to act as price integrity police- the standard of making careful, educated investments, was replaced by the incentive to sell as much insurance on as much debt as possible”.
This was a time of mortgages granted to NINJAS. That is, people with no job, no income, and no assets. It was a time when every delivery by the postal service included pre-approved credit card applications. And it was during this time that the City cowboys used their political influence to bring about the Financial Services Modernisation Act of 1999. This lead to the revoking of a rule that had been established after the Crash of 1929, a rule that meant no one company could simultaneously be a traditional bank, investment firm and insurance company. The Financial Services Modernisation Act meant a bank could (in Ratigan’s words) “take your money for safekeeping and use it as collateral with no supervision, all the while insuring itself against losses that taxpayers must pay of the bets the banks made with our money went bad”.
The move to deregulate the market and free capital from political interference and bureaucracy resulted in the rise of a credit casino, which lured people into taking on increasing levels of debt hidden behind such complex financial instruments that nobody could really hope to understand them. The banking and financial sector became increasingly infected with toxic debt and a speculative bubble that threatened to pop at any time, leading to a catastrophic downward spiral.
Of course, something like that almost happened in 2008, when the subprime mortgage speculative bubble burst and threatened to bring down such huge banking conglomerates that the government had to bail out the banks to avoid a crash as bad as the 1920s, if not worse. The massive stimulus packages that rescued the banks has resulted in austerity for future generations, or at least those generations who can’t afford top financial advisers who can use every morally-dubious trick in the book to protect their money. And the dubious financial instruments that lead to the near collapse of the global monetary system are still largely in place, leaving us with the probability that another speculative bubble could inflate and then pop, wiping out your savings.
Still, at least those performance targets had rid us of self-serving bureaucrats and delivered efficient services run by heroic entrepreneurs, right? New Labour certainly expected that to be the case. When they came to power in 1997, New Labour modelled itself on the Clinton Administration. Like their US counterpart, New Labour gave power away to the banks and the markets. And they took the targets-based system that John Major introduced and vastly expanded upon it- to the point where just about everyone from cabinet ministers down, and things that were previously considered unquantifiable- such as ‘happiness’- became part of a huge mathematical system that was supposed to use targets to free public servants from bureaucratic control.
But what this targets-based system did was to provide an opportunity for cheats to succeed by finding sneaky ways of fulfilling their goals. For example, hospital managers were given targets to cut waiting lists, and they achieved this by ordering consultants to prioritise the easiest operations like bunions, over more complicated ones like cancer. When they were given targets to reduce the number of patients waiting on trolleys, management removed the wheels from some of the trollies, reclassified them as beds, and reclassified the corridors as wards. Again, this meant they could take those patients off the list and meet their targets. Obviously, those tactics were not doing much to increase the actual quality of medical care. 
At first the government dismissed reports of cheating as a few bad eggs, but as more reports of fiddling the numbers came in, it became obvious that cheating had become endemic throughout the public services. Harvard Business School and others have shown that when goals are imposed on people, though intended to ensure peak performance, they often result in efforts to game the system without producing the underlying results the metric was supposed to be assessing. As Patrick Schilz, a professor of law, said:
“Your entire frame of reference will change and the dozens of quick decisions you make every day will reflect a set of values that embodies not what is right, but what you think you can get away with”.
This endemic cheating turned what had been intended as a rational system for boosting efficiency, into a weird world in which people were confronted with numbers and simply didn’t know whether to trust them or not. New Labour responded by adding even more mathematical levels of management, devising complex systems of auditing in order to monitor workers and make sure targets were being correctly fulfilled. The effect of all this was to turn what had been intended as a system of liberation into powerful new forms of control.
Worse still, this system had the effect of creating a more rigid and stratified society, and this happened because of what the system did to education. League tables had been created which showed parents which schools were the best performing, and which were bottom of the heap. The intention was that such league tables would incentivise less successful schools to improve their services, leading to rising standards across society. But, instead, rich parents moved to areas where the best schools were, which caused house prices to spiral, thereby keeping poor families out. And since the league tables were based on exam results, schooling was transformed from a system intended to give poor children the well-rounded education they would need to achieve social mobility, into one that focused on training kids to be specialised for passing exams, thereby enabling the school to rise up the league tables. The result was that by 2006, the country had become more rigid and stratified than at any time since the Second World War.
So if we compare life for the Baby Boomers with that of subsequent generations, we find the following. The Baby Boomers had an educational system designed to train them for their future careers and so enable social mobility; future generations received an education intended only to help schools look good on league tables, and higher education that is often not worth a damn when it comes to improving one’s chances of landing a decent job. When they entered the world of employment, the paternalistic corporate model ensured Baby Boomers a secure and steady working life where they were treated as stakeholders in the company they worked for, provided with many benefits in return for loyalty. Nowadays you enter a job and in all probability have no idea if you will still have a job tomorrow. You are a ‘permalancer’, working the same long hours as a full time employee but enjoying the same lack of benefits (like no sick pay and holiday entitlement) as the self-employed. The Baby Boomers left their dependable jobs and received a pension that enabled them to maintain the middle-class lifestyle they had earned throughout their working lives. These days, with austerity eating away at so many services, and a banking and financial sector still very much prone to speculative bubbles and subsequent crashes, you have no idea whether or not you will have any money to provide support in your old age. 
If we can visualise the Baby Boomers as being on a fairly well signposted path to prosperity, we can visualise later generations as being in some bewildering maze-cum-gauntlet, trying to negotiate their way around advice from genuine, well-meaning experts, and cheats disguised as servants but interested only in achieving tremendous short-term gain at their expense. In such a complex and uncertain world, is it any wonder that today’s young have adopted an “eat, drink and be merry, for tomorrow we die” outlook on life? I rather suspect that, had Baby Boomers lived under such conditions of uncertainty, they might have behaved the same.

Images from Wikimedia commons

Capital In the 21st Century by Thomas Piketty
White Collar Sweatshop by Jill Andresky Frazer
The Trap by Adam Curtis
Greedy Bastards by Dylan Ratigan

Posted in work jobs and all that | Tagged , , , | Leave a comment

Thoughts On the Baby Boomers (part one)

In a letter submitted to the Daily Mail, Dorothy Dobson responded to an article, published in the same paper, which attacked the Baby Boomer generation for being one that enjoyed privileges no longer available to subsequent generations. Headlined ‘Not All Baby Boomers Lived The Life Of Riley’, Mrs Dobson’s letter explained how “I was born in 1948, to a household with no bathroom, no hot water, and no inside toilet…Our first house cost us £1,400 at a time when my husband’s wages were £4 a week…My husband worked 16 hours a day, seven days a week to earn enough to live on…We’re now pensioners and have a decent home and a small amount of savings…We have good holidays but we have worked hard all our lives. Today’s whingers…think life owes them a living…They should be like we were and go without their luxuries”.
Reading their letter, one’s impression of the Dobsons’ life is hardly one in which everything was handed to them on a silver platter. I expect many other Baby Boomers recognised their own life in this letter, for they too were born into households with very little luxury and worked hard for decades to secure a reasonable standard of living in which to live out their old age. It must be very frustrating to hear younger generations, those selfish, smartphone-obsessed kidults, going on endless ‘gap-years’ and maxing out their credit cards rather than saving for retirement, accusing the older generation of ‘having it easy’.
As it happens, an understanding of the society that Baby Boomers grew up in, and the conditions experienced by later generations, indeed does not support the notion that our parents were born with everything handed to them on a silver platter. But things were different back then, and those differences arguably did grant the Baby Boomers opportunities that did not exist before, and have not existed since.
In order to see why this might be so, one needs to understand that capitalism comes in many forms, some more inclusive than others. From the 18th to the early 20th century, the West was ruled by a form of capitalism we might call ‘Patrimonial capitalism’. This is a form of capitalism dominated by old money that’s circulated through inheritance, leading to a rigid class structure and little social mobility. This is the reason why 19th century social commentators, like Jane Austen, are so preoccupied with marriage; because back then the rigid class structure and lack of social mobility meant one’s best chances of obtaining or holding onto great wealth was to marry into old money. 
This attitude was encapsulated in an economic formula set out by Thomas Piketty: r>g. This formula relates the return on capital- r (and r includes profits, dividends, rents and other income from capital) and economic growth, or g, and > represents the idea that when the rate of growth is low, then wealth will tend to accumulate more quickly from r than from labour. It also tends to accumulate more among the top 10% and 1%. The result is a trend toward higher inequality.
This rise in financial inequality could have spelled the end of democracy, as society would have divided between an oligarchical elite whose inherited wealth dominated much of society, and everybody else, lacking the power to change their circumstances. But then, in the early 20th century, some dramatic events occurred which altered the course of history. Those events were the Great Depression and the two World Wars. As you might imagine, these events destroyed much wealth. But, as Piketty argued, they were particularly bad for the wealth owned by the elite. And in the aftermath of World War II, governments took steps towards a redistribution of wealth, and the fast economic growth occurring around the world reduced the importance of inherited wealth.
How so? Well, the clue is in the word ‘growth’. You see, when growth is slow, life changes only gradually. So a money-making venture that worked well for one generation will work for subsequent ones. The son inherits his father’s buggy-whip empire and can life off of the profits it brings in. But when growth is high, change is fast and there is no guarantee that a successful venture will continue to work in the future. There is not much call for buggy whips when the internal combustion engine has rendered horse-drawn carriages obsolete.
So, between 1930 and 1975, the trend toward higher inequality was reversed, and a combination of government redistribution programs and fast growth enabled a transition away from patrimonial capitalism toward a more meritocratic form of capitalism. Whereas with patrimonial capitalism one’s circumstances are largely dictated by the conditions into which you are born, in meritocratic capitalism it is more how you live your life that determines where on the social ladder you end up. You can go from riches to rags if you make enough bad decisions (or have inordinate amounts of bad luck). Or, like the Dobsons, you can go from being really quite poor to really quite well-off, all through your own efforts and careful money management. Certainly, this isn’t privilege handed to you on a plate; it is the opportunity to ascend or descent the social ladder based on your own life choices.
Another event which changed society from the 1920s to the 1980s was the rise of left-wing politics and the communist revolution. This revolution did not have quite the effect that Marx expected. He believed that socialism would sweep capitalism away, as the proletariat gained awareness of themselves as a class and used their superior collective strength to wrestle the means of production from the hands of the owner-classes. But, instead, the threat of communism and the collective strength of workers organised into unions lead to a reformation of the workplace. Whereas in the dark, satanic mills of the 19th century, employees were treated as commodities to be exploited for profit, made to endure conditions that would seem intolerably brutal to us, the postwar years saw business instead think of their employees as stakeholders. They would treat their staff well, providing benefits like job security, paid vacations, full health coverage and a pension that would enable the employee to continue living in the middle-class lifestyle to which he had become accustomed through decades of loyal service. And it was, of course, that loyalty that businesses hoped to encourage. If the company adopted a paternalistic attitude and looked after its employees, they would want to work to ensure the company thrived.
The rise in benefits during the Baby Boom period was captured in a booklet called ‘A Record Of Progress’, which was published by The Consolidated Edison Company of New York. It recorded “a 143% increase (from 1945 to 1960) of sick pay, medical coverage and paid absences; “a 172%” increase in leisure time benefits (vacations and paid holidays) and “a 562% ” increase in retirement benefits”.

What the Baby Boomers enjoyed, then, was not everything handed to them on a plate, but rather a world in which there was less inequality, more opportunity to succeed, and more security for ordinary people, than had been the case in the preceding two hundred years. It was a time when you were assured that doing well at school and gaining good grades would land you a decent job; your workplace would provide security for all your working life, and when you retired your company pension would allow you to sustain your middle-class existence for several decades- the remainder of your life, basically.

It was, arguably, a golden age of capitalism and state security. And now, it is all but gone. How that came about will be the subject of part two.

White Collar Sweatshop by Jill Andresky Frazer
The Trap by Adam Curtis
Greedy Bastards by Dylan Ratigan
Capital In the 21st Century by Thomas Piketty
Daily Mail

Posted in work jobs and all that | Tagged , , | Leave a comment