Battle of the Paradigms: The Fight Over Internet Neutrality 

BATTLE OF THE PARADIGMS: THE FIGHT OVER INTERNET NEUTRALITY
INTRODUCTION
The philosopher Thomas Kuhn saw major scientific theories as being established through ‘paradigm shifts’. According to this view of progress, there is always a ‘paradigm’- a set of prevailing theories which form a world view through which people observe and explain their world. Any observation or evidence that seems to conflict with that worldview is rejected by the ‘establishment’ who act on the basis that the paradigm must be held inviolate. Opposing the ‘establishment’ is a heretical group beholden to a different paradigm.
This notion of competing worldviews or paradigms is perhaps not confined to major scientific theories, but also one of the major issues affecting our most pervasive and important technologies. As such, there is a battle to be fought that inevitably affects the lives of all of us. What battle? The fight over Internet neutrality.
ORIGINS OF THE PARADIGMS
That competing worldviews should have arisen thanks to the Internet is perhaps not surprising given the history of its development and the wider historical context in which it emerged. The battle is between government and the private sector on one hand, who for various reasons seek ways of imposing more centralised control over the Internet and the enclosure for profit’s or security’s sake, and advocates of commons-style management on the other, who wish to maintain and extend the Internet’s capacity to be a distributed, peer-to-peer, laterally-scaled network that could potentially provide most of the goods and services required for a decent life at near-zero marginal cost.
These three groups- government, the private sector, and civil society- have been part of the Internet from the start. The Internet itself is a commons, meaning it is owned by everyone and no-one. It takes big telecommunications companies to establish and maintain the physical network of the Internet (the fibre optic cables, data storage etc) but such companies are merely facilitators and providers; they don’t actually own the Internet. Governance of the Internet is the job of various non-profit organisations, such as the ‘World Wide Web Consortium’, the ‘Internet Engineering Taskforce’ and the ‘Internet Corporation for Assigned Names and Numbers’. All of these organisations are open to anyone to take part in, although participation does require technical expertise, which in practice excludes the non-technical among us.
The Internet is a commons, but the Web (or rather, the applications that run on it) are a hybrid of commercial enterprises and nonprofit organisations. The line between a commercial and nonprofit organisation is not always clear-cut. In the case of Apple and its App Store, that is obviously commercial. But it can feel like services provided by, say, Google or Facebook, are not commercial because end users get to access them for ‘free’. This mix of purely, partly, and non-commercial apps on the Internet help make the issue of governance a thorny one.
ORIGINS OF CENTRALISATION
OK, but why should there be those opposing paradigms of an enclosed, privatised Internet on one hand, and an open, collaborative network on the other? If we look back over the history of the Internet and the wider historical context in which it evolved, the reason becomes clearer.
The Internet’s origins can be traced back to 1960s research commissioned by the US Federal government to build robust, fault-tolerant computer networks that would provide communication even in the event of nuclear attack. This research lead to a precursor of the Internet known as Arpanet. Arpanet served as a backbone for the interconnection of academic and military networks. By the 1990s, the linking of commercial networks and enterprises, and the linking up of personal and mobile computers saw the emergence of the Internet that we know today.
Now, as I said before, the Internet evolved within a wider historical context. In particular, it evolved in a world dominated by fossil fuels and it was this more than anything that lead to there being a worldview bent on enclosure and privatisation. The oil industry is one of the most concentrated in the world. Just about every other industry that depends upon fossil fuel necessarily requires a vast capital expenditure to establish vertical integration that brings together supply chains, production processes and distribution centres. The sheer cost of these vertically integrated corporate enterprises demand centralised management in order to increase efficiencies and lower costs. This practice of vertical integration and centralisation was both necessary given the energy/communications matrix of the twentieth century, and actually pretty successful, bringing appreciable improvement to the lives of those in industrialised nations.
It’s perhaps not surprising, then, that governments and private enterprises used to an era when vertical integration and centralisation were the optimal path to prosperity would seek to carry their worldview to 21st century technologies.
Another thing to consider is the fact that the electromagnetic spectrum we use to communicate with had long seemed like a scarce resource. Broadcast radio emerged in the 1920s. Whenever two or more broadcasters in close proximity used spectrum frequencies very close to one another, this would result in constant interruption and interference. By 1927, radio broadcasters were causing sufficient disruption for congress to pass the Radio Act to establish the Federal Radio Commission. The job of the FCC was to manage the radio spectrum and determine which frequencies could be used and by whom. In practice, this meant that a broadcaster had exclusive use to a particular radio frequency that they licensed from the FCC. It also meant that the spectrum itself was considered to be a scarce resource and, as such, to be thought of as a commercial asset. Therefore, not only did we have organisations growing up in an era when vertical integration and centralisation were engines of productivity and efficiency, but also organisations growing up at a time when broadcasting was a scarce commodity that had to be managed commercially. This all contributed to the establishment of a mindset wedded to the ‘enclosure’ paradigm.
ORIGIN OF THE COLLABORATIVE COMMONS
But, equally, from the start, the Internet has been a commons and there have been those wedded to that style of governance. ‘Internet Neutrality’ is a concept that grew out of the end-to-end structure of the Internet. This structure favours users rather than network providers because, while we must pay for Internet connection and the speed and quality provided by our ISP can be better or worse depending on how much we pay for their service, once we are connected all transmitted packets of data are treated the same way, with no one- commercial or otherwise- getting preferential treatment. Advocates of commons-style management believe that a neutral Internet is best-suited to facilitate the collaboration of millions of end users, allowing people to develop their own applications that would advance network collaborations and drive marginal costs to near zero, an eventuality that would cause a paradigm shift away from capitalist/socialist economic systems toward a new world order.
GOVERNMENT AND CENTRALISED CONTROL
The notion of a nearly free, open, transparent Internet is not something all parties welcome. According to Jeremy Rifkin, “national governments, concerned over a spate of Internet-related policy issues that affect their general welfare and sovereign interests…are enacting legislation, some of which is threatening an essential feature of the medium- its open, universal, and transparent nature”. In 2011, the countries China, Russia, Uzbekistan and Tajkstan submitted a proposal to the UN General Assembly that pushed for new forms of government control over the Internet.
COMMERCIALISATION THROUGH ENCLOSURE
Governments find themselves in the middle of two interest groups, one of which is dedicated to the capitalist model, while the other subscribes to a commons mindset. Not surprisingly, each group seeks to get government on its side, enacting legislation that furthers its interests. In the case of the private sector, it seeks price discrimination that would increase income and profits. More precisely, it seeks to secure control of information exchanged over the Internet, control that would enable the charging of different prices for access to certain kinds of information, prioritise transmissions, favour some applications while blocking others. Such a move would seriously compromise network neutrality, which is based on a non-discriminatory communications with equal access and inclusion for all participants.
This enclosure of the Internet is not just coming from outside in the form of network providers fighting against what they see as unfair restrictions on their right to pursue profits. It’s also coming from the inside as well. As Rifkin said, “some of the best-known social media sites on the Web are revving up to find ways to enclose, commercialise, and monopolise the new communications medium”.
There is a famous acronym- TANSTAAFL- which stands for ‘there ain’t no such thing as a free lunch’. In other words, in the commercial world nothing is ever really free and whenever it seems like that is the case, that’s only because the price is hidden. A case in point are the services provided by the likes of Google and Facebook. Ever since the public began connecting their computers and phones to the Net and sharing data on the web, a valuable resource of personal data has been waiting to be mined, commercialised, and exploited for profit. As Tim Berners-Lee explained, whenever somebody connects to commercial media sites, their vital information is (in Rifkin’s words) “immediately captured, siloed, enclosed and commodified”. That information is then sold to interested third parties. 
The most familiar result is the targeted advertising that comes with using Google’s search engine. But it could also potentially mean such things as health insurance companies using your search history and other digital footprints to decide whether to provide cover or not. According to Berners-Lee, “the more this kind of architecture gains widespread use, the more the Web becomes fragmented, and the less we enjoy a single, universal, information space”. In other words, we could be witnessing a form of commercial exploitation based on the commodification of the self that is creating centralised and proprietary monopolies in virtual space.
NATURAL MONOPOLIES
It is argued by commons advocates that, since Google provides an essential service that we all depend upon, and given that no rival offers anything like a comparative service, then Google should be classed as an essential facility. Concerns have been raised over loss of ‘search neutrality’, in which a dominant, privately-owned search engine is tempted, for commercial or political reasons, to manipulate search results.
In a counter to such arguments, free-market advocates warn that, by treating such services as social utilities and calling for regulations that treat them as natural monopolies, we run the risk of actually turning them into just that, because in doing so we would be protecting such companies from competition. Since regulated companies have a guaranteed rate of return and fixed prices built in, critics say, they have less incentive to be innovative. It’s not like they have competitors to worry about, after all.
Also, there are critics who argue that monopolisation of social media is less of a concern than, say, power companies, because of disproportionate up-front costs. In order to guarantee a natural monopoly, power companies had to invest huge amounts of capital in order to set in place the requisite physical infrastructure and to secure a captive user-base. But when it comes to setting up something like Twitter, the up-front costs are far less. This makes it a lot easier for new players to come along and displace market leaders.
The problem with that argument, however, is that while it was once possible to depose market leaders in social media using little capital investment, that is no longer the case. After all, as Rifkin said, “Google…et al are investing billions of dollars in expanding their user base while simultaneously creating impenetrable enclosures, protected by layer upon layer of intellectual property, all designed to profit from the global social commons they helped create”.
The more such services grow, and the more users of such services there are, the more it is of benefit if everyone likewise uses the same services. But such services remain commercial ventures, so while they are motivated to optimise social connections, in line with their users’ interests, they are also incentivised to sell information about users to third parties. This, according to Zeynap Tufecki, sociology professor at the University of South Carolina, is “the corporatization of the commons”.
ENERGY
This battle of paradigms between an enclosed, privatised and commercialised Internet on one hand and a distributed, collaborative, laterally-scaled commons-management network on the other, is not just confined to the question of who owns our personal information. It also impacts on the future of energy. I said earlier that 20th century energy production, based on fossil fuels, required vertically-integrated infrastructure and centralised management. But 21st century renewable-energy technologies will work best when they are distributed, scale laterally across society, and organised collaboratively, which means they favour a communications medium that is similarly distributed, collaborative, and laterally scaled. As Rifkin said, “Internet communications and renewable energies form the inseparable matrix for a foundational infrastructure whose operating logic is best served by commons management”.
Moreover, technological progress can turn seemingly scarce resources into abundant ones (actually, it is more accurately said that the resource was always abundant; our ability to access it was poor). Case in point: Managing communications over radio frequencies. Modern technologies in the form of smart antennas, dynamic spectrum access, cognitive radio technologies and mesh networks are able to employ a variety of tricks that use the radio spectrum with far greater efficiency. This opens up the possibility of establishing open wireless connections at near-zero marginal cost.
But, if so, it won’t come without a struggle because there are those with a vested interest in preventing the transition to an abundance-based society. We see this in the case of electric utilities, where there have been moves to design a smart grid that is centralised, proprietary, and closed. The plan is to prevent users from having access to moment-by-moment information regarding changes in the price of electricity and to prevent end users from uploading their electricity to the grid at advantageous times when electricity peaks. In other words, they would rather not have users who are efficient at managing their electricity usage, as that would eat into profits. Also, concerns have been raised over discriminatory practices that favour faster connectivity of green electricity by affiliated business partners. As in all cases of neutrality versus preferential treatment, the goal for those pursuing the latter is to enforce a centralised architecture (in this case, on the smart grid) thereby enabling commercial enclosure for profit’s sake.
THE BATTLE
So which paradigm will ultimately prevail? Will the Internet and its future expanded and enhanced incarnation, the Internet of Things (in which not just communications and information but also energy and logistics are incorporated into our networks) become increasingly enclosed and proprietary, perpetuating the dominance of capitalist/socialist economic systems? Or will we make the paradigm shift to commons management in which distributed, collaborative, laterally-scaled peer-to-peer networks drive marginal costs to near-zero and prosumers become more focused on increasing their social capital through providing access to goods and services communities value, with capitalist/socialist work relegated to a supplementary role, maintaining the infrastructure on which the collaborative commons is built, but no longer dominant in our lives?
Given the financial muscle and political influence of today’s global corporations, it might seem like the collaborative commons stands little chance. But, from an environmental perspective, it would be unwise to bet against a near-zero marginal cost society. It has long been established by the laws of thermodynamics that the total amount of energy in the universe never decreases or increases. Also, all work converts useful energy into forms unavailable for work. What this means is that every time resources are turned into commodities, we are effectively running up a bill with nature. Some say that bill has now come due in the form of climate change and loss of biodiversity. Even if you are a climate change denier, there still remains the well-established laws of thermodynamics, which demand of life constant innovation and evolution to discover more efficient use of resources, at pain of extinction. 
The collaborative commons, built on top of infrastructures established by a market system operating at the peak of its power to deliver maximum possible productivity, would be so efficient at converting resources into useful products and services and distributing them appropriately, that anyone interested in the long-term survival and prosperity of the human race ought to promote and work toward advancing it.
BATTLE OF THE PARADIGMS: THE FIGHT OVER INTERNET NEUTRALITY
INTRODUCTION
The philosopher Thomas Kuhn saw major scientific theories as being established through ‘paradigm shifts’. According to this view of progress, there is always a ‘paradigm’- a set of prevailing theories which form a world view through which people observe and explain their world. Any observation or evidence that seems to conflict with that worldview is rejected by the ‘establishment’ who act on the basis that the paradigm must be held inviolate. Opposing the ‘establishment’ is a heretical group beholden to a different paradigm.
This notion of competing worldviews or paradigms is perhaps not confined to major scientific theories, but also one of the major issues affecting our most pervasive and important technologies. As such, there is a battle to be fought that inevitably affects the lives of all of us. What battle? The fight over Internet neutrality.
ORIGINS OF THE PARADIGMS
That competing worldviews should have arisen thanks to the Internet is perhaps not surprising given the history of its development and the wider historical context in which it emerged. The battle is between government and the private sector on one hand, who for various reasons seek ways of imposing more centralised control over the Internet and the enclosure for profit’s or security’s sake, and advocates of commons-style management on the other, who wish to maintain and extend the Internet’s capacity to be a distributed, peer-to-peer, laterally-scaled network that could potentially provide most of the goods and services required for a decent life at near-zero marginal cost.
These three groups- government, the private sector, and civil society- have been part of the Internet from the start. The Internet itself is a commons, meaning it is owned by everyone and no-one. It takes big telecommunications companies to establish and maintain the physical network of the Internet (the fibre optic cables, data storage etc) but such companies are merely facilitators and providers; they don’t actually own the Internet. Governance of the Internet is the job of various non-profit organisations, such as the ‘World Wide Web Consortium’, the ‘Internet Engineering Taskforce’ and the ‘Internet Corporation for Assigned Names and Numbers’. All of these organisations are open to anyone to take part in, although participation does require technical expertise, which in practice excludes the non-technical among us.
The Internet is a commons, but the Web (or rather, the applications that run on it) are a hybrid of commercial enterprises and nonprofit organisations. The line between a commercial and nonprofit organisation is not always clear-cut. In the case of Apple and its App Store, that is obviously commercial. But it can feel like services provided by, say, Google or Facebook, are not commercial because end users get to access them for ‘free’. This mix of purely, partly, and non-commercial apps on the Internet help make the issue of governance a thorny one.
ORIGINS OF CENTRALISATION
OK, but why should there be those opposing paradigms of an enclosed, privatised Internet on one hand, and an open, collaborative network on the other? If we look back over the history of the Internet and the wider historical context in which it evolved, the reason becomes clearer.
The Internet’s origins can be traced back to 1960s research commissioned by the US Federal government to build robust, fault-tolerant computer networks that would provide communication even in the event of nuclear attack. This research lead to a precursor of the Internet known as Arpanet. Arpanet served as a backbone for the interconnection of academic and military networks. By the 1990s, the linking of commercial networks and enterprises, and the linking up of personal and mobile computers saw the emergence of the Internet that we know today.
Now, as I said before, the Internet evolved within a wider historical context. In particular, it evolved in a world dominated by fossil fuels and it was this more than anything that lead to there being a worldview bent on enclosure and privatisation. The oil industry is one of the most concentrated in the world. Just about every other industry that depends upon fossil fuel necessarily requires a vast capital expenditure to establish vertical integration that brings together supply chains, production processes and distribution centres. The sheer cost of these vertically integrated corporate enterprises demand centralised management in order to increase efficiencies and lower costs. This practice of vertical integration and centralisation was both necessary given the energy/communications matrix of the twentieth century, and actually pretty successful, bringing appreciable improvement to the lives of those in industrialised nations.
It’s perhaps not surprising, then, that governments and private enterprises used to an era when vertical integration and centralisation were the optimal path to prosperity would seek to carry their worldview to 21st century technologies.
Another thing to consider is the fact that the electromagnetic spectrum we use to communicate with had long seemed like a scarce resource. Broadcast radio emerged in the 1920s. Whenever two or more broadcasters in close proximity used spectrum frequencies very close to one another, this would result in constant interruption and interference. By 1927, radio broadcasters were causing sufficient disruption for congress to pass the Radio Act to establish the Federal Radio Commission. The job of the FCC was to manage the radio spectrum and determine which frequencies could be used and by whom. In practice, this meant that a broadcaster had exclusive use to a particular radio frequency that they licensed from the FCC. It also meant that the spectrum itself was considered to be a scarce resource and, as such, to be thought of as a commercial asset. Therefore, not only did we have organisations growing up in an era when vertical integration and centralisation were engines of productivity and efficiency, but also organisations growing up at a time when broadcasting was a scarce commodity that had to be managed commercially. This all contributed to the establishment of a mindset wedded to the ‘enclosure’ paradigm.
ORIGIN OF THE COLLABORATIVE COMMONS
But, equally, from the start, the Internet has been a commons and there have been those wedded to that style of governance. ‘Internet Neutrality’ is a concept that grew out of the end-to-end structure of the Internet. This structure favours users rather than network providers because, while we must pay for Internet connection and the speed and quality provided by our ISP can be better or worse depending on how much we pay for their service, once we are connected all transmitted packets of data are treated the same way, with no one- commercial or otherwise- getting preferential treatment. Advocates of commons-style management believe that a neutral Internet is best-suited to facilitate the collaboration of millions of end users, allowing people to develop their own applications that would advance network collaborations and drive marginal costs to near zero, an eventuality that would cause a paradigm shift away from capitalist/socialist economic systems toward a new world order.
GOVERNMENT AND CENTRALISED CONTROL
The notion of a nearly free, open, transparent Internet is not something all parties welcome. According to Jeremy Rifkin, “national governments, concerned over a spate of Internet-related policy issues that affect their general welfare and sovereign interests…are enacting legislation, some of which is threatening an essential feature of the medium- its open, universal, and transparent nature”. In 2011, the countries China, Russia, Uzbekistan and Tajkstan submitted a proposal to the UN General Assembly that pushed for new forms of government control over the Internet.
COMMERCIALISATION THROUGH ENCLOSURE
Governments find themselves in the middle of two interest groups, one of which is dedicated to the capitalist model, while the other subscribes to a commons mindset. Not surprisingly, each group seeks to get government on its side, enacting legislation that furthers its interests. In the case of the private sector, it seeks price discrimination that would increase income and profits. More precisely, it seeks to secure control of information exchanged over the Internet, control that would enable the charging of different prices for access to certain kinds of information, prioritise transmissions, favour some applications while blocking others. Such a move would seriously compromise network neutrality, which is based on a non-discriminatory communications with equal access and inclusion for all participants.
This enclosure of the Internet is not just coming from outside in the form of network providers fighting against what they see as unfair restrictions on their right to pursue profits. It’s also coming from the inside as well. As Rifkin said, “some of the best-known social media sites on the Web are revving up to find ways to enclose, commercialise, and monopolise the new communications medium”.
There is a famous acronym- TANSTAAFL- which stands for ‘there ain’t no such thing as a free lunch’. In other words, in the commercial world nothing is ever really free and whenever it seems like that is the case, that’s only because the price is hidden. A case in point are the services provided by the likes of Google and Facebook. Ever since the public began connecting their computers and phones to the Net and sharing data on the web, a valuable resource of personal data has been waiting to be mined, commercialised, and exploited for profit. As Tim Berners-Lee explained, whenever somebody connects to commercial media sites, their vital information is (in Rifkin’s words) “immediately captured, siloed, enclosed and commodified”. That information is then sold to interested third parties. 
The most familiar result is the targeted advertising that comes with using Google’s search engine. But it could also potentially mean such things as health insurance companies using your search history and other digital footprints to decide whether to provide cover or not. According to Berners-Lee, “the more this kind of architecture gains widespread use, the more the Web becomes fragmented, and the less we enjoy a single, universal, information space”. In other words, we could be witnessing a form of commercial exploitation based on the commodification of the self that is creating centralised and proprietary monopolies in virtual space.
NATURAL MONOPOLIES
It is argued by commons advocates that, since Google provides an essential service that we all depend upon, and given that no rival offers anything like a comparative service, then Google should be classed as an essential facility. Concerns have been raised over loss of ‘search neutrality’, in which a dominant, privately-owned search engine is tempted, for commercial or political reasons, to manipulate search results.
In a counter to such arguments, free-market advocates warn that, by treating such services as social utilities and calling for regulations that treat them as natural monopolies, we run the risk of actually turning them into just that, because in doing so we would be protecting such companies from competition. Since regulated companies have a guaranteed rate of return and fixed prices built in, critics say, they have less incentive to be innovative. It’s not like they have competitors to worry about, after all.
Also, there are critics who argue that monopolisation of social media is less of a concern than, say, power companies, because of disproportionate up-front costs. In order to guarantee a natural monopoly, power companies had to invest huge amounts of capital in order to set in place the requisite physical infrastructure and to secure a captive user-base. But when it comes to setting up something like Twitter, the up-front costs are far less. This makes it a lot easier for new players to come along and displace market leaders.
The problem with that argument, however, is that while it was once possible to depose market leaders in social media using little capital investment, that is no longer the case. After all, as Rifkin said, “Google…et al are investing billions of dollars in expanding their user base while simultaneously creating impenetrable enclosures, protected by layer upon layer of intellectual property, all designed to profit from the global social commons they helped create”.
The more such services grow, and the more users of such services there are, the more it is of benefit if everyone likewise uses the same services. But such services remain commercial ventures, so while they are motivated to optimise social connections, in line with their users’ interests, they are also incentivised to sell information about users to third parties. This, according to Zeynap Tufecki, sociology professor at the University of South Carolina, is “the corporatization of the commons”.
ENERGY
This battle of paradigms between an enclosed, privatised and commercialised Internet on one hand and a distributed, collaborative, laterally-scaled commons-management network on the other, is not just confined to the question of who owns our personal information. It also impacts on the future of energy. I said earlier that 20th century energy production, based on fossil fuels, required vertically-integrated infrastructure and centralised management. But 21st century renewable-energy technologies will work best when they are distributed, scale laterally across society, and organised collaboratively, which means they favour a communications medium that is similarly distributed, collaborative, and laterally scaled. As Rifkin said, “Internet communications and renewable energies form the inseparable matrix for a foundational infrastructure whose operating logic is best served by commons management”.
Moreover, technological progress can turn seemingly scarce resources into abundant ones (actually, it is more accurately said that the resource was always abundant; our ability to access it was poor). Case in point: Managing communications over radio frequencies. Modern technologies in the form of smart antennas, dynamic spectrum access, cognitive radio technologies and mesh networks are able to employ a variety of tricks that use the radio spectrum with far greater efficiency. This opens up the possibility of establishing open wireless connections at near-zero marginal cost.
But, if so, it won’t come without a struggle because there are those with a vested interest in preventing the transition to an abundance-based society. We see this in the case of electric utilities, where there have been moves to design a smart grid that is centralised, proprietary, and closed. The plan is to prevent users from having access to moment-by-moment information regarding changes in the price of electricity and to prevent end users from uploading their electricity to the grid at advantageous times when electricity peaks. In other words, they would rather not have users who are efficient at managing their electricity usage, as that would eat into profits. Also, concerns have been raised over discriminatory practices that favour faster connectivity of green electricity by affiliated business partners. As in all cases of neutrality versus preferential treatment, the goal for those pursuing the latter is to enforce a centralised architecture (in this case, on the smart grid) thereby enabling commercial enclosure for profit’s sake.
THE BATTLE
So which paradigm will ultimately prevail? Will the Internet and its future expanded and enhanced incarnation, the Internet of Things (in which not just communications and information but also energy and logistics are incorporated into our networks) become increasingly enclosed and proprietary, perpetuating the dominance of capitalist/socialist economic systems? Or will we make the paradigm shift to commons management in which distributed, collaborative, laterally-scaled peer-to-peer networks drive marginal costs to near-zero and prosumers become more focused on increasing their social capital through providing access to goods and services communities value, with capitalist/socialist work relegated to a supplementary role, maintaining the infrastructure on which the collaborative commons is built, but no longer dominant in our lives?
Given the financial muscle and political influence of today’s global corporations, it might seem like the collaborative commons stands little chance. But, from an environmental perspective, it would be unwise to bet against a near-zero marginal cost society. It has long been established by the laws of thermodynamics that the total amount of energy in the universe never decreases or increases. Also, all work converts useful energy into forms unavailable for work. What this means is that every time resources are turned into commodities, we are effectively running up a bill with nature. Some say that bill has now come due in the form of climate change and loss of biodiversity. Even if you are a climate change denier, there still remains the well-established laws of thermodynamics, which demand of life constant innovation and evolution to discover more efficient use of resources, at pain of extinction. 
The collaborative commons, built on top of infrastructures established by a market system operating at the peak of its power to deliver maximum possible productivity, would be so efficient at converting resources into useful products and services and distributing them appropriately, that anyone interested in the long-term survival and prosperity of the human race ought to promote and work toward advancing it.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s