Scarcity, Spectrum, Broadband and Net Neutrality
In the following pages I’m going to review some of the main debates about airwave distribution and allocation that have taken place during the 20th C. and the beginning of the 21st C. I’m also going to investigate the ways in which new technologies and the Internet have changed the spectrum debate, and make some enquires on whether the scarcity of spectrum argument still holds on the wake of Internet radio and mobile broadband communication. Before plunging into an analysis of the present conditions of the information and communication debates, it would be good to review some of the main arguments and debates of the 20th C. so that we can have a conceptual background and a reference base to draw comparisons from, analyze, and arrive to a possible forecasting of the industry.
There are two conflicting narratives concerning broadcast and airwaves regulation. The scarcity principle maintains that there is not enough available airwaves for all to use and that a mechanism of distribution has to be put in place in order to prevent interference. Proponents of wavelength allocation could be separated into two main groups, those who favor a market based, price oriented approach, proposing that allocation of frequency should go to the higher bidder, and those who argue that airwaves have to be available not only to the ones that have the means to pays for it, but also to the interests of minorities, small communities and individuals with no resources.
There is a similarity between the argument that the state is a necessary evil in order to avoid chaos and maintain order in society and the argument that radio needs to be regulated by a central authority. On August 13, 1912, a bill approved by the Senate and the House of Representatives became law, the 1912 Act required that anyone operating a radio station must have a license issued by the Secretary of Commerce. This license would have details of ownership and location of the station, waves lengths and hours authorized to work, etc. No long after the 1912 Act was passed, attempts to change it took place. In 1918, then Secretary of the Navy, Josephus Daniels, using the scarcity principle as the main argument, said that “there is a certain amount of ether, and you cannot divided up among the people as they choose to do it; one hand must control it…radio, by virtue of the interferences is a natural monopoly…” (Coase 3). In 1917 and 1918, bills were introduced in order to seek absolute control of the radio industry to the Navy. Josephus Daniels was requesting that the government had a monopolistic control over wireless communication, conceding that airwave limits produced the conditions for a natural monopoly in wireless communication. The natural monopoly argument was from the beginning in contraposition to the principle of use of common resources, which sees spectrum as a social wealth, like air, water, etc. Nevertheless the House Committee wasn’t convinced by the Navy’s argument and didn’t allow for the bill to pass.
In 1921, Intercity Radio Company sued Secretary of Commerce Herbert Hoover over his refusal to renew its license. In February of 1923, a court decision held that the Secretary of Commerce have no jurisdiction to refuse a license. Nevertheless, if the court denied to the Secretary of Commerce control over the number of stations that could be established, it implicitly recognized his power to choose the wavelength that a licensee could use. But a later decision denied him even of that power. In 1925 the Zenith Radio Company was assigned a specific wavelength and hours of broadcasting that puts it in disadvantage in relation to General Electric Company’s Denver station. But the Zenith Company refused to abide by the rules and decide to broadcast in the wavelengths an hours of its choice. Criminal procedures were applied against the Zenith Radio Company for violation of the 1912 Act, but in April of 1926 it was held that the Act did not give the secretary of Commerce power to make regulations and “that he was required to issue a license subject only to the regulations in the Act itself” (Ibid 5).
After that, a “period of chaos” ensued, when the Secretary of Commerce was compelled to issue licenses to anyone who applied, which became free to decide on the power of their stations, hours of operation and the wavelength they would use.
The Zenith decision added pressure for new legislation and in July 1926 the two Houses of Congress passed a joint resolution “providing that no license should be granted for more than ninety days for a broadcasting station of for more than two years for any other type of station” (Ibid 5). Also, anyone who wanted to get a license had to waive any right or “any claim to a right, as against the United States, to any wave length or to the use of the ether in radio transmissions” (Ibid 5), thus the principle that airwaves were a common resource belonging to the people of the United States this time prevaild. In December 1926, the House and Senate agreed on comprehensive measures to regulate the radio industry, becoming law in February 1927.
This Act brought into existence the Federal Radio Commission, which had among its duties to classify radio stations, prescribe the nature of the service provided, assign wave-lengths, determine the power and location of the transmitters, regulate the type of apparatus used and prevent interferences. Other requirements included the need for the applicant to a license to provide proof of citizenship, financial and technical qualification of the applicant and its character; ownership and location of he station, frequencies or wave lengths, the power to be used, hours of the day, purpose and other information as may be required.
A fundamental aspect was that the Commission was authorized to issue a license if the “public interest, necessity or convenience would be served” by issuing that license. Transfer of the license was to be authorized by the Commission. And following the spirit of the 1926 Act, licensees were required to sign a waver of any claim to the use of a wavelength or the ether. Therefore the Commission was given great powers to regulate the radio industry but not power of censorship. Although some restrictions on obscene, indecent or profane language were imposed. Other aspects included the need for a permit from the original station to re-broadcast programs, the announcement of the names of people paying or furnishing programs and equal opportunity to all candidates in case a qualified candidate for office was allowed to broadcast on behalf of his campaign. Radio stations operated by the Federal Government where not under the jurisdiction of the Federal Radio Commission, except when the transmissions were not related to government business. These stations were subjected instead to the President of the U.S.
In 1934, the powers exercised by the Federal Radio Commission were transferred to the Federal Communication Commission, also made responsible for regulation of the telephone and telegraph industries. The sections of the 1934 largely reproduced the 1927 Act. Amendments have been made since, but mainly related to procedural matters, making the system in place today to be essentially the same that the one established in 1927.
The common resource argument gave rise to a clash with the doctrine of the freedom of the Press. The Commission of Freedom of the Press, under chairman Robert M. Hutchins, included under the term press “all the means to communicating to the public, news and opinions, emotions and beliefs, whether by newspapers, magazines or books, by radio broadcasts, by television or by film” (Ibid 7). The Supreme Court held a similar view. When the Federal Radio Commission was established, its immediate task was to regulate and allocate wavelengths, control of power and the policy of interference. But since the Commission also had as one of its goals to enforce representation of the public interest through license renewal, the Commission naturally entered the area of content supervising and regulation. The Commission maintained –and this has been sustained by the courts- than in order to renew a license it has to review past and proposed programming, thus setting the stage for the Commission to go beyond the merely technical aspect of distributing and allocating wavelengths.
Nevertheless, attempts to influence programming by the Commission has been given attention only two times, the first time in 1940, when the Commission criticized a Boston station for editorializing and favoring a political candidate – the so called Mayflower case-- the Commission renewed the station’s license only after getting a compromise from it that it would not broadcast editorials. Although in 1948 the Commission re-examined it and without explicitly repudiating the Mayflower doctrine, it approved of editorializing subject to the criterion of “overall fairness”, recognizing a necessary abridgement of freedom when concerning radio communications.
Another controversy aroused with the publication in 1946 --under the Truman administration and under “a progressive FCC (that) emerged to challenge the commercial interests consolidating control of US media” (Pickard 1)-- of the so-called Blue Book, entitled Public Service Responsibility of Broadcast Licensees. The name and motive behind the Blue Book came after a letter by a World War II veteran named William Tymous, who wrote a letter to FCC Commissioner Clifford Durr expressing his outrage about the racist radio programming that he encountered at home after coming back from war. The Blue Book “took the unprecedent --an unrepeated—step of making the privilege of upholding broadcast licenses contingent upon meeting substantive public interest requirements” (Ibid 3). This time the Commission took a clear approach at trying to regulate content by stating that it would favor any station applying for a license renewal that would take into account issues of overall program balance and that would provide time for content that was not prone to sponsorship as well as content that represented minorities point of view, non-profits organizations interest and experiment and unfettered artistic self-expression. The Blue Book was held as un-constitutional by some, but this view wasn’t favored by the courts.
Revision of past activities by applicants to a license has given raise to another kind of problems. The rejection of The Daily News application to open up a radio station on grounds --held by the American Jews Congress-- of bias against minorities posed a more clear threat to freedom of expression. Another case involved he political activities of a radio station owner accused, in 1954, of being a communist. Although the FCC renewed his license, it also held its right “to inquire in past associations, activities and beliefs” (Coase 11).
The issue of regulation and its relation to scarcity could be also assessed from the point of view of property rights. It is held that the ideas of regulation put into practice by the FCC were based on a misunderstanding of the problem of scarcity. After all, land, labor and capital are all scarce but they are not regulated by government but by price. A property rights mechanism guaranties that no chaos would ensue over a limited resource like land. In 1951, Mr. Leo Hertzel proposed that the price mechanism should be used to allocate frequencies, making wavelength allocation an economic factor and not a political one.
Monopolistic control of wavelength and several other concerns aroused as part of the debates on scarcity and spectrum allocation, prevalent among them was the debate, in 1951, between Mr. Leo Hertzel and Professor Smythe from the University of Illinois. Hertzel was arguing for the plausibility of allocating spectrum based on price and the leasing of channels to the higher bidder. Professor Smythe replied that there was indeed enough spectrum available to allocate but that the major part of it was under the use of the military, fire-department and amateur radio stations. Mr. Hertzel replied that there was indeed not serious argument to preclude these institutions from paying for wavelength use. Professor Smythe also argued that the use of market controls was based on the assumption that there was “substantially perfect competition in the electronic field” (Ibid 16). In fact, allocation schemes come with a cost and the danger of monopolistic tendencies is always present in the communications industry. Therefore, mechanisms of control other than market self-regulation are needed. It is in this sense the FCC rather complement the Department of Justice antitrust regulatory function even if a broadcasting station is not violating anti-trust law. Whether such a double mechanism of checks and balances is required by the broadcasting system goes beyond the scope of this paper, but it is clear that some mechanism have to be put in place in order to limit the number of broadcasting stations that certain firms are allowed to operate and the practices that need to be prohibited, institutional political regulation and price market based regulation complement each other at regulating monopolistic tendencies and fair competition. Chaos could happen after too much deregulation, and there is historical evidence that private enterprise and unrestrained competition are incapable by itself to impose order in use of spectrum, as evidenced by the situation that followed in 1926 after the courts rejected intervention of the Secretary of Commerce over the Zenith Radio Company. Retarding of development could also come as a consequence of a radio industry without any kind of regulation. The issue is therefore how to strike the right balance between freedom of expression and competition on one hand and regulation and order on the other, how to foster an environment where the industry could develop and represent the interests of both the private and the public sphere.
There is an inherent dilemma to governmental regulation since an administrative agency who tries to fulfill the regulatory mechanisms accomplished by market behavior faces two shortcomings: One is the lack of precise monetary account of costs and benefits, another is the great difficulty on getting relevant information at the level of management and about consumers tastes and preferences. It is because of these insufficiencies that the FCC has to take a simple approach in order to avoid the lengthy amount of time that usually takes to solve this issues. Thus, the FCC allocates certain range of frequencies for certain uses.
The pricing mechanism also has its complexities, expressed in the fact that resources don’t go to the ones that have the more money but to the ones that are willing to pay for it in order to make the best use of it. Firms who use funds more profitable tend to get more funds. Therefore it is doubtful that the FCC would allocate frequency to firms that are not in a good position to raise capital, which undermines the well intentioned thought that the FCC favors the weak and the limited over the one with strong possibilities and resources. The inquiry of the FCC on the financial status of the applicants usually tends in the direction of the stronger financially resourced, “in any case, it is doubtful whether the FCC has in general, awarded frequencies to firms which are in a relatively unfavorable position from the point of view of rising capital” (Coase 19).
Transfers of the ownership of television and radio stations have to be approved by the FCC. The Commission almost always approves such negotiations and it is clear that part of the moneys involved in the purchase goes to pay for use of the frequency. Today the Commission usually refuse to accept the selling of a station at a much higher price than the value of its equipment and infrastructure, unlike in the early years of radio regulation when station like WENW of NYC and WDTV of Pittsburgh were sold at a high price that took into account not only equipment and organization but also frequency. The buying and selling policies of today allows for a higher return on investment and a wider distribution of benefits among the business community, enabling “the new as well as the old owner to share in it” (Ibid 23). The extraordinary return rate accrued by the present system of frequency allocation, makes plausible to think that the obvious ways to distribute frequency in a fair way is by making those who wish to use frequencies bid for them.
Now, going back to the 1934 Communication Act, U.S. communications have been governed by a basic division between common carrier, such as telegraph and telephone, and mass media. Common carriers had been “generally imagined as natural monopolies services requiring prices and entry regulation” (Aufderheide 260) while mass media concerned scarce spectrum that belongs to the public. They both were requited to serve public interest and encourage democratic behavior.
Common carriers were to strive for universal service, making sure that anyone in society would have access at affordable price. Cross-ownership and ownership in mass media has to be limited in order to encourage diversity of sources. It is not difficult to see that many would agree with the view that communications --the back and forth of messages between a sender and a receiver-- is a universal right and that the government has to make sure that this right applies equally to all. On the other hand, information --the unidirectional message sent to a receiver--, although viewed as necessary to the well functioning and health of a democratic society, does not need to be regulated as strictly as communication -- except in the case of violation of moral norms. Telecommunication rules apply to more stringent demands, like safety and emergency situations, while information or media rules are applied in order to keep a structure working in good conditions but not necessarily linked to the emergency events that gave birth to the 1912 Act after the Titanic disaster. It is in this sense that telecommunication rules are more strict that media rules. Media rules are also more related to issues of First Amendment or freedom of expression, according to corporate law, broadcasting corporations have rights and responsibilities like actual people.
Reform in the telecommunication industry was first debated in the U.S. Congress in 1978. But it failed because of dissimilar interests of communication businesses and ideological differences between status quo liberals and marketplace oriented conservatives hostile to regulation. The situation in the following decade changed dramatically, “since the 1990’s telecommunication, mass-media and informatics had interpenetrated each other irrevocable, both technologically and financially. The development of digital technology and compression techniques, the increasing sophistication of satellites and of wireless technologies have created the platform for far greater flexibility in electronic communication than could have been imagined thirty years before” (Aufderheide 261-62). The same period has also witnessed increasing consolidation in communication as multinational media conglomerates have come to dominate content provision within vertically integrated companies.
The process of convergence that started in the last three decades has triggered regulatory changes. In 1982 an antitrust consent broke the monopolistic phone system. Large media firms won wavers from cross-ownership rules and TV networks succeed in getting access to content production in the 1990’s. The ascent of the Republicans and Ronald Reagan in the 1980’s pushed communication policy towards a marketplace oriented, deregulatory approach, to be echoed by the Clinton administration. In 1993 Congress gave the FCC authority to auction spectrum.
How the new developments on wireless technologies have affected the debate on spectrum allocation since until “recently, the only commercially useful spectrum was located below 3 GHz, but technical advancements now allow services to use ever shorter wavelengths well above 3 GHz” (Goodman 2)? Are the policies established in the 1927 Act still legitimate? Is the present technological situation making most of these rules obsolete?
There are two essential ways of dealing with spectrum allocation and interference avoidance, the first one has to do with issuing exclusive rights or a license as a means to separate users by space, time or frequency. The other way has to do with allocating spectrum for shared or unlicensed use while relying on technological measures to reduce interference. As of 2008 --the time Goodman article was written-- technology was only capable of managing interference “only for a limited sect of low power applicants, such as Wi-Fi service and cordless telephones” (Ibid 2). The rules of spectrum licensing of today has been into place since 1927 and spectrum licensing will remain the main form of regulation until technology would enable more widespread use of shared spectrum applications.
Today there is consensus that the future of communications will depend on ability of broadcasters to offer a mobile product. To that end, broadcaster are developing a mobile standard that would allow to offer services to handheld devices on spectrum previously devoted to broadcasting television. If these devices run on a subscription base they would have to pay a fee based on gross revenue. If they are advertisement based, they would have to respond to the question of to what public interest requirements they should respond in return for ”free” use of the spectrum, an issue that is compounded by the question of whether mobile broadcasting will be subject to the same localism and diversity functions that traditional broadcasting is subjected to and if should be regulated the same way as traditional broadcasting has been.
In 2002 the FCC talked about allowing low powered unlicensed devices to use so called “white space” areas of spectrum --those areas of broadcast spectrum not used to carry broadcast signals. Both side of the debate over white spectrum –incumbents and prospective entrants—assert that spectrum can be put to use in more intense fashion. The debate takes the form of whether incumbents should be granted additional rights to exploit adjacent spectrum if those rights should be auctioned or whether the frequencies should be opened up to devices that are unlicensed.
After the broadcasting industry crossover to DTV there will be at least 30 MHz and sometimes closer to 100 MHz of spectrum available previously used for television broadcasting in each U.S. market. This newly available spectrum is given a high value by big corporations like AT&T, which paid $55 million for the rights to use a 12 MHz license covering the Denver area. Given the availability of cable, satellite and fiber television and Internet broadband, the FCC is under pressure to reallocate broadcast spectrum to new wireless users. This pressure comes either from broadcaster themselves who want to sell the spectrum for non-broadcast purposes or to provide services under a different regulatory regime. The question looming in the air today is whether is it viable to support existing models of regulation or if the pertinent thing to do is to free spectrum up for new services under new regulation whether by an independent agency like the FCC or marketplace self-regulatory forces.
In the words of Steven Waldman, senior advisor to the chairman of the FCC, “the media landscape is changing rapidly (…) On the one hand, we’re seeing tremendous innovation – exciting changes -- in the media world. Primarily due to the Internet, consumers are exposed to more voices and viewpoints than ever before. And they have more ways to connect with each other and make their voices heard. Traditional media business models are struggling or collapsing. Newspapers and TV stations have been laying off thousands of professional journalists. This has raised strong concerns on both sides of the political spectrum, conservatives and liberals alike, about whether mass media will remain strong and independent enough to protect consumers and hold leaders accountable. That’s potentially a huge problem for our democracy” (Waldman).
The FCC historic goals of using news and information as instruments for enriching the lives of the peoples in their communities and fostering the values of democracy are being challenged by the collapse of traditional media business in the new conditions of information in the digital era. These challenges include: facilitation of information about school performance, crime rate, environmental hazards, public health and others at the local level. An issue that has to be accomplished without losing from sight the First Amendment and a free press, independent of government control, as a key foundational principle of Democracy.
In an article about what could possibly be the “last great newspaper war” and “the first battle of the online news age” between the NYT and WSJ, author E. Van Burskirk argues that the Times has lost about two-thirds of its value in the past five years (Van Buskirk 1). The local New York print market has seen that “print circulation declined by 8.7 % in the six months ending in March”, according to the Audit Bureau of Circulations (Ibid 3). By entering into the digital news market, newspapers compete at a global scale with news sources from another countries and levels. Also, there is pressure for charging for specific content in devices like the iPad and other e-readers.
Network neutrality is another of the current debates in the media and communication environment today. “With the majority of Internet traffic expected to shift to congestion-prone mobile networks, there is growing debate on both sides of the Atlantic about whether operators of the networks should be allowed to treat Web users differently, based on the users’ consumption” (O’Brien). Proponents of net-neutrality require gatekeepers to treat all users equally, without taking into account application, source or download limit. The industry has created that expectation largely by charging users a flat rate for unlimited Internet access. But form the point of view of the operators, networks have always been managed since otherwise few users that represent a small percentage of consumers while being responsible for a high percentage of the traffic would be putting a burden on the rest of consumers by slowing speed and quality of service. In other words, from the operator side of things, users have to be discriminated according to the capacity that they occupy in the net. This is to say that a user like FB or Google has to be charged differently than a regular user like somebody with a personal website that generate a few thousand views each month. The debate has taken political repercussion since “operators have increasingly micromanaged the flow of data, favoring some users over others as they have sought to handle exploding levels of traffic or deliver premium broadband service at guaranteed speeds to heavy users and businesses” (Ibid 1). Last year BitTorrent, a file-sharing service, challenged Comcast for disabling a protocol used by BitTorrent users. The FCC intervened, ordering Comcast to stop the blockade. But on April 6, an appeals court in Washington sided with Comcast, saying that the F.C.C. has no saying on how to tell Comcast how to manage its network. This seems unfair from the users point of view, whether BitTorrent or the person who was sharing music files over the Internet. Beyond the copyright issues involved in file sharing, operators like Comcast are concerned over the deterrent impact that any rigid legal mandate would have on the economics of high-speed wireless broadband, which could limit future investment and improvement to the networks. This is why on April 6, 2010, a federal appeals court ruled that regulators, i.e. FCC, have limited power over Web traffic control under current law. Thus allowing “Internet service companies to block or slow specific sites and charge video sites like Youtube to deliver their content faster to users” (Wyat 1).
By ruling that Comcast have the right to control its cable service customers’ access to BitTorrent, the United States Court of Appeals for the District of Columbia Circuit also raised obstacles to the Obama administration’s effort to increase Americans’ access to high-speed Internet networks. The national broadband plan, released by the Obama administration, “proposed to shift billions of dollars in money from a fund to provide phone service in rural areas to one that helps pay for Internet access in those areas. Legal observers said the court decision suggested that the F.C.C. did not have the authority to make that switch” (Ibid 1). The FCC have several ways of challenging this rule, either by appealing to the Supreme Court, which would “not hear the case or take years to deliver a final ruling. Congress could alternatively rewrite the laws to explicitly state the FCC’s powers over broadband access and content --which also would take a long time—or by taking matter in its own hand by reclassifying broadband services under an existing set of rules governing telephones services rather than the current status as a lightly regulated information service” (Hay 1).
By reclassifying broadband services the FCC would be need to prove that broadband should be treated as a basic utility such as telephone service, thus needing the same strict regulations that apply for common carriers. This “could prove difficult politically, however, since some conservative Republicans philosophically oppose giving the agency more power, on the grounds that Internet providers should be able to decide what services they offer and at what price (Wyatt 1). FCC Commissioner Michael Copps states that the FCC jurisdiction has been crippled by the decisions made in 2002 and 2005 that allowed the courts to step in the broadband debates (Copps)
According to reports “FCC Chairman Julius Genachowski plans to leave broadband deregulated rather than pursuing reclassification as some supporters had hoped (Reports)”. Reclassification of Broadband from Title 1 to Title 2 is for the FCC Commissioner Michael Copps a political problem that is necessary to solve in order to help improve quality of life and economic solvency in society. Consumer protection, privacy, security and service for the public are some of the issues that reformers have been working for decades to make telecommunications evolve, principles that the FCC wants to apply for broadband (Copps). Commissioner Copps is a strong proponent of reclassification and he is not alone, the Internet Open Coalition, which represents a group of major Internet stakeholders including Skype, Google, eBay, Amazon, Netflix, TiVo and Facebook has also joined forces in the net-neutrality debate. Also, Congress also has given signal that “stands ready to write new telecommunications policy, if needed (Reuters).
The big players of the media industry like AT&T, Verizon and Comcast have issued concerns about reclassification saying that it would be overtly burdensome. To lack of incentives for investment that would follow reclassification the media corporations are also citing freedom of speech concerns. In the words of Amendment attorney Robert Corn-Revere “reclassifying broadband as a Title II common carrier service would be an attempt ‘to change the level of First Amendment protection for a medium simply by changing its regulatory definition’, which he says (the FCC) has limited, if any, authority to do” (Eggerton 1). According to that viewpoint, First Amendment concerns are what really is at stake if reclassification happens.
An environment for fair competition, as coded by the U.S. Federal Trade Commission, would exert cease and desist measures to large corporations in order to curb unfair trade practices. The FTC states that “under this Act, the Commission is empowered, among other things, to (a) prevent unfair methods of competition, and unfair or deceptive acts or practices… (c) prescribe trade regulation rules defining with specificity acts or practices that are unfair or deceptive, and establishing requirements designed to prevent such acts or practices; (d) conduct investigations relating to the organization, business, practices, and management of entities…and (e) make reports and legislative recommendations to Congress” (Federal Trade Commission Act 15 U.S.C. §§ 41-58, as amended).
How could the right balance between market based self-regulation and governmental intervention would produce the best results for society? According to The Economist “America has adopted no policies to require the owners of broadband cables to open their infrastructure to rival sellers in order to enhance competition. America relies almost exclusively on “facilities competition”, the provision of rival infrastructures: a cable provider may compete, for example, with a network that runs optical fiber to the home. True, there is a legitimate worry that forcing a company to rent out parts of its infrastructure to competitors may deter investment” (The Economist). In another words, America relies too much on the building of new pipes rather than opening up a pipe to competitors. But building new pipes is a difficult, chaos prone and cost intensive enterprise that few corporations can assume and that society is willing to take because the collateral effects on the environment. On the other hand, opening up a pipe for use to competitors pose the problem of motivation for investment, the risible argument posed by former Alaskan Senator Ted Stevens when he says that the Internet was not a truck but a “series of tubes” (Wired).
Another countries have shown that successful compromises can be produced, as evidenced by a study prepared for the FCC by Harvard’s Berkman Centre for Internet & Society. The FCC contends that wireless broadband could provide more competition, but wireless data transfer is much slower and less reliable than fixed broadband, making it more of a complement than a competitor to fixed broadband. America ranks 16th among countries “on almost any available measure of broadband penetration or quality…The FCC says that by 2020 it aims to ensure universal access at a speed of 4 megabits per second (Mbps), a fairly feeble target. It also sets a goal, but not much of a plan, to provide 100m homes with 100 Mbps.” (Ibid).
Congress is stepping more forcefully into the issue since it sent “a letter over to the Commission from Rep. Henry Waxman, chairman of the House Commerce Committee and Jay Rockefeller, chairman of the Senate Commerce Committee (…) The chairman said that in considering all "viable options" of how to respond to the Comcast court decision, the FCC could also include on that list "a change in classification" as long as it was a "light regulatory touch" type of change” (Brodsky 1). At the heart of these debates about broadband lies the old controversies between self-regulated marketplace behavior and the need for government regulation, compounded by First Amendment issues.
Low intensity radio waves and wireless communication possibilities to open up airwaves to fair competition echoes the debate held in 1951 between Mr. Leo Hertzel and Professor Smythe around spectrum policy. An European study on satellite communication and its advantages –and disadvantages- for internet broadband access on rural areas in Europe recommend the use of mixing satellite transmission of the type Standard Based Terminals (DVB-RCS, Satmode) with other access technologies such as Wi-Fi. Providing the advantages of communication and Internet access not hindered by “traditional terrestrial infrastructure obstacles” (IVSZ) Satellite Broadband technology, combined with low intensity frequencies such as Wi-Fi, promises to be a magnificent tool to close the digital divide gap.
Sky Media with 8.9 million households in the UK and Ireland and with a large scale choice of movies, news, entertainment and sports on its digital television platform Sky Digital, is partnering with IBM Tivoli Netcool to develop a new broadband system to provide television service and challenge incumbent telecommunications companies by launching a consumer broadband service for its TV subscribers at better value
“The company acquired Easynet, one of Europe’s leading Internet service providers (ISPs) and the owner of a state-of-the-art fiber optic transmission network in the UK. This provided the infrastructure required to launch Sky Broadband” IBM).
On the other side of the Atlantic, Google has announced a plan to build an experimental network, similar to Google’s early efforts to provide municipal Wi-Fi in the city of Mountain View, Calif. “For starters, Google wants to offer 1 gigabit-per-second speeds to some 50,000 to 500,000 people. At 2.6 people per household, that roughly translates to between 20,000 and 200,000 homes … estimates say that it will cost Google between $3,000 and $8,000 per home, or roughly $60 million to $1.6 billion, depending upon the final size and footprint of the network. If Google reaches, say, 100,000 homes, it would cost the company about half a billion dollars…These costs are quite varied, in some cases as much as $4,000 to connect a single home. Google’s final tab will depend on where it decides to build out the network” (Malik).
In another development, this time not from the point of view of the private industry but from the state point of view, “U.S. Commerce Secretary Locke, joined by U.S. Sen. Patty Murray and U.S. Rep. Jay Inslee, announced a $84 million Recovery Act investment to help bridge the technological divide, boost economic growth, create jobs, and improve education and healthcare in Washington state. The grant will bring high-speed Internet access to more than 100 community anchor institutions – including community colleges, libraries, healthcare facilities, and government agencies – and lay the groundwork for bringing affordable broadband service to thousands of homes and businesses in the region (…) the project plans to directly connect the Jamestown S’Klallam tribal center, library, and clinic, and the Shoalwater tribal center and clinic, as well as provide connection opportunities for the Makah tribal center and clinic” (Vos).
Judging by the last examples, there is still hope for an information and telecommunication industry where private sphere and governmental efforts coexist, where the motivation for recouping investment by private corporations, and the interests of minorities and the disabled could be represented. Optimistic promises come from the combined use of satellite broadband and Wi-Fi, as shown by the E.U. commissioned study. Possibly Google’s creation of a whole complete infrastructure is the way of the future, which depends on costs and return expectations. More probably the answer is a combination of satellite broadband and Wi-Fi on one hand and fiber optics infrastructure on the other. In any case, cost, returns, and government policy are going to determine the results. It is not going to happen without struggles, as witnessed by the long history of acts, laws and debates waged by the media and communication industry since its early inception. The questions which I started at the beginning, if a scarcity principle still holds for the information and communication industry given the state of technology today, and a forecasting of the industry, are answered by the combination of satellite broadband and Wi-Fi promises to create new spectrum capabilities; while new infrastructures, like Google’s or IBM/Sky promises to open up competition. On the other hand government regulation still has it reason of being, since minorities and individuals in disadvantages need help and assistance in order to enjoy the benefits of information and communication today.
Bibliography:
1. R. H. Coase. The Federal Communications Commission. The Journal of Law and Economics. Volume II. October 1959.
2. Victor Pickard. The Great FCC Blue Book Debate: Determining the Role of Broadcasting Media in a Democratic Society, 1945-1949. The annual meeting of the International Communication Association, Marriott, Chicago, IL.
3. Patricia Aufderheide. Shifting Policy Paradigms and the Public Interest in the U.S. Telecommunications Act of 1996. The Communication Review. Vol. 2 (2). 1997. OPA.
4. Ellen P. Goodman. Spectrum Policy and Public Interest. Rutgers University, Law School. June 2008
5. Steven Waldman. Future of Media Statement. http://reboot.fcc.gov/futureofmedia/blog?entryId=104620
6. Kevin J. O'Brien. Web Users Against its Gatekeepers. NYT. May 2, 2010.
7. Edward Wyatt. U.S. Court Curbs F.C.C. Authority on Web Traffic. NYT. April 6, 2010.
8. Andre Hay. http://www.reuters.com/article/idUSTRE6341CB20100405
9. Michael Copps. Communicators with Michael Copps. http://www.c-spanvideo.org/program/293234-1
10. Reports. FCC Head Won’t Try to Reclassify Broadband. Xchange. May 3, 2010. http://www.xchangemag.com/hotnews/fcc-head-wont-try-to-reclassify-broadband.html
11. Reuters. Congress might rewrite telecom policy. May 5, 2010. http://www.reuters.com/article/idUSTRE6444KY20100505?type=politicsNews
12. John Eggerton. Corn-Revere Warns FCC About Broadband Reclassification. Broadcasting & Cable, 5/5/2010.
13. Federal Trade Commission Act (15 U.S.C. §§ 41-58, as amended) http://www.ftc.gov/ogc/ftcact.shtm
14. Pipe dream. Not what was asked for. Mar 18th 2010 | WASHINGTON, DC | From The Economist print edition.
15. Wired Blogs. Your Own Personal Internet. June 30, 2006. http://www.wired.com/threatlevel/2006/06/your_own_person/
16. Art Brodsky. Net Neutrality Struggle Hits Tipping Point. Huffington Post. May 5, 2010. http://www.huffingtonpost.com/art-brodsky/net-neutrality-struggle-h_b_564851.html
17. IVSZ, 30/05/2008. Draft State of The Art Satellite Broadband Technology Analysis Report. European Commission, 7th Framework, 2007-2013.
18. IBM. Sky builds a state-of-the-art broadband network with the support of IBM Tivoli Netcool. 10/2/2008. http://www01.ibm.com/software/success/cssdb.nsf/CS/STRD7K2K5B?OpenDocument&Site=default&cty=en_us
19. Om Malik. How Much Will Google’s Fiber Network Cost?. Gigaom. 02/11/2010. http://gigaom.com/2010/02/11/google-fiber-network-cost/
20. Eme Vos. Washington state gets $84M broadband stimulus grant. March 1, 2010.
(c) Renelio Marin
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home