Please email firstname.lastname@example.org for help.
The Digital Policy Institute believes that advancing the interests of consumers should be a major, driving force in government policymaking processes aimed at broadband expansion and enhanced connectivity, including wireless. Our 21st Century digital infrastructure can be the conduit for many consumer benefits. As part of our review of those many benefits, we’ve developed a “Top-10” list that federal and state officials should consider when making choices that can affect the competitive nature, availability and capacity of wireless broadband. Consistent with the FCC’s newest report providing evidence that American consumers are being well served by the wireless sector, we’d like to highlight some signs that show us the marketplace is dynamic, highly competitive and consumer-centric. (FCC)
1. The Existence of “Price Rivalry”: Reports from the FCC and media confirm that providers have been cutting prices in order to effectively compete. The FCC has termed this phenomenon “price rivalry.” A key example of this behavior came in 2009, when Sprint offered unlimited mobile-to-mobile calling to any domestic wireless number. Another example was seen in late March 2013, when T-Mobile USA officially unveiled a key part of its new “Un-carrier” strategy by releasing new pricing for its Value plans (which do not include device subsidies). (FierceWireless). Actions like these, and those taken by other wireless broadband providers, pressure their competitors into lowering their prices. Former FCC Chief Economist Gerry Faulhaber notes that this result is “the epitome of competition.” (Faulhaber)
2. Declining Prices: Alongside the phenomenon of price rivalry is the reality that both cellular CPI and data prices have been declining consistently throughout the last decade. In fact, wireless service prices have declined every year since 2002, with the exception of 2008, when they remained the same as the previous year. An examination of two key pricing indicators, the Wireless Telephone Services component of the Consumer Price Index and the per-minute price of voice service, shows that mobile wireless prices declined overall in 2010 and 2011. The Wireless Telephone Services CPI declined for two consecutive years, while the per-minute price of voice service remained roughly stable in 2010 and then declined in 2011. To further illustrate this point, as of December 2011, wireless prices were 41 percent less than those of December 1997. Data prices have undergone a similar change with the cost declining 90 percent since 2008. We have seen real and substantial decreases in service pricing. (Hahn)
3. Consumer-Centric Wireless Deployment: The migration to 4G LTE is business and consumer-centric. When compared to prior mobile technologies, 4G LTE improves the speed of data transfer (greater bandwidth), offers faster network response time (lower latency) and increases overall network capacity (improved spectrum efficiency). In addition to mobile phones, consumers are seeing new consumer electronic devices hit the market (cameras, notebooks, ultra-portables, and gaming devices) that will incorporate LTE modules. According to a recent report by Arthur D. Little, 4G LTE deployment offers improved customer service, personal and team productivity gains, direct cost reductions and improved flexibility and decision making. (A. D. Little)
4. Increased Investment: The year 2012 gave further indication that wireless competition is alive and well. During that period, wireless service operators announced several current or future investments. For example, in November 2012, AT&T announced an additional $14 billion investment over the next three years into its wired and wireless infrastructure as part of the company’s Velocity IP project; in so doing, the company will expand its LTE network to cover 300 million people by the end of 2014. (TechCrunch). Similarly, Verizon Wireless announced $256 million in wireless network enhancements across New England (Verizon Wireless). According to the FCC’s March 2013 Mobile Wireless Competition Report, annual incremental investment in wireless networks rose to $25.3 billion in 2011, almost 25 percent over what it was two years before. (FCC)
5. Spectrum Scarcity: The issue of spectrum scarcity is a pertinent one for mobile wireless providers. Combine this concern with the surprising assertion from some public interest advocates that there are too few providers in the wireless market, and you are left with a unique situation. Economists at the Phoenix Center for Advanced Legal and Economic Public Policy Studies assert that the scarcity of spectrum may actually help maintain the competitive ethos needed in the industry. “The addition of a spectrum constraint to the traditional model of competition turns the conventional view that high industry concentration is a bellwether of poor economic performance on its head. Indeed, under a binding spectrum constraint, a market characterized by few firms (rather than a large number of firms) is more likely to produce lower prices and possibly increase sector investment and employment.” Spectrum scarcity, then, supports the argument that competition does, in fact, exist in the wireless market. If it did not, chances are the telecom economy would be far worse off. (Phoenix Center)
6. Choice of Providers: The FCC’s March 2013 Mobile Wireless Competition Report to Congress, using data from the Bank of America Merrill Lynch’s Global Wireless Matrix, notes that globally “the United States had the least concentrated mobile market at the end of 2011…” if Orange UK and T-Mobile UK are listed as a single provider. In short, the raw data throughout these reports clearly demonstrates “a vibrant and competitive market where consumers enjoy a tremendous array of options.” (FCC)
7. Choice of Devices. One metric to assess the deployment of wireless is to consider the number of handsets sold in the United States. A report by Strategy Analytics shows an estimated 167 million smartphones were shipped in 2012, indicating that the United States currently leads in this sector. Additionally, the U.S. market was found to lead the world in mobile broadband subscribers. (Faulhaber) Furthermore, and according to the FCC’s March 2013, Mobile Wireless Competition Report, between 2009, and 2011, the number of wireless connected devices grew by 26.6 million. (FCC)
8. Growth of Mobile Traffic: U.S. mobile traffic increased 139% in 2012, growing to become 18% of total Web traffic. By 2014, it is estimated that 50% of Web traffic will come from mobile devices. The fastest growing industries: B2C services; Web publishing; healthcare and social services; arts and entertainment; and higher education. (Bluetrain Mobile)
9. Increased Advertising Mobile advertising spending has grown tremendously in the past five years, and growth is expected to accelerate in the future. eMarketer expects overall spending on mobile advertising in the U.S., including display, search and messaging-based ads served to mobile phones and tablets, to rise 180% this year to top $4 billion. eMarketer’s previous forecast, made in September 2012, was for substantially slower growth of 80%, to just $2.61 billion. Now eMarketer expects U.S. mobile ad spending to reach $7.19 billion next year and nearly $21 billion by 2016, a significant upward revision. (eMarketer)
10. Wireless Substitution. In the first six months of 2012, more than one of every three U.S. households (35.8%) did not have a landline telephone but did have at least one wireless telephone. Approximately 34.0% of all adults (about 80 million adults) lived in
households with only wireless telephones; 40.6% of all children (approximately 30 million children) lived in households with only wireless telephones. These figures represent an increase of 15.6 percentage points of wireless-only households since December 2008. This reality is a testament to the utility and affordability of wireless services and products, an outgrowth of a competitive market. (CDC)
Michael Hanley is editor-in-chief of the International Journal of Mobile Marketing (IJMM) and a senior research fellow with the Digital Policy Institute, an independent, interdisciplinary research and policy development organization located at Ball State University in Muncie, IN. The DPI has served as a catalyst for research and education on digital media issues since 2004. Additionally, the DPI is a member of the Consumer Advisory Committee at the Federal Communications Commission (FCC).
DigitalPolicyInstitute.org @Digital_Policy Facebook.com]]>
What is disturbing is that Twitter just revealed that it was also hacked earlier this month. This attack did have the possibility though of 250,000 accounts and their login information, including emails, being stolen. While this may be a small chunk of the more than 200 million accounts on Twitter, it makes a stomach uneasy to think of how many accounts could have had information withdrawn from them.
More recently, starting on February 18th, the Twitter accounts of both Burger King and Jeep were hacked. Burger King’s was made to look like McDonald’s and Jeep’s page was changed to state that it was sold to Cadillac due to employees doing pain meds in the bathroom. While these hacks seem to be less severe it still goes to show that hacks happen and can happen often.
So what does this all mean about using social media as a research tool? It was stated in Facebook’s blog post that the hacks were traced to a handful of employees visiting a website that had been compromised which led to malware being installed on their devices. The blog described the PC’s that were at fault as being, “…protected by the latest anti-virus software and were equipped with other up-to-date protection.” This is just a pure and simple example that portrays the Internet will never be 100% safe. Anti-virus and software alike are just like the flu shot you hate to get before the winter months, a tiny dosing of last year’s popular strains with no guarantee that something new won’t come along to infect you.
How can accurate and meaningful research be performed on social media when, as of late, it seems to be so effortless to hack into an account and create false information? The falsification of data is not something that a company doing research to determine how they will proceed in the future wants to have running about its sample. Imagine a competing company creating false data to lead their opposition in the wrong direction. Laid out here is only the beginning of the ideas that are possible with current security measures taken in social media. Does this mean that social media should not be used at all? If so, will it ever be safe to use in studies?
The Digital Policy Institute is putting on a free webinar on April 26th, starting at 1 p.m. EST that hopes to open up some ideas and issues pertaining to the use of social media in research. We ask you to please join us for this virtual event where some of the leading minds in the industry stir things up. Also, at 2:30 p.m. EST, we will be following up with another webinar with discussions about the upcoming Incentive Auctions.
Chris Goepfrich & Steve Jones, Ph.D., Director of the Center for Information and Communication Sciences at Ball State University
AT&T’s petition recommends that the FCC avoid using lengthy and unnecessary procedures when deciding how to transition from plain-old-telephone service (POTS) to IP-based service as the standard for the nation. Robert Yadon, director of the DPI and coauthor of the comments, pointed out that “this transition has already begun taking place. The problem, however, is with the bureaucratic mess that the FCC must contend with in order to complete the task.”
“AT&T was absolutely right to place these issues squarely on the FCC’s plate. With billions of investment dollars at stake, uncertainty exists today because the commission has failed to resolve the numerous proceedings pending before it that directly relate to the IP transition,” said Yadon. He added, “This will be an opportunity to directly address those who incorrectly believe that the provision of broadband is a natural monopoly by cable systems, or that today wireline broadband doesn’t compete with wireless broadband.”
In addition to the lag in the IP transition, the DPI/Kleinhenz filing, and the two petitions themselves, focus on needed reforms in the regulatory structure of the telecom industry. Current regulation, they contend, stifles innovation, poses disadvantages to private investment and treats differently competitors offering the same services. These points are further illustrated by regional economist Jack Kleinhenz, CEO of Kleinhenz & Associates and coauthor of the reply comments, who said, “We need to displace regulatory restrictions that are impediments to needed telecommunications infrastructure investments for the IP transition. Studies have shown that infrastructure is important and associated investments stimulate economic activity either by enhancing productivity or by direct contribution to economic output.”
Yadon and Barry Umansky, DPI’s chief legal counsel and co-author of the comments, gave specific support to the AT&T suggestion that telecommunications carriers engage in “regulatory trials” around the country, at specific geographic sites and under supervision of the FCC. Among other things, the aim of these trials would be to give the Commission the opportunity to observe how the transition actually would work and how it would affect consumers. Such trials also would give the FCC the experience needed to help it avoid the imposition of unnecessary regulation on the expanded IP-based infrastructure and its operation. Noting that Muncie, Indiana has been the site of the “Middletown Studies,” developing decades of data on social, consumer and media trends in “a typical American town,” the DPI/Kleinhenz filing recommends that Muncie be one of the sites for these national trials. This would better enable DPI to provide continuing and in-depth research in support of – and providing useful guidance on – the move to an all-IP telecommunications structure in America.
You can find the joint filing of DPI and Kleinhenz at http://www.bsu.edu/digitalpolicy]]>
The Internet, through networks and infrastructure, allow people to communicate and connect all across the world. The problem is that many people across the globe do not have access to these electronic networks.
Here in America, we are struggling with providing broadband access to all Americans. According to the article “Mixed Response to Comcast in Expanding Net Access,” published on January 20th, 2013, by Amy Chozick for The New York Times, there are more than 100 million Americans who lack high-speed Internet access in their homes.
At first, this digital divide was seen as an issue between urban and rural areas of America. However, Chozick writes, “only about 7 percent of households without broadband are in rural areas without the necessary infrastructure; the bulk of the rest are low-income families who cannot afford the monthly bill, or do not feel it is a necessity, according to government statistics” (pg. 1).
Providing affordable broadband to low-income families is where the focus needs to be, and that is what Comcast, America’s largest cable and Internet provider, is trying to do.
Starting in May of 2011, Comcast developed a program called Internet Essentials, a program that allows any family that qualifies for the National School Lunch Program to be eligible for a $9.95 per month home Internet service. While this policy is a good idea, there have been mixed responses to what Comcast is trying to accomplish.
Comcast sees an area for profit, but also is able to help provide needed home Internet to low-income families. However, Chozick writes, “many advocacy groups argue that broadband has become so crucial to success in school and the work force that it should be treated like a public utility paid in part by government subsidies” (pg. 1).
There are many pros and cons to the program Comcast has in place, but there has and always will be a “natural monopoly” for broadband service. It is key that all players within this monopoly develop policies to help low-income families receive broadband because this is an issue that all must help solve.
Chozick, A. (2013, January 20). Mixed response to Comcast in expanding net access. The New York Times. Retrieved from http://www.nytimes.com/2013/01/21/business/media/comcast-internet-essentials-brings-access-to-low-income-homes.html?pagewanted=1&_r=0&ref=technology
While this may seem like a lofty goal today, we already have markets with access to speeds up to 1Gbps from such services as Google Fiber. A similar initiative was started in Seattle in December of 2012 by a consortium of the City of Seattle, the University of Washington, and Gigabit Squared.
According to NetIndex, the average worldwide broadband speed in July of 2008 was around 4.59Mbps. Today, it is approximately 12.8Mbps, and the pace of growth is expected only to increase over the next seven years.
However, geography must also be taken into consideration. Many households, especially within the United States, are in rural areas that are difficult and expensive to access.
The technology behind connectivity is constantly changing as well. We have moved through PSTN, cable, satellite, WiMAX, 4G LTE, and even fiber to the home.
So, with all of these volatile factors, making an accurate prediction of the future becomes extraordinarily difficult, and therefore setting a realistic goal is also difficult.
Imagine that you have a basketball, and you’re aiming for a hoop. As you prepare to make your shot, the ball changes to a football, then the goal changes to a golf hole and moves 20 feet from its original position while the football changes to a javelin and the goal changes again as you continue in this fashion. Now you have an idea of the difficulty of setting a reasonable goal for high speed broadband access.]]>
Dr. Robert Yadon
USTA Calls for Removal of Legacy ILEC Regs
Today (12-19-12) the US Telecom Association (USTA) asked the FCC to throw out a key doctrine that determines how telephone companies are regulated. What they want is to remove the “dominant carrier” regulations that subject the carriers to price cap or rate-of-return regulations, and also require them to ask for permission to alter their tariffs or their voice services. Why?
As DPI has noted in the past, the number of people using the public switched telephone network (PSTN) has declined over the past ten years as more and more people switch to mobile phones or other services. From a penetration high of 93-percent in 2003, the dependency on wireline as the primary line service has dropped to approximately 26-percent in 2013.
With consumers leaving the plain old telephone system (POTS) behind in favor of mobile and Internet-Protocol based networks,the suggestion that the FCC needs to alter the way in which “dominant carriers” are classified and regulated seems reasonable. In a competitive world, these wireline carriers are no longer “dominant” and our federal regulations should reflect this evolving landscape. They say that technology leads policy, and here is a case where it’s time to play catch-up.
Digital Policy Institute
On October 26th, the DPI hosted a webinar on broadband in America. One panelist, former FCC chief economist Dr. Leslie Marx, now at Duke University, underscored the need for telecom policy to adapt to the realities of the communications industry, with a focus on broadband, and in particular, mobile wireless broadband. She also noted that private investment drives broadband infrastructure deployment, both for wired and wireless services. Here’s a perfect case in point.
AT&T’s recent commitment to invest an additional $14 billion over three years to upgrade and expand its wireless and wireline broadband networks is a powerful reminder that Internet-based communication is driving America toward a future of expanded digital opportunity. While many other sectors continue to struggle to recover from the recession, the expansion of increasingly robust broadband networks is helping American businesses both here and abroad stay competitive and also creating jobs and hope for individual Americans across the country.
By providing an even more robust platform for Internet-based services and applications, enhancement of the company’s broadband infrastructure should foster innovation and facilitate investment by other Internet participants such as device manufacturers, applications developers, Internet content creators, and the businesses that produce the hardware and software that fuels the IP-broadband ecosystem. This means jobs. In the last five years, mobile innovation alone has created about 1.6 million jobs. New investment means even more jobs in the future.
By extending IP-based wired connections to one million additional business locations, the new investment will enable enterprises of all sizes to grow their own operations through better connections to business partners and customers and by opening the door to new market opportunities. In this way, American businesses will become even more productive and more competitive around the globe.
The new investment, which applies to both wired and wireless infrastructure, also will bring the United States closer to the national goal of universal broadband service. The company said its high-speed wireless broadband 4G LTE service will be available to 300 million Americans within two years and will provide a high-speed Internet option for virtually every customer in its 22-state wire line service area. The expansion of LTE service means a new range of opportunity for remote and lightly populated communities that have tended to lag behind.
The continuing expansion of broadband networks from this new infusion of investment capital also will advance the country’s critical transition to all-IP communication. When completed, this transition will enable the rapid and seamless transmission of every form of data. Voice, video, and text will all move with ease across IP networks that connect every person and every device.
AT&T’s investment commitment is great news for the IP-broadband industry, Internet-related businesses of every sort, and for every other business and individual that already takes advantage of the Internet or plans to in the years ahead. It is a win-win-win for every citizen.]]>
August 27, 2012
Federal regulators have largely resisted regulatory imposition on the Internet – whether via wireless or wired connections. And approaching the Internet ecosystem with regulatory restraint has proven to be a prescient strategy. The exploding digital economy has been a tremendous boon for both consumers of Internet-based services and technologies, and for the businesses that invest in building the networks that make the Internet economy possible. Consumers are demanding more and faster broadband connections and broadband providers appear to be heeding their call. But will federal policies impacting broadband providers ultimately help or hinder progress towards a digitally-connected America?
Part of the answer arrived on August 22nd when the Federal Communications Commission (FCC) in Washington DC voted on party lines (3-2) to issue an order that placed a hold on the ability of telephone firms to obtain relief from “special access” regulations when they showed that there was competition for these services within a specific market. FCC Chairman Julius Genachowski said, “Based on the record and the undisputed finding that legacy regulations are not working as intended, we temporarily suspend outdated rules that not only allow incumbent carriers to raise prices in the absence of competition but also deny them the flexibility to lower prices in the presence of competition. We do this as we determine what permanent rules would best promote a healthy competitive marketplace.”
Special access services, often referred to as the “middle mile” of the Internet, have traditionally consisted of legacy, copper-based DS-1/DS-3 facilities used by businesses and mobile carriers for backhaul connection to the Web. However, in the digital era, that special access market is in rapid transition, and a number of competitors, including wireless carriers such as T-Mobile and Sprint, have stated that by the end of next year they will no longer rely on copper-based DS-1/DS-3 facilities to satisfy their backhaul needs.
One has to question the timing of this initiative when both the market participants and the FCC note that the industry is rapidly embracing the migration to fiber-based facilities, which are unregulated. How the FCC decides to pursue long-term resolution of “special access” issues gives us pause as to whether this agency has its sights set on moving quickly into the future, or whether its focus is on the past at the expense of a digitally connected future for us all.
How is this relatively arcane issue relevant to anyone other than the lawyers that have been working on it for years? Consider the following. According to a study late last year that reviewed data from the Centers for Disease Control, more than 30% of Americans are cutting the cord and opting for wireless broadband as their preferred on-ramp to the Internet. However, as noted industry analyst Roger Entner recently observed, reverting to a pre-1999 approach to special access won’t have a positive impact upon consumer welfare. Yet at the same time, some competitors are urging the FCC to take this new approach on special access so they can “game” the system, wasting competitors’ capital on obsolete technology while they build their own state-of-the-art, unregulated infrastructure.
In plain English, this means that the regulatory approach on special access this FCC is undertaking would be ineffective in reducing costs to consumers of wireless broadband. Moreover it would do nothing to hasten the investment in and buildout of next-generation wireless (or wired) broadband infrastructure. What then would be the upside of pursuing the path this FCC appears to favor? There does not seem to be a ready explanation.
If the FCC is serious about helping to achieve President Obama’s national goal of extending wireless broadband to more than 98 percent of the population by 2015, the Commission needs to signal very clearly that there is no tolerance for adopting or retaining federal policies that not only fail to further this goal, but may well delay achieving it.
Barry Umansky is a senior research fellow for the Digital Policy Institute at Ball State University, and a professor of telecommunications since 2003. His professional background includes over thirty years as a communications attorney in Washington, D.C. The Digital Policy Institute is an independent, interdisciplinary research and policy development organization and has served as a catalyst for research and education on digital media issues since 2004.
DigitalPolicyInstitute.org @Digital_Policy Facebook.com/DigitalPolicyInstitute
For further contact: email@example.com