Home » data traffic » Recent Articles:

Wireless Providers Study Monetizing, Controlling Your Wi-Fi Use; Do We Need Wi-Fi Neutrality?

While wireless providers currently treat Wi-Fi as a friendly way to offload wireless data traffic from their 3G and 4G networks, the wireless industry is starting to ponder whether they can also earn additional profits from regulating your use of it.

Dean Bubley has written a white paper for the wireless industry exploring Wi-Fi use by smartphone owners, and ways the industry can potentially cash in on it.

“It is becoming increasingly clear that Wi-Fi access will be a strategic part of mobile operators’ future network plans,” Bubley writes. “There are multiple use cases, ranging from offloading congested cells, through to reducing overseas roaming costs and innovative in-venue services.”

Bubley’s paper explores the recent history of some cell phone providers aggressively trying to offload traffic from their congested 3G networks to more-grounded Wi-Fi networks.

Among the most intent:

  • AT&T, which acquired Wayport, a major Wireless ISP, and is placing Wi-Fi hotspots at various venues and in high traffic tourist areas in major cities and wants to seamlessly switch Apple iPhone users to Wi-Fi, where available, whenever possible;
  • PCCW in Hong Kong;
  • KT in the Republic of Korea, which has moved as much as 67 percent of its data traffic to Wi-Fi;
  • KDDI in Japan, which is planning to deploy as many as 100,000 Wi-Fi Hotspots across the country.

America's most aggressive data offloader is pushing more and more customers to using their Wi-Fi Hotspots.

Bubley says the congestion some carriers experience isn’t necessarily from users downloading too much or watching too many online shows.  Instead, it comes from “signalling congestion,” caused when a smartphone’s applications demand repeated attention from the carrier’s network.  An application that requires regular, but short IP traffic connections, can pose a bigger problem than a user simply downloading a file.  Moving this traffic to Wi-Fi can be a real resource-saver for wireless carriers.

Bubley notes many wireless companies would like to charge third-party developers fees to allow them access to each provider’s “app store.”  Applications that consume a lot of resources could be charged more by providers (or banned altogether), while those that “behave well” could theoretically be charged a lower fee.  The only thing preventing this type of a “two-sided business model,” charging both developers and consumers for the applications that work on smartphones, are Net Neutrality policies (or the threat of them) in many countries.

Instead, Bubley suggests, carriers should be more open and helpful with third party developers to assist them in developing more efficient applications on a voluntary basis.

Bubley also ponders future business strategies for Wi-Fi.  He explores the next generation of Wi-Fi networks that allow users to establish automatic connections to the best possible signal without ponderous log-in screens, and new clients that can intelligently search out and connect to approved networks without user intervention.  That means data traffic could theoretically be shifted to any authenticated or preferred Wi-Fi network without users having to mess with the phone’s settings.  At the same time, that same technology could be used to keep customers off of free, third party Wi-Fi networks, in favor of networks operators run themselves.

Policy controls are a major focus of Bubley’s paper.  While he advocates for customer-friendly use of such controls, sophisticated network management tools can also be used to make a fortune for wireless providers who want to nickle and dime customers to death with usage fees, or open up new markets pitching Wi-Fi networks to new customers.

Bubley

For example, a wireless carrier could sell a retail store ready-to-run Wi-Fi that pushes customers to a well-controlled, store-run network while customers shop — a network that forbids access to competitors or online merchants, in an effort to curtail browsing for items while comparing prices (or worse ordering) online from a competitor.

Customers could also face smartphones programmed to connect automatically to a Wi-Fi network, while excluding access to others while a “preferred” network is in range.  Wireless carriers could develop the same Internet Overcharging schemes for Wi-Fi use that they have rolled out for 3G and 4G wireless network access.  Also available: speed throttles for “non-preferred” applications, speed controls for less-valued ‘heavy users,’ and establishment of extra-fee “roaming charges” for using a non-preferred Wi-Fi network.

Bubley warns carriers not to go too far.

“[We] believe that operators need to internalize the concept of ‘WiFiNeutrality’ – actively blocking or impeding the user’s choice of hotspot or private Wi-Fi is likely to be as divisive and controversial as blocking particular Internet services,” Bubley writes.

In a blog entry, Bubley expands on this concept:

I’m increasingly convinced that mobile device / computing users will need sophisticated WiFi connection management tools in the near future. Specifically, ones that allow them to choose between multiple possible accesses in any given location, based on a variety of parameters. I’m also doubtful that anyone will want to allow a specific service provider’s software to take control and choose for them – at least not always.

We may see the emergence of “WiFi Neutrality” as an issue, if particular WiFi accesses start to be either blocked or “policy-managed” aggressively.

[flv width=”640″ height=”380″]http://www.phillipdampier.com/video/The Future of Wi-Fi.flv[/flv]

Edgar Figueroa, chief executive officer of The Wi-Fi Alliance, speaks about the future of Wi-Fi. Wi-Fi technology has matured dramatically since its introduction more than a decade ago and today we find Wi-Fi in a wide variety of applications, devices and environments.  (3 minutes)

Canada: Get Off the Internet and Go Outside – You Are the Second Largest ‘Data Hog’ in the World

Phillip Dampier July 7, 2011 Broadband Speed, Canada, Competition, Data Caps, Public Policy & Gov't, Wireless Broadband Comments Off on Canada: Get Off the Internet and Go Outside – You Are the Second Largest ‘Data Hog’ in the World

Toronto

Except for South Korea, nobody uses the Internet more than Canadians.  That’s an important finding in a new report produced for Canada’s telecommunications regulator to better understand the current state of the broadband market in the country.

According to recent reports from the Organization for Economic Cooperation and Development, the country generates 2,288 terabytes of data traffic per month per 100,000 residents.  That’s among the highest in the world and comes from avid web browsing, watching online video, and a love affair with smartphones.

But like many relationships, this one is also expensive.  You pay all of the money you have to spare, and your provider delivers you just enough of a usage fix to keep you from running to Ottawa to demand change.

Canadian broadband pricing is in the top third of all OECD-measured nations, with the average price for High Speed Internet running $55.18.  The average median price across all OECD members runs a lot less — $39.23 per month.  If you want to pay less, you have to bundle your landline, cell phone, television, and Internet service with the same provider, or make due with a slow speed “lite user” plan, where average pricing had been running lower until this year.

The average monthly price of the Level 1 basket increased to roughly $35 in 2011, up considerably from $31 the previous year.

Similarly, average monthly price of the Level 2 basket increased this year as well to roughly $50, up from $48 last year.  The average advertized download speed of the services included in the Level 2 basket is close to 6.5 Mbps, which is similar to the average speed in last year’s study.

The average monthly price of the Level 3 basket also increased slightly to $63, but still remains well below the 2008 price of $69 per month.  The average advertized download speed of the services included in the Level 3 basket is roughly 14 Mbps, which is slightly higher than in last year’s study (where the average speed was 12.5 Mbps).

Lastly, the average price of the new Level 4 broadband service basket is roughly $78 per month.  The advertized download speeds for the Level 4 broadband services included in the study range from 25 to 50 Mbps – the average is close to 30 Mbps.

Roughly half of the Canadian broadband service plans surveyed for this study included monthly usage caps.  For those that do, they range from 1 to 13 GB on the Level 1 service basket – the average is 7 GB per month.  The range for the Level 2 service basket is from 25 to 75 GB – the average is 55 GB per month.  The range for the Level 3 service basket is from 75 to 125 GB – the average is just over 90 GB per month.  There has been little change in these monthly usage caps, on average, compared to last year.

In the case of the new Level 4 broadband service basket, for those service providers applying data usage caps, the caps range from 75 to 250 GB per month – the average was close to 140 GB per month.

The Canadian pricing and usage study was developed by Wall Communications for the Canadian Radio-television and Telecommunications Commission.  The best news for Canada?  Your broadband pricing remains relatively stable, with some package pricing reducing the cost of the broadband component.  Standalone service appears to have increased in price only slightly in many markets.  Increased foreign investment in the wireless marketplace is shaking up wireless pricing, as the hegemony of Bell, Telus, Rogers, and Quebecor are under increasing competitive pressure.  It’s a much sunnier outlook than what is taking place in the country to your south.

For Americans, pricing is headed in only one direction: up.

All charts courtesy of Wall Communications, Inc.

MetroPCS’ Nasty Terms of Use: ‘We May Not Provide You a Meaningful Data Experience’

Phillip Dampier May 25, 2011 Consumer News, Data Caps, MetroPCS, Net Neutrality, Online Video, Wireless Broadband Comments Off on MetroPCS’ Nasty Terms of Use: ‘We May Not Provide You a Meaningful Data Experience’

Unlimiting the ways a cell phone company can limit your service.

MetroPCS pitches its 4G/LTE plans to customers looking to save money over the bigger players in the marketplace.  The upstart provider, based in Richardson, Texas, serves just over a dozen major metropolitan areas with no-contract plans that deliver lower prices in return for smaller coverage areas.  As larger providers heavily sell their “next generation 4G” networks, MetroPCS has also been promoting their own “unlimited talk, text, and web” 4G/LTE plan that offers an “unlimited” experience for $60 a month.  But there is a catch, only revealed when customers click the fine print link that opens the Terms of Use.  The document is a poster child for Net Neutrality, because it allows the company to block, throttle, prioritize, alter, or inspect any web content.

Here is what MetroPCS advertises:

Here is a selection of the Terms and Conditions which tarnish a great sounding deal (underlining ours):

You acknowledge and agree that the Internet contains Data Content which, without alteration, will or may not be available, or may not be providable to you in a way to allow a meaningful experience, on a wireless handset.

You acknowledge and agree that such alteration that MetroPCS may or will perform on your behalf as your agent may include our use of Data Content traffic management or shaping techniques such as, but not limited to delaying or controlling the speeds at which Data Content is delivered, reformatting the Data Content, compressing the Data Content, prioritizing traffic on MetroPCS’ network, and placing restrictions on the amount of Data Content made available based on the Agreement. You further acknowledge that MetroPCS may not be able to alter such Data Content for you merely by reference to the Internet address and therefore acknowledge and agree that MetroPCS may examine, including, but not limited to Shallow (or Stateful) Packet Inspection and Deep Packet Inspection, the Data Content requested by you while using the MetroWEB Service to determine how best to alter such Data Content prior to providing it to you.

If we notice excessive data traffic coming from your phone, we reserve the right to suspend, reduce the speed of, or terminate your MetroWEB Service. In addition, to provide a good experience for the majority of our customers and minimize capacity issues and degradation in network performance, we may take measures including temporarily reducing data throughput for a subset of customers who use a disproportionate amount of bandwidth; if your web and data Service Plan usage is predominantly off-portal or otherwise not provided by MetroPCS during a billing cycle, we may reduce your data speed, without notice, for the remainder of that billing cycle. We may also suspend, terminate, or restrict your data session, or MetroWEB Service if you use MetroWEB Service in a manner that interferes with other customers’ service, our ability to allocate network capacity among customers, or that otherwise may degrade service quality for other customers.

MetroPCS also wants customers to know their service is not intended as a home broadband replacement, and states it is only to be used for basic web services, including e-mail and web browsing and downloading of legitimate audio content.  Video streaming is naughty.

Public Knowledge Dips Its Toe Into Fight Against Internet Overcharging – Learn From Canada

Phillip Dampier May 9, 2011 AT&T, Bell (Canada), Broadband "Shortage", Canada, Competition, Data Caps, Editorial & Site News, Public Policy & Gov't, Video, Wireless Broadband Comments Off on Public Knowledge Dips Its Toe Into Fight Against Internet Overcharging – Learn From Canada

Among the public interest groups that have historically steered clear of the fight against usage caps and usage based billing is Public Knowledge.

Stop the Cap! took them to task more than a year ago for defending the implementation of these unjustified hidden rate hikes and usage limits.  Since then, we welcome the fact the group has increasingly been trending towards the pro-consumer, anti-cap position, but they still have some road to travel.

Public Knowledge, joined by New America Foundation’s Open Technology Initiative, has sent a letter to the Federal Communications Commission expressing concern over AT&T’s implementation of usage caps and asking for an investigation:

[…] Public Knowledge and New America Foundation’s Open Technology Initiative urge the Bureau to exercise its statutory authority to fully investigate the nature, purpose, impact of those caps upon consumers. The need to fully understand the nature of broadband caps is made all the more urgent by the recent decision by AT&T to break with past industry practice and convert its data cap into a revenue source.

[…] Caps on broadband usage imposed by Internet Service Providers (ISPs) can undermine the very goals that the Commission has committed itself to championing. While broadband caps are not inherently problematic, they carry the omnipresent temptation to act in anticompetitive and monopolistic ways. Unless they are clearly and transparently justified to address legitimate network capacity concerns, caps can work directly against the promise of broadband access.

The groups call out AT&T for its usage cap and overlimit fee model, and ponder whether these are more about revenue enhancement than network management.  The answer to that question has been clear for more than two years now: it’s all about the money.

The two groups are to be commended for raising the issue with the FCC, but they are dead wrong about caps not being inherently problematic.  Usage caps have no place in the North American wired broadband market.  Even in Canada, providers like Bell have failed to make a case justifying their implementation.  What began as an argument about congestion has evolved into one about charging heavy users more to invest in upgrades that are simply not happening on a widespread basis.  The specific argument used is tailored to the audience: complaints about congestion to government officials, denials of congestion issues to shareholders coupled with promotion of usage pricing as a revenue enhancer.

If Bell can’t sell the Canadian government on its arguments for usage caps in a country that has a far lower population density and a much larger rural expanse to wire, AT&T certainly isn’t going to have a case in the United States, and they don’t.

The history of these schemes is clear:

  1. Providers historically conflate their wireless broadband platforms with wired broadband when arguing for Internet Overcharging schemes.  When regulators agree to arguments that wireless capacity problems justify usage limits, extending those limits to wired broadband gets carried along for the ride.  Dollar-a-holler groups supporting the industry love to use charts showing wireless data growth, and claim a similar problem afflicts wired broadband, even though the costs to cope with congestion are very different on the two platforms.
  2. Providers argue one thing while implementing another.  Most make the claim pricing changes allow them to introduce discounted “light user” plans.  But few save because true “pay only for what you use” usage-based billing is not on offer.  Instead, worry-free flat use plans are taken off the menu, replaced with tiered plans that force subscribers to guess their usage.  If they guess too little, a stiff overlimit fee applies.  If they guess too much, they overpay.  Heads AT&T wins, tails you lose.  That’s a clear warning providers are addressing revenue enhancement, not network enhancement.
  3. Claims of network congestion backed up with raw data, average usage per user, and the costs to address it are all labeled proprietary business information and are not available for independent inspection.

There are a few other issues:

In the world of broadband data caps, the caps recently implemented by AT&T are particularly aggressive. Unlike competitors whose caps appear to be at least nominally linked to congestions during peak-use periods, AT&T seeks to convert caps into a profit center by charging additional fees to customers who exceed the cap. In addition to concerns raised by broadband caps generally, such a practice produces a perverse incentive for AT&T to avoid raising its cap even as its own capacity expands.

In North America, only a handful of providers use peak-usage pricing for wired broadband.  Cable One, America’s 10th largest cable operator is among the largest, and they serve fewer than one million customers.  Virtually all providers with usage caps count both upstream and downstream data traffic 24 hours a day against a fixed usage allowance.  The largest — Comcast — does not charge an excessive usage fee.  AT&T does.

Furthermore, it remains unclear why AT&T’s recently announced caps are, at best, equal to those imposed by Comcast over two years ago.  The caps for residential DSL customers are a full 100GB lower than those Comcast saw fit to offer in mid-2008. The lower caps for DSL customers is especially worrying because one of the traditional selling points of DSL networks is that their dedicated circuit design helps to mitigate the impacts of heavy users on the rest of the network. Together, these caps suggest either that AT&T’s current network compares poorly to that of a major competitor circa 2008 or that there are non-network management motivations behind their creation.

AT&T has managed to create the first Internet version of the Reese's Peanut Butter Cup, combining Comcast's 'tolerated' 250GB cap with AT&T's style of slapping overlimit fees on data plans from their wireless business.

As Stop the Cap! has always argued, usage caps are highly arbitrary.  Providers always believe their usage caps are the best and most fair around, whether it was Frontier’s 5GB usage limit or Comcast’s 250GB limit.

AT&T experimented with usage limits in Reno, Nevada and Beaumont, Texas and found customers loathed them.  Comcast’s customers tolerate the cable company’s 250GB usage cap because it is not strictly enforced — only the top few violators are issued warning letters.  AT&T has established America’s first Internet pricing version of the Reese’s Peanut Butter Cup: getting Comcast’s tolerated usage cap into AT&T’s wireless-side overlimit fee.  The bitter aftertaste arrives in the mail at the end of the month.

Why establish different usage caps for DSL and U-verse?  Marketing, of course.  This is about money, remember?

AT&T DSL delivers far less average revenue per customer than its triple-play U-verse service.  To give U-verse a higher value proposition, AT&T supplies a more generous usage allowance.  Message: upgrade from DSL for a better broadband experience.

Technically, there is no reason to enforce either usage allowance, as AT&T DSL offers a dedicated connection to the central office or D-SLAM, from where fiber traditionally carries the signal to AT&T’s enormous backbone connection.  U-verse delivers fiber to the neighborhood and a much fatter dedicated pipeline into individual subscriber homes to deliver its phone, Internet, and video services.

A usage cap on U-verse makes as much sense as putting a coin meter on the television or charging for every phone call, something AT&T abandoned with their flat rate local and long distance plans.

Before partly granting AT&T’s premise that usage limits are a prophylactic for congestion and then advocate they be administered with oversight, why not demand proof that such pricing and usage schemes are necessary in the first place.  With independent verification of the raw data, providers like AT&T will find that an insurmountable challenge, especially if they have to open their books.

[flv width=”640″ height=”368″]http://www.phillipdampier.com/video/Bell’s Arguments for UBB 2-2011.flv[/flv]

Canada’s experience with Usage-Based Billing has all of the hallmarks of the kind of consumer ripoff AT&T wants Americans to endure:

  • A provider (Bell), whose spokesman argues for these pricing schemes to address congestion and “fairness,” even as that same spokesman admits there is no congestion problem;
  • Would-be competitors being priced out of the marketplace because they lack the infrastructure, access, or fair pricing to compete;
  • Big bankers and investors who applaud price gouging and are appalled at government checks and balances.

Watch Mirko Bibic try to rationalize why Bell’s Fibe TV (equivalent to AT&T U-verse) needs Internet Overcharging schemes for broadband, but suffers no capacity issues delivering video and phone calls over the exact same line.  Then watch the company try and spin this pricing as an issue of fairness, even as an investor applauds the company: “I love this policy because I am a shareholder.  That’s all I care about.  If you can suck every last cent out of users, I’m happy for you.”  Finally, watch a company buying wholesale access from Bell let the cat out of the bag — broadband usage costs pennies per gigabyte, not the several dollars many providers want to charge.  (11 minutes)

Broadcast Lobby Says ‘Spectrum Crisis’ is Fiction; Wireless Data Tsunami Debunked

(Source: JVC)

The National Association of Broadcasters (NAB), a trade association and lobbying group representing many of the nation’s television stations, says claims by wireless carriers of a nationwide spectrum crisis are troubling and counterfactual.  That conclusion comes in a new report issued by the NAB this morning that wants the FCC to keep its hands off UHF broadcast channel spectrum the agency wants to sell off to improve mobile broadband.

The paper, “Solving the Capacity Crunch: Options for Enhancing Data Capacity on Wireless Networks,” written by a former FCC employee, suggests claims by wireless carriers that they will “run out” of frequencies to serve America’s growing interest in wireless services are simply overblown.

Many wireless companies own spectrum they are not using, the report argues, and other licensed users are holding onto spectrum without using it either, hoping to make a killing selling it off at enormous profits in the future.  Besides, the federal government holds the largest amount of underutilized spectrum around — frequencies that could easily be allocated to wireless use without further reducing the size of the UHF broadcast TV band.

Many of the ideas in the NAB report emphasize the need for carriers to deploy innovative technology solutions to increase the efficiency of the spectrum they are already using.  Those ideas include additional cell towers to split traffic loads into smaller regional areas, and improving on network channel-bonding, caching, and intelligent network protocols.

But the NAB report has some obvious weak spots the wireless industry will likely exploit — notably their recommendations that seek a reduction in wireless traffic — ideas that would suggest there is not enough spectrum to handle every user.  Among those recommendations:

  • Implementing Internet Overcharging schemes like “fair use” policies and consumption-based pricing to discourage use;
  • Migrating voice traffic to Internet Protocol;
  • Migrating data traffic to a prolific network of “femtocells” — mini antennas that provide 3G service inside buildings, but deliver that traffic over home or business wired broadband connections;
  • Offering wider access to Wi-Fi networks in public areas;
  • Encouraging the development of bandwidth sensitive devices and applications.

The National Broadband Plan’s conclusion of a spectrum shortage is based on little more than a wish list by wireless carriers, says the paper. Its author, Uzoma Onyeije, cites contradictory statements by high-ranking corporate officials to show the Plan’s calls for making 500MHz of spectrum available for broadband in ten years is a gross overestimate of the actual need.

“There is no denying that the corporate imperative of mobile wireless carriers is to obtain as much spectrum as they can,” Onyeije wrote. “However, the fact that wireless carriers cannot find a unified voice on the amount and timing of their spectrum needs suggests that this advocacy is more strategic gamesmanship than factual reality.”

The NAB has heavily lobbied Washington officials on the issue of spectrum because their members — broadcast television stations — are facing the loss of up to 120MHz of what’s left of the UHF dial, already shrinking because of earlier reallocations.  The FCC proposal would resize the UHF dial to channels 14-30 — 16 channels.  In crowded television markets like Los Angeles, up to 16 stations would be forced to sign-off the public airwaves for good, because there would be insufficient space to allow them to continue a broadcast signal.  Instead, the FCC proposes they deliver their signal over pay television providers like cable or telco-provided IPTV.  Or they could always stream over the Internet.  But that would mean the decline of free, over the air television in this country.

Considering the millions of dollars many stations are worth, it’s no surprise broadcasters are howling over the proposal.

Onyeije’s report suggests AT&T and Verizon, among others, are grabbing whatever valuable spectrum they can get their hands on.  What they don’t use, they’ll “warehouse” for claimed future use.  By locking up unused spectrum, potential competitors can’t use it.  The proof, Onyeije writes, is found when comparing claims by the wireless industry with the FCC’s own independent research:

AT&T predicts 8-10 times of data growth between 2010 and 2015 and T-Mobile forecasts that data will have 10 times of growth in 5 years. Yet, the Commission’s assessment that 275MHz of spectrum is needed to meet mobile data demand is premised on data growth of 35 times between 2009 and 2014.

The Data Tsunami Debunked

Some providers are sitting on spectrum they already own.

The NAB also takes to task the “evidence” many providers use to claim the zettabyte era is at hand, where a veritable exaflood of data will force America into a widespread data brownout if more capacity isn’t immediately made available.

[…] The [industry claims rely] on suspect data. In arriving at its conclusion, OBI Technical Paper No. 6 relies heavily on forecast data from Cisco that is both wildly optimistic about data growth and unscientific. In a blog entry entitled, Should a Sales Brochure Underlie US Spectrum Policy?, Steven Crowley states that “[t]here is overlap between the people who prepare the forecast and the people responsible for marketing Cisco’s line of core-network hardware to service providers. The forecast is used to help sell that hardware. Put simply, it’s a sales brochure.”

Onyeije takes apart the oft-repeated claim that a data explosion will be unyielding, unrelenting, and will be the wireless industry’s biggest challenge for years to come.  It also speaks to issues about broadband use in general:

In particular, the paper appears to be premised on the highly suspect assumption that the high demand curve for mobile data will not slow. While smartphone growth is significantly increasing now, it will no doubt plateau and slow. It has been widely accepted for decades that the process of technological adoption over time is typically illustrated as a classic normal distribution or “bell curve” where a phase of rapid adoption ends in slowed adoption as the product matures or new technologies emerge.

As recently reported, Cisco now projects that U.S. mobile growth will drop by more than half by 2015. As Dave Burstein, Editor of DSL Prime, explains: “The growth is clearly not exponential.”  Mr. Burstein went on to say “Every CFO and engineer has to plan carefully for the network upgrades needed, but the numbers certainly don’t suggest a ‘crisis.’” Jon Healey of the Los Angeles Times Editorial Board similarly explains that “Much of the growth in the demand for bandwidth has come from two parallel forces: a new type of smartphone (epitomized by the iPhone) encourages people to make more use of the mobile Web, and more people are switching from conventional mobile phones to these new smartphones. Once everyone has an iPhone, an Android phone or the equivalent, much of the growth goes away.” AP Technology writer Peter Svensson echoes this concern and explains “AT&T’s own figures indicate that growth is slowing down now that smartphones are already in many hands.” Thus, the assumption that data demand will continue to grow unabated is deeply flawed.

Internet Overcharging is About Rationing and Reducing Use

Although the NAB favors Internet Overcharging to drive down demand for use, Onyeije’s report inadvertently provides additional evidence to the forces that oppose data caps, meters, and speed throttles: they are designed to monetize usage while driving it down at the same time:

While unlimited data plans on mobile phones were once the standard, there is now more focus on using pricing as a network management tool. As AT&T Operations President John Stankey put it, “I don’t think you can have an unlimited model forever with a scarce resource. More people get drunk at an open bar than a cash bar.”  In the past year, AT&T and Virgin Mobile abandoned unlimited data plans. In 2010, T-Mobile announced that it would employ data throttling and slow the download speeds of customers that use more than five GB of data each month. And Bloomberg reported on March 1, 2011 that “Verizon Communications Inc. will stop offering unlimited data plans for Apple Inc.’s iPhone as soon as this summer and switch to a tiered pricing offering that can generate more revenue and hold the heaviest users in check.” Usage-based smartphone data plans substantially reduce per-user data traffic. As a result, data growth is likely to slow over time. And companies, including Cisco, are marketing products to carriers to help make tiered data plans easier to implement and help carriers “increase the monetization of their networks.”

Search This Site:

Contributions:

Recent Comments:

Your Account:

Stop the Cap!