Home » Congestion » Recent Articles:

Clearwire: Unlimited Means No More Than 5GB Or We Throw You Off

Phillip Dampier November 20, 2012 Broadband Speed, Data Caps, Wireless Broadband 7 Comments

Clearwire wants a divorce from customers it deems are using the wireless broadband service too much — as in around 5GB per month, despite the fact many of those customers pay for “unlimited” accounts.

Broadband Reports says several ex-customers are now complaining on Twitter about their abrupt, involuntary departure this week as paying customers, despite company promises that advance warnings would be sent if a customer was engaged in “excessive use.”

“One user excessively running heavy bandwidth applications can adversely affect the speeds and service quality for their neighbors,” Clearwire told Broadband Reports. “It is rare that we take this step and when we do it affects an extremely small percentage of our total user base. We typically contact users to notify them of this type of situation first in order to provide an opportunity to make necessary changes.”

Broadband Reports:

How much usage was considered too much? Clearwire won’t get specific, but one of the users tells Broadband Reports Clearwire informed him he’d breached 5 GB three months in a row — which frankly doesn’t sound excessive for a modern wireless network.

Clear began throttling heavy users on unlimited accounts to around 256kbps back in 2010. They’ve never really been specific about what triggers the throttled state for users, given it appears to be calculated on the fly based on local tower congestion — so what triggers throttling may be different in different markets. It’s not entirely clear why throttling these users back to 256 kbps wasn’t substantial enough of a punishment for these “fired” customers.

Halloween Scare Stories: Controlling the “Spectrum Shortage” Data Tsunami With Rate Hikes, Caps

Phillip Dampier October 25, 2012 Astroturf, AT&T, Broadband "Shortage", Competition, Consumer News, Data Caps, Editorial & Site News, Public Policy & Gov't, Sprint, T-Mobile, Verizon, Video, Wireless Broadband Comments Off on Halloween Scare Stories: Controlling the “Spectrum Shortage” Data Tsunami With Rate Hikes, Caps

Phillip “Halloween isn’t until next week” Dampier

Despite endless panic about spectrum shortages and data tsunamis, even more evidence arrived this week illustrating the wireless industry and their dollar-a-holler friends have pushed the panic button prematurely.

The usual suspects are at work here:

  • The CTIA – The Wireless Association is the chief lobbying group of the wireless industry, primarily representing the voices of Verizon, AT&T, Sprint, and T-Mobile. They publish regular “weather reports” predicting calamity and gnashing of teeth if Washington does not immediately cave to demands to open up new spectrum, despite the fact carriers still have not utilized all of their existing inventory;
  • Cisco – Their bread is buttered when they convince everyone that constant equipment and technology upgrades (coincidentally sold by them) are necessary. Is your enterprise ready to confront the data tsunami? Call our sales office;
  • The dollar-a-holler gang – D.C. based lobbying firms and their astroturf friends sing the tune AT&T and Verizon pay to hear. No cell company wants to stand alone in a public policy debate important to their bottom line, so they hire cheerleaders that masquerade as “research firms,” “independent academia,” “think tanks,” or “institutes.” Sometimes they even enlist non-profit and minority groups to perpetuate the myth that doing exactly what companies want will help advance the cause of the disenfranchised (who probably cannot afford the bills these companies mail to their customers).

Tim Farrar of Telecom, Media, and Finance Associates discovered something interesting about wireless data traffic in 2012. Despite blaring headlines from the wireless industry that “Consumer Data Traffic Increased 104 Percent” this year, statistics reveal a dramatic slowdown in wireless data traffic, primarily because wireless carriers are raising prices and capping usage.

The CTIA press release only quotes total wireless data traffic within the US during the previous 12 months up to June 2012 for a total of 1.16 trillion megabytes, but doesn’t give statistics for data traffic in each individual six-month period. That information, however, can be calculated from previous press releases (which show total traffic in the first six months of 2012 was 635 billion MB, compared to 525 billion MB in the final six months of 2011).

Counter to the CTIA’s spin, this represents growth of just 21 percent, a dramatic slowdown from the 54 percent growth in total traffic seen between the first and second half of 2011. Even more remarkably, on a per device basis (based on the CTIA’s total number of smartphones, tablets, laptops and modems, of which 131 million were in use at the end of June), the first half of 2012 saw an increase of merely 3 percent in average wireless data traffic per cellphone-network connected device, compared to 29 percent growth between the first and second half of 2011 (and 20-plus percent in prior periods).

[…] What was the cause of this dramatic slowdown in traffic growth? We can’t yet say with complete confidence, but it’s not an extravagant leap of logic to connect it with the widely announced adoption of data caps by the major wireless providers in the spring of 2012. It’s understandable that consumers would become skittish about data consumption and seek out free WiFi alternatives whenever possible.

Farrar

Cisco helps feed the flames with growth forecasts that at first glance seem stunning, until one realizes that growth and technological innovation go hand in hand when solving capacity crunches.

The CTIA’s alarmist rhetoric about America being swamped by data demand is backed by wireless carriers, at least when they are not talking to their investors. Both AT&T and Verizon claim their immediate needs for wireless spectrum have been satisfied in the near-term and Verizon Wireless even intends to sell excess spectrum it has warehoused. Both companies suggest capital expenses and infrastructure upgrades are gradually declining as they finish building out their high capacity 4G LTE networks. They have even embarked on initiatives to grow wireless usage. Streamed video, machine-to-machine communications, and new pricing plans that encourage customers to increase consumption run contrary to the alarmist rhetoric that data rationing with usage caps and usage pricing is the consequence of insufficient capacity, bound to get worse if we don’t solve the “spectrum crisis” now.

So where is the fire?

AT&T’s conference call with investors this week certainly isn’t warning the spectrum-sky is falling. In fact, company executives are currently pondering ways to increase data usage on their networks to support the higher revenue numbers demanded by Wall Street.

If you ask carriers’ investor relations departments in New York, they cannot even smell smoke. But company lobbyists are screaming fire inside the D.C. beltway. A politically responsive Federal Communications Commission has certainly bought in. FCC chairman Julius Genachowski has rung the alarm bell repeatedly, notes Farrar:

Even such luminaries as FCC Chairman Julius Genachowski has stated in recent speeches that we are at a crisis point, claiming “U.S. mobile data traffic grew almost 300 percent last year” —while CTIA says it was less than half that, at 123 percent. “There were many skeptics [back in 2009] about whether we faced a spectrum crunch. Today virtually every expert confirms it.”

A smarter way of designing high capacity wireless networks to handle increased demand.

So how are consumers responding to the so-called spectrum crisis?

Evidence suggests they are offloading an increasing amount of their smartphone and tablet traffic to free Wi-Fi networks to avoid eroding their monthly data allowance. In fact, Farrar notes Wi-Fi traffic leads the pack in wireless data growth. Consumers will choose the lower cost or free option if given a choice.

So how did we get here?

When first conceived, wireless carriers built long range, low density cellular networks. Today’s typical unsightly cell tower covers a significant geographic area that can reach customers numbering well into the thousands (or many more in dense cities). If everyone decides to use their smartphone at the same time, congestion results without a larger amount of spectrum to support a bigger wireless data “pipe.” But some network engineers recognize that additional spectrum allocated to that type of network only delays the inevitable next wave of potential congestion.

Wi-Fi hints at the smarter solution — building short range, high density networks that can deliver a robust wireless broadband experience to a much smaller number of potential users. Your wireless phone company may even offer you this solution today in the form of a femtocell which offloads your personal wireless usage to your home or business Wi-Fi network.

Some wireless carriers are adopting much smaller “cell sites” which are installed on light poles or in nearby tall buildings, designed to only serve the immediate neighborhood. The costs to run these smaller cell sites are dramatically less than a full-fledged traditional cell tower complex, and these antennas do not create as much visual pollution.

To be fair, wireless growth will eventually tap out the currently allocated airwaves designated for wireless data traffic. But more spectrum is on the way even without alarmist rhetoric that demands a faster solution more than  a smart one that helps bolster spectrum -and- competition.

Running a disinformation campaign and hiring lobbyists remains cheaper than modifying today’s traditional cellular network design, at least until spectrum limits or government policy force the industry’s hand towards innovation. Turning over additional frequencies to the highest bidder that currently warehouses unused spectrum is not the way out of this. Allocating spectrum to guarantee those who need it most get it first is a better choice, especially when those allocations help promote a more competitive wireless marketplace for consumers.

[flv width=”600″ height=”358″]http://www.phillipdampier.com/video/KGO San Francisco FCC considers spectrum shortage 9-12-12.flv[/flv]

KGO in San Francisco breaks down the spectrum shortage issue in a way ordinary consumers can understand. FCC chairman Julius Genachowski and even Google’s Eric Schmidt are near panic. But the best way to navigate growing data demand isn’t just about handing over more frequencies for the exclusive use of Verizon, AT&T and others. Sharing spectrum among multiple users may offer a solution that could open up more spectrum for everyone.  (2 minutes)

Bottom-Ranked Suddenlink Upset About Frontier’s Ad Claims Their DSL is Better

Suddenlink is throwing a hissyfit over Frontier’s aggressive advertising.

Now come on, you are both pretty… slow that is.

Suddenlink Communications is crawling mad that Frontier Communications has been hammering the cable company over their broadband speeds, which PC Magazine this week proclaimed were nothing to write home about. The cable operator successfully challenged some of Frontier’s ads with the National Advertising Division of the Council of Better Business Bureaus.

The group recommended Frontier cease making claims that its DSL service offers “dedicated” lines to the Internet in contrast to Suddenlink, which forces customers to share their connection with the whole neighborhood.

Frontier claims Suddenlink’s network can bog down during peak hours, while Frontier makes sure customers consistently get the speeds they pay for.

Many of the ads targeted customers in West Virginia, who regularly tell Stop the Cap! neither provider competing there offers particularly good service.

“Is Frontier kidding?,” says Shane Foster, a former Frontier customer in West Virginia. “I was supposed to be getting up to 6Mbps service and I was lucky to get 1.5Mbps at 2 am.”

Foster says he believes Frontier oversold its DSL network in his area, with speeds slowing even further during the evening and weekends when everyone got online. While Frontier may not require customers to share a line from their home to the company’s central office, congestion can occur within Frontier’s local exchange or on the connection Frontier maintains with Internet backbone providers.

“The technician sent to my house even privately admitted it,” Foster tells Stop the Cap!

Foster switched to Suddenlink, but he is not exactly a happy customer there either.

“Their usage caps suck, the service is slow, and their measurement tool is always broken,” Foster shares. “West Virginia doesn’t just get the bottom of the barrel, it gets the dirt underneath it.”

Frontier Communications says it has been making improvements in West Virginia and other states where it provides DSL broadband. Some areas can now subscribe to 25Mbps service because of network upgrades. Foster says he would dump Suddenlink and go back to Frontier, if they can deliver speeds the rest of the country gets.

“Sorry, but 1.5Mbps is not broadband and with their prices, tricky fees and contracts it is robbery,” says Foster. “They need to clean up their act and I’ll come back. I hate usage caps with a passion.”

Frontier says it will appeal the NAD’s decision. But Frontier might do better advertising its broadband service as usage cap free — something customers consistently value over those running Internet Overcharging schemes.

Latest FCC Report on Broadband Speeds: Good for Verizon, Cablevision; Bad for Frontier

The Federal Communications Commission’s July report on America’s broadband speeds shows virtually every major national provider, with the exception of Frontier Communications, made significant improvements in delivering the broadband service and speeds they advertise to customers.

Utilizing thousands of volunteer testers agreeing to host a router that performs automated speed tests and other sampling measurements (full disclosure: your editor is a volunteer participant), the FCC speed measurement program is one of the most comprehensive independent broadband assessments in the country.

Hourly Sustained Download Speeds as a Percentage of Advertised, by Provider—April 2012 Test Data

The FCC found Cablevision’s improvements last year paid off handsomely for the company, which now effectively ties with Verizon Communication’s FiOS fiber-to-the-home service for delivering promised speeds during peak usage times. The cable operator was embarrassed in 2011 when the FCC found Cablevision broadband customers’ speeds plummeted during Internet use prime time. Those problems have since been corrected with infrastructure upgrades — particularly important for a cable operator that features near-ubiquitous competition from Verizon’s fiber network.

“This report demonstrates our commitment to delivering more than 100 percent of the speeds we advertise to our broadband customers – over the entire day and during peak hours – in addition to free access to the nation’s largest Wi-Fi network and other valuable product features and enhancements,” said Amalia O’Sullivan, Cablevision’s vice president of broadband operations.

Verizon also blew its own horn in a press statement released this afternoon.

“Verizon’s FiOS service continues to demonstrate its mastery of broadband speed, reliability and consistency for consumers as represented in today’s FCC-SamKnows residential broadband report,” said Mike Ritter, chief marketing officer for Verizon’s consumer and mass market business unit. “The FCC’s findings reaffirm the results from the 2011 report, which found that FiOS provides blazing-fast and sustained upstream and downstream speeds even during peak usage periods. This year’s results also show once again that FiOS Internet customers are receiving speeds that meet or exceed those we advertise, adding even more value to the customer experience.”

Average Peak Period Sustained Download and Upload Speeds as a Percentage of Advertised, by Provider—April 2012 Test Data

Cable operators’ investments in DOCSIS 3 technology also allowed their broadband networks to perform well even as broadband usage continues to grow. Comcast delivered 103% of promised speeds during peak usage, Time Warner Cable – 96%, and Cox – 95%.

Just one nationwide provider lost ground in the last year — Frontier Communications, whose DSL service has grown more congested than ever, with insufficient investment in network upgrades apparent by the company’s dead-last results.

Frontier managed 81% of promised speeds in 2011, partly thanks to its inherited fiber to the home network. This year, it managed only 79%.

Frontier performed adequately for customers choosing its lowest 1Mbps speed tier. It also performed well in areas where its fiber network can sustain much faster speeds. The biggest problems show up for Frontier’s DSL customers buying service at speeds of 3-10Mbps. At peak times, network congestion brings those speeds down.

On average, the FCC found fiber to the home service delivers the best broadband performance, followed by cable broadband, and then telephone company DSL. Five ISPs now routinely deliver nearly one hundred percent or greater of the speed advertised to the consumer even during time periods when bandwidth demand is at its peak. In the August 2011 Report, only two ISPs met this level of performance. In 2011, the average ISP delivered 87 percent of advertised download speed during peak usage periods; in 2012, that jumped to 96 percent. In other words, consumers today are experiencing performance more closely aligned with what is advertised than they experienced one year ago.

The FCC report also found that outlier performers in the 2011 study, with the exception of Frontier, worked hard to make their differences in performance disappear. Last year, the standard deviation from promised broadband speeds was 14.4 percent. This year it is 12.2 percent.

Peak Period Sustained Download Performance, by Provider—April 2012 Test Data

The FCC also found consumers are gravitating towards higher-priced, higher-speed broadband service. Last year’s average broadband speed tier was 11.1Mbps. This year it is 14.3Mbps, almost 30% higher. Along with faster speeds comes more usage. Customers paying for more speed expect to use their broadband connections more, and the FCC found they do.

Overall, the FCC was encouraged to see broadband speed tiers on the increase, some to 100Mbps or higher.

Highlights from the report:

  • Actual versus advertised speeds. The August 2011 Report showed that the ISPs included in the Report were, on average, delivering 87 percent of advertised speeds during the peak consumer usage hours of weekdays from 7:00 pm to 11:00 pm local time. The July 2012 Report finds that ISP performance has improved overall, with ISPs delivering on average 96 percent of advertised speeds during peak intervals, and with five ISPs routinely meeting or exceeding advertised rates.
  • Sustained download speeds as a percentage of advertised speeds. The average actual sustained download speed during the peak period was calculated as a percentage of the ISP’s advertised speed. This calculation was done for each speed tier offered by each ISP.
    • Results by technology:
      • On average, during peak periods DSL-based services delivered download speeds that were 84 percent of advertised speeds, cable-based services delivered 99 percent of advertised speeds, and fiber-to-the-home services delivered 117 percent of advertised speeds. This compared with 2011 results showing performance levels of 82 percent for DSL, 93 percent for cable, and 114 percent for fiber. All technologies improved in 2012.
      • Peak period speeds decreased from 24-hour average speeds by 0.8 percent for fiber-to-the-home services, 3.4 percent for DSL-based services and 4.1 percent for cable-based services. This compared with 0.4 percent for fiber services, 5.5 percent for DSL services and 7.3 percent for cable services in 2011.
    • Results by ISP:
      • Average peak period download speeds varied from a high of 120 percent of advertised speed to a low of 77 percent of advertised speed. This is a dramatic improvement from last year where these numbers ranged from a high of 114 percent to a low of 54 percent.
      • In 2011, on average, ISPs had a 6 percent decrease in delivered versus advertised download speed between their 24 hour average and their peak period average. In 2012, average performance improved, and there was only a 3 percent decrease in performance between 24 hour and peak averages.
  • Sustained upload speeds as a percentage of advertised speeds. With the exception of one provider, upload speeds during peak periods were 95 percent or better of advertised speeds. On average, across all ISPs, upload speed was 107 percent of advertised speed. While this represents improvement over the 103 percent measured for 2011, upload speeds have not been a limiting factor in performance and most ISPs last year met or exceeded their advertised upload speeds. Upload speeds showed little evidence of congestion with little variance between 24 hour averages and peak period averages.
    • Results by technology: On average, fiber-to-the-home services delivered 106 percent, DSL-based services delivered 103 percent, and cable-based services delivered 110 percent of advertised upload speeds. These compare with figures from 2011 of 112 percent for fiber, 95 percent for DSL, and 108 percent for cable.
    • Results by ISP: Average upload speeds among ISPs ranged from a low of 91 percent of advertised speed to a high of 122 percent of advertised speed. In 2011, this range was from a low of 85 percent to a high of 125 percent.
  • Latency. Latency is the time it takes for a packet of data to travel from one designated point to another in a network, commonly expressed in terms of milliseconds (ms). Latency can be a major controlling factor in overall performance of Internet services. In our tests, latency is defined as the round-trip time from the consumer’s home to the closest server used for speed measurement within the provider’s network. We were not surprised to find latency largely unchanged from last year, as it primarily depends upon factors intrinsic to a specific architecture and is largely outside the scope of improvement if networks are appropriately engineered. In 2012, across all technologies, latency averaged 31 milliseconds (ms), as opposed to 33 ms measured in 2011.
    • During peak periods, latency increased across all technologies by 6.5 percent, which represents a modest drop in performance. In 2011 this figure was 8.7 percent.
      • Results by technology:
        • Latency was lowest in fiber-to-the-home services, and this finding was true across all fiber-to-the-home speed tiers.
        • Fiber-to-the-home services provided 18 ms round-trip latency on average, while cable-based services averaged 26 ms, and DSL-based services averaged 43 ms. This compares to 2011 figures of 17 ms for fiber, 28 ms for cable and 44 ms for DSL.
      • Results by ISP: The highest average round-trip latency for an individual service tier among ISPs was 70.2 ms, while the lowest average latency within a single service tier was 12.6 ms. This compares to last year’s maximum latency of 74.8 ms and minimum of 14.5 ms.
  • Effect of burst speed techniques. Some cable-based services offer burst speed techniques, marketed under names such as “PowerBoost,” which temporarily allocate more bandwidth to a consumer’s service. The effect of burst speed techniques is temporary—it usually lasts less than 15 to 20 seconds—and may be reduced by other broadband activities occurring within the consumer household. Burst speed is not equivalent to sustained speed. Sustained speed is a measure of long-term performance. Activities such as large file transfers, video streaming, and video chat require the transfer of large amounts of information over long periods of time. Sustained speed is a better measure of how well such activities may be supported. However, other activities such as web browsing or gaming often require the transfer of moderate amounts of information in a short interval of time. For example, a transfer of a web page typically begins with a consumer clicking on the page reference and ceases when the page is fully downloaded. Such services may benefit from burst speed techniques, which for a period of seconds will increase the transfer speed. The actual effect of burst speed depends on a number of factors explained more fully below.
    • Burst speed techniques increased short-term download performance by as much as 112 percent during peak periods for some speed tiers. The benefits of burst techniques are most evident at intermediate speeds of around 8 to 15 Mbps and appear to tail off at much higher speeds. This compares to 2011 results with maximum performance increases of approximately 50 percent at rates of 6 to 7 Mbps with tail offs in performance beyond this.
  • Web Browsing, Voice over Internet Protocol (VoIP), and Streaming Video.
    • Web browsing. In specific tests designed to mimic basic web browsing—accessing a series of web pages, but not streaming video or using video chat sites or applications—the total time needed to load a page decreased with higher speeds, but only up to about 10 Mbps. Latency and other factors limited response time starting around speed tiers of 10 Mbps and higher. For these high speed tiers, consumers are unlikely to experience much if any improvement in basic web browsing from increased speed–i.e., moving from a 10 Mbps broadband offering to a 25 Mbps offering. This is comparable to results obtained in 2011 and suggests intrinsic factors (e.g. effects of latency, protocol limitations) limit overall performance at higher speeds. It should be noted that this is from the perspective of a single user with a browser and that higher speeds may provide significant advantages in a multi-user household or where a consumer is using a specific application that may be able to benefit from a higher speed tier.
    • VoIP. VoIP services, which can be used with a data rate as low as 100 kilobits per second (kbps) but require relatively low latency, were adequately supported by all of the service tiers discussed in this Report. However, VoIP quality may suffer during times when household bandwidth is shared by other services. The VoIP measurements utilized for this Report were not designed to detect such effects.
    • Streaming Video. 2012 test results suggest that video streaming will work across all technologies tested, though the quality of the video that can be streamed will depend upon the speed tier. For example, standard definition video is currently commonly transmitted at speeds from 1 Mbps to 2 Mbps. High quality video can demand faster speeds, with full HD (1080p) demanding 5 Mbps or more for a single stream. Consumers should understand the requirements of the streaming video they want to use and ensure that their chosen broadband service tier will meet those requirements, including when multiple members of a household simultaneously want to watch streaming video on separate devices. For the future, video content delivery companies are researching ultra high definition video services (e.g. 4K technology which has a resolution of 12 Megapixels per frame versus present day 1080p High Definition television with a 2 Megapixel resolution), which would require higher transmission speeds.

Year by Year Comparison of Sustained Actual Download Speed as a Percentage of Advertised Speed (2011/2012)

 

Innovation Reality Check: Give Broadband Consumers the Flat Rate Service They Demand

Phillip "Is this 'innovation' or more 'alienation' from Big Cable" Dampier

While Federal Communications Commission chairman Julius Genachowski pals around with his cable industry friends at this week’s Cable Show in Boston, observers could not miss the irony of the current FCC chairman nodding in repeated agreement with former FCC chairman Michael Powell, whose bread is now buttered by the industry he used to regulate.

The revolving door remains well-greased at the FCC, with Mr. Powell assuming the role of chief lobbyist for the cable industry’s National Cable and Telecommunications Association (and as convention host) and former commissioner Meredith Attwell-Baker enjoying her new office and high priced position at Comcast Corporation, just months after voting to approve its multi-billion dollar merger with NBC-Universal.

Genachowski’s announcement that he favors “usage-based pricing” as healthy and beneficial for broadband and high-tech industries reflects the view of a man who doesn’t worry about his monthly broadband bill. As long as he works for taxpayers, we’re covering most of those expenses for him.

Former FCC chairman Powell said cable providers want to be able to experiment with pricing broadband by usage. That represents the first step towards monetizing broadband usage, an alarming development for consumers and a welcome one for Wall Street who understands the increased earnings that will bring.

Unfortunately, the unspoken truth is the majority of consumers who endure these “experiments” are unwilling participants. The plan is to transform today’s broadband Internet ecosystem into one checked by usage gauges, rationing, bill shock, and reduced innovation.  The director of the FCC’s National Broadband Plan, Blair Levin, recently warned the United States is on the verge of throwing away its leadership in online innovation, distracted trying to cope with a regime of usage limits that will force every developer and content producer to focus primarily on living within the usage allowances providers allow their customers.

“I’d rather be the country that developed fantastic applications that everyone in the world wants to use than the country that only invented data compression technology [to reduce usage],” Levin said.

Genachowski’s performance in Boston displayed a public servant primarily concerned about the business models of the companies he is supposed to oversee.

Genachowski: Abdicating his responsibility to protect the public in favor of the interests of the cable industry.

“Business model innovation is very important,” Genachowski said. “There was a point of view a couple years ago that there was only one permissible pricing model for broadband. I didn’t agree.”

We are still trying to determine what Genachowski is talking about. In fact, providers offer numerous pricing models for broadband service in the United States, almost uniformly around speed-based tiers, which offer customers both a choice in pricing and includes a worry-free usage cap defined by the maximum speed the connection supports.

Broadband providers experimenting with Internet Overcharging schemes like usage caps, speed throttles, and usage-billing only layer an additional profit incentive or cost control measure on top of existing pricing models.  A usage cap limits a customer to a completely arbitrary level of usage a provider determines is sufficient. But such caps can also be used to control over-the-top streaming video by limiting its consumption — an important matter for companies witnessing a decline in cable television customers.  Speed throttles are a punishing reminder to customers who “use too much” they need to ration their usage to avoid being reduced to mind-numbing dial-up speeds until the next billing cycle begins. Usage billing discourages consumers from ever trying new and innovative services that could potentially chew up their allowance and deliver bill shock when overlimit fees appear on the bill.

The industry continues to justify these experiments with wild claims of congestion, which do not prevent companies like Comcast, Time Warner Cable, and Cox from sponsoring their own online video streaming services which even they admit burn through bandwidth. Others claim customers should pay for what they use, which is exactly what they do today when they write a check to cover their growing monthly bill. Broadband pricing is not falling in the United States, it is rising — even in places where companies claim these pricing schemes are designed to save customers money. The only money saved is that not spent on network improvements companies can now delay by artificially reducing demand.

It’s having your cake and eating it too, and this is one expensive cake.

Comcast is selling broadband service for $40-50 that one research report found only costs them $8 a month to provide. That’s quite a markup, but it never seems to be enough. Now Comcast claims it is ditching its usage cap (it is not), raising usage allowances (by 50GB — four years after introducing a cap the company said it would regularly revisit), and testing a new Internet overlimit usage fee it literally stole from AT&T’s bean counters (a whopping $10 for an anti-granular 50GB).

In my life, all of the trials and experiments I have participated in have been voluntary. But the cable industry (outside of Time Warner Cable, for the moment) has a garlic-to-a-vampire reaction to the concept of “opting out,” and customers are told they will participate and they’ll like it.  Pay for what you use! (-at our inflated prices, with a usage limit that was not there yesterday, and an overlimit fee for transgressors that is here today. Does not, under any circumstances, apply to our cable television service.)

No wonder Americans despise cable companies.

Michael Powell, former FCC chairman, is now the host and chief lobbyist for the National Cable & Telecommunications Association's Cable Show in Boston. (Photo courtesy: NCTA)

For some reason, Chairman Genachowski cannot absorb the pocket-picking-potential usage billing offers an industry that is insatiable for enormous profits and faces little competition.

Should consumers be allowed to pay for broadband in different ways?  Sure. Must they be compelled into usage pricing schemes they want no part of? No, but that’s too far into the tall grass for the guy overseeing the FCC and the market players to demand.

Of course, we’ve been here and done this all before.

America’s dinosaur phone companies have been grappling with the mysterious concept of ‘flat-rate envy’ for more than 100 years, and they made billions from delivering it. While the propaganda department at the NCTA conflates broadband usage with water, gas, and electricity, they always avoid comparing broadband with its closest technological relative: the telephone. It gets hard to argue broadband is a precious, limited resource when your local phone company is pelting you with offers for unlimited local and long distance calling plans. Thankfully, a nuclear power plant or “clean coal” isn’t required to generate a high-powered dial tone and telephone call tsunamis are rarely a problem for companies that upgraded networks long ago to keep up with demand. Long distance rates went down and have now become as rare as a rotary dial phone.

In the 20th century, landline telephone companies grappled with how to price their service to consumers.  Businesses paid “tariff” rates which typically amount to 7-10 cents per minute for phone calls. But residential customers, particularly those outside of the largest cities, were offered the opportunity to choose flat-rate local calling service. Customers were also offered measured rate services that either charged a flat rate per call or offered one or two tiers of calling allowances, above which consumers paid for each additional local call.

Consumers given the choice overwhelmingly picked flat-rate service, even in cases where their calling patterns proved they would save money with a measured rate plan.

"All you can eat" pricing is increasingly common with phone service, the closest cousin to broadband.

The concept baffled the economic intelligentsia who wondered why consumers would purposefully pay more for a service than they had to. A series of studies were commissioned to explore the psychology of flat-rate pricing, and the results were consistent: customers wanted the peace of mind a predictable price for service would deliver, and did not want to think twice about using a service out of fear it would increase their monthly bill.

In most cases, flat rate service has delivered a gold mine of profits for companies that offer it. It makes billing simple and delivers consistent financial results. But there occasionally comes a time when the economics of flat-rate service increasingly does not make sense to the company or its shareholders. That typically happens when the costs to provide the service are increasing and the ability to raise flat rates to a new price point is constrained. Neither has been true in any respect for the cable broadband business, where costs to provide the service continue to decline on a per-customer basis and rates have continued to increase for consumers. The other warning sign is when economic projections show an even greater amount of revenue and profits can be earned by measuring and monetizing a service experiencing high growth in usage. Why leave money on the table, Wall Street asks.

That leaves us with companies that used to make plenty of profit charging $50 a month for flat rate broadband, now under pressure to still charge $50, but impose usage limits that reduce costs and set the stage for rapacious profit-taking when customers blow through their usage caps. It also delivers a useful fringe benefit by keeping high bandwidth content companies from entering the marketplace, as consumers fret about their impact on monthly usage allowances. Nothing eats a usage allowance like online video. Limit it and companies can also limit cable-TV cord-cutting.

Fabian Herweg and Konrad Mierendorff at the Department of Economics at the University of Zurich found the economics of flat rate pricing still work well for providers and customers, who clearly prefer unlimited-use pricing:

We developed a model of firm pricing and consumer choice, where consumers are loss averse and uncertain about their own future demand. We showed that loss-averse consumers are biased in favor of flat-rate contracts: a loss-averse consumer may prefer a flat-rate contract to a measured tariff before learning his preferences even though the expected consumption would be cheaper with the measured tariff than with the flat rate. Moreover, the optimal pricing strategy of a monopolistic supplier when consumers are loss averse is analyzed. The optimal two-part tariff is a flat-rate contract if marginal costs are low and if consumers value sufficiently the insurance provided by the flat-rate contract. A flat-rate contract insures a loss-averse consumer against fluctuations in his billing amounts and this insurance is particularly valuable when loss aversion is intense or demand is highly uncertain.

Applied to broadband, Herweg and Mierendorff’s conclusions fit almost perfectly:

  1. Consumers often do not understand the measurement units of broadband usage and do not want to learn them (gigabytes, megabytes, etc.)
  2. Consumers cannot predict a consistent level of usage demand, leading to disturbing wild fluctuations in billing under usage-based pricing;
  3. The peace of mind, or “insurance” factor, gives consumers an expected stable bill for service, which they prefer over unstable usage fees, even if lower than flat rate;
  4. Flat rate works in an industry with stable or declining marginal costs. Incremental technology upgrades and falling broadband delivery costs offer the cable industry exceptional profits even at flat-rate prices.

Time Warner Cable (for now) is proposing usage-based pricing as an option, while leaving flat rate broadband a choice on the service menu. But will it last?

Time Warner Cable (so far) is the only cable operator in the country that has announced a usage-based pricing experiment that it claims is completely optional, and will not impact on the broadband rates of current flat rate customers. If this remains the case, the cable operator will have taken the first step to successfully duplicate the pricing model of traditional phone company calling plans, offering price-sensitive light users a measured usage plan and risk-averse customers a flat-rate plan. The unfortunate pressure and temptation to eliminate the flat rate pricing plan remains, however. Company CEO Glenn Britt routinely talks of favoring usage-based pricing and Wall Street continues to pressure the company to exclusively adopt those metered plans to increase profits.

Other cable operators compel customers to adopt both speed and usage-based plans, which often require a customer to either ration usage to avoid an overlimit fee or compel an expensive service upgrade for a more generous allowance.  The result is customers are stuck with plans they do not want that deliver little or no savings and often cost much more.

Why wouldn’t a company sell you a plan you want? Either because they cannot afford to or because they can make a lot more selling you something else. Guess which is true here?

Broadband threatens to not be an American success story if current industry plans to further monetize usage come to fruition. The United States is already falling behind in global broadband rankings. In fact, the countries that lived under congestion and capacity-induced usage limits in the last decade are rapidly moving to discard them altogether, even as providers in this country seek to adopt them. That is an ominous sign that destroys this country’s lead role in online innovation. How will consumers react to tele-medicine, education, and entertainment services of the future that will eat away at your usage allowance?

Even worse, with no evidence of a broadband capacity problem in the United States, Mr. Genachowski’s apparent ignorance of the anti-competitive duopoly’s influence on pricing power is frankly disturbing. Why innovate prices down in a market where most Americans have just one or two choices for service? Economic theory tells us that in the absence of regulatory oversight or additional competition, prices have nowhere to go but up.

To believe otherwise is to consider your local cable operator the guardian angel of your wallet, and just about every American with a cable bill knows that is about as real as the tooth fairy.

Search This Site:

Contributions:

Recent Comments:

Your Account:

Stop the Cap!