Home » internet traffic » Recent Articles:

Internet’s Biggest Frauds: Traffic Tsunamis and Usage-Based Pricing

Providers’ tall tales.

Year after year, equipment manufacturers and internet service providers trot out predictions of a storm surge of internet traffic threatening to overwhelm the internet as we know it. But growing evidence suggests such scare stories are more about lining the pockets of those predicting traffic tsunamis and the providers that use them to justify raising your internet bill.

This month, Cisco — one of the country’s largest internet equipment suppliers, released its latest predictions of astounding internet traffic growth. The company is so confident its annual predictions of traffic deluges are real it branded a term it likes to use to describe it: The Zettabyte Era. (A zettabyte, for those who don’t know, is one sextillion bytes, or perhaps more comfortably expressed as one trillion gigabytes.)

Cisco’s business thrives on scaring network engineers with predictions that customers will overwhelm their broadband networks unless they upgrade their equipment now, as in ‘right now!‘ In turn, the broadband industry’s bean counters find predictions of traffic explosions useful to justify revenue enhancers like usage caps, usage-based billing, and constant rate increases.

“As we make these and other investments, we periodically need to adjust prices due to increases [in] business costs,” wrote Comcast executive Sharon Powell in a letter defending a broad rate increase imposed on customers in Philadelphia late last year.

In 2015, as that cable company was expanding its usage caps to more markets, spokesman Charlie Douglas tried to justify the usage caps claiming, “When you have 10 percent of the customers consuming 50 percent of the network bandwidth, it’s only fair that those consumers should pay more.”

When Cisco released its 2017 predictions of internet traffic growth, once again it suggests a lot more data will need to be accommodated across America’s broadband and wireless networks. But broadband expert Dave Burstein has a good memory based on his long involvement in the industry and the data he saw from Cisco actually deflates internet traffic panic, and more importantly provider arguments for higher cost, usage-capped internet access.

“Peak Internet growth may have been a couple of years ago,” wrote Burstein. “For more than a decade, internet traffic went up ~40% every year. Cisco’s VNI, the most accurate numbers available, sees growth this year down to 27% on landlines and falling to 15-20% many places over the next few years. Mobile growth is staying higher — 40-50% worldwide. Fortunately, mobile technology is moving even faster. With today’s level of [provider investments], LTE networks can increase capacity 10x to 15x.”

According to Burstein, Cisco’s estimates for mobile traffic in the U.S. and Canada in 2020 is 4,525 petabytes and in 2021 is 5,883 petabytes. That’s a 30% growth rate. Total consumer traffic in the U.S. and Canada Cisco sees as 48,224 petabytes and 56,470 petabytes in 2021. That’s a 17% growth rate, which is much lower on wired networks.

Burstein’s findings are in agreement with those of Professor Andrew Odlyzko, who has debunked “exaflood/data tsunami” scare stories for over a decade.

“[The] growth rate has been decreasing for almost two decades,” Odlyzko wrote in a 2016 paper published in IPSI BgD Transactions. “Even the growth rate in wireless data, which was extremely high in the last few years, shows clear signs of a decline. There is still rapid growth, but it is simply not at the rates observed earlier, or hoped for by many promoters of new technologies and business methods.”

Burstein

The growth slowdown, according to Odlyzko, actually began all the way back in 1997, providing the first warning the dot.com bubble of the time was preparing to burst. He argued the data models used by equipment manufacturers and the broadband industry to measure growth have been flawed for a long time.

When new internet trends became popular, assumptions were made about what impact they would have, but few models accurately predicted whether those trends would remain a major factor for internet traffic over the long-term.

Peer-to-peer file sharing, one of the first technologies Comcast attempted to use as a justification for its original 250GB usage cap, is now considered almost a footnote among the applications having a current profound impact on internet traffic. Video game play, also occasionally mentioned as a justification for usage caps or network management like speed throttling, was hardly ever a major factor for traffic slowdowns, and most games today exchange player actions using the smallest amount of traffic possible to ensure games are fast and responsive. In fact, the most impact video games have on the internet is the size of downloads required to acquire and update them.

Odlyzko also debunked alarmist predictions of traffic overloads coming from the two newest and largest traffic contributors of the period 2001-2010 — cloud backups and online video.

Odlyzko

“Actual traffic trends falsified this conjecture, as the first decade of the 21st century witnessed a substantial [traffic growth rate] slowdown,” said Odlyzko. “The frequent predictions about ‘exafloods’ overwhelming the networks that were frequent a decade ago have simply not come to pass. At the 20 to 30% per year growth rates that are observed today in industrialized countries, technology is advancing faster than demand, so there is no need for increasing the volume of investments, or for the fine-grained traffic control schemes that are beloved by industry managers as well as researchers.”

That’s a hard pill to swallow for companies that manufacture equipment designed to “manage,” throttle, cap, and charge customers based on their overusage of the internet. It also gives fits to industry executives, lobbyists, and the well paid public policy researchers that produce on spec studies and reports attempting to justify such schemes. But the numbers don’t lie, even if the industry does.

Although a lot of growth measured these days comes from wireless networks, they are not immune to growth slowdowns either. The arrival of the smartphone was hailed by wireless companies and Wall Street as a rocket engine to propel wireless revenue sky high. Company presidents even based part of their business plans on revenue earned from monetizing data usage allegedly to pay for spectrum acquisitions and upgrades.

McAdam

Verizon’s CEO Lowell McAdam told investors as late as a year ago “unlimited data” could never work on Verizon Wireless again.

“With unlimited, it’s the physics that breaks it,” he said. “If you allow unlimited usage, you just run out of gas.”

The laws of physics must have changed this year when Verizon reintroduced unlimited data for its wireless customers.

John Wells, then vice president of public affairs for CTIA, the wireless industry’s top lobbying group, argued back in 2010 AT&T’s decision to establish pricing tiers was a legitimate way for carriers to manage the ‘explosive growth in data usage.’ Wells complained the FCC was taking too long to free up critically needed wireless spectrum, so they needed “other tools” to manage their networks.

“This is one of the measures that carriers are considering to make sure everyone has a fair and equal experience,” Walls said, forgetting to mention the wireless industry was cashing in on wireless data revenue, which increased from $8.5 billion annually in 2005 to $41.5 billion in 2009, and Wall Street was demanding more.

“There were again many cries about unsustainable trends, and demands for more spectrum (even though the most ambitious conceivable re-allocation of spectrum would have at most doubled the cellular bands, which would have accommodated only a year of the projected 100+% annual growth),” Odlyzko noted.

What the industry and Wall Street did not fully account for is that their economic models and pricing had the effect of modifying consumer behavior and changed internet traffic growth rates. Odlyzko cites the end of unlimited data plans and the introduction of “tight data caps” as an obvious factor in slowing down wireless traffic growth.

“But there were probably other significant ones,” Odlyzko wrote. “For example, mobile devices have to cope not just with limited transmission capacity, but also with small screens, battery
limits, and the like. This may have led to changes of behavior not just of users, but also of app developers. They likely have been working on services that can function well with modest
bandwidth.”

“U.S. wireless data traffic, which more than doubled from 2012 to 2013, increased just 26% from 2013 to 2014,” Odylzko reported. “This was a surprise to many observers, especially since there is still more than 10 times as much wireline Internet traffic than wireless Internet traffic.”

Many believe that was around the same time smartphones achieved peak penetration in the marketplace. Virtually everyone who wanted a smartphone had one by 2014, and as a result of fewer first-time users on their networks, data traffic growth slowed. At the same time, some Wall Street analysts also began to worry the companies were reaching peak revenue per user, meaning there was nothing significant to sell wireless customers that they didn’t already have. At that point, future revenue growth would come primarily from rate increases and poaching customers from competitors. Or, as some providers hoped, further monetizing data usage.

The Net Neutrality debate has kept most companies from “innovating” with internet traffic “fast lanes” and other monetization schemes out of fear of stoking political blowback. Wireless companies could make significant revenue trying to sell customers performance boosters like higher priority access on a cell tower or avoiding a speed throttle that compromised video quality. But until providers have a better idea whether the current administration’s efforts to neuter Net Neutrality are going to be successful, some have satisfied themselves with zero rating schemes and bundling that offer customers content without a data caps or usage billing or access to discounted packages of TV services like DirecTV Now.

Verizon is also betting its millions that “content is king” and the next generation of revenue enhancers will come from owning and distributing exclusive video content it can offer its customers.

Odlyzko believes providers are continuing the mistake of stubbornly insisting on acquiring or at least charging content providers for streaming content across their networks. That debate began more than a decade ago when then SBC/AT&T CEO Edward Whitacre Jr. insisted content companies like Netflix were not going to use AT&T’s “pipes for free.”

“Much of the current preoccupation of telecom service providers with content can be explained away as following historical precedents, succumbing to the glamour of ‘content,'” Odlyzko wrote. “But there is likely another pressing reason that applies today. With connection speeds growing, and the ability to charge according to the value of traffic being constrained either directly by laws and regulations, or the fear of such, the industry is in a desperate search for ways not to be a ‘dumb pipe.'”

AT&T and Verizon: The Doublemint Twins of Wireless

A number of Wall Street analysts also fear common carrier telecom companies are a revenue growth ‘dead-end,’ offering up a commodity service about as exciting as electricity. Customers given a choice between AT&T, Verizon, Sprint, or T-Mobile need something to differentiate one network from the other. Verizon Wireless claims it has a best in class LTE network with solid rural coverage. AT&T offers bundling opportunities with its home broadband and DirecTV satellite service. Sprint is opting to be the low price leader, and T-Mobile keeps its customers with a network that outperforms expectations and pitches constant promotions and giveaways to customers that crave constant gratification and change.

The theory goes that acquiring video content will drive data usage revenue, further differentiate providers, and keep customers from switching to a competitor. But Odylzko predicts these acquisitions and offerings will ultimately fail to make much difference.

“Dumb pipes’ [are] precisely what society needs,” Odylzko claims and in his view it is the telecom industry alone that has the “non-trivial skills” required to provide ubiquitous reliable broadband. The industry also ignores the utility-like built-in advantage it has owning pre-existing wireline and wireless networks. The amortized costs of network infrastructure often built decades ago offers natural protection from marketplace disruptors that likely lack the fortitude to spend billions of dollars required to invade markets with newly constructed networks of their own.

Odylzko is also critical of the industry’s ongoing failure of imagination.

Stop the Cap! calls that the industry’s “broadband scarcity” business model. It is predicated on the idea that broadband is a limited resource that must be carefully managed and, in some cases, metered. Companies like Cox and Comcast now usage-cap their customers and deter them from exceeding their allowance with overlimit penalties. AT&T subjectively usage caps their customers as well, but strictly enforces caps only for its legacy DSL customers. Charter Communications sells Spectrum customers on the idea of a one-size fits all, faster broadband option, but then strongly repels those looking to upgrade to even faster speeds with an indefensible $200 upgrade fee.

Rationing Your Internet Experience?

“The fixation with video means the telecom industry is concentrating too much on limiting user traffic,” Odlyzko writes. “In many ways, the danger for the industry, especially in the wireline arena, is from too little traffic, not too much. The many debates as to whether users really need 100Mbps connections, much less 1Gbps ones, reveal lack of appreciation that burst capability is the main function of modern telecom, serving human impatience. Although pre-recorded video dominates in the volume of traffic, the future of the Net is likely to be bursts of traffic coming from cascades of interactions between computers reacting to human demands.”

Burstein agrees.

“The problem for most large carriers is that they can’t sell the capacity they have, not that they can’t keep up,” he writes. “The current surge in 5G millimeter wave [talk] is not because the technology will be required to meet demand. Rather, it is inspired by costs coming down so fast the 5G networks will be a cheaper way to deliver the bits. In addition, Verizon sees a large opportunity to replace cable and other landlines.”

On the subject of cost and broadband economics, Burstein sees almost nothing to justify broadband rate hikes or traffic management measures like usage caps or speed throttling.

“Bandwidth cost per month per subscriber will continue flat to down,” Burstein notes. “For large carriers, that’s been about $1/month [per customer] since ~2003. Moore’s Law has been reducing equipment costs at a similar rate.”

“Cisco notes people are watching more TV over the net in evening prime time, so demand in those hours is going up somewhat faster than the daily average,” he adds. “This could be costly – networks have to be sized for highest demand – but is somewhat offset by the growth of content delivery networks (CDN), like Akamai and Netflix. (Google, YouTube, and increasingly Microsoft and Facebook have built their own.) CDNs eliminate the carrier cost of transit and backhaul. They deliver the bits to the appropriate segment of the carrier network, reducing network costs.”

Both experts agree there is no evidence of any internet traffic jams and routine upgrades as a normal course of doing business remain appropriate, and do not justify some of the price and policy changes wired and wireless providers are seeking.

But Wall Street doesn’t agree and analysts like New Street Research’s Jonathan Chaplin believe broadband prices should rise because with a lack of competition, nothing stops cable companies from collecting more money from subscribers. He isn’t concerned with network traffic growth, just revenue growth.

“As the primary source of value to households shifts increasingly from pay-TV to broadband, we would expect the cable companies to reflect more of the annual rate increases they push through on their bundles to be reflected in broadband than in the past,” Chaplin wrote investors. Comcast apparently was listening, because Chaplin noticed it priced standalone broadband at a premium $85 for its flagship product, which is $20 more than Comcast’s non-promotional rate for customers choosing a TV-internet bundle.

“Our analysis suggests that broadband as a product is underpriced,” Chaplin wrote. “Our work suggests that cable companies have room to take up broadband pricing significantly and we believe regulators should not oppose the re-pricing. The companies will undoubtedly have to take pay-TV pricing down to help ‘fund’ the price increase for broadband, but this is a good thing for the business. Post re-pricing, [online video] competition would cease to be a threat and the companies would grow revenue and free cash flow at a far faster rate than they would otherwise.”

Cox Introducing $50 Option to Waive Data Caps: The ‘Freedom from Extortion Plan’

Phillip Dampier August 14, 2017 Broadband "Shortage", Competition, Consumer News, Cox, Data Caps, Editorial & Site News Comments Off on Cox Introducing $50 Option to Waive Data Caps: The ‘Freedom from Extortion Plan’

As Cox Communications continues to expand its arbitrary data cap program on its broadband customers, the company has announced a ‘cap relief’ option for customers willing to pay $50 more for the same service they enjoyed last year without a data cap.

Company insiders tell DSL Reports Cox will introduce a new $50 option to avoid the data caps and overlimit fees the company began imposing in 2015 starting in its Cleveland, Ohio service area.

On Wednesday, Cox is expected to introduce two add-on options to help avoid the bill shock likely if customers exceed 1TB of usage per month and face the $10 overlimit fee for each 50GB of data consumed:

  • $30 a month for 500GB of extra data;
  • $50 a month to avoid data caps altogether and get back unlimited service.

Cox customers in Cleveland were unimpressed with Cox’s data caps when they were introduced in 2015.

These fees are in addition to whatever Cox customers currently pay for broadband service.

“An overwhelming majority of data is consumed by a very small percentage of internet users,” a memo to employees documenting the changes reads. “The new choices are great options for the small percentage of heavy users who routinely use 1TB+ per month and prefer a flat monthly rate, rather than purchasing additional data blocks. In Cox markets with usage-based billing, the less than two percent of customers who exceed the amount of data included in their plan still have the option of paying $10 for each additional 50GB of data when they need it.”

Such claims raise the same questions Stop the Cap! has always asked since we began fighting data caps in 2008:

If data caps only impact <2% of customers, why impose them at all?

Is the actual revenue earned from overlimit fees worth the expense of introducing usage measurement tools, billing system changes, and the cost of customer dissatisfaction at the prospect of an unexpectedly high bill?

What technical reasons did Cox choose 1TB as its arbitrary usage allowance other than the fact Comcast and other operators chose this level first?

Time Warner Cable executives privately admitted in internal company documents obtained by the New York Attorney General’s office that internet traffic costs represent little more than “a rounding error” in expenses for cable companies. But for most consumers, $30-50 to buy a bigger data allowance is hardly that.

In short, the “solution” Cox has decided on this week comes in response to a problem the company itself created — imposing arbitrary, unwanted data caps and overlimit fees on a product that is already intensely profitable at the prices Cox has charged for years. This internet overcharging scheme is just another way to gouge captive customers that will likely have only one alternative — the phone company and its various flavors of DSL or a U-verse product that cannot compete on speed unless you are lucky enough to live in a fiber-to-the-home service area.

Net Neutrality: A Taste of Preferential Fast Lanes of Web Traffic in India

Unclear and unenforced Net Neutrality rules in India give a cautionary tale to U.S. internet users who could soon find Net Neutrality guarantees replaced in the U.S. with industry-written rules filled with loopholes or no Net Neutrality protections at all.

As India considers stronger enforcement of Net Neutrality protection, broadband providers have been merrily violating current Net Neutrality guidelines with fast lanes, sometimes advertised openly. Many of those ISPs are depending on obfuscation and grey areas to effectively give their preferred partners a leg up on the competition while claiming they are not giving them preferential treatment.

Medianama notes Ortel advertises two different internet speeds for its customers – one for regular internet traffic and the other for preferred partner websites cached by Ortel inside its network. The result is that preferred websites load 10-40 times faster than regular internet traffic.

Ortel’s vice president of broadband business, Jiji John, said Ortel is not violating Net Neutrality.

“Cache concept is totally based on the Internet user’s browsing. ISP does not control the contents and it has nothing to do with Net Neutrality,” John said in a statement.

Critics contend ISPs like Ortel may not control the contents of websites, but they do control which websites are cached and which are not.

Alliance Broadband, a West Bengal-based Internet provider, goes a step further and advertises higher speeds for Hotstar — a legal streaming platform, Google and popular movie, TV and software torrents, which arrive at speeds of 3-12Mbps faster than the rest of the internet. Alliance takes this further by establishing a reserved lane for each service, meaning regardless of what else one does with their internet connection, Hotstar content will arrive at 8Mbps, torrents at 12Mbps and the rest of the internet at 5Mbps concurrently. This means customers can get up to 25Mbps when combining traffic from the three sources, even if they are only subscribed to a much slower tier.

Alliance Broadband’s rate card. Could your ISP be next?

Which services are deemed “preferred” is up to the ISP. While Alliance may favor Google, Wishnet in West Bengal offers up preferential speeds for YouTube videos.

The ISPs claim these faster speeds are a result of “peering” those websites on its own internal network, reducing traffic slowdowns and delays. In some cases, the ISPs store the most popular content on its own servers, where it can be delivered to customers more rapidly. This alone does not violate Net Neutrality, but when an ISP reserves bandwidth for a preferred partner’s website or application, that can come at the expense of those websites that do not have this arrangement. Some ISPs have sought to devote extra bandwidth to those reserved lanes so it does not appear to impact on other traffic, but it still gives preferential treatment to some over others.

Remarkably, Indian ISPs frequently give preferential treatment to peer-to-peer services that routinely flout copyright laws while leaving legal streaming services other than Hotstar on the slow lane, encouraging copyright theft.

American ISPs have already volunteered not to block of directly impede the traffic of websites, but this may not go far enough to prevent the kinds of clever preferential runarounds ISPs can engineer where Net Neutrality is already in place, but isn’t well defined or enforced.

Pondering the Future of AT&T’s Dead-Brand Walking U-verse, DirecTV, and Data Caps

att directvWith the advent of AT&T/DirecTV Now, AT&T’s new over-the-top streaming TV service launching later this year, AT&T is preparing to bury the U-verse brand.

Earlier this year, AT&T customers noticed a profound shift in the company’s marketing priorities. The phone company began steering potential customers to AT&T’s latest acquisition, satellite television provider DirecTV, instead of U-verse. There is an obvious reason for this – DirecTV has 20.45 million customers as of the second quarter of 2016 compared to 4.87 million customers for AT&T U-verse TV. Volume discounts make all the difference for pay television companies and AT&T hopes to capitalize on DirecTV’s lower programming costs.

AT&T’s buyout of DirecTV confused many Wall Street analysts, some who believe the days of satellite television are past their peak. Satellite providers lack the ability to bundle services, although some phone companies partner with the satellite company to pitch phone, broadband, and satellite TV to their customers. But consider for a moment what would happen if DirecTV introduced satellite television without the need for a satellite dish.

Phillip Dampier: The "U" in U-verse doesn't stand for "unlimited."

Phillip Dampier: The “U” in U-verse doesn’t stand for “unlimited.”

AT&T’s DirecTV Now will rely on the internet to deliver television channels instead of a satellite. AT&T is currently negotiating with most of the programmer conglomerates that own popular cable channels to allow them to be carried “over-the-top” through broadband connections. If successful, DirecTV Now could become a nationwide powerhouse alternative to traditional cable TV.

AT&T is clearly considering a potential future where DirecTV could dispense with satellites and rely on broadband instead. The company quietly began zero rating DirecTV streaming in September for AT&T Mobility customers, which means watching that programming will not count against your data plan. For current U-verse customers, broadband speeds have always been constrained by the need to reserve large amounts of bandwidth to manage television viewing. Although AT&T has been boosting speeds in selected areas, a more fundamental speed boost could be achieved if AT&T dropped U-verse television and turned the service into a simple broadband pipe that relied on DirecTV Now to manage television service for customers.

AT&T seems well on the way, adding this notice to customer bills:

“To make it simpler for our customers U-verse High Speed Internet and U-verse Voice services have new names: AT&T Internet and AT&T Phone. AT&T Internet product names will now align with our Internet speed tiers. Our voice plan names will remain the same.”

An earlier internal company memo suggested AT&T would eventually transition all of its TV products into “AT&T Entertainment” after completing a transition to its “next generation TV platform.” Increasingly, that platform seems to be an internet-powered streaming solution and not U-verse or DirecTV satellite. That transition should begin in January.

Top secret.

Gone by end of 2016.

It would represent a formidable change, but one that makes sense for AT&T’s investors. The transition to IP networks means providers will offer one giant broadband pipe, across which television, phone and internet access will travel. The bigger that pipe becomes, the more services customers are likely to use — and that means growing data usage. Having a lot of fiber infrastructure also lays the foundation for expansion of AT&T’s wireless network — particularly towards 5G service, which is expected to rely on small cell technology to offer faster speeds to a more localized area — fast enough to serve as a home broadband replacement. Powering that network will require plenty of fiber optics to provide backhaul access to those small cells.

Last week, AT&T announced it launched a trial 100Mbps service using point-to-point millimeter-wave spectrum to offer broadband to subscribers in multiple apartment complexes around the Minneapolis area. If the initial trial is successful, AT&T will boost speeds to include 500Mbps service to those same complexes. AT&T has chosen to provide the service outside of its usual service area — Minneapolis is served by CenturyLink. AT&T acquired a nationwide license to offer service in the 70-80GHz band back in 2009, and an AT&T spokesperson claimed the wireless signal can reach up to two miles. The company is also experimenting with new broadband over power lines technology that could offer service in rural areas.

cheapJust like its wireless service, AT&T stands to make money not just selling access to broadband and entertainment, but also by metering customer usage to monetize all aspects of how customers communicate. Getting customers used to the idea of having their consumption measured and billed could gradually eliminate the expectation of flat rate service, at which point customers can be manipulated to spend even more to access the same services that cost providers an all-time low to deliver. Even zero rating helps drive a belief the provider is doing the customer a favor waiving data charges for certain content, delivering a value perception made possible by that provider first overcharging for data and then giving the customer “a break.”

As of mid-September, streaming media analyst Dan Rayburn noted Akamai — a major internet backbone transit provider — was selling content delivery contracts at $0.002 per gigabyte delivered, the lowest price Rayburn has ever seen. Other bids Rayburn has reviewed recently topped out at 0.5 cents per gigabyte. According to industry expert Dave Burstein, that suggests large ISPs like AT&T are paying something less than a penny per gigabyte for internet traffic.

“If you use 139GB a month, that costs your provider something like $1/month,” Burstein wrote, noting doubling backbone transit costs gives a rough estimate of the cost to the carrier, which also has to carry the bits to your local exchange. In this context, telecom services like broadband and phone service should be decreasing in cost, not increasing. But the opposite is true. Large providers with usage caps expect to be compensated many times greater than that, charging $10 for 50GB in overlimit fees while their true cost is well under 50 cents. Customers buying a cell phone are often fitted with a data plan that represents an unprecedented markup. The extent of price increases customers can expect can be previewed by looking at the cost of phone service over the last 20 years. The average, often flat rate telephone bill in 1995 was $19.98 a month. In 2014, it was $73 a month. In 2015, it was $90 a month. Those dramatically rising prices in the last few years are mostly as a result of the increased cost of data plans providers charge to clean up on customers’ growing data usage.

Both Comcast and AT&T are dedicated to a campaign of getting customers to forget about flat rate, unlimited service at a reasonable cost. Even as both companies raise usage caps, they continue to raise prices as well, even as their costs to provide the service continue to drop. Both companies hope to eventually create the kind of profitable windfall with wired services that wireless providers like AT&T and Verizon Wireless have enjoyed for years since they abandoned unlimited flat rate plans. Without significant new competition, the effective duopoly most Americans have for telecommunications services offers the opportunity to create a new, more costly (and false) paradigm for telecom services, based on three completely false claims:

  • data costs are expensive,
  • usage must be monetized, and
  • without a bigger return on investment, investors will not finance the next generation of telecom upgrades.

But as the evidence clearly shows, profits from selling high-speed internet access are only growing, even as costs are falling. Much of the drag on profits come from increasing costs related to licensing television content. Voice over IP telephone service is almost an afterthought for most cable and phone companies, often thrown in for $10-20 a month.

AT&T’s transition puts all the attention and its quest for fatter profits on its broadband service. That’s a bad deal for AT&T customers no matter what the company calls its “next generation” network.

Slow Broadband = Low Usage, Finds New Study

Phillip Dampier June 15, 2016 Broadband Speed, Consumer News, Online Video, Rural Broadband Comments Off on Slow Broadband = Low Usage, Finds New Study

kcl-logoHow much you use the Internet is often a matter of how fast your broadband connection is, according to a new study.

King’s College London researchers found a clear correlation between bad broadband and low usage rates, as customers avoided high bandwidth apps like online video because they were frustrating or impossible to use. One analyst said the findings show rural areas are being “deprived of the full benefits of broadband.”

One of Britain’s most used apps is the BBC iPlayer, which streams live and on-demand programming from multiple BBC radio and television networks. It is a well-known bandwidth consumer, using a significant proportion of a customer’s broadband connection to deliver up to HD-quality video streams. The study found users in South Ayrshire, Ards, the Isle of Wight, the East Riding of Yorkshire, North Down and Midlothian were among the areas where people used iPlayer the least. It wasn’t because they didn’t want to. Those areas were identified by Ofcom, the British telecom regulator, as receiving some of the worst Internet speeds in the UK. Conversely, areas with robust broadband, including London, south Gloucestershire and Bristol, showed above average usage.

Dr. Sastry

Dr. Sastry

“It is clear that high-speed broadband is an important factor in the use of bandwidth-intensive applications such as BBC iPlayer,” said Dr. Nishanth Sastry, a senior lecturer at King’s College London and the lead researcher. “With technological advancements, it is likely that more services important to daily life will move online, yet there is a significant proportion of the population with inadequate broadband connections who won’t be able to access such services.”

Ian Watt, a telecommunications consultant with the analyst Ovum, said broadband speeds must get higher to assure users can watch HD video and simultaneously share their Internet connection with other members of the household.

“Recent Ovum research indicated a speed of 25Mbps was an appropriate target access speed to provide a high quality experience for video services,” Watt said. In the United States, 25Mbps is the current minimum speed to qualify as broadband, according to the most recent FCC definition.

The findings may also explain why U.S. broadband providers only capable of delivering relatively low-speed Internet access report lower average usage than those capable of providing service at or above 25Mbps. Those offering the fastest speeds are also the most likely to attract higher volumes of Internet traffic as customers take advantage of those speeds.

Search This Site:

Contributions:

Recent Comments:

Your Account:

Stop the Cap!