Home » AT&T » Recent Articles:

An AT&T Emergency Generator Left On for Weeks Drives San Jose Family Out of Their Home

Phillip Dampier July 8, 2015 AT&T, Consumer News, HissyFitWatch, Video Comments Off on An AT&T Emergency Generator Left On for Weeks Drives San Jose Family Out of Their Home
U-verse cabinets often make the evening news when they are plunked down in your front yard. With statewide video franchise laws, you and your local community leaders no longer have a say.

U-verse cabinets often make the evening news when they are plunked down in your front yard, as this report from North Carolina shows.

A drunk driver that managed to take out one of AT&T’s “lawn refrigerators” powering its U-verse service in San Jose was the start of a four-week nightmare for a family driven from their home by a loud, polluting emergency generator left running by the phone company 24 hours a day. Falsely blamed for an accident? Here’s what to do.

AT&T responded to the accident scene after half the neighborhood lost service. Technicians installed a replacement green lawn box and fired up an emergency generator to restore service until Pacific Gas & Electric could arrive to hook up regular power to the AT&T box. If you’re looking for a versatile solution for emergencies, outdoor adventures, or job sites, portable generators are compact and mobile, making them perfect for a temporary, self-contained power source.

And then nothing happened… for weeks.

Emily White’s home on New Jersey Ave was treated to nearly a month of continuous generator noise and fumes that made staying in the house impossible.

“We could smell the exhaust in our house and the noise was just endless and loud,” White told KGO-TV. “It vibrated the windows, we couldn’t use our backyard, we went away on the weekends just to get away from it.”

The family ended up canceling their Father’s Day barbecue and left for an area hotel, regularly calling AT&T to try to get them to deal with the generator but had no response. But it turned out they may have called the wrong company to complain.

Nearly a month after the accident, PG&E trucks arrived to finally restore power to AT&T’s equipment. They also assumed full responsibility for the delay.

“We could have and should have done better by this customer. We want to do a deeper dive into why the work took so long,” PG&E spokesperson Nicole Liebelt said.

The electric company is also picking up the cost of the family’s hotel stay.

KGO-TV reports PG&E may have been the guilty party for leaving an AT&T emergency generator up and running for nearly a month. (2:11)

Approval of AT&T-DirecTV Merger Expected Next Week

Phillip Dampier July 2, 2015 AT&T, Competition, Consumer News, DirecTV, Public Policy & Gov't, Rural Broadband, Wireless Broadband Comments Off on Approval of AT&T-DirecTV Merger Expected Next Week
The headquarters building of U.S. satellite TV operator DirecTV is seen in Los Angeles, California May 18, 2014. REUTERS/Jonathan Alcorn

The headquarters building of U.S. satellite TV operator DirecTV is seen in Los Angeles, California May 18, 2014. REUTERS/Jonathan Alcorn

WASHINGTON (Reuters) – AT&T Inc’s proposed $48.5 billion acquisition of DirecTV is expected to get U.S. regulatory approval as soon as next week, according to people familiar with the matter, a decision that will combine the country’s No. 2 wireless carrier with the largest satellite-TV provider.

The Department of Justice, which assesses whether deals violate antitrust law, has completed its review of the merger and is waiting on the Federal Communications Commission to wrap up its own, according to three people familiar with the matter.

The FCC, which reviews if deals are in public interest, is poised to approve the deal with conditions as early as next week, according to three other people familiar with the matter.

All the sources asked not to be named because they were not authorized to speak with the media. An AT&T spokeswoman and FCC spokesman declined comment. Justice Department representatives were not immediately available for comment.

AT&T’s merger with DirecTV, announced in May 2014, would create the country’s largest pay-TV company, giving DirecTV a broadband product and AT&T new avenues of growth beyond the maturing and increasingly competitive wireless service.

The deal has been expected to pass regulatory muster in contrast with the rival mega-merger between cable and Internet providers Comcast and Time Warner Cable, which was rejected in April largely over the combined companies’ reach into the broadband market.

The FCC and AT&T have been in negotiations over conditions for the merger for several weeks, the people said, adding that none of the conditions are controversial enough to break the deal.

Those conditions are expected to include assurances that both middle-class and low-income Americans have access to affordable high-speed Internet, including an offering of broadband subscriptions as a standalone service without a TV bundle, according to two of the people.

AT&T has earlier committed to expand access to broadband service in rural areas and to offer standalone Internet service at speeds of at least 6 Megabits per second to ensure consumers can access rival video services online, such as Netflix.

FCC officials are also considering ways to ensure that the conditions are properly enforced in the future, possibly through a third-party monitor, according to the two sources.

The FCC is also weighing how to ensure the merged companies abide by the so-called net neutrality rules, which regulate how Internet service providers manage traffic on their networks.

AT&T has promised to abide by net neutrality principles such as no-blocking of traffic, but is challenging in court the FCC’s newest net neutrality regulations that have expanded the agency’s authority over various deals between Internet providers and content companies.

FCC reviewers are weighing what net neutrality-related conditions to apply to the merger and how to address the possibility that the court throws out the latest rules, the two sources said.

Reported by: Alina Selyukh and Diane Bartz

Big City Telecom Infrastructure is Often Ancient: Conduits 70+ Years Old, Wiring from 1960s-1980s

A panel electromechanical switch similar to those in use in New York until the 1970s.

A panel electromechanical switch similar to those in use in New York until the 1970s. They were installed in the 1920s.

As late as the 1970s, New York Telephone (today Verizon) was still maintaining electromechanical panel switches in its telephone exchanges that were developed in the middle of World War I and installed in Manhattan between 1922-1930. Reliance on infrastructure 40-50 years old is nothing new for telephone companies across North America. A Verizon technician in New York City is just as likely to descend into tunnels constructed well before they were born as is a Bell technician in Toronto.

Slightly marring last week’s ambitious announcement Bell (Canada) was going to commence an upgrade to fiber to the home service across the Greater Toronto Area came word from a frank Bell technician in attendance who predicted Bell’s plans were likely to run into problems as workers deal with aging copper infrastructure originally installed by their fathers and grandfathers decades earlier.

The technician said some of the underground conduits he was working in just weeks earlier in Toronto’s downtown core were “easily 60-70 years old” and the existing optical fiber cables running through some of them were installed in the mid-1980s.

At least that conduit contained fiber. In many other cities, copper infrastructure from the 1960s-1980s is still in service, performing unevenly in some cases and not much at all in others.

Earlier this year, several hundred Verizon customers were without telephone service for weeks because of water intrusion into copper telephone cables, possibly amplified by the corrosive road salt dumped on New York streets to combat a severe winter. Verizon’s copper was down and out while its fiber optic network was unaffected. On the west coast, AT&T deals with similar outages caused by flooding. If that doesn’t affect service, copper theft might.

munifiber

Fiber optic cable

Telephone companies fight to get their money’s worth from infrastructure, no matter how old it is. Western Electric first envisioned the panel switches used in New York City telephone exchanges until the end of the Carter Administration back in 1916. It was all a part of AT&T’s revolutionary plan to move to subscriber-dialed calls, ending an era of asking an operator to connect you to another customer.

AT&T engineer W.G. Blauvelt wrote the plan that moved New York to fully automatic dialing. By 1930, every telephone exchange in Manhattan was served by a panel switch that allowed customers to dial numbers by themselves. But Blauvelt could not have envisioned that equipment would still be in use fifty years later.

As demand for telephones grew, the phone company did not expand its network of panel switches, which were huge – occupying entire buildings – loud, and very costly to maintain. It did not replace them either. Instead, newer exchanges got the latest equipment, starting with more modern Crossbar #1 switches in 1938. In the 1950s, Crossbar #5 arrived and it became a hit worldwide. Crossbar #5 switches usually stood alone or worked alongside older switching equipment in fast growing exchanges. It occupied less space, worked well without obsessive maintenance, and was reliable.

It was not until the 1970s that the Bell System decided to completely scrap their electromechanical switches in favor of newer electronic technology. The advantages were obvious — the newer equipment occupied a fraction of the space and had considerably more capacity than older switches. That became critical in New York starting in the late 1960s when customer demand for additional phone lines exploded. New York Telephone simply could not keep up with and waiting lists often grew to weeks as technicians looked for spare capacity. The Bell System’s answer to this growth was a new generation of electronic switches.

The #1 ESS was an analog electronic switch first introduced in New Jersey in 1965. Although it worked fine in smaller and medium-sized communities, the switch’s software bugs were notorious when traffic on the exchange reached peak loads. It was clear to New York Telephone the #1 ESS was not ready for Manhattan until the bugs were squashed.

Bell companies, along with some independent phone companies that depended on the same equipment, moved cautiously to begin upgrades. It would take North American phone companies until August 2001 to retire what was reportedly the last electromechanical switch, serving the small community of Nantes, Quebec.

ATT-New-York-central-office-fire-300x349

A notorious 1975 fire destroyed a phone exchange serving lower Manhattan. That was one way to guarantee an upgrade from New York Telephone.

On rare occasions, phone companies didn’t have much of a choice. The most notorious example of this was the Feb. 27, 1975 fire in the telephone exchange located at 204 Second Avenue and East 13th Street in New York. The five alarm fire destroyed the switching equipment and knocked out telephone service for 173,000 customers before 700 firefighters from 72 fire units managed to put the fire out more than 16 hours later. That fire is still memorialized today by New York firefighters because it injured nearly 300 of them. But the fire’s legacy continued for decades as long-term health effects, including cancer, from the toxic smoke would haunt those who fought it.

The New York Telephone building still stands and today also houses a street level Verizon Wireless retail store.

New York Telephone engineers initially rescued a decommissioned #1 Crossbar switch waiting to be melted down for scrap. It came from the West 18th Street office and was cleaned and repaired and put into emergency service until a #1 ESS switch originally destined for another central office was diverted. This part of Manhattan got its upgrade earlier for all the wrong reasons.

Throughout the Bell System in the 1970s and 80s, older switches were gradually replaced in favor of all electronic switches, especially the #5 ESS, introduced in 1982 and still widely in service today, serving about 50% of all landlines in the United States. Canadian telephone companies often favored telephone switches manufactured by Northern Telecom (Nortel), based in Mississauga, Ontario. They generally worked equally well as the American counterpart and are also in service in parts of the United States.

The legacy of more than 100 years of telephone service has made running old and new technology side by side nothing unusual for telephone companies. It has worked for them before, as has their belief in incremental upgrades. So Bell’s announcement it would completely blanket Toronto with all-fiber service is a departure from standard practice.

For Bell in Toronto, the gigabit upgrade will begin by pushing fiber cables through existing conduits that are also home to copper and fiber wiring still in service. If a conduit is blocked or lacks enough room to get new fiber cables through, the Bell technician predicted delays. It is very likely that sometime after fiber service is up and running, copper wire decommissioning will begin in Toronto. Whether those cables remain dormant underground and on phone poles for cost reasons or torn out and sold for scrap will largely depend on scrap copper prices, Bell’s budget, and possible regulator intervention.

But Bell’s upgrade will clearly be as important, if not more so, than the retirement of mechanical phone switches a few decades earlier. For the same reasons — decreased maintenance costs, increased capacity, better reliability, and the possibility to market new services for revenue generation make fiber just as good of an investment for Bell as electronic switches were in the 1970s and 1980s.

[flv]http://www.phillipdampier.com/video/ATT Reconnecting 170000 Phone Customers in NYC After a Major Fire 1975.mp4[/flv]

AT&T produced this documentary in the mid-1970s about how New York Telephone recovered from a fire that destroyed a phone exchange in lower Manhattan and wiped out service for 173,000 customers in 1975. The phone company managed to get service restored after an unprecedented three weeks. It gives viewers a look at the enormous size of old electromechanical switching equipment and masses of phone wiring. (22:40) 

The ISP Defense Squad Attacks Guardian Story on Internet Slowdowns

Phillip "Speaking as a Customer" Dampier

Phillip “Speaking as a Customer” Dampier

Two defenders of large Internet Service Providers are coming to the defense of the broadband industry by questioning a Guardian article that reported major Internet Service Providers were intentionally allowing a degradation in performance of Content Delivery Networks and other high volume Internet traffic in a dispute over money.

Richard Bennett and Dan Rayburn today both published articles attempting to discredit Battle for the Net’s effort highlighting the impact interconnection disputes can have on consumers.

Rayburn:

On Monday The Guardian ran a story with a headline stating that major Internet providers are slowing traffic speeds for thousands of consumers in North America. While that’s a title that’s going to get a lot of people’s attention, it’s not accurate. Even worse, other news outlets like Network World picked up on the story, re-hashed everything The Guardian said, but then mentioned they could not find the “study” that The Guardian is talking about. The reason they can’t find the report is because it does not exist.

[…] Even if The Guardian article was trying to use data collected via the BattlefortheNet website, they don’t understand what data is actually being collected. That data is specific to problems at interconnection points, not inside the last mile networks. So if there isn’t enough capacity at an interconnection point, saying ISPs are “slowing traffic speeds” is not accurate. No ISP is slowing down the speed of the consumers’ connection to the Internet as that all takes place inside the last mile, which is outside of the interconnection points. Even the Free Press isn’t quoted as saying ISPs are “slowing” down access speed, but rather access to enough capacity at connection points.

Bennett:

In summary, it appears that Battle for the Net may have cooked up some dubious tests to support their predetermined conclusion that ISPs are engaging in evil, extortionate behavior.

It may well be the case that they want to, but AT&T, Verizon, Charter Cable, Time Warner Cable, Brighthouse, and several others have merger business and spectrum auction business pending before the FCC. If they were manipulating customer experience in such a malicious way during the pendency of the their critical business, that would constitute executive ineptitude on an enormous scale. The alleged behavior doesn’t make customers stick around either.

I doubt the ISPs are stupid enough to do what the Guardian says they’re doing, and a careful examination of the available test data says that Battle for the Net is actually cooking the books. There is no way a long haul bandwidth and latency test says a thing about CDN performance. Now it could be that Battle for the Net has as a secret test that actually measures CDNs, but if so it’s certainly a well-kept one. Stay tuned.

The higher line measures speeds received by Comcast customers. The lower line represents speeds endured by AT&T customers, as measured by MLab.

The higher line measures speeds received by Comcast customers connecting to websites handled by GTT in Atlanta. The lower line represents speeds endured by AT&T customers, as measured by MLab.

Stop the Cap! was peripherally mentioned in Rayburn’s piece because we originally referenced one of the affected providers as a Content Delivery Network (CDN). In fact, GTT is a Tier 1 IP Network, providing service to CDNs, among others — a point we made in a correction prompted by one of our readers yesterday.

Both Rayburn and Bennett scoff at Battle for the Net’s methodology, results, and conclusion your Internet Service Provider might care more about money than keeping customers satisfied with decent Internet speeds. Bennett alludes to the five groups backing the Battle for the Net campaign as “comrades” and Rayburn comes close to suggesting the Guardian piece represented journalistic malpractice.

Much was made of the missing “study” that the Guardian referenced in its original piece. Stop the Cap! told readers in our original story we did not have a copy to share either, but would update the story once it became available.

We published our own story because we were able to find, without much difficulty, plenty of raw data collected by MLab from consumers conducting voluntary Internet Health Tests, on which Battle for the Net drew its conclusions about network performance. A review of that data independently confirmed all the performance assertions made in the Guardian story, with or without a report. There are obvious and undeniable significant differences in performance between certain Internet Service Providers and traffic distribution networks like GTT.

So let’s take a closer look at the issues Rayburn and Bennett either dispute or attempt to explain away:

  1. MLab today confirmed there is a measurable and clear problem with ISPs serving around 75% of Americans that apparently involves under-provisioned interconnection capacity. That means the connection your ISP has with some content distributors is inadequate to handle the amount of traffic requested by customers. Some very large content distributors like Netflix increasingly use their own Content Delivery Networks, while others rely on third-party distributors to move that content for them. But the problem affects more than just high traffic video websites. If Stop the Cap! happens to reach you through one of these congested traffic networks and your ISP won’t upgrade that connection without compensation, not only will video traffic suffer slowdowns and buffering, but so will traffic from every other website, including ours, that happens to be sent through that same connection.

MLab: "Customers of Comcast, Time Warner Cable, and Verizon all saw degraded performance [in NYC] during peak use hours when connecting across transit ISPs GTT and Tata. These patterns were most dramatic for customers of Comcast and Verizon when connecting to GTT, with a low speed of near 1 Mbps during peak hours in May. None of the three experienced similar problems when connecting with other transit providers, such as Internap and Zayo, and Cablevision did not experience the same extent of problems."

MLab: “Customers of Comcast, Time Warner Cable, and Verizon all saw degraded performance [in NYC] during peak use hours when connecting across transit ISPs GTT and Tata. These patterns were most dramatic for customers of Comcast and Verizon when connecting to GTT, with a low-speed of near 1 Mbps during peak hours in May. None of the three experienced similar problems when connecting with other transit providers, such as Internap and Zayo, and Cablevision did not experience the same extent of problems.”

MLab:

Our initial findings show persistent performance degradation experienced by customers of a number of major access ISPs across the United States during the first half of 2015. While the ISPs involved differ, the symptoms and patterns of degradation are similar to those detailed in last year’s Interconnections study: decreased download throughput, increased latency and increased packet loss compared to the performance through different access ISPs in the same region. In nearly all cases degradation was worse during peak use hours. In last year’s technical report, we found that peak-hour degradation was an indicator of under-provisioned interconnection capacity whose shortcomings are only felt when traffic grows beyond a certain threshold.

Patterns of degraded performance occurred across the United States, impacting customers of various access ISPs when connecting to measurement points hosted within a number of transit ISPs in Atlanta, Chicago, Los Angeles, New York, Seattle, and Washington, D.C. Many of these access-transit ISP pairs have not previously been available for study using M-Lab data. In September, 2014, several measurement points were added in transit networks across the United States, making it possible to measure more access-transit ISP interconnection points. It is important to note that while we are able to observe and record these episodes of performance degradation, nothing in the data allows us to draw conclusions about who is responsible for the performance degradation. We leave determining the underlying cause of the degradation to others, and focus solely on the data, which tells us about consumer conditions irrespective of cause.

Rayburn attempts to go to town highlighting MLab’s statement that the data does not allow it to draw conclusions about who is responsible for the traffic jam. But any effort to extend that to a broader conclusion the Guardian article is “bogus” is folly. MLab’s findings clearly state there is a problem affecting the consumer’s Internet experience. To be fair, Rayburn’s view generally accepts there are disputes involving interconnection agreements, but he defends the current system that requires IP networks sending more traffic than they return to pay the ISP for a better connection.

Rayburn's website refers to him as "the voice of industry."

Rayburn’s website refers to him as “the voice of industry.”

  1. Rayburn comes to the debate with a different perspective than ours. Rayburn’s website highlights the fact he is the “voice of the industry.” He also helped launch the industry trade group Streaming Video Alliance, which counts Comcast as one of its members. Anyone able to afford the dues for sponsor/founding member ($25,000 annually); full member ($12,500); or supporting member ($5,500) can join.

Stop the Cap! unreservedly speaks only for consumers. In these disputes, paying customers are the undeniable collateral damage when Internet slowdowns occur and more than a few are frequently inconvenienced by congestion-related slowdowns.

It is our view that allowing paying customers to be caught in the middle of these disputes is a symptom of the monopoly/duopoly marketplace broadband providers enjoy. In any industry where competition demands a provider deliver an excellent customer experience, few would ever allow these kinds of disputes to alienate customers. In Atlanta, Los Angeles, and Chicago, for example, AT&T has evidently made a business decision to allow its connections with GTT to degrade to just a fraction of the performance achieved by other providers. Nothing else explains consistent slowdowns that have affected AT&T U-verse and DSL customers for months on end that involve GTT while Comcast customers experience none of those problems.

We also know why this is happening because AT&T and GTT have both confirmed it to Ars Technica, which covered this specific slowdown back in March. As is always the case about these disputes, it’s all about the money:

AT&T is seeking money from network operators and won’t upgrade capacity until it gets paid. Under its peering policy, AT&T demands payment when a network sends more than twice as much traffic as it receives.

“Some providers are sending significantly more than twice as much traffic as they are receiving at specific interconnection points, which violates our peering policy that has been in place for years,” AT&T told Ars. “We are engaged in commercial-agreement discussions, as is typical in such situations, with several ISPs and Internet providers regarding this imbalanced traffic and possible solutions for augmenting capacity.”

competitionMissing from this discussion are AT&T customers directly affected by slowdowns. AT&T’s attitude seems uninterested in the customer experience and the company feels safe stonewalling GTT until it gets a check in the mail. It matters less that AT&T customers have paid $40, 50, even 70 a month for high quality Internet service they are not getting.

In a more competitive marketplace, we believe no ISP would ever allow these disputes to impact paying subscribers, because a dissatisfied customer can cancel service and switch providers. That is much less likely if you are an AT&T DSL customer with no cable competition or if your only other choice cannot offer the Internet speed you need.

  1. Consolidating the telecommunications industry will only guarantee these problems will get worse. If AT&T is allowed to merge with DirecTV and expand Internet service to more customers in rural areas where cable broadband does not reach, does that not strengthen AT&T’s ability to further stonewall content providers? Of course it does. In fact, even a company the size of Netflix eventually relented and wrote a check to Comcast to clear up major congestion problems experienced by Comcast customers in 2014. Comcast could have solved the problem itself for the benefit of its paying customers, but refused. The day Netflix’s check arrived, problems with Netflix magically disappeared.

More mergers and more consolidation does not enhance competition. It entrenches big ISPs to play more aggressive hardball with content providers at the expense of consumers.

Even Rayburn concedes these disputes are “not about ‘fairness,’ it’s business,” he writes. “Some pay based on various business terms, others might not. There is no law against it, no rule that prohibits it.”

Battle for the Net’s point may be that there should be.

Cable Companies Demand Satellite Providers Pay Up; Customer Bills Expected to Rise

directvTwo cable industry trade associations have asked the Federal Communications Commission to start collecting more fees from satellite television operators to cover the FCC’s regulatory expenses — a move satellite providers argue will cause consumers to suffer bill shock from increased prices.

The American Cable Association and the National Cable & Telecommunications Association have filed comments with the FCC asking the commission to impose the same regulatory fees on satellite subscribers that cable companies are likely to pay in 2015 — 95 cents a year per subscriber.

The FCC has proposed initially charging satellite operators $0.12 this year per customer, or about one cent a month. The two cable lobbying groups want that 12 cent fee doubled to 24 cents and then raised an additional 24 cents each year until it reaches parity with what cable companies pay.

dish logo“The FCC is off to a good start by declaring that Dish and DirecTV should pay regulatory fees to support the work of the agency’s Media Bureau for the first time and proposing setting the initial per subscriber fee at one cent per month in 2015,” said Matthew Polka, president and CEO of the ACA. “But given the FCC proposes that cable operators pay nearly 8 cents per month, per customer, it must do more, including requiring these two multibillion dollar companies with national reach to shoulder more of the fee burden next year that is now disproportionately borne by smaller, locally based cable operators.”

The satellite industry has filed their own comments with the FCC objecting to any significant fee increases, claiming it will cause consumers to experience bill shock and that satellite companies pose less of a regulatory burden on the FCC in comparison to cable operators.

The ACA counters that even if the satellite companies were required to pay the full 95 cents this year — the same rate small independent cable operators pay — it would add a trivial $0.08 a month to customer bills — less than a 0.4% increase on the lowest priced introductory offer sold by satellite providers.

fccThe ACA reminded the FCC it did not seem too concerned about rate shock when it imposed a 99 cent fee on IPTV providers like AT&T U-verse in 2014 without a phase-in.

DirecTV and Dish argue the FCC has jurisdiction over cable’s television, phone and Internet packages — a more complex assortment of services. Satellite providers currently only sell television service, so charging the same fee cable companies pay would be disproportionate and unfair, both claim.

Despite the sudden introduction of the IPTV fee last year, AT&T managed to use the opportunity to turn lemons into lemonade.

AT&T added a “Regulatory Video Cost Recovery Charge” on customers’ bills after the FCC assessed a 99 cent fee on IPTV services like U-verse in 2014. But AT&T charged nearly three times more than what it actually owed. U-verse customers were billed $0.24 a month/$2.88 in 2014 for “regulatory fee cost recovery.” But AT&T only paid the FCC $0.99 for each of its 5.7 million customers. It kept the remaining $1.89 for itself, amounting to $10,773,000 in excess profit.

This year the FCC expects to collect $0.95 from each U-verse subscriber, a four cent decline.

Search This Site:

Contributions:

Recent Comments:

Your Account:

Stop the Cap!