Home Network

5 Reasons Why Your Internet Has Slowed Down and How You Can Fix It

Apr 15, 2020

As most of the world is now under stay-at-home orders and social isolation guidelines, this makes the internet more popular than ever. While we all do our part to work and learn from home, you may be wondering if the internet can handle this much traffic? Fortunately, the answer is a resounding YES! Our members are working round the clock to provide superior network performance and are committed to keeping us all connected. However, you may still be questioning why your connection sometimes appears to be slow. We are here to help! These five common problems, suggested by our engineers, may help you pin down the cause:

5 Reasons Why Your Internet Has Slowed Down and How You Can Fix It

Consider sharing the infographic above with your friends and family to help those of us who are less tech savvy. An image on the fridge can be a great reminder! Overall, just remember to hang in there and breathe; with a good setup, you can turn your home into a productive workplace and fun learning environment for the kids. On the bright side, you have a very short commute from the bed to the kitchen, and your fuzzy slippers don’t violate the office dress code.

You can find more information about the cable industry’s response to COVID-19 on NCTA’s COVID-19 dashboard. If you are interested in learning more tips and tricks to improve your Wi-Fi performance, check out our blog post. Subscribe to our blog to keep up to date with COVID-19 news and the communications industry.



A Sneak Peek of SCTE Cable-Tec Expo

Ralph Brown

Sep 13, 2016

CableLabs and Kyrio will be hosting a booth at the SCTE Cable-Tec Expo 2016.  To provide you with a sneak peek of what we plan to show at the event, below are highlights of six demonstrations:

Full Duplex DOCSIS® 3.1 Technology

With Full Duplex DOCSIS 3.1 technology, the HFC network can support 10 Gbps Downstream x 6 Gbps Upstream symmetrical capacities in 1.2 GHz of spectrum.  Multi-Gbps symmetric services will meet user demands and support future applications. Learn more about the next evolution of DOCSIS technology.

3.5GHz Shared Spectrum and Wi-Fi Traffic Aggregation

3.5 GHz (3.55 – 3.7GHz) shared spectrum offers the potential democratization of LTE. Cable operators can deploy LTE-based solutions within homes, offices and even in public environments to create low cost mobile networks.  See how CableLabs multi-path TCP technology can help cable operators aggregate IP traffic from both 3.5 GHz spectrum and existing Wi-Fi access points to provide their customers with great wireless speeds.

Energy Efficiency of CPE (Consumer Premises Equipment)

CableLabs provides technical leadership to the industry through influencing energy efficiency voluntary agreements for set-top boxes and small network equipment as well as other energy efficiency initiatives.  CableLabs also works closely with SCTE Energy 2020 to address end-to-end energy efficiency in the cable infrastructure.  Learn more about the voluntary agreement initiatives and see CPE energy efficiency in action!

Automated Leakage Detection and Time Domain Reflectometer (TDR)

Cable operators can automatically gather their own leakage data for use as a diagnostic tool. This new detection method employs GPS and a continuous wave test signal and can pinpoint leakage sources. The leakage data can be used to make Proactive Network Maintenance (PNM) map overlays to speed problem resolution and to prevent LTE interference.  The TDR uses standing waves on digital signals to accurately calculate the distance to reflections without interrupting service.  Learn more about how these methods can improve network performance.

DOCSIS® 3.1 Profile Management Application (PMA)

The Profile Management Application implements a software application that can configure and manage DOCSIS 3.1 OFDM subcarrier modulation profiles on a DOCSIS 3.1 CMTS. This demonstration shows how PMA interacts with CMTSs, CMs, and other network elements to monitor, create, modify and then assign specific profiles to specific DOCSIS 3.1 CMs to optimize and maximize the capacity on a DOCSIS 3.1 OFDM Channel.

Real World Testing

In a world of constant change and innovation, the need for agility and assurance is crucial. Learn how Kyrio Testing Services provides their customers with the ability to adapt to new requirements, accelerate change, and assure quality of product performance and usability from the end user perspective.

We look forward to meeting you at the SCTE Cable-Tec Expo, Booth # 1424, September 26 – 29 in Philadelphia.


Li-Fi – A Bright Future for Home Networks

Josh Redmore
Principal Architect, Wireless Research & Development

Mar 8, 2016

At CableLabs, we are continually researching new methods of in-home wireless network distribution, and one exciting new contender is Li-Fi.

What is Li-Fi?

Li-Fi is the modulation of a free-space beam of light in order to transmit a signal. It can be thought of as analogous to Wi-Fi, just in a much higher frequency band (430 – 770 THz vs. 2.4 GHz). We’ve actually been using this same basic concept for over 100 years, in the form of Morse code being transmitted from ship to ship via signal lamps.

The Shannon–Hartley theorem allows us to calculate the maximum bitrate of a communications channel with a specific bandwidth. Since capacity increases with bandwidth, we can immediately see the vast potential of Li-Fi, which has ~340 THz to work with in the visible light frequencies. Compare that with Wi-Fi, which has less than 1 GHz available and is able to provide over a gigabit per second, and you can see the potential for ultra-high speed in-home networks. As virtual and augmented reality achieve widespread adoption, these ultra-high speeds will become mandatory.

To give you an idea of just how large a difference this increase in bandwidth is, it’s close to the same difference between the mass of the Earth and the Sun!

Where are we going?

The ideal product is a Li-Fi enabled light bulb in the same form factor that consumers are used to now – and with the same ease of installation. Li-Fi could be a viable solution to improving the coverage and reliability of a home network by reusing existing light fixtures. Just screw the bulb in and you’ve expanded your network.

idea concept with light bulbs on a blue background

The other side of the connection is the endpoint device, and a number of consumer device manufacturers are beginning the process of integrating Li-Fi. Apple is exploring adding Li-Fi to their mobile devices, which is a natural product evolution, as the majority of smartphones already contain the two things needed for Li-Fi, a light detector (the camera) and a light emitter (the camera flash).

Where are we today?


CableLabs has fully functioning prototypes of a single-channel Li-Fi system, which have proved to achieve data rates of around 300 Mbps. It is free from Wi-Fi interference and simple to use. Currently the devices need to be directly in-line with each other, so research into improving the signal-to-noise ratio (SNR) is needed for us to achieve omnidirectional Li-Fi.

We’ve also done extensive research into the necessary backhaul systems that will make Li-Fi a useful reality, such as next-generation powerline networking, which uses your existing home electrical wiring as a network. By networking the Li-Fi bulbs together, you can achieve seamless, whole-home coverage. Anywhere there is light, there is connectivity.

The cable industry, with the introduction of DOCSIS 3.1 and beyond, continues to increase Internet connection speeds. These speeds are currently beyond what any current-generation in-home wireless system can handle, so research into technologies like Li-Fi will play a vital part in ensuring customers are able to fully utilize their connection.

Josh Redmore is the Lead Architect in Wireless Network Optimization group at CableLabs.
Follow him on Twitter.


First Impressions of the Mobile World Congress in Barcelona

Pete Smyth
VP, Core Innovations

Feb 23, 2016

CableLabs is hosting its first tour of Mobile World Congress (MWC) here in Barcelona, Spain. The MWC is the World’s largest mobile congress with attendance likely to top over one hundred thousand people from two hundred countries. It’s the place to be seen in the mobile world, which today is more than about handsets and networks. All the largest companies in telecom are here making major announcements.

I began the day scouting with the team to make sure that our tour would visit the best companies and technologies. It was an amazing day of observing technological breakthroughs that I would like to share.

1GBps Mobile

Let’s first discuss speed. Many of the world's largest telecom vendors are presenting LTE technologies that will enable 1Gbps to your mobile phone - -that is if you are the only user in a cell close to the base station. To achieve these speeds, the headsets will need four antennas integrated into them. This is now possible with frequencies as low as 2.1 GHz in order to achieve 4x4 MIMO. It also means that you need to combine 3 mobile carriers with 256 QAM. What does that mean in a real world situation in a few years’ time? Well, the average speed today of LTE in the US of circa 10 MBps and these new technologies will take this to circa 60Mbps in a few years time with all the associated improvements in cellular backhaul. Qualcomm is making all this technology available in their new X16 Chip, which is being used by companies such as Ericsson, Nokia, Huawei and the rest - all here at MWC.


5G is based on the evolution of LTE with the use of millimeter waves to extend frequency operation of today’s wireless systems from sub-6GHz to 100GHz. Ericsson was demonstrating a 15GHz system with an 800Mhz channel bandwidth to support users with up to 20Gbps. Because of the smaller physical size of antennas at these millimeter wave frequencies, it is possible to build arrays of these for 256x256 MIMO and to steer these pencil-like beams to individual users. There are expectations of commercialization of this type of technology as early as 2020.

5G - IoT

5G is more than about speed. The evolution of LTE will support the aggregation of small channels for massive IoT applications which require relatively small bandwidths.

5G - 1mS Latency

One of the most exciting aspects of 5G is the requirement to support 1mS latency. Today’s mobile networks have latency of typically 60-100mS. Why is this new requirement important? The distance from the touch of an object to the sensation in your brain is approximately 1mS. With this target of low latency, it would be possible for people to see and react to events in real-time for critical control operations. Real-time control of cars driving at high speed would be possible across networks with no traffic lights!

It is not all 5G – WiFi 802.11ax

Qualcomm was demonstrating 802.11ax pre-standard WiFi. This takes the advances for LTE technology to WiFi to support multi-users in complex environments such as offices to maintain speeds in scheduled applications.

LTE coming to WiFi soon - MuLTEfire

Qualcomm is developing a new technology called MuLTEfire which will support LTE in the 5GHz WiFi with Listen Before Talk (LBT) as a good neighbor to WiFi. MSOs could then support LTE based services without the need of an anchor mobile carrier. Qualcomm has formed the MuLTEfire alliance for likeminded members to exploit LTE in the WiFi bands. Today this includes companies such as Intel and Ruckus.

3.5GHz mobile is on its way

The FCC is opening the 3.55-3,7GHz (LTE bands 42 and 43) within the next few years. People will be able to access up to 150MHz of free mobile spectrum for small cells. Qualcomm has formed an alliance with Ruckus to demonstrate a 3.5GHz LTE small cell base station in a dongle connected into what looked like a Google OnHub. 3.5GHz presents great opportunities for CableLabs MSO members seeking to become mobile operators.

And finally:

If you told me as little as a few years ago that anyone could become a mobile operator in the near future, enjoying free spectrum with a base station on a dongle and a package core network on a laptop, or virtualized in the cloud, I would have not believed you. What I saw at MWC on the first day makes me feel that this well be real within the next 3+ years! CableLabs is uniquely positioned to work with our Members, some of who are gathered here, to drive innovation to make this happen.


Coming Up for Air on LTE-U Coexistence: An Update

Rob Alderfer
VP, Technology Policy

Feb 12, 2016

If you have been tracking CableLabs’ work to ensure that the introduction of LTE into unlicensed spectrum does not do disproportionate harm to Wi-Fi consumers, you may have noticed that it has been a while since we wrote in this blog on the topic. That’s not because we’ve moved on, however. To the contrary, we’ve been hard at work with industry stakeholders.

News of the Week: The Wi-Fi Alliance Workshop

This week, CableLabs contributed to the second Wi-Fi Alliance (WFA) LTE-U Coexistence Test Workshop, just as we did at the first WFA event. The proceedings served as an update on work within the WFA – of which CableLabs is a member, along with 600 or so other companies, including the proponents of LTE-U – to develop consensus-based technical procedures to validate the coexistence performance of LTE-U devices. This is occurring primarily through the development of a test plan, of which a draft was released this week for informational purposes.

As Edgar Figueroa, WFA CEO noted at the event, the coexistence test plan is not complete.  Since the test plan is to be used in its entirety as a holistic suite of tests that LTE-U devices must pass, we must reserve final judgment until it is complete. But it looks promising at this stage.

Principles for LTE-U Coexistence Testing

As we outlined at the WFA workshop alongside Google, Broadcom, and Comcast, any satisfactory test plan must:

  • Specify a rigorous and realistic set of test cases that validates LTE-U ability to coexist fairly with Wi-Fi;
  • Not simply be a demonstration of equipment’s ability to satisfy the LTE-U Forum Coexistence and CSAT specifications, or vendor solutions and claims, which are insufficient to prove coexistence;
  • Include clear, quantified pass/fail metrics to ensure that LTE-U does not disproportionately degrade Wi-Fi performance; and,
  • Be conducted in an open manner, with results available to the public.

The current draft of the WFA test plan appears to be on a path to meet these criteria. Some important test cases have yet to be written – for example, verifying that Wi-Fi consumers will still be able to choose their preferred network even in the presence of LTE-U – and we are working with the WFA to see that this and other unfinished pieces proceed expeditiously to completion.

Once the test plan is complete and verified to be effective, LTE-U devices will start proceeding through the tests. If a device passes the full suite of tests, stakeholders should (ideally) be assured that Wi-Fi consumers will not be harmed by its deployment. If a device fails a part of the WFA tests, then it will be clear that improvement in its coexistence technology is needed before it is introduced to the marketplace.

An Evolution in Tone

The cautious optimism of the day is in contrast to the tone of the LTE-U debate in 2015. Then, the discourse was marked by competing technical studies and at-odds assertions. Now, a common, industry-led technical engagement may produce definitive results. A couple of takeaways come to mind:

First, given the substantial effort of the unlicensed community to develop test procedures within the WFA, it should be clear to all that the bare-bones test cases in the LTE-U Forum Coexistence Specification are vastly inadequate – as we have been saying for some time.

And second, given the progress to date, industry collaboration must be given time to play out. Though it is proceeding expeditiously, work within the WFA is not done, and can only serve to validate LTE-U coexistence as a whole work product.

That is an important point that was missed in press coverage of the FCC’s recent grant of an experimental license to Qualcomm. In a blog post, Julius Knapp of the FCC was careful to distinguish between those independent experiments and the collaborative coexistence process of the WFA. Any testing that is based on a draft, incomplete set of technical procedures cannot, by definition, yield meaningful information on LTE-U coexistence. The independent experiments that will be conducted by Qualcomm under this grant of temporary authority are much different than the collaborative industry process that remains underway at the WFA.

3GPP Parallels

We certainly hope that the work in the WFA will lead to meaningful assurance that Wi-Fi consumers will not be harmed. For that to occur, the WFA test plan is likely to only be a start, even if it is proven to be rigorous in its final form. It is possible that evolution in LTE-U technology will be needed as well. That is how the development of License Assisted Access LTE (LAA-LTE) has occurred in 3GPP, the mobile standards body.

In 3GPP, a robust technical debate has led to improvements in coexistence features, with listen-before-talk procedures that should make the technology more friendly to its neighbors in shared, unlicensed spectrum. Work is just now beginning to simulate the efficacy of LAA coexistence properties. If the WFA coexistence testing model proves to be effective in protecting wireless consumers, it may also be adapted to LAA.

What’s Next

LAA is scheduled to be part of 3GPP’s Release 13 in March, and work within the WFA to develop the LTE-U coexistence test plan should be nearing completion around that time as well. Stay tuned… it looks like 2016 will see some major developments for wireless consumers.

Jennifer Andreoli-Fang and Bernie McKibben also contributed to this article.

Rob Alderfer is Vice President of Technology Policy at CableLabs.


Fair LTE-U Coexistence Far From Proven In CableLabs / Qualcomm Testing

Rob Alderfer
VP, Technology Policy

Nov 11, 2015

CableLabs has been working hard to ensure that the introduction of LTE into unlicensed spectrum is a win for wireless broadband, and that it does not disrupt the Wi-Fi services that consumers have come to rely on. We have covered in prior posts why that is a challenge, since LTE-U can take advantage of Wi-Fi’s inherent politeness. Here we will review recent technical work we did with a major proponent of LTE-U, Qualcomm Technologies, and explain why that effort only reinforces our concerns.

In brief, we observed that current LTE-U prototype equipment is quite primitive — it is really just in mock-up state at this point — and incapable of demonstrating important coexistence features. The vendor-promised features, most of which are not required or even identified by the LTE-U Forum in its latest specifications, are not yet working to enable fair and reliable coexistence and confidence in testing. In addition, we found that claims of its ability to share fairly rest on a seemingly faulty understanding of how Wi-Fi shares spectrum. For CableLabs, this reinforces the need for a collaborative and open standards development process.

The Wi-Fi community has long sought the same collaborative standards development process for LTE-U that LAA has enjoyed (License Assisted Access LTE is the flavor of unlicensed LTE being developed in the mobile standards body, 3GPP). But in the absence of LTE-U standards development, CableLabs has engaged directly with the promoters of LTE-U in an attempt to do the fundamental research required to find coexistence solutions. Far from converging on solutions, however, our work to date on LTE-U has raised more questions than answers.

Lessons Learned with Qualcomm’s LTE-U

CableLabs recently concluded a brief technical engagement at Qualcomm’s campus in San Diego, which was preceded by a lengthy negotiation of what we would be allowed to test on site. Ultimately, the scope of the plan was much narrower than our guidance and focused on a limited set of basic coexistence tests. It certainly was not the fulsome research that we recommended and is required to address the concerns of Wi-Fi technologists, which we summarized in a presentation to the Wi-Fi Alliance last week.

We began this limited test hoping that we would be able to independently validate the definitive statements of LTE-U proponents that it has been ‘proven to coexist’, and is ‘more friendly to Wi-Fi than Wi-Fi is to itself’.

Unfortunately, our main conclusion from the three weeks we spent on site at Qualcomm is that there is no basis for definitive technical statements about LTE-U coexistence. The reason for this is surprisingly simple: LTE-U is in a prototype phase of development, and does not possess the features that its proponents have noted are important to coexistence.

For instance, Qualcomm has noted that their LTE-U solution will sense the spectrum for Wi-Fi activity and adjust its duty cycle ‘on’ time for rapidly changing congestion conditions. But that is not what we were shown. What we saw was an LTE-U prototype that must have its duty cycle manually programmed; it has no adaptation capabilities at all. There were other issues as well: For example, the equipment did not natively use the 5 GHz band that is targeted for LTE-U, and it only supported a single user device. We will refrain from going on at length here, but as is apparent in the photo below, LTE-U requires substantial further development. In short, we were surprised to see that the state of the art plainly won’t work in the real world, despite assurances to the contrary and claims of comprehensive testing.

Figure 1: Prototype LTE-U base station (left) and user device (right)

Importance of a Common Research Framework

Since LTE-U equipment is not mature, it should come as no surprise that coexistence research leaves much to be desired as well. Statements that LTE-U is ‘more friendly to Wi-Fi than Wi-Fi is to itself’ necessarily rely on a baseline understanding of how Wi-Fi shares the spectrum with other Wi-Fi networks. But, in our three weeks at Qualcomm, engineers spent the majority of the time grappling with that crucial baseline information. The test setup at Qualcomm was uncontrolled and provided strangely imbalanced measurements. Afterwards, a CableLabs engineer replicated the setup at our Colorado facilities with the same make of Wi-Fi equipment, and within a half hour obtained balanced results, suggesting that problematic baseline measurements were somehow endemic to Qualcomm’s research environment. CableLabs and its members regularly test, configure, and operate Wi-Fi networks, and the behavior we observed on-site in San Diego seems quite out of the ordinary. Selected baseline measurements are shown in Figure 2 below, including the expected balanced baseline we observed in our Colorado lab.

Figure 2: Imbalanced Wi-Fi baseline behavior in Qualcomm research not observed in follow-on CableLabs work

This highlights the fundamental problem with the LTE-U coexistence research done to date: There is no common technical framework in which stakeholders are working, which makes it very difficult, if not impossible, to interpret research results across studies. This is apparent in our limited work with Qualcomm, and in Qualcomm’s prior studies, which also reflect baseline imbalances and call into question the research conclusions of LTE-U proponents.

Since we spent most of our time at Qualcomm working to diagnose apparent problems with the research environment, we did not come close to executing against the already modest test plan developed at the outset. We did however take some limited measurements of Wi-Fi behavior in the presence of LTE-U (which was tuned to 50% duty cycle ‘on’ time, so is only a narrow representation of possible real-world LTE-U configurations).

As seen in Figure 3 below, the impact of LTE-U depends on what your comparison point is: To conclude that LTE-U coexists better than Wi-Fi, one would need to lower the bar as much as possible — using an imbalanced baseline and referencing only the lower end in the analysis. This would clearly be a skewed approach, and even when doing so, it still doesn’t tell a conclusive story — reference the first case in the figure below, where the presence of LTE-U degrades Wi-Fi more than either baseline case. We would submit that the better approach is to diagnose the problems with the baseline measurements, rather than using unexplained results to justify definitive conclusions.

Figure 3: LTE-U coexistence not reliably determined

Furthermore, the in-home research detailed in a recent blog post by our principal architect, Jennifer Andreoli-Fang, made it clear that LTE-U is likely to have a disproportionately negative impact to Wi-Fi when the baseline is properly calibrated. An open standards process with common research methods is clearly needed to drive greater consistency and confidence in results.

We certainly hope that Qualcomm’s LTE-U solution will move from prototype to product in the near future, so that the Wi-Fi community can attempt to validate its coexistence efficacy in an open and comprehensive fashion. But that would only be one necessary step on the path toward equitable spectrum sharing. As we have detailed before, the LTE-U Forum coexistence specification leaves substantial room for different vendor and carrier approaches, which are likely to do disproportionate harm to Wi-Fi.

While this quite limited testing at Qualcomm’s facilities raised more questions than answers for us, CableLabs remains fully committed to rectifying the shortcomings in unlicensed LTE coexistence. Indeed, we have seen more hopeful progress in the development of the global standard form of the technology, LAA-LTE, which we are cautiously optimistic is on a path to coexist well.  The more aggressive approach taken by LTE-U clearly poses significant challenges, but we see promise in the open standards process and the particular technical choices of LAA as the basis for a more effective and comprehensive solution.

Reliable coexistence in unlicensed spectrum requires a broadly supported agreement on specific solutions. That is why a collaborative, open standards development process is so important — that is how Wi-Fi is developed. Its success is self-evident in the marketplace.

Rob Alderfer is Vice President of Technology Policy at CableLabs.


Wi-Fi vs. Duty Cycled LTE-U: In-Home Testing Reveals Coexistence Challenges

Jennifer Andreoli-Fang
Distinguished Technologist, Wireless Technologies

Nov 5, 2015

Rob Alderfer, VP Technology Policy, CableLabs and Nadia Yoza-Mitsuishi, Wireless Architect Intern, CableLabs also contributed to this article.

In our last blog on Wi-Fi / LTE coexistence, we laid out the dangers attending the apparent decision of a few large carriers to go forward with the carrier scale deployment of a non-standard form of unlicensed LTE in shared spectrum. This time, we will review some of the testing conducted by CableLabs recently to explain why we are worried. We covered this material at a recent Wi-Fi Alliance workshop on LTE-U coexistence, along with the broader roadmap of research that we see as needed to get to solutions that are broadly supported.

Before we begin, let’s recap: there are different approaches to enabling LTE in unlicensed spectrum: LAA-LTE, and LTE-U. LAA is the version that the cellular industry standards body (3GPP) has been working on for the past year, and its coexistence measures appear to be on a path similar to what is used in Wi-Fi: “listen before talk,” or LBT. In contrast, LTE-U, the technology being developed for the US market, is taking a wholly different approach. LTE-U has not been submitted for consideration to a collaborative standards-setting body like 3GPP, instead it is being developed by a small group of companies through a closed process. LTE-U uses a carrier-controlled on/off switch that “duty cycles” the LTE signal. It turns on to transmit for some time determined by the wireless carrier, then it switches off (again, at the discretion of the carrier) to allow other technologies such as Wi-Fi the opportunity to access the channel.

CableLabs has tested duty-cycled LTE-U in our lab, our office environment, and most recently in a residential environment (test house) to research how technologies are experienced by consumers. Our research to date has raised significant concerns about the impact LTE-U will have on Wi-Fi services. Today we will review our most recent research, which was conducted in our test house.

How We Did Our In-Home Tests

Proponents of LTE-U claim that the technology will be ‘more friendly to Wi-Fi than Wi-Fi is to itself.’ So we decided to test that statement, using our test house to approximate a real-world environment, with technology that would comply with the LTE-U Forum Coexistence Specification v1.2.

(Note that just this week, the LTE-U Forum released a new version of their Coexistence Specification. We’re still looking at it along with the rest of the Wi-Fi community, since it is again a product of their closed process, but we don’t think it changes much for our purposes here. And judging by the discussion at this week’s Wi-Fi Alliance workshop, much work remains to get to reliable coexistence.)

To do our tests, we went to Best Buy and bought two identical off-the-shelf Wi-Fi APs, and we have a LTE signal generator that we can program to cycle the signal on and off – “duty cycling,” in the parlance of the LTE-U Forum.

We first established a baseline of how fair Wi-Fi is to itself. We had our off-the-shelf APs send full buffer downlink traffic (i.e., an AP to a Wi-Fi device), and took throughput, latency and packet loss measurements. We found that the two APs shared the spectrum about equally – throughput was roughly 50:50, for instance.

Next, we replaced one of the APs with LTE-U and repeated our tests, using various duty cycle configurations (LTE ‘on’ times). To make things simpler, we made sure that all of our measurements were done at the relatively strong signal level (-62 dBm) that is used by the LTE-U Forum as the level for detecting Wi-Fi.

What Our Research Found

If our Wi-Fi AP performed better in the presence of LTE-U, that would validate the claims made about it being a better neighbor.

But, we unfortunately did not find that. Quite the opposite, in fact. Our results showed that Wi-Fi performance suffered disproportionately in the presence of LTE-U. Wi-Fi throughput (“speed”) is degraded by 70% when LTE-U is on only 50% of the time, for instance. And more aggressive LTE-U configurations (longer ‘on’ times) would do even more damage, as seen in Figure 1.

Figure 1

Why does LTE-U do disproportionate damage to Wi-Fi? The primary reason is that it interrupts Wi-Fi mid-stream, instead of waiting its turn. This causes errors in Wi-Fi transmissions, ratcheting down its performance. We ran tests across a range of duty cycles to explore this effect. In our test case of two networks sharing with each other, to maintain Wi-Fi at 50% throughput, LTE-U could be on for no more than 35% of the time, as seen in Figure 2.

Figure 2

Does this mean that if LTE-U were to limit its duty cycle to 35% that Wi-Fi would perform acceptably? Unfortunately, it is not that simple. Our test case is admittedly limited: We are showing only two networks sharing the spectrum here, but in reality there can sometimes be hundreds of Wi-Fi APs within range of each other. If LTE-U took 35% of the airtime when sharing with 5 Wi-Fi networks, or 50 for that matter, the Wi-Fi experience would surely suffer significantly. And the problem gets worse if there are multiple LTE-U networks as well, which seems likely.

The effect that LTE-U would have across a range of real-world circumstances is, frankly, unknown. It has not been researched – by anyone. What we do know, based on our work here, is that the 50% LTE-U ‘on’ time considered by the LTE-U Forum when two networks are sharing (see Section 6.2.1 of their coexistence spec) does not yield proportionate throughput results.

That’s the effect of LTE-U on throughput, which is important. But equally important is latency, or the amount of delay on the network, which can have a dramatic impact on real-time applications like voice and video conferencing. We tested Wi-Fi latency across different duty cycles, and the results are seen in Figure 3 below. What’s important to note is that while the two Wi-Fi APs we tested can co-exist while providing smooth operation of real-time communications, that doesn’t appear to be the case if LTE-U is present. Even if LTE-U is only on 50% of the time, it degrades real-time Wi-Fi communications in a way that would likely be irritating to Wi-Fi users. And if LTE-U is on 70% of the time, then latency reaches levels where even latency-tolerant applications, like web page loading, are likely to become irritating.

Figure 3

So, we have seen in our research that LTE-U causes disproportionate harm to Wi-Fi, even when it is configured to share roughly 50:50 in time – which is not a given, as we have noted before. That’s because LTE-U interrupts Wi-Fi signals instead of waiting its turn through the listen-before-talk approach that Wi-Fi uses. We discussed the importance of using listen-before-talk in our last blog, and now you can see why we think it is important for LTE-U.

This research shows that LTE-U is not, in fact, a better neighbor to Wi-Fi than Wi-Fi is to itself, as its proponents claim. What should we make of these competing results? Is one claim right, and the other wrong? It’s not really that simple – the answer depends on how LTE-U is configured and deployed, and what coexistence features it actually adopts.

This highlights the need for the open and collaborative R&D that we have long been urging, so that we can find solutions that actually work for everyone. That has been happening with LAA, the 3GPP standard form of unlicensed LTE, where there is room for cautious optimism on its ability to coexist well with Wi-Fi. Hopefully LTE-U proponents will move toward actual collaboration.

Jennifer Andreoli-Fang is a Principal Architect in the Network Technologies group at CableLabs.

Rob Alderfer is VP Technology Policy and Nadia Yoza-Mitsuishi is Wireless Architect Intern, both at CableLabs, also contributed to this article.


Can a Wi-Fi radio detect Duty Cycled LTE?

Joey Padden
Distinguished Technologist, Wireless Technologies

Jun 24, 2015

For my third blog I thought I’d give you preview of a side project I’ve been working on. The original question was pretty simple: Can I use a Wi-Fi radio to identify the presence of LTE?

Before we go into what I’m finding, let’s recap: We know LTE is coming into the unlicensed spectrum in one flavor or another. We know it is (at least initially) going to be a tool that only mobile network operators with licensed spectrum can use, as both LAA and LTE-U will be “license assisted” – locked to licensed spectrum. We know there are various views about how well it will coexist with Wi-Fi networks. In my last two blog posts (found here and here) I shared my views on the topic, while some quick Googling will find you a different view supported by Qualcomm, Ericsson, and even the 3GPP RAN Chairman. Recently the coexistence controversy even piqued the interest of the FCC who opened a Public Notice on the topic that spawned a plethora of good bedtime reading.

One surefire way to settle the debate is to measure the effect of LTE on Wi-Fi, in real deployments, once they occur. However to do that, you must have a tool that can measure the impact on Wi-Fi. You’d also need a baseline, a first measurement of Wi-Fi only performance in the wild to use as a reference.

So let’s begin by considering this basic question: using only a Wi-Fi radio, what would you measure when looking for LTE? What Key Performance Indicators (KPIs) would you expect to show you that LTE was having an impact? After all, to a Wi-Fi radio LTE just looks like loud informationless noise, so you can’t just ask the radio “Do you see LTE?” and expect an answer. (Though that would be super handy.)

To answer these questions, I teamed up with the Wi-Fi performance and KPI experts at 7Signal to see if we could use their Eye product to detect, and better yet, quantify the impact of a co-channel LTE signal on Wi-Fi performance.

Our first tests were in the CableLabs anechoic chamber. This chamber is a quiet RF environment used for very controlled and precise wireless testing. The chamber afforded us a controlled environment to make sure we’d be able to see any “signal” or difference in the KPIs produced by the 7Signal system with and without LTE. After we confirmed that we could see the impact of LTE on a number of KPIs, we moved to a less controlled, but more real world environment.

Over the past week I’ve unleashed duty cycled LTE at 5GHz (a la LTE-U) on the CableLabs office Wi-Fi network. To ensure the user/traffic load and KPI sample noise was as real world as possible… I didn’t warn anybody. (Sorry guys! That slow/weird Wi-Fi this week was my fault!)

In the area tested, our office has about 20 cubes and a break area with the majority of users sharing the nearest AP. On average throughout the testing we saw ~25 clients associated to the AP.

We placed the LTE signal source ~3m from the AP. We chose two duty cycles, 50% of 80ms and 50% of 200ms, and always shared a channel with a single Wi-Fi access point within energy detection range. We also tested two power levels, -40 dBm and -65dBm at the AP so we could test with LTE power above and just below the Wi-Fi LBT energy detection threshold of -62dBm.

We will have more analysis and results later, but I just couldn’t help but share some preliminary findings. The impact to many KPIs is obvious and 7Signal does a great job of clearly displaying the data. Below are a couple of screen grabs from our 7Signal GUI.

The first two plots show the tried and true throughput and latency. These are the most obvious and likely KPIs to show the impact and sure enough the impact is clear.

Figure 1 - Wi-Fi Throughput Impact from Duty Cycle LTE

Figure 2 - Wi-Fi Latency Impact from Duty Cycle LTE

We were able to discern a clear LTE “signal” from many other KPIs. Some notable examples were channel noise floor and the rate of client retransmissions. Channel noise floor is pretty self-explanatory. Retransmissions occur when either the receiver was unable to successfully receive a frame or was blocked from sending the ACK frame back to the transmitter. The ACK frame, or acknowledgement frame, is used to tell the sender the receiver got the frame successfully. The retransmission plot shows the ratio of retransmitted frames over totaled captured frames in a 10 second sample.

As a side note: Our findings point to a real problem when the LTE power level at an AP is just below the Wi-Fi energy detection threshold. These findings are similar to those found in the recent Google authored white paper attached as Appendix A to their FCC filing.

Figure 3 - Channel Noise Floor Impact from Duty Cycle LTE

Figure 4 - Wi-Fi Client Retransmission Rate Impact of Duty Cycle LTE

Numerous other KPIs showed the impact but require more post processing of the data and/or explanation so I’ll save that for later.

In addition to the above plots, I have a couple of anecdotal results to share. First our Wi-Fi controller was continuously changing the channel on us making testing a bit tricky. I guess it didn’t like the LTE much. Also, we had a handful CableLabs employees figure out what was happening and say “Oh! That’s why my Wi-Fi has been acting up!” followed by various defamatory comments about me and my methods.

Hopefully all LTE flavors coming to the unlicensed bands will get to a point where coexistence is assured and such measures won’t be necessary. If not — and we don’t appear to be there yet — it is looking pretty clear that we can detect and measure the impact of LTE on Wi-Fi in the wild if need be. But again, with continued efforts by the Wi-Fi community to help develop fair sharing technologies for LTE to use, it won’t come to that.


A Look at CableLabs, from CEO Phil McKinney

Phil McKinney
President & CEO

May 8, 2014

Phil McKinney, CEO at CableLabs, recently wrote an article for Cablefax highlighting the history and work of CableLabs within the cable industry.  Here is a snippet, and be sure to follow the link to read the entire article:

"Today’s cable industry is a far cry from what it was twenty years ago when the Internet was in its infancy, the cloud did not exist, and mobile devices were not in everyone’s pocket. Slowly at first, and then with an almost voracious momentum, sophisticated networks have grown up to support the rapid transfer of data and information that feed our financial, educational, social, and entertainment infrastructures.

For many consumers and small businesses around the world, it is a cable network, built on the foundational efforts of an independent entrepreneur, that delivers the services linking them to the ever expanding global web—services that today we routinely take for granted: high speed broadband Internet, WiFi, high quality video (pay TV), and increasingly, the infrastructure for connected devices to communicate with each other — 'the Internet of things.'

None of this happened by magic. It was the hard work of many individuals across many companies, standards bodies and research organizations that created the cable service infrastructure we have today. And hard work continues—focused on building out a cable infrastructure for a next generation of services targeting the needs of a global market. CableLabs has been at the core of these efforts from the start."

Read the entire article here.


Carrier-Grade Wi-Fi Keeps Pace With Wi-Fi Network Growth: How CableLabs is Contributing

Mark Poletti
Director, Wireless Network Technologies

Apr 28, 2014

“Operators of all kinds – fixed, mobile, converged and pure-play Wi-Fi – are moving beyond using Wi-Fi just for convenient access, or data offload, and are making it a central part of their broader strategies to support a high quality broadband experience everywhere.”
(Excerpt from WBA Industry Report 2013: Global Trends in Public Wi-Fi p3)

Over the past few years Wi-Fi networks have seen substantial growth in the number of hotspots to deliver expanded coverage and new service offerings.  Wi-Fi networks offer unique characteristics, such as very high data rates (50 Mbps up to 400 Mbps using 802.11n),  easy connection to neutral hosts, and unlicensed shared spectrum.  Hotspots have very small coverage and are especially useful in areas with dense population.  This is very attractive to 3G/4G cellular carriers experiencing congestion due to spectrum and network capacity limitations.  Cellular networks experiencing such congestion use Wi-Fi networks to offload cellular data to maintain service and data integrity.


The Wireless Broadband Association (WBA) projects that 22% of new carrier capacity will come from Wi-Fi networks in 2014.

Wi-Fi network operators are exploring opportunities beyond 3G offload that will offer differentiated services to their core business.  For example, Wi-Fi network operators, such as mobile operators (MNOs) and cable operators (MSOs), offer service plans that enhance (e.g. from fixed cable to wireless used by cable operators) or extend (e.g. mobile to Wi-Fi used by T-Mobile, Verizon) existing service.   As mobile and Wi-Fi services continue to overlap and converge, the integration of mobile and Wi-Fi networks,  such as that used for 3G to Wi-Fi network handoff, will continue to become increasingly prevalent and more complex.


(Excerpt from WBA Industry Report 2013: Global Trends in Public Wi-Fi)

Vision of Carrier Grade Wi-Fi

As Wi-Fi networks are used for faster and more robust data, video, and voice services, maintaining a quality-user experience is becoming increasingly important.  “Carrier grade Wi-Fi” is a phrase used for an industry effort to improve Wi-Fi network design, management, and performance to closely match that of cellular networks within the inherent limitations of Wi-Fi.  The carrier grade Wi-Fi effort touches all aspects of the Wi-Fi ecosystem including vendors, operators, chipset manufacturers, industry groups, and standards bodies.

Wi-Fi was created to share unlicensed spectrum on a non-interfering basis with autonomous performance and control for any number of users.  As industry and market demand has evolved over the years, Institute of Electrical and Electronics Engineers (IEEE) standards and industry groups – such as the Wi-Fi Alliance (WFA) and the Wireless Broadband Alliance (WBA) – have continually added features to improve the quality of Wi-Fi network service, performance, and management.  Recent activities have begun to address specific features that will enable Wi-Fi to be carrier grade.

The WBA recently submitted a white paper addressing carrier grade Wi-Fi guidelines and providing definition of carrier grade Wi-Fi features.  In addition, the WFA has been incorporating these features into interoperability certification.

CableLabs has submitted several contributions that provide carrier grade Wi-Fi feature definition and use cases.  These features have been incorporated into an initial operators’ requirements document that will, once completed, be referenced by WFA task groups as test requirements for WFA certification.

CableLabs has also submitted a contribution to the Wi-Fi Mobile Converged Wireless Group (CWG) Test Plan Group.  This group has the charter to define the testing procedures for radio-frequency performance of all Wi-Fi devices (i.e. access points, tablets, Wi-Fi clients, and smartphones).

Radio Frequency (RF) Performance Is Key to Carrier Grade Wi-Fi

One key aspect of carrier grade Wi-Fi is RF performance.  At present, Wi-Fi access points (APs) and devices are not held to defined RF performance standards. This impacts coverage and capacity performance of Wi-Fi networks. For enterprise and consumer devices, defining minimum RF performance requirements will ensure that poorly performing devices don’t reduce overall network capacity.  It is important to note that maximizing transmitter and receiver performance has a dramatic impact on the coverage and capacity of Wi-Fi networks.

The graph in Figure 1 shows that a few decibels (dBs) – a measurement of RF power – of underperforming Wi-Fi devices can result in shorter range and lower throughput.  In Figure 1, the coverage heat maps of carrier grade and non-compliant Wi-Fi APs are shown, which indicate a significant reduction of coverage performance for non-compliant APs.

carriergrade2Screen Shot 2014-04-29 at 9.22.30 AM

Figure 1 – Coverage Performance of Carrier Grade vs. Underperforming Wi-Fi Devices

By improving the RF component selection and design optimization in Wi-Fi devices, the achievable network throughput significantly improves.  CableLabs is working with Industry bodies to determine reasonable performance thresholds for AP and Wi-Fi devices.

Current State of RF Certifications

At present, RF certification of Wi-Fi devices and access points is optional with no minimum performance standards.  Original Equipment Manufacturers (OEMs) have the option whether to test their equipment against WFA standards or not.  Although current test plans are thorough, they do not contain pass/fail or minimum performance levels requirements.

Currently, Federal Communications Commission (FCC) RF regulations are only intended to ensure public safety and non-interference with co-channel and adjacent channel systems. The regulations are designed for “not to exceed” power levels, not for minimum levels. In other words, the FCC focus has been on avoiding interference to other systems as opposed to requiring minimum performance requirements. The lack of receiver sensitivity tests does not provide assessment of end-user performance.  In addition, FCC tests do not cover device throughput performance versus coverage or interference.



Figure 2 – CWG RF Certification Testing for Smart Phones, Home and Small Office APs

As shown in Figure 2, CWG RF certification testing of Wi-Fi devices has not been pursued by OEMs and device OEM participation is decreasing.  CableLabs is working with industry bodies to recommend requiring that Wi-Fi devices meet minimum RF performance criteria during WFA RF testing and encourage RF certification testing.

CableLabs to Install RF Anechoic Chamber


CableLabs is in the process of installing an in-house RF anechoic chamber with the capability to measure RF performance of Wi-Fi devices such as total isotropic sensitivity (TIS), total radiated power (TRP), Wi-Fi adjacent-/co-channel interference and LTE interference.  Scheduled completion is late July 2014.  Once complete, CableLabs will have the ability to test RF performance of all Wi-Fi devices.  The RF chamber will be available to all member MSOs with the potential to lead Wi-Fi RF certification tests within the Wi-Fi industry.

Wi-Fi First Service Will Benefit from Carrier Grade Wi-Fi

Wi-Fi first service, a hybrid Wi-Fi/mobile service, where voice over Wi-Fi service is selected first over mobile voice service, is among the many new innovative services being introduced by Wi-Fi network operators.  This service offers several technical challenges such as:

  • Maintaining seamless Wi-Fi roaming and handover between Wi-Fi and 3G/4G networks
  • Maintaining authentication and security across Wi-Fi and 3G/4G networks
  • Supporting service profile across Wi-Fi and 3G/4G networks
  • Providing efficient mobile device network discovery, selection, and attachment
  • Delivering quality of service

Wi-Fi network operators are developing customized solutions to network architectures and target markets unique to their core business. As carrier grade Wi-Fi continues to gain momentum, it will offer improvement to Wi-Fi network quality and user experience. CableLabs is working with vendors and operators to develop solutions with emphasis on developing carrier grade Wi-Fi features.

Other Aspects of Carrier Grade Wi-Fi: What’s Next?

As carrier grade Wi-Fi continues to gain traction in the industry, CableLabs will maintain pace with standards body activities, product roadmaps, technology roadmaps, new services, business paradigms, and technology disruptors that will impact user experience and quality of Wi-Fi networks.  Future blog posts will provide updates to carrier grade Wi-Fi features that may include:

  • Solutions to Wi-Fi first network architectures and mobile applications
  • Carrier Grade Wi-Fi WFA certifications
  • IEEE 802.11 feature enhancements
  • Carrier Grade best practices for Wi-Fi  network planning, operation and performance
  • Quality of Service


Mark Poletti is a Lead Wireless Architect with CableLabs.  He has been addressing mobile operator design, operations and performance issues of 2G, 3G, 4G, and satellite networks for over 20 years.   Mark is a member of the WFA and WBA and is focused on the wireless convergence of MSO and MNO networks.