Comments
Consumer

Fair LTE-U Coexistence Far From Proven In CableLabs / Qualcomm Testing

Rob Alderfer
VP, Technology Policy

Nov 11, 2015

CableLabs has been working hard to ensure that the introduction of LTE into unlicensed spectrum is a win for wireless broadband, and that it does not disrupt the Wi-Fi services that consumers have come to rely on. We have covered in prior posts why that is a challenge, since LTE-U can take advantage of Wi-Fi’s inherent politeness. Here we will review recent technical work we did with a major proponent of LTE-U, Qualcomm Technologies, and explain why that effort only reinforces our concerns.

In brief, we observed that current LTE-U prototype equipment is quite primitive — it is really just in mock-up state at this point — and incapable of demonstrating important coexistence features. The vendor-promised features, most of which are not required or even identified by the LTE-U Forum in its latest specifications, are not yet working to enable fair and reliable coexistence and confidence in testing. In addition, we found that claims of its ability to share fairly rest on a seemingly faulty understanding of how Wi-Fi shares spectrum. For CableLabs, this reinforces the need for a collaborative and open standards development process.

The Wi-Fi community has long sought the same collaborative standards development process for LTE-U that LAA has enjoyed (License Assisted Access LTE is the flavor of unlicensed LTE being developed in the mobile standards body, 3GPP). But in the absence of LTE-U standards development, CableLabs has engaged directly with the promoters of LTE-U in an attempt to do the fundamental research required to find coexistence solutions. Far from converging on solutions, however, our work to date on LTE-U has raised more questions than answers.

Lessons Learned with Qualcomm’s LTE-U

CableLabs recently concluded a brief technical engagement at Qualcomm’s campus in San Diego, which was preceded by a lengthy negotiation of what we would be allowed to test on site. Ultimately, the scope of the plan was much narrower than our guidance and focused on a limited set of basic coexistence tests. It certainly was not the fulsome research that we recommended and is required to address the concerns of Wi-Fi technologists, which we summarized in a presentation to the Wi-Fi Alliance last week.

We began this limited test hoping that we would be able to independently validate the definitive statements of LTE-U proponents that it has been ‘proven to coexist’, and is ‘more friendly to Wi-Fi than Wi-Fi is to itself’.

Unfortunately, our main conclusion from the three weeks we spent on site at Qualcomm is that there is no basis for definitive technical statements about LTE-U coexistence. The reason for this is surprisingly simple: LTE-U is in a prototype phase of development, and does not possess the features that its proponents have noted are important to coexistence.

For instance, Qualcomm has noted that their LTE-U solution will sense the spectrum for Wi-Fi activity and adjust its duty cycle ‘on’ time for rapidly changing congestion conditions. But that is not what we were shown. What we saw was an LTE-U prototype that must have its duty cycle manually programmed; it has no adaptation capabilities at all. There were other issues as well: For example, the equipment did not natively use the 5 GHz band that is targeted for LTE-U, and it only supported a single user device. We will refrain from going on at length here, but as is apparent in the photo below, LTE-U requires substantial further development. In short, we were surprised to see that the state of the art plainly won’t work in the real world, despite assurances to the contrary and claims of comprehensive testing.

prototype-lteu-base-station-fig1
Figure 1: Prototype LTE-U base station (left) and user device (right)

Importance of a Common Research Framework

Since LTE-U equipment is not mature, it should come as no surprise that coexistence research leaves much to be desired as well. Statements that LTE-U is ‘more friendly to Wi-Fi than Wi-Fi is to itself’ necessarily rely on a baseline understanding of how Wi-Fi shares the spectrum with other Wi-Fi networks. But, in our three weeks at Qualcomm, engineers spent the majority of the time grappling with that crucial baseline information. The test setup at Qualcomm was uncontrolled and provided strangely imbalanced measurements. Afterwards, a CableLabs engineer replicated the setup at our Colorado facilities with the same make of Wi-Fi equipment, and within a half hour obtained balanced results, suggesting that problematic baseline measurements were somehow endemic to Qualcomm’s research environment. CableLabs and its members regularly test, configure, and operate Wi-Fi networks, and the behavior we observed on-site in San Diego seems quite out of the ordinary. Selected baseline measurements are shown in Figure 2 below, including the expected balanced baseline we observed in our Colorado lab.

imbalanced-wifi-baseline-Qualcomm-fig2
Figure 2: Imbalanced Wi-Fi baseline behavior in Qualcomm research not observed in follow-on CableLabs work

This highlights the fundamental problem with the LTE-U coexistence research done to date: There is no common technical framework in which stakeholders are working, which makes it very difficult, if not impossible, to interpret research results across studies. This is apparent in our limited work with Qualcomm, and in Qualcomm’s prior studies, which also reflect baseline imbalances and call into question the research conclusions of LTE-U proponents.

Since we spent most of our time at Qualcomm working to diagnose apparent problems with the research environment, we did not come close to executing against the already modest test plan developed at the outset. We did however take some limited measurements of Wi-Fi behavior in the presence of LTE-U (which was tuned to 50% duty cycle ‘on’ time, so is only a narrow representation of possible real-world LTE-U configurations).

As seen in Figure 3 below, the impact of LTE-U depends on what your comparison point is: To conclude that LTE-U coexists better than Wi-Fi, one would need to lower the bar as much as possible — using an imbalanced baseline and referencing only the lower end in the analysis. This would clearly be a skewed approach, and even when doing so, it still doesn’t tell a conclusive story — reference the first case in the figure below, where the presence of LTE-U degrades Wi-Fi more than either baseline case. We would submit that the better approach is to diagnose the problems with the baseline measurements, rather than using unexplained results to justify definitive conclusions.

lteu-coexistence-fig3
Figure 3: LTE-U coexistence not reliably determined

Furthermore, the in-home research detailed in a recent blog post by our principal architect, Jennifer Andreoli-Fang, made it clear that LTE-U is likely to have a disproportionately negative impact to Wi-Fi when the baseline is properly calibrated. An open standards process with common research methods is clearly needed to drive greater consistency and confidence in results.

We certainly hope that Qualcomm’s LTE-U solution will move from prototype to product in the near future, so that the Wi-Fi community can attempt to validate its coexistence efficacy in an open and comprehensive fashion. But that would only be one necessary step on the path toward equitable spectrum sharing. As we have detailed before, the LTE-U Forum coexistence specification leaves substantial room for different vendor and carrier approaches, which are likely to do disproportionate harm to Wi-Fi.

While this quite limited testing at Qualcomm’s facilities raised more questions than answers for us, CableLabs remains fully committed to rectifying the shortcomings in unlicensed LTE coexistence. Indeed, we have seen more hopeful progress in the development of the global standard form of the technology, LAA-LTE, which we are cautiously optimistic is on a path to coexist well.  The more aggressive approach taken by LTE-U clearly poses significant challenges, but we see promise in the open standards process and the particular technical choices of LAA as the basis for a more effective and comprehensive solution.

Reliable coexistence in unlicensed spectrum requires a broadly supported agreement on specific solutions. That is why a collaborative, open standards development process is so important — that is how Wi-Fi is developed. Its success is self-evident in the marketplace.

Rob Alderfer is Vice President of Technology Policy at CableLabs.

Comments
Consumer

Wi-Fi vs. Duty Cycled LTE-U: In-Home Testing Reveals Coexistence Challenges

Jennifer Andreoli-Fang
Distinguished Technologist, Wireless Technologies

Nov 5, 2015

Rob Alderfer, VP Technology Policy, CableLabs and Nadia Yoza-Mitsuishi, Wireless Architect Intern, CableLabs also contributed to this article.

In our last blog on Wi-Fi / LTE coexistence, we laid out the dangers attending the apparent decision of a few large carriers to go forward with the carrier scale deployment of a non-standard form of unlicensed LTE in shared spectrum. This time, we will review some of the testing conducted by CableLabs recently to explain why we are worried. We covered this material at a recent Wi-Fi Alliance workshop on LTE-U coexistence, along with the broader roadmap of research that we see as needed to get to solutions that are broadly supported.

Before we begin, let’s recap: there are different approaches to enabling LTE in unlicensed spectrum: LAA-LTE, and LTE-U. LAA is the version that the cellular industry standards body (3GPP) has been working on for the past year, and its coexistence measures appear to be on a path similar to what is used in Wi-Fi: “listen before talk,” or LBT. In contrast, LTE-U, the technology being developed for the US market, is taking a wholly different approach. LTE-U has not been submitted for consideration to a collaborative standards-setting body like 3GPP, instead it is being developed by a small group of companies through a closed process. LTE-U uses a carrier-controlled on/off switch that “duty cycles” the LTE signal. It turns on to transmit for some time determined by the wireless carrier, then it switches off (again, at the discretion of the carrier) to allow other technologies such as Wi-Fi the opportunity to access the channel.

CableLabs has tested duty-cycled LTE-U in our lab, our office environment, and most recently in a residential environment (test house) to research how technologies are experienced by consumers. Our research to date has raised significant concerns about the impact LTE-U will have on Wi-Fi services. Today we will review our most recent research, which was conducted in our test house.

How We Did Our In-Home Tests

Proponents of LTE-U claim that the technology will be ‘more friendly to Wi-Fi than Wi-Fi is to itself.’ So we decided to test that statement, using our test house to approximate a real-world environment, with technology that would comply with the LTE-U Forum Coexistence Specification v1.2.

(Note that just this week, the LTE-U Forum released a new version of their Coexistence Specification. We’re still looking at it along with the rest of the Wi-Fi community, since it is again a product of their closed process, but we don’t think it changes much for our purposes here. And judging by the discussion at this week’s Wi-Fi Alliance workshop, much work remains to get to reliable coexistence.)

To do our tests, we went to Best Buy and bought two identical off-the-shelf Wi-Fi APs, and we have a LTE signal generator that we can program to cycle the signal on and off – “duty cycling,” in the parlance of the LTE-U Forum.

We first established a baseline of how fair Wi-Fi is to itself. We had our off-the-shelf APs send full buffer downlink traffic (i.e., an AP to a Wi-Fi device), and took throughput, latency and packet loss measurements. We found that the two APs shared the spectrum about equally – throughput was roughly 50:50, for instance.

Next, we replaced one of the APs with LTE-U and repeated our tests, using various duty cycle configurations (LTE ‘on’ times). To make things simpler, we made sure that all of our measurements were done at the relatively strong signal level (-62 dBm) that is used by the LTE-U Forum as the level for detecting Wi-Fi.

What Our Research Found

If our Wi-Fi AP performed better in the presence of LTE-U, that would validate the claims made about it being a better neighbor.

But, we unfortunately did not find that. Quite the opposite, in fact. Our results showed that Wi-Fi performance suffered disproportionately in the presence of LTE-U. Wi-Fi throughput (“speed”) is degraded by 70% when LTE-U is on only 50% of the time, for instance. And more aggressive LTE-U configurations (longer ‘on’ times) would do even more damage, as seen in Figure 1.

wi-fi-duty-cycle-lte_fig1
Figure 1

Why does LTE-U do disproportionate damage to Wi-Fi? The primary reason is that it interrupts Wi-Fi mid-stream, instead of waiting its turn. This causes errors in Wi-Fi transmissions, ratcheting down its performance. We ran tests across a range of duty cycles to explore this effect. In our test case of two networks sharing with each other, to maintain Wi-Fi at 50% throughput, LTE-U could be on for no more than 35% of the time, as seen in Figure 2.

wi-fi-duty-cycle-lte_fig2
Figure 2

Does this mean that if LTE-U were to limit its duty cycle to 35% that Wi-Fi would perform acceptably? Unfortunately, it is not that simple. Our test case is admittedly limited: We are showing only two networks sharing the spectrum here, but in reality there can sometimes be hundreds of Wi-Fi APs within range of each other. If LTE-U took 35% of the airtime when sharing with 5 Wi-Fi networks, or 50 for that matter, the Wi-Fi experience would surely suffer significantly. And the problem gets worse if there are multiple LTE-U networks as well, which seems likely.

The effect that LTE-U would have across a range of real-world circumstances is, frankly, unknown. It has not been researched – by anyone. What we do know, based on our work here, is that the 50% LTE-U ‘on’ time considered by the LTE-U Forum when two networks are sharing (see Section 6.2.1 of their coexistence spec) does not yield proportionate throughput results.

That’s the effect of LTE-U on throughput, which is important. But equally important is latency, or the amount of delay on the network, which can have a dramatic impact on real-time applications like voice and video conferencing. We tested Wi-Fi latency across different duty cycles, and the results are seen in Figure 3 below. What’s important to note is that while the two Wi-Fi APs we tested can co-exist while providing smooth operation of real-time communications, that doesn’t appear to be the case if LTE-U is present. Even if LTE-U is only on 50% of the time, it degrades real-time Wi-Fi communications in a way that would likely be irritating to Wi-Fi users. And if LTE-U is on 70% of the time, then latency reaches levels where even latency-tolerant applications, like web page loading, are likely to become irritating.

wi-fi-duty-cycle-lte_fig3
Figure 3

So, we have seen in our research that LTE-U causes disproportionate harm to Wi-Fi, even when it is configured to share roughly 50:50 in time – which is not a given, as we have noted before. That’s because LTE-U interrupts Wi-Fi signals instead of waiting its turn through the listen-before-talk approach that Wi-Fi uses. We discussed the importance of using listen-before-talk in our last blog, and now you can see why we think it is important for LTE-U.

This research shows that LTE-U is not, in fact, a better neighbor to Wi-Fi than Wi-Fi is to itself, as its proponents claim. What should we make of these competing results? Is one claim right, and the other wrong? It’s not really that simple – the answer depends on how LTE-U is configured and deployed, and what coexistence features it actually adopts.

This highlights the need for the open and collaborative R&D that we have long been urging, so that we can find solutions that actually work for everyone. That has been happening with LAA, the 3GPP standard form of unlicensed LTE, where there is room for cautious optimism on its ability to coexist well with Wi-Fi. Hopefully LTE-U proponents will move toward actual collaboration.

Jennifer Andreoli-Fang is a Principal Architect in the Network Technologies group at CableLabs.

Rob Alderfer is VP Technology Policy and Nadia Yoza-Mitsuishi is Wireless Architect Intern, both at CableLabs, also contributed to this article.

Comments
Consumer

Can a Wi-Fi radio detect Duty Cycled LTE?

Joey Padden
Distinguished Technologist, Wireless Technologies

Jun 24, 2015

For my third blog I thought I’d give you preview of a side project I’ve been working on. The original question was pretty simple: Can I use a Wi-Fi radio to identify the presence of LTE?

Before we go into what I’m finding, let’s recap: We know LTE is coming into the unlicensed spectrum in one flavor or another. We know it is (at least initially) going to be a tool that only mobile network operators with licensed spectrum can use, as both LAA and LTE-U will be “license assisted” – locked to licensed spectrum. We know there are various views about how well it will coexist with Wi-Fi networks. In my last two blog posts (found here and here) I shared my views on the topic, while some quick Googling will find you a different view supported by Qualcomm, Ericsson, and even the 3GPP RAN Chairman. Recently the coexistence controversy even piqued the interest of the FCC who opened a Public Notice on the topic that spawned a plethora of good bedtime reading.

One surefire way to settle the debate is to measure the effect of LTE on Wi-Fi, in real deployments, once they occur. However to do that, you must have a tool that can measure the impact on Wi-Fi. You’d also need a baseline, a first measurement of Wi-Fi only performance in the wild to use as a reference.

So let’s begin by considering this basic question: using only a Wi-Fi radio, what would you measure when looking for LTE? What Key Performance Indicators (KPIs) would you expect to show you that LTE was having an impact? After all, to a Wi-Fi radio LTE just looks like loud informationless noise, so you can’t just ask the radio “Do you see LTE?” and expect an answer. (Though that would be super handy.)

To answer these questions, I teamed up with the Wi-Fi performance and KPI experts at 7Signal to see if we could use their Eye product to detect, and better yet, quantify the impact of a co-channel LTE signal on Wi-Fi performance.

Our first tests were in the CableLabs anechoic chamber. This chamber is a quiet RF environment used for very controlled and precise wireless testing. The chamber afforded us a controlled environment to make sure we’d be able to see any “signal” or difference in the KPIs produced by the 7Signal system with and without LTE. After we confirmed that we could see the impact of LTE on a number of KPIs, we moved to a less controlled, but more real world environment.

Over the past week I’ve unleashed duty cycled LTE at 5GHz (a la LTE-U) on the CableLabs office Wi-Fi network. To ensure the user/traffic load and KPI sample noise was as real world as possible… I didn’t warn anybody. (Sorry guys! That slow/weird Wi-Fi this week was my fault!)

In the area tested, our office has about 20 cubes and a break area with the majority of users sharing the nearest AP. On average throughout the testing we saw ~25 clients associated to the AP.

We placed the LTE signal source ~3m from the AP. We chose two duty cycles, 50% of 80ms and 50% of 200ms, and always shared a channel with a single Wi-Fi access point within energy detection range. We also tested two power levels, -40 dBm and -65dBm at the AP so we could test with LTE power above and just below the Wi-Fi LBT energy detection threshold of -62dBm.

We will have more analysis and results later, but I just couldn’t help but share some preliminary findings. The impact to many KPIs is obvious and 7Signal does a great job of clearly displaying the data. Below are a couple of screen grabs from our 7Signal GUI.

The first two plots show the tried and true throughput and latency. These are the most obvious and likely KPIs to show the impact and sure enough the impact is clear.

Duty_Cycled_LTE_fig1
Figure 1 - Wi-Fi Throughput Impact from Duty Cycle LTE

Duty_Cycled_LTE_fig2
Figure 2 - Wi-Fi Latency Impact from Duty Cycle LTE

We were able to discern a clear LTE “signal” from many other KPIs. Some notable examples were channel noise floor and the rate of client retransmissions. Channel noise floor is pretty self-explanatory. Retransmissions occur when either the receiver was unable to successfully receive a frame or was blocked from sending the ACK frame back to the transmitter. The ACK frame, or acknowledgement frame, is used to tell the sender the receiver got the frame successfully. The retransmission plot shows the ratio of retransmitted frames over totaled captured frames in a 10 second sample.

As a side note: Our findings point to a real problem when the LTE power level at an AP is just below the Wi-Fi energy detection threshold. These findings are similar to those found in the recent Google authored white paper attached as Appendix A to their FCC filing.

Duty_Cycled_LTE_fig3
Figure 3 - Channel Noise Floor Impact from Duty Cycle LTE

Duty_Cycled_LTE_fig4
Figure 4 - Wi-Fi Client Retransmission Rate Impact of Duty Cycle LTE

Numerous other KPIs showed the impact but require more post processing of the data and/or explanation so I’ll save that for later.

In addition to the above plots, I have a couple of anecdotal results to share. First our Wi-Fi controller was continuously changing the channel on us making testing a bit tricky. I guess it didn’t like the LTE much. Also, we had a handful CableLabs employees figure out what was happening and say “Oh! That’s why my Wi-Fi has been acting up!” followed by various defamatory comments about me and my methods.

Hopefully all LTE flavors coming to the unlicensed bands will get to a point where coexistence is assured and such measures won’t be necessary. If not — and we don’t appear to be there yet — it is looking pretty clear that we can detect and measure the impact of LTE on Wi-Fi in the wild if need be. But again, with continued efforts by the Wi-Fi community to help develop fair sharing technologies for LTE to use, it won’t come to that.

Comments