5 paradigm shifts with LTE!
LTE or Long Term Evolution has brought to the fore things that technologists and administrators (a.k.a. FCC/3GPP) have been championing for the longest time. Advantages that have been realized after operating GSM/CDMA systems for years, but without the investment from MNOs (mobile network operators) it was never going to be realized. These and the parallel development of exciting new phones and tablet devices have changed the way we communicate today. Pull mechanisms of today have replaced the push mechanisms of yesterday. A few years back if we had to share photos with our friends we sent an email with the links to our photos on yahoo photos, picasaweb or flickr. But today we just post them on our facebook or tweet the link to our ‘followers’. The way we use the internet today has changed and a recent study by Harvard Business Review has captured this paradigm shift.
Source: Harvard Business Review
With this shift in the way we interact with our fellow humans the devices and means have also changed, gone are the phones with which we can make voice calls only. In comes the Smartphone the revolutionized the wireless world – iPhone. The icon that signaled – Telecom 2.0 the arrival of an era of data hungry devices that crave more spectrum and bandwidth.
Here are five things that I feel that LTE can bring along, as it has been the biggest technology refresh after 3G.
The biggest benefit of GSM networks has been the seamless roaming across countries and continents, largely because of harmonized spectrum spanning large parts of the world. The LTE specifications inherit all the frequency bands defined for UMTS, which is a list that continues to grow. There are now 13 FDD bands and 8 TDD bands Significant overlap exists between some of the bands, but this does not necessarily simplify designs since there can be band-specific performance requirements based on regional needs. There is no consensus on which band LTE will first be deployed, since the answer is highly dependent on local variables. This lack of consensus is a significant complication for equipment manufacturers and contrasts with the start of GSM and W-CDMA, both of which were specified for only one band. What is now firmly established is that one may no longer assume that any particular band is reserved for any one access technology. To utilize LTE to its maximum potential data rates, 20MHz FDD contiguous spectrum is required.
To support seamless global roaming, harmonized spectrum will be needed – otherwise the burden is shifted to terminals (e.g., handsets or mobile devices) to support multiple frequency bands, which adds time, expense, complexity and inefficiency to the equation as well as silicon development.
Today, LTE is far from achieving harmonized spectrum. The three main emerging options for LTE spectrum allocation are:
- Digital Dividend: In the U.S., the 700MHz band has already been auctioned and Verizon has deployed LTE in it with a single 10MHz carrier. Currently in Europe there is a strong push to free up common 800MHz bandwidth across all countries; however, this remains an open and somewhat contentious issue with the regulators.
- 2.6GHz: Spectrum in 2.6GHz is available in large parts of the world and can serve as harmonized spectrum. However, there are a couple of important attributes to the 2.6GHz frequency band. First, relatively poor propagation characteristics will significantly impact indoor coverage, an issue already quite visible in the 3G HSPA networks deployed in the 2.1GHz band. Second, poor propagation characteristics also translate into smaller cell radius – hence the need for more cells, which adds expense and complexity. Lastly and most importantly, frequency has a direct impact on network costs, as both OpEx and CapEx increase significantly with higher frequency.
- 2G / 3G Spectrum Re-farming: There are a few spectrum re-farming options available, such as at 2.1GHz (for 5MHz carriers) or the GSM 900MHz band (for 5MHz or less carriers).
There are no clear answers for LTE spectrum allocation and harmonization; this complex issue needs to be addressed by the industry in strong collaboration with regulators – otherwise seamless roaming across LTE networks will remain just a vision.
Self Organizing Networks (SON) is a goal that operators and vendors alike are striving to achieve. The core principle is to completely eliminate the need for cumbersome radio network planning: as new cell sites are added in the network, neighboring cells dynamically learn about the changes in the neighboring radio environment, recalibrate their configurations, and adjust accordingly to minimize interference and update neighboring cell lists for handoff support. The eventual goal is to have the development of self-organization methods to enhance the operations of LTE radio networks, by integrating network planning, configuration and optimization into a single, mostly automated process requiring minimal manual intervention.
Presumably SON will enable quick network rollouts and upgrades and should greatly reduce the OpEx typically associated with planning and managing radio networks, and the architects of the Next Generation Mobile Network (NGMN) clearly mandated SON as a key requirement for 4G networks.
Most of these solutions today are Operations, Administration, Maintenance & Provisioning (OAM&P)-centric and address only a subset of complete SON requirements. Current solutions can potentially address certain aspects of SON, but there are many corner cases where these solutions break and thus SON will remain a research and innovation area for the next few years. Eventually the industry needs standardized SON specifications so that solutions from different vendors can co-exist simultaneously in the RAN – but we are still a few years away from realizing this.
But future communication networks will exhibit a significant degree of self-organization – comprising self-optimization, self-configuration and self-healing.
3GPP Specs – http://www.3gpp.org/ftp/Specs/html-info/32521.htm
Socrates Project – http://www.fp7-socrates.org/?q=node/1
2G and 3G networks were built with the philosophy of – build it and they will come, and the good news is that consumers have adopted these technologies – although for 3G there was nearly a four-year delay in adoption uptake. With femtocells, operators now have a new tool in their kit as they consider deployment options – namely an “inside-out” rollout strategy for delivering LTE data rates in the indoor environment (e.g., home, café, airport, etc.), while leveraging their UMTS/HSPA networks for macro coverage.
With LTE femtocells, operators can deliver dedicated capacity to consumers in their homes and offices, and as LTE adoption increases over time there will be a tipping point when the LTE macro network deployment business case becomes solid and should thus proceed. In other words, the inside-out rollout strategy eliminates Return on Investment (ROI) risks. Another aspect that needs to be considered is the bundling of femtocells with devices (e.g., providing a home base station with every new netbook or smart phone purchase). These devices generate huge amounts of data traffic and this traffic can easily be offloaded from the macro-cellular network, thereby significantly lowering the price/bit of delivery.
The “small or large cell” issue is also tied to spectrum, and for LTE, network operators can consider dedicated carriers for macrocells vs. femtocells. For example, the 2.6GHz band might not be suitable for macrocells, but it is very well suited for femtocells. Femtocells greatly increase frequency “re-use” in the indoor environment and are the cheapest capacity augmentation solution available today. The biggest gamechanger will be the enterprise femtocells which will provide the much needed capacity in offices and malls. The other kind of femtocells will be the metro femtocells deployed in problem zoning areas, where getting a new macro site will have problems passing through a zoning board.
Network operators face two challenges when it comes to LTE backhaul:
- Adding raw capacity, T1s, AAV, Microwave, etc
- Realizing full mesh backhaul
Because wireless networks are spectrum limited, typically the bottleneck so far has been on the air interface. LTE will shift the capacity bottleneck from the air interface to the backhaul link between base stations and the core network. This challenge is across the board, from macrocells to femtocells. For macrocells, a large part of the current backhaul infrastructure is comprised of bundled T1/E1s, but these will not suffice for 100Mbps+ data rates. Similarly, for femtocells the current IP broadband infrastructure is mainly comprised of DSL and cable modem links, but these are insufficient for carrying 100Mbps+ data rates.
LTE’s flat network architecture eliminates the Radio Network Controller / Base Station Controller (RNC/BSC)-like aggregation node in the Radio Access Network (RAN), which is both good and bad. While this simplifies the network architecture and reduces latency (on paper, anyway), in the real world it creates a completely new challenge: how to realize a full mesh backhaul between eNodeBs. Remember, in LTE networks the eNodeB needs to communicate with peer eNodeBs in order to support handoff functions – both control plane and data plane.
Operators will need to make significant investments in their backhaul infrastructure as they deploy LTE. Eventually, in order to comprehensively address the backhaul challenge, operators will use a full range of technologies including microwave, fiber/Ethernet, CATV Ethernet and potentially WiMAX. Whether operators deploy their own backhaul solutions or continue to lease backhaul for LTE remains to be seen. It will also be interesting to see what role pure backhaul leasing providers will play in LTE networks.
Last but not the least with the advent of LTE, comes a slew of exciting new devices – tablets, smart phones, intelligent hotspots, etc. Smart phones are limited in number for LTE for now but data cards, dongles and netbooks have already been launched in Verizon & MetroPCS. Most observers anticipate that data cards and dongles will be available earlier, followed by smart phones, but then again it is possible that we might have another big surprise from Apple!
LTE devices need to support Multiple Input Multiple Output (MIMO) in order to deliver high data rates – but this directly increases a device’s complexity. One of the interesting things to watch will be whether the initial devices will support only 2×2 MIMO or whether some vendors will launch devices supporting 4×4 MIMO from the start. This choice is directly related to battery life, and while data cards and dongles might get enough juice from their hosting laptops and netbooks, smart phone designers have critical design challenges in front of them in terms of balancing battery life with MIMO support.
Yet another key issue with devices and terminals is the frequency bands that they will support – a factor which is directly related to spectrum harmonization and has significant impact on device complexity and costs.
Lastly, how much legacy support will be embedded in these devices? It is clear that LTE networks will not have nationwide coverage for a long while, and to support full mobility will these devices support both 2G and 3G fallback mechanisms? I have a feeling that most of the new devices will also support HSPA+, thought it remains to seen if all the types of HSPA devices will be supported.
Opinions of Harish Vadada, please leave me a feedback- thanks!