Elastic Telecom networks: a path towards cellular network virtualization

Over the last few months we saw two big announcements for Telecom consolidation in the US from the 3rd and 4th largest carriers. Is this a surprise or what? By all means it is not – I have been busy lately studying demand and supply curves for my micro-economics class (part of my MBA program) and viola – this all makes perfect sense! It is all due to the economy of scale – the larger the firm is the better it is able to supply the service needed at a better rate (meaning lesser dollars!). The one graphic that is always hurting the budgets for carriers is the ARPU distribution and the long tail economics.

So what will happen over the next few years is a change from selling voice and data for MNOs to service enablers. This requires several market plays which require – two main factors investment dollars and partnerships with several niche market players and this will allow MNOs to move into the VAS space.

Several years ago Microsoft had a similar competition from the likes of Google which challenged the whole notion of selling licenses for an OS, Google brought in the concept of freemium play – give Android for free and sell the apps! A similar shift is happening today in the wireless ecosystem – old models need to be thrown out and value creation needs to shape the networks!

Operators need to take these steps for survival or be bought out –

Who is the competitor? The primary competitors for the telecom service providers had been the other service providers. However, over the course of last few years, players have been migrating and surfing in segments across the board – from Android, Apple to PayPal, from P&G to AT&T, from Facebook to Time Warner, from Google to Best Buy, every company wants to capture the mindshare and piece of the consumer’s pocketbook. The fine line between partners and competitors can get obliterated in a quarter. One product launch or one acquisition can change the game in an instant.

Deploy end-to-end framework starting with the mobile packet core & IMS: Innovation is not just for the edge network or the devices, but rather the entire mobile network from RNC, SGSN, GGSN, Billing systems to the devices should be available as a platform for innovation. Obviously, one can’t just open up all the critical pieces of information at once, it should be done in a methodical and thoughtful manner. In order to provide a good experience for the customer and a robust API framework for the developers, the various network component need to work in sync that help understand the user behavior at a granular level and turn observations into insights that can be exposed to the developers who can leverage the input by building new experiences.

Charge for OTT services: The consequences of not playing an active role in the OTT services can be severely detrimental to the operator profitability. Given that there is a significant pressure on the margins of the voice, access, and messaging businesses respectively, operators have to find new sources of sustainable revenues in the next 5 years or else accept to live with the decline of margins by 30-50%. Netflix, Pandora and several OTT products have created enormous inroads in terms of capital generation while the MNOs have passively seen their ARPU slip! Toll-free apps from AT&T are one step to correct that direction. A detailed explanation of the concept has been provided here in my blogpost.

Exploit the long tail: Voice, Access, and Messaging will continue to be the three dominant revenue generating applications for mobile operators for some time. However, the next bucket of revenue isn’t in any one or two applications but rather in the long-tail that can include hundreds of applications. While individually, they might not generate a significant amount of revenue, collectively, they can rival the top three in generating billions of dollars in the coming years. By focusing the vertical areas such as health, retail, education, energy and horizontal areas such as security, cloud computing, payments, and others, mobile operators can create a sustainable revenue source for the future.

Foster developer ecosystem and provide tools: Mobile operator VAS revenues are tightly linked to the developer ecosystem they are able to foster. Just like Android and iOS have huge developer following which is helping them dominate the mobile OS platform landscape, mobile operators who open up the APIs for network, billing, profile, authorization, location, performance, security, quality of service, etc. will build a robust ecosystem that churns up new services and applications that help drive enormous value to the end-customer. Directly or indirectly, this will add to the bottom line and the operator will be seen as a service innovator in the market place.  Application developers primarily focus on how their application or the service works. They rarely spend time on examining how the API request will impact the network. Operators need to setup testing labs and provide simulation tools to developers so that they are more informed about the traffic and signaling load generated by their applications and are better prepared to address data consumption issues. Here are some examples of the developer outreach that has happened over the years – AT&T, Verizon, Sprint, T-Mobile & Clearwire.

Creating value rather than nickel & dime for bytes: MNOs must manage their data margins but the pricing of data shouldn’t exclusively focus on the amount of data transmitted. A byte during a financial transaction or for a medical application is far more valuable than a byte transferred doing social networking updates. A health care application that provides peace of mind to family members might not send gigabytes of data but consumers are more interested in reliability and immediacy than the app tonnage. Remote monitoring apps can lower the overall cost for healthcare and insurance providers and they will not be measuring the cost of the app by amount of data transmitted but rather by the value it provides.

How can the MNOs play the space and get ahead of the game with the changes that will happen over the next few years. Below is a very good video describing the change model for next few years.

Read more…

Did you like this? Share it:

OpenFlow, SDN: Impact on Telecom networks

I heard of OpenFlow three years ago while working for a WiMAX operator, but it never peaked my interest to read up on it till Nicra was acquired by VMware this year. This is when it occurred to me that this space of the network industry is gearing up for serious change a scenario which will impact Mobile ecosystem even on the edge. In contrast to the LAN, the WAN is staid. In the early 2000s, IT organizations began to move away from Frame Relay and ATM and adopt MPLS WAN services. However, up until now, the conventional wisdom in the IT industry has been that there isn’t a fundamentally new technology in development that will replace MPLS. A key consequence of that assumption is that, on a going-forward basis, IT organizations will have to build their WANs using a combination of MPLS and the Internet.

Software defined networks (SDN) have the potential to change the conventional wisdom about WANs. SDN isn’t a technology, but a way of building networks. Like many of the good commercial successes in IT and software SDN was started as project in Stanford as a part of PhD thesis of Nick McKeown. SDN based on OpenFlow is a novel programmatic interface for controlling network switches, routers, Wi-Fi access points, cellular base stations and WDM/TDM equipment. While there is not uniform agreement within the industry on the definition of SDN, the general consensus is that it involves the separation of the control and the forwarding planes of networking devices, the centralization of the control planes and providing programmatic interfaces into the centralized control planes.

While most of the interest in SDN has been focused on the data center LAN, earlier this year Google discussed how it has used SDN in the WAN to carry traffic between its data centers. According to Google, when the project began, there weren’t any network devices that could meet its requirements, so the company built its own. It also built its own traffic engineering (TE) service. Its TE service collects both real-time utilization metrics and topology data from the underlying network as well as bandwidth demand from applications and services. The Google TE service uses this data to compute the best path for traffic flows and then programs those paths into their switches.

 

opennetsummit.org

Read more…

Did you like this? Share it:
Categories: IPV6, Mobile Convergence Tags: ,

Understanding the Signaling Tsunami

A recent development of Tekelec as the diameter supplier for T-Mobile LTE deployment has once again brought forth the discussion of control plane congestion and the operator readiness to address the issues. I have been working on this area for a long time and have seen the impact first hand on several customer outages. 3GPP and the Infrastructure vendors have come a long way since the days when the first smartphones changed the trend of user behavior and network congestion. It is not just an improvement on the ‘plumbing’ of smart pipes but the overall change end-to-end that has helped mitigate many problems, both from a signaling as well as performance. QOE (quality of experience) plays a big role in network planning and management today, as users now demand more from a handset/tablet than ever before. Signaling plane control gives the operator a better leverage and control over the various elements of the network that handles customer management and quality of service along with it.

 

Understanding user behavior with Smartphones

User behavior in consuming more content than what is produced, the popularity of multimedia services and the lack of processing power on Smartphones results in far greater traffic flowing downstream into Smartphones than flowing upstream from Smartphones into the cloud. Naturally, communication channels are typically asymmetric reserving greater bandwidth for the downlink vs. the uplink. In spite of the asymmetric bandwidth allocation, mobile networks are seeing significant traffic pressure on the downlink due to the sheer number of applications using the networks and the multimedia heavy nature of the traffic generated by many of the applications.

In addition, a number of symmetric applications such as file sharing or Peer-to-Peer (P2P), mobile Voice over IP (VoIP) and gaming contribute to traffic on the downlink. However, P2P traffic has been on the decline over the last several years as first audio content and then video content became readily available through legitimate storefronts. In addition, relative to other traffic types, Mobile VoIP and Mobile Gaming are low bit rate services, which do not impact the overall traffic as much in spite of a very large number of concurrent sessions in use.

As Smartphones become more capable, services such as video conferencing, User Generated Content (UGC) uploads, P2P applications, video surveillance and augmented reality are gaining popularity. Further, as voice services transition to VoIP in LTE networks, the potential increased traffic from voice will also contribute to the uplink traffic mix. The aggregate of these services is making network capacity requirements more symmetric and resulting in significant pressure on the uplink capacity. Foreseeing this trend, HSPA and LTE have increased the relative throughput of the uplink vs. the downlink compared to earlier technologies18 and further LTE also allows for more flexible spectrum allocation to account for evolving network traffic usage patterns.

Nevertheless, until LTE is widely deployed, accelerated growth in uplink traffic will require immediate and unique solutions on deployed HSPA and HSPA+ networks. Machine-to-Machine (M2M) devices are expected to grow into a significant share of the devices on the network. These M2M applications have diverse network requirements varying from heavy signaling-low throughput in the case of geo-tracking to low signaling-high downlink throughput in the case of Near-Video On Demand (VOD) to low signaling-high uplink throughput in the case of webcams. Further, M2M applications may also be widely distributed in large numbers due to the low cost low maintenance nature of the Smartphones resulting in rapid growth in M2M traffic. In addition, the wide distribution of these devices will require remote manageability using OTA software updates further adding to the traffic demands on the network. Read more…

Did you like this? Share it:

Mobile Apps and User Experience

I have been a smartphone since iPhone 2G was launched, though I had to jailbreak it to use it over the T-Mobile network.  But my user experience improved since the day Android devices landed on T-Mobile, and being part of the team that launched helped me bridge the gap. But how has the user experience for mobile apps improved after all these years of smart phone adoption? We have grown smarter with using our phones, networks and software in terms of design, implementation and optimization.  

 

UI/UX Design:  Application interface and interaction is something that users including yours truly gets going on things like the position and location of the keys on the smartphone screen, having difficulty with resizing or webpage scrolling, or agreeing with built-in dictionary items, or about inefficient manual input (e.g., the “fat finger” problem). Most users prefer interacting with a web-based interface of particular applications (e.g., Facebook) than with its widget.

Each user interaction with an app should reflect the story of the brand and should increase recognition, loyalty and satisfaction. Identifying which elements contribute most to the brand’s identity is essential. Examples are features, visuals, wording, fonts and animations. Our design teams work on many different products on different product teams. This could easily lead to several design and implementation variations of similar UI elements. Defining the core building blocks encourages reuse and discourages reinvention and, therefore, optimizes the design and implementation of a set of components. Read more…

Did you like this? Share it:
Categories: Mobile Web Tags: ,

Beyond Phones: 5 things that will boost wireless adoption

Wireless networks of today have become a complex mix of various different flavors of services. Big macro networks can no longer become the sole provider of services but will become a conduit for heterogeneous networks and become a ‘backhaul’ for many services that will happen over the years. With a limited spectrum scenario it looks like a bleak future for wireless networks unless a smart strategy to inter-operate wireless with various protocols and software to overcome this. HetNets are evolving as we speak and implementation is a complex mix of various 3GPP and IEEE networks. Atleast in terms of 3GPP we are now backward compatible and there is convergence with LTE, though the bands of operation are fragmented and cumbersome for one phone to work worldwide. 

M2M (Machine-to-Machine)

M2M is probably the first implementation of wireless other than providing voice or data access to a live user. Wireless monitoring of fleets and SCADA devices on 2G networks is how it all started. Now everything from a coke vending machine to a buoy in the sea is reporting back stats. There has been a number of working definitions of what is meant by M2M; some have included OnStar and eBooks such as Kindle as examples of machines. However, we can take the CTIA definition of M2M as “applications or mobile units that use wireless networks to communicate with other machines. These applications may include telemetry and telematic devices, remote monitoring systems (e.g. transportation, etc.) and other devices that provide status reports to businesses’ centers (e.g. operations, traffic management, data management, etc.).The M2M market doesn’t use high data rate but rather good coverage and long service life including a battery life. Moreover, the growth in M2M is less than the growth rate of smart phones so the overall fraction of traffic used by M2M is actually seen to decline for the past several years and in these forecasts through 2016 and beyond. 

 A growing number of businesses across many sectors are investigating M2M applications to transform the way they do business. These applications have broad potential; for example, they can be used for video surveillance and home security, automated meter reading, remote equipment monitoring, fleet management and public safety. As a result, the worldwide M2M cellular market is expected to reach $2.14 billion by 2017(Source: IDG). Given their diversity, M2M applications require a wide range of products, connectivity and support.

Wireless Health Monitoring Read more…

Did you like this? Share it:

BYOD, Toll-free Data – Operator Strategies for ARPU

BYOD – Bring your own device has become the new nightmare for CIOs and security folks in the world of IT. I remember a ticket assigned to a security analyst while working with a Mobile operator that had a BYOD policy, he left me a cryptic message to meet him with my laptop.  It so happened that the outlook and Windows accounts were the cause of conflict, flooding the account servers with authentication requests. Can this happen over the air for mobile operators? Read more…

Did you like this? Share it:

Mobile Network Operators – retool, rethink & reinvent

 

 

 

 

Some time back I had an opportunity to speak with a technology pioneer, who helped introduce the best multi-media device – the iPhone on AT&T.  We went into the technical details of the experiences, the paradigm shift that never happened and the impending “data tsunami” that is happening as we speak. I have been blogging about this very data explosion for a long time now. I have been a traffic planner for the last 5-6 years of my career as a telecom engineer. I have seen the evolution of wireless networks from a voice centric GSM to a data centric-LTE, a shift in the thought processes of the big-iron telco companies that have shaped the way we communicate and interact with the world. MNOs (Mobile Network Operators) are in the cross hairs of technology evolution, data pipes are filling up faster that they can build. I monitor capacity at the Radio access side for an operator on a day-to-day basis, take my word for it – we ding your data experience at the cost of voice. I can say the same happens for many operators – it is what it is. Some operators had offered unlimited data and then pulled wool over your eyes by ‘throttling’ the user. Why does this happen, what prevents the MNOs to offer data at the promised speeds – the problems are umpteen and to take the bull by its horns is hard. Will 4G and LTE solve the problem? Initially it will, but as soon as more devices are offered, capacity will be the crunch point and techniques to optimize and improvise.

  Read more…

Did you like this? Share it:
Categories: LTE, Mobile Convergence Tags: , ,

Optimizing Mobile Web

 

 

Not all apps on our smartphones are created equal, just like how not all network technologies work the same. Some perform great and some not as much, but do we know how each one differs from the other? The other day I was discussing mobile applications with my wife who is a mobile developer when I realized that we as telecom network architects pay so little attention to the details of the design of mobile applications and their performance. It is not always about coverage and getting the best signal – indoors or outdoors. I was always aware of the implications of a badly designed mobile application going rogue as we had seen that a few years ago with Android launch  or a recent topic of NTT DO CO MO asking for help from Google. The rules for development have changed significantly and they are here to stay, as Apple goes into mobile publication and Amazon has democratized publication. I am all for democratization of the Mobile web – it is all about you – the user. The fundamentals of Mobile web design are – great UI, good performance and good experience.

Network Optimization

Networks have been built on the premise of – we build and they will come. It was the years before Apple and Google made their foray into wireless. A recent report says that there will be more smartphones than humans by the end of this year! What will that mean from a network perspective? It definitely bodes well for operators who are in it to make money, but will it kill their network with congestion? IPhone has definitely not helped the MNOs in monetizing terms. Read more…

Did you like this? Share it:
Categories: Mobile Web Tags: ,

Extending Coverage: Distributed Antenna Systems

With Super Bowl XLVI around the corner, as a wireless engineer I always wait with bated for the performance of the network. How many calls did we drop? What was the congestion and customer satisfaction? How did we fare against other operators? Well these and more questions will always be on my mind as I have worked across different networks in the US. I still remember the days when COWs (cell on wheels) were the only option, but DAS (Distributed Antenna System) has come to the rescue. Businesses are finding a growing demand to provide a wide variety of wireless technologies in the indoor space today with coverage in stadiums and Casinos leading the need. Wireless cellular customers depend on the mobility of their devices wherever they go. There is a necessity for DAS systems with the breadth of design requirements that allow it to carry a wide range of technologies and to do it well. Both the venue owners and wireless carriers wish to provide their customers and occupants with a satisfying wireless user experience.

In addition, regulatory movement dictates that public-safety communication service is a gating item to building occupancy covering the indoor area, including both public and back-of-house areas. The DAS can provide a broadcast mechanism for reaching public-safety personnel throughout a building, and can be done with an economy of scale when combined with commercial cellular services. Public safety can be broadcast across a range of frequencies that the DAS may provide. Building owners often demand Wi-Fi service as well, from small offices to large venues such as airports and convention centers.

 

Source: DAS Forum

DAS design should consider the perspective of integrating multiple wireless services that meet performance standards demanded by the wireless provider and building owner. The design must also consider the economies of scale of sharing DAS resources among carriers. DAS vendors have advanced their equipment to meet a wider range of frequency bands and higher power outputs to address these high standards. Key performance indicators for DAS design include frequency bands, protocols, sectorization, coverage and interference, among other parameters.

DAS Sharing Mechanisms
Sharing DAS resources among multiple wireless service providers are an important facet of designing and operating a neutral-host DAS. Sharing DAS resources can occur on various levels, from completely independent use of separate DAS equipment for each MNOS to fully integrated usage on the same equipment. Depending on the carrier, the preference can be for either dedicated or shared equipment. A DAS can be designed to accommodate either architecture or a combination of both. For the shared neutral-host architecture, the challenge is how to design it to meet quality of service and growth needs of each participant while sharing resources. If a first MNOS commissions its services with a DAS, the continued level of performance must be addressed when an anticipated second or third MNO is added. Read more…

Did you like this? Share it:
Categories: Broadband Tags: ,

Wireless Churn, Metrics and Big Data

Churn has a simple definition for a wireless operator – it is the number of net deactivations (i.e. gross adds minus net adds) divided by the average number of the subscriptions during the year. Mobile telecommunication market has changed from a rapidly growing market, into a state of saturation and fierce competition. The focus of telecommunication companies has therefore shifted from building a large customer base into keeping customers ‘in house’. Customers who switch to a competitor are so called churned customers. Churn prevention, through churn prediction, is one way to keep customers ‘in house’. In contrast to post-paid customers, prepaid customers are not bound by a contract. The central problem concerning prepaid customers is that the actual churn date in most cases is difficult to assess. This is a direct consequence of the difficulty in providing an unequivocal definition of churning and a lack of understanding in churn behavior. In the telecom service industry, churn can be of several types -

  • Involuntary churn: This occurs when subscribers fail to pay for service and as a result the provider terminates service. Termination of service due to theft or fraudulent usage is also classified as involuntary churn.
  • Unavoidable churn: This occurs when customers die, move or are otherwise permanently removed from the market place.
  • Voluntary churn: Service termination on the part of the customer when leaving one operator and possibly for another because of better value

In reality, it is very unlikely that MNOs could differentiate unavoidable and voluntary churn and predict them separately.

Nokia Siemens Customer Acquisition & Churn Study

 Churn can be shown as follows: 

Monthly Churn = (C0 + A1 – C1) / C0

Where:

C0 = Number of customers at the start of the month

C1 = Number of customers at the end of the month

A1 = Gross new customers duringh the month

 As an example, suppose a carrier has 100 customers at the start of the month, acquires 20 new customers during the month, and has 110 customers at the end of the month. It must have lost 10 customers during the month, 10 percent of the customers it had at the start of the month.

According to the formula:

Monthly Churn = (100 + 20 – 110) / 100 = 10%  Read more…

Did you like this? Share it:

Evolving Trends in Wireless Network Optimization

The long standing objective of any wireless network operator has always been Optimization of network performance, maximizing efficiency and thereby providing the highest customer satisfaction.

Today’s wireless networks have come a long way from providing just voice services to customers, to delivering high speed content rich data applications to their wireless handsets in addition to the traditional voice. With these changes, has come a wide variety of tools and applications which aid in letting the network operators achieve their objectives.

Drive testing has always been an integral part of Wireless Network optimization and it continues to be so even after the emergence of newer and faster Voice and Data Technologies.

But is Drive Testing absolutely necessary or can it be replaced completely with the tools that today’s operators have at their disposal?

Let’s look at some key areas why we would need Drive Testing today:

  • Green Field Operation/Benchmarking: For any operator deploying a brand-new wireless network, there is currently no other way than to deploy the network and drive test it to gauge performance. Also, drive testing is critical for benchmark performance analysis of multiple wireless network providers.
  • Some RF Engineers think that the key to network optimization is to actually understand the subscriber’s experience and consider drive testing to be the only way to simulate actual subscriber experience.

While both above mentioned aspects sound true, there are some highly developed tools available today that might be able to provide data which can significantly reduce drive testing and cut costs for network operators. Read more…

Did you like this? Share it:

LTE Connect Cars – the new social medium

 

via

 Like most guys I love my car (after my gadgets & my Triumph) and like most of you I drool at the prospect of getting a car that will talk to me and connect all the pieces together. As a child, the first car that caught my imagination was Herbie (the wonderful little Bug), and its adventures. I make it a point annually to take my son to local car shows so that when he grows up he learns to appreciate the fact that fast cars are meant to be revered! But the connected car is something very special it connects what I do as a wireless engineer to what I love as a driver! So what is a connected car, will it talk to you? Listen to your commands and maybe drive for you – if you wait for say 10 years Google will make them and make them cheaper for you to actually get one. So my grandchildren will ask me how we ever lived without a self-driven car ever. Read more…

Did you like this? Share it:
Categories: LTE Tags: ,

Connected world: a changing wireless paradigm

Inherently we the humans always want to get connected by phone, email, social media, Television, radio with the rest of the world. Things are changing fast now it’s about connected devices, appliances, automobiles, transport systems and even the plants. Anything can be connected will be connected. Anything can have a chipset will have a chipset.

When we talk about the connectivity, wireless comes to our minds.

In wireless land scape there are several technologies having different set of advantages and disadvantages. Broadly the wireless technologies divided into WAN, MAN, LAN, PAN. We have a distance Vs throughput with application comparison chart located below.

 

Wireless Standards Primer

Traditionally 3GPP standard based technologies dominate in the WAN and MAN technology landscape. In MAN segment, WiMAX is used to some extent, which is an IEEE standard (IEEE802.16 *) based technology. It did not get traction world-wide. In the wireless LAN and PAN segment IEEE standards based technologies are very common. Bluetooth which started from Ericsson initially was accepted by IEEE and incorporated IEEE802.15.4 features. Zigbee is very popular in sensor networks, connected homes and smartgrid home area network segment. It is built on IEEE802.15.4.

Read more…

Did you like this? Share it:
Categories: Broadband, LTE Tags: , ,

Defining Mobile User Experience – QOE & QOS

I am not an early adopter of technology, though I am one of the gears that drive the engine of change for wireless technology. But I read most of my news, books, email, Youtube – 80% of the time today on my Cellphone or iPad (love Flipboard & Zite!) which runs on my wireless carrier.  Can I cut the cables and just go wireless – are we there yet? My answer is not yet, but maybe in three years from now with the momentum from LTE and the device ecosystem will make the telecommunications ecosystem capable of wireless speeds of 1Gbps and more.

Wireless Industry measures Churn an important KPI(key performance indicator) –  that defines customer satisfaction and  Operator performance. One of the biggest causes of churn is user experience or quality of user experience (QoE) and is used to describe the perception of end-users on how usable the services are. QoS (Quality of Service) on the other hand, describes the ability of the network to provide a service with an assured service level. In order to provide the best QoE to users in a cost-effective, competitive and efficient manner, network and service providers must manage QoS and services in proper and appropriate ways.

QoS and QoE are so interdependent that they have to be studied and managed with a common understanding, from planning to implementation and engineering (optimization). In short, the aim of the network and services should be to achieve the maximum user rating (QoE), while network quality (QoS) is the main building block for reaching that goal effectively. QoE, however, is not just limited to the technical performance of the network; there are also non-technical aspects, which influence the overall user perception to a great deal. 

QoS is defined as the ability of the network to provide a service at an assured service level. QoS encompasses all functions, mechanisms and procedures in the cellular network and terminal that ensure the provision of the negotiated service quality between the user equipment (UE) and the core network (CN). Read more…

Did you like this? Share it:
Categories: Broadband Tags: , ,

Unlicensed Spectrum and Wireless Networks

November 17th, 2011 Comments off

A recent media article stating that Steve Jobs wanted to build an unlicensed network for the iPhone peaked my interest on speaking about unlicensed spectrum and the way it has been carved out by the FCC. I have always been against licensed spectrum making wireless expensive. The recent auctions both AWS and 700MHz have shown that it is all but a numbers game and deeper the pockets of the Operator the more spectrum they have are able to garner. Spectrum has been called the oxygen for wireless operators and in many ways it is as all commercial operators. Recognizing this potential the Obama administration and the FCC has made plans to make available 300 MHz of new spectrum over 5 years and 500 MHz over the next 10 years, which is almost, doubles the 547 MHz of spectrum that we license out today.

As consumers race to embrace all that wireless broadband connectivity has to offer and U.S. mobile innovation continues to advance at an astounding pace, there is a clear and compelling national interest in ensuring adequate spectrum is available to continue this progress. Unfortunately, we cannot simply flip a switch and make more broadband spectrum available. It typically takes several years for spectrum to be repurposed and released into the marketplace. And the clock is ticking with rising demand rapidly closing the gap with existing supply. The consequences of inaction are severe, widespread and wholly negative for consumers and the U.S. economy.

European countries, which had been leading the world in mobile communications, embraced the auction to promote competition and regional integration through the entrance of international operators to many countries. When 3-G auctions were held in 2000, at the peak of the “wireless bubble”, license fees skyrocketed far above their value; the fees amounted to more than 100 billion euro for all of Europe. After the bubble collapsed, however, the expected market for “mobile multimedia” proved almost nonexistent. Mobile operators in Europe fell into a business crisis due to huge liabilities. Deployment of 3-G services was delayed – some of them were even aborted – because of technical problems and financial difficulties.

Economists offer the excuse that it was not the auction but the operators’ extremely speculative behavior that was to blame. Through auctions, at least theoretically, spectrum can be allocated efficiently if operators behave rationally. This would be better than traditional licensing by paper examinations, known as “beauty contests”, in promoting competition and in realizing the full value of spectrum. Yet it is undeniable that auctions induced the “winner’s curse”, which is not rational but regular behavior in financial markets. A more important problem is that spectrum auctions depend on the legacy systems of telephone switching. It is inefficient and expensive in the Internet age, as the tragedy of 3G evidenced.

Another problem is that very little spectrum is available for auctions. Relocation of spectrum is conducted by governments after the removal of incumbent operators by negotiation, which takes a long time. Because spectrum is allotted by licenses for specific use, even if a band is idle, nobody is allowed to use it and incumbents cannot convert it to a different use. As a result, it is estimated that, integrating space and time, more than 90 percent of the spectrum less than 6 GHz in the metropolitan area of Tokyo is not used. Rural areas must be even less efficient.

UNII bands

The Unlicensed National Information Infrastructure (U-NII) radio band is part of the radio frequency spectrum used by IEEE-802.11a devices and by many wireless ISPs.

It operates over three ranges: Read more…

Did you like this? Share it:
Categories: Spectrum, WiMAX Tags: , , ,

Capacity Crunch – stitching networks together!

via

Having seen the evolution of wireless from voice-centric to data-centric I can truly say that the spotlight now lies on data. And who would have thought a few years ago that the tipping point would come from apple. Apple products have done to wireless ecosystem what the warm temperatures of waters in Gulf of Mexico do to hurricanes in the gulf coast. I am an engineer by profession but all my theories of radio propagation, design principles, erlang B principles all stop to bow before the devices that are unleashed today on our networks! Wireless has become the utility like PG&E. And the pipes in networks are clogging. They are filling up faster than can be laid. Have we all become bandwidth hogs? What are we doing today that we never did in the past. One example I have from my own life – I update my Facebook status from phone, tweet every now and then, send an MMS to my circle. It is the ‘my’ profile that has changed, I used to maybe browse on my phone, send emails from Blackberry once in a while. So has the world changed faster than I did! Have the networks changed, very much too we are now on 4G – LTE/ LTE-A, is getting launched. Once SMS was the cash-cow for wireless companies, it is data now. Future networks will be networks of networks, consisting of multiple-access technologies, multiple bands, widely-varying coverage areas, all self-organized and self-optimized. 

Capacity Crunch

MNOs have seen a seen a multi-fold increase in data traffic. With more than 5.75 billion mobile devices in service across the globe—including 5.2 billion Global System for Mobile Communications-Long Term Evolution (GSM-LTE) devices—mobile telephony is the most dominant form of communications on the planet. Mobile devices are stoking a dramatic and unprecedented transformation in personal communications and Internet access. And wireless technology is expanding the concept of mobility and connectivity beyond the traditional phone. The opportunity for operators and their vendor partners is not just in increasing voice and data subscribers, but also in connecting every facet of a person’s technology world. Someday, the industry will look back at terms such as voice and data ARPU as legacy analytics as they consider new research terms more relevant for the all-consuming, growing connected world. Read more…

Did you like this? Share it:

Telecom Cloud Operators – Internet, Content and Cloud Evolution

Data Tsunami

Mobile networks of today will evolve to become the Cloud providers of tomorrow. What does a cloud provider mean? It means that the networks of the future will become your only source for – Internet, TV, Cellular service – voice and data, home automation, Car connectivity and so much more.  Networks of today have an inherent advantage they have the existing infrastructure like cell towers, the backhaul and other services in place to compete tomorrow.  How many providers will we have? My guess is as good as yours. We will have a duopoly with Verizon and AT&T from existing carriers; the smaller carriers will no longer matter. Their existence as bottom-feeders will always be there for Pre-paid plans and serving customers for entry-level voice and data plans and rural carriers. Cable companies like Comcast, Roadrunner, Qwest, etc will exist either in a partnership or will merge with the wireless giants.

That will depend on the evolution of the cloud architectures and how users evolve from consumers of internet to content creators. Read more…

Did you like this? Share it:

Cloud RAN, Radio-over-Fiber: Cloud paradigm for Wireless Networks

Distributed Node-B architecture called Cloud Radio Access Network (C-RAN) is the new paradigm in base stations architecture that aims to reduce the number of cell sites while increasing the base station deployment density bypassing some of the zoning and construction hurdles to brining up new sites on-air. Metro cities like NY, LA and SFO already have a high density of Cell towers. As LTE and more complex wireless technologies are being deployed – would it not make sense to re-use and harness the existing infrastructure?

The concept of the Cloud RAN comes with a new architecture that breaks down the base station into a Base Unit (BU) – a digital unit that implements the MAC PHY and AAS (Antenna Array System) functionality, and the Remote Radio Head (RRH) that obtains the digital (optical) signals, converts digital signals to analog, amplifies the power, and sends the actual transmission. By making the RRH an active unit capable of converting from analog to digital, operators can now place numerous BUs in a single geographical point while distributing the RRUs according to the RF plans. The RRH becomes an intelligent antenna array which not only submits RF signals but also handles the conversion between digital and modular data. New RRH can also support multiple cellular generation (2G, 3G and LTE) eliminating the need for multiple antennas.

The Cloud RAN lowers operating expenses and simplifies the deployment process. By centralizing all the active electronics of multiple cell sites, at one location (aka the “Base Station Server”), energy, real-estate and security costs are minimized. The RRH can be mounted outdoor or indoor – on poles, sides of buildings or anywhere a power and a broadband connection exist, making installation less costly and easier. The RRH is typically connected using fiber to the BU, creating cloud-like radio access network topology.  This topology saves costs both during the installation and later on technology upgrades for both software as well as hardware saving the operators millions of dollars in CAPEX/OPEX.

Enablers for Trending towards RAN Clouds Read more…

Did you like this? Share it:

Telecom Clouds and Internet of Things

The growth of wireless Telecom is no longer Customer Acquisition – it is a dead paradigm. As CTIA aptly puts the numbers here , unless one operator cannibalizes the other then they can grow. How can operators build a sustainable business model? They would need to differentiate by engaging themselves with the customer more and more – be available to them whenever and wherever, have great portfolios of devices, services and wireless modes. We are doing many things todaythat seemed impossible just a few years ago.

 

 

How do we do that?

Several different ways – there are clear trends and paths that need to be explored and engaged by the MNOs. Are they doing it? Clearly they are but is the industry looking to benefit out it. Obviously to a layman it is not visible but the Industry has taken a winding road that has lead to the one of the most vibrant parts of the US economy. If we look at just look the top 10 trends put out by CTIA below it shows why we are where we are today. Read more…

Did you like this? Share it:
Categories: Broadband Tags:

Network as a Service(NaaS)

Selling a service instead of access will be the future for all telecommunication companies to come. The current trends are all pointing in the direction of cannibalizing the ‘control’ that telecom operators have in today’s world. Data deluge has begun as have seen in the last few years and it will only increase astronomically. MNOs in order to retain control over their networks, while preventing customer churn, diminished profitability and brand devaluation, must shift their role from traffic carrier to “application enabler.” With this approach, they can derive the greatest possible value from their network and its capabilities by developing and delivering first- and third-party applications.

Today’s mobile application explosion and evolution to 4G technology give service providers a valuable opportunity to transform their networks and services to deliver a truly next-generation Web 2.0 experience — profitably. Wireless providers have a number of key assets they can leverage to drive incremental revenues. When these assets are integrated within innovative applications, they strengthen the value of the application and the end-user experience.

 

Source: Alcatel-Lucent

Application Enablement addresses the mobile web’s disruptive changes by helping MNOs partner with developers, media and content players in new ways, so they can open and enrich their networks and, ultimately, grow their brand. Application Enablement contributes to market relevance by removing barriers and fostering new ecosystems. To enable applications, service providers require a High Leverage Network — one that offers scalability, awareness and optimization, while being oriented toward service delivery, monetization, faster time-to market and accelerated ROI. Operators can leverage these solutions to transport and deliver traffic more reliably, efficiently, flexibly and at lowest cost. The challenge, in the current environment, is to grow beyond core telecom capabilities. Growth areas can include extended and enhanced “franchise” services, as well as creation of new branded services that combine network enablers with application developers’ capabilities. Additional revenue streams can also be generated with new (potentially unbranded) services in non-traditional areas.

Understanding the Cloud Architecture Read more…

Did you like this? Share it:

Recession Impact on Wireless Ecosystem

With the release of the FCC report last month on the outlook for wireless for the next year, the picture doesn’t seem to be as dismal as so many other sectors like Housing, Banking etc. But the last three years have made an impact that will be here to stay and has shaped the way people spend and MNOs spend their budgets. As an engineer who has worked at three levels – Corporate, Regional and Markets I understand the decisions that are made solely based on economic merits that impact the network and the customers. Understanding the economics is harder for me as an engineer as technological advantages and not short-term cash flow savings make sense in a longer run. But economics play a bigger role than any engineering marvel as most MNOs are for-profit organizations and run solely based on ARPU, Churn and EBIDTA (Interest before Interest, Taxes, Depreciation and Amortization). The current stagflation and recession has taken a toll on the mindset of the customer and their spending habits which is a strong indicator for the value proposition that MNOs bring along. FCC for the first time in 2008, had observed that wireless voice usage per subscriber declined for in 11 years. At the same time, use of text messaging and other wireless data services increased over the previous year. The decline in voice minutes-of-use, coupled with the increase in data use, suggests that although only about 40 percent of consumers currently use data services, these consumers may be substituting data services, such as text messaging, for traditional voice services. Changing behavior by customers as well as trimming their cellular spending by attempting to be smarter in multiple ways is shaping the way MNOs win or lose in the market place. The biggest impact for any MNO is net-adds and the performance of an operator is judged by it!

Source: FCC

 

Read more…

Did you like this? Share it:
Categories: FCC Tags: ,

10 Cloud-Centric advances with LTE

With the advent of the iPhone and iPad – new classes of devices were born, one that captivates the imagination and capabilities of the common man. The devices now are always on, power hungry and data hungry – one that brings convergence of a multimedia and a computer married together. Hence, NGMN (a consortium of Operators) and 3GPP laid a foundation way back in 2006 for LTE and LTE-advanced standards development that have started to take off in a big way from mid-2010 onwards with the rollout of LTE (read Data Centric) networks. So what has changed from the UMTS world? How and why is it so different that operators will need to upgrade their networks, which is capital intensive? All these and more on how inadvertently we are moving to Cloud centric systems, sans the hoopla about iCloud, Amazon web-services or Hadoop.  Let us look at the problems and resolutions with LTE for moving our applications to the cloud and beyond.

Flatter Architecture and Air Interface developments (read lower Latency)

Problem

Latency and Round trip times over GERAN and UTRAN was always a performance bottleneck.

Solution

Flatter architecture with some of the functions of the RNC moving to the eNodeB & MME along with a complete re-designs of the air interface with OFDMA replacing W-CDMA air interface.

Air Interface

Read more…

Did you like this? Share it:
Categories: LTE Tags: , , , ,

5 paradigm shifts with LTE!

LTE or Long Term Evolution has brought to the fore things that technologists and administrators (a.k.a. FCC/3GPP) have been championing for the longest time. Advantages that have been realized after operating GSM/CDMA systems for years, but without the investment from MNOs (mobile network operators) it was never going to be realized.  These and the parallel development of exciting new phones and tablet devices have changed the way we communicate today. Pull mechanisms of today have replaced the push mechanisms of yesterday. A few years back if we had to share photos with our friends we sent an email with the links to our photos on yahoo photos, picasaweb or flickr. But today we just post them on our facebook or tweet the link to our ‘followers’. The way we use the internet today has changed and a recent study by Harvard Business Review has captured this paradigm shift.

Source: Harvard Business Review

With this shift in the way we interact with our fellow humans the devices and means have also changed, gone are the phones with which we can make voice calls only.  In comes the Smartphone the revolutionized the wireless world – iPhone. The icon that signaled – Telecom 2.0 the arrival of an era of data hungry devices that crave more spectrum and bandwidth.

Here are five things that I feel that LTE can bring along, as it has been the biggest technology refresh after 3G. Read more…

Did you like this? Share it:
Categories: LTE Tags: ,

Plugging the Deadzones – Het Nets, Radio Clouds & Cooperative networks

Wireless telecommunication market is witnessing a shift in business models and market structure as a result of the deployment of new broadband access technologies, spectrum management techniques, policy-based network management, and the drive of new entrants to compete against the incumbents. If you look at operators in the US today – Verizon has 2G, 3G and now LTE; AT&T is rapidly looking to deploy LTE and so is Sprint with its technology refresh program. There are reasons why all the operators need to look intrinsically as well to plug their coverage holes. Deploying new technologies is very exciting but the need to plug holes in the current technologies to address – traffic offload, inter-technology handovers and IP as a layer to guarantee the QOS needed to continue as a data or VOIP session is an often isolated. There have been technologies like IMS around for quite some time that operators have been reluctant to utilize the benefits due to high initial investment needed. But as the market matures and the need for connectivity all the time overtakes the cost, incubating technologies like IMS, Software defined radios, Inter-radio technology solutions will have to take place both on the access side as well as on the backhaul (transport) side. In the last few years we have seen an emerging trend in the use of Microwave as a backhaul tool mainly due to Clearwire and its innovative approach of using microwave rings to aggregate the cellsite traffic.

Coverage Maps for top four Operators in the US 

Source: Deadcellzones.com

 

Addressing Dead zones

By far the biggest factor for wireless churn has been coverage holes, other than of course the iPhone!  The fact that the customer experience has been bad is the biggest factor that influences customer churn. So how can an operator address this? What can be done that would mitigate these dead zones – building new cell sites in these coverage deficient areas are one. But in cities like San Francisco, Los Angeles or New York zoning is difficult and getting space, power and leases signed might take up to two years. There are other ways and solutions to get around these bottlenecks – Strand mounted femtocells, public Wi-Fi and Relay technologies with LTE and 4G technology.  Read more…

Did you like this? Share it:
Categories: Broadband, LTE Tags:

Intelligent Femtocells, SON & Convergence

Femtocells or small cells or liquid radio or one of the umpteen names that this small basestation goes with has evolved organically in the last few years to make an appearance in the home.   Finally is it ready to replace my Wi-Fi router at home? Maybe not, as it still remains a one trick pony that guarantees indoor coverage for operators but no visible benefit for the consumer. So why is the Femtocell important for a consumer? What does it offer that operators have an incentive to subsidize and give it away for free? How has the ecosystem changed over the years to say that now is the time for Femtocells to become the differentiator between operators? Come next year, when 70% of US wireless customers will either be AT&T or Verizon, how would somebody in the market for a ‘new’ wireless connection differentiate them? So what is different now from the cellular services of the yester-year?

Connection – and a great connection at that! Robust data service along with ubiquitous voice coverage without service interruption. 3G as a technology had some technical problems like cell breathing, SINR and UL Power limitations, but with the advent of LTE, WiMAX and Whitespaces ecosystems evolving to reach mainstream the Intelligent Femtocell is ready for main street.

Traditional problems for Femtocells

Interference Mitigation

Interference is a one of the biggest issue associated with femtocell adoption. There are a number of issues associated with interference all of which have needed to be investigated and solutions found to ensure that the deployment of femtocells to take place successfully.  Read more…

Did you like this? Share it:

Telecom Evolution: Impact of Economic Cycles, Consolidation & Managed Services

Source: Wall Street Journal

 The other day when I saw this graphic in WSJ it reminded me, of a time when I was starting out in the field of Telecom as a junior engineer, the world look so brilliant. I could go work for so many Operators – wireless and wired! The world was going wireless, GSM was new to India and Industry captains were predicting a multi-fold growth. While in the US it was CDMA vs GSM – two very strong contenders that were changing the way the world was communicating, while the cola wars were dominating the rest of the news and Billy Joel singing – “we didn’t start the Fire” !  

  
 Economic Cycles:

Read more…

Did you like this? Share it:
Categories: Broadband Tags:

Radio Network Sharing – new paradigm for LTE

Network sharing or Radio Access Network sharing between two or more operators will become a reality in near future as ARPU trend decreases, CAPEX(Capital Expenditure)and spectrum costs become speculative. The time is coming when operators have to decide to either evolve or go out of business being swallowed by bigger rivals. The days when a voice centric network paid for itself in time are gone, as the data-centric networks evolve it has to be supplemented with value added services like apps, faster download speeds, etc. Wouldn’t it be wonderful if we could do multiple things with our phones – monitor our home, watch Football, control home appliances, monitor kids, be on top of work email, control our work computers, GPS, do video chat on- demand and all of the above at one flat rate! This is going to be the reality in the next 5-7 years as vendors are looking to provide these services in collaboration with the operators and smart phones.
What does this mean for an operator? More fatter pipes, new investments on hardware infrastructure upgrades, more space, more power and above all more capacity in terms of bandwidth and spectrum. So the path now for an operator would be to either chose the upgrade path or become obsolete. Is there a third way a middle ground that mitigates spending so much for infrastructure investments, well it comes in the form of a sharing agreement, where Operator A shares the network or other resources with Operator B, as pooled resources mean greater reach in terms of capacity and coverage.
Why do we need RAN sharing or for that matter any network-sharing models?

Capacity

Ironically, there is much talk of an impending spectrum crisis, an alleged shortage of spectrum for cellular networks. A survey commissioned by NSF in 2005 concluded that radio spectrum is underutilized; they found only 5.2% of the spectrum occupied in the range from 30 MHz to 3000MHz. And how can there be a spectrum shortage?

There is a contradiction that the problem at hand is not the lack of spectrum but because it is so inefficiently used. Pockets of spectrum are quite heavily used – for example, the cell phone and specialized mobile radio (SMR) band (a narrow band from 806 MHz to 902 MHz) is 46.3% utilized in New York City; while other bands are barely used at all. Cognitive radios seem like one approach to provide more efficient usage of the spectrum. It might be argued that there is, in fact, not really a shortage of capacity. It’s simply off limits to most users, most of the time. Switch on a cell-phone in most locations in the world and you can see 5-10 different cellular networks, and several Wi-Fi networks. However, most users can only use one of the visible cellular networks, constrained by the contract they are locked into. Nearby Wi-Fi networks are usually off-limits too, because they are secured by their owners.

If we really want to give users access to the abundant wire-less capacity around them, why don’t we make it easier by design and by policy for a mobile client to move freely between the spectrum, and networks, owned by different cellular and Wi-Fi providers? While this approach is clearly counter to current business practices and would require cellular providers to exchange access to their networks more freely than they do today. I believe it is worth exploring because of the much greater efficiencies it would bring; and the much greater capacity that could be made available to end users. Interestingly, a several-fold increase in capacity could be made available for little or no additional infrastructure cost. Read more…

Did you like this? Share it:

ATT&T – what it means for us …..

Now that the dust is settling on the AT&T and T-Mobile deal, let us get to some straight talk. It is possibly the biggest deal that I have seen in Telecom happen in the last few years. The one that comes close is the Cingular – AT&T deal, a few years back but this deal means different things for different people. Is it good for customers, investors, financial intuitions behind this deal or not to forget the employees of T-Mobile? The sign – For Sale was written all over T-Mobile USA since the last couple of years when the growth halted and the churn was eating into the customer base, but the market forces had to improve before such a deal could take place. The biggest impact that is for the workforce that exists in Seattle for wireless broadband, that area has been a hotbed for wireless since Craig Macaw started the original Macaw cellular (aquired byAT&T wireless). And the failure of Clearwire to make an impact on the market with its lead on WiMAX based broadband solutions, as a frontrunner in 4G solutions.

What forced this sale? Well people may say that iPhone was the cause, possible as one of the several reasons but I would say that in a data-centric world, Voice as a paradigm for wireless growth had declined and T-Mobile was slow in adapting to this change. It was still the cheapest nationwide voice plan operator with data thrown in as an afterthought. This was one of the root causes for a network that brought some many firsts to the market – Blackberry devices, Android devices, UMA etc. In a marketplace where big fish eats small fish there is no room for second places, for the record T-Mobile is the fourth largest carrier in the continental US.

Customers and Rateplans

What does this change mean for existing customers, will they be able to still have their rate plans unchanged in this scenario? Let us look back at history and see what had happened in the past, AT&T customers were allowed to keep their plan as long as they did not make any changes to their handsets or their contracts. So that means that the time when the acquisition is finalized, the customer contracts will be grandfathered and honored. But in a longer term the loser will be the customer due to lesser competition among national operators, and T-Mobile is the best value for money per Billshrink.

 The deal will also give T-Mobile users access to a planned 4G wireless network using LTE, or Long Term Evolution technology. T-Mobile has HSPA+, which delivers 4G-like speeds, but it’s not a real 4G technology and the company hasn’t announced any plans for network expansion beyond HSPA+. As for customers, while the move leaves them with fewer choices, current AT&T and T-Mobile customers may experience improved service quality. Read more…

Did you like this? Share it:
Categories: Broadband Tags:

Switch to our mobile site

Stop Copying Plugin made by VLC Media Player