Wednesday, 18 March 2015
The wave of cellular network operator consolidation is being driven by cutting costs and achieving economies of scale.
The wave of operator consolidation across the Americas and Europe is driven by the urgent need to cut costs and achieve economies of scale. With average revenue per user (ARPU) falling, even for supposedly premium LTE and fiber services, and with huge infrastructure upgrades still ahead, carriers need to find drastic cost efficiencies or be consigned to a spiral of rising debt and falling margins.
Even in the US, the big four operators, whose ARPUs are the envy of the rest of the world, are seeing their profits squeezed by price competition and huge capital expenditure (capex) bills, and now they are being hit by another factor, the inflated cost of spectrum. All four have seen their market values slashed in recent days, with investors scared by the bids being made in the current AWS-3 spectrum auction, as well as warnings of difficult quarters from the two market leaders and the ongoing price wars.
Such jitters will put even more pressure on carriers to adopt a dramatically different approach to building their networks, including the software-defined strategies which will eventually eat into the revenues of traditional equipment makers.
All four national US cellcos have seen their stock being dumped in the past week, and have lost a total of $45bn in market value between them since mid-November, according to Wall Street Journal calculations. That loss in value is greater than the combined market capitalisation of Sprint and T-Mobile.
Sprint suffered the most, with a drop of 16 per cent in the second week of December, reflecting the persistent lack of confidence in the carrier, which has been less aggressive than T-Mobile USA in price cutting, but has failed to increase its premium base significantly either. It has also been hit by delays in its ambitious LTE rollout plan, and the transition has caused temporary network quality issues which have added to its churn woes.
The drop in Sprint’s value shows the new coolness on carrier stocks is not just about the headline factor of the huge prices bid in the AWS-3 auction, which have now topped $43bn. Sprint was the only one of the big four to stay out of that process, saying it has enough capacity to support its Network Vision multimode infrastructure and its Spark triband LTE offering for now. It will concentrate on leveraging its great asset, its huge quantity of 2.5GHz spectrum, to improve its cost:capacity position relative to its rivals, and wait until the 2016 600MHz auction to acquire more frequencies.
Staying out of the AWS-3 madness has not saved Sprint from investor anger, partly because of its ongoing issues, and partly because the current spectrum sale has set new expectations for the 600MHz incentive auction of broadcast frequencies. Since sub-1GHz bands are generally seen as "beachfront" assets, investors fear that the 2016 sale may command even higher prices than AWS-3, and at that stage, Sprint is expected to be a heavy buyer, as is TMo.
Of course, it is not as simple as that. Many market changes will have occurred by the time the delayed 2016 procedure gets under way, and many operators will be setting higher value on capacity bands by that stage, rather than coverage-focused sub-1GHz frequencies.
However, for now, fears that operators will never be able to call a halt to spectrum spending – even with the use of Wi-Fi and other options – are haunting the carrier stocks. T-Mobile lost 10 per cent in the first week of December, while Verizon lost six per cent and AT&T five per cent, partly because of the high sums they will end up paying if they are successful bidders.
But warnings about lower profits in the current quarter, issued by both Verizon and AT&T, are also a big factor, especially as these negative vibes are not about the wireline business but about wireless, which is usually the growth driver. The markets are concerned that the price wars, and the ever greater promotions and discounts operators must offer to lure and retain consumers, will drag on into next year.
Consolidation is the usual remedy for excessive price wars, as seen in other competitive markets like France. In a research note last week, Jefferies analysts Mike McCormack, Scott Goldman and Tudor Mustata wrote that they had "doubts for the sustainability of the four-player market," and that "without a more accommodative M&A environment, short term lower pricing for consumers will likely end poorly for all”.
These are the kind of opinions which send investors running, but have less impact on competition authorities. While Sprint owner Softbank argued persuasively, earlier in the year, that a merger with TMo would create a more viable third player and more profits all round – which could then be invested in services and capacity – it backed away from making a bid in the end, deterred by likely antitrust scrutiny of a deal which would reduce the number of carriers. Other offers for the fourth operator could still materialise in 2015, from Dish or even Vodafone (see separate item), but for now TMo is aggressively independent and continually upping the ante in terms of consumer offers. Even Verizon, usually aloof from such mud-fighting, has been forced to join in, hence its warning about current profit levels.
Apart from mergers, carriers may also choose to preserve profit margins by sacrificing market share, losing the lower value consumers and keeping those who choose an operator not on price but for network quality, added value services or choice of devices, for example.
AT&T and Verizon are both working hard on adding new revenue streams, many of them not dependent on the fickle consumer – Verizon’s alliances with carmakers are a good example, as is the involvement of both giants in GE’s Industrial Internet initiative and in smart city projects.
If they can build new revenues, and increase wholesale activity (which protects them from the cost of acquiring and retaining users), they may decide to take the hit of lower subscriber numbers. The Jefferies note argues that it is better to lose customers than ARPU, calculating that a fall in ARPU is 2.5 times more damaging to EBITDA (earnings before interest, taxes, depreciation, and amortization) profits than a loss of subscribers. "The bottom line is the industry needs to stop obsessing about subscribers when ARPU is so much more important," they wrote. "In our opinion, going negative on subscriber growth may be just what the doctor ordered."
Another analyst, Kevin Smithen of Macquarie Capital, urged Verizon and AT&T to let about five per cent of their customer bases go, and concentrate on a clear network performance advantage over rivals, which would be helped by a lower burden of subscribers, and would drive higher (or at least stable) ARPU. Sprint and TMo could then pick up those users – a total of eight million to nine million between them – and gain what they most need, better scale to improve cashflow and lower debt. That would see a clear divide between the two pairs of operators, with the leading two investing their improved profits in spectrum and network quality, not on subsidies and customer retention.
That model reflects what has already happened in the US, with Sprint and TMo increasingly focused on prepaid and lower cost businesses and AT&T and Verizon staying at the premium end. However, LTE has changed that pattern somewhat, because the smaller operators will need to climb back up the value tree in order to recoup their investments, hence the spread of price wars and heavy discounting from the low end segment and right into postpaid 4G.
LTE has been a big disappointment as a profit generator, especially as it has coincided with a rise in usage which is far bigger than anticipated a few years ago, and a period of heavy spending on device financing. The network upgrade which was supposed to improve the carriers’ economics with its lower cost of capacity and its improved spectral efficiency has caused further pain to some, forcing them to make major investments in spectrum and infrastructure, only to hunt in vain for ROI in a price-competitive market.
The Jefferies trio added: "In our view, innovation, both in the network and in the handset, has caused increasing industry pain. Now, with too many competitors in the market, carriers are unable to price at levels to properly re-coup investment."
The carriers will learn some lessons from this difficult period. One will be to think about their infrastructure investments in a completely new way in future. This is one driver for general interest in virtualised and software-defined networks, which currently come with all kinds of uncertainties and upfront costs, but should eventually allow operators to adopt the economics of IT – commoditised hardware running network functions as apps in the cloud, and supporting very low cost cell site end points.
AT&T is a leading light in this process with its Domain 2.0 program, which sets out a multi-year process to move its network functions to the cloud and create a next generation platform based on SDN (software-defined networking). This will be directly responsible for a reduction in capex spending, it says. Verizon expects to spend about the same on its networks in 2015 as it did in 2014 – its LTE roll-out is almost complete in coverage terms, but now it needs to shovel in more capacity – but AT&T says it will reduce its bill from $21bn this year to $18bn in 2015, and down from there.
This week, it announced new vendors for Domain 2.0, taking the total to 10, and said in a statement that it expected its next generation network “to reflect a downward bias toward capital spending. This will come from relying less on specialized hardware and deploying more open source and reusable software".
The latest additions to its supplier roster are Brocade, Ciena and Cisco. These suppliers will see it as an important endorsement – AT&T is such a frontrunner in telco software defined networks (SDN) that inclusion on its list is seen as a signal that a vendor is geared up to adapt to the software-led world and its new economics, rather than battling against the huge upheaval. In the first wave of Domain 2.0 selections, Cisco was omitted, which helped create considerable market concern that it would struggle to convert its hardware-based business to SDN, despite all its high profile efforts in this direction. Cisco subsequently acquired one of the Domain 2.0 choices, Tail-f, but has now also joined the elite club under its own steam. The other participants so far are Affirmed Networks, Alcatel-Lucent, Amdocs, Ericsson, Fujitsu, Juniper and Metaswitch.
With its high profile focus on SDN, AT&T is differentiating itself clearly from its arch-rival in terms of its approach to its network. Verizon was the clear winner of the race to get to LTE first, but AT&T has been more experimental with its architecture, and more concerned with adopting technologies which alter the cost structure – it was a pioneer of mobile offload with its acquisition of Wayport; it is prominent in industry developments and organisations for both small cells and WiFi; it is intensely involved with NFV and SDN.
Its architecture chief, John Donovan, summed up the differences in philosophy at the recent Barclays 2014 Global Technology Conference, saying: "Can we benefit from standards? Yes. Can we flatten the network? Yes. Can we reduce components? Yes. But I'm more enthused about making that stuff software-defined than I am about next-gening it.”
And in a new blog, he is pledging to make the AT&T network 75% software-driven by 2020, building on existing projects such as the virtualisation of network analytics, edge routers and some data platforms, during 2015.
This emphasis keeps Donovan out of the most hype-infested areas of debate about "5G". The senior EVP of architecture, technology and operations is not uninterested in a possible air interface upgrade, but only if it is a real leapfrog “that would allow us to carry more bits per hertz, so that we can gain more efficiency in our spectrum ... It's sort of a high bar saying if you can clear this, then it might make sense to take a look. Absent that, then it's good to sit and talk and plan, but it's not going to trigger any sort of investment cycle."
This is the sort of comment which strikes fear into the hearts of traditional network vendors, many of which are busily testing technologies which could go into a brand new air interface and power a conventional 5G upgrade. But the operators know there are likely to be more significant, and possibly more immediate, cost gains to be made from turning the network into software.
The initial effort and disruption will be just as great as ripping and replacing a generation of mobile kit, but the eventual savings should be greater for many, and they will gain a flexible network which can support many kinds of as yet unimagined new services and behaviours, without the need for yet another rip and replace.
South Korea's LG Electronics has announced its first self-made mobile system-on-chip (SoC), along with plans to debut the chip in a new smartphone to be released soon.
Nuclun (LG says it's pronounced "NOO-klun," somewhat like how Americans pronounce "nuclear") is an eight-core application processor based on ARM's big.LITTLE technology. It marries four 1.5GHz Cortex-A15 cores with four 1.2GHz Cortex-A7 cores, plus next-generation LTE-A Cat 6 wireless that's backward-compatible with current LTE networks.
From the sound of it, LG doesn't plan to market these chips to other phone makers. Instead, it will use Nuclun as a way to cut costs and deliver devices that are better differentiated from the competition. Previously, LG has mostly sourced SoCs for its mobile smartphones from Qualcomm – as do many other vendors. "With this in-house solution, we will be able to achieve better vertical integration and further diversity our product strategy against stronger competition,"
Jong-seok Park, CEO of LG's Mobile Communications Company, said in a canned statement. "Nuclun will give us greater flexibility in our mobile strategy going forward." This is the first time that LG has created its own processor since it divested its earlier chipmaking business during the Asian financial crisis of the 1990s.
Behind the scenes, LG is reportedly working with Taiwan Semiconductor Manufacturing Company to fab the chips, as the Korean firm has no production facilities of its own. According to a report in the Wall Street Journal, LG has been collaborating with TSMC for the last few months to develop Nuclun.
LG says its new Nuclun mobile chips will give its smartphone line greater diversity The move sees LG joining Apple and Samsung in creating its own silicon for its mobile devices – although Samsung, at least, is not exclusive to its homebrewed Exynos line and still sources processors from several other vendors, as will LG.
The Korean firm says the first phone to be built around Nuclun is the G3 Screen, an Android 4.4 "KitKat" device that's based on the popular LG G3, with a 5.9-inch full HD screen, 2GB of RAM, 32GB of eMMC storage, a 13-megapixel front camera, and a 2.1-megapixel rear camera. LG says the G3 Screen will ship "this week," but there's a catch: it will only be available in South Korea, having been "developed specifically" for the Korean market.
Mobile operators’ business models need to evolve to address the changing capex environment, which has become more challenging over the last decade, said Hatem Bamatraf, Etisalat group CTO, at Mobile World Congress.
The key question, he said, is “how are we going to justify investing in a certain technology? People love technology, but there needs to be a balance between the investment and the business case. There are lots of ratios we are looking at before we decide how much money to invest in a certain market and in a certain technology”.
Speaking during a panel session on 5G, he said all of those ratios are related to revenue and profitability. “We all know there are huge challenges when it comes to top-line growth as well as bottom-line growth.” Asked how much operators will need to invest to the point of being able to launch 5G, he joked that you first need to ask Huawei what it will sell its 5G equipment for, “then we’ll be able to address that”.
Mischa Dohler, professor of wireless communications at King’s College London, said there is an opportunity to crowdsource some of the deployment of 5G technology, because the industry is not only connecting people but also industries.
“When you start adding industry by industry, you crowdsource in a sense, so your capex can actually go down because someone else is paying.” Seizo Onoe, NTT DoCoMo CTO, insisted that the market leader in Japan is keeping to its aggressive 5G roadmap schedule. “I’m sure we can launch 5G by 2020.”
He said his preference is to have one 5G network for all 5G use cases, but “we’ll probably end up using different technologies. My dream is to finally have one technology covering all use cases, however, that will probably be 6G.”
Yang Chaobin, Huawei’s CMO of wireless networks, agreed, noting different vertical industries should be able to share a common infrastructure and be able to tap services on demand. He noted that 5G shouldn’t just be an update of existing technologies but should be an enabler of many disruptive innovations in other industries.
The mobile industry should consider 5G as a “special generation”, introducing “challenges in all layers of the technology”, Mike Short, VP of public affairs Telefónica Europe, said at Mobile World Congress.
Speaking in one of the sessions covering the next-generation mobile technology, he said: “It’s beyond what we’re doing today with mobile. 5G will have a huge influence on our connectivity to the internet and wireless broadcast capabilities.” But Matt Grob, VP; CTO for Qualcomm, questioned any need to rush. “We have engineering teams working on LTE Advanced and 5G. Each time the 5G team unveil a new performance leap, the LTE engineers respond by matching it.”
Underlining the need for a new generation, Mischa Dohler, professor of wireless communications at King’s College London, insisted that “We really do need 5G in order have a paradigm shift. The order of magnitude jump in traffic is what is really driving this move.” He said the industry is nearing the limit and needs to have breakthroughs, which will hopefully come from the 5G developments.
This stand was supported by Allan Kock, director of RAN development at TeliaSonera. “5G is a fundamental change in technology and will have a significant impact on how we offer services. We must look at performance and coverage, and not just consider microcells.”
Ericsson group CTO Ulf Ewaldsson said it’s important to have a full understanding of the needs of industries that are being transformed. This means that there are a wide range of potential requirements, from low-power sensors to fast-moving vehicles that require extremely low latency. “And we’re going to fit all that on one network.
The risk with 5G is we’re stretching it too wide to be able to build it. But we don’t know that yet,” he cautioned. Chaesub Lee, director of ITU’s Standardisation Bureau, however, noted there is a long way to go since the industry’s current treatment of traffic isn’t smart enough to serve all the business models.
“We now only have one classification of traffic – broadband or not.”
A 5G White Paper unveiled by the Next Generation Mobile Networks (NGMN) Alliance aims to settle industry debate over the technology’s future standards.
Following a decision late last year by the NGMN board of more than 20 operator CTOs, a team of 100 technical specialists were asked to contribute towards defining the end-to-end requirements for 5G. The resulting White Paper, according to the NGMN, now presents a consolidated view of operator requirements intended to support the standardisation and subsequent availability of 5G for 2020 and beyond.
“The White Paper provides essential and long expected input for the work of many industry bodies”, said Peter Meissner, CEO of the NGMN Alliance. “Together with our global partners from within industry and research, we will now focus on setting up and implementing a 5G work-programme ensuring that future solutions will meet our ambitious targets.”
The NGMN document states it would like to see any 5G ecosystem as being global, lacking in fragmentation and open to innovation. However, while conceding that commercial deployment time scales will vary across the operator community, it urges 5G availability by 2020. The White Paper is set for open discussion at the NGMN industry conference in Frankfurt. However, equipment vendors have been calling for open collaboration across industry sectors for some time.
“We want the standardisation of 5G to be done differently to past efforts,” said Ken Hu, Huawei’s rotating CEO, at Tuesday’s keynote presentation at Mobile World Congress. “There should be a better understanding of the particular requirements of vertical industries and improved communication between interested parties.”
Technology vendors disagreed over the timescales for 5G deployments at Mobile World Congress keynote.
Qualcomm’s CEO Steve Mollenkopf called for the benefits of 4G LTE Advanced to be fully maximised to protect R&D and deployment investment, whereas Ken Hu, Huawei’s rotating CEO, stressed the benefits of 5G over today’s LTE.
“The debate is when do we call it 5G,” said Mollenkopf. “There’s still a lot to do.” “The biggest challenge we face with 5G is the extreme number of use cases. There will be many new methods for billions of devices to connect and interact, and we need to transform the edge of the internet to better support mobile devices. It’s at the edge that real innovation will take place,” he added. However, he warned that there was an ongoing need for multimode support to protect existing and future LTE investment, and “we don’t need to make a huge technology jump when LTE is providing some of this already.”
Hu countered this viewpoint with the claim that LTE cannot support the 1,000s of connections needed for IoT services. “5G will be capable of connecting 100 billion devices, which will be very valuable for industrial applications.” The Huawei exec noted that only 5G latency capabilities could fulfil the much-hyped driverless car concept. “Stopping a car travelling at 100kph would extend braking distance by another 1.4 meters due to LTE latency, but only 2.8cm with 5G.”
“5G provides us with a very powerful applications platform that will take the technology into new industry segments and trigger positive disruption. But we must involve the key industry verticals in how 5G evolves. The communications industry did this in isolation in the past, which resulted in a fragmented approach.”
As an indication of Huawei’s keenness to move forward, the company confirmed it had already developed a new air interface for 5G. Hu added that 5G would have a virtualised architecture leading to a single physical network providing support for a multitude of different apps.
Mollenkopf, meanwhile, indicated that the concept of 5G was still very much under discussion, emphasising Qualcomm’s views that this next-gen technology must target user-centric connectivity. “It’s important that the mobile user is seen as part of the network, a node.”
Despite being one of the hot-topics at Mobile World Congress, 5G is “still a technology push story, not a market pull one”, Michael Peeters, Wireless CTO for Alcatel-Lucent, told Mobile World Live.
“Operators have two primary approaches to 5G: one which is driven by their research organisations which need to understand whatever 5G may be in order to be ready and to drive the direction of research and standards; and another which is driven by commercial and operational needs which are trying to understand how 5G fits into future operations and revenue streams. It is clear that today the first one is the more important one,” he said. There are two “likely, or rather, visible” paths for operators looking to make the most of 5G, Peeters continued. The first will be through the continued support of “ultra broadband applications”, solving capacity issues where heavy users are connected to networks. And the second is “enabling the world of ubiquitous IoT” – “where an infinity of devices (real, or virtual i.e. applications) each use almost zero bandwidth, but nonetheless eat up all of the control plane of the network”.
“IoT today is a catch-all which contains wearables, objects about the house, whitegoods that are powered, cars, camera networks and whatnot – indeed a huge spectrum of bandwidth and connectivity requirements. 5G can become the network that unites technologies in a way that creates a better end-user experience – if the industry ecosystem can see the value in “one network to rule them all” – 5G can be the overseer of the synergies between many different technologies that each add their own unique value,” the executive said.
Peeters also took a light-hearted view of the term “4.5G”, which many in the industry – including Alcatel-Lucent – have used to describe technology beyond LTE-Advanced. “We’re already sorry we ever used the term. We’d said in some talks that we’d call the set of foundational technologies ‘4.5G’ as an easy shorthand. Immediately, the industry jumped on this and everything LTE-A, beyond release 12, was suddenly 4.5G,” he said.
Despite not actually being an official mobile standard yet, ABI Research has gone out on a limb to suggest that 5G will take more than five years to reach 100 million connections, two years more than it took 4G, which grew faster than previous generations.
The research firm put 4G technology’s growth down to its capabilities of increasingly powerful smartphones and the availability of 4G devices, and predicts that 5G growth will “be a bit more muted at first due to the increased complexity of cells and networks, but will pick up in 2023.”
According to research director Philip Solis, the US, China, Japan, South Korea and the UK will drive 5G subscriber volumes because they are early builders of 5G networks and have a large population living in urban areas. The report also says that it is important to understand the nuances around 5G to understand where it is headed.
“5G will be a spectrum of evolution to revolution—it will be an evolution of the way the core network and network topology is transforming now, but it will be clearly delineated as a fifth generation mobile air interface on which the mobile network of the 2020s and 2030s will be built,” Solis explained.
According to the study, 5G will encompass spatial division as the foundation of the air interface, leveraging techniques like massive MIMO and 3D beamforming. Client devices will have links to multiple cells simultaneously for robust connectivity.
Spectrum will be used flexibly and shift as needed between access and fronthaul and backhaul. A 5G network will be made up of small cells but the report says to expect a scaled down version to use existing spectrum for macrocells as well as in the longer term. One problem, however, could be regulatory issues concerning concentrated RF beams in centimeter and millimeter wave spectrum, the report notes.
UK watchdog Ofcom is calling for industry input on the use of “very high frequency spectrum” to deliver so-called 5G services.
It noted that 5G is likely to use large blocks of spectrum to achieve the fastest speeds, which are “difficult to find at lower frequencies”. Higher bands, above 6GHz, will therefore be important. The regulator said that it expects lower frequency bands will be needed for wide-area coverage, while the higher-frequency range will provide increased performance and capacity at specific locations.
The growing use of small cells will also support the use of different spectrum bands, because the power and coverage requirements are different to those of macro sites, making higher bands more viable. And other new technology developments may make spectrum above 6GHz viable for mobile services, it continued.
Ofcom has already started looking at the use of spectrum below 6GHz that could be used for mobile applications, including 5G. Noting that the launch time frame for 5G services is uncertain, Ofcom said that it is important that it “does the groundwork now to understand how these frequencies might be used to serve citizens and consumers in the future”.
Spectrum above 6GHz currently supports various uses, the regulator said, from scientific research to satellite broadcasting and weather monitoring – and “one of Ofcom’s core roles” is managing spectrum taking into account current and future demands. The closing date for responses was 27 February 2015.
SK Telecom and Samsung Electronics demonstrated a record mobile transmission rate of 7.55Gb/s using ‘3D beamforming’ antenna technology at Mobile World Congress last week. The antennae technology, the companies said, compensates for high propagation loss at millimeter-wave frequencies by producing a pencil beam and steering the direction of radio signals.
Millimeter-wave frequency bands have not generally been used for cellular communication due to the high propagation loss. However, as the frequency bands below 6GHz become more saturated, mobile operators are pushing to develop new technologies that can make use of ultra-high frequency bands.
The two South Korean firms also showcased what they called “full dimension MIMO” that uses tens to hundreds of antennae. The current LTE-Advanced standard uses a maximum of eight antennae.
The companies also said they plan to strengthen their cooperation to develop key 5G technologies, with the goal of being the first to commercialise 5G in 2020. They signed an agreement in October 2014 to start joint research on 5G network technology and service development.
South Korea’s leading operator SK Telecom (SKT) has demoed a data download speed of 600Mb/s using Nokia Networks’ 4X4 MIMO technology combined with carrier aggregation (CA).
By using four transmit antennae and four receive antennae to link a mobile device and a base station, the technology is able to double download speeds compared to typical 2X2 MIMO LTE, which has two transmit and two receive antennae.
When applied to the current LTE systems with two transmit and two receive antennae, the technology can double the maximum downlink speed of LTE using 10MHz bandwidth from 75Mb/s to 150 Mb/s, the companies claimed. To reach the 600Mb/s LTE-Advanced data speed, SKT and Nokia Networks first achieved the 300Mb/s data rate using 4X4 MIMO on a 20MHz block of spectrum.
They then combined two of these LTE bandwidths using CA to hit the 600Mb/s rate. Since a mobile device with four antennas and CA capability has not been developed, the two companies used a mobile device simulator made by Aeroflex to measure the data speeds during the demonstration.
The companies said the demonstration is important because MIMO is a key technology needed to achieve early commercialisation of supposed ’5G’ networks.
Park Jin-hyo, the head of SKT’s Network Technology R&D Centre, said the demonstration “not only strengthens our competitiveness in LTE-A, but also marks another important milestone in our journey towards achieving the next-generation network.”
The demonstration was showcased at Mobile World Congress in Barcelona. SKT and Samsung demonstrated a record mobile transmission rate of 7.55Gb/s using ‘3D beamforming’ antennae technology.
Back in January ZTE boasted a “world first” by completing pre-commercial field testing of multi-user and multi-stream transmission on a massive MIMO (multiple input multiple output) base station, claiming it had set “new records” in single-carrier transmission capacity and spectral efficiency.
The Chinese supplier didn’t reveal what the actual speeds were, only to say that its proprietary massive MIMO base station “demonstrated peak data throughput that is more than three times that of traditional base stations, and average data throughput that exceeds conventional systems by at least five times”. The test was done using handsets based on existing 4G standards. “Being a pre-5G technology, ZTE’s massive MIMO solution is delivering exponential advances to 4G networks without modifying existing air interfaces, making it possible for carriers to provide a 5G-like user experience on existing 4G handsets in an accelerated timeframe,” said Dr Xiang Jiying, chief scientist at ZTE.
“ZTE successfully overcame the challenge of doing multi-user and multi-stream spatial multiplexing in a scattered-signal environment, clearing the main hurdle in the development of massive MIMO technology.” No doubt intended to underline its ‘5G’ credentials, the Chinese firm added that “several major international telecommunications operators indicated they would deepen their collaborations with ZTE, after they were invited to attend the field test, noting the record-breaking results had exceeded their expectations”.
The latest development, said ZTE, comes two months after it successfully completed a pre-commercial test of what it claimed was the world’s first pre-5G massive MIMO base station in November 2014. There was also some detail about the MIMO base station in the announcement about world record capacity. Comprising 128 antennas, ZTE said it uses a frontal area similar to existing 8-antennae. And by integrating antennae, base station units and RFs in one module, ZTE also claimed it uses only one-third of the installation space of traditional systems. This lowers operating costs and total cost of ownership. ZTE has been talking up the pre-5G concept for quite some time, saying it will be available much sooner than 5G (which has a time frame for commercial launch of beyond 2020).
The supplier believes too that pre-5G will deliver a comparable user experience to eventual 5G technology, offering high throughput and low latency through the likes of MIMO technology. While ZTE is no stranger to 5G media announcements, neither are companies such as Samsung and Huawei.
Distributed Broadband Optical Wireless MIMO Antennae Networking goes along with 3D Beamforming antennae
3D beamforming MIMO antennae technology is clearly key to early deployment of 5G but a more vital issue is the development of inexpensive multi-gigabit optical fibre or better still, free space optical backhaul links between closely spaced small cell street lamp posts requiring a new metropolitan area, mesh based architecture.
We all know IoT and 5G are going to be huge, and as with most technology, the industrial IoT will eventually eclipse its consumer equivalent. But both are dependent upon each other and certainly in no rush. From smart socks to smart homes, consumer tech is flying off the shelves, but industrial IoT is taking its sweet time.
As always there’s a good reason for that. Consumer product cycles are often measured in months, while industrial hardware changes over a period of years or more. An oil pipeline or a stretch of train track may last several decades before retirement or major servicing. That’s typically a good thing, but with the IoT, reliability can get in the way. Sure, adding sensors to industrial systems can add value, but is the payoff worth a rip-and-replace? If you’re managing lots of expensive, perfectly reliable systems spread over a large area, the answer is often “No.”
In Employing Industrial IoT: A Framework for CIOs, Gigaom Research Analyst Adam Lesser outlines a number of inhibitors to industrial IoT adoption: standards, security, staffing and power efficiency. These are tough enough on their own, and any industrial IoT implementation will have to cope with them, but for companies with large sunk costs in facilities and physical infrastructure, the very act of upgrading equipment can cost money and time the business just isn’t prepared to absorb. But what if instead of sensor-enabling the environment we want to measure, we put the sensors on the people already moving through it?
I’m not talking about Google Glass or the HoloLens. Most augmented reality platforms are great for visualizing 3D spaces, providing contextual documentation, or otherwise enhancing your field of view, but they’re usually focused on helping us do one thing more efficiently. They’re primarily push tools, with very limited perception beyond the vision and hearing we already have. But if we enhance perception beyond human limits, we’ve just put legs on some pretty powerful data collection, and we can put that to use without upsetting a single apple in the cart.
A few weeks ago DAQRI’s 4D Expo conference was held in Los Angeles. While the show featured everything from 3D printing to augmented colouring books, the centerpiece of the show was the DAQRI Smart Helmet, a wearable AR platform designed exclusively for industrial use. While most of the attendees were focused on the AR output of the device (admittedly, the Predator-like holographic wrist controls were pretty cool), the most impressive part of the helmet was the sensor package pulling in data that unassisted humans could never access. Unlike people, the test unit had built-in thermal sensors and 360-degree cameras.
Both of these could assist workers in accomplishing their usual tasks more safely or efficiently (e.g., “Don’t touch that until it cools down” or “Watch out for that beam behind you”), but they could also provide passive scanning for entirely different use cases. For example, equipped with similarly-specced devices, security guards with no additional effort or training could scan factory equipment for excessive heat or out-of-bounds behavior while they do their nightly rounds.
Of course, not everything can be assessed externally. If you want to know what’s going on inside a data center, nuclear reactor, or deep-sea cable, there are no shortcuts. You’re going to need to put a sensor inside. But there are plenty of opportunities to enhance the perception of existing employees and get a lot of bang for a comparatively small buck, with no disruption in business flow.
There may even be opportunities to scan for data entirely unrelated to the task at hand on behalf of a third party, creating new revenue streams while increasing the safety and efficiency of core business processes. Large, established businesses have a lot to lose, making true ground-up innovation a tough sell. But big industry has one massively underutilized resource — knowledgeable people.
After decades of trying to automate them out of existence, humans may turn out to provide just the ideal bridge we need to get a jump on the next 5, 10 and perhaps 20 years of perfectly timed 5G expansion.
Speaking at Mobile World Congress in Barcelona, Oettinger laid out his “grand vision” for 5G, saying that his “ambitious strategy” will give the EU a strong voice in international deals to set 5G standards.
“The EU industry has a major role to play in the context of global 5G. It has a strong influence on the competitiveness and innovation of other sectors. Beyond economic matters, it is also about security and technological sovereignty for Europe,” said Oetti.
However, despite this he was keen to push the idea of international collaboration. Last June the Commission signed a deal with South Korea to work together towards a global definition of 5G and to cooperate in 5G research.
“It is our intention to sign similar agreements with other key regions of the world, notably Japan, China, and the US,” said Oettinger on Tuesday. Chinese representatives were quick to jump on the high-speed bandwagon.
“5G will represent a new wave of innovation. We need to ensure that technological revolution and business model evolution go hand-in-hand,” Dr. Li Yingtao from Huawei stressed that the company is committed to research tando innovation within the EU. “We are joining forces with our European partners to help the EU take the lead on the road towards 5G,” he said.
Luigi Gambardella, President of ChinaEU, a bilateral trade lobby group, added: “A cooperation between Europe and China on 5G will open the door to both the Chinese and the European industry to the biggest and richest markets in the world respectively and create the conditions to invest more in European and Chinese digital economies."
ChinaEU wants nothing less than to put together “the political institutions and the main manufacturers of both regions to achieve this common objective”.
With companies including Huawei and ZTE, China is the world’s largest manufacturer of mobile equipment and network infrastructure, something Oettinger knows he cannot ignore.
But he must get the European house in order first. As part of his “grand vision”, he wants early identification of 5G spectrum bands and with the next International Telecommunication Union World Radio Conference looming, time is running out.
5G standardisation is expected to start in 2016 and a major China-EU summit is due to be held in Brussels in June.
“There can be no successful 5G deployment in Europe without enhanced coordination of spectrum assignments between member states,” said Oettinger before warning squabbling national ministers that they must get their act together and agree on the proposed Telecoms law.
“It contains important measures to facilitate small cell deployment and Wi-Fi which are at the heart of 5G success. Removing administrative barriers for their rapid deployment is the forward-looking policy of today to enable 5G tomorrow,” said the Commissioner.
Huawei has talked up its plans for 5G at Mobile World Congress, with an emphasis on building its patent book for the next mobile standard.
The company is already in fifth place in European patent applicants for 2014 according to the European Patent Office, and had nearly 500 patents granted in that year. New filings across its whole portfolio came to 1,600 in 2014, the EPO report says.
CEO Ken Hu told MWC attendees the company's spending on 5G would rise; as Reuters notes, it had previously committed to tip $US600 million into its 5G bucket between 2013 and 2018.
The Register noted that technologies like Huawei's 5G air interface represent a shot at capturing the flag in the emerging technology.
Hu said the company's work in 5G was giving it a strong position in intellectual property.
Since he expects the 5G network to have to serve 100 billion “smart nodes”, those that get their patents into standards will ensure a fabulous annuity in the form of license payments.
To get there, he said, will need cooperation between carriers, vendors and verticals.