Wednesday, 16 January 2019

Samsung launching 5G smartphones at Barcelona in February.

Meet the 5G Samsung Galaxy S10 X and its monstrous specs!

Samsung is pulling out all the stops for its all-new 5G colossus Samsung will unveil its all-new Galaxy S10 range on February 20, 2019 at simultaneous Galaxy Unpacked events in San Francisco and London.

Thanks to the prominent "10" emblazoned on the invitation, it seems pretty clear we'll see the Samsung Galaxy S10 , Galaxy S10 Plus, and new affordable Galaxy S10 Lite revealed during the keynote presentation next month.

However, the latest whispers suggest we could also see the foldable Galaxy X, a pair of sensor-packed Smart Shoes, as well as a 5G-enabled Galaxy S10 announced, too. The latter has been subject to a new leak which gives us a boatload of new details on the aforementioned 5G-compatible Galaxy S10 handset.

According to sources speaking to the publication, the so-called ultra-flagship model will be branded Galaxy S10 X. Galaxy S10 release date, price, features, 5G, leaks Samsung Galaxy X finally has a release date, and it's sooner than expected The addition of the "X" to the Galaxy S10 name is not only to signal the fact that the Galaxy S10 represents the tenth anniversary of the best-selling Galaxy S smartphone series, but also purportedly feeds into the marketing planned for the handset, which will centre around the buzzwords "eXperience" and "eXpansion".

Those of us who have been closely following the whispers from the supply chain in the run-up to the Galaxy S10 launch event will know that this name seems somewhat problematic. After all, Galaxy X has been heavily rumoured to be the name of the long-awaited foldable flagship phone due to be unveiled during the same keynote as the Galaxy S10 series.

Samsung briefly unveiled the pliable phone during its annual developer conference in San Francisco in November 2018, but neglected to confirm the name of the handset. They have not only revealed the branding of the upcoming Galaxy S10 X, but it has some new details around the specs squeezed inside the ultra-flagship. And it sounds like a beast of a smartphone. The 5G-enabled Galaxy S10 will include the same in-screen ultrasonic fingerprint scanner scheduled for the regular 4G-touting Galaxy S10 and Galaxy S10 Plus, as well as a whopping 1TB of built-in storage, an enormous 5,000mAh battery, and 10GB of RAM or more.

Wow. In addition to serious overkill specs inside the handset, there are claims that the flagship will boast six cameras (a dual-selfie camera on the front, and a quadruple set-up like the Galaxy A9 on the back), 3D depth-sensing for improved images, and a humongous 6.7-inch Super AMOLED Infinity-O style display.

Finally, the 5G-charged Galaxy S10 X will also include exclusive new software capabilities. The next-generation handset will purportedly use AI to attempt to automate some of its users most common software features. This sounds similar to what Google has introduced in its Android 9.0 Pie. However, Samsung is tipped to be pushing the features further using its Bixby talkative assistant.

So, what will the 5G Samsung Galaxy S10 X cost? Prices for the entry-level (non 5G) Galaxy S10 Lite will sit somewhere between £555, or $712 and £625, $801. Meanwhile, sources claim the maxed-out Galaxy S10 could reach as high as £1,250 or $1,603.

However, it's unclear whether this is a reference to the Galaxy S10 Plus, or the 5G-enabled Galaxy S10 X. According to an earlier Samsung Galaxy S10 UK price leak, the top-end smartphone model will reach an eye-watering £1,399 or $1,791. Given that the Galaxy Note 9 currently costs £1,099 or $1,149.99 for the maxed-out model with 512GB of built-in storage, these leaked high prices really don't seem that unrealistic.

5G emerges in Alpha Test mode in UK writes the Wired magazine.

Hybrid 5G performance is being analyzed.

5G device to device communication is such a product of progressive thinking, a network that uses both LTE communication scenario in conjugation with Wi-Fi low band communication.

The main idea for conjunction of two different types of network is based on the fact that base stations suffer large amount of traffic and tend to drop data and information in such cases. Apart from these facts another main stream goal is to provide security for such a communication technology.

The future 5G network is based on the transitioning nodes, a set of cluster head communicates with another cluster head using base stations and nodes in between transfer from one cluster head to another.

A Gray hole attack is a situation in which the attacker inserts a malicious node into cluster head and steals information. This paper is based on the performance of 5G networks and the likely effects of gray hole attacks on 5G networks.

Further proposal emerges around Hybrid5G

In this paper, we propose a user centered handoff scheme for hybrid 5G environments. The handoff problem is formulated as a multi-objective optimization problem which maximizes the achievable data receiving rate and minimizes the block probability simultaneously. When a user needs to select a new Base Station (BS) in handoff, the user will calculate the achievable data receiving rate and estimate the block probability for each available BS based on limited local information. By taking the throughput metric into consideration, the formulated multi-objective optimization problem is then transformed into a maximization problem. We solve the transformed maximization problem to calculate the network selection result in a distributed method. The calculated network selection result is proved to be a Pareto Optimal solution of the original multi-objective optimization problem. The proposed scheme guarantees that based on limited local information, each user can select a new BS with high achievable data receiving rate and low block probability in handoff. Comprehensive experiment has been conducted. It is shown that the proposed scheme promotes the total throughput and ratio of users served significantly. INDEX TERMS 5G mobile communication, Base stations, Optimization, Mobile handsets, Bandwidth, Quality of service, Interference

Tuesday, 15 January 2019

LiFi has a fascinating history which is revealed by Professor Harald Haas who was interviewed in Edinburgh, Scotland.

Project Loon: The Journey is the reward not just the destination!

Earliest tests of Loon started back 8 years ago in 2011, using a weather balloon and basic, off-the-shelf radio parts – the first prototype. The next years were a process of rapid iteration to prove beyond any doubt that the balloon-powered internet just works very well indeed in rural and metro areas around the globe.

Razer Blade 15 Advanced // 4K OLED & 240 HZ!?

4K relevance in our development of Hybrid5G Infrastructure

Hands down the most fabulous 4K high-end Razer Blade 15 Advanced laptop launched in 2019, featuring the new RTX 2060, 2070 MAXQ and 2080 AXQ GPUs, making it an ideal gamer device.

I find any 4K OLED Panel and a 240Hz display is an extraordinarily powerful solution for Livestreaming.

Attend our secure audio/video conference call where we review our design of Hybrid5G infrastructure and discuss all the Smart Cities development issues. Join our secure free Livestream daily at the link below as we support up to 12 attendees for training at:

We are online every day of the week for an hour or two beginning 10 AM EST in January and February. If you have any questions about Hybrid5G Infrastructure or Smart Cities and IoT feel free to hit us up and drop in the above site at your earliest convenience.

Growth of cellular mobile subscriptions continues unabated

How many will support 5G?

First sight of a 5G Smartphone

Samsung had a 5G smartphone prototype on display at CES and almost nobody noticed it!!

samsung 5g prototype phone ces 2019 2

  • Samsung had a 5G smartphone prototype on display at CES 2019.
  • It's the closest official look we have to future smartphones from Samsung that come with 5G compatibility.
  • A variant of Samsung's upcoming Galaxy S10 is said to come with 5G support.
Strolling innocently through Samsung's massive booth at CES 2019 in Las Vegas, I stumbled upon Samsung's 5G prototype smartphone.
There it was, a 5G smartphone prototype from the biggest mobile company in the world, with no fanfare or much interest from any other CES attendee around me. I almost started to believe that only I could see it.
To be clear, the prototype I saw was, indeed, just a prototype. It was the smartphone used to show off 5G capabilities when Verizon and Samsung announced earlier in December that a 5G phone will be coming in early 2019 during a Qualcomm event in Hawaii. I'd doubt that Samsung would brazenly display this device if it had any similarities to the upcoming Galaxy S10, of which one model is rumored to support 5G connectivity.
Still, it's the closest thing we've seen to an official 5G smartphone from Samsung at the moment.
Check out Samsung's 5G prototype smartphone:

There it is, casually hanging out at the far end of Samsung's 5G hardware display. You can barely see it.

There it is, casually hanging out at the far end of Samsung's 5G hardware display. You can barely see it.

Here's a closer look at the prototype of Samsung's vision of a 5G smartphone. If you were expecting 5G phones to look different, you could be set up for disappointment.

Here's a closer look of the prototype of Samsung's vision of a 5G smartphone. If you were expecting 5G phones to look different, you could be set up for disappointment.

It looked like a perfectly operational unit with a working display, volume buttons, a power button, and even a Bixby button. It even had a case.

It looked like a perfectly operational unit with a working display, volume buttons, a power button, and even a Bixby button. It even had a case.
The phone's screen cycled through a canned demo that showcased some of the device's features and functions. None of the demos appeared to make use of wireless capabilities, however, so it didn't provide much insight into what the super-fast 5G wireless data experience will actually be like.

It's hard to tell from the photo, but the prototype had a notch on the top right corner.

It's hard to tell from the photo, but the prototype had a notch on the top right corner.
Samsung's upcoming Galaxy S10 is said to have had a hole-punch cutout on the top right of the display. The prototype's corner notch isn't the same thing, but it does show that Samsung is at least experimenting with moving selfie cameras and sensors to a different location away from the top centre of its phones.

Caged in its glass enclosure, Samsung's prototype is still a mystery and doesn't reveal that much about the company's future phones.

Caged in its glass enclosure, Samsung's prototype is still a mystery and doesn't reveal that much about the company's future phones.
All we can assume about a 5G smartphone from Samsung is that it'll include the 5G-compatible model of Qualcomm's Snapdragon 855 mobile chip. What kind of speeds and performance we can realistically expect from mobile 5G vary greatly. It's generally expected that mobile 5G networks will be significantly faster than the current 4G LTE networks we currently use to stream data on our smartphones today.
We'll have to wait and see for a true 5G smartphone connected to a 5G mobile network to find out just how much better 5G will be than what we have now.

Cellular Mobile Phones and Wi-Fi Routers were Safe?

How Did Big Wireless Convince Us Cellular Mobile Phones and Wi-Fi Routers were Safe?

January 15, 2019
A Kaiser Permanente study (published in early December 2017 in Scientific Reports) conducted controlled research testing on hundreds of pregnant women in the San Francisco Bay area and found that those who had been exposed to magnetic field (MF) non-ionizing radiation associated with cell phones and wireless devices had 2.72 times more risk of miscarriage than those with lower MF exposure. Furthermore, the study reported that the association was “much stronger” when MF was measured “on a typical day of participants’ pregnancies.” According to lead investigator De-Kun Li, the possible effects of MF exposure have been highly controversial because, “from a public health point of view, everybody is exposed. If there is any health effect, the potential impact is huge.” [See article byJulian Klein and Casey Lewis, with Kenn Burrows and Peter Phillips, “Accumulating Evidence of Ongoing Wireless Technology Health Hazards,” in Censored 2015: Inspiring We the People.]
A March 2018 investigation for the Nation by Mark Hertsgaard and Mark Dowie showed how the scope of this public health issue has been inadequately reported by the press and under-appreciated by the public. Hertsgaard and Dowie reported that the telecom industry has employed public relations tactics, first pioneered by Big Tobacco in the 1960s and developed by fossil-fuel companies in the 1980s, to influence both the public’s understanding of wireless technologies and regulatory debates. 
The wireless industry has “war-gamed” science by playing offence as well as defence, actively sponsoring studies that result in published findings supportive of the industry while aiming to discredit competing research that raises serious questions about the safety of cellular devices and other wireless technologies. [On “war-gaming,” see, for example, a 1994 Motorola memo, now published online.] When studies have linked wireless radiation to cancer and genetic damage, industry spokespeople have pointed out that the findings are disputed by other researchers. This strategy has proven highly effective, Hertsgaard and Dowie reported, because “the apparent lack of certainty helps to reassure customers, even as it fends off government regulations and lawsuits that might pinch their profits.” As Hertsgaard and Dowie concluded, 
Lack of definitive proof that this technology is harmful does not mean the technology is safe, yet the wireless industry has succeeded in selling this logical fallacy to the entire world . . . The upshot is that, over the past 30 years, billions of people around the world have been subjected to a massive public-health experiment: Use a cell phone today, find out later if it causes cancer and genetic damage. Meanwhile, the wireless industry has obstructed a full and fair understanding of the current science, aided by government agencies that have prioritized commercial interests over human health and news organizations that have failed to inform the general public about what the scientific community really thinks. In other words, this public-health experiment has been conducted without the informed consent of its subjects, even as the industry keeps its thumb on the scale.
The stakes of this public-health experiment continue to rise with the increasing prevalence of Wi-Fi and Bluetooth technologies as well as the development of the “Internet of Things” and anticipated 5G wireless networks.
Multiple studies, including one published in the American Journal of Epidemiology in October 2017, have correlated long-term exposure to cell phone radiation with the risk for glioma (a type of brain tumour), meningioma, DNA damage, and other health risks. In May 2017, the California Department of Public Health released safety guidelines in response to possible health impacts from cell phone radiation. Yet this information was withheld from the public for seven years and only released after litigation. The American Academy of Pediatrics has clear recommendations to reduce children’s exposure to cell phone radiation—yet pregnant women continue to use wireless devices worn on their abdomens and children are given cell phones as toys.
The wireless industry claims to be in full compliance with health and safety regulations and opposes mandatory disclaimers about keeping phones at a safe distance. Yet they also oppose updating cell phone radiation testing methods in ways that would accurately represent real-life use. 
As the Environmental Health Trust and Marc Arazi have reported, recent scientific research and court rulings from France underscore these concerns about wireless technology radiation. Under court order, the National Frequency Agency of France (ANFR) recently disclosed that nine out of ten cell phones exceed government radiation safety limits when tested in the way they are actually used, next to the human body. As the Environmental Health Trust reported, French activists coined the term “PhoneGate” because of parallels to the 2015 Volkswagen emissions scandal (referred to informally as “Dieselgate”) in which Volkswagen cars “passed” diesel emission tests in the lab, but actually had higher emissions when driven on real roads. In the same way, cell phones “passed” laboratory radiation tests when the “specific absorption rate” (SAR), which indicates how much radiation the body absorbs, was measured at a distance of 15 mm (slightly more than half an inch). However, the way people actually carry and use cell phones (for example, tucked into a jeans pocket or bra, or held in contact with the ear) results in higher levels of absorbed radiation than those found in lab tests.
As reported by Environmental Health Trust (EHT), French law ensures that SAR levels are identified prominently on cell phone packaging and that cell phone sales are banned for young children. In 2016, new French policies stated, “ALL wireless devices, including tablets, cordless phones, remote controlled toys, wireless toys, baby monitors and surveillance bracelets, should be subjected to the same regulatory obligations as cell phones.” EHT also reported that, according to Le Monde, France would attempt to ban the use of cell phones in schools, colleges, and playgrounds as of 2017.
Although local media might announce the findings of a few selected studies, as the San Francisco Chronicle did when the Kaiser Permanente study was published, the norm for corporate media is to report the telecom industry line—that is, that evidence linking Wi-Fi and cell phone radiation to health issues, including cancer and other medical problems, is either inconclusive or disputed. Such was the case, for example, when the Wall Street Journal published an article in February 2018 titled “Why the Largest Study Ever on Cellphones and Cancer Won’t Settle the Debate.” Similarly, in May 2016 the Washington Post published an article titled “Do Cellphones Cause Cancer? Don’t Believe the Hype.” As Hertsgaard and Dowie’s Nation report suggested, corporate coverage of this sort is partly how the telecom industry remains successful in avoiding the consequences of their actions.
Mark Hertsgaard and Mark Dowie, “How Big Wireless Made Us Think That Cell Phones are Safe: A Special Investigation,” The Nation, March 29, 2018,
“Phonegate: French Government Data Indicates Cell Phones Expose Consumers to Radiation Levels Higher Than Manufacturers Claim,” Environmental Health Trust, June 2, 2017, updated June 2018,
Marc Arazi, “Cell Phone Radiation Scandal: French Government Data Indicates Cell Phones Expose Consumers to Radiation Levels Higher Than Manufacturers Claim,” Dr. Marc Arazi blog, June 3, 2017,
Marc Arazi, “Phonegate: New Legal Proceedings against ANFR and Initial Reaction to the Communiqué of Nicolas Hulot,” Dr. Marc Arazi blog, December 2, 2017,
Student Researchers: John Michael Dulalas, Bethany Surface, and Kamila Janik
 (San Francisco State University) and Shannon Cowley (University of Vermont)
Faculty Evaluators: Kenn Burrows (San Francisco State University) and Rob Williams (University of Vermont)

Monday, 14 January 2019

LiFi in much greater detail

LiFi is a wireless communication technology that uses the infrared and visible light spectrum for high-speed data communication. LiFi, first coined in [1] extends the concept of visible light communication (VLC) to achieve high speed, secure, bi-directional and fully networked wireless communications [2]. It is important to note that LiFi supports user mobility and multiuser access. The size of the infrared and visible light spectrum together is approximately 2600 times the size of the entire radio frequency spectrum of 300 GHz (see Fig. 1). It is shown in [3] that the compound annual growth rate (CAGR) of wireless traffic has been 60% during the last 10 years. If this growth is sustained for the next 20 years, which is a reasonable assumption due to the advent of Internet-of-Things (IoT) and a machine type communication (MTC), this would mean a demand of 12,000 times the current bandwidth assuming the same spectrum efficiency. As an example, the industrial, scientific and medical (ISM) RF band in the 5.4 GHz region is about 500 MHz, and this is primarily used by wireless fidelity (WiFi). This bandwidth is already becoming saturated, which is one reason for the introduction of Wireless Gigabit Alliance (WiGig). WiGig uses the unlicensed spectrum between 57 GHz–66 GHz, i.e., a maximum bandwidth of 9 GHz. In 20 years from now, the bandwidth demand for future wireless systems would, however, be 12,000 × 500 MHz which results in a demand for 6 THz of bandwidth. The entire RF spectrum is only 0.3 THz. This means a 20 times shortfall compared to the entire RF spectrum, and a 667 times shortfall compared to the currently allocated bandwidth for WiGig. In comparison, the 6 THz of bandwidth is only 0.8% of the entire IR and visible light spectrum. One could argue that a more aggressive spatial reuse of frequency resources could be adopted to overcome this looming spectrum crunch. This approach has been used very successfully in the past and has led to the ‘small cell concept’. In fact, it has been the major contributor to the improvements of data rates as illustrated in Fig. 2. The cell sizes in cellular communication have dramatically shrunk. The cell radius in early 2 G systems was 35 km, in 3G systems 5 km, in 4G systems 100 m, and in 5G probably about 25 m in order to reuse the available RF spectrum more efficiently and to achieve higher data densities. However, further reductions in cell sizes are more difficult to achieve due to the high infrastructure cost for the backhaul and front-haul data links which connect these distributed access points to the core network. Moreover, with a smaller cell size the likelihood of line-of-sight between an interfering base station and a user terminal increases. The resulting interference can significantly diminish data rates and may cause a major problem in cellular networks [4]. Therefore, WiFi access points have been mounted under the seats in stadia to use the human body as an attenuator for the RF signals and to avoid line-of-sight interference links. Clearly, this is not a viable solution for office and home deployments. For these reasons, it is conceivable that the contributions for the future mobile data traffic growth will stem from more spectrum rather than spatial reuse. In particular, the optical resources are very attractive as they are plentiful as shown in Fig. 1 and they are license-free.

Fig 1
Fig. 1
Fig 2
Fig. 2
These resources can be used for data communication which is successfully demonstrated for decades in fibre-optic communication using light amplification by stimulated emission of radiation (lasers). With the widespread adoption of high brightness light emitting diodes (LEDs) an opportunity has arisen to use the visible light spectrum for pervasive wireless networking.
Traditionally, a VLC system has been conceived as a single point-to-point wireless communication link between a LED light source and a receiver which is equipped with a photo detection device such as a photo detector (PD). The achievable data rate depends on the digital modulation technology used as well as the lighting technology. The available lighting technologies are summarised in Fig. 3.

Fig 3
Fig. 3
Most commercial LEDs are composed of a blue high brightness LED with a phosphorous coating that converts blue light into yellow. When blue light and yellow light are combined, this turns into white light. This is the most cost-efficient way to produce white light today, but the phosphor color converting material slows down the frequency response, i.e., higher frequencies are heavily attenuated. Consequently, the bandwidth of this type of LED is merely in the region of 2 MHz. With a blue filter at the receiver to remove the slow yellow components it, however, is possible to achieve data rates in the region of 1 Gbps with these devices. More advanced red, green and blue (RGB) LEDs enable data rates up to 5 Gbps as white light is produced by mixing the base colors instead of using a color converting chemical. Record transmission speeds with a single micro LED of 8 Gbps have been demonstrated [5], and it was shown that 100 Gbps are feasible with laser-based lighting [6].
The key advantages of a LiFi wireless networking layer are: (i) three orders of magnitude enhanced data densities [7]; (ii) unique properties to enhance physical layer security [8]; (iii) use in intrinsically safe environments such as petrochemical plants and oil platforms where RF is often banned; iv) with the advent of power-over-ethernet (PoE) and its use in lighting, there exists the opportunity to piggy-back on existing data network infrastructures for the required backhaul connections between the light sources with its integrated LiFi modem and the Internet.

LiFi networking

Fig. 4 illustrates the concept of a LiFi attocell network. The room is lit by several light fixtures, which provide illumination. Each light is driven by a LiFi modem or a LiFi chip and, therefore, also serves as an optical base station or access point (AP). The optical base stations are connected to the core network by high speed backhaul connections. The light fixtures also have an integrated infrared detector to receive signals from the terminals. The illuminating lights are modulated at high rates. The resulting high frequency flickers which are much higher than the refresh rate of a computer monitor are not visible to the occupants of the room. Power and data can be provided to each light fixture using a number of different techniques, including PoE and power-line communication (PLC) [9][10]. An optical uplink is implemented by using a transmitter on the user equipment (UE), often using an IR source(so it is invisible to the user). Each of these light fixtures, which at the same time act as wireless LiFi APs, create an extremely small cell, an optical attocell [11]. Because light is spatially confined, it is possible in LiFi to take the ‘small cell concept’ to a new level by creating ultra-small cells with radii less than 5 m while exploiting the huge additional unlicensed spectrum in the optical domain. The balance of light fixtures that contain APs and those that provide only illumination is determined by the requirement of the network, but potentially all light fixtures can contain APs. Compared to a single AP wireless hot-spot system, such cellular systems can cover a much larger area and allow multiple UEs to be connected simultaneously [12]. In cellular networks, dense spatial reuse of the wireless transmission resources is used to achieve very high data density - bits per second per square meter (bps/m2). Consequently, the links using the same channel in adjacent cells interfere with each other, which is known as co-channel interference (CCI) [13]Fig. 5illustrates CCI in an optical attocell network.

Fig 4
Fig. 4
Fig 5
Fig. 5
The move from point to point links to full wireless networks based on light poses several challenges. Within each cell, there can be several users and therefore multiple accessschemes are required. The provision of an uplink can also require a different approach from the downlink. This is because low energy consumption is required in the portable device, and an uplink visible light source on the device is likely to be distracting to the user. Therefore, the use of the infrared spectrum seems most appropriate for the uplink. In addition, modulation techniques for a high-speed uplink have to be spectrum efficient and power efficient at the same time. Two recently developed modulation techniques that achieve this are enhanced unipolar OFDM (EU OFDM) [14], or spectral and energy efficient (SEE OFDM) [15]. Advanced CCI mitigation techniques [16] often require that these multiple LiFi APs are operated by means of a centralized control mechanism such as ‘resource schedulers’ within the controller of a software-defined network (SDN) [17]. The main tasks of the ‘resource scheduler’ are to adaptively allocate signal power, frequency, time and wavelength resources. Typically, there are trade-offs between signalling overhead, computational complexity, user data rates, aggregate data rates and user fairness, and the optimum selection of respective CCI mitigation and resource scheduling techniques depend on actual use cases and system constraints [18][19]. Other functions of the central controller include achieving multi-user access and the process of handover from cell to cell when terminals move. Handover plays an important role in LiFi networks. For example, the handover controller has to ensure that connectivity is maintained when users leave a room, or the premises. Therefore, there might be situations when there is no LiFi coverage. In these scenarios to avoid loss of connectivity, we utilize the fact that LiFi is complementary to RF networks. To this end, there have been studies on hybrid LiFi/RF networks, and the three key findings are: (i) LiFi networks will significantly improve services quality to mobile users, (ii) service delivery can be uninterrupted, and (iii) WiFi networks significantly benefit from LiFi networks. The latter is because well-designed load balancing will ensure that WiFi networks suffer less from inefficient traffic overheads caused by constant re-transmissions which happen when two or multiple terminals are in contention [20].
LiFi attocell networks have many advantages over incumbent technologies. Firstly, unlike omnidirectional RF antennas radiating signals in all directions, a LED light source typically radiates optical power directionally because of the way it is constructed. Therefore, the radiation of the visible light signals is naturally confined within a limited region. In contrast, RF mm-wave systems require complicated and expensive antenna beamforming techniques to achieve the same objective. Secondly, LiFi attocell networks can be implemented by modifying existing lighting systems. Any LiFi attocell network can provide extra wireless capacity without interference to RF networks that may already exist. LiFi attocell networks, therefore, have the potential to augment 5G cellular systems in a cost-effective manner [21].
A unique feature of LiFi is that it combines illumination and data communication by using the same device to transmit data and to provide lighting. Fig. 6(a) depicts a simple room scenario with two lights. Fig. 6(b) shows the resulting illuminance at desk level of 0.75 m. In a particular example, the lights are placed such that within the plane at desk height, 90% of the area achieves an illuminance of 400 lx based on a given illumination requirement. Fig. 6(c) depicts the resulting signal-to-interference-plus-noise ratio (SINR). The region where the light cones overlap is subject to strong CCI, and the SINR drops significantly. It is interesting to note that the SINR can vary by about 30 dB within a few centimetres. This example also highlights that the peak SINR can be in the region of 50 dB which is two to three orders of magnitude higher than the peak SINR in RF-based wireless systems. The achievable data rate strongly depends on the location of the receiver and also on the field of view (FoV) of the receiver [22].

Fig 6
Fig. 6
Interference mitigation techniques are required to ensure that within the region of strong CCI, a mobile station can also achieve high SINR, and this is a non-trivial problem which involves signal processing such as successive interference cancellation [23].

LiFi misconceptions

There are many misconceptions in relation to LiFi:
LiFi is a LoS technology: This perhaps is the greatest misconception. By using an orthogonal frequency division multiplexing (OFDM)-type intensity modulation (IM)/direct detection (DD) modulation scheme [24], the data rate scales with the achieved signal-to-noise-ratio (SNR). In a typical office room environment where the minimum level of illumination for reading purposes is 500 lx, the SNR at table height is between 40 dB and 60 dB [25]. This means higher order digital modulation schemes can be used in conjunction with OFDM to harness the available channel capacity. By using adaptive modulation and coding (AMC) it is possible to transmit data at SNRs as low as −6 dB. Fig. 7 illustrates a video transmission to the laptop in the front over a distance of about 3 m where the LED light fixture is pointing against a white wall in the opposite direction to the location of the receiver. Therefore, there is no direct LoS component reaching the receiver at the front, but the video is successfully received. Obviously, if the wall would be dark, more light would be absorbed which would compromise the SNR at the receiver. If the SNR drops below the −6 dB threshold, an error-free communication link would not be possible. However, in low-light conditions single photon avalanche diodes may be used at the receiver which enhances the receiver sensitivity by at least an order of magnitude [26].

Fig 7
Fig. 7
LiFi does not work in sunlight conditions: Sunlight constitutes a constant interfering signal outside the bandwidth used for data modulation. LiFi operates at frequencies typically greater than 1 MHz. Therefore, constant sunlight can be removed using electrical filters. An additional effect of sunlight is enhanced shot noise, which cannot easily be eliminated by optical filters. In a study [27] the impact of shot noise was investigated qualitatively, and it was found that data rate is compromised by 1.5% and 4.5% assuming a 0.19 mm2 detector, and 2 mm2 detectors respectively. Saturation can be avoided by using automatic gain control algorithms in combination with optical filters. In fact, we argue that sunlight is hugely beneficial as it enables solar cell based LiFi receivers where the solar cell acts as data receiver device, and at the same time harvests sunlight as energy [28].
Lights cannot be dimmed: There are advanced modulation techniques such as EU-OFDM [14] which enable the operation of LiFi close to the turn-on voltage (ToV) of the LED which means that the lights can be operated at very low light output levels while maintaining high data rates.
The lights flicker: The lowest frequency at which the lights are modulated is in the region of 1 MHz. The refresh rate of a computer screen is about 100 Hz. This means the flicker-rate of a LiFi light bulb is 10,000 higher than that of a computer screen. Therefore, there is no perceived flicker.
This is for downlink only: A key advantage is that LiFi can be combined with LED illumination. This, however, does not mean that both functions always have to be used together. Both functions can easily be separated (see the comment on dimming). As a result, LiFi can also be very effectively used for uplink communication where lighting is not required. The infrared spectrum, therefore, lends itself perfectly for the uplink. We have conducted an experiment where we sent data at a speed of 1.1 Gbps over a distance of 10 m with an LED of only 4.5 mW optical output power.

Market disruption potential

LiFi is a disruptive technology that is poised to impact many industries. LiFi is a fundamental 5G technology. It can unlock the IoT, drive Industry 4.0 applications, light-as-a-service (LaaS) in the lighting industry, enable new intelligent transport systems, enhance road safety when there are more and more driverless cars, create new cyber secure wireless networks, enable new ways of health monitoring of aging societies, offer new solutions to close the digital divide, and enable very high-speed wireless connectivity in future data centers. LiFi will have a catalytic effect for the merger of two major industries: i) the wireless communications industry and ii) the lighting industry. In 25 years from now, we argue that the LED lightbulb will serve thousands of applications and will be an integral part of the emerging smart cities, smart homes and the IoT. LaaS will be a dominating theme in the lighting industry, which will drive the required new business models when LED lamps last 20 years or more. LaaS in combination with LiFi will, therefore, provide a business model driven ‘pull’ for the lighting industry to enter what has traditionally been a wireless communications market. In the wireless industry, LiFi has the potential to create a paradigm shift by moving from cm-wave communication to nm-wave communication – see Fig. 8. It is, therefore, conceivable that the wireless industry and the lighting industry will merge into one. An important prerequisite for the large-scale adoption of LiFi technology is the availability of standards. In this context, efforts have started in IEEE 802.15.7, IEEE 802.11 as well as ITU-R to standardize LiFi technology.

Fig 8
Fig. 8


In this paper, we have shown that there has been a clear trend in wireless communications to use ever higher frequencies. This is a consequence of the limited availability of RF spectrum in the lower frequency bands of exponential growth in wireless data traffic that we have been witnessing at the same time during the last decade. This growth will continue. It is, therefore, inevitable that other spectrum than the RF spectrum must be used for future wireless communication systems. We, therefore, forecast a paradigm shift in wireless communications when moving from mm-wave communication to nm-wave communication which consequently involves light – i.e., LiFi. There has been significant research in physical layer technologies for LiFi during the past 15 years and data rates have increased from a few Mbps in around 2002 to 8 Gbps from a single LED in 2016. In the last five years there has been increasing research in LiFi networking techniques such as multiuser access, interference mitigation and mobility support, and in parallel LiFi products have entered the market which has enabled wireless networking with light. Therefore, LiFi has become a reality and this technology is here to stay for a long time.


Prof Haas acknowledges support from the EPSRC under established career fellowship grant EP/K008757/1.