Thursday, August 12, 2021

Will Millimeter Wave Frequencies Above 100 GHz be a Big Small Cell Problem?

Lots of people who know a bit about radio frequency networks will justifiably have some concern about how well commercial mobile systems will function when using GHz frequencies between 30 GHz and 140 GHz, for example. 


All radio frequency signals suffer high attenuation in the first few meters, once they are transmitted. This effect is essentially frequency independent. 

source: Semfio Networks


source: Hyperphysics


In other words, radio frequency signals get weaker according to an inverse square law:  quantity is inversely proportional to the square of the distance from the source: 1/distance squared. 



Attenuation in free space also is increased by rain or snow, and is frequency dependent. 


source: Researchgate 


So as we start to commercially deploy radio systems between 10 GHz and 400 GHz, we will see higher free space attenuation than at sub-1 GHz frequencies or mid-band frequencies (mid-band roughly between 2 GHz and 6 GHz). 


source: Electronic Design 


Higher frequencies with higher free space attenuation is why small cells are important for millimeter wave signals (26-, 28-, 38-, and 60 GHz in the U.S. market). 


But some are optimistic that as we begin to deploy mobile generations beyond 5G, and open up millimeter and teraHertz frequencies for use, we will find affordable ways to deploy radio infrastructure that is commercially useful. 


One big concern is signal propagation. It might ultimately be less a problem than we imagine. 


“Between sub-6 GHz and 140 GHz frequencies, the propagation path loss for an urban radio channel doesn't really differ at different frequencies, after accounting for the radiated signal's first meter of travel,”  says Theodore S. Rappaport, Professor of Electrical Engineering at the NYU Tandon School of Engineering.


This means that once a radio signal reaches what's called the "far field" (beyond the first meter or so), the frequency has surprising little impact on a signal's attenuation as it travels through urban and indoor channels


The caveat is that rain will increase attenuation. And signals at some frequencies are more liable to be absorbed by oxygen. That is a big deal just beyond 60-GHz and around 120 GHz. In those two regions point-to-point operations are possible, as well as indoor, short range use cases. 


Rappaport believes that future mobile networks using frequencies above 100 GHz will likely work, at most of the same locations. “Networks won't likely require further densification, and today's new tower sites for 5G will be usable for decades to come without the need to build many more,” he argues.


Beginning with 5G, wireless systems are using directional antennas that have high antenna gains and narrow beam widths on both the mobile and the base station ends of each link, Rappaport says. This offers more, not less, signal strength to each user as we move to millimeter wave, sub-teraHertz, and ultimately teraHertz frequencies.


Good news, if he proves right for networks built at scale.


No comments:

Post a Comment

Is Sora an "iPhone Moment?"

Sora is OpenAI’s new cutting-edge and possibly disruptive AI model that can generate realistic videos based on textual descriptions.  Perhap...