5G ?                                                          Home : www.sharetechnote.com

 

 

 

 

How will you achieve the technology ? what would be the challege ?

 

Since no technical specification is done for 5G, I cannot talk anything about any technology for 5G that is officially described. What I want to say in this section is just to think of some possible technical factors and challenges for those technical factors. Most of the comments for each factors would sound too pessimistic, but it has always been like this before any new technology come out, but I believe eventually most of these challenges will be overcome or totally new concept that can overcome these obstacles will emerge. This is how all the current technologies have evolved. The purpose of my comments here to give you something to think about.

 

One thing for sure for 5G would be that the required data throughput for 5G will be much higher (probably tens of Gbps) than 4G. So I want to talk about general technique for increasing the data rate.

 

< Increasing Modulation Depth >

 

Whenever a technology reach a point where it has to jump up the data rate, one of the first step has almost always been "to increase Modulate Depth". The most common evolution path for the modulation Scheme was as follows :

         BPSK -> QPSK(QAM) -> 16 QAM -> 64 QAM

What would be the next step ? Logically it should be 256 QAM. (If it is too much, we may think about 128 QAM)

Would this be possible ? It may be possible if it is wired, it may be possible (possible but not easy even in wired communication). However, it would be very difficult (almost impossible ?) with wireless communication when we think about various wireless factors (AWGN, Fading, Avaliable Power, Dynamic Range, PAPR handling etc). <-- This is the status as of I first write this which is about 2 years ago. Now (as of Oct 2015), 256 QAM in current LTE is already implemented in some chipset and being tested in UE level.

 

 

< Increasing Number of Antenna and Spatial Multiplexing factors >

 

When we reach the maximum modulation depth, another step we think of as the next step to push up the throughput would be to increase the number of antenna and Spatial Multiplexing factors.

The most common Antenna Configuration for multiplexing we use the most commonly as of now (as of May,2013) is 2 x 2 MIMO. The maximum configuration that is specified in 3GPP as part of final goal of LTE advanced is 8 x 8.

Considering the 5G data rate will be much higher than the final stage data rate for LTE advanced, it is highly probable to adopt even more entenna (more complicating Spatial Multiplexing). How many antenna we can use ? How difficult it would be ?

We don't know... but the number of Antenna that SamSung claimed to use for their next generation technology trial (May 2013) was 64 antenna array.

Would this be really feasible/practical ? Look at the mobile phone that you have now and imagine that you have to put a 64 antenna properly spaced for maximize the spatial multiplexing. At least, Antenna and RF designer would not want to even think about it -:)

<-- (As of Oct 2015), it is likely that the Massive MIMO would operate in MU-MIMO (MultiUser MIMO). In this case, eNodeB (Not sure if it will still be called as eNodeB in 5G) will have huge number of Antenna (like 64 or even more than 100) but UE will have much less number of Antenna.

 

 

< Increasing the operating bandwidth >

 

If all the other technique mentioned above is not possible, an alternative we can think of is to increase the channel bandwidth (RF bandwidth).

What is the maximum bandwidth we use for LTE as of now ? (May 2013)

It is 20 Mhz assuming Single Carrier. If we adopt LTE advanced technically available as of now, it can be max 40 Mhz using Carrier Aggregation with 2 carriers. At the final stage of LTE advanced, we may be able to use max 100 Mhz using 5 carriers.

If you think about 100 Mhz for the currently available spectrum (mostly 800~3Ghz), first you would have spectrum license issue. Most of the spectrum is already sold out. Another issue would be to develop various RF components to handle 100 Mhz contiguous spectrum. 100 Mhz single band would be too much fractional bandwidth (operational bandwidth divided by the center frequency). You can think of implementing 100 Mhz using 5 separate 20 Mhz band, but in that case you need to implement 5 separate RF chains for it which will make RF/hardware design so complicated.

<-- (As of Oct 2015). Then you may ask how wide the bandwidth would be in 5G ? We don't know yet, but hearing from chipset vendors , network vendors and test equipment, it is likely to start with under 200 Mhz(e.g, 160 Mhz) or 400 Mhz. Ideal maximum that is being suggested is around 2 Ghz Bandwidth.

 

 

< Using very high frequency spectrum >

 

One of the ways to get around the 'increasing operating bandwidth' issues described above would be to use very high frequency spectrum which has not been licensed out much. For example, if you go to 20 Ghz spectrum, 100 Mhz BW is only 0.5 % fractional bandwidth.. and several hundreds of Mhz BW is less than 5%.

In case of recent SamSung trial (May 2013), it is claimed to use 26 Ghz spectrum.

But one of the biggest problem with this kind of high frequency (millimeter wave) is that it has extremly high path loss and is very vulnerable to various envirenmental/weather factors like building, tree, moisture, rain etc.

Another important problem with this kind of high frequency is that it would be very difficult to get the small sized RF components properly working at this frequency. (For example, RF SAW duplexer, SAW filters which are one of the most commonly used components in the mobile phone currently will not be used in these frequency.)

 

One area that can be more practical at least as of now (July 2013) would be in 5Ghz range since the chipsets and devices supporting 802.11ac are already emerging in the market even though the performance may not exactly meet the claimed criteria. However, it would be pretty sure that higher and wider spectrum will be considered as well in real 5G mobile communication.

<-- (as of Oct 2015) FCC started promoting 5 different blocks in mmWave range. The lowest one is around 29Ghz and the highest is around 59 Ghz.

 

 

< Dynamically Configurable Spectrum both in frequency and time domain >

 

Unlike the current communication system, it is expected that multiple radio technology, operators and carrier frequencies would change dymanically even during the signal user service session. If you think of LTE-A Carrier Aggregation, WiFi Offload, it would give you very primitive idea of this. But this kind of combination and dynamic changes will become a default mode of operation in 5G.

Then.. the challenging issue would be how the RF front end of the device handle this kind of situation.

 

< Massive MIMO >

 

If I am asked to list a couple of critical features that should be accomplished in 5G before anything else, I would list as follows :

  • Extremely high data rate
  • Handling simultaneous (concurrent) users in PHY/MAC in much more numbers than the case in current LTE
  • Handling the path loss which is normally observed in mm Wave (milimeter wave) region

Even though it is not a single solution that can completely achieve all these features, one of the key factor would be Massive MIMO (In case of the prototype that SamSung came up with in May 2013, it used 64 Antenna in 28 Ghz). I stronly recommend you google this topic and study. Following is a couple of introductary material that I found.

 

< Very Short TTI and Extremly Low Latency >

 

These property can be described in several different layers..but usually when we talk about TTI it usually refer to MAC/PHY property and when we talk about Latency it usually refer to higher layer including IP layer.

Regarding TTI, in WCDMA R99 the most common MAC layer TTI were 10 ms (U-Plane) or 40 ms (C-Plane), in HSPA it become 2ms or 10 ms, in LTE it become 1 ms. I think 1 ms is already a kind of min TTI, but we would need event shorter TTI to achieve the max throughput/latency criteria required for 5G.

Another possibility would be that a couple of additional TTI is introduced and let the system select one of the TTI dynamically or semi-statically depending on situations.

--> (As of Oct 2015) It is not yet determined anything on the TTI length(subframe length) of 5G. But as far as I am hearing, 0.2 ms (200 us) or 0.1 ms(100 us) subframe length is most frequently mentioned.

 

< Extremely High Sampling Rate ADC/DAC >

 

Even thought nothing specific is formally defined for 5G requirement, I think it will be highly probable that the system bandwith will get extended to at least a couple of hundreds Mhz. As I mentioned above, there will be a lot of issues related to RF (or milimeter wave) but there will also be tough chalenges at baseband level as well. The first question you would have would be "what would be the sampling rate ?". If the system bandwidth at carrier frequency level is a couple of Mhz, the sampling rate at baseband level should be several Mhz BW.

It means that you need ADC which can sample and convert at several Mhz rate. You may think this would not be a big issue since you may have seen various digital oscilloscope which covers even a couple of Gb sampling rate.

Yes.. it is true. This kind of super high frequency ADC is already used in various area especially high end digital oscilloscope. But those high frequency sampling is not done by a single ADC. It is done by multiple ADCs working in parallel and sampling in interleaving pattern. To make this kind of sampling work properly, very complicated hardware design and control algorithm are required.

To make this kind of technology usable for a mobile device which should not be as expensive and bulky as a high end digital oscilloscope, great deal of improvement on ADC/DAC technology is required as well.

 

See if following materials can give you any insight on this topic. (Think of how you can make this kind of technology usuable on mobile device)

 

< RACH mechanism to handle very large numbers of subscribers >

 

One of the important goal of 5G is to implement a system that can handle very large numbers of subscribers. These subscribers are not only human but also various types of machines and sensors. In this case, the chances would increase a lot of PRACH reaching a network simultaneously and casuing contention. To figure out a way to handle this situation will be an important topics to be researched (especially in MAC layer design).