Font Size: a A A

Study On Transmission Time Interval Optimization Based On LTE-A Pro

Posted on:2018-12-21Degree:MasterType:Thesis
Country:ChinaCandidate:X T SunFull Text:PDF
GTID:2348330512979439Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
Higher data speed and lower latency have always been the targets of wireless communication systems.They are also the significant advantages of the 4th-Generation communication system(4G)compared to the previous generation of mobile wireless communication system.Compared with 4G,the fifth generation mobile communication system(5G)is clearly proposed to further increase user data peak rates and reduce air interface latency.As the transitional stage of 4G and 5G,Long Term Evolution-Advanced Pro(LTE-A Pro)aims to further reduce user latency and increase system throughput on the basis of 4G to welcome 5G.Since reducing user latency can improve the efficiency of wireless resource utilizatio a improve the transmission speed of control signaling,reduce call setup or bearer setup time and transmission time interval(TTI)is an important part of user transmission latency,a method of reduing it by optimizing TTI has been proposed in this paper.Although the third generation partnership project(3 GPP)put forward a variety of pre scheduling strategies to reduce latency,they have casued another problem of effiency.The method proposed in this paper would not introduce this kind of problems,so it is very necessary to carry out the sudy.In order to reduce system transmission latency and improve system throughput by optimizing TTI,two phases of work have been done in this paper.In the first phase of work,TTI corresponds to the length of one subframe in long term evolution(LTE)system,so TTI can be optimized by optimizing the structure of subframes.To ensure backward compatibility,channels and signals on physical layer are needed to be changed.After that,we can reduce LTE downlink subframe length from 14 orthogonal frequency division multiplexing(OFDM)symbols to 7 OFDM symbols or 2 OFDM symbols.In the premise of static control overhead,different numbers of maximum scheduled users and the sizes of the data packets have been set to perform simulation and analysis.It can be concluded that when the channels are in good conditions,the shorter the subframe length,the smaller the system latency.The second phase of work is carried out under adaptive control overhead and it can be divided into two parts.In the first part of work,simulations have been performed by setting different packets’ sizes and slow start thresholds.After simulation,it can be concluded that when the channels are in good condition,the shorter the subframe length,that is the shorter the TTI,the better the system performance.However,due to hardware constraints,protocol specifications and other reasons,the shorter the subframe length,the harder to achieve.In the second part of work,in order to balance the system performance and implementation difficulty,that is help subframes of 7 OFDM symbols to perform best,this paper puts forward the mehod of optimizing uplink access latency and hybrid automatic retransmission request(HARQ)acknowledgement/negative acknowledgement(ACK/NACK)round-trip time.The simulation results indicate that by using this method,subframes of 7 OFDM symbols can help reduce 45.6%and 40.6%system latency for cell edge and center users under low system load,better than the performance of subframes of 2 OFDM symbols without using this method.
Keywords/Search Tags:Transmission time interval, Subframe length, OFDM symbols, Latency
PDF Full Text Request
Related items