The Long Term Evolution (LTE) telecommunication technology has been introduced to provide more capabilities and functionalities to support innovative mobile services. The LTE represents a revolution in telecom technology to provide faster communication and higher data transmission with improved coverage and spectrum efficiency as well as optimized radio access network.
The LTE technology introduces new architectural changes which indicate that the EPC core network is more centralized with more responsibilities to be considered while the radio access network is more distributed. Furthermore, the LTE architecture simplifies the radio access network side by considering only one system which is the « eNodeB » and introduces complicated functionalities and intelligence towards the EPC network. Therefore, the EPC systems are supposed to provide new/evolved responsibilities in LTE technology compared to the 3G & 2G technologies (3GPP TS 36.300 (2015)). Hence, the EPC network has more challenges to be resolved and more intelligence to be provided to meet the expectations of the evolved technology (LTE) and the Next Generations Network services.
The real-time and conversational LTE services require guaranteed resources to be strictly allocated for the whole lifetime of the service call. The LTE mobile core network (EPC) resource allocation approach is inadequate with regards to the guaranteed resources used by those services. More precisely, the EPC mobile gateway system is not capable of properly utilizing the unused bandwidth of the guaranteed resources when the mobile service is not fully using the reserved bandwidth. In this thesis, we will focus on optimizing the guaranteed resource utilization for the LTE mobile services and present an adaptive approach which enhances the resource reservation for the LTE mobile guaranteed services. Our approach provides techniques to: analyze the ongoing mobile guaranteed traffic usage, provide time-series models that mathematically represent the conducted data, forecast the mobile service guaranteed resource consumption, identify the wasted/unused resources, and utilize these resources by other services. Our approach introduces a novel type/method of resource allocation in 3GPP standards. Our experiments will be conducted on datasets captured on an emulated LTE environment. The goal of our experiments is to show that our approach is feasible and beneficial in enhancing the resource allocation for the LTE mobile services and increasing the overall throughput of the LTE/EPC networks.
The telecommunication evolution 4G/LTE technology came in to provide higher data rate and lower delay with improved coverage and spectrum efficiency. The LTE systems provide capable signaling as well as optimized radio transmission in the radio access network.
Through comparing the telecom wireless technologies such as 2G, 3G and WiMAX with the 4G/LTE radio access system, the LTE system provides evolution to the telecommunication as it offers higher data rate, bigger capacity, lower delay, and more improvement on coverage and spectrum efficiency (Shin et al. (2008b) and Verizon (2010)). The radio access network part of the LTE system has only one node which is the Evolved Node B (eNodeB) while the 3G technology in the Universal Mobile Telecommunications System (UMTS) has two nodes: NodeB and Radio Network Controller (RNC). 2G technology also has two nodes: Radio Base Station (RBS) and Base Station Controller (BSC). This architectural change helps to offer less transmission delay in the radio access network and moves more intelligence towards the core network. Also, 3GPP has proposed key system features such as the default PDN session for the User Equipment (UE), and also proposed new architecture of the policy control and charging to have more control on the mobile broadband services.
The evolved packet system (EPS) defines a single core network -all IP based- for multiple heterogeneous accesses that provide triple play services in the next generation networks. In the 2G and 3G technologies, there were two separate core networks: the circuit-switched core network which, delivers the telephony services over circuit switching methodology, and the packet-switched core network, which delivers the wireless mobile broadband data services over the packet switching methodology using all-IP network. In the LTE technology, the evolved packet core network (EPC) is considered as a single mobile core network to run all services required by the wireless user equipments (3GPP TS 36.300 (2015)).
The Packet Data Network (PDN) session is established in the LTE/EPC network for UE connectivity to the internet or to any other network. As part of the PDN session, the UE can have a default bearer and dedicated bearer(s). The default bearer is used for session connectivity and best-efforts traffic. It is established once the UE attaches to the network and an IP address will be assigned to it. It is the responsibility of EPC gateway system to assign the IP address and maintain the UE PDN session(s) and their bearers. On the other hand, the dedicated bearers can be activated to run specific services that require special QoS requirements. Based on the guarantee criteria of the resource reservation, the dedicated bearer can be classified as guaranteed or non-guaranteed bearer (Ekstrom (2009)).
The dedicated guaranteed bearer (GBR) will be used to run special services that require bandwidth to be reserved for the whole lifetime of the service/call. Usually, the GBR bearer is established once the UE demands a service that is provisioned to trigger the guaranteed bearer creation. Conversational voice & video and real-time gaming are examples of the services that would use GBR bearer. On the other hand, the dedicated non-guaranteed bearer (non-GBR) can be used to run services that require special priorities, but it does not require bandwidth to be reserved for the whole lifetime of the service. The non-GBR can remain established for long time, as it does not require bandwidth to be strictly reserved.
CHAPTER 1 INTRODUCTION |