In recent years, the field of research around the congestion problem of 4G and 5G networks has grown, especially those based on artificial intelligence (AI). Although 4G with LTE is seen as a mature technology, there is a continuous improvement in the infrastructure that led to the emergence of 5G networks. As a result of the large services provided in industries, Internet of Things (IoT) applications and smart cities, which have a large amount of exchanged data, a large number of connected devices per area, and high data rates, have brought their own problems and challenges, especially the problem of congestion. In this context, artificial intelligence (AI) models can be considered as one of the main techniques that can be used to solve network congestion problems. Since AI technologies are able to extract relevant features from data and deal with huge amounts of data, the integration of communication networks with AI to solve the congestion problem appears promising, and the research requires exploration. This paper provides a review of how AI technologies can be used to solve the congestion problem in 4G and 5G networks. We examined previous studies addressing the problem of congestion in networks, such as congestion prediction, congestion control, congestion avoidance, and TCP development for congestion control. Finally, we discuss the future vision of using AI technologies in 4G and 5G networks to solve congestion problems and identify research issues that need further study.
With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreThis review investigates the practice and influence of chatbots and ChatGPT as employable tools in writing for scientific academic purposes. A primary collection of 150 articles was gathered from academic databases, but it was systematically chosen and refined to include 30 studies that focused on the use of ChatGPT and chatbot technology in academic writing contexts. Chatbots and ChatGPT in writing enhancement, support for student learning at higher education institutions, scientific and medical writing, and the evolution of research and academic publishing are some of the topics covered in the reviewed literature. The review finds these tools helpful, with their greatest advantages being in areas such as structuring writings, gram
... Show MoreThis paper presents a proposed method for (CBIR) from using Discrete Cosine Transform with Kekre Wavelet Transform (DCT/KWT), and Daubechies Wavelet Transform with Kekre Wavelet Transform (D4/KWT) to extract features for Distributed Database system where clients/server as a Star topology, client send the query image and server (which has the database) make all the work and then send the retrieval images to the client. A comparison between these two approaches: first DCT compare with DCT/KWT and second D4 compare with D4/KWT are made. The work experimented over the image database of 200 images of 4 categories and the performance of image retrieval with respect to two similarity measures namely Euclidian distance (ED) and sum of absolute diff
... Show MoreThis research takes up address the practical side by taking case studies for construction projects that include the various Iraqi governorates, as it includes conducting a field survey to identify the impact of parametric costs on construction projects and compare them with what was reached during the analysis and the extent of their validity and accuracy, as well as adopting the approach of personal interviews to know the reality of the state of construction projects. The results showed, after comparing field data and its measurement in construction projects for the sectors (public and private), the correlation between the expected and actual cost change was (97.8%), and this means that the data can be adopted in the re
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Human interaction technology based on motion capture (MoCap) systems is a vital tool for human kinematics analysis, with applications in clinical settings, animations, and video games. We introduce a new method for analyzing and estimating dorsal spine movement using a MoCap system. The captured data by the MoCap system are processed and analyzed to estimate the motion kinematics of three primary regions; the shoulders, spine, and hips. This work contributes a non-invasive and anatomically guided framework that enables region-specific analysis of spinal motion which could be used as a clinical alternative to invasive measurement techniques. The hierarchy of our model consists of five main levels; motion capture system settings, marker data
... Show MoreMalaysia's growing population and industrialisation have increased solid waste accumulation in landfills, leading to a rise in leachate production. Leachate, a highly contaminated liquid from landfills, poses environmental risks and affects water quality. Conventional leachate treatments are costly and time-consuming due to the need for additional chemicals. Therefore, the Electrocoagulation process could be used as an alternative method. Electrocoagulation is an electrochemical method of treating water by eliminating impurities by applying an electric current. In the present study, the optimisation of contaminant removal was investigated using Response Surface Methodology. Three parameters were considered for optimisation: the curr
... Show MoreThis paper experimentally investigates the heating process of a hot water supply using a neural network implementation of a self-tuning PID controller on a microcontroller system. The Particle Swarm Optimization (PSO) algorithm employed in system tuning proved very effective, as it is simple and fast optimization algorithm. The PSO method for the PID parameters is executed on the Matlab platform in order to put these parameters in the real-time digital PID controller, which was experimented with in a pilot study on a microcontroller platform. Instead of the traditional phase angle power control (PAPC) method, the Cycle by Cycle Power Control (CBCPC) method is implemented because it yields better power factor and eliminates harmonics
... Show More