In recent years, the field of research around the congestion problem of 4G and 5G networks has grown, especially those based on artificial intelligence (AI). Although 4G with LTE is seen as a mature technology, there is a continuous improvement in the infrastructure that led to the emergence of 5G networks. As a result of the large services provided in industries, Internet of Things (IoT) applications and smart cities, which have a large amount of exchanged data, a large number of connected devices per area, and high data rates, have brought their own problems and challenges, especially the problem of congestion. In this context, artificial intelligence (AI) models can be considered as one of the main techniques that can be used to solve network congestion problems. Since AI technologies are able to extract relevant features from data and deal with huge amounts of data, the integration of communication networks with AI to solve the congestion problem appears promising, and the research requires exploration. This paper provides a review of how AI technologies can be used to solve the congestion problem in 4G and 5G networks. We examined previous studies addressing the problem of congestion in networks, such as congestion prediction, congestion control, congestion avoidance, and TCP development for congestion control. Finally, we discuss the future vision of using AI technologies in 4G and 5G networks to solve congestion problems and identify research issues that need further study.
This study has been developed axes of the search, including: Search (deliberative) language and idiomatically, and Description Language (b social phenomenon), and the definition of the theory of (acts of speech), and discussed the problem of the conflict between tradition and innovation, as defined objectively have a target aimed at reviving the deliberative thought when Arab scholars , and the balance between the actual done Arab and Western rhetoric, but Meet in intellectual necessity, a sober reading that preserve the Arab language prestige, and its position in the light of the growing tongue Sciences, as long as we have inherited minds unique, and heritage huge able to consolidate the Arab theory lingual in linguistics.
Luminescent sensor membranes and sensor microplates are presented for continuous or high-throughput wide-range measurement of pH based on a europium probe.
In this paper, a design of the broadband thin metamaterial absorber (MMA) is presented. Compared with the previously reported metamaterial absorbers, the proposed structure provides a wide bandwidth with a compatible overall size. The designed absorber consists of a combination of octagon disk and split octagon resonator to provide a wide bandwidth over the Ku and K bands' frequency range. Cheap FR-4 material is chosen to be a substate of the proposed absorber with 1.6 thicknesses and 6.5×6.5 overall unit cell size. CST Studio Suite was used for the simulation of the proposed absorber. The proposed absorber provides a wide absorption bandwidth of 14.4 GHz over a frequency range of 12.8-27.5 GHz with more than %90 absorp
... Show MoreThe city is a built-up urban space and multifunctional structures that ensure safety, health and the best shelter for humans. All its built structures had various urban roofs influenced by different climate circumstances. That creates peculiarities and changes within the urban local climate and an increase in the impact of urban heat islands (UHI) with wastage of energy. The research question is less information dealing with the renovation of existing urban roofs using color as a strategy to mitigate the impact of UHI. In order to achieve local urban sustainability; the research focused on solutions using different materials and treatments to reduce urban surface heating emissions. The results showed that the new and old technologies, produ
... Show MoreThis study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreBackground: Skull secondary tumors are malignant bone tumors which are increasing in incidence.Objective: The objectives of this study were to present clinical features , asses the outcome of patients with secondary skull tumors ,characterize the MRI features, locations, and extent of secondary skull tumors to determine the frequency of the symptomatic disease.Type of the study: This is a prospective study.Methods: This is a prospective study from February 2000 to February 2008. The patients were selected from five neurosurgical centers and one oncology hospital in Baghdad/Iraq. The inclusion criteria were MRI study of the head(either as an initial radiological study or following head CT scan when secondary brain tumor is suspected , vis
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreAverage interstellar extinction curves for Galaxy and Large Magellanic Cloud (LMC) over the range of wavelengths (1100 A0 – 3200 A0) were obtained from observations via IUE satellite. The two extinctions of our galaxy and LMC are normalized to Av=0 and E (B-V)=1, to meat standard criteria. It is found that the differences between the two extinction curves appeared obviously at the middle and far ultraviolet regions due to the presence of different populations of small grains, which have very little contribution at longer wavelengths. Using new IUE-Reduction techniques lead to more accurate result.