Massive multiple-input multiple-output (massive-MIMO) is a promising technology for next generation wireless communications systems due to its capability to increase the data rate and meet the enormous ongoing data traffic explosion. However, in non-reciprocal channels, such as those encountered in frequency division duplex (FDD) systems, channel state information (CSI) estimation using downlink (DL) training sequence is to date very challenging issue, especially when the channel exhibits a shorter coherence time. In particular, the availability of sufficiently accurate CSI at the base transceiver station (BTS) allows an efficient precoding design in the DL transmission to be achieved, and thus, reliable communication systems can be obtained. In order to achieve the aforementioned objectives, this paper presents a feasible DL training sequence design based on a partial CSI estimation approach for an FDD massive-MIMO system with a shorter coherence time. To this end, a threshold-based approach is proposed for a suitable DL pilot selection by exploring the statistical information of the channel covariance matrix. The mean square error of the proposed design is derived, and the achievable sum rate and bit-error-rate for maximum ratio transmitter and regularized zero forcing precoding is investigated over different BTS topologies with uniform linear array and uniform rectangular array. The results show that a feasible performance in the DL FDD massive-MIMO systems can be achieved even when a large number of antenna elements are deployed by the BTS and a shorter coherence time is considered.
تعد لعبة كرة السلة من الألعاب الرياضية التي تحتاج متطلبات بدنية ووظيفية خاصة بها، وذلك من خلال الانتقال داخل الملعب بالكرة أو بدونها والسبل للتخلص من ملاحقة الخصم أثناء الدفاع وكيفية المناورة أثناء الهجوم مع إجادة التصويب بكافة أنواعه داخل الملعب، ومن هنا تكونت مشكلة البحث في كيفية تطوير تلك العوامل الوظيفية والتي لها الأثر في الارتقاء بمستوى أداء اللاعب أثناء اللعب، وعن طريق استخدام الباحثان لطريقة جهاز ا
... Show MoreAs cities across the world grow and the mobility of populations increases, there has also been a corresponding increase in the number of vehicles on roads. The result of this has been a proliferation of challenges for authorities with regard to road traffic management. A consequence of this has been congestion of traffic, more accidents, and pollution. Accidents are a still major cause of death, despite the development of sophisticated systems for traffic management and other technologies linked with vehicles. Hence, it is necessary that a common system for accident management is developed. For instance, traffic congestion in most urban areas can be alleviated by the real-time planning of routes. However, the designing of an efficie
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show MoreThis paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreThe objective of this research was to estimate the dose distribution delivered by radioactive gold nanoparticles (198 AuNPs or 199 AuNPs) to the tumor inside the human prostate as well as to normal tissues surrounding the tumor using the Monte-Carlo N-Particle code (MCNP-6.1. 1 code). Background Radioactive gold nanoparticles are emerging as promising agents for cancer therapy and are being investigated to treat prostate cancer in animals. In order to use them as a new therapeutic modality to treat human prostate cancer, accurate radiation dosimetry simulations are required to estimate the energy deposition in the tumor and surrounding tissue and to establish the course of therapy for the patient. Materials and methods A simple geometrical
... Show More