Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation experiments. Was then estimate parameters of the probability distribution that has been extracted from the distribution formula for the function of every failure using a method as possible the greatest and the way White and the way the estimated mixed, and comparison between the adoption of the standard average squares error (MSE) to compare the results using the method of simulation in the demo to get to the advantage estimators and volumes of different samples to my teacher and measurement form of distribution. The results reveal that the mixed estimated parameter is the best form either parameter shape, and the results showed that the best estimated of scale parameters are the White estimator
Abstract
The issue of the protection of the environment is a shared responsibility between several destinations and sectors, and constitutes a main subject in which they can achieve sustainable development. In the sectors of government programs can be set up towards the establishment of the government sector to the green environment, so to be the implementati
... Show MoreThis study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators
In this paper, a FPGA model of intelligent traffic light system with power saving was built. The intelligent traffic light system consists of sensors placed on the side's ends of the intersection to sense the presence or absence of vehicles. This system reduces the waiting time when the traffic light is red, through the transition from traffic light state to the other state, when the first state spends a lot of time, because there are no more vehicles. The proposed system is built using VHDL, simulated using Xilinx ISE 9.2i package, and implemented using Spartan-3A XC3S700A FPGA kit. Implementation and Simulation behavioral model results show that the proposed intelligent traffic light system model satisfies the specified operational req
... Show MoreThe aim of this research is to identify the impact of the effectiveness of digital transformation technology in improving the efficiency of the insurance service in the public insurance companies in Iraq through digital transformation technology in the insurance process "promotion of the insurance product, submission of insurance application, underwriting, settlement of losses" and to achieve this goal a questionnaire was designed A survey distributed in the surveyed community Which represents the decision makers in the researched companies under study, and one of the most important results of the research was the existence of a relationship between digital transformation and improving the efficiency of the insurance process, whether thr
... Show MoreIn this paper, a method is proposed to increase the compression ratio for the color images by
dividing the image into non-overlapping blocks and applying different compression ratio for these
blocks depending on the importance information of the block. In the region that contain important
information the compression ratio is reduced to prevent loss of the information, while in the
smoothness region which has not important information, high compression ratio is used .The
proposed method shows better results when compared with classical methods(wavelet and DCT).
In the present work, leaching process studiedusing organic acids (acetic acid and lactic acid) to extract phosphate from the Iraqi Akashat phosphate ore by separation of calcareous materials (mainly calcite). This approach characterized by energy conservation, environmental enhancement by recovery of calcite as calcium sulfate (gypsum), keeping the physical and chemical properties of apatite. Samples were analyzed using X-ray diffraction and FTIR spectrophotometer. From the obtained experimental data it was found that using the two organic acids yields closed purity values of the produced apatite at the optimum conditions, while at different acid concentrations, it was found that the efficiency of acetic acid is higher at the low acid co
... Show More
The research aims to measure, assess and evaluate the efficiency of the directorates of Anbar Municipalities by using the Data Envelopment Analysis method (DEA). This is because the municipality sector is consider an important sector and has a direct contact with the citizen’s life. Provides essential services to citizens. The researcher used a case study method, and the sources of information collection based on data were monthly reports, the research population is represented by the Directorate of Anbar Municipalities, and the research sample consists of 7 municipalities which are different in terms of category and size of different types. The most important conclusion reached by the research i
... Show MoreThe most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show MoreAbstract
This study was conducted by using soil map of LD7 project to interpret the
distribution and shapes of map units by using the index of compaction as an
index of map unit shape explanation. Where there were wide and varied
ranges of compaction index of map units, where the maximum value was
0.892 for MF9 map unit and the lower value was 0.010 for same map unit.
MF9 has wide range appearance of index of compaction after those indices
were statistically analyzed by using cluster analysis to group the similar
ranges together to ease using their values, so the unit MF9 was considered as
key map unit that appears in the soils of LD7 project which may be used to
expect another map units existence in area of
In this paper, we made comparison among different parametric ,nonparametric and semiparametric estimators for partial linear regression model users parametric represented by ols and nonparametric methods represented by cubic smoothing spline estimator and Nadaraya-Watson estimator, we study three nonparametric regression models and samples sizes n=40,60,100,variances used σ2=0.5,1,1.5 the results for the first model show that N.W estimator for partial linear regression model(PLM) is the best followed the cubic smoothing spline estimator for (PLM),and the results of the second and the third model show that the best estimator is C.S.S.followed by N.W estimator for (PLM) ,the
... Show More