The goal beyond this Research is to review methods that used to estimate Logistic distribution parameters. An exact estimators method which is the Moment method, compared with other approximate estimators obtained essentially from White approach such as: OLS, Ridge, and Adjusted Ridge as a suggested one to be applied with this distribution. The Results of all those methods are based on Simulation experiment, with different models and variety of sample sizes. The comparison had been made with respect to two criteria: Mean Square Error (MSE) and Mean Absolute Percentage Error (MAPE).
Abstract
The prevention of bankruptcy not only prolongs the economic life of the company and increases its financial performance, but also helps to improve the general economic well-being of the country. Therefore, forecasting the financial shortfall can affect various factors and affect different aspects of the company, including dividends. In this regard, this study examines the prediction of the financial deficit of companies that use the logistic regression method and its impact on the earnings per share of companies listed on the Iraqi Stock Exchange. The time period of the research is from 2015 to 2020, where 33 companies that were accepted in the Iraqi Stock Exchange were selected as a sample, and the res
... Show MoreThe two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreSelf-assessment is a process of formative assessment during which teachers reflect on and evaluate the quality of their work, decide the degree to which they reflect explicitly stated goals or criteria, identify strengths and weaknesses in their work, and revise accordingly. The present study is an attempt to find out the SA of Iraqi English language teachers. The sample consists of 100 teachers in Baghdad. An inventory of many domains distributed to the teachers, they are, routines, expectations, language, time, opportunities, physical environment, and interactions. The results show that the EFL teachers practice four domains of SA they are: routines, physical environment, time, and language.
The right to property is one of the most fundamental rights enjoyed by individuals, and most national constitutions and laws, as well as international conventions, have to be respected and protected only in accordance with the economic and social development of the country (the so-called public benefit) and in return for just compensation. What is fair compensation?
In this research work, a simulator with time-domain visualizers and configurable parameters using a continuous time simulation approach with Matlab R2019a is presented for modeling and investigating the performance of optical fiber and free-space quantum channels as a part of a generic quantum key distribution system simulator. The modeled optical fiber quantum channel is characterized with a maximum allowable distance of 150 km with 0.2 dB/km at =1550nm. While, at =900nm and =830nm the attenuation values are 2 dB/km and 3 dB/km respectively. The modeled free space quantum channel is characterized at 0.1 dB/km at =860 nm with maximum allowable distance of 150 km also. The simulator was investigated in terms of the execution of the BB84 p
... Show MoreThe presented work shows a preliminary analytic method for estimation of load and pressure distributions on low speed wings with flow separation and wake rollup phenomena’s. A higher order vortex panel method is coupled with the numerical lifting line theory by means of iterative procedure including models of separation and wake rollup. The computer programs are written in FORTRAN which are stable and efficient.
The capability of the present method is investigated through a number of test cases with different types of wing sections (NACA 0012 and GA(W)-1) for different aspect ratios and angles of attack, the results include the lift and drag curves, lift and pressure distributions along the wing s
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show More