Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation experiments. Was then estimate parameters of the probability distribution that has been extracted from the distribution formula for the function of every failure using a method as possible the greatest and the way White and the way the estimated mixed, and comparison between the adoption of the standard average squares error (MSE) to compare the results using the method of simulation in the demo to get to the advantage estimators and volumes of different samples to my teacher and measurement form of distribution. The results reveal that the mixed estimated parameter is the best form either parameter shape, and the results showed that the best estimated of scale parameters are the White estimator
In this article, the numerical and approximate solutions for the nonlinear differential equation systems, represented by the epidemic SIR model, are determined. The effective iterative methods, namely the Daftardar-Jafari method (DJM), Temimi-Ansari method (TAM), and the Banach contraction method (BCM), are used to obtain the approximate solutions. The results showed many advantages over other iterative methods, such as Adomian decomposition method (ADM) and the variation iteration method (VIM) which were applied to the non-linear terms of the Adomian polynomial and the Lagrange multiplier, respectively. Furthermore, numerical solutions were obtained by using the fourth-orde Runge-Kutta (RK4), where the maximum remaining errors showed th
... Show MoreThe aim of this study is to provide an overview of various models to study drug diffusion for a sustained period into and within the human body. Emphasized the mathematical compartment models using fractional derivative (Caputo model) approach to investigate the change in sustained drug concentration in different compartments of the human body system through the oral route or the intravenous route. Law of mass action, first-order kinetics, and Fick's perfusion principle were used to develop mathematical compartment models representing sustained drug diffusion throughout the human body. To adequately predict the sustained drug diffusion into various compartments of the human body, consider fractional derivative (Caputo model) to investiga
... Show MoreThe aim of the research is to identify the cognitive method (rigidity flexibility) of third-stage students in the collage of Physical Education and Sports Sciences at The University of Baghdad, as well as to recognize the impact of using the McCarthy model in learning some of skills in gymnastics, as well as to identify the best groups in learning skills, the experimental curriculum was used to design equal groups with pre test and post test and the research community was identified by third-stage students in academic year (2020-2021), the subject was randomly selected two divisions after which the measure of cognitive method was distributed to the sample, so the subject (32) students were distributed in four groups, and which the pre te
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreThe skill scale in most of sport activity monitoring a lot of dynamic behaviours conducted with playing situations that help the excerpt's in sport field to evaluate and put right solutions ,soccer one of games that studies in third stage in college and take skills ,dribbling , passing, shooting these skills helps to execute the plans in game ,the researchers notice that there is no test measure the skills of the game in the beginning of the first semester especially in the method of soccer in physical education college and the problem of the research were by answering the question that is there test connect between one or more that one of skill to measure the ability of students to execute the plans in soccer and the conclusion was the bui
... Show MoreThe concept of quality of auditing profession comes on top of the concerns of the international business community and international institutions particularly now following the impact of the several failures and financial hardships suffered by the major companies in the recent collapse of money markets in some countries of the world and fear of their recurrence in the future.An observer of the local and international rules and standards (or principles) finds that these include such implications have direct or indirect effects on the performance of the service of the accountant and auditor, which should upgrade their professional performance in these services to a high level of quality so as to be in line with the requirements, principles
... Show MoreDue to the lack of statistical researches in studying with existing (p) of Exogenous Input variables, and there contributed in time series phenomenon as a cause, yielding (q) of Output variables as a result in time series field, to form conceptual idea similar to the Classical Linear Regression that studies the relationship between dependent variable with explanatory variables. So highlight the importance of providing such research to a full analysis of this kind of phenomena important in consumer price inflation in Iraq. Were taken several variables influence and with a direct connection to the phenomenon and analyzed after treating the problem of outliers existence in the observations by (EM) approach, and expand the sample size (n=36) to
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreIn this paper, a new high-performance lossy compression technique based on DCT is proposed. The image is partitioned into blocks of a size of NxN (where N is multiple of 2), each block is categorized whether it is high frequency (uncorrelated block) or low frequency (correlated block) according to its spatial details, this done by calculating the energy of block by taking the absolute sum of differential pulse code modulation (DPCM) differences between pixels to determine the level of correlation by using a specified threshold value. The image blocks will be scanned and converted into 1D vectors using horizontal scan order. Then, 1D-DCT is applied for each vector to produce transform coefficients. The transformed coefficients will be qua
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More