Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin’s method), The nonparametric model is estimated by using kernel smoothing (Nadaraya Watson), K-Nearest Neighbor smoothing and Median smoothing. The Flower Pollination algorithms were employed and structured in building the ecological model and estimating the semi-parametric regression function with measurement errors in the explanatory and dependent variables, then compare the models to choose the best model used in the environmental scope measurement errors, where the comparison between the models is done using the mean square error (MSE).
This study was performd on 50 serum specimens of patients with type 2 diabetes, in addition, 50 normal specimens were investigated as control group. The activity rate of LAP in patients (560.46 10.504) I.U/L and activity rate of LAP in healthy(10.58 4.39)I.U/L.The results of the study reveal that Leucine aminopeptidase (LAP) activity of type 2 diabetes patient s serum shows a high signifiacant increase (p < 0.001) compare to healthy subjects. Addition preparation leucine amide as substrate of LAP, identification melting point and spectra by FTIR. K
The aerodynamic characteristics of the forward swept wing aircraft have been studied theoretically and an experimentally investigation for the wake field generated by this configuration have been carried out. Low order panel method with the Dirichlet boundary condition have been used to solve the case of the steady, inviscid and compressible flow. Two different panel method techniques have been employed: the source-doublet and the doublet method. The thickness for the various components was considered in the study. Prandtl-Glauert similarity rule has been used to account for the compressibility effects. Experimentally, a model was manufactured from wood with body length (290mm) and main wing span was (204mm). The primary objective of th
... Show MoreAspect categorisation and its utmost importance in the eld of Aspectbased Sentiment Analysis (ABSA) has encouraged researchers to improve topic model performance for modelling the aspects into categories. In general, a majority of its current methods implement parametric models requiring a pre-determined number of topics beforehand. However, this is not e ciently undertaken with unannotated text data as they lack any class label. Therefore, the current work presented a novel non-parametric model drawing a number of topics based on the semantic association present between opinion-targets (i.e., aspects) and their respective expressed sentiments. The model incorporated the Semantic Association Rules (SAR) into the Hierarchical Dirichlet Proce
... Show MoreA- The research problem: the research problem which is the garments industry, as a
whole it does not rely on a single system in the sizes of the clothing and the working
companies, see that it is not plausible that the sizes be unificd and consistent in all companies.
The current sizes in the domestic Iraqi markets are not suitable for some females ,on the other
hand the Iraqi industry suffers the lack of a modern standard for some Iraqis female bodies.
B- The Signifiance of the research: lies in the study of the diversity of the human body
sizes and naming them to reflect the desires and requirements of the consumer and try to find
a method to meet their expectations as well as to raise the level of garments industr
OpenStreetMap (OSM) is the world’s biggest publicly licensed geographic data collection. Because OSM is rapidly being used in a wide range of applications, researchers have focused their efforts on determining its quality. The OSM buildings data quality is still ambiguous, due to the limitations, and a few researchers have evaluated the OSM buildings data quality through difficulties where the authoritative data are not obtainable. The focus of this research is to analyze and assess the accuracy of OSM buildings including completeness, and positional accuracy methods. Two different study areas in Baghdad city-Iraq have been investigated: Al-Rasheed and Al-Karrada. The process of the (OSM) data evaluation involved identifying the correspon
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Equation Boizil used to Oatae approximate value of bladder pressure for 25 healthy people compared with Amqas the Alrotinahh ways used an indirect the catheter Bashaddam and found this method is cheap and harmless and easy
The increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreThe Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreAbstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show More