This paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper, the performance of the new feature-based algorithm is compared against the performance of seven ontology-based algorithms adapted to Arabic. The results of the evaluation and comparison experiments show that the new proposed algorithm outperforms the adapted word similarity algorithms on the Arabic word benchmark dataset. The proposed algorithm will be included in the AWN-similarity which is free open-source software for Arabic.
Generally, statistical methods are used in various fields of science, especially in the research field, in which Statistical analysis is carried out by adopting several techniques, according to the nature of the study and its objectives. One of these techniques is building statistical models, which is done through regression models. This technique is considered one of the most important statistical methods for studying the relationship between a dependent variable, also called (the response variable) and the other variables, called covariate variables. This research describes the estimation of the partial linear regression model, as well as the estimation of the “missing at random” values (MAR). Regarding the
... Show MoreMicroencapsulated of paraffin wax which acts as core material of phase change
material covered by polymer was prepared by using rabid (physical-chemical) with lower
energy (green) method. Prepolymer of condensed Melamine-Formaldehyde resin, was
solidified by heat effect gradually and surrounds the Paraffin wax as microcapsules. The
diameter of the prepared capsules was about (170-220) micron which has a proportion with
the prepolymer temperature, otherwise the thermal analysis appears as a best value of
enthalpy (ΔH) which was (12 J/gm) when the prepolymer temperature was (60˚C)
Thsst researcher problem of delays faced by researchers are all waiting to evaluate their standards by the experts who must take their views to extract the truth Virtual important step first step in building standards whatsoever, then the difference of opinion among experts about the paragraphs Whatever the scope of their functions, leading to confusion in maintaining these paragraphs or delete? Or ignore the views and opinion of the researcher to maintain the same? Or as agreed upon with the supervisor if he was a student? Especially if the concepts of a modern new building.
Therefore, the researcher sought to try to find a solution to her problem to conduct an experiment to test building steps
Calculating similarities between texts that have been written in one language or multiple languages still one of the most important challenges facing the natural language processing. This work offers many approaches that used for the texts similarity. The proposed system will find the similarity between two Arabic texts by using hybrid similarity measures techniques: Semantic similarity measure, Cosine similarity measure and N-gram ( using the Dice similarity measure). In our proposed system we will design Arabic SemanticNet that store the keywords for a specific field(computer science), by this network we can find semantic similarity between words according to specific equations. Cosine and N-gram similarity measures are used in order t
... Show Morefacing economic units operating in the environment sector of the Iraqi
industrial many pressures in its seeking to measure and evaluate its performance because of variables, today's corporate environment, as the case which makes looking for a methodology can be adopted to evaluate its performance with a more holistic, rather than being limited to traditional measures that are no longer enough to keep pace with rapid changes in today's corporate environment, which requires that measures of performance are derived from the strategy of unity and commensurate with the specificity of the environment in Iraq. Try searching discussion Ttormwhrat and performance measurement systems to suit the business strategies and directions of change
... Show MoreThe objective of an Optimal Power Flow (OPF) algorithm is to find steady state operation point which minimizes generation cost, loss etc. while maintaining an acceptable system performance in terms of limits on generators real and reactive powers, line flow limits etc. The OPF solution includes an objective function. A common objective function concerns the active power generation cost. A Linear programming method is proposed to solve the OPF problem. The Linear Programming (LP) approach transforms the nonlinear optimization problem into an iterative algorithm that in each iteration solves a linear optimization problem resulting from linearization both the objective function and constrains. A computer program, written in MATLAB environme
... Show MoreA sensitivity-turbidimetric method at (0-180o) was used for detn. of mebeverine in drugs by two solar cell and six source with C.F.I.A.. The method was based on the formation of ion pair for the pinkish banana color precipitate by the reaction of Mebeverine hydrochloride with Phosphotungstic acid. Turbidity was measured via the reflection of incident light that collides on the surface particles of precipitated at 0-180o. All variables were optimized. The linearity ranged of Mebeverine hydrochloride was 0.05-12.5mmol.L-1, the L.D. (S/N= 3)(3SB) was 521.92 ng/sample depending on dilution for the minimum concentration , with correlation coefficient r = 0.9966while was R.S.D%
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show More