Preferred Language
Articles
/
jeasiq-90
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeeding and maternal health. The maternal health variable contained missing value and was processed in Matlab2015a using Methods Principal Component    Analysis and probabilistic Principal Component Analysis of where the missing values ​​were processed and then the methods were compared using the root of the mean error squares. The best method to processed the missing values Was the PCA method.                             

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2019
Journal Name
Al-khwarizmi Engineering Journal
Reverse Engineering Representation Using an Image Processing Modification
...Show More Authors

In the reverse engineering approach, a massive amount of point data is gathered together during data acquisition and this leads to larger file sizes and longer information data handling time. In addition, fitting of surfaces of these data point is time-consuming and demands particular skills. In the present work a method for getting the control points of any profile has been presented. Where, many process for an image modification was explained using Solid Work program, and a parametric equation of the profile that proposed has been derived using Bezier technique with the control points that adopted. Finally, the proposed profile was machined using 3-aixs CNC milling machine and a compression in dimensions process has been occurred betwe

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Oct 12 2021
Journal Name
Engineering, Technology And Applied Science Research
Automated Pavement Distress Detection Using Image Processing Techniques
...Show More Authors

Pavement crack and pothole identification are important tasks in transportation maintenance and road safety. This study offers a novel technique for automatic asphalt pavement crack and pothole detection which is based on image processing. Different types of cracks (transverse, longitudinal, alligator-type, and potholes) can be identified with such techniques. The goal of this research is to evaluate road surface damage by extracting cracks and potholes, categorizing them from images and videos, and comparing the manual and the automated methods. The proposed method was tested on 50 images. The results obtained from image processing showed that the proposed method can detect cracks and potholes and identify their severity levels wit

... Show More
Scopus (25)
Crossref (20)
Scopus Crossref
Publication Date
Sun Mar 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Evaluation of the foundations of preparing the general budget of the StateThrough the 2008 federal budget analysis
...Show More Authors

تعد الموازنة الأداة الأساسية لتنفيذ أولويات أية دولة، ويتوجب النظر إليها في ضوء المناخ الاجتماعي والسياسي والاقتصادي، لأنها تساعد في توجيه الاقتصاد لتحقيق النمو ورفع مستوى الرفاهية. اعتمدت وزارة المالية في أعداد الموازنة السنوية بعد 9/4/ 2003 أسلوباً مغايراً لما كان معتمداً في العقود الماضية، إذ كانت هناك موازنتين الأولى الموازنة الجارية، والثانية الموازنة الاستثمارية رغم وجود قانون يحتم إصدار موازنة

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Mar 11 2024
Journal Name
Nibal
التنظيم الرياضي
...Show More Authors

تعتبر الإدارة فيقولون علم التنظيم، ولعل في ذلك إطلاقا لاسم الجزء على الكل إظهار لأهمية هذا الجزء وتزداد أهمية التنظيم وتبدو أكثر جلاء كلما تضخم حجم أعمال الإدارة وبالتالي اتسع إطار الجهاز الإداري الذي يتولى هذه الأعمال. التنظيم والإدارة عنصران متكاملان وإن كان بينهما اختلاف فالتنظيم يعنى تحديد الوظائف وتوزيعها في علاقات إنتاجية معينة وهو يهدف إلى توفير نوع من المهارات والمسئوليات عن طريق التوزيع المناسب ل

... Show More
Publication Date
Sat Jan 01 2022
Journal Name
Journal Of Petroleum Science And Engineering
Performance evaluation of analytical methods in linear flow data for hydraulically-fractured gas wells
...Show More Authors

View Publication
Scopus (8)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (379)
Crossref (387)
Scopus Clarivate Crossref
Publication Date
Thu Apr 27 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Error Analysis in Numerical Algorithms
...Show More Authors

   In this paper, we applied the concept of the error analysis using the linearization method and new condition numbers constituting optimal bounds in appraisals of the possible errors. Evaluations of finite continued fractions, computations of determinates of tridiagonal systems, of determinates of second order and a "fast" complex multiplication. As in Horner's scheme, present rounding error analysis of product and summation algorithms. The error estimates are tested by numerical examples. The executed program for calculation is "MATLAB 7" from the website "Mathworks.com

View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Indian Journal Of Ecology
Horizontal variability of some soil properties in wasit governorate by using time series analysis
...Show More Authors

Scopus (2)
Scopus
Publication Date
Sat Jan 20 2024
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Review using continuous flow injection analysis technique in the determination of several drugs
...Show More Authors

Continuous flow injection analysis (CFIA) is one of the simplest, easiest, and multilateral analytical automation methods in moist chemical analysis. This method depends on changing the physical and chemical properties of a part of the specimen spread out from the specimen injected into the carrier stream. The CFIA technique uses automatic analysis of samples with high efficiency. The CFIA PC compatibility also allows specimens to be treated automatically, reagents to be added, and reaction conditions to be closely monitored. The CFIA is one of the automated chemical analysis methods in which a successive specimen sample is to be estimated and injected into a vector stream from a flowing solution that meets the reagent and mixes at a spe

... Show More
View Publication Preview PDF
Crossref (1)
Crossref