Preferred Language
Articles
/
jeasiq-1638
Split and Merge Regions of Satellite Images using the Non-Hierarchical Algorithm of Cluster Analysis
...Show More Authors

يعد التقطيع الصوري من الاهداف الرئيسة والضرورية في المعالجات الصورية للصور الرقمية، فهو يسعى الى تجزئة الصور المدروسة الى مناطق متعددة اكثر نفعاً تلخص فيها المناطق ذات الافادة لصور الاقمار الصناعية، وهي صور متعددة الاطياف ومجهزة من الاقمار الصناعية باستخدام مبدأ الاستشعار عن بعد والذي اصبح من المفاهيم المهمة التي تُعتمد تطبيقاته في اغلب ضروريات الحياة اليومية، وخاصة بعد التطورات المتسارعة التي شهدتها ميادين الحياة المختلفة والتي كثيراً منها طرقت بابها خوارزميات وتقنيات البرمجيات، فهذه الصور تعد ضرورية جداٌ لتمكيننا من دراسة طيف واسع من الاهداف في العديد من الجوانب العلمية، في هذا البحث استخدمت خوارزمية التحليل العنقودي غير الهرمية كطريقة للتقطيع الصوري (شطر ودمج المناطق) بهدف عرض اهمية استخدام الاساليب الاحصائية في مهام المعالجة الصورية مثل التقطيع الصوري، حيث اعتمد على تقنية (K-Means) لتنفيذ هذه المهمة، وقد طبقت خوارزمية هذه التقنية على صورة اقمار صناعية متعددة الاطياف لمشهد غربي العراق، حيث اظهرت النتائج مدى مرونة هذه الخوارزمية في التعامل مع التفاوت في اضاءة العناصر الصورية للصورة الملونة وكفاءة تكوينها لمناطق العناقيد المتكونة من مجاميع من العناصر الصورية المتجانسة في درجة شدة اضاءتها، واخيراً قدرة هذه الخوارزمية على اعطاء صور تتميز بجودتها والتي قيست على وفق مقياس ارتفاع اشارة نسبة الضوضاء (Peak Signal to Noise Ratio (PSNR)) لقياس جودة الصورة.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jun 01 2022
Journal Name
Baghdad Science Journal
Variable Selection Using aModified Gibbs Sampler Algorithm with Application on Rock Strength Dataset
...Show More Authors

Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Wed Sep 11 2024
Journal Name
Lecture Notes In Civil Engineering
An Image Processing Algorithm to Address the Problem of Stains Merge on Water Sensitive Papers and Its Impact on the Evaluation of Spray Quality Indicators
...Show More Authors

There are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m

... Show More
View Publication
Scopus (1)
Scopus Clarivate Crossref
Publication Date
Wed Dec 30 2015
Journal Name
College Of Islamic Sciences
Of non-Muslim minorities In the Muslim community
...Show More Authors

Of non-Muslim minorities In the Muslim community

View Publication Preview PDF
Publication Date
Tue Oct 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Determination of the lot size using the Wagner-Whitin algorithm under the Constraint Theory / Case Study of Diyala Public Company
...Show More Authors

         International companies are striving to reduce their costs and increase their profits, and these trends have produced many methods and techniques to achieve these goals. these methods is heuristic and the other Optimization.. The research includes an attempt to adapt some of these techniques in the Iraqi companies, and these techniques are to determine the optimal lot size using the algorithms Wagner-Whitin under the theory of constraints. The research adopted the case study methodology to objectively identify the problem of research, namely determining lot size optimal for each of the products of electronic measurement laboratory in Diyala and in light of the bottlenecks in w

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Journal Of Engineering
Enhanced Chain-Cluster Based Mixed Routing Algorithm for Wireless Sensor Networks
...Show More Authors

Energy efficiency is a significant aspect in designing robust routing protocols for wireless sensor networks (WSNs). A reliable routing protocol has to be energy efficient and adaptive to the network size. To achieve high energy conservation and data aggregation, there are two major techniques, clusters and chains. In clustering technique, sensor networks are often divided into non-overlapping subsets called clusters. In chain technique, sensor nodes will be connected with the closest two neighbors, starting with the farthest node from the base station till the closest node to the base station. Each technique has its own advantages and disadvantages which motivate some researchers to come up with a hybrid routing algorit

... Show More
View Publication Preview PDF
Publication Date
Mon Dec 18 2017
Journal Name
Al-khwarizmi Engineering Journal
Optimization and Prediction of Process Parameters in SPIF that Affecting on Surface Quality Using Simulated Annealing Algorithm
...Show More Authors

Incremental sheet metal forming is a modern technique of sheet metal forming in which a uniform sheet is locally deformed during the progressive action of a forming tool. The tool movement is governed by a CNC milling machine. The tool locally deforms by this way the sheet with pure deformation stretching. In SPIF process, the research is concentrate on the development of predict models for estimate the product quality. Using simulated annealing algorithm (SAA), Surface quality in SPIF has been modeled. In the development of this predictive model, spindle speed, feed rate and step depth have been considered as model parameters. Maximum peak height (Rz) and Arithmetic mean surface roughness (Ra) are used as response parameter to assess th

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Sat Sep 30 2017
Journal Name
Al-khwarizmi Engineering Journal
Robot Arm Path Planning Using Modified Particle Swarm Optimization based on D* algorithm
...Show More Authors

Abstract

Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al

... Show More
View Publication Preview PDF
Crossref (8)
Crossref
Publication Date
Tue Mar 30 2021
Journal Name
Baghdad Science Journal
Delivery Route Management based on Dijkstra Algorithm
...Show More Authors

For businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Clarivate Crossref
Publication Date
Thu Jul 31 2025
Journal Name
Iraqi Journal For Administrative Sciences
Using the Statistical Analysis to study the important reasons of the pollution in the Iraqi Marshlands Areas
...Show More Authors

The need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref