Preferred Language
Articles
/
joe-442
Data Aggregation in Wireless Sensor Networks Using Modified Voronoi Fuzzy Clustering Algorithm
...Show More Authors

Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which lead to extend the network lifetime and reduce the traffic that may be accrue in the buffer of sink node. Each cluster head collected data from its members and forwards it to the sink node. A comparative study between modified VFCA and LEACH protocol is implemented in this paper and shows that the modified VFCA is more efficient than LEACH protocol in terms of network lifetime and average energy consumption. Another comparative study between modified VFCA and K-Means clustering algorithm is presented and shows that the modified VFCA is more efficient than K-Means clustering algorithm in terms of  packets transmitted to sink node, buffer utilization, packet loss values and running time. A simulation process is developed and tested using Matlab R2010a program in a computer having the following properties: windows 7 (32-bit operating system), core i7, RAM 4GB, hard 1TB.

 

 

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Mar 06 2016
Journal Name
Baghdad Science Journal
A Note on the Perturbation of arithmetic expressions
...Show More Authors

In this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 30 2022
Journal Name
Iraqi Journal Of Science
A Comparative Study for Supervised Learning Algorithms to Analyze Sentiment Tweets
...Show More Authors

      Twitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Jan 04 2022
Journal Name
Iraqi Journal Of Science
Identifying of User Behavior from Server Log File
...Show More Authors

Due to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD

... Show More
View Publication Preview PDF
Publication Date
Thu Feb 27 2020
Journal Name
Journal Of Mechanics Of Continua And Mathematical Sciences
SUGGESTING MULTIPHASE REGRESSION MODEL ESTIMATION WITH SOME THRESHOLD POINT
...Show More Authors

The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Feb 01 2022
Journal Name
Journal Of Engineering
Geomechanical study to predict the onset of sand production formation
...Show More Authors

One of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sat Jan 30 2021
Journal Name
Iraqi Journal Of Science
Dynamic Fault Tolerance Aware Scheduling for Healthcare System on Fog Computing
...Show More Authors

 Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (8)
Scopus Crossref
Publication Date
Tue Aug 31 2021
Journal Name
Iraqi Journal Of Science
Development of a Job Applicants E-government System Based on Web Mining Classification Methods
...Show More Authors

     Governmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicate

... Show More
View Publication Preview PDF
Scopus (5)
Scopus Crossref
Publication Date
Wed Feb 01 2023
Journal Name
Journal Of Engineering
Checking the Accuracy of Selected Formulae for both Clear Water and Live Bed Bridge Scour
...Show More Authors

Due to severe scouring, many bridges failed worldwide. Therefore, the safety of the existing bridge (after contrition) mainly depends on the continuous monitoring of local scour at the substructure. However, the bridge's safety before construction mainly depends on the consideration of local scour estimation at the bridge substructure. Estimating the local scour at the bridge piers is usually done using the available formulae. Almost all the formulae used in estimating local scour at the bridge piers were derived from laboratory data. It is essential to test the performance of proposed local scour formulae using field data. In this study, the performance of selected bridge scours estimation formulae was validated and sta

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 01 2018
Journal Name
International Journal Of Science And Research (ijsr)
Optimal Economic Design of Diversion Structures during Construction of a Dam by Particle Swarm Optimization
...Show More Authors

Diverting river flow during construction of a main dam involves the construction of cofferdams, and tunnels, channels or other temporary passages. Diversion channels are commonly used in wide valleys where the high flow makes tunnels or culverts uneconomic. The diversion works must form part of the overall project design since it will have a major impact on its cost, as well as on the design, construction program and overall cost of the permanent works. Construction costs contain of excavation, lining of the channel, and construction of upstream and downstream cofferdams. The optimization model was applied to obtain optimalchannel cross section, height of upstream cofferdam, and height of downstream cofferdamwith minimum construction cost

... Show More
Publication Date
Sun Mar 04 2018
Journal Name
Iraqi Journal Of Science
Improving Detection Rate of the Network Intrusion Detection System Based on Wrapper Feature Selection Approach
...Show More Authors

Regarding the security of computer systems, the intrusion detection systems (IDSs) are essential components for the detection of attacks at the early stage. They monitor and analyze network traffics, looking for abnormal behaviors or attack signatures to detect intrusions in real time. A major drawback of the IDS is their inability to provide adequate sensitivity and accuracy, coupled with their failure in processing enormous data. The issue of classification time is greatly reduced with the IDS through feature selection. In this paper, a new feature selection algorithm based on Firefly Algorithm (FA) is proposed. In addition, the naïve bayesian classifier is used to discriminate attack behaviour from normal behaviour in the network tra

... Show More
View Publication Preview PDF