Preferred Language
Articles
/
bsj-6142
Steganography and Cryptography Techniques Based Secure Data Transferring Through Public Network Channel
...Show More Authors

Attacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover.  The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels with two secret random keys. Therefore, the hidden message remains protected even if the stego-object is hacked because the attacker is unable to know the correct frames and pixels that hold each bit of the secret message in addition to difficulty to successfully rebuild the message. The results refer to that the proposed scheme provides a good performance for evaluation metric that is used in this purpose when compared to a large number of related previous methods.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Dec 30 2020
Journal Name
Iraqi Journal Of Science
Subsurface Imaging of 2D Seismic Data Using Kirchhoff Time Migration Method, Central Iraq
...Show More Authors

Kirchhoff Time Migration method was applied in pre-and post-Stack Time Migration for post-processing of images collected from Balad-Samarra (BS-92) survey line that is sited across Ajeel anticline oilfield. The results showed that Ajeel anticline structure was relocated at the correct position in the migrated stacked section. The two methods (Pre and Post) of migration processing showed enhanced subsurface images and increased horizontal resolution, which was clear after the broadening the syncline and narrowing or compressing the anticline. However, each of these methods was associated with migration noise. Thus, a Post-Stack process was applied using Dip-Removal (DDMED) and Band-Pass filters to eliminate the artifact noise. The time-fr

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon May 15 2023
Journal Name
Iraqi Journal Of Science
Sub-Surface Investigation of Khashim Al-Ahmer Gas Field Using Seismic Reflection Data
...Show More Authors

This paper is carried out to detect the subsurface structures that have geological
and economical importance by interpreting the available reflection seismic data of
an area estimated to be about (740) km2. The Khashim Al-Ahmer structure is partial
of series structures of (Injana – Khashim Al-Ahmer – Mannsorya) from the (NW to
the SE), it is located within for deep faulted area. The component of the one
elongated dome of asymmetrical of structure which has(SW) limb more steeper than
the (NE) limb.Twenty three seismic sections had been interpreted for two seismic
surveys and the total length of all seismic lines is about (414.7) Km. Interpretation
of seismic data was focused on two reflectors (Fatha and Jeribi)

... Show More
View Publication Preview PDF
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlighting the diff

... Show More
Crossref (1)
Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (3)
Scopus
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight

... Show More
View Publication Preview PDF
Publication Date
Tue Nov 30 2021
Journal Name
Iraqi Journal Of Science
Inspecting Hybrid Data Mining Approaches in Decision Support Systems for Humanities Texts Criticism
...Show More Authors

The majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content.   When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Wed Dec 30 2020
Journal Name
Iraqi Journal Of Science
A Comparison of Different Estimation Methods to Handle Missing Data in Explanatory Variables
...Show More Authors

Missing data is one of the problems that may occur in regression models. This problem is usually handled by deletion mechanism available in statistical software. This method reduces statistical inference values because deletion affects sample size. In this paper, Expectation Maximization algorithm (EM), Multicycle-Expectation-Conditional Maximization algorithm (MC-ECM), Expectation-Conditional Maximization Either (ECME), and Recurrent Neural Networks (RNN) are used to estimate multiple regression models when explanatory variables have some missing values. Experimental dataset were generated using Visual Basic programming language with missing values of explanatory variables according to a missing mechanism at random general pattern and s

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Nov 30 2021
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms

... Show More
View Publication Preview PDF
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref