Preferred Language
Articles
/
ijs-3607
Chaos-based Color Image Steganography Method Using 3 D Cat Map
...Show More Authors

     Steganography is a technique to hide a secret message within a different multimedia carrier so that the secret message cannot be identified. The goals of steganography techniques include improvements in imperceptibility, information hiding, capacity, security, and robustness. In spite of numerous secure methodologies that have been introduced, there are ongoing attempts to develop these techniques to make them more secure and robust. This paper introduces a color image steganographic method based on a secret map, namely 3-D cat. The proposed method aims to embed data using a secure structure of chaotic steganography, ensuring better security. Rather than using the complete image for data hiding, the selection of the image band and pixel coordination is adopted, using the 3D map that produces irregular outputs for embedding a secret message randomly in the least significant bit (LSB) of the cover image. This increases the complexity encountered by the attackers. The performance of the proposed method was evaluated and the results reveal that the proposed method provides a high level of security through defeating various attacks, such as statistical attacks, with no detectable distortion in the stego-image. Comparison results ensure that the proposed method surpasses other existing steganographic methods regarding the Mean Square Error (MSE) and Peak Signal-to-Noise Ratio(PSNR).

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Engineering
ORGANIZATION OF MEMORY CHIPS IN MEMORY SYSTEMS THAT HAVE WORD SIZE WIDER THAN 8-BIT
...Show More Authors

This paper presents a method to organize memory chips when they are used to build memory systems that have word size wider than 8-bit. Most memory chips have 8-bit word size. When the memory system has to be built from several memory chips of various sizes, this method gives all possible organizations of these chips in the memory system. This paper also suggests a precise definition of the term “memory bank” that is usually used in memory systems. Finally, an illustrative design problem was taken to illustrate the presented method practically

View Publication Preview PDF
Crossref
Publication Date
Tue Oct 20 2020
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimating of Survival Function under Type One Censoring Sample for Mixture Distribution
...Show More Authors

In this article, it is interesting to estimate and derive the three parameters which contain two scales parameters and one shape parameter of a new mixture distribution for the singly type one censored data which is the branch of right censored sample. Then to define some special mathematical and statistical properties for this new mixture distribution which is considered one of the continuous distributions characterized by its flexibility. Next,  using maximum likelihood estimator method for singly type one censored data based on the Newton-Raphson matrix procedure to find and estimate values of these three parameter by utilizing the real data taken from the National Center for Research and Treatment of Hematology/University of Mus

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Dec 30 2020
Journal Name
Iraqi Journal Of Science
A Comparison of Different Estimation Methods to Handle Missing Data in Explanatory Variables
...Show More Authors

Missing data is one of the problems that may occur in regression models. This problem is usually handled by deletion mechanism available in statistical software. This method reduces statistical inference values because deletion affects sample size. In this paper, Expectation Maximization algorithm (EM), Multicycle-Expectation-Conditional Maximization algorithm (MC-ECM), Expectation-Conditional Maximization Either (ECME), and Recurrent Neural Networks (RNN) are used to estimate multiple regression models when explanatory variables have some missing values. Experimental dataset were generated using Visual Basic programming language with missing values of explanatory variables according to a missing mechanism at random general pattern and s

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Dec 04 2016
Journal Name
Baghdad Science Journal
Synthesis, Characterization, Antimicrobial and Theoretical Studies of V(IV),Fe(III),Co(II),Ni(II), Cu(II), and Zn(II)Complexes with Bidentate (NN) Donar Azo Dye Ligand
...Show More Authors

The new 4-[(7-chloro-2,1,3-benzoxadiazole)azo]-4,5-diphenyl imidazole (L) have been synthesized and characterized by micro elemental and thermal analyses as well as 1H.NMR, FT-IR, and UV-Vis spectroscopic techniques. (L) acts as a ligand coordinating with some metal ionsV(IV), Fe(III), Co(II), Ni(II), Cu(II), and Zn(II). Structures of the new compounds were characterized by elemental and thermal analyses as well as FT-IR and UV-Vis Spectra. The magnetic properties and electrical conductivities of metal complexes were also determined. Study of the nature of the complexes formed in ethanol following the mole ratio method.. The work also include a theoretical treatment of the formed complexes in the gas phase, this was done using the (hyperch

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Thu Jul 01 2021
Journal Name
University Of Northampton Pue
Validating a Proposed Data Mining Approach (SLDM) for Motion Wearable Sensors to Detect the Early Signs of Lameness in Sheep
...Show More Authors

View Publication
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Prioritized Text Detergent: Comparing Two Judgment Scales of Analytic Hierarchy Process on Prioritizing Pre-Processing Techniques on Social Media Sentiment Analysis
...Show More Authors

Most companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Thu Oct 31 2013
Journal Name
Al-khwarizmi Engineering Journal
Pressure Control of Electro-Hydraulic Servovalve and Transmission Line Effect
...Show More Authors

The effected of the long transmission line (TL) between the actuator and the hydraulic control valve sometimes essentials. The study is concerned with modeling the TL which carries the oil from the electro-hydraulic servovalve to the actuator. The pressure value inside the TL has been controlled by the electro-hydraulic servovalve as a voltage supplied to the servovalve amplifier. The flow rate through the TL has been simulated by using the lumped π element electrical analogy method for laminar flow. The control voltage supplied to servovalve can be achieved by the direct using of the voltage function generator or indirect C++ program connected to the DAP-view program built in the DAP-card data acqu

... Show More
View Publication Preview PDF
Publication Date
Wed Apr 25 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
On Estimating the Survival Function for the Patients Suffer from the Lung Cancer Disease
...Show More Authors

          In this paper, the survival function has been estimated for the patients with lung cancer using different parametric estimation methods depending on sample for completing real data which explain the period of survival for patients who were ill with the lung cancer based on the diagnosis of disease or the entire of patients in a hospital for a time of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the estimation of the survival function for the lung cancer by using pre-test singles stage shrinkage estimator method was the best   . <

... Show More
View Publication Preview PDF
Crossref