Preferred Language
Articles
/
kBfIvY0BVTCNdQwCkxqE
Data Hiding in 3D-Medical Image
...Show More Authors

Information hiding strategies have recently gained popularity in a variety of fields. Digital audio, video, and images are increasingly being labelled with distinct but undetectable marks that may contain a hidden copyright notice or serial number, or even directly help to prevent unauthorized duplication. This approach is extended to medical images by hiding secret information in them using the structure of a different file format. The hidden information may be related to the patient. In this paper, a method for hiding secret information in DICOM images is proposed based on Discrete Wavelet Transform (DWT). Firstly. segmented all slices of a 3D-image into a specific block size and collecting the host image depend on a generated key, secondly selected the block number and slice number, thirdly, the low-high band used for embedding after adding the generated number, fourthly, used the Hessenberg transform on the blocks that portioned the band (low-high) in a specific size. The secret information (image or text) is a binary value. It was embedded by setting the positive value in the diagonal to odd values if the embedded is one and setting it to even if the secret bit is zero. Several tests were applied, such as applying mean square error, peak signal to noise ratio PSNR, and structural similarity index measure SSIM. Some analyses such as adding noise, scaling, and rotation analysis are applied to test the efficiency. The results of the tests showed the strength of the proposed method.

Scopus Clarivate Crossref
View Publication
Publication Date
Thu Sep 30 2021
Journal Name
Iraqi Journal Of Science
Image Focus Enhancement Using Focusing Filter and DT-CWT Based Image Fusion
...Show More Authors

Combining multi-model images of the same scene that have different focus distances can produce clearer and sharper images with a larger depth of field. Most available image fusion algorithms are superior in results. However, they did not take into account the focus of the image. In this paper a fusion method is proposed to increase the focus of the fused image and to achieve highest quality image using the suggested focusing filter and Dual Tree-Complex Wavelet Transform. The focusing filter consist of a combination of two filters, which are Wiener filter and a sharpening filter. This filter is used before the fusion operation using Dual Tree-Complex Wavelet Transform. The common fusion rules, which are the average-fusion rule and maximu

... Show More
View Publication Preview PDF
Scopus (3)
Scopus Crossref
Publication Date
Wed Aug 31 2022
Journal Name
Al-kindy College Medical Journal
Assessment of Awareness And Knowledge among Medical Students Regarding Radiation Exposure from Common Diagnostic Imaging Procedures: Radiation exposure awareness among medical students
...Show More Authors

Objective: to assess the awareness and knowledge of our medical students regarding dose levels of imaging procedures and radiation safety issues, and to conclude how the curriculum of clinical radiology in the college medical program impacts such knowledge.

Subjects and methods:  this is a cross-sectional study conducted among 150 medical students in Alkindy College of Medicine between January 2021 to July 2021, regardless of their age or gender. The study included six grades according to the year 2020-2021. A questionnaire consisting of 12 multiple-choice questions was conducted via an online survey using Google Forms. The questions were divided into two parts

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sat Oct 08 2022
Journal Name
Aro-the Scientific Journal Of Koya University
Data Analytics and Techniques
...Show More Authors

Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide

... Show More
View Publication
Crossref (5)
Clarivate Crossref
Publication Date
Mon Apr 01 2019
Journal Name
2019 International Conference On Automation, Computational And Technology Management (icactm)
Multi-Resolution Hierarchical Structure for Efficient Data Aggregation and Mining of Big Data
...Show More Authors

Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jul 25 2018
Journal Name
International Journal Of Engineering Trends And Technology
Polynomial Color Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref
Publication Date
Sun Jun 01 2008
Journal Name
Baghdad Science Journal
Monochrome Image Hologram (MIH)
...Show More Authors

A new computer-generated optical element called a monochrome image hologram (MIH) is described. A real nonnegative function to represent the transmittance of a synthesized hologram is used. This technique uses the positions of the samples in the synthesized hologram to record the phase information of a complex wavefront. Synthesized hologram is displayed on laser printer and is recorded on a film. Finally the reconstruction process is done using computerized .

View Publication Preview PDF
Crossref
Publication Date
Thu Oct 01 2015
Journal Name
Journal Of Economics And Administrative Sciences
Estimation Multivariate data points in spatial statistics with application
...Show More Authors

This paper  deals  to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 13 2021
Journal Name
Iraqi Journal Of Science
Modern Probabilistic Model: Filtering Massive Data in E-learning
...Show More Authors

So muchinformation keeps on being digitized and stored in several forms, web pages, scientific articles, books, etc. so the mission of discovering information has become more and more challenging. The requirement for new IT devices to retrieve and arrange these vastamounts of informationaregrowing step by step. Furthermore, platforms of e-learning are developing to meet the intended needsof students.
The aim of this article is to utilize machine learning to determine the appropriate actions that support the learning procedure and the Latent Dirichlet Allocation (LDA) so as to find the topics contained in the connections proposed in a learning session. Ourpurpose is also to introduce a course which moves toward the student's attempts a

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Fri Jul 01 2022
Journal Name
Iraqi Journal Of Science
Statistical Analysis of Extreme Rainfall Data in Baghdad City
...Show More Authors

Studying extreme precipitation is very important in Iraq. In particular, the last decade witnessed an increasing trend in extreme precipitation as the climate change. Some of which caused a disastrous consequences on social and economic environment in many parts of the country. In this paper a statistical analysis of rainfall data is performed. Annual maximum rainfall data obtained from monthly records for a period of 127 years (1887-2013 inclusive) at Baghdad metrology station have been analyzed. The three distributions chosen to fit the data were Gumbel, Fréchet and the generalized Extreme Value (GEV) distribution. Using the maximum likelihood method, results showed that the GEV distribution was the best followed by Fréchet distribut

... Show More
View Publication Preview PDF