Preferred Language
Articles
/
fRd8Qo8BVTCNdQwCfGdR
Multiple and Coherent Noise Removal from X-Profile 2D Seismic Data of Southern Iraq Using Normal Move Out-Frequency Wavenumber Technique
...Show More Authors

Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain method for multiple eliminations. The method is tested on a fake reflection event to authorize their validity, and applied to a real field X-profile 2D seismic data from southern Iraq. The results ensure the possibility of internal multiple types existing in the deep reflection data in Iraq and have to remove. So that the interpretation for the true reflectors be valid. The final processed stacked seismic data using normal move out- frequency-wavenumber domain technique shows good, clear, and sharp reflectors in comparison with the conventional normal move out stack data. Open-source Madagascar reproducible package is used for processing all steps of this study and the package is very efficient, accurate, and easy to implement normal move out, frequency-wavenumber domain, Dip-filter programs. The aim of the current study is to separate internal multiples and noise from the real 2D seismic data.

Scopus Crossref
View Publication
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Merge Operation Effect On Image Compression Using Fractal Technique
...Show More Authors

Fractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.

View Publication Preview PDF
Publication Date
Wed Jul 29 2020
Journal Name
Iraqi Journal Of Science
Fractal Image Compression Using Block Indexing Technique: A Review
...Show More Authors

Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima

... Show More
Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Sep 01 2025
Journal Name
Journal Of Information Hiding And Multimedia Signal Processing
Steganography Based on Image Compression Using a Hybrid Technique
...Show More Authors

Information security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-

... Show More
View Publication
Publication Date
Tue Jun 01 2021
Journal Name
Journal Of Optical Communications
Improving the optical link for UVLC using MIMO technique
...Show More Authors
Abstract<p>This paper proposed a theoretical treatment to study underwater wireless optical communications (UWOC) system with different modulation schemes by multiple input-multiple output (MIMO) technology in coastal water. MIMO technology provides high-speed data rates with longer distance link. This technique employed to assess the system by BER, Q. factor and data rate under coastal water types. The reliability of the system is examined by the techniques of 1Tx/1Rx, 2Tx/2Rx, 3Tx/3Rx and 4Tx/4Rx. The results shows the proposed technique by MIMO can get the better performance compared with the other techniques in terms of BER. Theoretical results were obtained to compare between PIN and APD </p> ... Show More
View Publication
Scopus (20)
Crossref (20)
Scopus Crossref
Publication Date
Sat Nov 26 2022
Journal Name
Sensors
3D Object Recognition Using Fast Overlapped Block Processing Technique
...Show More Authors

Three-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia

... Show More
View Publication
Scopus (13)
Crossref (15)
Scopus Clarivate Crossref
Publication Date
Wed Oct 06 2021
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Image segmentation by using thresholding technique in two stages
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Thu Nov 21 2019
Journal Name
Journal Of Engineering
Assessment of Observed Building Structure Setback of Shops along an Arterial Road and Noise Intrusion Level
...Show More Authors

Roads irrespective of the type have specific standard horizontal distance measured at 90 degrees from a lot boundary to a development known as a setback. Non-observance of the recommended setbacks accommodated in any urban center’s master plan creates noise hazard to the public health and safety as the movement of vehicular traffic is not without the attendant noise. This study assessed noise intrusion level in shops along a section of Ibadan-Abeokuta road with due consideration to compliance with the recommended building structure setback. Analysis of noise descriptors evaluated in this study gave A-weighted equivalent sound pressure level average of 91.3 dBA, the daytime average sound level (LD) 92.27 dBA,

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Apr 01 2019
Journal Name
2019 International Conference On Automation, Computational And Technology Management (icactm)
Multi-Resolution Hierarchical Structure for Efficient Data Aggregation and Mining of Big Data
...Show More Authors

Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an

... Show More
View Publication
Scopus (4)
Crossref (2)
Scopus Crossref
Publication Date
Mon Jan 01 2024
Journal Name
The Scientific World Journal
Efficient Removal of Brilliant Green Dye Using Mesoporous Attapulgite Clay: Investigating Adsorption Kinetics, Isotherms, and Mechanisms
...Show More Authors

The study involved the effectiveness of Iraqi attapulgite (IQATP) clay as an environmentally friendly material that easily adsorbs brilliant green (BG) dye from water systems and is identified by various complementary methods (e.g., FTIR, SEM‐EDS, XRD, ICP‐OES, pHpzc, and BET), where the result reported that the IQATP specific surface area is 29.15 m2/g. A systematic analysis was selected to evaluate the impact of different effective adsorption performance variables on BG dye decontamination. These variables included IQATP dosage (0.02–0.8 g/L), solution pH (3.05–8.15), contact time (ranging from 2 to 25 min), and initial BG dye concentration from 20 to 80 mg/L. The parameter

... Show More
View Publication
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Wed May 10 2023
Journal Name
Journal Of Planner And Development
Relationship of LST, NDVI, and NDBI using Landsat-8 data in Duhok city in 2019-2022
...Show More Authors

One of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city&#39;s land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land&#39;s surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS&#39;s. The findings revealed that the NDBI and the NDVI had the strongest correlation with the

... Show More
View Publication Preview PDF