Preferred Language
Articles
/
uBY7jIcBVTCNdQwCX1WK
Signal Processing Techniques for Robust Spectrum Sensing
...Show More Authors

Cognitive radios have the potential to greatly improve spectral efficiency in wireless networks. Cognitive radios are considered lower priority or secondary users of spectrum allocated to a primary user. Their fundamental requirement is to avoid interference to potential primary users in their vicinity. Spectrum sensing has been identified as a key enabling functionality to ensure that cognitive radios would not interfere with primary users, by reliably detecting primary user signals. In addition, reliable sensing creates spectrum opportunities for capacity increase of cognitive networks. One of the key challenges in spectrum sensing is the robust detection of primary signals in highly negative signal-to-noise regimes (SNR).In this paper , we present the system design approach to meet this challenge with technique , The design space is diverse as it involves various types of primary user signals, We analysis of signal processing approaches and identify the regimes where these techniques are applicable. The goal of this paper is to present a practical system design view of spectrum sensing functionality.

Scopus Clarivate Crossref
View Publication
Publication Date
Tue Feb 12 2019
Journal Name
Iraqi Journal Of Laser
Propanol Vapor Sensor Utilizing Air-Gap as Sensing Part of the Mach-Zehnder Interferometer
...Show More Authors

The purpose of this study is to demonstrate a simple high sensitivity vapor sensor for propanol ((CH3)2CHOH). A free space gap was employed in two arms of a Mach-Zehnder interferometer to serve as the sensing mechanism by adding propanol volume (0.2, 0.4, 0.6, 0.8, and 1) ml and to set the phase reference with a physical spacing of (0.5, 1, 1.5, and 2) mm. The propagation constant of transmitted light in the Mach-Zehnder interferometer’s gap changes due to the small variation in the refractive index inside sensing arm that will further shift the optical phase of the signal. Experimental results indicated that the highest sensitivity of propanol was about 0.0275 nm/ml in different liquid volume while highest phase shift was 0.182×103 i

... Show More
View Publication Preview PDF
Publication Date
Thu May 21 2015
Journal Name
Environmental Monitoring And Assessment
Water quality monitoring of Al-Habbaniyah Lake using remote sensing and in situ measurements
...Show More Authors

View Publication
Scopus (35)
Crossref (29)
Scopus Clarivate Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Journal Of Cybersecurity And Information Management
A New Automated System Approach to Detect Digital Forensics using Natural Language Processing to Recommend Jobs and Courses
...Show More Authors

A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref

... Show More
View Publication
Scopus Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Iop Conference Series: Earth And Environmental Science
Processing and re-interpretation of gravity bouguer map of a selected area in the W-NW of Iraq
...Show More Authors
Abstract<p>Geophysical data interpretation is crucial in characterizing the subsurface structure. The Bouguer gravity map analysis of the W-NW region of Iraq serves as the basis for the current geophysical research. The Bouguer gravity data were processed using the Power Spectrum Analysis method. Four depth slices have been acquired after the PSA process, which are: 390 m, 1300 m, 3040 m, and 12600 m depth. The gravity anomaly depth maps show that shallow-depth anomalies are mainly related to the sedimentary cover layers and structures, while the gravity anomaly of the deeper depth slice of 12600 m is more presented to the basement rocks and mantle uplift. The 2D modeling technique was used for </p> ... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Sun Mar 15 2020
Journal Name
Al-academy
Techniques of Acting Performance in Fantasy Theatrical Show
...Show More Authors

View Publication
Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Engineering
Application Artificial Forecasting Techniques in Cost Management (review)
...Show More Authors

For the duration of the last few many years many improvement in computer technology, software program programming and application production had been followed with the aid of diverse engineering disciplines. Those trends are on the whole focusing on synthetic intelligence strategies. Therefore, a number of definitions are supplied, which recognition at the concept of artificial intelligence from exclusive viewpoints. This paper shows current applications of artificial intelligence (AI) that facilitate cost management in civil engineering tasks. An evaluation of the artificial intelligence in its precise partial branches is supplied. These branches or strategies contributed to the creation of a sizable group of fashions s

... Show More
View Publication Preview PDF
Publication Date
Wed May 31 2023
Journal Name
Iraqi Geological Journal
A Survey of Infill Well Location Optimization Techniques
...Show More Authors

The maximization of the net present value of the investment in oil field improvements is greatly aided by the optimization of well location, which plays a significant role in the production of oil. However, using of optimization methods in well placement developments is exceedingly difficult since the well placement optimization scenario involves a large number of choice variables, objective functions, and restrictions. In addition, a wide variety of computational approaches, both traditional and unconventional, have been applied in order to maximize the efficiency of well installation operations. This research demonstrates how optimization approaches used in well placement have progressed since the last time they were examined. Fol

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (2)
Scopus Crossref
Publication Date
Tue Feb 01 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Computer-based plagiarism detection techniques: A comparative study
...Show More Authors

Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and

... Show More
Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression using Polynomial Coding Techniques: A review
...Show More Authors

Publication Date
Sun Apr 23 2017
Journal Name
International Conference Of Reliable Information And Communication Technology
Classification of Arabic Writer Based on Clustering Techniques
...Show More Authors

Arabic text categorization for pattern recognitions is challenging. We propose for the first time a novel holistic method based on clustering for classifying Arabic writer. The categorization is accomplished stage-wise. Firstly, these document images are sectioned into lines, words, and characters. Secondly, their structural and statistical features are obtained from sectioned portions. Thirdly, F-Measure is used to evaluate the performance of the extracted features and their combination in different linkage methods for each distance measures and different numbers of groups. Finally, experiments are conducted on the standard KHATT dataset of Arabic handwritten text comprised of varying samples from 1000 writers. The results in the generatio

... Show More
Scopus (6)
Scopus