Preferred Language
Articles
/
zxb_rYoBVTCNdQwC8KLJ
New Techniques to Enhance Data Deduplication using Content based-TTTD Chunking Algorithm
...Show More Authors

Scopus Crossref
View Publication
Publication Date
Sun Dec 01 2002
Journal Name
Iraqi Journal Of Physics
An edge detection algorithm matching visual contour perception
...Show More Authors

For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.

View Publication Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
2011 Developments In E-systems Engineering
Enhanced Computation Time for Fast Block Matching Algorithm
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Fri Feb 14 2014
Journal Name
International Journal Of Computer Applications
Parallelizing RSA Algorithm on Multicore CPU and GPU
...Show More Authors

View Publication
Crossref (10)
Crossref
Publication Date
Fri Sep 03 2021
Journal Name
Entropy
Reliable Recurrence Algorithm for High-Order Krawtchouk Polynomials
...Show More Authors

Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the

... Show More
View Publication
Scopus (35)
Crossref (32)
Scopus Clarivate Crossref
Publication Date
Mon Aug 01 2022
Journal Name
Bulletin Of Electrical Engineering And Informatics
Solid waste recycling and management cost optimization algorithm
...Show More Authors

Solid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (5)
Scopus Crossref
Publication Date
Thu Jun 06 2024
Journal Name
Journal Of Applied Engineering And Technological Science (jaets)
Lightweight Block and Stream Cipher Algorithm: A Review
...Show More Authors

Most of the Internet of Things (IoT), cell phones, and Radio Frequency Identification (RFID) applications need high speed in the execution and processing of data. this is done by reducing, system energy consumption, latency, throughput, and processing time. Thus, it will affect against security of such devices and may be attacked by malicious programs. Lightweight cryptographic algorithms are one of the most ideal methods Securing these IoT applications. Cryptography obfuscates and removes the ability to capture all key information patterns ensures that all data transfers occur Safe, accurate, verified, legal and undeniable.  Fortunately, various lightweight encryption algorithms could be used to increase defense against various at

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Symmetry
Fast Overlapping Block Processing Algorithm for Feature Extraction
...Show More Authors

In many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th

... Show More
View Publication
Scopus (28)
Crossref (28)
Scopus Clarivate Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq
...Show More Authors

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Thu Aug 01 2024
Journal Name
The American Journal Of Management And Economics Innovations
THE ROLE OF ECONOMIC DATA ANALYSIS IN MANAGING MEDIUM AND SMALL COMPANIES TO MAKE STRATEGIC DECISIONS AND IMPROVE PERFORMANCE: AN ANALYTICAL STUDY
...Show More Authors

Economic analysis plays a pivotal role in managerial decision-making processes. This analysis is predicated on deeply understanding economic forces and market factors influencing corporate strategies and decisions. This paper delves into the role of economic data analysis in managing small and medium-sized enterprises (SMEs) to make strategic decisions and enhance performance. The study underscores the significance of this approach and its impact on corporate outcomes. The research analyzes annual reports from three companies: Al-Mahfaza for Mobile and Internet Financial Payment and Settlement Services Company Limited, Al-Arab for Electronic Payment Company, and Iraq Electronic Gateway for Financial Services Company. The paper concl

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref