Image retrieval is an active research area in image processing, pattern recognition, and
computer vision. In this proposed method, there are two techniques to extract the feature
vector, the first one is applying the transformed algorithm on the whole image and the second
is to divide the image into four blocks and then applying the transform algorithm on each part
of the image. In each technique there are three transform algorithm that have been applied
(DCT, Walsh Transform, and Kekre’s Wavelet Transform) then finding the similarity and
indexing the images, useing the correlation between feature vector of the query image and
images in database. The retrieved method depends on higher indexing number.
Experimental results have shown better results (higher precision and recall) by applying
DCT on the image than the other transform algorithms and the performance improvement if
dividing the image into equal four blocks and applying the transformed algorithm into each
part
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreFinding similarities in texts is important in many areas such as information retrieval, automated article scoring, and short answer categorization. Evaluating short answers is not an easy task due to differences in natural language. Methods for calculating the similarity between texts depend on semantic or grammatical aspects. This paper discusses a method for evaluating short answers using semantic networks to represent the typical (correct) answer and students' answers. The semantic network of nodes and relationships represents the text (answers). Moreover, grammatical aspects are found by measuring the similarity of parts of speech between the answers. In addition, finding hierarchical relationships between nodes in netwo
... Show MoreDigital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
The corrosion of metals is of great economic importance. Estimates show that the quarter of the iron and the steel produced is destroyed in this way. Rubber lining has been used for severe corrosion protection because NR and certain synthetic rubbers have a basic resistance to the very corrosive chemicals particularly acids. The present work includes producing ebonite from both natural and synthetic rubbers ; therefore, the following materials were chosen to produce ebonite rubber: a) Natural rubber (NR). b) Styrene butadiene rubber (SBR). c) Nitrile rubber (NBR). d) Neoprene rubber (CR) [WRT]. The best ebonite vulcanizates are obtained in the presence of 30 Pphr sulfur, and carbon black as reinforcing filler. The relation between
... Show MoreMerging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreThe transition of customers from one telecom operator to another has a direct impact on the company's growth and revenue. Traditional classification algorithms fail to predict churn effectively. This research introduces a deep learning model for predicting customers planning to leave to another operator. The model works on a high-dimensional large-scale data set. The performance of the model was measured against other classification algorithms, such as Gaussian NB, Random Forrest, and Decision Tree in predicting churn. The evaluation was performed based on accuracy, precision, recall, F-measure, Area Under Curve (AUC), and Receiver Operating Characteristic (ROC) Curve. The proposed deep learning model performs better than othe
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
In this study, two correlations are developed to calculate absolute rocks permeability from core samples tested by Gas Permeameter Apparatus. The first correlation can be applied if K g≤100, the second correlation can be applied if Kg>100. Sixty core samples having different permeabilities to give a wide range of values that necessary to achieve a correlation.
The developed correlation is easily applied and a quick method to avoid repeating the test at different pressure values. Only one pressure test is required to reach absolu
... Show MoreOsteoarthritis (OA) is a series of aggressive destructive inflammatory processes. Synovitis is common both at an early and a late phase. This disease may be uniquely singular in some site but phylogenetically related at some point in time to produce a common outcome of dysfunction, disability, socioeconomic destruction and sometimes socioeconomic failure. Articular cartilage, subchondral bone and synovial membrane are the site of major abnormalities in this disease process. Rheumatoid factor (RF) represents one of the routine laboratory tests that made for all patients have joint complaints.Chloroquine phosphate (CQP) is agent belong to disease modifying osteoathritic drugs (DMOADs). Chloroquine and their derivatives have been used for t
... Show MoreBackground: Cystatin C is recently considered to be a good predictor of cardiovascular morbidity and mortality in patients with coronary artery disease (CAD)Objectives: Correlation between cystatin and ischemic heart disease.Methods :One hundred forty patients (140) with ischemic heart disease admitted to thin study at Baghdad teaching hospital from the period June. 2011 to Jan. 2012. Those patients was categorized into three groups.Group (A): patients with ischemic heart failure.Group (B): Patients with myocardial infarction.Group (C) patients with unstable angina.All these groups were in comparison to fifty (50) healthy controls. Fasting serum citation (C) were measured in all patients and control in addition to all other routine inves
... Show More