Preferred Language
Articles
/
7hYg5IsBVTCNdQwCH-Mp
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the improved algorithm can detect this type of anomaly. Thus, our approach is effective in finding abnormalities.

Scopus
Preview PDF
Quick Preview PDF
Publication Date
Mon Mar 27 2017
Journal Name
Iraqi Journal Of Pharmaceutical Sciences ( P-issn 1683 - 3597 E-issn 2521 - 3512)
Preparation and Evaluation of Meloxicam Microsponges as Transdermal Delivery System
...Show More Authors

The aim of present study was to develop gel formulation of microsponges of poorly soluble drug meloxicam (MLX) in order to enhance the release and dissolution of MLX which is the limitation for preparation in topical forms. Also skin delivery is an alternative administration for MLX that can minimize gastrointestinal (GI) side effects and improve patient compliance. The microsponges of MLX were prepared by quasi-emulsion solvent diffusion method.  The effects of drug:polymer ratio, stirring time and Eudragit polymer type on the physical characteristics of microsponges were investigated and characterized for production yield, loading efficiency, particle size, surface morphology, and in vitro drug release from microsponges. The selec

... Show More
View Publication
Crossref (6)
Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Engineering
Prediction of Monthly Fluoride Content in Tigris River using SARIMA Model in R Software
...Show More Authors

The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2,0,0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlation coefficien

... Show More
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Engineering
Prediction of Monthly Fluoride Content in Tigris River using SARIMA Model in R Software
...Show More Authors

The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat

... Show More
View Publication Preview PDF
Publication Date
Sun Dec 27 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Evaluation the profitability of public commercial banks using liquidity indicators: A comparison of the Rafidain and Rasheed study
...Show More Authors

The issue of liquidity, profitability, and money employment, and capital fullness is one of the most important issues that gained high consideration by other authors and researchers in their attempts to find out the real relationship and how can balance be achieved, which is the main goal of each deposits.

For the sake of comprising the study variables, the research has formed the problem of the study which refers to the bank capability to enlarge profits without dissipation in liquidity   of the bank which will negatively reflect on the bank's fame as well as the customers' trust. For all these matters, the researcher has proposed a set of aims, the important of which is the estimation of the bank profitability; liquid

... Show More
View Publication Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison of Parameters Estimation Methods for the Negative Binomial Regression Model under Multicollinearity Problem by Using Simulation
...Show More Authors

This study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 01 2025
Journal Name
Chinese Journal Of Analytical Chemistry
Synthesis, characterization, antioxidant and bioactivity assessment, and thermodynamic analysis of metal ion complexes using a novel azo dye
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Journal Of Cybersecurity And Information Management
A New Automated System Approach to Detect Digital Forensics using Natural Language Processing to Recommend Jobs and Courses
...Show More Authors

A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref

... Show More
View Publication
Scopus Crossref
Publication Date
Fri May 01 2015
Journal Name
Ieee Transactions On Microwave Theory And Techniques
On the Design of Gyroelectric Resonators and Circulators Using a Magnetically Biased 2-D Electron Gas (2-DEG)
...Show More Authors

View Publication
Scopus (3)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Fri Nov 01 2013
Journal Name
Journal Of Cosmetics, Dermatological Sciences And Applications
Treatment of chronic paronychia: A double blind comparative clinical trial using singly vaseline, nystatin and fucidic acid ointment
...Show More Authors

KE Sharquie, AA Noaimi, SA Galib, Journal of Cosmetics, Dermatological Sciences and Applications, 2013 - Cited by 4

View Publication
Publication Date
Fri Jan 20 2023
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Study of the Land Cover of Razzaza Lake during the Past 25 Years Using Remote Sensing Methods
...Show More Authors

In this study, the Earth's surface was studied in Razzaza Lake for 25 years, using remote sensing methods. Images of the satellites Landsat 5 (TM) and 8 (OLI) were used to study and determine the components of the land cover. The study covered the years 1995-2021 with an interval of 5 years, as this region is uninhabited, so the change in the land cover is slow. The land cover was divided into three main classes and seven subclasses and classified using the maximum likelihood classifier with the help of training sets collected to represent the classes that made up the land cover. The changes detected in the land cover were studied by considering 1995 as a reference year. It was found that there was a significant reduction in the water mass

... Show More
View Publication Preview PDF
Crossref (1)
Crossref