Preferred Language
Articles
/
ChgOTJcBVTCNdQwC_pcQ
Discussion on techniques of data cleaning, user identification, and session identification phases of web usage mining from 2000 to 2022
...Show More Authors

The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Dec 05 2022
Journal Name
Baghdad Science Journal
Effects of Castor Oil Nanoemulsion Extracted by Hexane on the Fourth Larval stage of Culex quinquefsciatus from Al Hawizeh Marsh/Iraq, and Non- Targeted Organism
...Show More Authors

           The current study aims to show the importance of plant products as mosquitocides against Culex quinquefasciatus. Castor oil Nanoemulsions were subedit in various ratios including castor oil, ethanol, tween 80, and deionized water by using ultrasonication. Thermodynamic, centrifugation, PH, assay which improved that the formula  of 10 ml  of castor  oil, ethanol  5ml, tween 80 (14 ml) and deionized water 71ml was more stable than other formulas. The stable formula of castor oil nanoemulsion was characterized by transmission electron microscopy (TEM) and dynamic light scattering (DLS). Nanoemulsion droplets were spherical in shape and were found to have a Z-average diameter of 87.4nm. A concentration of ca

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Wed Jun 28 2023
Journal Name
Internet Technology Letters,
The Blockchain for Healthcare 4.0 Apply in Standard Secure Medical Data Processing Architecture
...Show More Authors

Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Contemporary Challenges for Cloud Computing Data Governance in Information Centers: An analytical study
...Show More Authors

Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jul 01 2018
Journal Name
Journal Of The American Pharmacists Association
Evaluation of community pharmacist–provided telephone interventions to improve adherence to hypertension and diabetes medications
...Show More Authors

View Publication
Scopus (29)
Crossref (25)
Scopus Clarivate Crossref
Publication Date
Mon Jun 30 2008
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Optimal Quantitave and Distributive Analysis of Thermal Pollution due to Heated Water Released to Rivers
...Show More Authors

To reduce the effects of discharging heated water disposed into a river flow by a single thermal source, two parameters were changed to get the minimum effect using optimization. The first parameter is to distribute the total flow of the heated water between two disposal points (double source) instead of one and the second is to change the distance between these two points. In order to achieve the solution, a two dimensional numerical model was developed to simulate and predict the changes in temperature distribution in the river due to disposal of the heated water using these two points of disposal.
MATLAB-7 software was used to build a program that could solve the governing partial equations of thermal pollution in rivers by using t

... Show More
View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Global Pharma Technology
Study of the genetic diversity in three species belonging to family: Daphniidae (Crustaceae, Cladocera) collected from different regions in Baghdad Province/Iraq
...Show More Authors

Scopus (2)
Scopus