Preferred Language
Articles
/
wxdWPo4BVTCNdQwCEj4x
Mining categorical Covid-19 data using chi-square and logistic regression algorithms
...Show More Authors

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jan 24 2020
Journal Name
Petroleum And Coal
Evaluation of Geomechanical Properties for Tight Reservoir Using Uniaxial Compressive Test, Ultrasonic Test, and Well Logs Data
...Show More Authors

Tight reservoirs have attracted the interest of the oil industry in recent years according to its significant impact on the global oil product. Several challenges are present when producing from these reservoirs due to its low to extra low permeability and very narrow pore throat radius. Development strategy selection for these reservoirs such as horizontal well placement, hydraulic fracture design, well completion, and smart production program, wellbore stability all need accurate characterizations of geomechanical parameters for these reservoirs. Geomechanical properties, including uniaxial compressive strength (UCS), static Young’s modulus (Es), and Poisson’s ratio (υs), were measured experimentally using both static and dynamic met

... Show More
Publication Date
Fri Jul 01 2022
Journal Name
Iraqi Journal Of Science
X.K.N: A Proposed Method for Data Encryption Using XOR and NOT Logical Gates with LFSR Generated Keys
...Show More Authors

In this paper, a method for data encryption was proposed using two secret keys, where the first one is a matrix of XOR's and NOT's gates (XN key), whereas the second key is a binary matrix (KEYB) key. XN and KEYB are (m*n) matrices where m is equal to n. Furthermore this paper proposed a strategy to generate secret keys (KEYBs) using the concept of the LFSR method (Linear Feedback Shift Registers) depending on a secret start point (third secret key s-key). The proposed method will be named as X.K.N. (X.K.N) is a type of symmetric encryption and it will deal with the data as a set of blocks in its preprocessing and then encrypt the binary data in a case of stream cipher.

View Publication Preview PDF
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
A Comparison between Ericson's Formulae Results and Experimental Data Using New Formulae of Single Particle Level Density
...Show More Authors

The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter  was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of  are derived from the relation between  and level density parameter . The formulae used to derive  are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on  from the Thomas-Fermi formula show a good agreement with the experimental data.

View Publication Preview PDF
Scopus Crossref
Publication Date
Fri Jan 01 2010
Journal Name
Conference Proceedings
Assessing the accuracy of 'crowdsourced' data and its integration with official spatial data sets
...Show More Authors

Scopus (19)
Scopus
Publication Date
Sat May 01 2021
Journal Name
Journal Of Physics: Conference Series
Regression shrinkage and selection variables via an adaptive elastic net model
...Show More Authors
Abstract<p>In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in </p> ... Show More
View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Wed Dec 28 2022
Journal Name
Al–bahith Al–a'alami
Contents of Campaign Advertisements “Take the Vaccine . to Protect Yourself” to Raise Awareness about Vaccines Against صthe Covid-19 Virus (Analytical Study of the Ministry of Health Facebook Page)
...Show More Authors

        This paper aims to identify the contents of the advertisements of the (Take the Vaccine .. to Protect Yourself) campaign that was carried out by the Iraqi Ministry of Health for the period from (11/19/2020) to (4/1/2022), to raise awareness of the anti-Covid 19 virus vaccines, which it published on its official page on Facebook. The researcher used a comprehensive inventory method for the research community, and used the content analysis tool.                                                                             

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison of Parameters Estimation Methods for the Negative Binomial Regression Model under Multicollinearity Problem by Using Simulation
...Show More Authors

This study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More