Preferred Language
Articles
/
DhdKlY4BVTCNdQwChFV3
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the base and obtaining several values for the suggested threshold and applying then Haar wavelet function With the cut-off hard and mid threshold and Comparing the results according to several criteria.

Publication Date
Sun Jan 01 2023
Journal Name
2nd International Conference On Mathematical Techniques And Applications: Icmta2021
Review of clustering for gene expression data
...Show More Authors

View Publication
Crossref (2)
Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Customers emotional blackmail and reduce it the new product- study of the opinions of a sample of customers who deal with peak economy for household items in najaf al Ashraf
...Show More Authors

The challenges facing today's multi-customer and this is due to the multiplicity of products and speed in launching new products so search came to reveal the  reveal the of the new product classification standards through a relationship (good products, low interest products, useful products and products desired) and the customer emotionally blackmail through deportation (fear, obligation and guilt). dentified the problem of the research in several questions focused on the nature of the relationship between the variables of research, and for that outline supposedly to search it expresses the head of one hypothesis and branched out of which four hypotheses subset, but in order to ensure the validity of the ass

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq
...Show More Authors

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Sun Jan 14 2024
Journal Name
Journal Of Al-rafidain University College For Sciences ( Print Issn: 1681-6870 ,online Issn: 2790-2293 )
Using Nonparametric Procedure to Develop an OCMT Estimator for Big Data Linear Regression Model with Application Chemical Pollution in the Tigris River
...Show More Authors

Chemical pollution is a very important issue that people suffer from and it often affects the nature of health of society and the future of the health of future generations. Consequently, it must be considered in order to discover suitable models and find descriptions to predict the performance of it in the forthcoming years. Chemical pollution data in Iraq take a great scope and manifold sources and kinds, which brands it as Big Data that need to be studied using novel statistical methods. The research object on using Proposed Nonparametric Procedure NP Method to develop an (OCMT) test procedure to estimate parameters of linear regression model with large size of data (Big Data) which comprises many indicators associated with chemi

... Show More
View Publication
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some Estimation methods for the two models SPSEM and SPSAR for spatially dependent data
...Show More Authors

ABSTRUCT

In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error          ( λ ) in the model (SPSEM), estimated  the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Iraqi Journal Of Biotechnology
Reliable Reference Gene for Normalization of RT- qPCR Data in Human Cancer Cell Lines
Subjected to Gene Knockdown
...Show More Authors

Quantitative real-time Polymerase Chain Reaction (RT-qPCR) has become a valuable molecular technique in biomedical research. The selection of suitable endogenous reference genes is necessary for normalization of target gene expression in RT-qPCR experiments. The aim of this study was to determine the suitability of each 18S rRNA and ACTB as internal control genes for normalization of RT-qPCR data in some human cell lines transfected with small interfering RNA (siRNA). Four cancer cell lines including MCF-7, T47D, MDA-MB-231 and Hela cells along with HEK293 representing an embryonic cell line were depleted of E2F6 using siRNA specific for E2F6 compared to negative control cells, which were transfected with siRNA not specific for any gene. Us

... Show More
Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Use The moment method to Estimate the Reliability Function Of The Data Of Truncated Skew Normal Distribution
...Show More Authors

The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Sep 01 2015
Journal Name
2015 Ieee International Circuits And Systems Symposium (icsys)
Investigating the impact of on-chip interconnection noise on Dynamic Thermal Management efficiency
...Show More Authors

Dynamic Thermal Management (DTM) emerged as a solution to address the reliability challenges with thermal hotspots and unbalanced temperatures. DTM efficiency is highly affected by the accuracy of the temperature information presented to the DTM manager. This work aims to investigate the effect of inaccuracy caused by the deep sub-micron (DSM) noise during the transmission of temperature information to the manager on DTM efficiency. A simulation framework has been developed and results show up to 38% DTM performance degradation and 18% unattended cycles in emergency temperature under DSM noise. The finding highlights the importance of further research in providing reliable on-chip data transmission in DTM application.

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Thu Dec 11 2025
Journal Name
Jornal Of Al-muthanna For Agricultural Sciences
A Proposed Approach to Agricultural Extension in Iraq for a Better Response to the Needs of farmer’s to Address Their Challenges
...Show More Authors

View Publication