Preferred Language
Articles
/
jeasiq-971
Using panel data in structural equations with application
...Show More Authors

The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration.  chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology has been through a presentation , the emphasized the importance of research on the mind a new addition for the professionals and researchers in this area. In the second part , we have introduced the concept of system of simultaneous equation for the method of combining CT data and time series . using fixed effects for the periods and groups once and use them without a second time . Also has been the conditions diagnosis model used in the analysis , which includes the police rank and oich includes the police rank and order in addition to illustrate the urder in addition to illustrate the use of a method of least squares two –stage built in appreciation of the data used in the research as well as view to test the fixed effects for each of the groups and period . in addition to the concept of testing Phillips-perron (Philips-peron). The research problem can be summarized in the thirastoqania data CT, which was detected using the test Phillips –perron  (Philips-peron) and the level of the series and the difference first & second data scan (panel data) and each of the fixed effect of the peviods and groups, and also the goal of research and its premises and the nature of the variables used and where they develop. In the practical side , were presented results of the assessment system  of simultaneous equation used in the research and for the period (1990-2005), disaggregated by type of estimation method and the function of each sector (pubic ,mixed , cooperative ,private ) separately.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution for Progressively Censoring Data with S- Function about COVID-19
...Show More Authors

The two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq
...Show More Authors

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of estimations methods of the entropy function to the random coefficients for two models: the general regression and swamy of the panel data
...Show More Authors

In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.

The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Application of data content analysis (DEA) technology to evaluate performance efficiency: applied research in the General Tax Authority
...Show More Authors

The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T

... Show More
View Publication Preview PDF
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Improved Firefly Algorithm with Variable Neighborhood Search for Data Clustering
...Show More Authors

Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the

... Show More
View Publication Preview PDF
Scopus (12)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Sat Oct 20 2018
Journal Name
Journal Of Economics And Administrative Sciences
The Effect of Extreme Values on Streeter-Phleps Model Parameter Estimators With Application Abstract
...Show More Authors

Abstract

   The extremes effects in parameters readings which are BOD (Biological Oxygen Demands) and DO(Dissolved Oxygen) can caused error estimating of the model’s parameters which used to determine the ratio of de oxygenation and re oxygenation of the dissolved oxygen(DO),then that will caused launch big amounts of the sewage pollution  water to the rivers and it’s turn is effect in negative form on the ecosystem life and the different types of the water wealth.

   As result of what mention before this research came to employees Streeter-Phleps model parameters estimation which are (Kd,Kr) the de oxygenation and re oxygenation ratios on respect

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Compare to the conditional logistic regression models with fixed and mixed effects for longitudinal data
...Show More Authors

Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Aug 01 2023
Journal Name
Baghdad Science Journal
Digital Data Encryption Using a Proposed W-Method Based on AES and DES Algorithms
...Show More Authors

This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (2)
Scopus Crossref