Preferred Language
Articles
/
jeasiq-1710
Some Estimation methods for the two models SPSEM and SPSAR for spatially dependent data
...Show More Authors

ABSTRUCT

In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error          ( λ ) in the model (SPSEM), estimated  the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smooth parameter ( h ) according to the cross validation criterion ( CV ), the Local linear two step estimator  after removing the effect of the spatial errors dependence , once using variance- covariance spatial matrix of errors ( Ω )using kernel function(LLEK2) and other through the use of variance- covariance spatial matrix of errors ( Ω* ) using cubic B-Spline estimator (LLECS2), to remove the effect of the spatial errors dependence, also the Local linear two step estimator using Suggested kernel estimator, once using variance- covariance spatial matrix of errors using kernel estimator (SUGK2), and other through the use of variance- covariance spatial matrix of errors using cubic B-Spline estimator (SUGCS2) to removing the effect of the spatial errors dependence.

From the simulation experiment, with a frequency of 1000 times, for three sample sizes, three levels of variance, for two model, and Calculate the matrix of distances between the sites of the observations through the Euclidean distance, the two estimated methods mentioned above were used to estimate (SPSEM) and (SPSAR) models, using the spatial Neighborhoods matrix modified under the Rook Neighboring criteria. Comparing these methods using mean absolute percentage error (MAPE) turns out that the best method for the SPSEM) model is (SUGCS2) method, and for (SPSAR) model is (LLECS2) method.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Bayes Analysis for the Scale Parameter of Gompertz Distribution
...Show More Authors

In this paper, we investigate the behavior of the bayes estimators, for the scale parameter of the Gompertz distribution under two different loss functions such as, the squared error loss function, the exponential loss function (proposed), based different double prior distributions represented as erlang with inverse levy prior, erlang with non-informative prior, inverse levy with non-informative prior and erlang with chi-square prior.

The simulation method was fulfilled to obtain the results, including the estimated values and the mean square error (MSE) for the scale parameter of the Gompertz distribution, for different cases for the scale parameter of the Gompertz distr

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2010
Journal Name
Al-kindy College Medical Journal
Electrical stimulation for the treatment of Knee joint osteoarthritis
...Show More Authors

Abstract 20 patients with osteoarthritis of the knee joint were treated by electrical stimulation in the form of 6 sessions every other day each sessions of diphase fixe (DF) for 4 minutes followed by rest for 4 minutes then treated with a monophase fixe (MF) for 2 minutes. By clinical & statistical analysis ( P value < 0.05) we conclude that the electrical stimulation is effective as one method in the treatment of osteoarthritis.

View Publication Preview PDF
Publication Date
Sun Oct 03 2021
Journal Name
Journal Of Discrete Mathematical Sciences And Cryptography
Analysing the structure of A4-graphs for Mathieu groups
...Show More Authors

View Publication
Scopus (6)
Scopus Clarivate Crossref
Publication Date
Tue Feb 28 2023
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Modeling the trend of Iraqi GDP for 1970-2020
...Show More Authors

The study of economic growth indicators is of fundamental importance in estimating the effectiveness of economic development plans, as well as the great role it plays in determining appropriate economic policies in order to optimally use the factors that lead to the dynamics of growth in Iraq, especially during a certain period of time. The gross domestic product (GDP) at current prices), which is considered a part of the national accounts, which is considered as an integrated dynamic of statistics that produces in front of policy makers the possibility of determining whether the economy is witnessing a state of expansion or evaluating economic activity and its efficiency in order to reach the size of the overall economy. The research aims

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Tue Nov 09 2021
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Relationship between accounting Conservatism, Persistence and volatility of earnings of companies listed on the Iraq Stock Exchange
...Show More Authors

The study aimed to show the concept and importance of accounting Conservatism, Persistence and volatility of earnings of companies listed in the Iraqi market for securities have been selected 35 companies for the period between 2013 to 2107 used the scale of book value to market value to measure the Conservatism while the current profit regression model was used to measure future profits Eviews 9 was used for the purpose of testing the two hypotheses. The results of the study found that there is a statistically significant relationship between accounting Conservatism and Persistence of earnings. The relationship is inverse with no statistically significant relationship between Conservatism and earnings volatility.

View Publication Preview PDF
Publication Date
Mon Mar 08 2021
Journal Name
Baghdad Science Journal
Synthesis and some of the electrical properties of the polymer Mosul poly Berrol
...Show More Authors

Thin films were prepared from poly Berrol way Ketrrukemaaih pole of platinum concentrations both Albaarol and salt in the electrolytic Alastontrel using positive effort of 7 volts on the pole and the electrical wiring of the membrane record

View Publication Preview PDF
Publication Date
Sat Dec 30 2023
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Interpretation of Mud Losses in Carbonates Based on Cuttings Description, Well-Logging, Seismic and Coherency Data
...Show More Authors

    Hartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.

   The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Oct 01 2010
Journal Name
2010 Ieee Symposium On Industrial Electronics And Applications (isiea)
Distributed t-way test suite data generation using exhaustive search method with map and reduce framework
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (460)
Crossref (455)
Scopus Clarivate Crossref
Publication Date
Sun May 01 2011
Journal Name
Information Sciences
Design and implementation of a t-way test data generation strategy with automated execution tool support
...Show More Authors

View Publication
Scopus (66)
Crossref (52)
Scopus Clarivate Crossref