Preferred Language
Articles
/
jeasiq-3045
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroups by penalizing the pairwise distances between the coefficients of the cubic B-spline model using one of the common penalize functions, the Minimax Concave Penalty function (MCP). This method, in turn, works to determine the number of clusters through one of the model selection criteria, Bayesian information criteria (BIC), and we used optimization methods to solve their equations. Therefore, we applied the alternative direction method of the ADMM multiplier algorithm to reach approximate solutions to find the estimators of the nonparametric model using R statistical software.
Longitudinally balanced data were generated in the simulation study, as the number of subjects was 60 and the number of repeats (time) was 10 for each subject. The simulation was iterated 100 times, and it showed that employing the MCP partial methods on the cubic model can group profiles into clusters, which is the aim of this paper.

 

Paper type: Research paper.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Mar 09 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Improve Performance of Solar Cell by using Grooves which Have Semicircular Shape on The Surface by using Program (ZEMAX)
...Show More Authors

 In this work silicon solar cell has been used with semicircular grooves to improve its efficiency by reducing reflection of rays and increasing optical path through the cell. Software program for optical design (zemax) has been used by ray tracing mode to evaluate prototype efficiency when using detector beneath the cell. The prototype has aspect ratio (A.R=0.2) which is the best efficiency at incident angle (Ï´=0ͦ) and the best acceptance angle (Ï´=50ͦ). 

View Publication Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Simplified Novel Approach for Accurate Employee Churn Categorization using MCDM, De-Pareto Principle Approach, and Machine Learning
...Show More Authors

Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date.  A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sun Feb 28 2021
Journal Name
Journal Of Economics And Administrative Sciences
Effects of Macroeconomic Variables on Gross Domestic Product in Saudi Arabia using ARDL model for the period 1993-2019
...Show More Authors

 

This paper analyses the relationship between selected macroeconomic variables and gross domestic product (GDP) in Saudi Arabia for the period 1993-2019. Specifically, it measures the effects of interest rate, oil price, inflation rate, budget deficit and money supply on the GDP of Saudi Arabia. The method employs in this paper is based on a descriptive analysis approach and ARDL model through the Bounds testing approach to cointegration. The results of the research reveal that the budget deficit, oil price and money supply have positive significant effects on GDP, while other variables have no effects on GDP and turned out to be insignificant. The findings suggest that both fiscal and monetary policies should be fo

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Nov 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
strong criminal capabilities، Using simulation .
...Show More Authors

The penalized least square method is a popular method to deal with high dimensional data ,where  the number of explanatory variables is large than the sample size . The properties of  penalized least square method are given high prediction accuracy and making estimation and variables selection

 At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
Solve the fuzzy Assignment problem by using the Labeling method
...Show More Authors

The Assignment model is a mathematical model that aims to express a real problem facing factories and companies which is characterized by the guarantee of its activity in order to make the appropriate decision to get the best allocation of machines or jobs or workers on machines in order to increase efficiency or profits to the highest possible level or reduce costs or time To the extent possible, and in this research has been using the method of labeling to solve the problem of the fuzzy assignment of real data has been approved by the tire factory Diwaniya, where the data included two factors are the factors of efficiency and cost, and was solved manually by a number of iterations until reaching the optimization solution,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Mar 11 2019
Journal Name
Baghdad Science Journal
Solving Mixed Volterra - Fredholm Integral Equation (MVFIE) by Designing Neural Network
...Show More Authors

       In this paper, we focus on designing feed forward neural network (FFNN) for solving Mixed Volterra – Fredholm Integral Equations (MVFIEs) of second kind in 2–dimensions. in our method, we present a multi – layers model consisting of a hidden layer which has five hidden units (neurons) and one linear output unit. Transfer function (Log – sigmoid) and training algorithm (Levenberg – Marquardt) are used as a sigmoid activation of each unit. A comparison between the results of numerical experiment and the analytic solution of some examples has been carried out in order to justify the efficiency and the accuracy of our method.

         

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Wed Feb 16 2022
Journal Name
Journal Of Economics And Administrative Sciences
Solving Resource Allocation Model by Using Dynamic Optimization Technique for Al-Raji Group Companies for Soft Drinks and Juices
...Show More Authors

In this paper, the problem of resource allocation at Al-Raji Company for soft drinks and juices was studied. The company produces several types of tasks to produce juices and soft drinks, which need machines to accomplish these tasks, as it has 6 machines that want to allocate to 4 different tasks to accomplish these tasks. The machines assigned to each task are subject to failure, as these machines are repaired to participate again in the production process. From past records of the company, the probability of failure machines at each task was calculated depending on company data information. Also, the time required for each machine to complete each task was recorded. The aim of this paper is to determine the minimum expected ti

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
The Approximation of Weighted Hölder Functions by Fourier-Jacobi Polynomials to the Singular Sturm-Liouville Operator
...Show More Authors

      In this work, a weighted H lder function that approximates a Jacobi polynomial which solves the second order singular Sturm-Liouville equation is discussed. This is generally equivalent to the Jacobean translations and the moduli of smoothness. This paper aims to focus on improving methods of approximation and finding the upper and lower estimates for the degree of approximation in weighted H lder spaces by modifying the modulus of continuity and smoothness. Moreover, some properties for the moduli of smoothness with direct and inverse results are considered.

View Publication Preview PDF
Scopus (3)
Scopus Clarivate Crossref
Publication Date
Sun Jun 22 2025
Journal Name
Journal Of Engineering
Image Compression Using 3-D Two-Level Techniques
...Show More Authors

In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-

... Show More
View Publication
Publication Date
Sun Apr 08 2018
Journal Name
Al-khwarizmi Engineering Journal
Kinetic Study of the Leaching of Iraqi Akashat Phosphate Ore Using Lactic Acid
...Show More Authors

     In the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by

... Show More
View Publication Preview PDF
Crossref (1)
Crossref