Preferred Language
Articles
/
jeasiq-3045
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroups by penalizing the pairwise distances between the coefficients of the cubic B-spline model using one of the common penalize functions, the Minimax Concave Penalty function (MCP). This method, in turn, works to determine the number of clusters through one of the model selection criteria, Bayesian information criteria (BIC), and we used optimization methods to solve their equations. Therefore, we applied the alternative direction method of the ADMM multiplier algorithm to reach approximate solutions to find the estimators of the nonparametric model using R statistical software.
Longitudinally balanced data were generated in the simulation study, as the number of subjects was 60 and the number of repeats (time) was 10 for each subject. The simulation was iterated 100 times, and it showed that employing the MCP partial methods on the cubic model can group profiles into clusters, which is the aim of this paper.

 

Paper type: Research paper.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Jun 01 2022
Journal Name
Bulletin Of Electrical Engineering And Informatics
Proposed model for data protection in information systems of government institutions
...Show More Authors

Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Crossref
Publication Date
Mon Dec 25 2023
Journal Name
Ieee Access
ITor-SDN: Intelligent Tor Networks-Based SDN for Data Forwarding Management
...Show More Authors

Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del

... Show More
View Publication
Scopus (3)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Wed Nov 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
strong criminal capabilities، Using simulation .
...Show More Authors

The penalized least square method is a popular method to deal with high dimensional data ,where  the number of explanatory variables is large than the sample size . The properties of  penalized least square method are given high prediction accuracy and making estimation and variables selection

 At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Dec 01 2018
Journal Name
Al-khwarizmi Engineering Journal
Classical and Statistical Optimization of Medium Composition for Promoting Prodigiosin Produced by Local Isolate of Serratia Marcescens
...Show More Authors

Prodigiosin is a ‘natural red pigment produced by Serratia marcescens which exhibits immunosuppressive and anticancer properties in addition to antimicrobial activities. This work presents an attempt to maximize the production of prodigiosin by two different strategies: one factor at time (OFAT) and statistical optimization. The result of OFAT revealed that sucrose and peptone were the best carbon and nitrogen sources for pigment production with concentration of prodigiosin of about 135 mg/ L. This value was increased to 331.6mg/ L with an optimized ratio of C/N (60:40) and reached 356.8 with pH 6 and 2% inoculum size at end of classical optimization. Statistical experimental design based on Response surface methodology was co

... Show More
View Publication Preview PDF
Crossref (5)
Crossref
Publication Date
Sun Feb 28 2021
Journal Name
Journal Of Economics And Administrative Sciences
Effects of Macroeconomic Variables on Gross Domestic Product in Saudi Arabia using ARDL model for the period 1993-2019
...Show More Authors

 

This paper analyses the relationship between selected macroeconomic variables and gross domestic product (GDP) in Saudi Arabia for the period 1993-2019. Specifically, it measures the effects of interest rate, oil price, inflation rate, budget deficit and money supply on the GDP of Saudi Arabia. The method employs in this paper is based on a descriptive analysis approach and ARDL model through the Bounds testing approach to cointegration. The results of the research reveal that the budget deficit, oil price and money supply have positive significant effects on GDP, while other variables have no effects on GDP and turned out to be insignificant. The findings suggest that both fiscal and monetary policies should be fo

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Simplified Novel Approach for Accurate Employee Churn Categorization using MCDM, De-Pareto Principle Approach, and Machine Learning
...Show More Authors

Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date.  A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Fri Dec 01 2023
Journal Name
Methods And Objects Of Chemical Analysis
Partial Least Squares Method for the Multicomponent Analysis of Antibacterial Mixture
...Show More Authors

This study's objective is to assess how well UV spectrophotometry can be used in conjunction with multivariate calibration based on partial least squares (PLS) regression for concurrent quantitative analysis of antibacterial mixture (Levofloxacin (LIV), Metronidazole (MET), Rifampicin (RIF) and Sulfamethoxazole (SUL)) in their artificial mixtures and pharmaceutical formulations. The experimental calibration and validation matrixes were created using 42 and 39 samples, respectively. The concentration range taken into account was 0-17 μg/mL for all components. The calibration standards' absorbance measurements were made between 210 and 350 nm, with intervals of 0.2 nm. The associated parameters were examined in order to develop the optimal c

... Show More
Scopus (3)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Feb 16 2022
Journal Name
Journal Of Economics And Administrative Sciences
Solving Resource Allocation Model by Using Dynamic Optimization Technique for Al-Raji Group Companies for Soft Drinks and Juices
...Show More Authors

In this paper, the problem of resource allocation at Al-Raji Company for soft drinks and juices was studied. The company produces several types of tasks to produce juices and soft drinks, which need machines to accomplish these tasks, as it has 6 machines that want to allocate to 4 different tasks to accomplish these tasks. The machines assigned to each task are subject to failure, as these machines are repaired to participate again in the production process. From past records of the company, the probability of failure machines at each task was calculated depending on company data information. Also, the time required for each machine to complete each task was recorded. The aim of this paper is to determine the minimum expected ti

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Clouds Height Classification Using Texture Analysis of Meteosat Images
...Show More Authors

In the present work, pattern recognition is carried out by the contrast and relative variance of clouds. The K-mean clustering process is then applied to classify the cloud type; also, texture analysis being adopted to extract the textural features and using them in cloud classification process. The test image used in the classification process is the Meteosat-7 image for the D3 region.The K-mean method is adopted as an unsupervised classification. This method depends on the initial chosen seeds of cluster. Since, the initial seeds are chosen randomly, the user supply a set of means, or cluster centers in the n-dimensional space.The K-mean cluster has been applied on two bands (IR2 band) and (water vapour band).The textural analysis is used

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jan 01 2022
Journal Name
The International Journal Of Nonlinear Analysis And Applications
Developing Bulk Arrival Queuing Models with Constant Batch Policy Under Uncertainty Data Using (0-1) Variables
...Show More Authors

This paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b

... Show More