Two-dimensional unsteady mixed convection in a porous cavity with heated bottom wall is numerically studied in the present paper. The forced flow conditions are imposed by providing a hydrostatic pressure head at the inlet port that is located at the bottom of one of the vertical side walls and an open vent at the top of the other vertical side wall. The Darcy model is adopted to model the fluid flow in the porous medium and the combination effects of hydrostatic pressure head and the heat flux quantity parameters are carefully investigated. These governing parameters are varied over wide ranges and their effect on the heat transfer characteristics is studied in detail. It is found that the time required to reach a desired temperature at the bottom wall decreases with heat flux and pressure head increase. The higher heat flux quantities leaves wider regions near the top wall at lower temperatures which is important in most engineering applications like drying.
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
In the current research, multiple mixing ratios of gamma -transitions of the energy levels 60𝑁𝑑 142−150 isotopes populated in 𝑁𝑑(𝑛, 𝑛 ˊ 60 142−150 ) 60𝑁𝑑 142−150 interaction are calculated using the constant statistical tensor (CST) method. The results obtained are, in general, in good agreement or consistent, within the experimental error, with the results published in the previously researches. Existing discrepancies result from inaccuracies in the experimental results of previous works. The current results confirm the validity of the constant statistical tenser method of calculating the values of mixing ratios and its predictability of errors in experimental results
Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show MoreThe purpose of this study is to diagnose factors that effect Thi-Qar behavioral intention to use internet. A sample of (127) internet users of university staff was taken in the study and were analyzed by using path analyze . The study concluded that there is a set of affecting correlation. It was founded that exogenous variables (gender, income, perceived fun, perceived usefulness, Image, and ease of use) has significant effect on endogenous (behavioral intention) . The result of analysis indicated that image hopeful gained users comes first, ease of use secondly, perceived fan and perceived usefulness on (dependent variables (daily internet usage and diversity of internet usage. Implication of these result are discussed . the st
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show More