This research aims to estimate stock returns, according to the Rough Set Theory approach, test its effectiveness and accuracy in predicting stock returns and their potential in the field of financial markets, and rationalize investor decisions. The research sample is totaling (10) companies traded at Iraq Stock Exchange. The results showed a remarkable Rough Set Theory application in data reduction, contributing to the rationalization of investment decisions. The most prominent conclusions are the capability of rough set theory in dealing with financial data and applying it for forecasting stock returns.The research provides those interested in investing stocks in financial markets with significant financial analysis tools that exceed the traditional statistical methods. The originality of the research lies in the diversification of financial and statistical analysis tools and methods of forecasting stock returns
We study the physics of flow due to the interaction between a viscous dipole and boundaries that permit slip. This includes partial and free slip, and interactions near corners. The problem is investigated by using a two relaxation time lattice Boltzmann equation with moment-based boundary conditions. Navier-slip conditions, which involve gradients of the velocity, are formulated and applied locally. The implementation of free-slip conditions with the moment-based approach is discussed. Collision angles of 0°, 30°, and 45° are investigated. Stable simulations are shown for Reynolds numbers between 625 and 10 000 and various slip lengths. Vorticity generation on the wall is shown to be affected by slip length, angle of incidence,
... Show MoreFlow-production systems whose pieces are connected in a row may not have maintenance scheduling procedures fixed because problems occur at different times (electricity plants, cement plants, water desalination plants). Contemporary software and artificial intelligence (AI) technologies are used to fulfill the research objectives by developing a predictive maintenance program. The data of the fifth thermal unit of the power station for the electricity of Al Dora/Baghdad are used in this study. Three stages of research were conducted. First, missing data without temporal sequences were processed. The data were filled using time series hour after hour and the times were filled as system working hours, making the volume of the data relativel
... Show MoreIn this study used three methods such as Williamson-hall, size-strain Plot, and Halder-Wagner to analysis x-ray diffraction lines to determine the crystallite size and the lattice strain of the nickel oxide nanoparticles and then compare the results of these methods with two other methods. The results were calculated for each of these methods to the crystallite size are (0.42554) nm, (1.04462) nm, and (3.60880) nm, and lattice strain are (0.56603), (1.11978), and (0.64606) respectively were compared with the result of Scherrer method (0.29598) nm,(0.34245),and the Modified Scherrer (0.97497). The difference in calculated results Observed for each of these methods in this study.
In current article an easy and selective method is proposed for spectrophotometric estimation of metoclopramide (MCP) in pharmaceutical preparations using cloud point extraction (CPE) procedure. The method involved reaction between MCP with 1-Naphthol in alkali conditions using Triton X-114 to form a stable dark purple dye. The Beer’s law limit in the range 0.34-9 μg mL-1 of MCP with r =0.9959 (n=3) after optimization. The relative standard deviation (RSD) and percentage recoveries were 0.89 %, and (96.99–104.11%) respectively. As well, using surfactant cloud point extraction as a method to extract MCP was reinforced the extinction coefficient(ε) to 1.7333×105L/mol.cm in surfactant-rich phase. The small volume of organi
... Show MoreThis research investigated the importance and priorities of the project overhead costs in Iraq via a questionnaire using the fuzzy analytic hierarchy process technique (FAHP). Using this technique is very important in the uncertain circumstances as in our country. The researcher reached to frame an equation through the results of the priorities of weights include the percentages of each of the main items of the project overhead costs. The researcher tested this equation by applying it to one of the completed projects and the results showed suitability for the application. The percentages of the (salaries, grants, and incentives) and (fieldwork requirements) in equation represent approximately two-thirds of project overhe
... Show MoreA reduced-order extended state observer (RESO) based a continuous sliding mode control (SMC) is proposed in this paper for the tracking problem of high order Brunovsky systems with the existence of external perturbations and system uncertainties. For this purpose, a composite control is constituted by two consecutive steps. First, the reduced-order ESO (RESO) technique is designed to estimate unknown system states and total disturbance without estimating an available state. Second, the continuous SMC law is designed based on the estimations supplied by the RESO estimator in order to govern the nominal system part. More importantly, the robustness performance is well achieved by compensating not only the lumped disturbance, but also its esti
... Show MoreAbstract
Objective: the idea of this study to improve transdermal permeability of Methotrexate using eucalyptus oil, olive oil and peppermint oil as enhancers.
Method: eucalyptus oil (2% and 4%), peppermint oil (2% and 4%) and olive oil (2% and 4%) all used as natural enhancers to develop transdermal permeability of Methotrexate via gel formulation. The gel was subjected to many physiochemical properties tests. In-vitro release and permeability studies for the drug were done by Franz cell diffusion across synthetic membrane, kinetic model was studied via korsmeyer- peppas equation.
Result: the results demonstrate that safe, nonirritant or cause necrosis to rats' skin and stable till 60 days gel was successfully formulated.<
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show More