This work is concerned with studying the solvability for optimal classical continuous control quaternary vector problem that controls by quaternary linear hyperbolic boundary value problem. The existence of the unique quaternary state vector solution for the quaternary linear hyperbolic boundary value problem is studied and demonstrated by employing the method of Galerkin, where the classical continuous control quaternary vector is Known. Also, the existence theorem of an optimal classical continuous control quaternary vector related to the quaternary linear hyperbolic boundary value problem is demonstrated. The existence of a unique solution to the adjoint quaternary linear hyperbolic boundary value problem associated with the quaternary linear hyperbolic boundary value problem is formulated and studied. The directional derivative for the cost functional is derived. Finally, the necessary optimality theorem for the optimal classical continuous control quaternary vector is proved.
The aim of this paper is to introduce and study the concept of SN-spaces via the notation of simply-open sets as well as to investigate their relationship to other topological spaces and give some of its properties.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreStudying extreme precipitation is very important in Iraq. In particular, the last decade witnessed an increasing trend in extreme precipitation as the climate change. Some of which caused a disastrous consequences on social and economic environment in many parts of the country. In this paper a statistical analysis of rainfall data is performed. Annual maximum rainfall data obtained from monthly records for a period of 127 years (1887-2013 inclusive) at Baghdad metrology station have been analyzed. The three distributions chosen to fit the data were Gumbel, Fréchet and the generalized Extreme Value (GEV) distribution. Using the maximum likelihood method, results showed that the GEV distribution was the best followed by Fréchet distribut
... Show MoreThe current study aims to apply the methods of evaluating investment decisions to extract the highest value and reduce the economic and environmental costs of the health sector according to the strategy.In order to achieve the objectives of the study, the researcher relied on the deductive approach in the theoretical aspect by collecting sources and previous studies. He also used the applied practical approach, relying on the data and reports of Amir almuminin Hospital for the period (2017-2031) for the purpose of evaluating investment decisions in the hospital. A set of conclusions, the most important of which is: The failure to apply
... Show MoreA fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted direct
... Show MoreThe process of stocks evaluating considered as a one of challenges for the financial analysis, since the evaluating focuses on define the current value for the cash flows which the shareholders expected to have. Due to the importance of this subject, the current research aims to choose Fama & French five factors Model to evaluate the common stocks to define the Model accuracy in Fama& French for 2014. It has been used factors of volume, book value to market value, Profitability and investment, in addition to Beta coefficient which used in capital assets pricing Model as a scale for Fama & French five factors Model. The research sample included 11 banks listed in Iraq stock market which have me
... Show MoreThe present study focused mainly on the vibration analysis of composite laminated plates subjected to
thermal and mechanical loads or without any load (free vibration). Natural frequency and dynamic
response are analyzed by analytical, numerical and experimental analysis (by using impact hammer) for
different cases. The experimental investigation is to manufacture the laminates and to find mechanical
and thermal properties of glass-polyester such as longitudinal, transverse young modulus, shear modulus,
longitudinal and transverse thermal expansion and thermal conductivity. The vibration test carried to
find the three natural frequencies of plate. The design parameters of the laminates such as aspect ratio,
thickness
In this paper we introduced many new concepts all of these concepts completely
depended on the concept of feebly open set. The main concepts which introduced in
this paper are minimal f-open and maximal f-open sets. Also new types of
topological spaces introduced which called Tf min and Tf max spaces. Besides,
we present a package of maps called: minimal f-continuous, maximal f-continuous,
f-irresolute minimal, f-irresolute maximal, minimal f-irresolute and maximal firresolute.
Additionally we investigated some fundamental properties of the concepts
which presented in this paper.
This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreIn the present work, an image compression method have been modified by combining The Absolute Moment Block Truncation Coding algorithm (AMBTC) with a VQ-based image coding. At the beginning, the AMBTC algorithm based on Weber's law condition have been used to distinguish low and high detail blocks in the original image. The coder will transmit only mean of low detailed block (i.e. uniform blocks like background) on the channel instate of transmit the two reconstruction mean values and bit map for this block. While the high detail block is coded by the proposed fast encoding algorithm for vector quantized method based on the Triangular Inequality Theorem (TIE), then the coder will transmit the two reconstruction mean values (i.e. H&L)
... Show More