The present study attempts to give a detailed discussion and analysis of parenthetical constructions in English and Arabic, the aim being to pinpoint the points of similarity and difference between the two languages in this particular linguistic area.The study claims that various types of constructions in English and Arabic could be considered parenthetical; these include non-restrictive relative clauses, non-restrictive appositives, comment clauses, vocatives, interjections, among others. These are going to be identified, classified, and analyzed according to the Quirk grammar - the approach to grammatical description pioneered by Randolph Quirk and his associates, and published in a series of reference grammars during the 1970s and 1980s, notably A Grammar of Contemporary English (1972) and its successor A Comprehensive Grammar of the English Language in1985. Reference will, however, be made, wherever necessary, to the principles, techniques and terminology of other models of grammar. The method is, thus, more or less, eclectic. The concluding part of the research offers the main findings of the study.
The current research aims to train students to take benefit of their studies to analyze and taste the artistic works as one of the most important components of the academic structure for students specializing in visual arts; then to activate this during training them the methods of teaching. Consequently, the capabilities of mind maps were employed as a tool that would be through freeing each student to analyze a model of artistic work and think about his analytical principles according to what he knows. Then, a start-up with a new stage revolves around the possibility of transforming this analysis into a teaching style by thinking about how the student would do. The same person who undertook the technical analysis should offer this work
... Show MoreIn the present work, different remote sensing techniques have been used to analyze remote sensing data spectrally using ENVI software. The majority of algorithms used in the Spectral Processing can be organized as target detection, change detection and classification. In this paper several methods of target detection have been studied such as matched filter and constrained energy minimization.
The water body mapping have been obtained and the results showed changes on the study area through the period 1995-2000. Also the results that obtained from applying constrained energy minimization were more accurate than other method comparing with the real situation.
In this paper, analyzing the non-dimensional Magnesium-hydrodynamics problem Using nanoparticles in Jeffrey-Hamel flow (JHF) has been studied. The fundamental equations for this issue are reduced to a three-order ordinary differential equation. The current project investigated the effect of the angles between the plates, Reynolds number, nanoparticles volume fraction parameter, and magnetic number on the velocity distribution by using analytical technique known as a perturbation iteration scheme (PIS). The effect of these parameters is similar in the converging and diverging channels except magnetic number that it is different in the divergent channel. Furthermore, the resulting solutions with good convergence and high accuracy for the d
... Show MoreThere are many different methods for analysis of two-way reinforced concrete slabs. The most efficient methods depend on using certain factors given in different codes of reinforced concrete design. The other ways of analysis of two-way slabs are the direct design method and the equivalent frame method. But these methods usually need a long time for analysis of the slabs.
In this paper, a new simple method has been developed to analyze the two-way slabs by using simple empirical formulae, and the results of final analysis of some examples have been compared with other different methods given in different codes of practice.
The comparison proof that this simple proposed method gives good results and it can be used in analy
... Show MoreCharacteristic evolving is most serious move that deal with image discrimination. It makes the content of images as ideal as possible. Gaussian blur filter used to eliminate noise and add purity to images. Principal component analysis algorithm is a straightforward and active method to evolve feature vector and to minimize the dimensionality of data set, this paper proposed using the Gaussian blur filter to eliminate noise of images and improve the PCA for feature extraction. The traditional PCA result as total average of recall and precision are (93% ,97%) and for the improved PCA average recall and precision are (98% ,100%), this show that the improved PCA is more effective in recall and precision.
In multivariate survival analysis, estimating the multivariate distribution functions and then measuring the association between survival times are of great interest. Copula functions, such as Archimedean Copulas, are commonly used to estimate the unknown bivariate distributions based on known marginal functions. In this paper the feasibility of using the idea of local dependence to identify the most efficient copula model, which is used to construct a bivariate Weibull distribution for bivariate Survival times, among some Archimedean copulas is explored. Furthermore, to evaluate the efficiency of the proposed procedure, a simulation study is implemented. It is shown that this approach is useful for practical situations and applicable fo
... Show MoreThis study's objective is to assess how well UV spectrophotometry can be used in conjunction with multivariate calibration based on partial least squares (PLS) regression for concurrent quantitative analysis of antibacterial mixture (Levofloxacin (LIV), Metronidazole (MET), Rifampicin (RIF) and Sulfamethoxazole (SUL)) in their artificial mixtures and pharmaceutical formulations. The experimental calibration and validation matrixes were created using 42 and 39 samples, respectively. The concentration range taken into account was 0-17 μg/mL for all components. The calibration standards' absorbance measurements were made between 210 and 350 nm, with intervals of 0.2 nm. The associated parameters were examined in order to develop the optimal c
... Show MoreWater quality planning relies on Biochemical Oxygen Demand BOD. BOD testing takes five days. The Particle Swarm Optimization (PSO) is increasingly used for water resource forecasting. This work designed a PSO technique for estimating everyday BOD at Al-Rustumiya wastewater treatment facility inlet. Al-Rustumiya wastewater treatment plant provided 702 plant-scale data sets during 2012-2022. The PSO model uses the daily data of the water quality parameters, including chemical oxygen demand (COD), chloride (Cl-), suspended solid (SS), total dissolved solids (TDS), and pH, to determine how each variable affects the daily incoming BOD. PSO and multiple linear regression (MLR) findings are compared, and their perfor
... Show MoreLinear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show MoreIn aspect-based sentiment analysis ABSA, implicit aspects extraction is a fine-grained task aim for extracting the hidden aspect in the in-context meaning of the online reviews. Previous methods have shown that handcrafted rules interpolated in neural network architecture are a promising method for this task. In this work, we reduced the needs for the crafted rules that wastefully must be articulated for the new training domains or text data, instead proposing a new architecture relied on the multi-label neural learning. The key idea is to attain the semantic regularities of the explicit and implicit aspects using vectors of word embeddings and interpolate that as a front layer in the Bidirectional Long Short-Term Memory Bi-LSTM. First, we
... Show More