In a common language based on interpretation and diagnosis in the symbols and signs, the subject of Sufism and artistic semiotics is manifested in the construction and intensity of the reading of the text and the dismantling of its intellectual systems.
The emergence of Sufism in its religious features and the spiritual revelations related to the divine love of life in absolute reality, And images and language in a stream of intellectual and artistic unique and harmonious communicates with the subject of the themes of the Arab literature and its implications, but it is separated by a special entity signals and symbols related to the mysticism and worship.
The unleashing of the imagination and the diagnosis, and the Tijsim was a repository from which Sufism derives the powers of the soul and its primary material through the formation of the world of images and intended divine names and divine self. Which helped them in their words and their efforts to find a special language depends on the symbol in the expression of their ideas is the meaning of the soles of stock under the words of the apparent can only be achieved by the family in its public form calls for ambiguity and concealment.
And within the mechanisms of producing a new text, but what appeared in the form of external show us continue to meet the semiotics with the language of Sufism between the capacity of thought and deciphering symbols, to show us the mystical text signals and forms within the space of plastic modernism.
The Sufi text has taken on a distinctive character in its different formations of different poets and their different experiences. Hence, the study was of an analytical and semi-descriptive nature within the cells of Sufi symbols of wine, nature, beauty, and journeys to the Divine Self and the Sufi Sufi. This research came in two frameworks, the first of which was the induction framework which came to be known in various terms such as semiotics and mysticism, as well as the book Al-Khuraida. The second is to be the analytical framework.
One of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services th
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services through our ca
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreDEMs, thus, simply regular grids of elevation measurements over the land surface.The aim of the present work is to produce high resolution DEM for certain investigated region (i.e. Baghdad University Campus\ college of science). The easting and northing of 90 locations, including the ground-base and buildings of the studied area, have been obtained by field survey using global positioning system (GPS). The image of the investigated area has been extracted from Quick-Bird satellite sensor (with spatial resolution of 0.6 m). It has been geo-referenced and rectified using 1st order polynomial transformation. many interpolation methods have been used to estimate the elevation such as ordinary Kriging, inverse distance weight
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
In this article we study a single stochastic process model for the evaluate the assets pricing and stock.,On of the models le'vy . depending on the so –called Brownian subordinate as it has been depending on the so-called Normal Inverse Gaussian (NIG). this article aims as the estimate that the parameters of his model using my way (MME,MLE) and then employ those estimate of the parameters is the study of stock returns and evaluate asset pricing for both the united Bank and Bank of North which their data were taken from the Iraq stock Exchange.
which showed the results to a preference MLE on MME based on the standard of comparison the average square e
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time t . The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method t
... Show More