The usual methods of distance determination in Astronomy parallax and Spectroscopic with Expansion Methods are seldom applicable to Nebulae. In this work determination of the distances to individual Nebulae are calculated and discussed. The distances of Nebulae to the Earth are calculated. The accuracy of the distance is tested by using Aladin sky Atlas, and comparing Nebulae properties were derived from these distance made with statistical distance determination. The results showed that angular Expansions may occur in a part of the nebulae that is moving at a velocity different than the observed velocity. Also the results of the comparison of our spectroscopic distances with the trigonometric distances is that the spectroscopic distances are 55% larger. Since using trigonometric parallaxes with large relative measurement errors can introduce systematic errors, we carried out a Monte Carlo simulation of the biases introduced by selection effects and measurement errors. It turns out that a difference between both distance scales of the observed size is expected for the present day data if the underlying distance scales are identical.
Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreIn this paper, we are mainly concerned with estimating cascade reliability model (2+1) based on inverted exponential distribution and comparing among the estimation methods that are used . The maximum likelihood estimator and uniformly minimum variance unbiased estimators are used to get of the strengths and the stress ;k=1,2,3 respectively then, by using the unbiased estimators, we propose Preliminary test single stage shrinkage (PTSSS) estimator when a prior knowledge is available for the scale parameter as initial value due past experiences . The Mean Squared Error [MSE] for the proposed estimator is derived to compare among the methods. Numerical results about conduct of the considered
... Show MoreThe importance of the research lies in preparing exercises using a proposed device to learn the skill of thehuman wheel on a machine rug of ground movements of the artistic gymnastics. As for the research problem:Through the presence of the two researchers as teachers and observers of this sport in the gymnastics hall,they noticed that there is difficulty in the students’ performance of the skill of the round off on the machineof the mat of ground movements, according to the researchers’ opinion, the reason for this is that skillsare taught with the limited availability of assistive devices, as well as the lack of use of these devices inexercises according to biomechanical variables, although they facilitate the learning process
... Show MoreThe wastewater arising from pulp and paper mills is highly polluted and has to be treated before discharged into rivers. Coagulation-flocculation process using natural polymers has grown rapidly in wastewater treatment. In this work, the performance of alum and Polyaluminum Chloride (PACl) when used alone and when coupled with Fenugreek mucilage on the treatment of pulp and paper mill wastewater were studied. The experiments were carried out in jar tests with alum, PACl and Fenugreek mucilage dosages range of 50-2000 mg/L, rapid mixing at 200 rpm for 2 min, followed by slow mixing at 40 rpm for 15 min and settling time of 30 min. The effectiveness of Fenugreek mucilage was measured by the reduction of turbidity and Chemical Oxygen Demand
... Show More
Abstract
This research deals with Building A probabilistic Linear programming model representing, the operation of production in the Middle Refinery Company (Dura, Semawa, Najaif) Considering the demand of each product (Gasoline, Kerosene,Gas Oil, Fuel Oil ).are random variables ,follows certain probability distribution, which are testing by using Statistical programme (Easy fit), thes distribution are found to be Cauchy distribution ,Erlang distribution ,Pareto distribution ,Normal distribution ,and General Extreme value distribution . &
... Show MoreThe proposed algorithm that is presented in this paper is based on using the principle of texts translation from one language to another, but I will develop this meaning to cipher texts by using any electronic dictionary as a tool of ciphering based on the locations of the words that text contained them in the dictionary. Then convert the text file into picture file, such as BMP-24 format. The picture file will be transmitted to the receiver. The same algorithm will be used in encryption and decryption processing in forward direction in the sender, and in backward direction in the receiver. Visual Basic 6.0 is used to implement the proposed cryptography algorithm.
Steganography is the art of secret communication. Its purpose is to hide the presence of information, using, for example, images as covers. The frequency domain is well suited for embedding in image, since hiding in this frequency domain coefficients is robust to many attacks. This paper proposed hiding a secret image of size equal to quarter of the cover one. Set Partitioning in Hierarchal Trees (SPIHT) codec is used to code the secret image to achieve security. The proposed method applies Discrete Multiwavelet Transform (DMWT) for cover image. The coded bit stream of the secret image is embedded in the high frequency subbands of the transformed cover one. A scaling factors ? and ? in frequency domain control the quality of the stego
... Show More