The major goal of this research was to use the Euler method to determine the best starting value for eccentricity. Various heights were chosen for satellites that were affected by atmospheric drag. It was explained how to turn the position and velocity components into orbital elements. Also, Euler integration method was explained. The results indicated that the drag is deviated the satellite trajectory from a keplerian orbit. As a result, the Keplerian orbital elements alter throughout time. Additionally, the current analysis showed that Euler method could only be used for low Earth orbits between (100 and 500) km and very small eccentricity (e = 0.001).
A novel analytical method is developed for the determination of azithromycin. The method utilizes continuous flow injection analysis to enhance the chemiluminescence system of luminol, H2O2, and Cr(III). The method demonstrated a linear dynamic range of 0.001–100 mmol L-1 with a high correlation coefficient (r) of 0.9978, and 0.001–150 mmol L-1 with a correlation coefficient (r) of 0.9769 for the chemiluminescence emission versus azithromycin concentration. The limit of detection (L.O.D.) of the method was found to be 18.725 ng.50 µL−1 based on the stepwise dilution method for the lowest concentration within the linear dynamic range of the calibration graph. The relative standard deviation (R.S.D. %) for n = 6 was less than 1.2%
... Show MoreIn this work, a simple and very sensitive cloud point extraction (CPE) process was developed for the determination of trace amount of metoclopramide hydrochloride (MTH) in pharmaceutical dosage forms. The method is based on the extraction of the azo-dye results from the coupling reaction of diazotized MTH with p-coumaric acid (p-CA) using nonionic surfactant (Triton X114). The extracted azo-dye in the surfactant rich phase was dissolved in ethanol and detected spectrophotometrically at λmax 480 nm. The reaction was studied using both batch and CPE methods (with and without extraction) and a simple comparison between the two methods was performed. The conditions that may be affected by the extraction process and the sensitivity of m
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn this paper , an efficient new procedure is proposed to modify third –order iterative method obtained by Rostom and Fuad [Saeed. R. K. and Khthr. F.W. New third –order iterative method for solving nonlinear equations. J. Appl. Sci .7(2011): 916-921] , using three steps based on Newton equation , finite difference method and linear interpolation. Analysis of convergence is given to show the efficiency and the performance of the new method for solving nonlinear equations. The efficiency of the new method is demonstrated by numerical examples.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show More