In this research study the synodic month for the moon and their
relationship with the mean anomaly for the moon orbit and date A.D
and for long periods of time (100 years), we was design a computer
program that calculates the period of synodic months, and the
coordinates of the moon at the moment of the new moon with high
accuracy. During the 100 year, there are 1236 period of synodic
months.
We found that the when New Moon occurs near perigee (mean
anomaly = 0°), the length of the synodic month at a minimum.
Similarly, when New Moon occurs near apogee (mean anomaly =
180°), the length of the synodic month reaches a maximum. The
shortest synodic month on 2053 /1/ 16 and lasted (29.27436) days.
The longest synodic month began on 2008 /11/ 27 and lasted
(29.81442) days. The mean synodic month (29.53109) days. We
found the relationship between synodic month with months. The
shortest synodic month are correlated with date (June and July) when
the Earth is near aphelion. And the longest Synodic month are
correlated with date (December and January) when the Earth is near
perihelion.
This investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreTraining has an effect on employees’ performances. Accordingly, the person who is responsible for employees’ development must figure out the most effective way to train and develop employees. Central Michigan University (CMU) has recognized the importance of providing appropriate training for employees who have a duty in advising students. The reason is that these employees have a significant impact on students’ educational performances. Thus, special attention to this category of employees is needed to improve advising quality. This research attempted to explore the impact of training on academic advising at CMU. Face-to-face interviews and online surveys were used as data collection tools for this study. The study scope c
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
To decrease the dependency of producing high octane number gasoline on the catalytic processes in petroleum refineries and to increase the gasoline pool, the effect of adding a suggested formula of composite blending octane number enhancer to motor gasoline composed of a mixture of oxygenated materials (ethanol and ether) and aromatic materials (toluene and xylene) was investigated by design of experiments made by Mini Tab 15 statistical software. The original gasoline before addition of the octane number blending enhancer has a value of (79) research octane number (RON). The design of experiments which study the optimum volumetric percentages of the four variables, ethanol, toluene, and ether and xylene materials leads
... Show MoreThis research includes the using of statistical to improve the quality of can plastics which is produced at the state company for Vegetable oils (Almaamon factory ) by using the percentage defective control chart ( p-chart ) of a fixed sample. A sample of size (450) cans daily for (30) days was selected to determine the rejected product . Operations research with a (win QSB ) package for ( p-chart ) was used to determine test quality level required for product specification to justify that the process that is statistically controlled.
The results show high degree of accuracy by using the program and the mathematical operations (primary and secondary ) which used to draw the control limits charts and to reject the statistically uncontr
The goal of this work is demonstrating, through the gradient observation of a of type linear ( -systems), the possibility for reducing the effect of any disturbances (pollution, radiation, infection, etc.) asymptotically, by a suitable choice of related actuators of these systems. Thus, a class of ( -system) was developed based on finite time ( -system). Furthermore, definitions and some properties of this concept -system and asymptotically gradient controllable system ( -controllable) were stated and studied. More precisely, asymptotically gradient efficient actuators ensuring the weak asymptotically gradient compensation system ( -system) of known or unknown disturbances are examined. Consequently, under convenient hypo
... Show MorePartial shading is one of the problems that affects the power production and the efficiency of photovoltaic module. A series of experimental work have been done of partial shading of monocrystalline PV module; 50W, Isc: 3.1A, Voc: 22V with 36 cells in series is achieved. Non-linear power output responses of the module are observed by applying various cases of partial shading (vertical and horizontal shading of solar cells in the module). Shading a single cell (corner cell) has the greatest impact on output energy. Horizontal shading or vertical shading reduced the power from 41W to 18W at constant solar radiation 1000W/m2 and steady state condition. Vertical blocking a column
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.