The ionospheric characteristics exhibit significant variations with the solar cycle, geomagnetic conditions, seasons, latitudes and even local time. Representation of this research focused on global distribution of electron (Te) and ion temperatures (Ti) during great and severe geomagnetic storms (GMS), their daily and seasonally variation for years (2001-2013), variations of electron and ion temperature during GMS with plasma velocity and geographic latitudes. Finally comparison between observed and predicted Te and Ti get from IRI model during the two kinds of storm selected. Data from satellite Defense Meteorological Satellite Program (DMSP) 850 km altitude are taken for Te, Ti and plasma velocity for different latitudes during great and severe geomagnetic storms from years 2001 to 2013 according to what is available appeared that there is 22 events for severe and great geomagnetic storms happened during years 2001-2005 only from years selected, from maximum solar cycle 23. From data analysis, in general the temperature of the electron is greater than the temperature of the ion, but there are some disturbances happened during the storm time, in the day there is fluctuation in values of Te and Ti with the value of Ti greater than Te. Through the Dst index, Te and Ti do not depend on the strength of the geomagnetic storm. Plasma velocity variation shows the same profile of Te and Ti variation during the storm time and there is a linear relation between (Te) & (Ti) and plasma velocity. The variation of electron and ion temperature with geographic latitude during severe and great storms appears that as the latitude increases the temperature of ions increases reaches its maximum value approximately 80000K at poles.
From comparing the predicted Te and Ti values calculating from IRI model during the great and severe storms with observed values, it’s found that the predicted values from IRI model much less than the observed values and the variation was nonlinear along 24 hours, from this we can conclude that the model must be corrected for Te and Ti for these two kinds of storms.
This research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreThis paper proposes a new method to tune a fractional order PID controller. This method utilizes both the analytic and numeric approach to determine the controller parameters. The control design specifications that must be achieved by the control system are gain crossover frequency, phase margin, and peak magnitude at the resonant frequency, where the latter is a new design specification suggested by this paper. These specifications results in three equations in five unknown variables. Assuming that certain relations exist between two variables and discretizing one of them, a performance index can be evaluated and the optimal controller parameters that minimize this performance index are selected. As a case study, a thir
... Show MoreAlternative distribution to estimate the Dose – Response model in bioassay excrement
This research concern to study five different distribution (Probit , Logistic, Arc sine , extreme value , One hit ), to estimate dose –response model by using m.l.e and probit method This is done by determining different weights in each distribution in addition find all particular statistics for vital model .
Learning programming is among the top challenges in computer science education. A part of that, program visualization (PV) is used as a tool to overcome the high failure and drop-out rates in an introductory programming course. Nevertheless, there are rising concerns about the effectiveness of the existing PV tools following the mixed results derived from various studies. Student engagement is also considered a vital factor in building a successful PV, while it is also an important part of the learning process in general. Several techniques have been introduced to enhance PV engagement; however, student engagement with PV is still challenging. This paper employed three theories—constructivism, social constructivism and cognitive load t
... Show MoreThe second leading cause of death and one of the most common causes of disability in the world is stroke. Researchers have found that brain–computer interface (BCI) techniques can result in better stroke patient rehabilitation. This study used the proposed motor imagery (MI) framework to analyze the electroencephalogram (EEG) dataset from eight subjects in order to enhance the MI-based BCI systems for stroke patients. The preprocessing portion of the framework comprises the use of conventional filters and the independent component analysis (ICA) denoising approach. Fractal dimension (FD) and Hurst exponent (Hur) were then calculated as complexity features, and Tsallis entropy (TsEn) and dispersion entropy (DispEn) were assessed as
... Show MoreThis study is the first and new record to the spider Scytodes univittata Simon, 1882 (Araneae:Scytodidae)in Baghdad /Iraq , the spiders Scytodes univittata were collect from province Baghdad in Iraq , genus Scytodes belong to the family Scytodidae it is one of the most family are wide distribution around the world have 6 eyes and are slow moving , the genus Scytodes are known from the names spitting spiders ,.Female Scytodes univittata can be characteristic by :large round cephalothorax length:4.45 mm , abdomen length 3.50, total body length 7.95 mm and V-shaped of fovea, scutela triangle and large with long thin legs femur I have two row of spines then spineless are in IV femur , coloration is yello
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods