High smoke emissions, nitrogen oxide and particulate matter typically produced by diesel engines. Diminishing the exhausted emissions without doing any significant changes in their mechanical configuration is a challenging subject. Thus, adding hydrogen to the traditional fuel would be the best practical choice to ameliorate diesel engines performance and reduce emissions. The air hydrogen mixer is an essential part of converting the diesel engine to work under dual fuel mode (hydrogen-diesel) without any engine modification. In this study, the Air-hydrogen mixer is developed to get a homogenous mixture for hydrogen with air and a stoichiometric air-fuel ratio according to the speed of the engine. The mixer depends on the balance between th
... Show MoreIn this project we analyze data of a large sample of gas rich dwarfs galaxies including; Low Surface Brightness Galaxies (LSBGs), Blue Compact Galaxies (BCGs), and dwarfs Irregulars (dIr). We then study the difference between properties of these galaxies in the range of radio frequencies (B-band). The data are available in HIPASS catalogue and McGaugh’s Data Page. We depended also NASA/IPACExtragalactic Databes web site http://ned.ipac.caltech.edu in the data reduction. We measured the gas evolution (HI mass), gas mass-to-luminosity ratio, and abundance of the elements such as the oxygen abundance for these galaxies. Our results show a
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
The urban Gentrification is an inclusive global phenomenon to restructure the cities on the overall levels, the research to propose a specific study about the concept of urban Gentrification in the cities and showcasing its, specifications, and results, and how to deal with the variables that occur on cities through improvements as part of urban renewal projects, then the general axis of the research is shrinked, choosing the urban centers as the most important areas that deal with the urban Gentrification process due to its direct connection with indivisuals and social changes, and to process the specific axis of the research theses and studies will be showcased that discuss the topic in different research directions, and emerged
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show More