The catalytic cracking of three feeds of extract lubricating oil, that produced as a by-product from the process of furfural extraction of lubricating oil base stock in AL-Dura refinery at different operating condition, were carried out at a fixed bed laboratory reactor. The initial boiling point for these feeds was 140 ºC for sample (1), 86 ºC for sample (2) and 80 ºC for sample (3). The catalytic cracking processes were carried out at temperature range 325-400 ºC and initially at atmospheric pressure after 30 minutes over 9.88 % HY-zeolite catalyst load. The comparison between the conversion at different operating conditions of catalytic cracking processes indicates that a high yield was obtained at 375°C, according to gasoline production. The distillation of cracking liquid products was achieved by general ASTM distillation (ASTM D -86) for separation of gasoline fraction up to 220 ºC from light cycle oil fraction above 220 ºC. According to gasoline production, it can be noticed that the condition of the feed with the lowest initial boiling point (80 ºC) (sample 3) made it gives more production of gasoline as compared with the other feeds (sample 1,2). At the best temperature (375 ºC), for the best feed for the production of gasoline (sample (3)), the production of gasoline + kerosene were 19.315, 16.16 and 12.95 wt.% for sample (2, 3 and 1). The RON for the gasoline produced from the catalytic cracking for the feed of the lowest initial boiling point was 92.3.
In this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreJoining tissue is a growing problem in surgery with the advancement of the technology and more precise and difficult surgeries are done. Tissue welding using laser is a promising technique that might help in more advancement of the surgical practice. Objectives: To study the ability of laser in joining tissues and the optimum parameters for yielding good welding of tissues. Methods: An in-vitro study, done at the Institute of Laser, Baghdad University during the period from October 2008 to February 2009. Diode and Nd-YAG lasers were applied, using different sessions, on sheep small intestine with or without solder to obtain welding of a 2-mm length full thickness incision. Different powers and energies were used to get maximum effect. Re
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation techniq
... Show MoreImplementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show MoreSocial Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreA modification to cascaded single-stage distributed amplifier (CSSDA) design by using active inductor is proposed. This modification is shown to render the amplifier suitable for high gain operation in small on-chip area. Microwave office program simulation of the Novel design approach shows that it has performance compatible with the conventional distributed amplifiers but with smaller area. The CSSDA is suitable for optical and satellite communication systems.
The main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreAbstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More