The present study aimed to investigate the histological, enzyme histochemical changes and liver function effects of Maxxthor insecticide on albino rats in liver. The experiment included 20 rat which were divided into four groups, the first group 5 rats were considered the control animals and the others were divided equally into three groups with a dose of 0.01, 0.1and 1 mg / kg of body weight, respectively for a period 40 days. The animals given each 48 hours via oral route Maxxthor by tube dosage after dissolved with distilled water. Microscopic examination of liver showed inflammatory cell aggregation around vessels, congestion and dilation of sinusoids hepatocytes hypertrophy with severs inflammatory cells infiltration, kupffer cells proliferation and hydropic degeneration. Enzyme histochemical study of liver showed weak expression of ALP activity in hepatocyte in low dose, sever expression in middle dose, and mild expression in high dose. There was significantly increase in serum aminotransferases (ALT, AST) and alkaline phosphatase (ALP) level in treated group of rats as compared with control group.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image co
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreRestoration is the main process in many applications. Restoring an original image from a damaged image is the foundation of the restoring operation, either blind or non-blind. One of the main challenges in the restoration process is to estimate the degradation parameters. The degradation parameters include Blurring Function (Point Spread Function, PSF) and Noise Function. The most common causes of image degradation are errors in transmission channels, defects in the optical system, inhomogeneous medium, relative motion between object and camera, etc. In our research, a novel algorithm was adopted based on Circular Hough Transform used to estimate the width (radius, sigma) of the Point Spread Function. This algorithm is based o
... Show MoreIn this paper, we studied the scheduling of jobs on a single machine. Each of n jobs is to be processed without interruption and becomes available for processing at time zero. The objective is to find a processing order of the jobs, minimizing the sum of maximum earliness and maximum tardiness. This problem is to minimize the earliness and tardiness values, so this model is equivalent to the just-in-time production system. Our lower bound depended on the decomposition of the problem into two subprograms. We presented a novel heuristic approach to find a near-optimal solution for the problem. This approach depends on finding efficient solutions for two problems. The first problem is minimizing total completi
... Show MoreDigital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
The strong cryptography employed by PGP (Pretty Good Privacy) is one of the best available today. The PGP protocol is a hybrid cryptosystem that combines some of the best features of both conventional and public-key cryptography. This paper aim to improve PGP protocol by combined between the Random Genetic algorithm, NTRU (N-th degree Truncated polynomial Ring Unit) algorithm with PGP protocol stages in order to increase PGP protocol speed, security, and make it more difficult in front of the counterfeiter. This can be achieved by use the Genetic algorithm that only generates the keys according to the Random Genetic equations. The final keys that obtained from Genetic algorithm were observed to be purely random (according to the randomne
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreA new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducte
... Show MoreIn this paper two ranking functions are employed to treat the fuzzy multiple objective (FMO) programming model, then using two kinds of membership function, the first one is trapezoidal fuzzy (TF) ordinary membership function, the second one is trapezoidal fuzzy weighted membership function. When the objective function is fuzzy, then should transform and shrinkage the fuzzy model to traditional model, finally solving these models to know which one is better
The purpose of this paper is to introduce and study the concepts of fuzzy generalized open sets, fuzzy generalized closed sets, generalized continuous fuzzy proper functions and prove results about these concepts.