The present study aimed to explain the dose-dependent possible deleterious effects of 30 day administration of Tramadol on some hematological and biochemical parameters of laboratory male rats (Rattus norvegicus), the study consisted of eighteen adult male rats randomly divided into three equal groups (each of six). Group 1 (control) were treated by intraperitoneal injection of normal saline solution (0.2 ml), group two (low dose) was treated by intraperitonealy (i.p) injection of Tramadol at a dose of 50 mg/kg/day, group three (high dose) was treated by intraperitonealy injection of Tramadol at a dose of 100 mg/kg/day for 30 days. At the end of experimental period, rats were sacrificed. Blood were collected by cardiac puncture to investigate blood film and biochemical parameters which include Aspartate transaminase (AST), Alanine transaminase (ALT), urea, and glucose. Results explained a significant reduction in hemoglobin (Hb), packed cell volume (PCV), and red blood cells count (RBC), in both treated group and significant elevation in WBC count which is clearly appeared in lymphocyte count, while the biochemical results showed a significant increased in ALT, blood urea, and decreased in blood glucose level in high dose treated group mostly
The drones have become the focus of researchers’ attention because they enter into many details of life. The Tri-copter was chosen because it combines the advantages of the quadcopter in stability and manoeuvrability quickly. In this paper, the nonlinear Tri-copter model is entirely derived and applied three controllers; Proportional-Integral-Derivative (PID), Fractional Order PID (FOPID), and Nonlinear PID (NLPID). The tuning process for the controllers’ parameters had been tuned by using the Grey Wolf Optimization (GWO) algorithm. Then the results obtained had been compared. Where the improvement rate for the Tri-copter model of the nonlinear controller (NLPID) if compared with
Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreCrime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
This study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreDisease diagnosis with computer-aided methods has been extensively studied and applied in diagnosing and monitoring of several chronic diseases. Early detection and risk assessment of breast diseases based on clinical data is helpful for doctors to make early diagnosis and monitor the disease progression. The purpose of this study is to exploit the Convolutional Neural Network (CNN) in discriminating breast MRI scans into pathological and healthy. In this study, a fully automated and efficient deep features extraction algorithm that exploits the spatial information obtained from both T2W-TSE and STIR MRI sequences to discriminate between pathological and healthy breast MRI scans. The breast MRI scans are preprocessed prior to the feature
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show More