Body odour is the smell caused by bacteria feeding on sweat on the skin, especially in the armpit and groin area. Fifty-four volunteers from students and employees of college of Education Ibn Al- Haitham, were surveyed. Data were obtained concerning: subject details and microbial examination. The following conclusions were reached: 1) coagulase negative Staphylococcus was the most common isolate. 2) The most effective antibiotics were amikacin, ciprofloxacin, vancomycin, cephalothin, tobramycin, gentamycin respectively and were least sensitive to methicillin and penicillin G. 3) Alum zirconium and alum chlorohydrate were the most effective antiperspirants.
In this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
Wireless sensor networks (WSNs) are emerging in various application like military, area monitoring, health monitoring, industry monitoring and many more. The challenges of the successful WSN application are the energy consumption problem. since the small, portable batteries integrated into the sensor chips cannot be re-charged easily from an economical point of view. This work focusses on prolonging the network lifetime of WSNs by reducing and balancing energy consumption during routing process from hop number point of view. In this paper, performance simulation was done between two types of protocols LEACH that uses single hop path and MODLEACH that uses multi hop path by using Intel Care i3 CPU (2.13GHz) laptop with MATLAB (R2014a). Th
... Show MoreThere is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreWet granulation method was used instead of conventional pan coating or fluidized –bed coating technique to prepare enteric-coated diclofenac sodium granules, using ethanolic solution of EudragitTM L100 as coating, binding and granulating agent .Addition of PEG400 or di-n-butyl phthalate as a plasticizer was found to improve the enteric property of the coat.
Part of the resulted granules was filled in hard gelatin capsules (size 0), while the other part was compressed into tablets with and without disintegrant.
The release profile of these two dosage forms in 0.1N HCl (pH 1.2)for 2 hours, and in phosphate buffer (pH 6.8) for 45 minutes as well as the release kinetic were compared with that of the en
... Show MoreImmune-mediated hepatitis is a severe impendence to human health, and no effective treatment is currently available. Therefore, new, safe, low-cost therapies are desperately required. Berbamine (BE), a natural substance obtained primarily from
This c
Tanuma and Zubair formations are known as the most problematic intervals in Zubair Oilfield, and they cause wellbore instability due to possible shale-fluid interaction. It causes a vast loss of time dealing with various downhole problems (e.g., stuck pipe) which leads to an increase in overall well cost for the consequences (e.g., fishing and sidetrack). This paper aims to test shale samples with various laboratory tests for shale evaluation and drilling muds development. Shale's physical properties are described by using a stereomicroscope and the structures are observed with Scanning Electron Microscope. The shale reactivity and behavior are analyzed by using the cation exchange capacity testing and the capillary suction test is
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreHTH Ahmed Dheyaa Al-Obaidi,", Ali Tarik Abdulwahid', Mustafa Najah Al-Obaidi", Abeer Mundher Ali', eNeurologicalSci, 2023
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show More