A rapid, simple and sensitive spectrophotometric method for the determination of trace amounts of chromium is studied. The method is based on the interaction of chromium with indigo carmine dye in acidic medium and the presence of oxalates as a catalyst for interaction, and after studying the absorption spectrum of the solution resulting observed decrease in the intensity of the absorption. As happened (Bleaching) for color dye, this palace and directly proportional to the chromium (VI) amount was measured intensity of the absorption versus solution was figurehead at a wavelength of 610 nm. A plot of absorbance with chromium (VI) concentration gives a straight line indicating that Beer’s law has been obeyed over the range of 0.5 -70 µg /25 ml, i.e., 0.02- 2.8 ppm with a molar absorptivity of chromium (VI) 1.71? 104 l.mol-1.cm-1, Sandell’s sensitivity index of 0.0030 µg.cm-2 .The detection limit of chromium was (DL) 0.0012 µg.mL-1 and a relative standard deviation of ? (0.70 -1.86)% depended on the concentration level. The method is developed for the determination of chromium(III) and has been successfully applied to the determination of chromium in various water samples, Pharmaceutical preparations ,standard rock sample of (MRG-1).
The paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreFacial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreWireless sensor network (WSN) security is an important component for protecting data from an attacker. For improving security, cryptography technologies are divided into two kinds: symmetric and asymmetric. Therefore, the implementation of protocols for generating a secret key takes a long time in comparison to the sensor’s limitations, which decrease network throughput because they are based on an asymmetric method. The asymmetric algorithms are complex and decrease network throughput. In this paper, an encryption symmetric secret key in wireless sensor networks (WSN) is proposed. In this work, 24 experiments are proposed, which are encryption using the AES algorithm in the cases of 1 key, 10 keys, 25 keys, and 50 keys. I
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreInstitutions and companies are looking to reduce spending on buildings and services according to scientific methods, provided they reach the same purpose but at a lower cost. On this basis, this paper proposes a model to measure and reduce maintenance costs in one of the public sector institutions in Iraq by using performance indicators that fit the nature of the work of this institution and the available data. The paper relied on studying the nature of the institution’s work in the maintenance field and looking at the type of data available to know the type and number of appropriate indicators to create the model. Maintenance data were collected for the previous six years by reviewing the maintenance and financial dep
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show MoreUsing the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through that region. The result
... Show More