This study was conducted to estimate some heavy metals cadmium, lead, nickel and iron in 15 samples of Iraqi honey with 3 replicates for each sample which were collected from apiaries near potential contamination areas in five Iraqi governorates, including Baghdad, Karbala, Babylon, Diyala and Salah al-Din. The atomic absorption technique was used to estimate the concentrations of heavy metals, the results showed that there were significant differences at (P≤0.05) between the concentrations of these elements in the honey samples, the highest concentrations of cadmium 0.123 mg/kg were recorded in Baghdad, near the petrochemical production complex, lead 4.657 mg/kg and nickel 0.023 mg/kg in Babylon near the power plant, iron was 1.863 mg/kg in Karbala near the waste collection and incineration plant, and all the concentrations of cadmium and lead in the studied honey samples were higher than the acceptable limits set by the European Commission Regulation.
The cuneiform images need many processes in order to know their contents
and by using image enhancement to clarify the objects (symbols) founded in the
image. The Vector used for classifying the symbol called symbol structural vector
(SSV) it which is build from the information wedges in the symbol.
The experimental tests show insome numbersand various relevancy including
various drawings in online method. The results are high accuracy in this research,
and methods and algorithms programmed using a visual basic 6.0. In this research
more than one method was applied to extract information from the digital images
of cuneiform tablets, in order to identify most of signs of Sumerian cuneiform.
Localization is an essential demand in wireless sensor networks (WSNs). It relies on several types of measurements. This paper focuses on positioning in 3-D space using time-of-arrival- (TOA-) based distance measurements between the target node and a number of anchor nodes. Central localization is assumed and either RF, acoustic or UWB signals are used for distance measurements. This problem is treated by using iterative gradient descent (GD), and an iterative GD-based algorithm for localization of moving sensors in a WSN has been proposed. To localize a node in 3-D space, at least four anchors are needed. In this work, however, five anchors are used to get better accuracy. In GD localization of a moving sensor, the algo
... Show MoreHM Al-Dabbas, RA Azeez, AE Ali, IRAQI JOURNAL OF COMPUTERS, COMMUNICATIONS, CONTROL AND SYSTEMS ENGINEERING, 2023
Estimations of average crash density as a function of traffic elements and characteristics can be used for making good decisions relating to planning, designing, operating, and maintaining roadway networks. This study describes the relationships between total, collision, turnover, and runover accident densities with factors such as hourly traffic flow and average spot speed on multilane rural highways in Iraq. The study is based on data collected from two sources: police stations and traffic surveys. Three highways are selected in Wassit governorate as a case study to cover the studied locations of the accidents. Three highways are selected in Wassit governorate as a case study to cover the studied locations of the accidents. The selection
... Show More<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
Induced EF is among the most important of advanced oxidation processes (AOPs) It was employed to treat different kinds of wastewater. In the present review, the types and mechanism of induced EF were outlined. Parameters affecting this process have been mentioned with details. These are current density, pH, H2O2 concentration, and time. The application of induced electro Fenton in various sectors of industries like textile, petroleum refineries, and pharmaceutical were outlined. The outcomes of this review demonstrate the vital role of induced EF in treatment of wastewater at high efficiency and low cost in contrast with conventional technique
Recent years have seen an explosion in graph data from a variety of scientific, social and technological fields. From these fields, emotion recognition is an interesting research area because it finds many applications in real life such as in effective social robotics to increase the interactivity of the robot with human, driver safety during driving, pain monitoring during surgery etc. A novel facial emotion recognition based on graph mining has been proposed in this paper to make a paradigm shift in the way of representing the face region, where the face region is represented as a graph of nodes and edges and the gSpan frequent sub-graphs mining algorithm is used to find the frequent sub-structures in the graph database of each emotion. T
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show More