<p>In this paper, a simple color image compression system has been proposed using image signal decomposition. Where, the RGB image color band is converted to the less correlated YUV color model and the pixel value (magnitude) in each band is decomposed into 2-values; most and least significant. According to the importance of the most significant value (MSV) that influenced by any simply modification happened, an adaptive lossless image compression system is proposed using bit plane (BP) slicing, delta pulse code modulation (Delta PCM), adaptive quadtree (QT) partitioning followed by an adaptive shift encoder. On the other hand, a lossy compression system is introduced to handle the least significant value (LSV), it is based on an adaptive, error bounded coding system, and it uses the DCT compression scheme. The performance of the developed compression system was analyzed and compared with those attained from the universal standard JPEG, and the results of applying the proposed system indicated its performance is comparable or better than that of the JPEG standards.</p>
In this paper a refractive index sensor based on micro-structured optical fiber has been proposed using Finite Element Method (FEM). The designed fiber has a hexagonal cladding structure with six air holes rings running around its solid core. The air holes of fiber has been infiltrated with different liquids such as water , ethanol, methanol, and toluene then sensor characteristics like ; effective refractive index , confinement loss, beam profile of the fundamental mode, and sensor resolution are investigated by employing the FEM. This designed sensor characterized by its low confinement loss and high resolution so a small change in the analyte refractive index could be detect which is could be useful to detect the change of
... Show MoreCarbon monoxide (CO) plays an important indirect greenhouse gases due to its influences on the budgets of hydroxyl radicals (OH) and Ozone (O3). The atmospheric carbon monoxide (CO) observations can only be made on global and continental scales by remote sensing instruments situated in space. One of instrument is the Measurements of Pollution in the Troposphere (MOPITT), which is designed to measure troposphere CO and CH4 by use of a nadir-viewing geometry and was launched aboard the Earth Observing System (EOS) Terra spacecraft on 18 December 1999. Results from the analysis of the retrieved monthly (1ºх1º) spatial grid resolution, from the MOPITT data were utilized to analyze the distribution of CO surface mixing ratio in Iraq for th
... Show MoreIn modern times face recognition is one of the vital sides for computer vision. This is due to many reasons involving availability and accessibility of technologies and commercial applications. Face recognition in a brief statement is robotically recognizing a person from an image or video frame. In this paper, an efficient face recognition algorithm is proposed based on the benefit of wavelet decomposition to extract the most important and distractive features for the face and Eigen face method to classify faces according to the minimum distance with feature vectors. Faces94 data base is used to test the method. An excellent recognition with minimum computation time is obtained with accuracy reaches to 100% and recognition time decrease
... Show MoreSocial media and news agencies are major sources for tracking news and events. With these sources' massive amounts of data, it is easy to spread false or misleading information. Given the great dangers of fake news to societies, previous studies have given great attention to detecting it and limiting its impact. As such, this work aims to use modern deep learning techniques to detect Arabic fake news. In the proposed system, the attention model is adapted with bidirectional long-short-term memory (Bi-LSTM) to identify the most informative words in the sentence. Then, a multi-layer perceptron (MLP) is applied to classify news articles as fake or real. The experiments are conducted on a newly launched Arabic dataset called the Ara
... Show MoreThis paper describes a practical study on the impact of learning's partners, Bluetooth Broadcasting system, interactive board, Real – time response system, notepad, free internet access, computer based examination, and interaction classroom, etc, had on undergraduate student performance, achievement and involving with lectures. The goal of this study is to test the hypothesis that the use of such learning techniques, tools, and strategies to improve student learning especially among the poorest performing students. Also, it gives some kind of practical comparison between the traditional way and interactive way of learning in terms of lectures time, number of tests, types of tests, student's scores, and student's involving with lectures
... Show MoreThe emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreSocial media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreGovernmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicate
... Show More