<span>One of the main difficulties facing the certified documents documentary archiving system is checking the stamps system, but, that stamps may be contains complex background and surrounded by unwanted data. Therefore, the main objective of this paper is to isolate background and to remove noise that may be surrounded stamp. Our proposed method comprises of four phases, firstly, we apply k-means algorithm for clustering stamp image into a number of clusters and merged them using ISODATA algorithm. Secondly, we compute mean and standard deviation for each remaining cluster to isolate background cluster from stamp cluster. Thirdly, a region growing algorithm is applied to segment the image and then choosing the connected region to produce a binary mask for the stamp area. Finally, the binary mask is combined with the original image to extract the stamp regions. The results indicate that the number of clusters can be determined dynamically and the largest cluster that has minimum standard deviation (i.e., always the largest cluster is the background cluster). Also, show that the binary mask can be established from more than one segment to cover are all stamp’s disconnected pieces and it can be useful to remove the noise appear with stamp region.</span>
Genetic algorithms (GA) are a helpful instrument for planning and controlling the activities of a project. It is based on the technique of survival of the fittest and natural selection. GA has been used in different sectors of construction and building however that is rarely documented. This research aimed to examine the utilisation of genetic algorithms in construction project management. For this purpose, the research focused on the benefits and challenges of genetic algorithms, and the extent to which genetic algorithms is utilised in construction project management. Results showed that GA provides an ability of generating near optimal solutions which can be adopted to reduce complexity in project management and resolve difficult problem
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
The purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreDocument clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreThe aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN
Eye loss may be caused as a result of eye trauma, accidents, or malignant tumors, which leads the patient to undergo surgery to remove the damaged parts. This research examines the potential of computer vision represented by Structure from Motion (SfM) photogrammetry in fabricating the orbital prosthesis as a noninvasive and low-cost technique. A low-cost camera was used to collect the data towards extracting the dense 3D data of the patient facial features following Structure from Motion-Multi View Stereo (SfM-MVS) algorithms. To restore the defective orbital, a Reverse Engineering (RE) based approach has been applied using the similarity RE algorithms based on the opposite healthy eye to rehabilitate the defected orbital precisely
... Show MoreA method is developed for the determination of iron (III) in pharmaceutical preparations by coupling cloud point extraction (CPE) and UV-Vis spectrophotometry. The method is based on the reaction of Fe(III) with excess drug ciprofloxacin (CIPRO) in dilute H2SO4, forming a hydrophobic Fe(III)- CIPRO complex which can be extracted into a non-ionic surfactant Triton X-114, and iron ions are determined spectrophotometrically at absorption maximum of 437 nm. Several variables which impact on the extraction and determination of Fe (III) are optimized in order to maximize the extraction efficiency and improve the sensitivity of the method. The interferences study is also considered to check the accuracy of the procedure. The results hav
... Show MoreIn this paper a nonlinear adaptive control method is presented for a pH process, which is difficult to control due to the nonlinear and uncertainties. A theoretical and experimental investigation was conducted of the dynamic behavior of neutralization process in a continuous stirred tank reactor (CSTR). The process control was implemented using different control strategies, velocity form of PI control and nonlinear adaptive control. Through simulation studies it has been shown that the estimated parameters are in good agreement with the actual values and that the proposed adaptive controller has excellent tracking and regulation performance.
Harold Pinter’s The Caretaker(1959) clearly portrays a lack of communication among the characters of the play which refers to the condition of modern man. This failure of communication led Harold Pinter to use a lot of pauses and silences in all the plays he wrote instead of words. Samuel Beckett preceded Pinter in doing so in his plays and one way to express the bewilderment of modern man during the 20th century is through the use of no language in the dramatic works. Language is no more important to modern man; instead, he uses silence to express his feelings. Silence is more powerful than the words themselves. That’s why long and short pauses can be seen throughout all Pinter’s plays.
In this play, th
... Show More