This paper is focusing on reducing the time for text processing operations by taking the advantage of enumerating each string using the multi hashing methodology. Text analysis is an important subject for any system that deals with strings (sequences of characters from an alphabet) and text processing (e.g., word-processor, text editor and other text manipulation systems). Many problems have been arisen when dealing with string operations which consist of an unfixed number of characters (e.g., the execution time); this due to the overhead embedded-operations (like, symbols matching and conversion operations). The execution time largely depends on the string characteristics; especially its length (i.e., the number of characters consisting the strings plus the number of words in the sentence). In other words, the variable length of strings is an obstacle to achieve processing uniformity when manipulating strings. Many of string matching algorithms were introduced in the literature to deal with fixed length of characters of each string. In this paper, some test results are provided for a number of string operations (such as, simple string matching, hashing indexing systems, stop-words collection and text extractions). To understand the advantage of the proposed method, these operations were applied on different sizes of text files. A comparison is made with the results of using traditional methods that deal with strings only. The overall results demonstrate the positive effectiveness of the proposed approach.
Theatrical performances began with the Greeks when the theatrical scenes and skeletal figures were encoded, where the large wall of the Alskina, which contains three doors, the middle of them with a high height, and the two sides took the natural size, where the middle door indicated a symbolism of the god or demigods, as we find the condensation of the symbol in the architecture of the theater, and the symbol was taken In the theatrical scene, the development semantically and aesthetically, and interpreting and interpreting the current day, where the laser light formed the symbolism of the contemporary virtual scene, and in order to identify the aesthetics of the symbol in the theatrical scene, the current research was evaluated into fo
... Show MoreThe cuneiform images need many processes in order to know their contents
and by using image enhancement to clarify the objects (symbols) founded in the
image. The Vector used for classifying the symbol called symbol structural vector
(SSV) it which is build from the information wedges in the symbol.
The experimental tests show insome numbersand various relevancy including
various drawings in online method. The results are high accuracy in this research,
and methods and algorithms programmed using a visual basic 6.0. In this research
more than one method was applied to extract information from the digital images
of cuneiform tablets, in order to identify most of signs of Sumerian cuneiform.
All modern critical approaches attempt to cover the meanings and overtones of the text, claiming that they are better than others in the analysis and attainment of the intended meanings of the text. The structural approach claims to be able to do so more than any other modern critical approach, as it claimed that it is possible to separate what is read from the reader, on the presumed belief that it is possible to read the text with a zero-memory. However, the studies in criticism of criticism state that each of these approaches is successful in dealing with the text in one or more aspects while failing in one or more aspects. Consequently, the criticism whether the approach possesses the text, or that the text rejects this possession, r
... Show MoreManufacturing high-efficiency polymeric materials to moderate fast neutrons by converting them into slow or thermal neutrons. These materials absorb thermal neutrons as well as gamma rays associated with neutrons. Materials of small mass number are used to slow down fast neutrons because neutrons have a high cross-section when they interact with these materials. Materials of high mass number absorb gamma rays. Polyurethane and epoxy were mixed in various ratios to create a blend to serve as neutrons shield, lead (Pb) was then added to the blend at weight percentages of 20%, 30%, 40%, 50%, and 70% to produce a polymer composite.
Polymeric materials reinforced with lead in various ratios were tested to select the best
... Show MoreFeature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreThe current study is unique in its emphasis on investigating design operation and concept from multiple scientific perspectives: including invention, technique, and design components. This research tends to study the methodology and creation of design process in a holistic manner so that the readers may grasp their characteristics and properties down to its minute epistemological detail. The investigation of the design concept is where the real groundwork and pressing need for the study begin. Creation and methodology are two primary concepts in relation to design these relationships can be formed in any design because of the various forces that act upon it. The primordial objective of this study is to evaluate the relationship betw
... Show MoreAutomatic document summarization technology is evolving and may offer a solution to the problem of information overload. Multi-document summarization is an optimization problem demanding optimizing more than one objective function concurrently. The proposed work considers a balance of two significant objectives: content coverage and diversity while generating a summary from a collection of text documents. Despite the large efforts introduced from several researchers for designing and evaluating performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. The design of gener
... Show MoreFlow-production systems whose pieces are connected in a row may not have maintenance scheduling procedures fixed because problems occur at different times (electricity plants, cement plants, water desalination plants). Contemporary software and artificial intelligence (AI) technologies are used to fulfill the research objectives by developing a predictive maintenance program. The data of the fifth thermal unit of the power station for the electricity of Al Dora/Baghdad are used in this study. Three stages of research were conducted. First, missing data without temporal sequences were processed. The data were filled using time series hour after hour and the times were filled as system working hours, making the volume of the data relativel
... Show MoreThe Electric Discharge (EDM) method is a novel thermoelectric manufacturing technique in which materials are removed by a controlled spark erosion process between two electrodes immersed in a dielectric medium. Because of the difficulties of EDM, determining the optimum cutting parameters to improve cutting performance is extremely tough. As a result, optimizing operating parameters is a critical processing step, particularly for non-traditional machining process like EDM. Adequate selection of processing parameters for the EDM process does not provide ideal conditions, due to the unpredictable processing time required for a given function. Models of Multiple Regression and Genetic Algorithm are considered as effective methods for determ
... Show MoreCrime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o