The majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniques in humanities when summarizing and eliciting automated decisions. This process relies on technological advancement and considers (1) the automated-decision support-techniques commonly used in humanities, (2) the performance evolution and the use of the stylometric approach in text-mining, and (3) the comparisons of the results of chunking text by using different attributes in Burrows' Delta method. This study also provides an overview of the efficiency of applying some selected data-mining (DM) methods with various text-mining techniques to support the critics' decision in artistry ‒ one field of humanities. The automatic choice of criticism in this field was supported by a hybrid approach to these procedures.
This paper is concerned with introducing and studying the M-space by using the mixed degree systems which are the core concept in this paper. The necessary and sufficient condition for the equivalence of two reflexive M-spaces is super imposed. In addition, the m-derived graphs, m-open graphs, m-closed graphs, m-interior operators, m-closure operators and M-subspace are introduced. From an M-space, a unique supratopological space is introduced. Furthermore, the m-continuous (m-open and m-closed) functions are defined and the fundamental theorem of the m-continuity is provided. Finally, the m-homeomorphism is defined and some of its properties are investigated.
The design of the Graphic has recently taken a great variety in the curricula and design methods that have led to its applications for career advancement and aesthetic quality. This is due to the communicative role that it plays in the movement of human beings and their aspirations. Among these applications are the designs of magazine covers in different variations. The design of the cover is based on a structural system that requires awareness and skill from the designer, especially since the organizational structure in the design of the envelope is limited in both vocabulary and space, and in the neighborhood Many have the image of sovereignty and sometimes the other head line, and can lead color values are other.
In all
... Show MoreThe traffic congestion caused by the increase in the number of vehicles in the cities as a result of the increase in the population and the density of construction requires the provision of appropriate infrastructure and the provision of transport systems and logistics services that meet the needs of the population to meet the many challenges now and in the future by introducing various modes of transport , In accordance with integrated plans such as the use of (pedestrian friendly environments, bicycles and their own paths, light rail, metro, express bus, as well as public transport buses and others), through the development of Projects High-level roads, such as the annual and major roads, etc., and integrated with the urban planning of
... Show MoreThe aim of the research to measure the correlation relationship between modern manufacturing systems and process design and measure the effect by adopting the regression; the research consists of two main variables, which are modern manufacturing systems and process design; it was applied in the production lines of the General Company for Construction Industries, There is a sample of managers, engineers, technicians, administrators, and some workers were selected to fill the special questionnaire with (70) forms which distributed and (65) were approved suitable for use, For data analysis the correlation coefficient was adopted to measure the relationship and regression analysis to find out the effect, Using (SPSS), So the first h
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe study relied on data about the health sector in Iraq in 2006 in cooperation with the Ministry of Health and the Central Bureau of Statistics and Information Technology in 2007 Included the estimates of the population distribution of the Baghdad province and the country depending on the population distribution for 1997,evaluate the health sector which included health institutions, and health staff, and other health services. The research Aimis; Measurement an amount and size of the growth of health services (increase and decrease) and the compare of verified in Iraq and Baghdad, and evaluate the effectiveness of the distribution of supplies and health services (physical and human) of the size of the population distribution and
... Show MoreThis study deals with the interpretation of structural 3D seismic reflection of the Kumait oil field in southern Iraq within the administrative boundaries of the Maysan Governorate. Synthetic seismograms are prepared by using available data of the Kt-1 oil field by using Petrel software to define and pick the reflector on the seismic section of the Zubair Formation, Which represents the Cretaceous Age. The study shows that the Kumait structure is an anticline fold. It is thought to be a structure trap caused by the collision of the Arabian and Iranian plates and trending in the same direction as driving factors in the area, which are from the northwest to the southeast, and the overall trend of strata is north and northeast. Sei
... Show MoreThe research is dealing with the absorption and fluorescence spectra for the hybrid of an Epoxy Resin doped with organic dye Rhodamine (R6G) of different concentrations (5*10-6, 5*10-5, 1*10-5, 1*10-4, 5*10-4) Mol/ℓ at room temperature. The Quantum efficiency Qfm, the rate of fluorescence emission Kfm (s-1), the non-radiative lifetime τfm (s), fluorescence lifetime τf and the Stokes shift were calculated. Also the energy gap (Eg) for each dye concentration was evaluated. The results showed that the maximum quantum effi
... Show MoreAs material flow cost accounting technology focuses on the most efficient use of resources like energy and materials while minimizing negative environmental effects, the research aims to show how this technology can be applied to promote green productivity and its reflection in attaining sustainable development. In addition to studying sustainability, which helps to reduce environmental impacts and increase green productivity, the research aims to demonstrate the knowledge bases for accounting for the costs of material flow and green productivity. It also studies the technology of accounting for the costs of material flow in achieving sustainable development and the role of green productivity in achieving sustainable development. According
... Show MoreWith the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show More