Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
The research aims at the possibility of measuring the technical and scale efficiency (SE) of the departments of the College of Administration and Economics at the University of Baghdad for a period lasting 8 years, from the academic year 2013-2014 to 2018-2019 using the method of Applied Data Analysis with an input and output orientation to maintain the distinguished competitive position and try to identify weaknesses in performance and address them. Nevertheless, the research problem lies in diagnosing the most acceptable specializations in the labor market and determining the reasons for students’ reluctance to enter some departments. Furthermore, the (Win4DEAp) program was used to measure technical and scale efficiency (SE) and rely on
... Show MoreAbstract :
This present paper sheds the light on dimensions of scheduling the service that includes( the easiness of performing the service, willingness , health factors, psychological sides, family matters ,diminishing the time of waiting that improve performance of nursing process including ( the willingness of performance, the ability to perform the performance , opportunity of performance) . There is genuine problem in the Iraqi hospitals lying into the weakness of nursing staffs , no central decision to define and organize schedules. Thus the researcher has chosen this problem as to be his title . The research come a to develop the nursing service
... Show MoreAbstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
The study aims at analyzing the inconsistent structural and semantic aspects found in the plays of N.A Ostrovsky. The analysis, that includes all the linguistics schools of thoughts in modern Russian language, is performed chronologically to clarify all the ambiguities that the Russian language learners may face. Such difficulties lie in the use of inconsistent aspects with complete declarative sentences and adverbial clauses. Hence, it constructs a new sentence category that consists of secondary clause and its syncretism semantic.
The study illustrates the wide scope of both studying the sentence inconsistent
... Show MoreThe purpose of current study is to analyze the computer textbooks content for intermediate stage in Iraq according to the theory of multiple intelligence. By answering the following question “what is the percentage of availability of multiple intelligence in the content of the computer textbooks on intermediate stage (grade I, II) for the academic year (2017-2018)? The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea for registration. The research tool was prepared according the Gardner’s classification of multiple intelligence. It has proven validity and reliability. The study found the percentage of multiple intelligence in the content of computer textbooks for the in
... Show MoreTo limit or reduce common microbial contamination occurrence in dairy products in general and in soft cheese in particular, produced in locally plants, this study was performed to demonstrate the possibility of implementing HACCP in one of dairy plants in Baghdad city
HACCP plan was proposed in soft cheese production line. A pre-evaluation was performed in soft cheese line production, HACCP Pre-requisites programs was evaluated from its presence and effectiveness. The evaluation was demonstrated risk in each of: Good Manufacturing Practice (GMP) program, evaluated as microbial and physical risk and considered as critical r
... Show MoreCritical buckling and natural frequencies behavior of laminated composite thin plates subjected to in-plane uniform load is obtained using classical laminated plate theory (CLPT). Analytical investigation is presented using Ritz- method for eigenvalue problems of buckling load solutions for laminated symmetric and anti-symmetric, angle and cross ply composite plate with different elastic supports along its edges. Equation of motion of the plate was derived using principle of virtual work and solved using modified Fourier displacement function that satisfies general edge conditions. Various numerical investigation were studied to exhibit a convergence and accuracy of the present solution for considering some design parameters such as edge
... Show MoreThis paper tackles with principal component analysis method (PCA ) to dimensionality reduction in the case of linear combinations to digital image processing and analysis. The PCA is statistical technique that shrinkages a multivariate data set consisting of inter-correlated variables into a data set consisting of variables that are uncorrelated linear combination, while ensuring the least possible loss of useful information. This method was applied to a group of satellite images of a certain area in the province of Basra, which represents the mouth of the Tigris and Euphrates rivers in the Shatt al-Arab in the province of Basra.
... Show More