Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
BN Rashid
This research worked on identifying the effect of increasing the volume of indebtedness by companies listed on the Iraq Stock Exchange on the trading volume of those companies, and this research included some theoretical concepts related to both debt financing and trading volume, and it represents the research community of the joint-stock companies listed in The Iraq Stock Exchange (the banking sector). As for the research sample, it was deliberately chosen represented by companies with continuous trading without stopping, which reached 10 joint-stock companies, and the period of research was extended during the period 2011-2015, and a set of indicators and financial methods were used In measuring research v
... Show MoreThe research specified with study the relation between the market share for the sample research banks and the amount of the achieved revenues from the investment, where the dominated belief that there potentiality enhancing the revenue on investment with the increase of the banks shares in their markets after their success in achieving rates of successive growth in their sales of sales and to a suitable achieve market coverage for their products and they have dissemination and suitable promotion activity, the market share represented the competition for the banks, and the markets pay attention to the market share as a strategic objective and to maintain them also increasi
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreImmunization is one of the most cost-effective and successful public health applications. The results of immunization are difficult to see as the incidence of disease occurrence is low while adverse effects following the immunization are noticeable, particularly if the vaccine was given to apparently healthy person. High safety expectations of population regarding the vaccines so they are more prone to hesitancy regarding presence of even small risk of adverse events which may lead to loss of pub
... Show MoreThe vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc
The oil and gas industry relies heavily on IT innovations to manage business processes, but the exponential generation of data has led to concerns about processing big data, generating valuable insights, and making timely decisions. Many companies have adopted Big Data Analytics (BDA) solutions to address these challenges. However, determining the adoption of BDA solutions requires a thorough understanding of the contextual factors influencing these decisions. This research explores these factors using a new Technology-Organisation-Environment (TOE) framework, presenting technological, organisational, and environmental factors. The study used a Delphi research method and seven heterogeneous panelists from an Oman oil and gas company
... Show More