Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThe "Nudge" Theory is considered one of the most recent theories, which is clear in the economic, health, and educational sectors, due to the intensity of studies on it and its applications, but it has not yet been included in crime prevention studies. The use of Nudge theory appears to enrich the theory in the field of crime prevention, and to provide modern, effective, and implementable mechanisms.
The study deals with the "integrative review" approach, which is a distinctive form of research that generates new knowledge on a topic through reviewing, criticizing, and synthesizing representative literature on the topic in an integrated manner so that new frameworks and perspectives are created around it.
The study is bas
... Show MoreThis review explores the Knowledge Discovery Database (KDD) approach, which supports the bioinformatics domain to progress efficiently, and illustrate their relationship with data mining. Thus, it is important to extract advantages of Data Mining (DM) strategy management such as effectively stressing its role in cost control, which is the principle of competitive intelligence, and the role of it in information management. As well as, its ability to discover hidden knowledge. However, there are many challenges such as inaccurate, hand-written data, and analyzing a large amount of variant information for extracting useful knowledge by using DM strategies. These strategies are successfully applied in several applications as data wa
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreThis paper deals with the design and implementation of an ECG system. The proposed system gives a new concept of ECG signal manipulation, storing, and editing. It consists mainly of hardware circuits and the related software. The hardware includes the circuits of ECG signals capturing, and system interfaces. The software is written using Visual Basic languages, to perform the task of identification of the ECG signal. The main advantage of the system is to provide a reported ECG recording on a personal computer, so that it can be stored and processed at any time as required. This system was tested for different ECG signals, some of them are abnormal and the other is normal, and the results show that the system has a good quality of diagno
... Show MoreLand Use / Land Cover (LULC) classification is considered one of the basic tasks that decision makers and map makers rely on to evaluate the infrastructure, using different types of satellite data, despite the large spectral difference or overlap in the spectra in the same land cover in addition to the problem of aberration and the degree of inclination of the images that may be negatively affect rating performance. The main objective of this study is to develop a working method for classifying the land cover using high-resolution satellite images using object based method. Maximum likelihood pixel based supervised as well as object approaches were examined on QuickBird satellite image in Karbala, Iraq. This study illustrated that
... Show MoreThe emergence of SARS-CoV-2, the virus responsible for the COVID-19 pandemic, has resulted in a global health crisis leading to widespread illness, death, and daily life disruptions. Having a vaccine for COVID-19 is crucial to controlling the spread of the virus which will help to end the pandemic and restore normalcy to society. Messenger RNA (mRNA) molecules vaccine has led the way as the swift vaccine candidate for COVID-19, but it faces key probable restrictions including spontaneous deterioration. To address mRNA degradation issues, Stanford University academics and the Eterna community sponsored a Kaggle competition.This study aims to build a deep learning (DL) model which will predict deterioration rates at each base of the mRNA
... Show MoreCybercrime and risks to children between the problems and solutions( An analytical study in the light of international, Arab and national statistics).
Lies the problem of the study to identify a new type of crime is different from the same traditional character of the crimes or what was customary since human creation up to the information revolution and we enter the era of globalization, which is also called (cyber crime) and their negative impact on all segments of society, especially children, as they the day of the most important social security threats, for all local and international communities alike , and those risks require collective action to various sectors and segments of society ,especially the educated classes in order t
The most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show MoreBackground: Many thymoma classifications have been followed and have been updated by newer or alternative schemes. Many classifications were based on the morphology and histogenesis of normal thymus as the backbone, while other classifications have followed a more simplified scheme, whereby thymomas were grouped based on biological behavior. The WHO classification is currently the advocated one, which is based on “organotypical” features (i.e. histological characteristics mimicking those observed in the normal thymus) including cytoarchitecture (encapsulation and a “lobular architecture”) and the cellular composition, mostly the nuclear morphology is generally appreciated.
Objectives: Thi
... Show More