Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based on the percentage of an accuracy measure of the previous work, are surveyed and introduced, with the aim of producing a concise review of using these algorithms in crime prediction. It is expected that this review study will be helpful for presenting such techniques to crime researchers in addition to supporting future research to develop these techniques for crime analysis by presenting some crime definition, prediction systems challenges and classifications with a comparative study. It was proved though literature, that supervised learning approaches were used in more studies for crime prediction than other approaches, and Logistic Regression is the most powerful method in predicting crime.
This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThe proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreAtrial fibrillation is associates with elevated risk of stroke. The simplest stroke risk assessment schemes are CHADS2 and CHA2DS2-VASc score. Aspirin and oral anticoagulants are recommended for stroke prevention in such patients.
The aim of this study was to assess status of CHADS2 and CHA2DS2-VASc scores in Iraqi atrial fibrillation patients and to report current status of stroke prevention in these patients with either warfarin or aspirin in relation to these scores.
This prospective cross-sectional study was carried out at Tikrit, Samarra, Sharqat, Baquba, and AL-Numaan hospitals from July 2017 to October 2017. CHADS2
... Show MoreThe aim of the research is to:. Preparation and implementation of special educational units using multimedia to learn the skill of scrolling from below. 2 to recognize the impact of the use of multimedia in learning the skill of scrolling from below. 3 to identify the differences between the tests after the two groups research in learning the skill of passing from the bottom volleyball. The research represented the students of the first stage and the sample of the research was drawn randomly and the number of (50) students were divided into two experimental and control groups and each group (25) students were used standardized tests and conducting pre-tests and the implementation of the main exp
... Show MoreAbstract
The study aims to build a training program based on the Connectivism Theory to develop e-learning competencies for Islamic education teachers in the Governorate of Dhofar, as well as to identify its effectiveness. The study sample consisted of (30) Islamic education teachers to implement the training program, they were randomly selected. The study used the descriptive approach to determine the electronic competencies and build the training program, and the quasi-experimental approach to determine the effectiveness of the program. The study tools were the cognitive achievement test and the observation card, which were applied before and after. The study found that the effectiveness of the training program
... Show MoreBackground: The risk of antibiotics resistance (AR) increases due to excessive of antibiotics either by health care provider or by the patients.
Objective: The assessment of the self-medication Practice of over the counter drugs and other prescription drugs and its associated risk factor.
Subjects and Methods: Study design: A descriptive study was conducted from “20th December 2019 to 08th January 2021”. A pre validated and structured questionnaire in English and Urdu language was created to avoid language barrier including personal detail, reasons and source and knowledge about over the counter drugs and Antibiotics. Sample of the study was randomly selected.
... Show MoreThe current research aims to identify the types and rates and the reasons for the crimes that are with the origin and sexual scattered deviation between the fabric of Iraqi society, which reticent about because of the culture of the community, where he offered a researcher investigator justice and tribal leaders and lawyers about proportion and the types and causes of crime is the origin of sexual deviance and finally a question asked a researcher in 1000 young lived their adolescence days of the former regime (1979-2003) from going to brothels researcher found that 920 people and 92% had gone to the house to practice adultery and that 70 of them, and by 7% had engaged in masturbation only and that 10 of them have not committed anything
... Show More