Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or living in it to assist people in recognizing between a secured and an unsecured environment. Geo-location, combined with new approaches and techniques, can be extremely useful in crime investigation. The aim is focused on comparative study between three supervised learning algorithms. Where learning used data sets to train and test it to get desired results on them. Various machine learning algorithms on the dataset of Boston city crime are Decision Tree, Naïve Bayes and Logistic Regression classifiers have been used here to predict the type of crime that happens in the area. The outputs of these methods are compared to each other to find the one model best fits this type of data with the best performance. From the results obtained, the Decision Tree demonstrated the highest result compared to Naïve Bayes and Logistic Regression.
This research reviews the aesthetic variables that were founded according to (theatrical rehearsal) as one of the most important pillars on which the theatrical process is based, because of its necessity in developing theatrical art on several levels that helped the theatrical director in organizing his work, and this became clear through the research chapters represented in the first chapter (methodological framework) and the second chapter, which consisted of the first topic (the duality of watching / rehearsal) and the second topic (the applications of theatrical rehearsal in theatrical experiences), all the way to the third chapter (research procedures), which included the analysis of theatrical rehearsals (sharing on life), and the
... Show MoreThis study depicts the removal of Manganese ions (Mn2+) from simulated wastewater by combined electrocoagulation/ electroflotation technologies. The effects of initial Mn concentration, current density (C.D.), electrolysis time, and different mesh numbers of stainless steel screen electrodes were investigated in a batch cell by adopting Taguchi experimental design to explore the optimum conditions for maximum removal efficiency of Mn. The results of multiple regression and signal to noise ratio (S/N) showed that the optimum conditions were Mn initial concentration of 100 ppm, C.D. of 4 mA/cm2, time of 120 min, and mesh no. of 30 (wire/inch). Also, the relative significance of each factor was attained by the analysis
... Show MoreThis study depicts the removal of Manganese ions (Mn2+) from simulated wastewater by combined electrocoagulation/ electroflotation technologies. The effects of initial Mn concentration, current density (C.D.), electrolysis time, and different mesh numbers of stainless steel screen electrodes were investigated in a batch cell by adopting Taguchi experimental design to explore the optimum conditions for maximum removal efficiency of Mn. The results of multiple regression and signal to noise ratio (S/N) showed that the optimum conditions were Mn initial concentration of 100 ppm, C.D. of 4 mA/cm2, time of 120 min, and mesh no. of 30 (wire/inch). Also, the relative significance of each factor was attained by the analysis of variance (ANO
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreAbstract: The utility of DNA sequencing in diagnosing and prognosis of diseases is vital for assessing the risk of genetic disorders, particularly for asymptomatic individuals with a genetic predisposition. Such diagnostic approaches are integral in guiding health and lifestyle decisions and preparing families with the necessary foreknowledge to anticipate potential genetic abnormalities. The present study explores implementing a define-by-run deep learning (DL) model optimized using the Tree-structured Parzen estimator algorithm to enhance the precision of genetic diagnostic tools. Unlike conventional models, the define-by-run model bolsters accuracy through dynamic adaptation to data during the learning process and iterative optimization
... Show MoreThis paper deals the prediction of the process of random spatial data of two properties, the first is called Primary variables and the second is called secondary variables , the method that were used in the prediction process for this type of data is technique Co-kriging , the method is usually used when the number of primary variables meant to predict for one of its elements is measured in a particular location a few (because of the cost or difficulty of obtaining them) compare with secondary variable which is the number of elements are available and highly correlated with primary variables, as was the&nbs
... Show MoreThis work aims to detect the associations of C-peptide and the homeostasis model assessment of beta-cells function (HOMA2-B%) with inflammatory biomarkers in pregnant-women in comparison with non-pregnant women. Sera of 28 normal pregnant women at late pregnancy versus 27 matched age non-pregnant women (control), were used to estimate C-peptide, triiodothyronine (T3), and thyroxin (T4) by Enzyme-linked-immunosorbent assay (ELISA), fasting blood sugar (FBS) by automatic analyzer Biolis 24i, hematology-tests by hematology analyzer and the calculation of HOMA2-B% and homeostasis model assessment of insulin sensitivity (HOMA2-S%) by using C-peptide values instead of insulin. The comparisons, correlations, regression analysis tests were perfo
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show More