With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Leveraging sophisticated AI algorithms, the study focuses on scrutinizingsubtle periodic patterns and uncovering relationships among the collected datasets. Through thiscomprehensive analysis, the research endeavors to pinpoint crime hotspots, detect fluctuations infrequency, and identify underlying causes of criminal activities. Furthermore, the research evaluates theefficacy of the AI model in generating productive insights and providing the most accurate predictionsof future criminal trends. These predictive insights are poised to revolutionize the strategies of lawenforcement agencies, enabling them to adopt proactive and targeted approaches. Emphasizing ethicalconsiderations, this research ensures the continued feasibility of AI use while safeguarding individuals'constitutional rights, including privacy. The anticipated outcomes of this research are anticipated tofurnish actionable intelligence for law enforcement, policymakers, and urban planners, aiding in theidentification of effective crime prevention strategies. By harnessing the potential of AI, this researchcontributes to the promotion of proactive strategies and data-driven models in crime analysis andprediction, offering a promising avenue for enhancing public security in Los Angeles and othermetropolitan areas.
Projects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreThe assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show MoreJournal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
Document source identification in printer forensics involves determining the origin of a printed document based on characteristics such as the printer model, serial number, defects, or unique printing artifacts. This process is crucial in forensic investigations, particularly in cases involving counterfeit documents or unauthorized printing. However, consistent pattern identification across various printer types remains challenging, especially when efforts are made to alter printer-generated artifacts. Machine learning models are often used in these tasks, but selecting discriminative features while minimizing noise is essential. Traditional KNN classifiers require a careful selection of distance metrics to capture relevant printing
... Show MoreThis study assesses the short-term and long-term interactions between firm performance, financial education and political instability in the case of Malaysia Small to Medium Enterprises (SMEs). The simultaneous insertion of financial education and political instability within the study is done intentionally to inspect the effect of these two elements in one equation for the Malaysian economy. Using the bound testing methodology for cointegration and error correction models, advanced within an autoregressive distributed lag (ARDL) framework, we examine whether a long-run equilibrium connection survives between firm performance and the above mentioned independent variables. Using this method, we uncover evidence of a positive long-term link b
... Show MoreBearing capacity of soil is an important factor in designing shallow foundations. It is directly related to foundation dimensions and consequently its performance. The calculations for obtaining the bearing capacity of a soil needs many varying parameters, for example soil type, depth of foundation, unit weight of soil, etc. which makes these calculation very variable–parameter dependent. This paper presents the results of comparison between the theoretical equation stated by Terzaghi and the Artificial Neural Networks (ANN) technique to estimate the ultimate bearing capacity of the strip shallow footing on sandy soils. The results show a very good agreement between the theoretical solution and the ANN technique. Results revealed that us
... Show More