With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Leveraging sophisticated AI algorithms, the study focuses on scrutinizingsubtle periodic patterns and uncovering relationships among the collected datasets. Through thiscomprehensive analysis, the research endeavors to pinpoint crime hotspots, detect fluctuations infrequency, and identify underlying causes of criminal activities. Furthermore, the research evaluates theefficacy of the AI model in generating productive insights and providing the most accurate predictionsof future criminal trends. These predictive insights are poised to revolutionize the strategies of lawenforcement agencies, enabling them to adopt proactive and targeted approaches. Emphasizing ethicalconsiderations, this research ensures the continued feasibility of AI use while safeguarding individuals'constitutional rights, including privacy. The anticipated outcomes of this research are anticipated tofurnish actionable intelligence for law enforcement, policymakers, and urban planners, aiding in theidentification of effective crime prevention strategies. By harnessing the potential of AI, this researchcontributes to the promotion of proactive strategies and data-driven models in crime analysis andprediction, offering a promising avenue for enhancing public security in Los Angeles and othermetropolitan areas.
This study assesses the short-term and long-term interactions between firm performance, financial education and political instability in the case of Malaysia Small to Medium Enterprises (SMEs). The simultaneous insertion of financial education and political instability within the study is done intentionally to inspect the effect of these two elements in one equation for the Malaysian economy. Using the bound testing methodology for cointegration and error correction models, advanced within an autoregressive distributed lag (ARDL) framework, we examine whether a long-run equilibrium connection survives between firm performance and the above mentioned independent variables. Using this method, we uncover evidence of a positive long-term link b
... Show More This research aims to estimate stock returns, according to the Rough Set Theory approach, test its effectiveness and accuracy in predicting stock returns and their potential in the field of financial markets, and rationalize investor decisions. The research sample is totaling (10) companies traded at Iraq Stock Exchange. The results showed a remarkable Rough Set Theory application in data reduction, contributing to the rationalization of investment decisions. The most prominent conclusions are the capability of rough set theory in dealing with financial data and applying it for forecasting stock returns.The research provides those interested in investing stocks in financial
... Show MoreDocument source identification in printer forensics involves determining the origin of a printed document based on characteristics such as the printer model, serial number, defects, or unique printing artifacts. This process is crucial in forensic investigations, particularly in cases involving counterfeit documents or unauthorized printing. However, consistent pattern identification across various printer types remains challenging, especially when efforts are made to alter printer-generated artifacts. Machine learning models are often used in these tasks, but selecting discriminative features while minimizing noise is essential. Traditional KNN classifiers require a careful selection of distance metrics to capture relevant printing
... Show MoreAbstract:
This research (The effect of information in Iraqi universities) wants to
explain the aims and the opinions of Iraqi universities lecturers during the
globalization era and information technology .
The results we have got about using the computers by the lecturers of
Iraqi universities just in illustration and statistic as depends on there works
Bearing capacity of soil is an important factor in designing shallow foundations. It is directly related to foundation dimensions and consequently its performance. The calculations for obtaining the bearing capacity of a soil needs many varying parameters, for example soil type, depth of foundation, unit weight of soil, etc. which makes these calculation very variable–parameter dependent. This paper presents the results of comparison between the theoretical equation stated by Terzaghi and the Artificial Neural Networks (ANN) technique to estimate the ultimate bearing capacity of the strip shallow footing on sandy soils. The results show a very good agreement between the theoretical solution and the ANN technique. Results revealed that us
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreThe assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show MoreAbstract
The current research problem includes a variety of research motivations to serve the private health sector, which is witnessing a great competition from internal and external environments. In this regard, private medical clinics are increasingly seeking to attract and retain customers through the quality of their service offerings represented by health services. Innovative and effective marketing methods to improve performance and stay in competition, by relying on the physical evidence of the product as a component of the marketing mix of services and its role in particular in packaging and supporting the health service with concrete evidence that affects the customer an
... Show More