With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Leveraging sophisticated AI algorithms, the study focuses on scrutinizingsubtle periodic patterns and uncovering relationships among the collected datasets. Through thiscomprehensive analysis, the research endeavors to pinpoint crime hotspots, detect fluctuations infrequency, and identify underlying causes of criminal activities. Furthermore, the research evaluates theefficacy of the AI model in generating productive insights and providing the most accurate predictionsof future criminal trends. These predictive insights are poised to revolutionize the strategies of lawenforcement agencies, enabling them to adopt proactive and targeted approaches. Emphasizing ethicalconsiderations, this research ensures the continued feasibility of AI use while safeguarding individuals'constitutional rights, including privacy. The anticipated outcomes of this research are anticipated tofurnish actionable intelligence for law enforcement, policymakers, and urban planners, aiding in theidentification of effective crime prevention strategies. By harnessing the potential of AI, this researchcontributes to the promotion of proactive strategies and data-driven models in crime analysis andprediction, offering a promising avenue for enhancing public security in Los Angeles and othermetropolitan areas.
In the present investigation, bed porosity and solid holdup in viscous three-phase inverse fluidized bed (TPIFB) are determined for aqueous solutions of carboxy methyl cellulose (CMC) system using polyethylene and polypropylene as a particles with low-density and diameter (5 mm) in a (9.2 cm) inner diameter with height (200 cm) of vertical perspex column. The effectiveness of gas velocity Ug , liquid velocity UL, liquid viscosity μL, and particle density ρs on bed porosity BP and solid holdups εg were determined. The bed porosity increases with "increasing gas velocity", "liquid velocity", and "liquid viscosity". Solid holdup decreases with increasing gas, liquid
... Show MorePurpose: The purpose of the study is to compare and evaluate Earnings Management in Tunisia and Iraq. Theoretical framework: Earnings Management is an important topic that has been studied by a significant number of researchers, as well as those who are interested in the accounting profession. Earnings Management has gotten a lot of attention from academics, professionals, and other interested parties in recent years (e.g. Kliestik et al., 2020; Rahman et al., 2021; Gamra &Ellouze, 2021) Design/methodology/approach: The sample includes ten banks listed on the Bourse of Tunisia and Iraq Stock Exchanges for the year 2017. We have used a model of Kothari et al., (2005) as a tool to measure Earnings Management in both mark
... Show MoreRumors are typically described as remarks whose true value is unknown. A rumor on social media has the potential to spread erroneous information to a large group of individuals. Those false facts will influence decision-making in a variety of societies. In online social media, where enormous amounts of information are simply distributed over a large network of sources with unverified authority, detecting rumors is critical. This research proposes that rumor detection be done using Natural Language Processing (NLP) tools as well as six distinct Machine Learning (ML) methods (Nave Bayes (NB), random forest (RF), K-nearest neighbor (KNN), Logistic Regression (LR), Stochastic Gradient Descent (SGD) and Decision Tree (
... Show MoreHeavy oil is classified as unconventional oil resource because of its difficulty to recover in its natural state, difficulties in transport and difficulties in marketing it. Upgrading solution to the heavy oil has positive impact technically and economically specially when it will be a competitive with conventional oils from the marketing prospective. Developing Qaiyarah heavy oil field was neglected in the last five decades, the main reason was due to the low quality of the crude oil resulted in the high viscosity and density of the crude oil in the field which was and still a major challenge putting them on the major stream line of production in Iraq. The low quality of the crude properties led to lower oil prices in the global markets
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show More<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi
... Show MoreIn this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreIn present work the effort has been put in finding the most suitable color model for the application of information hiding in color images. We test the most commonly used color models; RGB, YIQ, YUV, YCbCr1 and YCbCr2. The same procedures of embedding, detection and evaluation were applied to find which color model is most appropriate for information hiding. The new in this work, we take into consideration the value of errors that generated during transformations among color models. The results show YUV and YIQ color models are the best for information hiding in color images.
In this research, the performance of a two kind of membrane was examined to recovering the nutrients (protein and lactose) from the whey produced by the soft cheese industry in the General Company for Food Products inAbo-ghraab.Wheyare treated in two stages, the first including press whey into micron filter made of poly vinylidene difluoride (PVDF) standard plate type 800 kilo dalton, The membrane separates the whey to permeate which represent is the main nutrients and to remove the fat and microorganisms.The second stage is to isolate the protein by using ultra filter made of polyethylsulphone(PES)type plate with a measurement of 10,60 kilo dalton and the recovery of lactose in the form of permeate.
The results showed that the percen