Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of this feature in the application of access and control methods in terms of security. Based on this, we propose a model for improving security for all departments of government institutions by addressing security issues early in the system's life cycle, integrating them with functional elements throughout the life cycle, and focusing on the system's organizational aspects. The main security aspects covered are system administration, organizational factors, enterprise policy, and awareness and cultural aspects.
Optimization is essentially the art, science and mathematics of choosing the best among a given set of finite or infinite alternatives. Though currently optimization is an interdisciplinary subject cutting through the boundaries of mathematics, economics, engineering, natural sciences, and many other fields of human Endeavour it had its root in antiquity. In modern day language the problem mathematically is as follows - Among all closed curves of a given length find the one that closes maximum area. This is called the Isoperimetric problem. This problem is now mentioned in a regular fashion in any course in the Calculus of Variations. However, most problems of antiquity came from geometry and since there were no general methods to solve suc
... Show MoreThe aim of the research is to determine the impact of profit management practices on the quality of profits through the use of flexibility in determining accounting methods and practices profit information is one of the most important information that concerns current users in general and observing users in particular. Some corporations managements manipulate the results of the company's profit or loss (income statement) and financial position statement with multiple reasons, including capital market motivations to raise their share prices in the stock market and attract investors, and on the other hand the motives of funding and borrowing loans, and the use of the flexibility in accounting policies and estimates to change the in
... Show MoreA field experiment was conducted to grow the wheat crop during the fall season 2020 in Karbala province, north of Ain Al-Tamr District in two locations of different textures and parent materials. The first site (calcareous soil) with a sandy loam texture, is located at (44° 40′ 37′) east longitude and (32° 41′ 34′) north latitude, at an altitude of 32 m above sea level, and an area of 20 hectares. As for the second location (gypsum soil) with a loam texture, it is located at a longitude (45° 41′ 39′) east and a latitude (33° 43′ 34′ north) and at an altitude of 33 m above sea level and an area of 20 hectares. To find out the effect of different tillage systems on water productivity and wheat yield under center pivot irri
... Show MoreIn this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show MoreThis work bases on encouraging a generous and conceivable estimation for modified an algorithm for vehicle travel times on a highway from the eliminated traffic information using set aside camera image groupings. The strategy for the assessment of vehicle travel times relies upon the distinctive verification of traffic state. The particular vehicle velocities are gotten from acknowledged vehicle positions in two persistent images by working out the distance covered all through elapsed past time doing mollification between the removed traffic flow data and cultivating a plan to unequivocally predict vehicle travel times. Erbil road data base is used to recognize road locales around road segments which are projected into the commended camera
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show More