General medical fields and computer science usually conjugate together to produce impressive results in both fields using applications, programs and algorithms provided by Data mining field. The present research's title contains the term hygiene which may be described as the principle of maintaining cleanliness of the external body. Whilst the environmental hygienic hazards can present themselves in various media shapes e.g. air, water, soil…etc. The influence they can exert on our health is very complex and may be modulated by our genetic makeup, psychological factors and by our perceptions of the risks that they present. Our main concern in this research is not to improve general health, rather than to propose a data mining approach that will eventually give a more clear understanding and automotive general steps that can be used by the data analyser to give more enhanced and improved results than using typical statistical tests and database queries. This research proposes a new approach involving 3 algorithms selected from data mining which are association rule mining, Apriori algorithm and Naïve Bayesian consequently, to offer a final improved decision support results that can serve the researchers in their fields.
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreTI1e Web service securi ty challenge is to understand and assess the risk involved in securing a web-based service today, based on our existing security technology, and at the same time tmck emerging standards and understand how they will be used to offset the risk in
new web services. Any security model must i llustrate how data can
now through an application and network topology to meet the
requirements defined by the busi ness wi thout exposing the data to undue risk. In this paper we propose &n
... Show MoreThis research deals with a very important subject as it tries to change the theoretical and scientific heritage and some professional rules adopted in the newsroom. Most media students have difficulties in writing news for press correctly. The researcher tries to identify the compatibility of what is published in local news agencies with professional and academic standards.
The research finds detailed editorial rules for a number of news formats which will play an important role in writing news for press easily, especially for the beginners and newcomers. Also, it discovers a new fact denying the beliefs of some researchers and writers in not having news conclusion in news edited according to the inverted pyramid pattern.
The re
In this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm
In this work, satellite images for Razaza Lake and the surrounding area
district in Karbala province are classified for years 1990,1999 and
2014 using two software programming (MATLAB 7.12 and ERDAS
imagine 2014). Proposed unsupervised and supervised method of
classification using MATLAB software have been used; these are
mean value and Singular Value Decomposition respectively. While
unsupervised (K-Means) and supervised (Maximum likelihood
Classifier) method are utilized using ERDAS imagine, in order to get
most accurate results and then compare these results of each method
and calculate the changes that taken place in years 1999 and 2014;
comparing with 1990. The results from classification indicated that
Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreThe sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circ
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm.
Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreThat corruption with all its forms, has prevailed over the whole world, but with different degrees relaying on the one who leads these countries of rulers, followers and officials who have been deemed the main reason for that corruption, but if these rulers were righteous, These countries would have blessed and elevated and were corrupt unjust tyrants who were the disaster that befell the chiefs of those countries with ruin, misery, and backwardness. This is what we sought to prove and clarify by considering the verses of the Holy Qur’an in respect with this topic. The research includes an introduction, two topics, and a conclusion.
 
... Show More