General medical fields and computer science usually conjugate together to produce impressive results in both fields using applications, programs and algorithms provided by Data mining field. The present research's title contains the term hygiene which may be described as the principle of maintaining cleanliness of the external body. Whilst the environmental hygienic hazards can present themselves in various media shapes e.g. air, water, soil…etc. The influence they can exert on our health is very complex and may be modulated by our genetic makeup, psychological factors and by our perceptions of the risks that they present. Our main concern in this research is not to improve general health, rather than to propose a data mining approach that will eventually give a more clear understanding and automotive general steps that can be used by the data analyser to give more enhanced and improved results than using typical statistical tests and database queries. This research proposes a new approach involving 3 algorithms selected from data mining which are association rule mining, Apriori algorithm and Naïve Bayesian consequently, to offer a final improved decision support results that can serve the researchers in their fields.
Botnet detection develops a challenging problem in numerous fields such as order, cybersecurity, law, finance, healthcare, and so on. The botnet signifies the group of co-operated Internet connected devices controlled by cyber criminals for starting co-ordinated attacks and applying various malicious events. While the botnet is seamlessly dynamic with developing counter-measures projected by both network and host-based detection techniques, the convention techniques are failed to attain sufficient safety to botnet threats. Thus, machine learning approaches are established for detecting and classifying botnets for cybersecurity. This article presents a novel dragonfly algorithm with multi-class support vector machines enabled botnet
... Show MoreThis research aims to predict new COVID-19 cases in Bandung, Indonesia. The system implemented two types of deep learning methods to predict this. They were the recurrent neural networks (RNN) and long-short-term memory (LSTM) algorithms. The data used in this study were the numbers of confirmed COVID-19 cases in Bandung from March 2020 to December 2020. Pre-processing of the data was carried out, namely data splitting and scaling, to get optimal results. During model training, the hyperparameter tuning stage was carried out on the sequence length and the number of layers. The results showed that RNN gave a better performance. The test used the RMSE, MAE, and R2 evaluation methods, with the best numbers being 0.66975075, 0.470
... Show MoreA mathematical method with a new algorithm with the aid of Matlab language is proposed to compute the linear equivalence (or the recursion length) of the pseudo-random key-stream periodic sequences using Fourier transform. The proposed method enables the computation of the linear equivalence to determine the degree of the complexity of any binary or real periodic sequences produced from linear or nonlinear key-stream generators. The procedure can be used with comparatively greater computational ease and efficiency. The results of this algorithm are compared with Berlekamp-Massey (BM) method and good results are obtained where the results of the Fourier transform are more accurate than those of (BM) method for computing the linear equivalenc
... Show MoreThis study aims to design unified electronic information system to manage students attendance in Lebanese French university/Erbil, as a system that simplifies the process of entering and counting the students absence, and generate absence reports to expel students who passed the acceptable limit of being absent, and by that we can replace the traditional way of using papers to count absence, with a complete electronically system for managing students attendance, in a way that makes the results accurate and unchangeable by the students.
In order to achieve the study's objectives, we designed an information syst
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreGivers of foreign Audit about Social Responsibility of Profit Organization. The recent time is charcterstically with big economic Organization activities, because there are many transactions between these Organizations and different financial markets development techniques.
This encourgage business men to increase their efforts for investment in these markets. Because the Accounting is in general terms it represents a language of these Unions Activities and translate them in to fact numbers, for that there is need for Accounting recording for certain of these Organizations behavior and their harmonization with their Objectives.
In this respect the Audit function comes to che
... Show MoreBreast cancer is the second deadliest disease infected women worldwide. For this
reason the early detection is one of the most essential stop to overcomeit dependingon
automatic devices like artificial intelligent. Medical applications of machine learning
algorithmsare mostly based on their ability to handle classification problems,
including classifications of illnesses or to estimate prognosis. Before machine
learningis applied for diagnosis, it must be trained first. The research methodology
which isdetermines differentofmachine learning algorithms,such as Random tree,
ID3, CART, SMO, C4.5 and Naive Bayesto finds the best training algorithm result.
The contribution of this research is test the data set with mis
The last two decades have seen a marked increase in the illegal activities on the Dark Web. Prompt evolvement and use of sophisticated protocols make it difficult for security agencies to identify and investigate these activities by conventional methods. Moreover, tracing criminals and terrorists poses a great challenge keeping in mind that cybercrimes are no less serious than real life crimes. At the same time, computer security societies and law enforcement pay a great deal of attention on detecting and monitoring illegal sites on the Dark Web. Retrieval of relevant information is not an easy task because of vastness and ever-changing nature of the Dark Web; as a result, web crawlers play a vital role in achieving this task. The
... Show MoreThe process of evaluating data (age and the gender structure) is one of the important factors that help any country to draw plans and programs for the future. Discussed the errors in population data for the census of Iraqi population of 1997. targeted correct and revised to serve the purposes of planning. which will be smoothing the population databy using nonparametric regression estimator (Nadaraya-Watson estimator) This estimator depends on bandwidth (h) which can be calculate it by two ways of using Bayesian method, the first when observations distribution is Lognormal Kernel and the second is when observations distribution is Normal Kernel
... Show More