General medical fields and computer science usually conjugate together to produce impressive results in both fields using applications, programs and algorithms provided by Data mining field. The present research's title contains the term hygiene which may be described as the principle of maintaining cleanliness of the external body. Whilst the environmental hygienic hazards can present themselves in various media shapes e.g. air, water, soil…etc. The influence they can exert on our health is very complex and may be modulated by our genetic makeup, psychological factors and by our perceptions of the risks that they present. Our main concern in this research is not to improve general health, rather than to propose a data mining approach that will eventually give a more clear understanding and automotive general steps that can be used by the data analyser to give more enhanced and improved results than using typical statistical tests and database queries. This research proposes a new approach involving 3 algorithms selected from data mining which are association rule mining, Apriori algorithm and Naïve Bayesian consequently, to offer a final improved decision support results that can serve the researchers in their fields.
The very fast developments of web and data collection technologies have enabled non-experts to collect and disseminate geospatial datasets through web applications. This new type of spatial data is usually known as collaborative mapping or volunteered geographic information VGI. There are various countries around the world could benefit from collaborative mapping data because it is cost free data, easy to access and it provides more customised data. However, there is a concern about its quality because the data collectors may lack the sufficient experience and training about geospatial data production. Most previous studies which have outlined and analysed VGI quality focused on positional and linear features. The current research has been
... Show MoreIn this research, aflatoxins were produced, extracted, isolated, and purified in order to optimize the storage conditions of feed. Using a preparative thin layer chromatography (TLC) method, with commercially available plates of 1.5 to 2.0 mm, B1 aflatoxin was isolated from the feed samples of whole wheat, maize, and crushed rice, and the procedure was repeated four times. A purity value of 99 % for B1 aflatoxin was achieved and tested using spectrophotometric and chromatographic methods. As solvents, acetone:water (85:15l) was used for aflatoxin extraction from the feed sample, whereas methanol: water (50: 50) was used for trichothecenes extraction. The primary findings of this research indicate that B1 aflatoxin reac
... Show MoreThere are many images you need to large Khoznah space With the continued evolution of storage technology for computers, there is a need nailed required to reduce Alkhoznip space for pictures and image compression in a good way, the conversion method Alamueja
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThis research takes up address the practical side by taking case studies for construction projects that include the various Iraqi governorates, as it includes conducting a field survey to identify the impact of parametric costs on construction projects and compare them with what was reached during the analysis and the extent of their validity and accuracy, as well as adopting the approach of personal interviews to know the reality of the state of construction projects. The results showed, after comparing field data and its measurement in construction projects for the sectors (public and private), the correlation between the expected and actual cost change was (97.8%), and this means that the data can be adopted in the re
... Show MoreTwitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness
... Show MoreThe research aims to identify the theoretical framework of technical reserves in the insurance activity and the role of the auditor in verifying the integrity of the estimates of technical provisions (technical reserves) for the branches of general insurance in insurance companies based on the relevant international auditing standards, as a proposed audit program has been prepared in accordance with international auditing standards that enable the auditor to express a sound opinion on the fairness of the financial statements of these companies , The research has reached many conclusions, the most important of which is the existence of deficiencies in the audit procedures of insurance companies, as the audit program of those companies did
... Show MoreThere are many tools and S/W systems to generate finite state automata, FSA, due to its importance in modeling and simulation and its wide variety of applications. However, no appropriate tool that can generate finite state automata, FSA, for DNA motif template due to the huge size of the motif template. In addition to the optional paths in the motif structure which are represented by the gap. These reasons lead to the unavailability of the specifications of the automata to be generated. This absence of specifications makes the generating process very difficult. This paper presents a novel algorithm to construct FSAs for DNA motif templates. This research is the first research presents the problem of generating FSAs for DNA motif temp
... Show MoreChurning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show More