Psychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from a questionnaire, and the psychiatric researchers recommend these questions. Useless questions are pruned using the attribute selection method. Moreover, pieces of information gained through these questions are measured according to a specific class and ranked accordingly. Association and a priori algorithms are used to detect the most influential and interrelated questions in the questionnaire. Consequently, the decisive parameters that may lead to job apathy are determined.
The rising temperatures are the most significant aspect in the period of climate variability. In this study PRECIS model data and observed data are used for assessing the temperature change scenarios of Sindh province during the first half of the present century. Observed data from various meteorological stations of Sindh are the primary source for temperature change detection. The current scenario (1961–1990) and future one (2010-2050) are acted by the PRECIS Regional Climate Model at a spatial resolution of 25 * 25 km. Regional Climate Model (RCM) can yield reasonably suitable projections to be used in the climate - scenario. The main objective of the study is to prepare maps. The simulated temperature as obtained from climate model-
... Show MoreRadiation treatment has long been the conventional approach for treating nasopharyngeal cancer (NPC) tumors due to its anatomic features, biological characteristics, and radiosensitivity. The most common treatment for nasopharyngeal carcinoma is radiotherapy. This study aimed to assess the better quality of radiotherapy treatment techniques using intensity-modulated radiotherapy (IMRT) and volumetric-modulated arc therapy (VMAT). The VMAT and IMRT are comparative techniques. Forty patients with nasopharyngeal carcinoma and forwarded for radiotherapy were treated with both advanced techniques, IMRT and VMAT, using eclipse software from Varian. The x-ray energy was set at 6 MV. The total prescribed dose was 70 Gy. The results show that the
... Show MoreBackground: The aims of the study were to evaluate the unclean/clean root canal surface areas with a histopathological cross section view of the root canal and the isthmus and to evaluate the efficiency of instrumentation to the isthmus using different rotary instrumentation techniques. Materials and Methods:The mesial roots of thirty human mandibular molars were divided into six groups, each group was composed of five roots (10 root canals)which prepared and irrigated as: Group one A: Protaper system to size F2 and hypodermic syringe, Group one B: Protaper system to size F2 and endoactivator system, Group two A:Wave One small then primary file and hypodermic syringe, Group two B:Wave One small then primary file and endoactivator system, Gr
... Show MoreThe study aims to demonstrate the significance of metaverse technology across various disciplines, academic degrees, scientific fields, and academic titles. It also aims to assess the level of knowledge and understanding of university teachers (research sample) regarding metaverse technology. Hence, the descriptive research methodology was based on the method of statistical survey in the sample. It involved a set of organized scientific steps to deduce data from the reality of the statistical sample and its nature in order to achieve the objectives of the study. In this study, a questionnaire was used as a tool to collect data from a random sample of approximately 121 teachers and instructors from the University of Baghdad. This app
... Show MoreThe Supreme Federal Court of Iraq was established by the Federal Supreme Court Law No. (30) For the year 2005 on the basis of the provisions of the Iraqi State Administration Law for the transitional period of 2004. The law included a clear reference to the main purpose for which the Federal Supreme Court was established, the separation of jurisdictions between different levels of government, as well as its competence to control the constitutionality of laws. It was also referred to in the Constitution of 2005 and defined its powers in Article (93) of it, in addition to the other terms of reference in the other articles of the Constitution and the laws in force, and from reading the texts of the above Constitution, the Federal Supreme Co
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show More