Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Error (MCE) criterion was used to compare the two models, leading to the conclusion that the Nadaraya-Watson (NW) circular model outperformed the parametric model in estimating the parameters of the circular regression model. Research, Practical & Social Implications: The recommendation emphasized using the Nadaraya-Watson nonparametric smoothing method to capture the nonlinearity in the data. Originality/value: The results indicated that the Nadaraya-Watson circular model (NW) outperformed the parametric model. Paper type Research paper.
Autorías: Muayad Kadhim Raheem, Lina Fouad Jawad. Localización: Opción: Revista de Ciencias Humanas y Sociales. Nº. 21, 2019. Artículo de Revista en Dialnet.
Because of the rapid development and use of the Internet as a communication media emerged to need a high level of security during data transmission and one of these ways is "Steganography". This paper reviews the Least Signification Bit steganography used for embedding text file with related image in gray-scale image. As well as we discuss the bit plane which is divided into eight different images when combination them we get the actual image. The findings of the research was the stego-image is indistinguishable to the naked eye from the original cover image when the value of bit less than four Thus we get to the goal is to cover up the existence of a connection or hidden data. The Peak to Signal Noise Ratio(PSNR) and Mean Square Error (
... Show MoreIn the image processing’s field and computer vision it’s important to represent the image by its information. Image information comes from the image’s features that extracted from it using feature detection/extraction techniques and features description. Features in computer vision define informative data. For human eye its perfect to extract information from raw image, but computer cannot recognize image information. This is why various feature extraction techniques have been presented and progressed rapidly. This paper presents a general overview of the feature extraction categories for image.
The present research deals with the influencing factors which depends on the way perceptual of the graphic designer which enters in the design logos of the loco European health, where the search include four chapters, the researcher reviewed in the chapter 0ne the methodical frame of the research ,as reviewed in the second chapter the theoretical frame, and the previous studies which included three sections, the first section included the perceptual understandable and types of it, and the second section included the influencing factors in the designer perceptual ways and its division . While the third section included the perceptual in graphic designer through the percepted shapes and the relation with ground and colors for express the i
... Show MoreE-Health care system is one of the great technology enhancements via using medical devices through sensors worn or implanted in the patient's body. Wireless Body Area Network (WBAN) offers astonishing help through wireless transmission of patient's data using agreed distance in which it keeps patient's status always controlled by regular transmitting of vital data indications to the receiver. Security and privacy is a major concern in terms of data sent from WBAN and biological sensors. Several algorithms have been proposed through many hypotheses in order to find optimum solutions. In this paper, an encrypting algorithm has been proposed via using hyper-chaotic Zhou system where it provides high security, privacy, efficiency and
... Show MoreAttacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover. The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels wit
... Show MoreGeneral medical fields and computer science usually conjugate together to produce impressive results in both fields using applications, programs and algorithms provided by Data mining field. The present research's title contains the term hygiene which may be described as the principle of maintaining cleanliness of the external body. Whilst the environmental hygienic hazards can present themselves in various media shapes e.g. air, water, soil…etc. The influence they can exert on our health is very complex and may be modulated by our genetic makeup, psychological factors and by our perceptions of the risks that they present. Our main concern in this research is not to improve general health, rather than to propose a data mining approach
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show More