Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Givers of foreign Audit about Social Responsibility of Profit Organization. The recent time is charcterstically with big economic Organization activities, because there are many transactions between these Organizations and different financial markets development techniques.
This encourgage business men to increase their efforts for investment in these markets. Because the Accounting is in general terms it represents a language of these Unions Activities and translate them in to fact numbers, for that there is need for Accounting recording for certain of these Organizations behavior and their harmonization with their Objectives.
In this respect the Audit function comes to che
... Show MoreThe two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThe aim of this research is to show the importance of the effective use
of the internet in academic libraries; to improve the services and to increase
the competence of librarians.
The research has given some recommendations to improve the quality
of services and the need for cooperative network among academic libraries.
Acceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.
An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo
... Show MoreABSTRACT:
The study aims at expounding the correlation and effect between the Human resource development strategy and Quality Municipality Service within a theoretical framework and a practical framework conducted at Directorate Of Municipalities in holy Karbala . The researcher found during a pilot study that there isn’t enough care paid by the Directorate Of Municipalities in developing its human resources using one strategy or a number of strategies and their effect on the Quality Municipality Service. Thus a number of research questions were set concerning the existence of clear perception in the Directorates Of Municipalities concerning the strategies of developing both the human resource an Qualit
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreThe research aims to identify the extent to which Iraqi private banks practice profit management motivated by reducing the taxable base by increasing the provision for loan losses by relying on the LLP it model, which consists of a main independent variable (net profit before tax) and independent sub-variables (bank size, total debts to total equity, loans granted to total obligations) under the name of the variables governing the banking business. (Colmgrove-Smirnov) was used to test the normal distribution of data for all banks during the period 2017-2020, and then find the correlation between the main independent variable sub and the dependent variable by means of the correlation coefficient person, and then using the multiple
... Show MoreMultilocus haplotype analysis of candidate variants with genome wide association studies (GWAS) data may provide evidence of association with disease, even when the individual loci themselves do not. Unfortunately, when a large number of candidate variants are investigated, identifying risk haplotypes can be very difficult. To meet the challenge, a number of approaches have been put forward in recent years. However, most of them are not directly linked to the disease-penetrances of haplotypes and thus may not be efficient. To fill this gap, we propose a mixture model-based approach for detecting risk haplotypes. Under the mixture model, haplotypes are clustered directly according to their estimated d