Storing, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the CHB-MIT Scalp EEG Database. Compression Ratio (CR) is used to evaluate the compression system performance. The results are encouraging when compared with previous works on the same data samples.
Abstract Background: The novel coronavirus 2 (SARS?CoV?2) pandemic is a pulmonary disease, which leads to cardiac, hematologic, and renal complications. Anticoagulants are used for COVID-19 infected patients because the infection increases the risk of thrombosis. The world health organization (WHO), recommend prophylaxis dose of anticoagulants: (Enoxaparin or unfractionated Heparin for hospitalized patients with COVID-19 disease. This has created an urgent need to identify effective medications for COVID-19 prevention and treatment. The value of COVID-19 treatments is affected by cost-effectiveness analysis (CEA) to inform relative value and how to best maximize social welfare through evidence-based pricing decisions. O
... Show MoreQuantitative real-time Polymerase Chain Reaction (RT-qPCR) has become a valuable molecular technique in biomedical research. The selection of suitable endogenous reference genes is necessary for normalization of target gene expression in RT-qPCR experiments. The aim of this study was to determine the suitability of each 18S rRNA and ACTB as internal control genes for normalization of RT-qPCR data in some human cell lines transfected with small interfering RNA (siRNA). Four cancer cell lines including MCF-7, T47D, MDA-MB-231 and Hela cells along with HEK293 representing an embryonic cell line were depleted of E2F6 using siRNA specific for E2F6 compared to negative control cells, which were transfected with siRNA not specific for any gene. Us
... Show MoreThese days, it is crucial to discern between different types of human behavior, and artificial intelligence techniques play a big part in that. The characteristics of the feedforward artificial neural network (FANN) algorithm and the genetic algorithm have been combined to create an important working mechanism that aids in this field. The proposed system can be used for essential tasks in life, such as analysis, automation, control, recognition, and other tasks. Crossover and mutation are the two primary mechanisms used by the genetic algorithm in the proposed system to replace the back propagation process in ANN. While the feedforward artificial neural network technique is focused on input processing, this should be based on the proce
... Show MoreStructure type and disorder have become important questions in catalyst design, with the most active catalysts often noted to be “disordered” or “amorphous” in nature. To quantify the effects of disorder and structure type systematically, a test set of manganese(III,IV) oxides was developed and their reactivity as oxidants and catalysts tested against three substrates: methylene blue, hydrogen peroxide, and water. We find that disorder destabilizes the materialsthermodynamically, making them stronger chemical oxidantsbut not necessarily better catalysts. For the disproportionation of H2O2 and the oxidative decomposition of methylene blue, MnOx-mediated direct oxidation competes with catalytically mediated oxidation, making the most
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThis paper considers the maximum number of weekly cases and deaths caused by the COVID-19 pandemic in Iraq from its outbreak in February 2020 until the first of July 2022. Some probability distributions were fitted to the data. Maximum likelihood estimates were obtained and the goodness of fit tests were performed. Results revealed that the maximum weekly cases were best fitted by the Dagum distribution, which was accepted by three goodness of fit tests. The generalized Pareto distribution best fitted the maximum weekly deaths, which was also accepted by the goodness of fit tests. The statistical analysis was carried out using the Easy-Fit software and Microsoft Excel 2019.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show More