Biosensor is defined as a device that transforms the interactions between bioreceptors and analytes into a logical signal proportional to the reactants' concentration. Biosensors have different applications that aim primarily to detect diseases, medicines, food safety, the proportion of toxins in water, and other applications that ensure the safety and health of the organism. The main challenge of biosensors is represented in the difficulty of obtaining sensors with accuracy, specific sensitivity, and repeatability for each use of the patient so that they give reliable results. The rapid diversification in biosensors is due to the accuracy of the techniques and materials used in the manufacturing process and the interrelationships in scientific research between various disciplines, i.e., physics and biology, engineering and biology. This research aims to define biosensors in general, classify them and show their most important applications, with a brief description of their time development and the reason for their speared in all fields.
Developing a solid e-voting system that offers fairness and privacy for users is a challenging objective. This paper is trying to address whether blockchain can be used to build an efficient e-voting system, also, this research has specified four blockchain technologies with their features and limitations. Many papers have been reviewed in a study covered ten years from 2011 to 2020. As a result of the study, the blockchain platform can be a successful public ledger to implement an e-voting system. Four blockchain technologies have been noticed from this study. These are blockchain using smart contracts, blockchain relying on Zcash platform, blockchain programmed from scratch, and blockchain depending on digital signature. Each bl
... Show MoreFractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image co
... Show MoreSince Internet Protocol version 6 is a new technology, insecure network configurations are inevitable. The researchers contributed a lot to spreading knowledge about IPv6 vulnerabilities and how to address them over the past two decades. In this study, a systematic literature review is conducted to analyze research progress in IPv6 security field following the Preferred Reporting Items for the Systematics Review and Meta-Analysis (PRISMA) method. A total of 427 studies have been reviewed from two databases, IEEE and Scopus. To fulfil the review goal, several key data elements were extracted from each study and two kinds of analysis were administered: descriptive analysis and literature classification. The results show positive signs of t
... Show MoreGlaucoma is one of the most dangerous eye diseases. It occurs as a result of an imbalance in the drainage and flow of the retinal fluid. Consequently, intraocular pressure is generated, which is a significant risk factor for glaucoma. Intraocular pressure causes progressive damage to the optic nerve head, thus leading to vision loss in the advanced stages. Glaucoma does not give any signs of disease in the early stages, so it is called "the Silent Thief of Sight". Therefore, early diagnosis and treatment of retinal eye disease is extremely important to prevent vision loss. Many articles aim to analyze fundus retinal images and diagnose glaucoma. This review can be used as a guideline to help diagnose glaucoma. It presents 63 artic
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreRMK Al-Zaid, AT Al-Musawi, SJ Mohammad
The new, standard molecular biologic system for duplicating DNA enzymatically devoid of employing a living organism, like E. coli or yeast, represents polymerases chain reaction (PCR). This technology allows an exponential intensification of a minor quantity of DNA molecule several times. Analysis can be straightforward with more DNA available.
A thermal heat cycler performs a polymerization chain reaction that involves repeated cycles of heating and cooling the reactant tubes at the desired temperature for each reaction step. A heated deck is positioned on the upper reaction tube to avoid evaporating the reaction mixture (normally volumes range from 15 to 100 l per tube), or an oil layer can be placed on a reaction mixture surfa
... Show MoreIn this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for