Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and estimation. The current method (visual quantification methods) of image quantification is time-consuming and cumbersome, and manual measurement is imprecise because of the natural differences among human eyes’ abilities. Subsequently, objective outcome evaluation can obviate the drawbacks of the current methods and facilitate recording for documenting function and research purposes. To achieve a fast and valuable objective estimation of fluorescence in each image, an algorithm was designed based on machine vision techniques to extract the targeted objects in images that resulted from confocal images and then estimate the covered area to produce a percentage value similar to the outcome of the current method and is predicted to contribute to sustainable biotechnology image analyses by reducing time and labor consumption. The results show strong evidence that t-designed objective algorithm evaluations can replace the current method of manual and visual quantification methods to the extent that the Intraclass Correlation Coefficient (ICC) is 0.9.
Most of drinking water consuming all over the world has been treated at the water treatment plant (WTP) where raw water is abstracted from reservoirs and rivers. The turbidity removal efficiency is very important to supply safe drinking water. This study is focusing on the use of multiple linear regression (MLR) and artificial neural network (ANN) models to predict the turbidity removal efficiency of Al-Wahda WTP in Baghdad city. The measured physico-chemical parameters were used to determine their effect on turbidity removal efficiency in various processes. The suitable formulation of the ANN model is examined throughout many preparations, trials, and steps of evaluation. The predict
This qualitative study was conducted on eight types of commercial baking yeast which available in local markets to estimate their fermentation activity as affecting the Bread industry and the impact of the salt added to DoughLeavening, The results showed a great variation in the fermentation capacity of yeast samples (their role in swelling the dough), most notably the sample value Y3 and least sample Y7 and reached 80% and 20% respectively, and the value of Leavening by using the two types of yeast with addition of three levels of salt (0 , 1 and 2%) have 20.0 , 19.7 and 15.7 of the sample Y3, compared with 10.5 , 10.3 and 8.8 of the sample Y7 for each of the levels of salt respectively, reflect
... Show MoreThe pilgrimage takes place in several countries around the world. The pilgrimage includes the simultaneous movement of a huge crowd of pilgrims which leads to many challenges for the pilgrimage authorities to track, monitor, and manage the crowd to minimize the chance of overcrowding’s accidents. Therefore, there is a need for an efficient monitoring and tracking system for pilgrims. This paper proposes powerful pilgrims tracking and monitoring system based on three Internet of Things (IoT) technologies; namely: Radio Frequency Identification (RFID), ZigBee, and Internet Protocol version 6 (IPv6). In addition, it requires low-cost, low-power-consumption implementation. The proposed
A security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear
... Show MoreThis study aims to enhance the RC5 algorithm to improve encryption and decryption speeds in devices with limited power and memory resources. These resource-constrained applications, which range in size from wearables and smart cards to microscopic sensors, frequently function in settings where traditional cryptographic techniques because of their high computational overhead and memory requirements are impracticable. The Enhanced RC5 (ERC5) algorithm integrates the PKCS#7 padding method to effectively adapt to various data sizes. Empirical investigation reveals significant improvements in encryption speed with ERC5, ranging from 50.90% to 64.18% for audio files and 46.97% to 56.84% for image files, depending on file size. A substanti
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreA frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co
... Show More