Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and estimation. The current method (visual quantification methods) of image quantification is time-consuming and cumbersome, and manual measurement is imprecise because of the natural differences among human eyes’ abilities. Subsequently, objective outcome evaluation can obviate the drawbacks of the current methods and facilitate recording for documenting function and research purposes. To achieve a fast and valuable objective estimation of fluorescence in each image, an algorithm was designed based on machine vision techniques to extract the targeted objects in images that resulted from confocal images and then estimate the covered area to produce a percentage value similar to the outcome of the current method and is predicted to contribute to sustainable biotechnology image analyses by reducing time and labor consumption. The results show strong evidence that t-designed objective algorithm evaluations can replace the current method of manual and visual quantification methods to the extent that the Intraclass Correlation Coefficient (ICC) is 0.9.
In this paper, several combination algorithms between Partial Update LMS (PU LMS) methods and previously proposed algorithm (New Variable Length LMS (NVLLMS)) have been developed. Then, the new sets of proposed algorithms were applied to an Acoustic Echo Cancellation system (AEC) in order to decrease the filter coefficients, decrease the convergence time, and enhance its performance in terms of Mean Square Error (MSE) and Echo Return Loss Enhancement (ERLE). These proposed algorithms will use the Echo Return Loss Enhancement (ERLE) to control the operation of filter's coefficient length variation. In addition, the time-varying step size is used.The total number of coefficients required was reduced by about 18% , 10% , 6%
... Show MoreInternational companies are striving to reduce their costs and increase their profits, and these trends have produced many methods and techniques to achieve these goals. these methods is heuristic and the other Optimization.. The research includes an attempt to adapt some of these techniques in the Iraqi companies, and these techniques are to determine the optimal lot size using the algorithms Wagner-Whitin under the theory of constraints. The research adopted the case study methodology to objectively identify the problem of research, namely determining lot size optimal for each of the products of electronic measurement laboratory in Diyala and in light of the bottlenecks in w
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreIn this paper, the construction of Hermite wavelets functions and their operational matrix of integration is presented. The Hermite wavelets method is applied to solve nth order Volterra integro diferential equations (VIDE) by expanding the unknown functions, as series in terms of Hermite wavelets with unknown coefficients. Finally, two examples are given
Evolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreAbstract his study involved evaluation of side effects of two weight reduction pills that had been widely distributed in the last period. Two weight reduction compounds are studied, Reductil (containing chemical substances) and Chinese’s weight reduction herbs (containing natural substances). Two doses for each compound are used in this research; 5mg/ml and 0.5mg/ml for Reductil, while 30mg/ml and 10mg/ml for Chinese weight reduction herbs. To evaluate the toxic effects of these compounds, the following parameters were determined which include mitotic index (cytogenetic analysis), serum FSH and LH hormones level (follicles stimulation hormone/FSH and lutenising hormone/LH) and histological examination of female mice ovaries. Control group
... Show MoreThe discovery of novel therapeutic molecules is always difficult, and there are a variety of methodologies that use the most diverse and innovative medicinal chemistry approaches. One such approach is the deuteration technique: Deuteration is the process of substituting deuterium for hydrogen in a molecule. When compared to the drug molecule, its deuterated analogues may retain the features of the original molecule and, in some cases, improve its pharmacological activity, with fewer side effects and lower toxicity. Metronidazole is a commonly used antibiotic to treat anaerobic bacterial infections, protozoal and microaerophilic bacterial infections. Met
... Show More