Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and estimation. The current method (visual quantification methods) of image quantification is time-consuming and cumbersome, and manual measurement is imprecise because of the natural differences among human eyes’ abilities. Subsequently, objective outcome evaluation can obviate the drawbacks of the current methods and facilitate recording for documenting function and research purposes. To achieve a fast and valuable objective estimation of fluorescence in each image, an algorithm was designed based on machine vision techniques to extract the targeted objects in images that resulted from confocal images and then estimate the covered area to produce a percentage value similar to the outcome of the current method and is predicted to contribute to sustainable biotechnology image analyses by reducing time and labor consumption. The results show strong evidence that t-designed objective algorithm evaluations can replace the current method of manual and visual quantification methods to the extent that the Intraclass Correlation Coefficient (ICC) is 0.9.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe objective of the current research is to identify the degree of awareness of the teachers of Arabic language with the requirements of sustainable development. The research sample consisted of (100) male and female teachers of the Arabic language. A 3-likert scale of (71) items grouped into practical and cognitive aspects, five trends for each aspect was designed by the researcher to explore the required data. The results showed that the level of awareness of teachers of the Arabic language was moderate of both the cognitive and practical aspects of sustainable education with means (1.69) and (1.48) respectively. The researcher presented a set of recommendations and suggestions.
The research aims to demonstrate the impact of governance mechanisms on the quality of financial reports in the light of the accounting disclosure for sustainable development represented in (accounting disclosure for economic development, accounting disclosure for environmental development, and accounting disclosure for social development) in a sample of banks listed in the Iraq Stock Exchange.
Governance mechanisms were measured by evaluating and analyzing the mechanisms in banks for the research sample consisting of (15) banks, based on the governance guide issued by the Central Bank, as well as the banks’ financial reports for the years 2016 -2018, and the dimensions of accounting disclosure for sust
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreOne of ciphering systems depends on transposition of letters in plain text to generate cipher text. The programming of transposition depends mainly on 2-dimension matrix in either methods but the difference is in columnar .We print columns in the matrix according to their numbers in key but in the fixed, the cipher text will be obtained by printing matrix by rows.
Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreEmergency vehicle (EV) services save lives around the world. The necessary fast response of EVs requires minimising travel time. Preempting traffic signals can enable EVs to reach the desired location quickly. Most of the current research tries to decrease EV delays but neglects the resulting negative impacts of the preemption on other vehicles in the side roads. This paper proposes a dynamic preemption algorithm to control the traffic signal by adjusting some cycles to balance between the two critical goals: minimal delay for EVs with no stop, and a small additional delay to the vehicles on the side roads. This method is applicable to preempt traffic lights for EVs through an Intelli
Community detection is an important and interesting topic for better understanding and analyzing complex network structures. Detecting hidden partitions in complex networks is proven to be an NP-hard problem that may not be accurately resolved using traditional methods. So it is solved using evolutionary computation methods and modeled in the literature as an optimization problem. In recent years, many researchers have directed their research efforts toward addressing the problem of community structure detection by developing different algorithms and making use of single-objective optimization methods. In this study, we have continued that research line by improving the Particle Swarm Optimization (PSO) algorithm using a
... Show MoreMetaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreThe research aims to shed light on the nature of the tax gap in the income tax by the method of direct deduction and its reflection on the financial objective of the tax, and to determine the reasons for this gap in the deduction between the tax due in accordance with the laws and instructions in force and the tax actually paid. The tax gap is a real problem that cannot be ignored for what it represents loss of financial revenues due to the state.
The research problem is represented in the existence of a gap between the tax due according to direct deduction instructions and the tax actually paid according to the financial statements, and to achieve the objectives of the research and test the hypotheses, t
... Show More