The use of deep learning.
Medicine is one of the fields where the advancement of computer science is making significant progress. Some diseases require an immediate diagnosis in order to improve patient outcomes. The usage of computers in medicine improves precision and accelerates data processing and diagnosis. In order to categorize biological images, hybrid machine learning, a combination of various deep learning approaches, was utilized, and a meta-heuristic algorithm was provided in this research. In addition, two different medical datasets were introduced, one covering the magnetic resonance imaging (MRI) of brain tumors and the other dealing with chest X-rays (CXRs) of COVID-19. These datasets were introduced to the combination network that contained deep lea
... Show MoreAbstract:
The research aims to identify how to audit potential liabilities and contingent liabilities in light of the pandemic and its reflection on the auditor's report. The research problem is represented by the complexity of the process of checking potential liabilities and contingent liabilities in insurance companies, which was negatively reflected in the auditor's neutral technical opinion. The researchers hypothesize that auditing potential liabilities and contingent liabilities in light of the Corona pandemic is positively reflected in the auditor's report. The research concludes that the process of checking potential liabilities and contingent liabilities is
... Show MoreAccounting disclosure is the main means and effective tool for communicating business results to users in support of their decisions, especially those with thought and specialization from academics and professionals in the field of accounting and auditing about the importance of accounting disclosure and transparency in financial reports.
Contingent liabilities represent commitments based on the occurrence of one or more events in the future to confirm the value due, the party entitled to it, the maturity date, or to confirm the existence of the obligation itself, and therefore they should not be recognized as a contingent liability i
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThe parameter and system reliability in stress-strength model are estimated in this paper when the system contains several parallel components that have strengths subjects to common stress in case when the stress and strengths follow Generalized Inverse Rayleigh distribution by using different Bayesian estimation methods. Monte Carlo simulation introduced to compare among the proposal methods based on the Mean squared Error criteria.
In real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreThe aim of this study is to screen the phytochemicals found in Populus euphratica leaves since this type of trees are used traditionally by many villagers as treatment for eczema and other skin disease and also this plant is poorly investigated for their phytochemicals especially in Iraq. Phytochemical screening of the extracts obtained from the n-hexane and chloroform fraction of leaves of Populus euphratica was done by Thin-layer chromatography and various spraying reagents to test if alkaloids, sterols and other compounds are present. UPLC-electrospray ionization –tandem mass spectroscopy along with GC-MS and HPTLC are used to identify the phytochemicals present in the plant leaves.UPLC-ESI-MS/MS method 20 compound
... Show More