The success of any institution must be based on means to protect its resources and assets from the waste, loss, misuse and the availability of accurate and reliable data by accounting reports to increase its operational efficiency, namely, that the internal control system is considered as a safety valve for top management in any economic unit. The problem is represented by the need for an efficient system, so to ensure its success, there must exist external parties which monitor and evaluate the performance because of its importance by following clear criteria. So, the research problem came to address performance evaluation indicators which are set by the Federal Board of Supreme Audit (FBSA) and identify the extent of its contribution to achieving an efficient system for the General Commission of Taxes (GCT), fulfil the requirements of the tax reform and identify shortcomings in these indicators, and determine the role of internal control in the GCT to achieve the aspirations of the FBSA to raise the efficiency of tax work performance. The aim of the research stems from the knowledge of the role of the FBSA in evaluating the performance to raise the efficiency of the internal control system and the tax administration in general, as well as find out how to use modern and possible methods and techniques in the control process over tax procedures, and research importance shows the role of the FBSA in evaluating the tax administration performance. The internal control is considered of the fundamental foundations of management's performance and this is an important and indispensable stage of the tax collection mechanism as a whole, being the cornerstones of the tax system and these could be the cause of achieving the desired economy, and that the use of an efficient system for control with a scientific manner that increases the effectiveness of management's performance.
This article showcases the development and utilization of a side-polished fiber optic sensor that can identify altered refractive index levels within a glucose solution through the investigation of the surface Plasmon resonance (SPR) effect. The aim was to enhance efficiency by means of the placement of a 50 nm-thick layer of gold at the D-shape fiber sensing area. The detector was fabricated by utilizing a silica optical fiber (SOF), which underwent a cladding stripping process that resulted in three distinct lengths, followed by a polishing method to remove a portion of the fiber diameter and produce a cross-sectional D-shape. During experimentation with glucose solution, the side-polished fiber optic sensor revealed an adept detection
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreThere is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreHeat island is known as the increases in air temperature through large and industrial cities compared to surrounding rural areas. In this study, remote sensing technology is used to monitor and track thermal variations within the city center of Baghdad through Landsat satellite images and for the period from 2000 to 2015. Several processors and treatments were applied on these images using GIS 10.6 and ERDAS 2014, such as image correction and extraction, supervised classification, and selection of training samples. Urban areas detection was resulted from the supervised classification linked to the temperature readings of the surface taken from the thermal bands of satellite images. The results showed that the surface temperature of the c
... Show More<p>Vehicular ad-hoc networks (VANET) suffer from dynamic network environment and topological instability that caused by high mobility feature and varying vehicles density. Emerging 5G mobile technologies offer new opportunities to design improved VANET architecture for future intelligent transportation system. However, current software defined networking (SDN) based handover schemes face poor handover performance in VANET environment with notable issues in connection establishment and ongoing communication sessions. These poor connectivity and inflexibility challenges appear at high vehicles speed and high data rate services. Therefore, this paper proposes a flexible handover solution for VANET networks by integrating SDN and
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreThis review will focus on protein and peptide separation studies of the period 1995 to 2010. Peptide and protein analysis have developed dramatically after applying mass spectrometry (MS) technology and other related techniques, such as two-dimensional liquid chromatography and two-dimensional gel electrophoresis. Mass spectrometry involves measurements of mass-to-charge ratios of the ionized sample. High-performance liquid chromatography (HPLC) is an important technique that is usually applied before MS is conducted due to its efficient separation. Characterization of proteins provides a foundation for the fundamental understanding of biology aspects. In this review, instrumentation, principle, applications, developments, and accuracy o
... Show MoreRecent studies have revealed some conflicting results about the health effects of caffeine. These studies are inconsistent in terms of design and population and source of consumed caffeine. In the current study, we aimed to evaluate the possible health effects of dietary caffeine intake among overweight and obese individuals.
In this cross-sectional study, 488 apparently healthy individuals with overweight and obesity were participated. Dietary intake was assessed by a Food Frequency Questionnaire (FFQ) and