The success of any institution must be based on means to protect its resources and assets from the waste, loss, misuse and the availability of accurate and reliable data by accounting reports to increase its operational efficiency, namely, that the internal control system is considered as a safety valve for top management in any economic unit. The problem is represented by the need for an efficient system, so to ensure its success, there must exist external parties which monitor and evaluate the performance because of its importance by following clear criteria. So, the research problem came to address performance evaluation indicators which are set by the Federal Board of Supreme Audit (FBSA) and identify the extent of its contribution to achieving an efficient system for the General Commission of Taxes (GCT), fulfil the requirements of the tax reform and identify shortcomings in these indicators, and determine the role of internal control in the GCT to achieve the aspirations of the FBSA to raise the efficiency of tax work performance. The aim of the research stems from the knowledge of the role of the FBSA in evaluating the performance to raise the efficiency of the internal control system and the tax administration in general, as well as find out how to use modern and possible methods and techniques in the control process over tax procedures, and research importance shows the role of the FBSA in evaluating the tax administration performance. The internal control is considered of the fundamental foundations of management's performance and this is an important and indispensable stage of the tax collection mechanism as a whole, being the cornerstones of the tax system and these could be the cause of achieving the desired economy, and that the use of an efficient system for control with a scientific manner that increases the effectiveness of management's performance.
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreWater quality sensors have recently received a lot of attention due to their impact on human health. Due to their distinct features, environmental sensors are based on carbon quantum dots (CQDs). In this study, CQDs were prepared using the electro-chemical method, where the structural and optical properties were studied. These quantum dots were used in the environmental sensor application after mixing them with three different materials: CQDs, Alq3 polymer and CQDs and Alq3 solutions using two different methods: drop casting and spin coating, and depositing them on silicon. The sensitivity of the water pollutants was studied for each case of the prepared samples after measuring the change in resistance of the samples at a temperature of
... Show MoreA cut-off low is a closed low with a low value of geopotential height at the upper atmospheric levels that has been fully detached (cut-off) from the westerly flow and move independently. A cut-off low causes extreme rainfall events in the mid-latitudes regions. The main aim of this paper is to investigate the cut-off low at 500 hPa over Iraq from a synoptic point of view and the behavior of geopotential height at 500 hPa. To examine the association of the cut-off low at 500 hPa with rainfall events across Iraq, two case studies of heavy rainfall events from different times were conducted. The results showed that the cut-off low at 500 hPa with a low value of geopotential height will strengthen the low-pressure system at the surface, lea
... Show More'Steganography is the science of hiding information in the cover media', a force in the context of information sec, IJSR, Call for Papers, Online Journal
In this study, an efficient photocatalyst for dissociation of water was prepared and studied. The chromium oxide (Cr2O3) with Titanium dioxide (TiO2) nanofibers (Cr2O3-TNFs) nanocomposite with (chitosan extract) were synthesized using ecologically friendly methods such as ultrasonic and hydrothermal techniques; such TiO2 exhibits nanofibers (TNFs) shape struct
... Show MoreIn this research, Artificial Neural Networks (ANNs) technique was applied in an attempt to predict the water levels and some of the water quality parameters at Tigris River in Wasit Government for five different sites. These predictions are useful in the planning, management, evaluation of the water resources in the area. Spatial data along a river system or area at different locations in a catchment area usually have missing measurements, hence an accurate prediction. model to fill these missing values is essential.
The selected sites for water quality data prediction were Sewera, Numania , Kut u/s, Kut d/s, Garaf observation sites. In these five sites models were built for prediction of the water level and water quality parameters.
This article showcases the development and utilization of a side-polished fiber optic sensor that can identify altered refractive index levels within a glucose solution through the investigation of the surface Plasmon resonance (SPR) effect. The aim was to enhance efficiency by means of the placement of a 50 nm-thick layer of gold at the D-shape fiber sensing area. The detector was fabricated by utilizing a silica optical fiber (SOF), which underwent a cladding stripping process that resulted in three distinct lengths, followed by a polishing method to remove a portion of the fiber diameter and produce a cross-sectional D-shape. During experimentation with glucose solution, the side-polished fiber optic sensor revealed an adept detection
... Show More