The research aims at evaluating the illustrations images and determining the availability of good image standards in the illustrations images of the content of the second intermediate stage computer's book for the academic year (2019-2020) as seen by computer teachers. The sample was randomly selected, (30) teachers who are actually teaching the subject in schools within the geographical area of the province of Baghdad (Karkh III). To achieve this goal, ten standards were identified: scientific accuracy, suitability for the level of students, image clarity, image freshness, quality of coloring, suitability of its location of the subject, Matching their content glimpsed, The subject matter is appropriate in terms of area, matching its tit
... Show MoreThat internal control is particularly important in improvingperformance and tax reform، they play an important role in the regularity ofwork and development and the anti corruption and activating the decisionsand tax legislation، as contained in the organizational plan and the means،procedures and components designed from which to ensure a policy andimplementation plans The research aims to review the reality of the internalcontrol in the General Commission for Taxes and stand on the deficiencies init, with the strengthening of the role of internal control in the GeneralAuthority for taxes based on the laws and regulations and by using modernmeans to work as well as developing the performance of employees in thebody، including helpin
... Show MoreTo achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DN
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreThe widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreIn this article, we design an optimal neural network based on new LM training algorithm. The traditional algorithm of LM required high memory, storage and computational overhead because of it required the updated of Hessian approximations in each iteration. The suggested design implemented to converts the original problem into a minimization problem using feed forward type to solve non-linear 3D - PDEs. Also, optimal design is obtained by computing the parameters of learning with highly precise. Examples are provided to portray the efficiency and applicability of this technique. Comparisons with other designs are also conducted to demonstrate the accuracy of the proposed design.
Optimizing the Access Point (AP) deployment is of great importance in wireless applications owing the requirement to provide efficient and cost-effective communication. Highly targeted by many researchers and academic industries, Quality of Service (QOS) is an important primary parameter and objective in mind along with AP placement and overall publishing cost. This study proposes and investigates a multi-level optimization algorithm based on Binary Particle Swarm Optimization (BPSO). It aims to an optimal multi-floor AP placement with effective coverage that makes it more capable of supporting QOS and cost effectiveness. Five pairs (coverage, AP placement) of weights, signal threshol
The increasing availability of computing power in the past two decades has been use to develop new techniques for optimizing solution of estimation problem. Today's computational capacity and the widespread availability of computers have enabled development of new generation of intelligent computing techniques, such as our interest algorithm, this paper presents one of new class of stochastic search algorithm (known as Canonical Genetic' Algorithm ‘CGA’) for optimizing the maximum likelihood function strategy is composed of three main steps: recombination, mutation, and selection. The experimental design is based on simulating the CGA with different values of are compared with those of moment method. Based on MSE value obtained from bot
... Show More