This research aims to evaluate the teaching methods used by intermediate Arabic language teachers. To achieve the goal, the researcher followed the descriptive-analytical approach. The research sample was limited to Arabic language teachers at intermediate school for the academic year (2017-2018) the researcher chose a random sample included (155) teachers which form a (40%) Of the original community at Baghdad/ Rusafa1th. The researcher has developed standards of teaching methods which are (7) standards with (39) paragraphs included three alternatives. The results showed the need to pay attention to the use of various modern teaching methods. Moreover, the researcher suggested making an evaluation of the teaching methods used by Arabic
... Show MoreThe present study aims to identify the most and the least common teaching practices among faculty members in Northern Border University according to brain-based learning theory, as well as to identify the effect of sex, qualifications, faculty type, and years of experiences in teaching practices. The study sample consisted of (199) participants divided into 100 males and 99 females. The study results revealed that the most teaching practice among the study sample was ‘I am trying to create an Environment of encouragement and support within the classroom which found to be (4.4623). As for the least teaching practice was ‘I use a natural musical sounds to create student's mood to learn’ found to be (2.2965). The study results also in
... Show MoreIn this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every
... Show MoreIn this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between ev
... Show MoreThe search included a comparison between two etchands for etch CR-39 nuclear track detector, by the calculation of bulk etch rate (Vb) which is one of the track etching parameters, by two measuring methods (thichness and change mass). The first type, is the solution prepared from solving NaOH in Ethanol (NaOH/Ethanol) by varied normalities under temperature(55˚C)and etching time (30 min) then comparated with the second type the solution prepared from solving NaOH in water (NaOH/Water) by varied normalities with (70˚C) and etching time (60 min) . All detectors were irradiated with (5.48 Mev) α-Particles from an 241Am source in during (10 min). The results that Vb would increase with the increase of
... Show MoreThis paper analysed the effect of electronic internal auditing (EIA) based on the Control Objectives for Information and Related Technologies (COBIT) framework. Organisations must implement an up-to-date accounting information system (AIS) capable of meeting their auditing requirements. Electronic audit risk (compliance assessment, control assurance, and risk assessment) is a development by Weidenmier and Ramamoorti (2006) to improve AIS. In order to fulfil the study’s objectives, a questionnaire was prepared and distributed to a sample comprising 120 employees. The employees were financial managers, internal auditors, and workers involved in the company’s information security departments in the General Company for Electricity D
... Show MoreRadiation is a form of energy, its emitted either in the form of particles such as α-particles and β-particles (beta particles including the electron and the positron) or waves such as sunlight, X-rays and γ-rays. Radiation found everywhere around us and it comes from many different sources naturally or man-made sources. In this study a questionnaire was distributed to people working in the field of X-rays that used for a medical imaging (X-ray and CT-scan) to evaluate the extent of awareness and knowledge in estimate the damage of ionizing radiation as a result of wrong use. The questionnaire was distributed to medical clinics in Al-Harithiya in Baghdad, which it’s considered as
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe research deals with an evolutionary-based mutation with functional annotation to identify protein complexes within PPI networks. An important field of research in computational biology is the difficult and fundamental challenge of revealing complexes in protein interaction networks. The complex detection models that have been developed to tackle challenges are mostly dependent on topological properties and rarely use the biological properties of PPI networks. This research aims to push the evolutionary algorithm to its maximum by employing gene ontology (GO) to communicate across proteins based on biological information similarity for direct genes. The outcomes show that the suggested method can be utilized to improve the
... Show More