The present research aims at recognizing the difficulties and problems which hamper teachers and educators alike when using the internet for educational purposes.It discusses the benefits of the internet as a source of information or publication and as a communicative tool.Arandom sample of (30) teachers working at schools in Baghdad / Second Risafa,was selected.Three of the sample members use the internet for student project plans via internet centers, whereas 16 of them use it for chatting, emailing and research purposes.The rest of the sample have limited knowledge of the internet. The researcher used the interviewing method to gather data from the sample members.The method involved eleven questions which required their replies.The repeated distribution and the percentage were employed to analyze the collected data.Among the conclusions arrived at is that the difficulties confronted by the teachers are their computer and internet illiteracy, particularly in teaching sciences, lack of internet and computer training courses,unavailability of computers, shortage of computer and internet guides and manuals which,even when available,do not match the ongoing progress,lack of technical support,continuing power failure, high cost of computers and internet access systems, anxiety and fear of misusing the internet, which led to negative orientation in internet use,fear of accessing non-educational sites,fear of losing focus when browsing the web, in addition to the teachers’ lack of knowledge of other languages such as the English. The study makes the following recommendations.It recommends that the Ministry of Education should provide computers which are connected to the internet.Computer and internet training courses must also be held to provide teachers with the skills they need to use the internet in teaching science.There should also be curricula prepared which include the use computers and the internet in most subjects, particularly science.Material and technical support should also be provided for schools.
In modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreUltimate oil recovery and displacement efficiency at the pore-scale are controlled by the rock wettability thus there is a growing interest in the wetting behaviour of reservoir rocks as production from fractured oil-wet or mixed-wet limestone formations have remained a key challenge. Conventional waterflooding methods are inefficient in such formation due to poor spontaneous imbibition of water into the oil-wet rock capillaries. However, altering the wettability to water-wet could yield recovery of significant amounts of additional oil thus this study investigates the influence of nanoparticles on wettability alteration. The efficiency of various formulated zirconium-oxide (ZrO2) based nanofluids at different nanoparticle concentrations (0
... Show MoreImage steganography is undoubtedly significant in the field of secure multimedia communication. The undetectability and high payload capacity are two of the important characteristics of any form of steganography. In this paper, the level of image security is improved by combining the steganography and cryptography techniques in order to produce the secured image. The proposed method depends on using LSBs as an indicator for hiding encrypted bits in dual tree complex wavelet coefficient DT-CWT. The cover image is divided into non overlapping blocks of size (3*3). After that, a Key is produced by extracting the center pixel (pc) from each block to encrypt each character in the secret text. The cover image is converted using DT-CWT, then the p
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreMicroalgae have been increasingly used for wastewater treatment due to their capacity to assimilate nutrients. Samples of wastewater were taken from the Erbil wastewater channel near Dhahibha village in northern Iraq. The microalga Coelastrella sp. was used in three doses (0.2, 1, and 2g. l-1) in this experiment for 21 days, samples were periodically (every 3 days) analyzed for physicochemical parameters such as pH, EC, Phosphate, Nitrate, and BOD5, in addition to, Chlorophyll a concentration. Results showed that the highest dose 2g.l-1 was the most effective dose for removing nutrients, confirmed by significant differences (p≤0.05) between all doses. The highest removal percentage was
... Show MoreComputer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show More