Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreThe necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embe
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Cassava, a significant crop in Africa, Asia, and South America, is a staple food for millions. However, classifying cassava species using conventional color, texture, and shape features is inefficient, as cassava leaves exhibit similarities across different types, including toxic and non-toxic varieties. This research aims to overcome the limitations of traditional classification methods by employing deep learning techniques with pre-trained AlexNet as the feature extractor to accurately classify four types of cassava: Gajah, Manggu, Kapok, and Beracun. The dataset was collected from local farms in Lamongan Indonesia. To collect images with agricultural research experts, the dataset consists of 1,400 images, and each type of cassava has
... Show MoreThe nature of the dark sector of the Universe remains one of the outstanding problems in modern cosmology, with the search for new observational probes guiding the development of the next generation of observational facilities. Clues come from tension between the predictions from Λ cold dark matter (ΛCDM) and observations of gravitationally lensed galaxies. Previous studies showed that galaxy clusters in the ΛCDM are not strong enough to reproduce the observed number of lensed arcs. This work aims to constrain the warm dark matter (WDM) cosmologies by means of the lensing efficiency of galaxy clusters drawn from these alternative models. The lensing characteristics of two samples of simulated clusters in the Λ warm dark matter and ΛCDM
... Show MoreFive serological methods for detection of Brucella were compaired in this study, Four of the methods are commonely used in the detections:- 1-Rose-Bengal: as primary screening test which depends on detecting antibodies in the blood serum. 2-IFAT: which detects IgG and IgM antibodies in the serum. 3-ELISA test: which detects IgG antibodies in the serum. 4-2ME test: which detects IgG antibodies The fifth methods. It was developed by a reasercher in one of the health centers in Baghdad. It was given the name of spot Immune Assay (SIA). Results declares that among (100) samples of patients blood, 76, 49, 49, 37, and 28. samples were positive to Rose Bengal, ELISA, SIA, 2ME and IFAT tests, respectively. When efficiency, sensitivity and specific
... Show MoreThis current study aims to:
1st: The recognizing of Alexithymia level for 6th grade students (Study Specimen) through the next Zero Hypothesis:1. There are no statistically significant differences at (0.05) level between the arithmetic mean of the specimen degrees as a whole and the central assumption for the scale of the lack in emotions expression
2. There are no statistically significant differences at (0.05) level between the arithmetic mean of the male students specimen and the arithmetic meanc of the female students specimen for the scale of Alexithymia.
2nd: ldentification the level of the emotional intelligence among 6th grade students (Study Specimen) through the next Zero Hypothesis:
1) There are no statistically si
Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show More