In the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harnesses the unique attributes of this language, encompassing its complex character designs, diacritical marks, and ligatures, to effectively protect information. In this work, we propose a new text steganography method based on Arabic language characteristics concealment, where the proposed method has two levels of security which are: Arabic encoding and word shifting. In the first step, build a new Arabic encoding mapping table to convert an English plaintext to Arabic characters, then use a word shifting process to add an authentication phase for the sending message and add another level of security to the achieved ciphertext. The proposed method showed that Arabic language characteristics steganography achieved 0.15 ms for 1 k, 1.0033 ms for 3 k, 2.331 ms for 5 k, and 5.22 ms for 10 k file sizes respectively.
Alzheimer’s disease (AD) is an age-related progressive and neurodegenerative disorder, which is characterized by loss of memory and cognitive decline. It is the main cause of disability among older people. The rapid increase in the number of people living with AD and other forms of dementia due to the aging population represents a major challenge to health and social care systems worldwide. Degeneration of brain cells due to AD starts many years before the clinical manifestations become clear. Early diagnosis of AD will contribute to the development of effective treatments that could slow, stop, or prevent significant cognitive decline. Consequently, early diagnosis of AD may also be valuable in detecting patients with dementia who have n
... Show MoreThis study assessed the advantage of using earthworms in combination with punch waste and nutrients in remediating drill cuttings contaminated with hydrocarbons. Analyses were performed on day 0, 7, 14, 21, and 28 of the experiment. Two hydrocarbon concentrations were used (20000 mg/kg and 40000 mg/kg) for three groups of earthworms number which were five, ten and twenty earthworms. After 28 days, the total petroleum hydrocarbon (TPH) concentration (20000 mg/kg) was reduced to 13200 mg/kg, 9800 mg/kg, and 6300 mg/kg in treatments with five, ten and twenty earthworms respectively. Also, TPH concentration (40000 mg/kg) was reduced to 22000 mg/kg, 10100 mg/kg, and 4200 mg/kg in treatments with the above number of earthworms respectively. The p
... Show MoreImage retrieval is used in searching for images from images database. In this paper, content – based image retrieval (CBIR) using four feature extraction techniques has been achieved. The four techniques are colored histogram features technique, properties features technique, gray level co- occurrence matrix (GLCM) statistical features technique and hybrid technique. The features are extracted from the data base images and query (test) images in order to find the similarity measure. The similarity-based matching is very important in CBIR, so, three types of similarity measure are used, normalized Mahalanobis distance, Euclidean distance and Manhattan distance. A comparison between them has been implemented. From the results, it is conclud
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreIn this paper, we implement and examine a Simulink model with electroencephalography (EEG) to control many actuators based on brain waves. This will be in great demand since it will be useful for certain individuals who are unable to access some control units that need direct contact with humans. In the beginning, ten volunteers of a wide range of (20-66) participated in this study, and the statistical measurements were first calculated for all eight channels. Then the number of channels was reduced by half according to the activation of brain regions within the utilized protocol and the processing time also decreased. Consequently, four of the participants (three males and one female) were chosen to examine the Simulink model during di
... Show MoreBiomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reductio
... Show MoreIn this paper, a least squares group finite element method for solving coupled Burgers' problem in 2-D is presented. A fully discrete formulation of least squares finite element method is analyzed, the backward-Euler scheme for the time variable is considered, the discretization with respect to space variable is applied as biquadratic quadrangular elements with nine nodes for each element. The continuity, ellipticity, stability condition and error estimate of least squares group finite element method are proved. The theoretical results show that the error estimate of this method is . The numerical results are compared with the exact solution and other available literature when the convection-dominated case to illustrate the effic
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreSpent hydrodesulfurization (Co-Mo/γ-Al2O3) catalyst generally contains valuable metals like molybdenum (Mo), cobalt (Co), aluminium (Al) on a supporting material, such as γ-Al2O3. In the present study, a two stages alkali/acid leaching process was conducted to study leaching of cobalt, molybdenum and aluminium from Co-Mo/γ-Al2O3 catalyst. The acid leaching of spent catalyst, previously treated by alkali solution to remove molybdenum, yielded a solution rich in cobalt and aluminium.