In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates (increase compression ratios) with little reduction of performance (PSNR).
In this paper, a simulation of the electrical performance for Pentacene-based top-contact bottom-gate (TCBG) Organic Field-Effect Transistors (OFET) model with Polymethyl methacrylate (PMMA) and silicon nitride (Si3N4) as gate dielectrics was studied. The effects of gate dielectrics thickness on the device performance were investigated. The thickness of the two gate dielectric materials was in the range of 100-200nm to maintain a large current density and stable performance. MATLAB simulation demonstrated for model simulation results in terms of output and transfer characteristics for drain current and the transconductance. The layer thickness of 200nm may result in gate leakage current points to the requirement of optimizing the t
... Show MoreBackground: As a multifactorial disorder, temporomandibular joint (TMD) is difficult to diagnose, and multiple factors affect the joint and cause the temporomandibular disorder. Standardization of clinical diagnosis of TMD should be used to reach a definite clinical diagnosis; the condylar bone may degenerate in accordance with these disorders. Aims: Evaluate the correlation between the clinical diagnosis and degenerative condylar change (flattening, sclerosis, erosion, and osteophyte). Materials and Methods: A prospective study with a study group of 97 TMD patients (total of 194 joints) aged 20 to 50. Patients were sent to cone beam computed tomography (CBCT) to assess the degenerative condylar change. Results: No association was found bet
... Show MoreDam operation and management have become more complex recently because of the need for considering hydraulic structure sustainability and environmental protect on. An Earthfill dam that includes a powerhouse system is considered as a significant multipurpose hydraulic structure. Understanding the effects of running hydropower plant turbines on the dam body is one of the major safety concerns for earthfill dams. In this research, dynamic analysis of earthfill dam, integrated with a hydropower plant system containing six vertical Kaplan turbines (i.e., Haditha dam), is investigated. In the first stage of the study, ANSYS-CFX was used to represent one vertical Kaplan turbine unit by designing a three-dimensional (3-D) finite element (F
... Show MorePermeability estimation is a vital step in reservoir engineering due to its effect on reservoir's characterization, planning for perforations, and economic efficiency of the reservoirs. The core and well-logging data are the main sources of permeability measuring and calculating respectively. There are multiple methods to predict permeability such as classic, empirical, and geostatistical methods. In this research, two statistical approaches have been applied and compared for permeability prediction: Multiple Linear Regression and Random Forest, given the (M) reservoir interval in the (BH) Oil Field in the northern part of Iraq. The dataset was separated into two subsets: Training and Testing in order to cross-validate the accuracy
... Show MoreThis article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreIn the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show MoreAbstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreThis work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it
... Show More