Image quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavelet transform (W), multi-wavelet transform (M), and tensor product mixed transform (T) as 1-level W, M, and T techniques. WT and MT are the 2-level techniques, while WWT, WMT, MWT, and MMT are the 3-level techniques. For each level of each technique, a reconstructed process is applied. The simulation results using MATLAB 2019a indicated that the MMT is the best technique with CR=1024, R(D)=4.154, and PSNR=81.9085. Also, it is faster than the other techniques in the previous works as compared with them. Further research might investigate whether this technique can benefit image and speech recognition.
A perturbed linear system with property of strong observability ensures that there is a sliding mode observer to estimate the unknown form inputs together with states estimation. In the case of the electro-hydraulic system with piston position measured output, the above property is not met. In this paper, the output and its derivatives estimation were used to build a dynamic structure that satisfy the condition of strongly observable. A high order sliding mode observer (HOSMO) was used to estimate both the resulting unknown perturbation term and the output derivatives. Thereafter with one signal from the whole system (piton position), the piston position make tracking to desire one with a simple linear output feedback controller after ca
... Show MoreCompanies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreThis paper analysed the effect of electronic internal auditing (EIA) based on the Control Objectives for Information and Related Technologies (COBIT) framework. Organisations must implement an up-to-date accounting information system (AIS) capable of meeting their auditing requirements. Electronic audit risk (compliance assessment, control assurance, and risk assessment) is a development by Weidenmier and Ramamoorti (2006) to improve AIS. In order to fulfil the study’s objectives, a questionnaire was prepared and distributed to a sample comprising 120 employees. The employees were financial managers, internal auditors, and workers involved in the company’s information security departments in the General Company for Electricity D
... Show MoreWithin this work, to promote the efficiency of organic-based solar cells, a series of novel A-π-D type small molecules were scrutinised. The acceptors which we designed had a moiety of N, N-dimethylaniline as the donor and catechol moiety as the acceptor linked through various conjugated π-linkers. We performed DFT (B3LYP) as well as TD-DFT (CAM-B3LYP) computations using 6-31G (d,p) for scrutinising the impact of various π-linkers upon optoelectronic characteristics, stability, and rate of charge transport. In comparison with the reference molecule, various π-linkers led to a smaller HOMO–LUMO energy gap. Compared to the reference molecule, there was a considerable red shift in the molecules under study (A1–A4). Therefore, based on
... Show MoreA robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video str
... Show MoreAbstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreThis paper describes a practical study on the impact of learning's partners, Bluetooth Broadcasting system, interactive board, Real – time response system, notepad, free internet access, computer based examination, and interaction classroom, etc, had on undergraduate student performance, achievement and involving with lectures. The goal of this study is to test the hypothesis that the use of such learning techniques, tools, and strategies to improve student learning especially among the poorest performing students. Also, it gives some kind of practical comparison between the traditional way and interactive way of learning in terms of lectures time, number of tests, types of tests, student's scores, and student's involving with lectures
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreTwitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show More