Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless compression scheme of first stage that corresponding to second stage. The tested results shown are promising in both two stages, that implicilty enhanced the performance of traditional polynomial model in terms of compression ratio , and preresving image quality.
There have been many writings and discussions that dealt with the details and interpretation of the research methods and the identification of the methods and methodological methods used by researchers and writers as they deal with research topics and problems in all fields of natural and human sciences. But we noticed that the movement of science and its knowledge and development requires the identification of suitable tools and methodological methods appropriate for each type of science. In other words, attempts should be established to build appropriate methodological tools for human and cognitive activity that can be referred to as a specific science that sets out certain paths of the human sciences which is certainly the ori
... Show MoreCognitive radios have the potential to greatly improve spectral efficiency in wireless networks. Cognitive radios are considered lower priority or secondary users of spectrum allocated to a primary user. Their fundamental requirement is to avoid interference to potential primary users in their vicinity. Spectrum sensing has been identified as a key enabling functionality to ensure that cognitive radios would not interfere with primary users, by reliably detecting primary user signals. In addition, reliable sensing creates spectrum opportunities for capacity increase of cognitive networks. One of the key challenges in spectrum sensing is the robust detection of primary signals in highly negative signal-to-noise regimes (SNR).In this paper ,
... Show MoreSocial Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreThe Matching and Mosaic of the satellite imagery play an essential role in many remote sensing and image processing projects. These techniques must be required in a particular step in the project, such as remotely change detection applications and the study of large regions of interest. The matching and mosaic methods depend on many image parameters such as pixel values in the two or more images, projection system associated with the header files, and spatial resolutions, where many of these methods construct the matching and mosaic manually. In this research, georeference techniques were used to overcome the image matching task in semi automotive method. The decision about the quality of the technique can be considered i
... Show MoreSignificant advances in the automated glaucoma detection techniques have been made through the employment of the Machine Learning (ML) and Deep Learning (DL) methods, an overview of which will be provided in this paper. What sets the current literature review apart is its exclusive focus on the aforementioned techniques for glaucoma detection using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines for filtering the selected papers. To achieve this, an advanced search was conducted in the Scopus database, specifically looking for research papers published in 2023, with the keywords "glaucoma detection", "machine learning", and "deep learning". Among the multiple found papers, the ones focusing
... Show MoreThis research aims to solve the nonlinear model formulated in a system of differential equations with an initial value problem (IVP) represented in COVID-19 mathematical epidemiology model as an application using new approach: Approximate Shrunken are proposed to solve such model under investigation, which combines classic numerical method and numerical simulation techniques in an effective statistical form which is shrunken estimation formula. Two numerical simulation methods are used firstly to solve this model: Mean Monte Carlo Runge-Kutta and Mean Latin Hypercube Runge-Kutta Methods. Then two approximate simulation methods are proposed to solve the current study. The results of the proposed approximate shrunken methods and the numerical
... Show MoreThis work aims to investigate the tensile and compression strengths of heat- cured acrylic resin denture base material by adding styrene-butadiene (S- B) to polymethyl methacrylate (PMMA). The most well- known issue in prosthodontic practice is fracture of a denture base. All samples were a blend of (90%, 80%) PMMA and (10%, 20%) S- B powder melted in Oxolane (Tetra hydro furan). These samples were chopped down into specimens of dimensions 100x10x2.5mm to carry out the requirements of tensile tests. The compression strength test specimens were shaped into a cylinder with dimensions of 12.7mm in diameter and 20mm in length. The experimental results show a significant increase in both tensile and compression strengths when compared to cont
... Show MoreCompression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio fil
... Show MoreIn this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show More
Abstract: Narrow laser pulses have been essential sources in optical communication system. High data rate optical communication network system demands compressed laser source with unique optical property. In this work using pulsed duration (9) ns, peak power 1.2297mW, full width half maximum (FWHM) 286 pm, and wavelength center 1546.7 nm as compression laser source. Mach Zehnder interferometer (MZI) is built by considering two ways. First, polarization maintaining fiber (PMF) with 10 cm length is used to connect between laser source and fiber brag grating analysis (FBGA). Second, Nested Mach Zehnder interferometer (NMZI) was designed by using three PMFs with 10 cm length. These three Fibers are splicing to sing
... Show More