Data <span>transmission in orthogonal frequency division multiplexing (OFDM) system needs source and channel coding, the transmitted data suffers from the bad effect of large peak to average power ratio (PAPR). Source code and channel codes can be joined using different joined codes. Variable length error correcting code (VLEC) is one of these joined codes. VLEC is used in mat lab simulation for image transmission in OFDM system, different VLEC code length is used and compared to find that the PAPR decreased with increasing the code length. Several techniques are used and compared for PAPR reduction. The PAPR of OFDM signal is measured for image coding with VLEC and compared with image coded by Huffman source coding and Bose-Chaudhuri-Hochquenghem (BCH) channel coding, the VLEC code decreases the data transmitted size and keep the same level of PAPR reduction with respect to data coded by Huffman and BCH code when using PAPR reduction methods.</span>
With the continuous progress of image retrieval technology, the speed of searching for the required image from a large amount of image data has become an important issue. Convolutional neural networks (CNNs) have been used in image retrieval. However, many image retrieval systems based on CNNs have poor ability to express image features. Content-based Image Retrieval (CBIR) is a method of finding desired images from image databases. However, CBIR suffers from lower accuracy in retrieving images from large-scale image databases. In this paper, the proposed system is an improvement of the convolutional neural network for greater accuracy and a machine learning tool that can be used for automatic image retrieval. It includes two phases
... Show MoreThe basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreThis research aims to know the intellectual picture the displaced people formed about aid organizations and determine whether they were positive or negative, the researchers used survey tool as standard to study the society represented by displaced people living in Baghdad camps from Shiites, Sunnis, Shabak, Turkmen, Christians, and Ezidis.
The researcher reached to important results and the most important thing he found is that displaced people living in camps included in this survey hold a positive opinion about organizations working to meet their demands but they complain about the shortfall in the health care side.
The research also found that displaced people from (Shabak, Turkmen, and Ezidi) minorities see that internati
This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreIn this paper, membrane-based computing image segmentation, both region-based and edge-based, is proposed for medical images that involve two types of neighborhood relations between pixels. These neighborhood relations—namely, 4-adjacency and 8-adjacency of a membrane computing approach—construct a family of tissue-like P systems for segmenting actual 2D medical images in a constant number of steps; the two types of adjacency were compared using different hardware platforms. The process involves the generation of membrane-based segmentation rules for 2D medical images. The rules are written in the P-Lingua format and appended to the input image for visualization. The findings show that the neighborhood relations between pixels o
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreBackground: Spleen is a hemopoietic organ which is capable of supporting elements of different systems. It is affected by several groups of diseases; inflammatory, hematopoietic, reticuloendothelial proliferation, portal hypertension and storage diseases. Ultrasound (US) may detect mild splenomegaly before it is clinically palpable. Knowledge of the normal range of spleen size in the population being examined is a prerequisite. Racial differences in splenic length could result in incorrect interpretation of splenic measurements and such differences would make it difficult to standardize expected splenic length and to determine non- palpable splenic enlargement.Objectives: To measure the normal values of splenic lengthin Iraqi subjects an
... Show MoreBackground: Arterial stiffness is related with atherosclerosis and cardiovascular disease events. Patients with atherosclerotic disease show to have larger diameters, reduced arterial compliance and lower flow velocities. Aim of study : To compare between patients of two age groups with concomitant diseases diabetes and hypertension in regard to intima media thickness and blood flow characteristics in order to estimate the blood perfusion to the brain via the common and internal carotid arteries. Subject and Methods : 40 patients with (diabetic and hypertension) diseases were enrolled , they were classified according to age. Color Doppler and B mode ultrasound was used to determine lumen Diameter (D), Intima – media thickness (IMT)
... Show MoreIn this work, the performance of the receiver in a quantum cryptography system based on BB84 protocol is scaled by calculating the Quantum Bit Error Rate (QBER) of the receiver. To apply this performance test, an optical setup was arranged and a circuit was designed and implemented to calculate the QBER. This electronic circuit is used to calculate the number of counts per second generated by the avalanche photodiodes set in the receiver. The calculated counts per second are used to calculate the QBER for the receiver that gives an indication for the performance of the receiver. Minimum QBER, 6%, was obtained with avalanche photodiode excess voltage equals to 2V and laser diode power of 3.16 nW at avalanche photodiode temperature of -10
... Show MoreIs in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.