In this paper, a new high-performance lossy compression technique based on DCT is proposed. The image is partitioned into blocks of a size of NxN (where N is multiple of 2), each block is categorized whether it is high frequency (uncorrelated block) or low frequency (correlated block) according to its spatial details, this done by calculating the energy of block by taking the absolute sum of differential pulse code modulation (DPCM) differences between pixels to determine the level of correlation by using a specified threshold value. The image blocks will be scanned and converted into 1D vectors using horizontal scan order. Then, 1D-DCT is applied for each vector to produce transform coefficients. The transformed coefficients will be quantized with different quantization values according to the energy of the block. Finally, an enhanced entropy encoder technique is applied to store the quantized coefficients. To test the level of compression, the quantitative measures of the peak signal-to-noise ratio (PSNR) and compression ratio (CR) is used to ensure the effectiveness of the suggested system. The PSNR values of the reconstructed images are taken between the intermediate range from 28dB to 40dB, the best attained compression gain on standard Lena image has been increased to be around (96.60 %). Also, the results were compared to those of the standard JPEG system utilized in the “ACDSee Ultimate 2020†software to evaluate the performance of the proposed system.
The primary goal of in-situ load testing is to evaluate the safety and performance of a structural system under particular loading conditions. Advancements in building techniques, analytical tools, and monitoring instruments are prompting the evaluation of the appropriate loading value, loading process, and examination criteria. The procedure for testing reinforced concrete (RC) structures on-site, as outlined in the ACI Building Code, involves conducting a 24-h load test and applying specific evaluation criteria. This article detailed a retrofitting project for an RC slab-beams system by utilizing carbon fiber-reinforced polymer (CFRP) sheets to strengthen the structure following a fire incident. The RC structure showed indicators of deter
... Show MoreInformation centric networking (ICN) is the next generation of internet architecture with its ability to provide in-network caching that make users retrieve their data efficiently regardless of their location. In ICN, security is applied to data itself rather than communication channels or devices. In-network caches are vulnerable to many types of attacks, such as cache poisoning attacks, cache privacy attacks, and cache pollution attacks (CPA). An attacker floods non-popular content to the network and makes the caches evict popular ones. As a result, the cache hit ratio for legitimate users will suffer from a performance degradation and an increase in the content’s retrieval latency. In this paper, a popularity variation me
... Show MoreAbstract
Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al
... Show MoreA quantitative description of microstructure governs the characteristics of the material. Various heat and excellent treatments reveal micro-structures when the material is prepared. Depending on the microstructure, mechanical properties like hardness, ductility, strength, toughness, corrosion resistance, etc., also vary. Microstructures are characterized by morphological features like volume fraction of different phases, particle size, etc. Relative volume fractions of the phases must be known to correlate with the mechanical properties. In this work, using image processing techniques, an automated scheme was presented to calculate relative volume fractions of the phases, namely Ferrite, Martensite, and Bainite, present in the
... Show MoreFuture wireless communication systems must be able to accommodate a large number of users and simultaneously to provide the high data rates at the required quality of service. In this paper a method is proposed to perform the N-Discrete Hartley Transform (N-DHT) mapper, which are equivalent to 4-Quadrature Amplitude Modulation (QAM), 16-QAM, 64-QAM, 256-QAM, … etc. in spectral efficiency. The N-DHT mapper is chosen in the Multi Carrier Code Division Multiple Access (MC-CDMA) structure to serve as a data mapper instead of the conventional data mapping techniques like QPSK and QAM schemes. The proposed system is simulated using MATLAB and compared with conventional MC-CDMA for Additive White Gaussian Noise, flat, and multi-path selective fa
... Show MoreA special methodology for adding a watermark for colored (RGB) image is formed and adding the wavelet transform as a tool during this paper. The watermark is added into two components. The primary one is by taking the key that contain associate eight range from (0...7) every range in it determines the actual bit position in specific component of canopy image. If that bit is analogous to the bit in watermark, (0) are hold on within the Least Significant Bit (LSB) of the watermarked image; otherwise (1) are hold on. The other is that it will add multiple secret keys victimization shift and rotate operations. The watermark is embedded redundantly over all extracted blocks in image to extend image protection. This embedding is completed with
... Show MoreThis research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreThe estimation of the initial oil in place is a crucial topic in the period of exploration, appraisal, and development of the reservoir. In the current work, two conventional methods were used to determine the Initial Oil in Place. These two methods are a volumetric method and a reservoir simulation method. Moreover, each method requires a type of data whereet al the volumetric method depends on geological, core, well log and petrophysical properties data while the reservoir simulation method also needs capillary pressure versus water saturation, fluid production and static pressure data for all active wells at the Mishrif reservoir. The petrophysical properties for the studied reservoir is calculated using neural network technique
... Show MoreA fast laser texturing technique has been utilized to produce micro/nano surface textures in Silicon by means of UV femtosecond laser. We have prepared good absorber surface for photovoltaic cells. The textured Silicon surface absorbs the incident light greater than the non-textured surface. The results show a photovoltaic current increase about 21.3% for photovoltaic cell with two-dimensional pattern as compared to the same cell without texturing.
The process of evaluating data (age and the gender structure) is one of the important factors that help any country to draw plans and programs for the future. Discussed the errors in population data for the census of Iraqi population of 1997. targeted correct and revised to serve the purposes of planning. which will be smoothing the population databy using nonparametric regression estimator (Nadaraya-Watson estimator) This estimator depends on bandwidth (h) which can be calculate it by two ways of using Bayesian method, the first when observations distribution is Lognormal Kernel and the second is when observations distribution is Normal Kernel
... Show More