The searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time. Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle to involve four types of binary code books (i.e. Pour when , Flat when , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding procedure, with very small distortion per block, by designing small binary codebook, then rotating each block in it. Moreover, it can be used for improving the efficiency of the coding process even further with the decrease in the bit rate (i.e. increasing the compression ratio(.
A true random TTL pulse generator was implemented and investigated for quantum key distribution systems. The random TTL signals are generated by low cost components available in the local markets. The TTL signals are obtained by using true random binary sequences based on registering photon arrival time difference registered in coincidence windows between two single – photon detectors. The true random TTL pulse generator performance was tested by using time to digital converters which gives accurate readings for photon arrival time. The proposed true random pulse TTL generator can be used in any quantum -key distribution system for random operation of the transmitters for these systems
DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreEnergy savings are very common in IoT sensor networks because IoT sensor nodes operate with their own limited battery. The data transmission in the IoT sensor nodes is very costly and consume much of the energy while the energy usage for data processing is considerably lower. There are several energy-saving strategies and principles, mainly dedicated to reducing the transmission of data. Therefore, with minimizing data transfers in IoT sensor networks, can conserve a considerable amount of energy. In this research, a Compression-Based Data Reduction (CBDR) technique was suggested which works in the level of IoT sensor nodes. The CBDR includes two stages of compression, a lossy SAX Quantization stage which reduces the dynamic range of the
... Show MoreThe aim of the present work to study the effect of changing velocity (Reynold's number) on oxygen cathodic polarization using brass rotating cylinder electrode in 0.1, 0.3 and 0.5N NaCl solutions (PH = 7) at temperatures 40, 50 and 600 C. Cathodic polarization experiments were conducted as a function of electrode rotational speed and concentration.
The effect of using three different interpolation methods (nearest neighbour, linear and non-linear) on a 3D sinogram to restore the missing data due to using angular difference greater than 1° (considered as optimum 3D sinogram) is presented. Two reconstruction methods are adopted in this study, the back-projection method and Fourier slice theorem method, from the results the second reconstruction proven to be a promising reconstruction with the linear interpolation method when the angular difference is less than 20°.
The primary objective of this study is to manage price market items in the construction of walls for affordable structures with load-bearing hollow masonry units using the ACI 211.1 blend design with a slump range of 25-50 mm that follows the specification limits of IQS 1077. It was difficult to reach a suitable cement weight to minimum content (economic and environmental goal), so many trail mixtures were cast. A portion (10-20%) of the coarse aggregates was replaced with concrete, tile, and clay-brick waste. Finally, two curing methods were used: immersion under water as normal curing, and water spraying as it is closer to the field conditions. The recommendation in IQS 1077 to increase the curing period from 14 to 28 days was tak
... Show MoreNumerical study has been conducted to investigate the thermal performance enhancement of flat plate solar water collector by integrating the solar collector with metal foam blocks.The flow is assumed to be steady, incompressible and two dimensional in an inclined channel. The channel is provided with eight foam blocks manufactured form copper. The Brinkman-Forchheimer extended Darcy model is utilized to simulate the flow in the porous medium and the Navier-Stokes equation in the fluid region. The energy equation is used with local thermal equilibrium (LTE) assumption to simulate the thermofield inside the porous medium. The current investigation covers a range of solar radiation intensity at 09:00 AM, 12:00 PM, and 04:00
... Show MoreIn this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
There are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show More