JPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
This article investigates how an appropriate chaotic map (Logistic, Tent, Henon, Sine...) should be selected taking into consideration its advantages and disadvantages in regard to a picture encipherment. Does the selection of an appropriate map depend on the image properties? The proposed system shows relevant properties of the image influence in the evaluation process of the selected chaotic map. The first chapter discusses the main principles of chaos theory, its applicability to image encryption including various sorts of chaotic maps and their math. Also this research explores the factors that determine security and efficiency of such a map. Hence the approach presents practical standpoint to the extent that certain chaos maps will bec
... Show MoreAbstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreThe relative strength index (RSI) is one of the best known technical analysis indicators; it provides the speculators by prior signals about the future stock’s prices, and because the speculations in shares of companies which listed in the Iraq Stock Exchange have a high degree of risk, like risk of shares prices felling, so the speculators became committed to use some methods to reduce these risks, and one of these methods is the technical analysis by using the relative strength index (RSI) which enable the speculators of choosing the right time for buy and sell the stocks and the right time to enter or leave the market by using the historical rice data. And from here the problem of the research formulated as “Is the using of
... Show MoreThe consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreFinding a path solution in a dynamic environment represents a challenge for the robotics researchers, furthermore, it is the main issue for autonomous robots and manipulators since nowadays the world is looking forward to this challenge. The collision free path for robot in an environment with moving obstacles such as different objects, humans, animals or other robots is considered as an actual problem that needs to be solved. In addition, the local minima and sharp edges are the most common problems in all path planning algorithms. The main objective of this work is to overcome these problems by demonstrating the robot path planning and obstacle avoidance using D star (D*) algorithm based on Particle Swarm Optimization (PSO)
... Show MoreAbstract:
The main objective of the research is to build an optimal investment portfolio of stocks’ listed at the Iraqi Stock Exchange after employing the multi-objective genetic algorithm within the period of time between 1/1/2006 and 1/6/2018 in the light of closing prices (43) companies after the completion of their data and met the conditions of the inspection, as the literature review has supported the diagnosis of the knowledge gap and the identification of deficiencies in the level of experimentation was the current direction of research was to reflect the aspects of the unseen and untreated by other researchers in particular, the missing data and non-reversed pieces the reality of trading at the level of compani
... Show More