Steganography is the art of secret communication. Its purpose is to hide the presence of information, using, for example, images as covers. The frequency domain is well suited for embedding in image, since hiding in this frequency domain coefficients is robust to many attacks. This paper proposed hiding a secret image of size equal to quarter of the cover one. Set Partitioning in Hierarchal Trees (SPIHT) codec is used to code the secret image to achieve security. The proposed method applies Discrete Multiwavelet Transform (DMWT) for cover image. The coded bit stream of the secret image is embedded in the high frequency subbands of the transformed cover one. A scaling factors ? and ? in frequency domain control the quality of the stego images. The proposed algorithm is compared with wavelet based algorithm which shows a favorable results in terms of PSNR reaches to 18 dB.
Grabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreImage fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,
... Show MoreDigital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreIt is well known that sonography is not the first choice in detecting early breast tumors. Improving the resolution of breast sonographic image is the goal of many workers to make sonography a first choice examination as it is safe and easy procedure as well as cost effective. In this study, infrared light exposure of breast prior to ultrasound examination was implemented to see its effect on resolution of sonographic image. Results showed that significant improvement was obtained in 60% of cases.
Maximizing the net present value (NPV) of oil field development is heavily dependent on optimizing well placement. The traditional approach entails the use of expert intuition to design well configurations and locations, followed by economic analysis and reservoir simulation to determine the most effective plan. However, this approach often proves inadequate due to the complexity and nonlinearity of reservoirs. In recent years, computational techniques have been developed to optimize well placement by defining decision variables (such as well coordinates), objective functions (such as NPV or cumulative oil production), and constraints. This paper presents a study on the use of genetic algorithms for well placement optimization, a ty
... Show MoreAbstract:
The distribution or retention of profits is the third decision among financial management decisions in terms of priority, whether at the level of theory or practice, as the issue of distribution or retention is multi-party in terms of influence and impact, as determining the optimal percentage for each component is still the subject of intellectual debate because these decisions are linked to the future of the organization and several considerations, The research focus on the nature of the policies followed by the Iraqi banking sector As the sample chosen by the intentional sampling method was represented by the Commercial Bank of
... Show MoreBecause of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables
The evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreIn this article, we aim to define a universal set consisting of the subscripts of the fuzzy differential equation (5) except the two elements and , subsets of that universal set are defined according to certain conditions. Then, we use the constructed universal set with its subsets for suggesting an analytical method which facilitates solving fuzzy initial value problems of any order by using the strongly generalized H-differentiability. Also, valid sets with graphs for solutions of fuzzy initial value problems of higher orders are found.