The science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within average of 65 and MSE within average of 85, indicate that the proposed method has proven successful in its ability to efficiently embedding data.
In this paper, we introduce a DCT based steganographic method for gray scale images. The embedding approach is designed to reach efficient tradeoff among the three conflicting goals; maximizing the amount of hidden message, minimizing distortion between the cover image and stego-image,and maximizing the robustness of embedding. The main idea of the method is to create a safe embedding area in the middle and high frequency region of the DCT domain using a magnitude modulation technique. The magnitude modulation is applied using uniform quantization with magnitude Adder/Subtractor modules. The conducted test results indicated that the proposed method satisfy high capacity, high preservation of perceptual and statistical properties of the steg
... Show MoreAbstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreThe study of the validity and probability of failure in solids and structures is highly considered as one of the most incredibly-highlighted study fields in many science and engineering applications, the design analysts must therefore seek to investigate the points where the failing strains may be occurred, the probabilities of which these strains can cause the existing cracks to propagate through the fractured medium considered, and thereafter the solutions by which the analysts can adopt the approachable techniques to reduce/arrest these propagating cracks.In the present study a theoretical investigation upon simply-supported thin plates having surface cracks within their structure is to be accomplished, and the applied impact load to the
... Show MoreSemantic segmentation is effective in numerous object classification tasks such as autonomous vehicles and scene understanding. With the advent in the deep learning domain, lots of efforts are seen in applying deep learning algorithms for semantic segmentation. Most of the algorithms gain the required accuracy while compromising on their storage and computational requirements. The work showcases the implementation of Convolutional Neural Network (CNN) using Discrete Cosine Transform (DCT), where DCT exhibit exceptional energy compaction properties. The proposed Adaptive Weight Wiener Filter (AWWF) rearranges the DCT coefficients by truncating the high frequency coefficients. AWWF-DCT model reinstate the convolutional l
... Show MoreMinimizing the power consumption of electronic systems is one of the most critical concerns in the design of integrated circuits for very large-scale integration (VLSI). Despite the reality that VLSI design is known for its compact size, low power, low price, excellent dependability, and high functionality, the design stage remains difficult to improve in terms of time and power. Several optimization algorithms have been designed to tackle the present issues in VLSI design. This study discusses a bi-objective optimization technique for circuit partitioning based on a genetic algorithm. The motivation for the proposed research is derived from the basic concept that, if some portions of a circuit's system are deactivated during th
... Show MoreMedical imaging is a technique that has been used for diagnosis and treatment of a large number of diseases. Therefore it has become necessary to conduct a good image processing to extract the finest desired result and information. In this study, genetic algorithm (GA)-based clustering technique (K-means and Fuzzy C Means (FCM)) were used to segment thyroid Computed Tomography (CT) images to an extraction thyroid tumor. Traditional GA, K-means and FCM algorithms were applied separately on the original images and on the enhanced image with Anisotropic Diffusion Filter (ADF). The resulting cluster centers from K-means and FCM were used as the initial population in GA for the implementation of GAK-Mean and GAFCM. Jaccard index was used to s
... Show MoreThe primary objective of this study is to monitor and collect data from the main
tributaries of Smaquli stream during rainfall storm events, which can be used to
establish relationship between suspended sediment concentration and discharge. The
Smaquli catchment is divided into two sub-catchments namely Sarwchawa and
Krosh with areas of 80.64 and 34.82 km2 respectively. Jali dam is built at watershed
outlet. Rainfall, stream discharge, and suspended sediment concentration are
monitored during ten rainfall storms in the water years (2012-2013) and (2013-
2014). Analysis of the data from the two sampling sites, shows two different
responses of suspended sediment concentrations. The Krosh sub-catchment reacts
rapi
Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreDeficiencies in revenue-related accounting standards, including American accounting standards as well as international accounting standards, prompted the issuance of the International Financial Reporting Standard IFRS 15 "Revenue from contracts with customers" as part of the convergence plan between the FASB and the International Accounting Standards Board (IASB) according to the requirements of The joint venture between the two councils, whereby the standard aims to define the basis for reporting useful information to the users of the financial statements about the nature, amount, timing and uncertainty about the revenues and cash flows arising from a contract with the customer, The standard is base
... Show More