Cloud computing offers a new way of service provision by rearranging various resources over the Internet. The most important and popular cloud service is data storage. In order to preserve the privacy of data holders, data are often stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for big data storage and processing in the cloud. Traditional deduplication schemes cannot work on encrypted data. Among these data, digital videos are fairly huge in terms of storage cost and size; and techniques that can help the legal aspects of video owner such as copyright protection and reducing the cloud storage cost and size are always desired. This paper focuses on video copyright protection and deduplication. A video copyright and deduplication scheme in cloud storage environments using the H.264 compression algorithm and SHA-512 hashing technique is proposed. This paper proposes a combined copyright production and deduplication based on video content to authenticate and to verify the integrity of the compressed H.264 video. The design of the proposed scheme consists of two modules. First, a H.264 compression algorithm is applied on the given video by the user. Second, a unique signature in different time interval of the compressed video is generated by the user in such a way that the CSP can use it to compare the user’s video against other videos without compromising the security of the user’s video. To avoid any attacker to gain access to the hash signature during uploading to the cloud, the hash signature is encrypted with the user password. Some experimental results are provided, showing the effectiveness of our proposed copyright protection and deduplication system.
One of the most important problems in the statistical inference is estimating parameters and Reliability parameter and also interval estimation , and testing hypothesis . estimating two parameters of exponential distribution and also reliability parameter in a stress-strength model.
This parameter deals with estimating the scale parameter and the Location parameter µ , of two exponential distribution ,using moments estimator and maximum likelihood estimator , also we estimate the parameter R=pr(x>y), where x,y are two- parameter independent exponential random variables .
Statistical properties of this distribution and its properti
... Show MorePower-electronic converters are essential elements for the effective interconnection of renewable energy sources to the power grid, as well as to include energy storage units, vehicle charging stations, microgrids, etc. Converter models that provide an accurate representation of their wideband operation and interconnection with other active and passive grid components and systems are necessary for reliable steady state and transient analyses during normal or abnormal grid operating conditions. This paper introduces two Laplace domain-based approaches to model buck and boost DC-DC converters for electromagnetic transient studies. The first approach is an analytical one, where the converter is represented by a two-port admittance model via mo
... Show MoreThere are many diseases that affect the arteries, especially those related to their elasticity and stiffness, and they can be guessed by estimating and calculating the modulus of elasticity. Hence, the accurate calculation of the elastic modulus leads to an accurate assessment of these diseases, especially in their early stages, which can contribute to the treatment of these diseases early. Most of the calculations used the one-dimensional (1D) modulus of elasticity. From a mechanical point of view, the stresses to which the artery is subjected are not one-dimensional, but three-dimensional. Therefore, estimating at least a two-dimensional (2D) modulus of elasticity will necessarily be more accurate. To the knowledge of researchers, there i
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
BACKGROUND: Femoral shaft fracture is a common fracture in pediatric age group reaching 62% of all fracture shaft femur in children in spite of rapid union rate and successful conservative treatment but some cases need surgical intervention and one of the methods using plate and screw by the lateral approach. AIM: This study aims to compare functional outcome fixation of mid-shaft femur fracture in children by plate and screws between (subvastus lateralis and transvastus lateralis) regarding infection, union, and limitation of knee movement. PATIENT AND METHOD: The study was done on 30 children who had diaphyseal fracture femur in Al-Kindy Teaching Hospital in period (April 2018–April 2020) with 6 months follow-up, and the pa
... Show MoreAutorías: Muhammad Hamza Shihab, Nuha Mohsin Dhahi. Localización: Revista iberoamericana de psicología del ejercicio y el deporte. Nº. 4, 2022. Artículo de Revista en Dialnet.
Intrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show MoreA resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show More