Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security
Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThe aim of this study was to investigate antibiotic amoxicillin removal from synthetic pharmaceutical wastewater. Titanium dioxide (TiO2) was used in photocatalysis treatment method under natural solar irradiation in a tubular reactor. The photocatalytic removal efficiency was evaluated by the reduction in amoxicillin concentration. The effects of antibiotics concentration, TiO2 dose, irradiation time and the effect of pH were studied. The optimum conditions were found to be irradiation time 5 hr, catalyst dosage 0.6 g/L, flow rate 1 L/min and pH 5. The photocatalytic treatment was able to destruct the amoxicillin in 5 hr and induced an amoxicillin reduction of about 10% with 141.8 kJ/L accumulate
... Show MoreKidney tumors are of different types having different characteristics and also remain challenging in the field of biomedicine. It becomes very important to detect the tumor and classify it at the early stage so that appropriate treatment can be planned. Accurate estimation of kidney tumor volume is essential for clinical diagnoses and therapeutic decisions related to renal diseases. The main objective of this research is to use the Computer-Aided Diagnosis (CAD) algorithms to help the early detection of kidney tumors that addresses the challenges of accurate kidney tumor volume estimation caused by extensive variations in kidney shape, size and orientation across subjects.
In this paper, have tried to implement an automated segmentati
The analysis, behavior of two-phase flow incompressible fluid in T-juction is done by using "A Computational Fluid Dynamic (CFD) model" that application division of different in industries. The level set method was based in “Finite Element method”. In our search the behavior of two phase flow (oil and water) was studed. The two-phase flow is taken to simulate by using comsol software 4.3. The multivariable was studying such as velocity distribution, share rate, pressure and the fraction of volume at various times. The velocity was employed at the inlet (0.2633, 0.1316, 0.0547 and 0.0283 m/s) for water and (0.1316 m/s) for oil, over and above the pressure set at outlet as a boundary condition. It was observed through the program
... Show MoreA rapid, sensitive and without extraction spectrophotometric method for determination of clonazepam (CLO) in pure and pharmaceutical dosage forms has been described. The proposed method was simply depended on charge transfer reaction between reduced CLO (n-donor) and metol (N-methyl-p-aminophenol sulfate) as a chromogenic reagent (π- acceptor). The reduced drug, with zinc and concentrated hydrochloric acid, produced a purple colored soluble charge-transfer complex with metol in the presence of sodium metaperiodate in neutral medium, which has been measured at λmax 532 nm. All the variables which affected the developed and the stability of the colored product such as concentration of reagent and oxidant, temperature and time of rea
... Show MoreSoftware Defined Networking (SDN) with centralized control provides a global view and achieves efficient network resources management. However, using centralized controllers has several limitations related to scalability and performance, especially with the exponential growth of 5G communication. This paper proposes a novel traffic scheduling algorithm to avoid congestion in the control plane. The Packet-In messages received from different 5G devices are classified into two classes: critical and non-critical 5G communication by adopting Dual-Spike Neural Networks (DSNN) classifier and implementing it on a Virtualized Network Function (VNF). Dual spikes identify each class to increase the reliability of the classification
... Show MoreImproving students’ use of argumentation is front and center in the increasing emphasis on scientific practice in K-12 Science and STEM programs. We explore the construct validity of scenario-based assessments of claim-evidence-reasoning (CER) and the structure of the CER construct with respect to a learning progression framework. We also seek to understand how middle school students progress. Establishing the purpose of an argument is a competency that a majority of middle school students meet, whereas quantitative reasoning is the most difficult, and the Rasch model indicates that the competencies form a unidimensional hierarchy of skills. We also find no evidence of differential item functioning between different scenarios, suggesting
... Show More