With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Reschedule Algorithm (called SDN-RA) for cloud data center networks. The SDN-RA performance is validated and compared as results to other two corresponding SDN; ECMP and Hedera methods. The simulation environment of current work implemented using Fat-Tree topology over Mininet emulator which is connected to the Ryu-SDN controller. The performance evaluation of SDN-RA shows an increase in the network in terms of throughput and link utilization besides a reduction of RTT delay and loss rate.
Lasmiditan (LAS) was formulated as a nanoemulsion based in situ gel (NEIG)with the aim of improving its oral bioavailability via application intranasally. The solubility of LAS in oils, emulsifiers, and co-emulsifiers was determined to identify nanoemulsion (NE)components. Phase diagrams were constructed to identify the area of nanoemulsification. LAS NE was formulated using the spontaneous nanoemulsification method. Four NEs (F19, F24, F31, and F34) containing 7-15 % oleic acid (OA) as an oily phase, 40-55% labrasol (LR), and transcutol (TC) as emulsifier mixture at (1:1), (2:1), (3:1), and (1:2) ratio with 30-53 % (w/w) aqueous phase, having suitable optical transparency of 95–98%, globule size of 104-140 nm and polydisper
... Show MoreThis research aims to identify the relationship between occupational hypocrisy and organizational strategic success, It was done by analyzing the correlations and influence between variables, applied to a random sample of university professors at the University of Kufa faculty of administration and economic.
The main tool for data collection is the survey were questionnaires were distributed randomly to the professors , and (43) questionnaires were returned, and test its validity by using (SEM) (Structural Equation Modeling), Hypothesis has been tested by using Statistical Package for Social Sciences (SPSS v. 18), The research found a set of conclusions:(The occupational hypocrisy has
... Show MoreFinding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreThis paper addresses the nature of Spatial Data Infrastructure (SDI), considered as one of the most important concepts to ensure effective functioning in a modern society. It comprises a set of continually developing methods and procedures providing the geospatial base supporting a country’s governmental, environmental, economic, and social activities. In general, the SDI framework consists of the integration of various elements including standards, policies, networks, data, and end users and application areas. The transformation of previously paper-based map data into a digital format, the emergence of GIS, and the Internet and a host of online applications (e.g., environmental impact analysis, navigation, applications of VGI dat
... Show More