With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
The key objective of the study is to understand the best processes that are currently used in managing talent in Australian higher education (AHE) and design a quantitative measurement of talent management processes (TMPs) for the higher education (HE) sector.
The three qualitative multi-method studies that are commonly used in empirical studies, namely, brainstorming, focus group discussions and semi-structured individual interviews were considered. Twenty
We are used Bayes estimators for unknown scale parameter when shape Parameter is known of Erlang distribution. Assuming different informative priors for unknown scale parameter. We derived The posterior density with posterior mean and posterior variance using different informative priors for unknown scale parameter which are the inverse exponential distribution, the inverse chi-square distribution, the inverse Gamma distribution, and the standard Levy distribution as prior. And we derived Bayes estimators based on the general entropy loss function (GELF) is used the Simulation method to obtain the results. we generated different cases for the parameters of the Erlang model, for different sample sizes. The estimates have been comp
... Show MoreA true random TTL pulse generator was implemented and investigated for quantum key distribution systems. The random TTL signals are generated by low cost components available in the local markets. The TTL signals are obtained by using true random binary sequences based on registering photon arrival time difference registered in coincidence windows between two single – photon detectors. The true random TTL pulse generator performance was tested by using time to digital converters which gives accurate readings for photon arrival time. The proposed true random pulse TTL generator can be used in any quantum -key distribution system for random operation of the transmitters for these systems
The financial analysis of the published financial statements is the means that enables businessmen, financial institutions, financial analysts and others to conduct their studies and conclusions to obtain information that helps them in the decision-making process, including decisions related to investment. National in making the decision on the investment activity, for the period from 2012 to 2018, through the information provided by the annual financial statements, by selecting a set of indicators provided by the financial statements, namely (liquidity ratio, activity percentage, profitability ratios) to measure the extent of this ability Indicators in determining their role in making an investment decision.
Error control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreThis paper presents a hybrid genetic algorithm (hGA) for optimizing the maximum likelihood function ln(L(phi(1),theta(1)))of the mixed model ARMA(1,1). The presented hybrid genetic algorithm (hGA) couples two processes: the canonical genetic algorithm (cGA) composed of three main steps: selection, local recombination and mutation, with the local search algorithm represent by steepest descent algorithm (sDA) which is defined by three basic parameters: frequency, probability, and number of local search iterations. The experimental design is based on simulating the cGA, hGA, and sDA algorithms with different values of model parameters, and sample size(n). The study contains comparison among these algorithms depending on MSE value. One can conc
... Show More