The evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual resource, by combining two types of algorithms: dynamic algorithm (adaptive firefly) and static algorithm (weighted round robin). The results show improvement in resource utilization, increased productivity, and reduced response time.
In today’s competitive environment, organizational efficiency and sustained growth are crucial for survival. The performance of an organization is intricately connected to strategic planning, prompting firms to gather and leverage competitive information for a competitive advantage. Senior managers, recognizing this, initiate actions accordingly. This study aims to investigate the relationship between foresight, vision, strategic partnerships, motivation, system thinking, and organizational performance. Data, gathered through a self-administered questionnaire from various textile units, were analysed using structural equation modelling (SEM). The findings indicate that sub-constructs of strategic intelligence positively impact organizatio
... Show MoreThis research aims to identify the effect of numbered heads strategy on developing oral expression skills among fifth-grade primary students in Bisha Province. To achieve this, the researcher prepared a research tool represented in the observation card which consists of (27) statements distributed in four axes. The tool was sent to (5) experts in the field to verify their validity. In light of their corrections, the tool was developed to be valid for gathering field information. To verify the validity of the content and the reliability of the tool, the researcher applied it to a sample consists of (20) students from outside the research group. The overall coefficient of correlation between the statements of the tool is as follows: (.95,
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreCryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show MoreThere is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreBiomedical signal such as ECG is extremely important in the diagnosis of patients and is commonly recorded with a noise. Many different kinds of noise exist in biomedical environment such as Power Line Interference Noise (PLIN). Adaptive filtering is selected to contend with these defects, the adaptive filters can adjust the filter coefficient with the given filter order. The objectives of this paper are: first an application of the Least Mean Square (LMS) algorithm, Second is an application of the Recursive Least Square (RLS) algorithm to remove the PLIN. The LMS and RLS algorithms of the adaptive filter were proposed to adapt the filter order and the filter coefficients simultaneously, the performance of existing LMS
... Show MoreThis study uses load factor and loss factor to determine the power losses of the electrical feeders. An approach is presented to calculate the power losses in the distribution system. The feeder’s technical data and daily operation recorded data are used to calculate and analyze power losses.
This paper presents more realistic method for calculating the power losses based on load and loss factors instead of the traditional methods of calculating the power losses that uses the RMS value of the load current which not consider the load varying with respect to the time. Eight 11kV feeders are taken as a case study for our work to calculate load factor, loss factor and power losses. Four of them (F40, F42, F43 and F
... Show MoreFinger vein recognition and user identification is a relatively recent biometric recognition technology with a broad variety of applications, and biometric authentication is extensively employed in the information age. As one of the most essential authentication technologies available today, finger vein recognition captures our attention owing to its high level of security, dependability, and track record of performance. Embedded convolutional neural networks are based on the early or intermediate fusing of input. In early fusion, pictures are categorized according to their location in the input space. In this study, we employ a highly optimized network and late fusion rather than early fusion to create a Fusion convolutional neural network
... Show MorePolymer electrolytes were prepared using the solution cast technology. Under some conditions, the electrolyte content of polymers was analyzed in constant percent of PVA/PVP (50:50), ethylene carbonate (EC), and propylene carbonate (PC) (1:1) with different proportions of potassium iodide (KI) (10, 20, 30, 40, 50 wt%) and iodine (I2) = 10 wt% of salt. Fourier Transmission Infrared (FTIR) studies confirmed the complex formation of polymer blends. Electrical conductivity was calculated with an impedance analyzer in the frequency range 50 Hz–1MHz and in the temperature range 293–343 K. The highest electrical conductivity value of 5.3 × 10-3 (S/cm) was observed for electrolytes with 50 wt% KI concentration at room
... Show More