The evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual resource, by combining two types of algorithms: dynamic algorithm (adaptive firefly) and static algorithm (weighted round robin). The results show improvement in resource utilization, increased productivity, and reduced response time.
This paper describes the application of consensus optimization for Wireless Sensor Network (WSN) system. Consensus algorithm is usually conducted within a certain number of iterations for a given graph topology. Nevertheless, the best Number of Iterations (NOI) to reach consensus is varied in accordance with any change in number of nodes or other parameters of . graph topology. As a result, a time consuming trial and error procedure will necessary be applied
to obtain best NOI. The implementation of an intellig ent optimization can effectively help to get the optimal NOI. The performance of the consensus algorithm has considerably been improved by the inclusion of Particle Swarm Optimization (PSO). As a case s
The improvement in Direction of Arrival (DOA) estimation when the received signals impinge on Active-Parasitic Antenna (APA) arrays will be studied in this work. An APA array consists of several active antennas; others are parasitic antennas. The responses to the received signals are measured at the loaded terminals of the active element. The terminals of the parasitic element are shorted. The effect of the received signals on the parasites, i.e., the induced short-circuit current, is mutually coupled to the active elements. Eigen decomposition of the covariance matrix of the measurements of the APA array generates a third subspace in addition to the traditional signal and noise subspaces generated by the all-active ante
... Show MoreThis paper adapted the neural network for the estimating of the direction of arrival (DOA). It uses an unsupervised adaptive neural network with GHA algorithm to extract the principal components that in turn, are used by Capon method to estimate the DOA, where by the PCA neural network we take signal subspace only and use it in Capon (i.e. we will ignore the noise subspace, and take the signal subspace only).
In this paper, a FPGA model of intelligent traffic light system with power saving was built. The intelligent traffic light system consists of sensors placed on the side's ends of the intersection to sense the presence or absence of vehicles. This system reduces the waiting time when the traffic light is red, through the transition from traffic light state to the other state, when the first state spends a lot of time, because there are no more vehicles. The proposed system is built using VHDL, simulated using Xilinx ISE 9.2i package, and implemented using Spartan-3A XC3S700A FPGA kit. Implementation and Simulation behavioral model results show that the proposed intelligent traffic light system model satisfies the specified operational req
... Show MoreAbstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MoreIn this research, the focus was on estimating the parameters on (min- Gumbel distribution), using the maximum likelihood method and the Bayes method. The genetic algorithmmethod was employed in estimating the parameters of the maximum likelihood method as well as the Bayes method. The comparison was made using the mean error squares (MSE), where the best estimator is the one who has the least mean squared error. It was noted that the best estimator was (BLG_GE).
In this research, the focus was placed on estimating the parameters of the Hypoexponential distribution function using the maximum likelihood method and genetic algorithm. More than one standard, including MSE, has been adopted for comparison by Using the simulation method
In this paper, the generalized inverted exponential distribution is considered as one of the most important distributions in studying failure times. A shape and scale parameters of the distribution have been estimated after removing the fuzziness that characterizes its data because they are triangular fuzzy numbers. To convert the fuzzy data to crisp data the researcher has used the centroid method. Hence the studied distribution has two parameters which show a difficulty in separating and estimating them directly of the MLE method. The Newton-Raphson method has been used.
... Show MoreMulti-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show MoreHepatitis is one of the diseases that has become more developed in recent years in terms of the high number of infections. Hepatitis causes inflammation that destroys liver cells, and it occurs as a result of viruses, bacteria, blood transfusions, and others. There are five types of hepatitis viruses, which are (A, B, C, D, E) according to their severity. The disease varies by type. Accurate and early diagnosis is the best way to prevent disease, as it allows infected people to take preventive steps so that they do not transmit the difference to other people, and diagnosis using artificial intelligence gives an accurate and rapid diagnostic result. Where the analytical method of the data relied on the radial basis network to diagnose the
... Show More