The proposal of nonlinear models is one of the most important methods in time series analysis, which has a wide potential for predicting various phenomena, including physical, engineering and economic, by studying the characteristics of random disturbances in order to arrive at accurate predictions.
In this, the autoregressive model with exogenous variable was built using a threshold as the first method, using two proposed approaches that were used to determine the best cutting point of [the predictability forward (forecasting) and the predictability in the time series (prediction), through the threshold point indicator]. B-J seasonal models are used as a second method based on the principle of the two proposed approaches in dete
... Show MoreExisting leachate models over–or underestimates leachate generation by up to three orders of magnitude. Practical experiments show that channeled flow in waste leads to rapid discharge of large leachate volumes and heterogeneous moisture distribution. In order to more accurately predict leachate generation, leachate models must be improved. To predict moisture movement through waste, the two–domain PREFLO, are tested. Experimental waste and leachate flow values are compared with model predictions. When calibrated with experimental parameters, the PREFLO provides estimates of breakthrough time. In the short term, field capacity has to be reduced to 0.12 and effective storage and hydraulic conductivity of the waste must be increased to
... Show MoreTransport layer is responsible for delivering data to the appropriate application process on the host computers. The two most popular transport layer protocols are Transmission Control Protocol (TCP) and User Datagram Protocol (UDP). TCP is considered one of the most important protocols in the Internet. UDP is a minimal message-oriented Transport Layer protocol. In this paper we have compared the performance of TCP and UDP on the wired network. Network Simulator (NS2) has been used for performance Comparison since it is preferred by the networking research community. Constant bit rate (CBR) traffic used for both TCP and UDP protocols.
Cooperation spectrum sensing in cognitive radio networks has an analogy to a distributed decision in wireless sensor networks, where each sensor make local decision and those decision result are reported to a fusion center to give the final decision according to some fusion rules. In this paper the performance of cooperative spectrum sensing examines using new optimization strategy to find optimal weight and threshold curves that enables each secondary user senses the spectrum environment independently according to a floating threshold with respect to his local environment. Our proposed approach depends on proving the convexity of the famous optimization problem in cooperative spectrum sensing that stated maximizing the probability of detec
... Show MoreExperimental and theoretical investigations are presented on flocculation process in pulsator clarifier. Experimental system was designed to study the factors that affecting the performance of pulsator clarifier. These factors were water level in vacuum chamber which range from 60 to 150 cm , rising time of water in vacuum chamber which having times of 20,30 & 40 seconds , and sludge blanket height which having heights of 20,30 & 40 cm .The turbidity and pH of raw water used were 200 NTU and 8.13 respectively. According to the jar test, the alum dose required for this turbidity was 20 mg/l .The performance parameters of pulsator clarifier such as , turbidity ,total solid TS , shear rate , volume concentration of sludge blanket an
... Show MoreBiometrics is widely used with security systems nowadays; each biometric modality can be useful and has distinctive properties that provide uniqueness and ambiguity for security systems especially in communication and network technologies. This paper is about using biometric features of fingerprint, which is called (minutiae) to cipher a text message and ensure safe arrival of data at receiver end. The classical cryptosystems (Caesar, Vigenère, etc.) became obsolete methods for encryption because of the high-performance machines which focusing on repetition of the key in their attacks to break the cipher. Several Researchers of cryptography give efforts to modify and develop Vigenère cipher by enhancing its weaknesses.
... Show MoreTo achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DN
... Show MoreThe confirming of security and confidentiality of multimedia data is a serious challenge through the growing dependence on digital communication. This paper offers a new image cryptography based on the Chebyshev chaos polynomials map, via employing the randomness characteristic of chaos concept to improve security. The suggested method includes block shuffling, dynamic offset chaos key production, inter-layer XOR, and block 90 degree rotations to disorder the correlations intrinsic in image. The method is aimed for efficiency and scalability, accomplishing complexity order for n-pixels over specific cipher rounds. The experiment outcomes depict great resistant to cryptanalysis attacks, containing statistical, differential and brut
... Show More

