In this paper, wavelets were used to study the multivariate fractional Brownian motion through the deviations of the random process to find an efficient estimation of Hurst exponent. The results of simulations experiments were shown that the performance of the proposed estimator was efficient. The estimation process was made by taking advantage of the detail coefficients stationarity from the wavelet transform, as the variance of this coefficient showed the power-low behavior. We use two wavelet filters (Haar and db5) to manage minimizing the mean square error of the model.
This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreCryptography can be thought of as a toolbox, where potential attackers gain access to various computing resources and technologies to try to compute key values. In modern cryptography, the strength of the encryption algorithm is only determined by the size of the key. Therefore, our goal is to create a strong key value that has a minimum bit length that will be useful in light encryption. Using elliptic curve cryptography (ECC) with Rubik's cube and image density, the image colors are combined and distorted, and by using the Chaotic Logistics Map and Image Density with a secret key, the Rubik's cubes for the image are encrypted, obtaining a secure image against attacks. ECC itself is a powerful algorithm that generates a pair of p
... Show MoreNowadays, internet security is a critical concern; the One of the most difficult study issues in network security is "intrusion detection". Fight against external threats. Intrusion detection is a novel method of securing computers and data networks that are already in use. To boost the efficacy of intrusion detection systems, machine learning and deep learning are widely deployed. While work on intrusion detection systems is already underway, based on data mining and machine learning is effective, it requires to detect intrusions by training static batch classifiers regardless considering the time-varying features of a regular data stream. Real-world problems, on the other hand, rarely fit into models that have such constraints. Furthermor
... Show MoreIn this paper a refractive index sensor based on micro-structured optical fiber has been proposed using Finite Element Method (FEM). The designed fiber has a hexagonal cladding structure with six air holes rings running around its solid core. The air holes of fiber has been infiltrated with different liquids such as water , ethanol, methanol, and toluene then sensor characteristics like ; effective refractive index , confinement loss, beam profile of the fundamental mode, and sensor resolution are investigated by employing the FEM. This designed sensor characterized by its low confinement loss and high resolution so a small change in the analyte refractive index could be detect which is could be useful to detect the change of
... Show MoreSeveral previous investigations and studies utilized silica fume (SF) or (micro silica) particles as supplementary cementitious material added as a substitute to cement-based mortars and their effect on the overall properties, especially on physical properties, strength properties, and mechanical properties. This study investigated the impact of the inclusion of silica fume (SF) particles on the residual compressive strengths and microstructure properties of cement-based mortars exposed to severe conditions of elevated temperatures. The prepared specimens were tested and subjected to 25, 250, 450, 600, and 900 °C. Their residual compressive strengths and microstructure were evaluated and compared with control samples (C
... Show MoreIn this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreMost of today’s techniques encrypt all of the image data, which consumes a tremendous amount of time and computational payload. This work introduces a selective image encryption technique that encrypts predetermined bulks of the original image data in order to reduce the encryption/decryption time and the
computational complexity of processing the huge image data. This technique is applying a compression algorithm based on Discrete Cosine Transform (DCT). Two approaches are implemented based on color space conversion as a preprocessing for the compression phases YCbCr and RGB, where the resultant compressed sequence is selectively encrypted using randomly generated combined secret key.
The results showed a significant reduct
Companies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreThe research deals with an evolutionary-based mutation with functional annotation to identify protein complexes within PPI networks. An important field of research in computational biology is the difficult and fundamental challenge of revealing complexes in protein interaction networks. The complex detection models that have been developed to tackle challenges are mostly dependent on topological properties and rarely use the biological properties of PPI networks. This research aims to push the evolutionary algorithm to its maximum by employing gene ontology (GO) to communicate across proteins based on biological information similarity for direct genes. The outcomes show that the suggested method can be utilized to improve the
... Show More