In this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels parameters values including multi-path gains vector, multi-path delay time vector, and maximum Doppler shift. © 2009 Springer Science+Business Media, LLC.
The structure, optical, and electrical properties of SnSe and its application as photovoltaic device has been reported widely. The reasons for interest in SnSe due to the magnificent optoelectronic properties with other encouraging properties. The most applications that in this area are PV devices and batteries. In this study tin selenide structure, optical properties and surface morphology were investigated and studies. Thin-film of SnSe were deposit on p-Si substrates to establish a junction as solar cells. Different annealing temperatures (as prepared, 125,200, 275) °C effects on SnSe thin films were investigated. The structure properties of SnSe was studied through X-ray diffraction, and the results appears the increasing of the peaks
... Show MoreThis paper deals with a new Henstock-Kurzweil integral in Banach Space with Bilinear triple n-tuple and integrator function Ψ which depends on multiple points in partition. Finally, exhibit standard results of Generalized Henstock - Kurzweil integral in the theory of integration.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreThis paper presents a numerical scheme for solving nonlinear time-fractional differential equations in the sense of Caputo. This method relies on the Laplace transform together with the modified Adomian method (LMADM), compared with the Laplace transform combined with the standard Adomian Method (LADM). Furthermore, for the comparison purpose, we applied LMADM and LADM for solving nonlinear time-fractional differential equations to identify the differences and similarities. Finally, we provided two examples regarding the nonlinear time-fractional differential equations, which showed that the convergence of the current scheme results in high accuracy and small frequency to solve this type of equations.
The risk assessment for three pipelines belonging to the Basra Oil Company (X1, X2, X3), to develop an appropriate risk mitigation plan for each pipeline to address all high risks. Corrosion risks were assessed using a 5 * 5 matrix. Now, the risk assessment for X1 showed that the POF for internal corrosion is 5, which means that its risk is high due to salinity and the presence of CO, H2S and POF for external corrosion is 1 less than the corrosion, while for Flowline X2 the probability of internal corrosion is 4 and external is 4 because there is no Cathodic protection applied due to CO2, H2S and Flowline X3 have 8 leaks due to internal corrosion so the hazard rating was very high 5 and could be due to salinity, CO2, fluid flow rate
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreWith the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show MoreDue to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show More