Finding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
Potential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Correlation equations for expressing the boiling temperature as direct function of liquid composition have been tested successfully and applied for predicting azeotropic behavior of multicomponent mixtures and the kind of azeotrope (minimum, maximum and saddle type) using modified correlation of Gibbs-Konovalov theorem. Also, the binary and ternary azeotropic point have been detected experimentally using graphical determination on the basis of experimental binary and ternary vapor-liquid equilibrium data.
In this study, isobaric vapor-liquid equilibrium for two ternary systems: “1-Propanol – Hexane – Benzene” and its binaries “1-Propanol –
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThis work represents study the rock facies and flow unit classification for the Mishrif carbonate reservoir in Buzurgan oil Field, which located n the south eastern Iraq, using wire line logs, core samples and petrophysical data (log porosity and core permeability). Hydraulic flow units were identified using flow zone indicator approach and assessed within each rock type to reach better understanding of the controlling role of pore types and geometry in reservoir quality variations. Additionally, distribution of sedimentary facies and Rock Fabric Number along with porosity and permeability was analyzed in three wells (BU-1, BU-2, and BU-3). The interactive Petrophysics - IP software is used to assess the rock fabric number, flow zon
... Show MoreToday the NOMA has exponential growth in the use of Optical Visible Light Communication (OVLC) due to good features such as high spectral efficiency, low BER, and flexibility. Moreover, it creates a huge demand for electronic devices with high-speed processing and data rates, which leads to more FPGA power consumption. Therefore; it is a big challenge for scientists and researchers today to recover this problem by reducing the FPGA power and size of the devices. The subject matter of this article is producing an algorithm model to reduce the power consumption of (Field Programmable Gate Array) FPGA used in the design of the Non-Orthogonal Multiple Access (NOMA) techniques applied in (OVLC) systems combined with a blue laser. However, The po
... Show MoreThe monetary policy is a vital method used in implementing monetary stability through: the management of income and adjustment of the price (monetary targets) in order to promote stability and growth of real output (non-cash goals); the tool of interest rate and direct investment guides or movement towards the desired destination; and supervisory instruments of monetary policy in both quantitative and qualitative. The latter is very important as a standard compass to investigate the purposes of the movement monetary policy in the economy. The public and businesses were given monetary policy signals by those tools. In fiscal policy, there are specific techniques to follow to do the spending and collection of revenue. This is done in order to
... Show More