This work presents a symmetric cryptography coupled with Chaotic NN , the encryption algorithm process the data as a blocks and it consists of multilevel( coding of character, generates array of keys (weights),coding of text and chaotic NN ) , also the decryption process consists of multilevel (generates array of keys (weights),chaotic NN, decoding of text and decoding of character).Chaotic neural network is used as a part of the proposed system with modifying on it ,the keys that are used in chaotic sequence are formed by proposed key generation algorithm .The proposed algorithm appears efficiency during the execution time where it can encryption and decryption long messages by short time and small memory (chaotic NN offer capacity of m
... Show MoreThis purpose of the research is to test liquidity ratios to assess bank liquidity risks represented by liquidity ratios (current assets / current liabilities, current assets / total deposits, current assets / total assets, cash credit / total deposits, liquidity coverage ratio LCR, net stable financing ratio NSFR). This research involves evaluating these risks in banks via these ratios, and reveal the most important means used to solve these risks, including the capital adequacy ratio under the Basel II decisions and for selected period (2017-2019).The research reached the most important conclusion, which is the bank sample did not fall into bank liquidity risks throughout the years of research. Tracking specific ratio with adequ
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreThe unconventional techniques called “the quick look techniques”, have been developed to present well log data calculations, so that they may be scanned easily to identify the zones that warrant a more detailed analysis, these techniques have been generated by service companies at the well site which are among the useful, they provide the elements of information needed for making decisions quickly when time is of essence. The techniques used in this paper are:
- Apparent resistivity Rwa
- Rxo /Rt
The above two methods had been used to evaluate Nasiriyah oil field formations (well-NS-3) to discover the hydrocarbon bearing formations. A compu
... Show MoreIt is an established fact that substantial amounts of oil usually remain in a reservoir after primary and secondary processes. Therefore; there is an ongoing effort to sweep that remaining oil. Field optimization includes many techniques. Horizontal wells are one of the most motivating factors for field optimization. The selection of new horizontal wells must be accompanied with the right selection of the well locations. However, modeling horizontal well locations by a trial and error method is a time consuming method. Therefore; a method of Artificial Neural Network (ANN) has been employed which helps to predict the optimum performance via proposed new wells locations by incorporatin
Abstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreThe study aims to integrate the visually impaired people into the art connoisseur community through producing special print artworks to enable the visually impaired people to use their other senses to feel artworks by using artistic printing techniques through adding some prominent materials to the printing colors or making an impact that visually impaired people can perceive using their other senses. This study also aims to set up art exhibitions that display tangible works that can enable visually impaired people to feel artwork and understand its elements to enable them to feel it through other senses.
The study follows the experimental method, through using artistic printing techniques, which allow printing with prominent textur