In this paper, membrane-based computing image segmentation, both region-based and edge-based, is proposed for medical images that involve two types of neighborhood relations between pixels. These neighborhood relations—namely, 4-adjacency and 8-adjacency of a membrane computing approach—construct a family of tissue-like P systems for segmenting actual 2D medical images in a constant number of steps; the two types of adjacency were compared using different hardware platforms. The process involves the generation of membrane-based segmentation rules for 2D medical images. The rules are written in the P-Lingua format and appended to the input image for visualization. The findings show that the neighborhood relations between pixels of 8-adjacency give better results compared with the 4-adjacency neighborhood relations, because the 8-adjacency considers the eight pixels around the center pixel, which reduces the required communication rules to obtain the final segmentation results. The experimental results proved that the proposed approach has superior results in terms of the number of computational steps and processing time. To the best of our knowledge, this is the first time an evaluation procedure is conducted to evaluate the efficiency of real image segmentations using membrane computing.
This study aims to preparation a standards code for sustainability requirements to contribute in a better understanding to the concept of sustainability assessment systems in the dimensions of Iraqi projects in general and in the high-rise building. Iraq is one of the developing countries that faced significant challenges in sustainability aspects environmental, economic and social, it became necessary to develop an effective sustainability building assessment system in respect of the local context in Iraq. This study presented a proposal for a system of assessing the sustainability requirements of Iraqi high rise buildings (ISHTAR), which has been developed through several integrated
Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreCopper Telluride Thin films of thickness 700nm and 900nm, prepared thin films using thermal evaporation on cleaned Si substrates kept at 300K under the vacuum about (4x10-5 ) mbar. The XRD analysis and (AFM) measurements use to study structure properties. The sensitivity (S) of the fabricated sensors to NO2 and H2 was measured at room temperature. The experimental relationship between S and thickness of the sensitive film was investigated, and higher S values were recorded for thicker sensors. Results showed that the best sensitivity was attributed to the Cu2Te film of 900 nm thickness at the H2 gas.
The data communication has been growing in present day. Therefore, the data encryption became very essential in secured data transmission and storage and protecting data contents from intruder and unauthorized persons. In this paper, a fast technique for text encryption depending on genetic algorithm is presented. The encryption approach is achieved by the genetic operators Crossover and mutation. The encryption proposal technique based on dividing the plain text characters into pairs, and applying the crossover operation between them, followed by the mutation operation to get the encrypted text. The experimental results show that the proposal provides an important improvement in encryption rate with comparatively high-speed Process
... Show MoreIn many oil fields only the BHC logs (borehole compensated sonic tool) are available to provide interval transit time (Δtp), the reciprocal of compressional wave velocity VP.
To calculate the rock elastic or inelastic properties, to detect gas-bearing formations, the shear wave velocity VS is needed. Also VS is useful in fluid identification and matrix mineral identification.
Because of the lack of wells with shear wave velocity data, so many empirical models have been developed to predict the shear wave velocity from compressional wave velocity. Some are mathematical models others used the multiple regression method and neural network technique.
In this study a number of em
... Show MoreIn our work present, the application of strong-Lensing observations for some gravitational lenses have been adopted to study the geometry of the universe and to explain the physics and the size of the quasars. The first procedure was to study the geometrical of the Lensing system to determine the relation between the redshift of the gravitational observations with its distances. The second procedure was to compare between the angular diameter distances "DA" calculated from the Euclidean case with that from the Freedman models, then evaluating the diameter of the system lens. The results concluded that the phenomena are restricted to the ratio of distance between lens and source with the diameter of the lens noticing.