Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visualization technique for visual analytics of data cubes using parallel coordinates. The proposed technique extends parallel coordinates to a 3D space to reflect concept hierarchy …
In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreIn this research, the Iraqi flagpole at Baghdad University, which is the longest in Baghdad, with a height of 75m, was monitored. According to the importance of this structure, the calculation of the displacement (vertical deviation) in the structure was monitored using the Total Station device, where several observations were taken at different times for two years the monitoring started from November 2016 until May 2017, at a rate of four observations for one year. The observation was processed using the least square method, and the fitting of circles, and then the data was processed. The deviation was calculated using the Matlab program to calculate the values of corrections, where
Eye loss may be caused as a result of eye trauma, accidents, or malignant tumors, which leads the patient to undergo surgery to remove the damaged parts. This research examines the potential of computer vision represented by Structure from Motion (SfM) photogrammetry in fabricating the orbital prosthesis as a noninvasive and low-cost technique. A low-cost camera was used to collect the data towards extracting the dense 3D data of the patient facial features following Structure from Motion-Multi View Stereo (SfM-MVS) algorithms. To restore the defective orbital, a Reverse Engineering (RE) based approach has been applied using the similarity RE algorithms based on the opposite healthy eye to rehabilitate the defected orbital precisely
... Show MoreNeuro-ophthalmology, bridging neurology and ophthalmology, highlights the nervous system’s crucial role in vision, encompassing afferent and efferent pathways. The evolution of this field has emphasized the importance of neuroanatomy for precise surgical interventions, presenting educational challenges in blending complex anatomical knowledge with surgical skills. This review examines the interplay between neuroanatomy and surgical practices in neuro-ophthalmology, aiming to identify educational gaps and suggest improvements.
A literature search across databases such as PubMed, Scopus, and W
In this paper, integrated quantum neural network (QNN), which is a class of feedforward
neural networks (FFNN’s), is performed through emerging quantum computing (QC) with artificial neural network(ANN) classifier. It is used in data classification technique, and here iris flower data is used as a classification signals. For this purpose independent component analysis (ICA) is used as a feature extraction technique after normalization of these signals, the architecture of (QNN’s) has inherently built in fuzzy, hidden units of these networks (QNN’s) to develop quantized representations of sample information provided by the training data set in various graded levels of certainty. Experimental results presented here show that
... Show More