Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
The intelligent buildings provided various incentives to get highly inefficient energy-saving caused by the non-stationary building environments. In the presence of such dynamic excitation with higher levels of nonlinearity and coupling effect of temperature and humidity, the HVAC system transitions from underdamped to overdamped indoor conditions. This led to the promotion of highly inefficient energy use and fluctuating indoor thermal comfort. To address these concerns, this study develops a novel framework based on deep clustering of lagrangian trajectories for multi-task learning (DCLTML) and adding a pre-cooling coil in the air handling unit (AHU) to alleviate a coupling issue. The proposed DCLTML exhibits great overall control and is
... Show MoreInvestigating gender differences based on emotional changes becomes essential to understand various human behaviors in our daily life. Ten students from the University of Vienna have been recruited by recording the electroencephalogram (EEG) dataset while watching four short emotional video clips (anger, happiness, sadness, and neutral) of audiovisual stimuli. In this study, conventional filter and wavelet (WT) denoising techniques were applied as a preprocessing stage and Hurst exponent
We presented here a 65years old lady with an unusual presentation of a large epigastric hernia of twenty years duration .The swelling was occupying all the right hypochondrial region .The diagnosis was made on r^E^a-operative identification of the defect in the linea alba which wassutured after removal of the hernial sac and its contents .The postoperative course was uneventful and the patient remained with no complications or recurrence for more than two years follow up.
In this paper, the problem of developing turbulent flow in rectangular duct is investigated by obtaining numerical results of the velocity profiles in duct by using large eddy simulation model in two dimensions with different Reynolds numbers, filter equations and mesh sizes. Reynolds numbers range from (11,000) to (110,000) for velocities (1 m/sec) to (50 m/sec) with (56×56), (76×76) and (96×96) mesh sizes with different filter equations. The numerical results of the large eddy simulation model are compared with k-ε model and analytic velocity distribution and validated with experimental data of other researcher. The large eddy simulation model has a good agreement with experimental data for high Reynolds number with the first, seco
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreImage processing applications are currently spreading rapidly in industrial agriculture. The process of sorting agricultural fruits according to their color comes first among many studies conducted in industrial agriculture. Therefore, it is necessary to conduct a study by developing an agricultural crop separator with a low economic cost, however automatically works to increase the effectiveness and efficiency in sorting agricultural crops. In this study, colored pepper fruits were sorted using a Pixy2 camera on the basis of algorithm image analysis, and by using a TCS3200 color sensor on the basis of analyzing the outer surface of the pepper fruits, thus This separation process is done by specifying the pepper according to the color of it
... Show More