This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spent to achieve the best classification accuracy.
The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreTi6Al4V alloy is widely used in aerospace and medical applications. It is classified as a difficult to machine material due to its low thermal conductivity and high chemical reactivity. In this study, hybrid intelligent models have been developed to predict surface roughness when end milling Ti6Al4V alloy with a Physical Vapor Deposition PVD coated tool under dry cutting conditions. Back propagation neural network (BPNN) has been hybridized with two heuristic optimization techniques, namely: gravitational search algorithm (GSA) and genetic algorithm (GA). Taguchi method was used with an L27 orthogonal array to generate 27 experiment runs. Design expert software was used to do analysis of variances (ANOVA). The experimental data were
... Show MoreAbstract: The increased interest in developing new photonic devices that can support high data rates, high sensitivity and fast processing capabilities for all optical communications, motivates a pre stage pulse compressor research. The pre-stage research was based on cascading single mode fiber and polarization maintaining fiber to get pulse compression with compression factor of 1.105. The demand for obtaining more précised photonic devices; this work experimentally studied the behavior of Polarization maintaining fiber PMF that is sandwiched between two cascaded singe mode fiber SMF and fiber Bragg gratings FBG. Therefore; the introduced interferometer performed hybrid interference of both Mach-Zehnder
... Show MoreFinding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThe research aims to form a clear theoretical philosophy and perceptions about strategic Entrepreneurship through the relationship between high Involvement management practices, the basis in creating that leadership and high-performance work systems as a support tool in achieving them according to the proposals (Hitt et al, 2011), in an attempt to generalize theoretical philosophy and put forward how to apply it within The Iraqi environment, and on this basis the problem of the current research was launched to bridge the knowledge gap between the previous proposals and the possibility of their application, aiming to identify the practices of high Involvement management and the possibility of high-performance work systems and thei
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show More