Getting knowledge from raw data has delivered beneficial information in several domains. The prevalent utilizing of social media produced extraordinary quantities of social information. Simply, social media delivers an available podium for employers for sharing information. Data Mining has ability to present applicable designs that can be useful for employers, commercial, and customers. Data of social media are strident, massive, formless, and dynamic in the natural case, so modern encounters grow. Investigation methods of data mining utilized via social networks is the purpose of the study, accepting investigation plans on the basis of criteria, and by selecting a number of papers to serve as the foundation for this article. Afterward a watchful evaluation of these papers, it has beeniscovered that numerous data extraction approaches were utilized with social media data to report a number of various research goals in several fields of industrial and service. Though, implementations of data mining are still raw and require more work via industry and academic world to prepare the work sufficiently. Bring this analysis to a close. Data mining is the most important rule for uncovering hidden data in large datasets, especially in social network analysis, and it demonstrates the most important social media technology.
The removal of Ibuprofen antibiotics (IBU) by photo-degradation UV/H2O2/Fe+2 system was investigated in a batch reactor under different initial concentrations of H2O2 (100-500) mg/L, Fe+2 (10-40) mg/L, pH (3-9) and initial concentrations of IBU (10-80) mg/L, and their relationship with the degradation efficiency were studied. The result demonstrated that the maximum elimination of IBU was 85.54% achieved at 300 mg/L of H2O2, 30 mg/L of Fe+2, pH=3, and irradiation time of 150 min, for 10 mg/L of IBU. The results have shown that the oxidation reagent H2O2 plays a very important role in IBU degradation.
Three-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show MoreAbstract: Word sense disambiguation (WSD) is a significant field in computational linguistics as it is indispensable for many language understanding applications. Automatic processing of documents is made difficult because of the fact that many of the terms it contain ambiguous. Word Sense Disambiguation (WSD) systems try to solve these ambiguities and find the correct meaning. Genetic algorithms can be active to resolve this problem since they have been effectively applied for many optimization problems. In this paper, genetic algorithms proposed to solve the word sense disambiguation problem that can automatically select the intended meaning of a word in context without any additional resource. The proposed algorithm is evaluated on a col
... Show MoreThe disposal of the waste material is the main goal of this investigation by transformation to high-fineness powder and producing self-consolidation concrete (SCC) with less cost and more eco-friendly by reducing the cement weight, taking into consideration the fresh and strength properties. The reference mix design was prepared by adopting the European guide. Five waste materials (clay brick, ceramic, granite tiles, marble tiles, and thermostone blocks) were converted to high-fine particle size distribution and then used as 5, 10, and 15% weight replacements of cement. The improvement in strength properties is more significant when using clay bricks compared to other activated waste
A green and low-cost method was used to prepare graphene oxide (GO) and reduced graphene oxide (rGO) by chemical exfoliation of graphite powder by modified Hummers method, followed by reduction using ascorbic acid. X-ray diffractometry (XRD) and field emission scanning electron microscopy (FE-SEM) were used to analyze the structure and morphology of the synthesized materials. Fourier transform infrared spectroscopy (FTIR) and ultraviolet-visible spectroscopy were used to identify the formation of the GO and rGO
In the present work theoretical relations are derived for the efficiency evaluation for the generation of the third and the fourth harmonics u$ing crystal cascading configuration. These relations can be applied to a wide class of nonlinear optical materials. Calculations are made for beta barium borate (BBO) crystal with ruby laser /.=694.3 nm . The case study involves producing the third harmonics at X. =231.4 nm of the fundamental beam. The formula of efficiency involves many parameters, which can be changed to enhance the efficiency. The results showed that the behavior of the efficiency is not linear with the crystal length. It is found that the efficiency increases when the input power increases. 'I'he walk-off length is calculated for
... Show MoreDigital image started to including in various fields like, physics science, computer science, engineering science, chemistry science, biology science and medication science, to get from it some important information. But any images acquired by optical or electronic means is likely to be degraded by the sensing environment. In this paper, we will study and derive Iterative Tikhonov-Miller filter and Wiener filter by using criterion function. Then use the filters to restore the degraded image and show the Iterative Tikhonov-Miller filter has better performance when increasing the number of iteration To a certain limit then, the performs will be decrease. The performance of Iterative Tikhonov-Miller filter has better performance for less de
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More