Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, science, telecommunications, the environment, etc. Authors notice through reading of previous studies that there are different analyzes through HADOOP and its various tools such as the sentiment in real-time and others. However, dealing with this Big data is a challenging task. Therefore, such type of analysis is more efficiently possible only through the Hadoop Ecosystem. The purpose of this paper is to analyze literature related analysis of big data of social media using the Hadoop framework for knowing almost analysis tools existing in the world under the Hadoop umbrella and its orientations in addition to difficulties and modern methods of them to overcome challenges of big data in offline and real –time processing. Real-time Analytics accelerates decision-making along with providing access to business metrics and reporting. Comparison between Hadoop and spark has been also illustrated.
The main objective of this paper is to develop and validate flow injection method, a precise, accurate, simple, economic, low cost and specific turbidimetric method for the quantitative determination of mebeverine hydrochloride (MbH) in pharmaceutical preparations. A homemade NAG Dual & Solo (0-180º) analyser which contains two identical detections units (cell 1 and 2) was applied for turbidity measurements. The developed method was optimized for different chemical and physical parameters such as perception reagent concentrations, aqueous salts solutions, flow rate, the intensity of the sources light, sample volume, mixing coil and purge time. The correlation coefficients (r) of the developed method were 0.9980 and 0.9986 for cell
... Show MoreBackground: Synthetic hydroxyapatite,(Ca10(PO4)6(OH2) can directly bond to bones without infection and fibrous encapsulation, thus is regarded as bioactive and biocompatible. The aim of the study was the estimation of microarchitecture bone parameters include bone mass (gm/cm2) cortical bone width (mm), thread width (mm), marrow space star volume analysis (V*m) and osteoblast, osteocyte cell number. Materials and methods: Ninety-six (96) commercially pure titanium CpTi) used in this study, (48) implants were coated with HA by dipping coating and (48) implants were used as control. They were inserted in (32) Newzland white rabbits and followed for 2 & 6 weeks. Mechanical torque removal test and histomorphometric analysis of bone microarchit
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show MoreThe research aimed to modeling a structural equation for tourist attraction factors in Asir Region. The research population is the people in the region, and a simple random sample of 332 individuals were selected. The factor analysis as a reliable statistical method in this phenomenon was used to modeling and testing the structural model of tourism, and analyzing the data by using SPSS and AMOS statistical computerized programs. The study reached a number of results, the most important of them are: the tourist attraction factors model consists of five factors which explain 69.3% of the total variance. These are: the provision of tourist services, social and historic factors, mountains, weather and natural parks. And the differenc
... Show MoreThis research work dealt with the problem of layout the production line of engine of fan roof at the General Company for Electrical Industries (GCEI). It was observed that the assembly line of engine was unstable and subject to severe fluctuations. In addition the execution of tasks at some stations was observed to be very fast while at other stations was slow. This phenomenon resulted into bottlenecks between workstations, idle time, and work in process. The system design was used to assign tasks to work stations according to different heuristics (Ranked Positional weight techniques, longest Task Time, Most following tasks, Shortest tasks time, Least number of following task).
The study revealed that th
... Show More