Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, science, telecommunications, the environment, etc. Authors notice through reading of previous studies that there are different analyzes through HADOOP and its various tools such as the sentiment in real-time and others. However, dealing with this Big data is a challenging task. Therefore, such type of analysis is more efficiently possible only through the Hadoop Ecosystem. The purpose of this paper is to analyze literature related analysis of big data of social media using the Hadoop framework for knowing almost analysis tools existing in the world under the Hadoop umbrella and its orientations in addition to difficulties and modern methods of them to overcome challenges of big data in offline and real –time processing. Real-time Analytics accelerates decision-making along with providing access to business metrics and reporting. Comparison between Hadoop and spark has been also illustrated.
In this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show MoreThe density-based spatial clustering for applications with noise (DBSCAN) is one of the most popular applications of clustering in data mining, and it is used to identify useful patterns and interesting distributions in the underlying data. Aggregation methods for classifying nonlinear aggregated data. In particular, DNA methylations, gene expression. That show the differentially skewed by distance sites and grouped nonlinearly by cancer daisies and the change Situations for gene excretion on it. Under these conditions, DBSCAN is expected to have a desirable clustering feature i that can be used to show the results of the changes. This research reviews the DBSCAN and compares its performance with other algorithms, such as the tradit
... Show MoreAn experimental and theoretical study has been done to investigate the thermal performance of different types of air solar collectors, In this work air solar collector with a dimensions of (120 cm x90 cm x12 cm) , was tested under climate condition of Baghdad city with a (43° tilt angel) by using the absorber plate (1.45 mm thickness, 115 cm height x 84 cm width), which was manufactured from iron painted with a black matt.
The experimental test deals with five types of absorber:-
Conventional smooth flat plate absorber , Finned absorber , Corrugated absorber plate, Iron wire mesh on absorber And matrix of porous media on absorber .
The hourly and average efficiency of the collectors
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreBackground: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We
... Show MoreThis study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope
the main of this paper is to give a comprehensive presentation of estimating methods namely maximum likelihood bayes and proposed methods for the parameter
In this paper, two new simple, fast and efficient block matching algorithms are introduced, both methods begins blocks matching process from the image center block and moves across the blocks toward image boundaries. With each block, its motion vector is initialized using linear prediction that depending on the motion vectors of its neighbor blocks that are already scanned and their motion vectors are assessed. Also, a hybrid mechanism is introduced, it depends on mixing the proposed two predictive mechanisms with Exhaustive Search (ES) mechanism in order to gain matching accuracy near or similar to ES but with Search Time ST less than 80% of the ES. Also, it offers more control capability to reduce the search errors. The experimental tests
... Show MoreSimple, sensitive and accurate two methods were described for the determination of terazosin. The spectrophotometric method (A) is based on measuring the spectral absorption of the ion-pair complex formed between terazosin with eosin Y in the acetate buffer medium pH 3 at 545 nm. Method (B) is based on the quantitative quenching effect of terazosin on the native fluorescence of Eosin Y at the pH 3. The quenching of the fluorescence of Eosin Y was measured at 556 nm after excitation at 345 nm. The two methods obeyed Beer’s law over the concentration ranges of 0.1-8 and 0.05-7 µg/mL for method A and B respectively. Both methods succeeded in the determination of terazosin in its tablets