Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, science, telecommunications, the environment, etc. Authors notice through reading of previous studies that there are different analyzes through HADOOP and its various tools such as the sentiment in real-time and others. However, dealing with this Big data is a challenging task. Therefore, such type of analysis is more efficiently possible only through the Hadoop Ecosystem. The purpose of this paper is to analyze literature related analysis of big data of social media using the Hadoop framework for knowing almost analysis tools existing in the world under the Hadoop umbrella and its orientations in addition to difficulties and modern methods of them to overcome challenges of big data in offline and real –time processing. Real-time Analytics accelerates decision-making along with providing access to business metrics and reporting. Comparison between Hadoop and spark has been also illustrated.
Social Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreAntiviral medications may be the best choices for COVID-19 treatment until particular therapeutic treatments become available. Tamiflu (oseltamivir) is a neuraminidase inhibitor licensed for the management and defense against influenza types A and B. Oseltamivir-based medication combinations are currently being used to treat COVID-19 patients who also have the new coronavirus 1 SARS-CoV-2. 1 Oseltamivir administration was related with a less time spent in the hospital, quicker recovery 1 and discharge, and a decreased mortality rate. Docking is a modern computational method for identifying a hit molecule by assessing the binding ability of molecular medicines within the binding target pocket. In this work, we chose 21 ligand compounds that
... Show MoreSimultaneous determination of Furosemide, Carbamazepine, Diazepam, and Carvedilol in bulk and pharmaceutical formulation using the partial least squares regression (PLS-1 and PLS-2) is described in this study. The two methods were successfully applied to estimate the four drugs in their quaternary mixture using UV spectral data of 84synthetic mixtures in the range of 200-350nm with the intervals Δλ=0.5nm. The linear concentration range were 1-20 μg.mL-1 for all, with correlation coefficient (R2) and root mean squares error for the calibration (RMSE) for FURO, CARB, DIAZ, and CARV were 0.9996, 0.9998, 0.9997, 0.9997, and 0.1128, 0.1292, 0.1868,0.1562 respectively for PLS-1, and for PLS-2 were 0.9995, 0.9999, 0.9997, 0.9998, and 0.1127, 0.
... Show MoreEmpirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, F
... Show MoreThe present work reports the performance of three types of polyethersulfone (PES) membrane in the removal of highly polluting and toxic lead Pb2+ and cadmium Cd2+ ions from a single salt. This study investigated the effect of operating variables, including pH, types of PES membrane, and feed concentration, on the separation process. The transport parameters and mass transfer coefficient (k) of the membranes were estimated using the combined film theory-solution-diffusion (CFSD), combined film theory-Spiegler-Kedem (CFSK), and combined film theory-finely-porous (CFFP) membrane transport models. Various parameters were used to estimate the enrichment factors, concentration polarization modulus, and Péclet number. The pH values signif
... Show MoreQuality control charts are limited to controlling one characteristic of a production process, and it needs a large amount of data to determine control limits to control the process. Another limitation of the traditional control chart is that it doesn’t deal with the vague data environment. The fuzzy control charts work with the uncertainty that exists in the data. Also, the fuzzy control charts investigate the random variations found between the samples. In modern industries, productivity is often of different designs and a small volume that depends on the market need for demand (short-run production) implemented in the same type of machines to the production units. In such cases, it is difficult to determine the contr
... Show MoreThe Iraqi economy has suffered for a long period of inflation because of the Iraq war and the resolutions and the sanctions that were imposed on Iraq, this phenomenon overshadowed at various aspects of the economy including the tax revenue that the State seeks to optimize the total income for the budget, the research covers the years 1990-2010, these years have been divided according to the country's economic variables.
The research adopted on econometrics analysis that is based on the information and data available on topics and has been using statistical methods to test functions are formulated.
Research concluded that rates of inflation and GDP impact is limited to direct taxation and indirect in current prices a
... Show MoreMicrofluidic devices provide distinct benefits for developing effective drug assays and screening. The microfluidic platforms may provide a faster and less expensive alternative. Fluids are contained in devices with considerable micrometer-scale dimensions. Owing to this tight restriction, drug assay quantities are minute (milliliters to femtoliters). In this research, a microfluidic chip consisting of micro-channels carved on substrate materials built using an Acrylic (Polymethyl Methacrylate, PMMA) chip was designed using a Carbon Dioxide (CO2) laser machine. The CO2 parameters influence the chip’s width, depth, and roughness. To have a regular channel surface, and low roughness, the laser power (60 W), with scanning speed (250 m/s)
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show More