Everybody is connected with social media like (Facebook, Twitter, LinkedIn, Instagram…etc.) that generate a large quantity of data and which traditional applications are inadequate to process. Social media are regarded as an important platform for sharing information, opinion, and knowledge of many subscribers. These basic media attribute Big data also to many issues, such as data collection, storage, moving, updating, reviewing, posting, scanning, visualization, Data protection, etc. To deal with all these problems, this is a need for an adequate system that not just prepares the details, but also provides meaningful analysis to take advantage of the difficult situations, relevant to business, proper decision, Health, social media, science, telecommunications, the environment, etc. Authors notice through reading of previous studies that there are different analyzes through HADOOP and its various tools such as the sentiment in real-time and others. However, dealing with this Big data is a challenging task. Therefore, such type of analysis is more efficiently possible only through the Hadoop Ecosystem. The purpose of this paper is to analyze literature related analysis of big data of social media using the Hadoop framework for knowing almost analysis tools existing in the world under the Hadoop umbrella and its orientations in addition to difficulties and modern methods of them to overcome challenges of big data in offline and real –time processing. Real-time Analytics accelerates decision-making along with providing access to business metrics and reporting. Comparison between Hadoop and spark has been also illustrated.
The estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show MoreAfter the year 2003, Iraq went through multiple waves of violence and at different levels on the security, intellectual, political and social levels. Behind that stood several motives and incentives to enable violence that represented the first axis of research, the most important of which was the political motives that circulated an atmosphere that politics against society and transformed power into a field of political brutality against the individual and the group at once. There are also cultural, intellectual, media and economic motives such as weak cultural independence, poverty, marginalization, unemployment and want, and the absence of a media discourse that rejects violence but incites it, on the other ha
... Show MoreTesting is a vital phase in software development, and having the right amount of test data is an important aspect in speeding up the process. As a result of the integrationist optimization challenge, extensive testing may not always be practicable. There is also a shortage of resources, expenses, and schedules that impede the testing process. One way to explain combinational testing (CT) is as a basic strategy for creating new test cases. CT has been discussed by several scholars while establishing alternative tactics depending on the interactions between parameters. Thus, an investigation into current CT methods was started in order to better understand their capabilities and limitations. In this study, 97 publications were evalua
... Show MoreThe research aims to demonstrate the dual use of analysis to predict financial failure according to the Altman model and stress tests to achieve integration in banking risk management. On the bank’s ability to withstand crises, especially in light of its low rating according to the Altman model, and the possibility of its failure in the future, thus proving or denying the research hypothesis, the research reached a set of conclusions, the most important of which (the bank, according to the Altman model, is threatened with failure in the near future, as it is located within the red zone according to the model’s description, and will incur losses if it is exposed to crises in the future according to the analysis of stress tests
... Show MoreMetadiscourse markers are means for organizing a writer’s information and create a connection with her/his readers. When students write, they usually focus on one type of these markers that is the interactive markers and belittling the use of the other type which is the interactional markers. That is to say, they emphasize on presenting and organizing their information only. Therefore, this study is conducted to bridge this gap. The researchers have selected 18 thesis abstracts. Nine of them are written by Iraqi students of English and the rest by American students. The aims of the study are to examine the types and sub-types of metadiscourse markers used by American and Iraqi students; investigate comparatively the impact of the metad
... Show MoreExperienced organizations in recent years, significant challenges , especially with the spread of economic globalization, making it required to provide new and better through experience , creativity and innovation to achieve the quality and high-quality products of all kinds , in order to achieve the objectives of the study and to answer its questions tested the study in the woolen Industries sector in Baghdad . The study was applied to a sample of 30 people in the senior management and the middle and lower in the company (managers of sections , and managers of people , and managers of the units , and office managers ) and for the processing of data and information used several statistical methods and extracted result
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreIn the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assum
... Show MoreCharacterization of the heterogonous reservoir is complex representation and evaluation of petrophysical properties and application of the relationships between porosity-permeability within the framework of hydraulic flow units is used to estimate permeability in un-cored wells. Techniques of flow unit or hydraulic flow unit (HFU) divided the reservoir into zones laterally and vertically which can be managed and control fluid flow within flow unit and considerably is entirely different with other flow units through reservoir. Each flow unit can be distinguished by applying the relationships of flow zone indicator (FZI) method. Supporting the relationship between porosity and permeability by using flow zone indictor is ca
... Show More