Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide improvements for many applications. In addition, critical challenges and research issues were provided based on published paper limitations to help researchers distinguish between various analytics techniques to develop highly consistent, logical, and information-rich analyses based on valuable features. Furthermore, the findings of this paper may be used to identify the best methods in each sector used in these publications, assist future researchers in their studies for more systematic and comprehensive analysis and identify areas for developing a unique or hybrid technique for data analysis.
One of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city's land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land's surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS's. The findings revealed that the NDBI and the NDVI had the strongest correlation with the
... Show MoreIn many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The purpose of this paper is to apply different transportation models in their minimum and maximum values by finding starting basic feasible solution and finding the optimal solution. The requirements of transportation models were presented with one of their applications in the case of minimizing the objective function, which was conducted by the researcher as real data, which took place one month in 2015, in one of the poultry farms for the production of eggs
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Since the Internet has been more widely used and more people have access to multimedia content, copyright hacking, and piracy have risen. By the use of watermarking techniques, security, asset protection, and authentication have all been made possible. In this paper, a comparison between fragile and robust watermarking techniques has been presented to benefit them in recent studies to increase the level of security of critical media. A new technique has been suggested when adding an embedded value (129) to each pixel of the cover image and representing it as a key to thwart the attacker, increase security, rise imperceptibility, and make the system faster in detecting the tamper from unauthorized users. Using the two watermarking ty
... Show MoreThe general crisis of research methods in the social sciences
Research methodology: philosophy and techniques, founded by philosophers and applied by scientists, and no accurate application of techniques except with a deep understanding of philosophy, as a prerequisite. This fact is almost completely absent from the Iraqi and Arab academic mentality. This constituted one of the dimensions of the double crisis - theoretical and applied - of research methods in the social sciences. As first, there is no philosophy of science, neither as an independent material nor as an introductory subject, but not even an oral confirmation. Secondly, the advancement of quantitative research methods are presented without a background philosophy, as sol
The applications of Multilevel Converter (MLC) are increased because of the huge demand for clean power; especially these types of converters are compatible with the renewable energy sources. In addition, these new types of converters have the capability of high voltage and high power operation. A Nine-level converter in three modes of implementation; Diode Clamped-MLC (DC-MLC), Capacitor Clamped-MLC (CC-MLC), and the Modular Structured-MLC (MS-MLC) are analyzed and simulated in this paper. Various types of Multicarrier Modulation Techniques (MMTs) (Level shifted (LS), and Phase shifted (PS)) are used for operating the proposed Nine level - MLCs. Matlab/Simulink environment is used for the simulation, extracting, and ana
... Show More