The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery] and also an empirical Bayes estimator Using Gamma Prior, for singly type II censored sample. An empirical study has been used to make a comparison between the three estimators of the reliability for stress – strength Weibull model, by mean squared error MSE criteria, taking different sample sizes (small, moderate and large) for the two random variables in eight experiments of different values of their parameters. It has been found that the weighted loss function was the best for small sample size, and the entropy and Quadratic were the best for moderate and large sample sizes under the two prior distributions and for empirical Bayes estimation.
In the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the veloci
... Show MoreThe Arabic grammatical theory is characterized by the characteristics that distinguish it from other languages. It is based on the following equation: In its entirety a homogeneous linguistic system that blends with the social nature of the Arab, his beliefs, and his culture.
This means that this theory was born naturally, after the labor of maintaining an integrated inheritance, starting with its legal text (the Koran), and ends with its features of multiple attributes.
Saber was carrying the founding crucible of that theory, which takes over from his teacher, Hebron, to be built on what it has reached. It is redundant to point to his location and the status of his book.
So came to my research tagged: (c
3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show MoreThe smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec
... Show MoreThe phenomena of Dust storm take place in barren and dry regions all over the world. It may cause by intense ground winds which excite the dust and sand from soft, arid land surfaces resulting it to rise up in the air. These phenomena may cause harmful influences upon health, climate, infrastructure, and transportation. GIS and remote sensing have played a key role in studying dust detection. This study was conducted in Iraq with the objective of validating dust detection. These techniques have been used to derive dust indices using Normalized Difference Dust Index (NDDI) and Middle East Dust Index (MEDI), which are based on images from MODIS and in-situ observation based on hourly wi
The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat
... Show MoreStoring, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the C
... Show MoreThe aim of this study is to detect the level of psychological stress among unemployed individuals and the level of their wellbeing by finding the correlation between these two variables.
The research sample consisted of (99) people who are currently unemployed and registered at the Ministry of Labor Affairs.
Schafer (1996) scale for psychological stress was used and alongside Ziout's (2012) scale for wellbeing.
The results of the research showed an inverse relation between being unemployed and having wellbeing.
&nb
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreBackground:The most common pattern of dyslipidemia in diabetic patients is increased triglyceride (TG) and decreased HDL cholesterol level, The concentration of LDL cholesterol in diabetic patients is usually not significantly different from non diabetic individuals, Diabetic patients may have elevated levels of non-HDL cholesterol [ LDL+VLDL]. However type 2 diabetic patients typically have apreponderance of smaller ,denser LDL particles which possibly increases atherogenicity even if the absolute concentration of LDL cholesterol is not significantly increased. The Third Adult Treatment Panel of the National Cholesterol Education Program (NCEP III) and the American Heart Association (AHA ) have designate diabetes as a coronary heart dis
... Show More