In practical engineering problems, uncertainty exists not only in external excitations but also in structural parameters. This study investigates the influence of structural geometry, elastic modulus, mass density, and section dimension uncertainty on the stochastic earthquake response of portal frames subjected to random ground motions. The North-South component of the El Centro earthquake in 1940 in California is selected as the ground excitation. Using the power spectral density function, the two-dimensional finite element model of the portal frame’s base motion is modified to account for random ground motions. A probabilistic study of the portal frame structure using stochastic finite elements utilizing Monte Carlo simulation is presented using the finite element program ABAQUS. The dynamic reliability and probability of failure of stochastic and deterministic structures based on the first-passage failure were examined and evaluated. The results revealed that the probability of failure increases due to the randomness of stiffness and mass of the structure. The influence of uncertain parameters on reliability analysis depends on the extent of variance in structural parameters.
The study aimed to identify the E-leadership practice at Princess Nora Bint Abdul Rahman University and to explore the obstacles that are facing the leaders in the implementation of E-leadership, in addition to find mechanisms to improve the practice of electronic driving at the university in light of the Corona pandemic. A descriptive analytical approach was used by depending on a questionnaire to collect the needed data. The study sample consisted of (139) heads of departments, attorneys' colleges and support Deanships universities. The study results showed that E-leadership employment recorded a medium to high degree. Especially at university, that publishing decisions electronically and uses technical means in providing its services.
... Show MorePurpose: To determine the effect of information technology governance (ITG) under the control objectives for information and related technologies (COBIT) on financial performance is the objective of this study. Additionally, the article seeks to look into the relationships between the factors under consideration. Theoretical framework: Information technology and operational processes are evaluated and ensure their compliance with the instructions of the Central Bank of Iraq. Therefore, the research dealt with a conceptual framework by reviewing the literature on the importance of the COBIT framework in assessing financial performance. Design/methodology/approach: To investigate the effect of information technology; we the valu
... Show More
Abstract
This research deals with Building A probabilistic Linear programming model representing, the operation of production in the Middle Refinery Company (Dura, Semawa, Najaif) Considering the demand of each product (Gasoline, Kerosene,Gas Oil, Fuel Oil ).are random variables ,follows certain probability distribution, which are testing by using Statistical programme (Easy fit), thes distribution are found to be Cauchy distribution ,Erlang distribution ,Pareto distribution ,Normal distribution ,and General Extreme value distribution . &
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreAn edge dominating set of a graph is said to be an odd (even) sum degree edge dominating set (osded (esded) - set) of G if the sum of the degree of all edges in X is an odd (even) number. The odd (even) sum degree edge domination number is the minimum cardinality taken over all odd (even) sum degree edge dominating sets of G and is defined as zero if no such odd (even) sum degree edge dominating set exists in G. In this paper, the odd (even) sum degree domination concept is extended on the co-dominating set E-T of a graph G, where T is an edge dominating set of G. The corresponding parameters co-odd (even) sum degree edge dominating set, co-odd (even) sum degree edge domination number and co-odd (even) sum degree edge domin
... Show MoreThe goal of health awareness is to familiarize people with health information and facts, and make them realize their sense of responsibility towards their health and the health of others. And translate that goal in practice through applying healthy and sound behaviors spontaneously. This, the goal, is pursued by health awareness announcements through employing announcement persuasion of health as one of the methods of influencing the masses to adopt ideas, behaviors, and adherence to health advice and guidelines; The study aims to analyze the content of the Ministry of Health and Environment announcements regarding health awareness and to reveal the most employable persuasions in health awareness announcements
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values