Pushover analysis is an efficient method for the seismic evaluation of buildings under severe earthquakes. This paper aims to develop and verify the pushover analysis methodology for reinforced concrete frames. This technique depends on a nonlinear representation of the structure by using SAP2000 software. The properties of plastic hinges will be defined by generating the moment-curvature analysis for all the frame sections (beams and columns). The verification of the technique above was compared with the previous study for two-dimensional frames (4-and 7-story frames). The former study leaned on automatic identification of positive and negative moments, where the concrete sections and steel reinforcement quantities the
... Show MoreA Tonido cloud server provides a private cloud storage solution and synchronizes customers and employees with the required cloud services over the enterprise. Generally, access to any cloud services by users is via the Internet connection, which can face some problems, and then users may encounter in accessing these services due to a weak Internet connection or heavy load sometimes especially with live video streaming applications overcloud. In this work, flexible and inexpensive proposed accessing methods are submitted and implemented concerning real-time applications that enable users to access cloud services locally and regionally. Practically, to simulate our network connection, we proposed to use the Raspberry-pi3 m
... Show MoreStoring, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the C
... Show MoreObjective: To establish growth curve for a sample of infertile women and to assess Body Mass Index.
Methodology: Non-probability (purposive sample) of (100) infertile women, who visit Kamal Al-Samaraee
Hospital/ fertility and IVF center . The data are collected through the use of constructed questionnaire, which
consists of two parts. Part 1: consists of (5) items about demographic characteristics, part 2: consists of (4) items
about reproductive status, descriptive statistical analysis procedures (frequency, percentage, Contingency
coefficients, polynomial cube order).
Results: Revealed that the infertile women in the study group had decrease in their Body Mass Index with aging
(with increase of infertility duration
Abstract
The prevention of bankruptcy not only prolongs the economic life of the company and increases its financial performance, but also helps to improve the general economic well-being of the country. Therefore, forecasting the financial shortfall can affect various factors and affect different aspects of the company, including dividends. In this regard, this study examines the prediction of the financial deficit of companies that use the logistic regression method and its impact on the earnings per share of companies listed on the Iraqi Stock Exchange. The time period of the research is from 2015 to 2020, where 33 companies that were accepted in the Iraqi Stock Exchange were selected as a sample, and the res
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show More