Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of other techniques. As well as algorithm of efficient block's position has been adopted to achieve this research. Also in this paper was introduce A modify of the orthogonal search algorithm (OSA) for searching scheme has been introduced which is contributed in decreasing the motion searching time of the successive inter frames.
Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreThis paper presents an enhancement technique for tracking and regulating the blood glucose level for diabetic patients using an intelligent auto-tuning Proportional-Integral-Derivative PID controller. The proposed controller aims to generate the best insulin control action responsible for regulating the blood glucose level precisely, accurately, and quickly. The tuning control algorithm used the Dolphin Echolocation Optimization (DEO) algorithm for obtaining the near-optimal PID controller parameters with a proposed time domain specification performance index. The MATLAB simulation results for three different patients showed that the effectiveness and the robustness of the proposed control algorithm in terms of fast gene
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
Coronavirus disease (Covid-19) has threatened human life, so it has become necessary to study this disease from many aspects. This study aims to identify the nature of the effect of interdependence between these countries and the impact of each other on each other by designating these countries as heads for the proposed graph and measuring the distance between them using the ultrametric spanning tree. In this paper, a network of countries in the Middle East is described using the tools of graph theory.
The interests toward developing accurate automatic face emotion recognition methodologies are growing vastly, and it is still one of an ever growing research field in the region of computer vision, artificial intelligent and automation. However, there is a challenge to build an automated system which equals human ability to recognize facial emotion because of the lack of an effective facial feature descriptor and the difficulty of choosing proper classification method. In this paper, a geometric based feature vector has been proposed. For the classification purpose, three different types of classification methods are tested: statistical, artificial neural network (NN) and Support Vector Machine (SVM). A modified K-Means clustering algorithm
... Show MoreIn this paper, the survival function has been estimated for the patients with lung cancer using different parametric estimation methods depending on sample for completing real data which explain the period of survival for patients who were ill with the lung cancer based on the diagnosis of disease or the entire of patients in a hospital for a time of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the estimation of the survival function for the lung cancer by using pre-test singles stage shrinkage estimator method was the best . <
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
In this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.
Diverting river flow during construction of a main dam involves the construction of cofferdams, and tunnels, channels or other temporary passages. Diversion channels are commonly used in wide valleys where the high flow makes tunnels or culverts uneconomic. The diversion works must form part of the overall project design since it will have a major impact on its cost, as well as on the design, construction program and overall cost of the permanent works. Construction costs contain of excavation, lining of the channel, and construction of upstream and downstream cofferdams. The optimization model was applied to obtain optimalchannel cross section, height of upstream cofferdam, and height of downstream cofferdamwith minimum construction cost
... Show More