Texture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreThe ability to produce load-bearing masonry units adopting ACI 211.1 mix design using (1:3.2:2.5) as (cement: fine aggregate: coarse aggregate) with slump range (25-50mm) which can conform (dimension, absorption, and compressive strength) within IQS 1077/1987 requirements type A was our main goal of the study. The ability to use low cement content (300 kg/m3) to handle our market price products since the most consumption in wall construction for low-cost buildings was encouraging. The use of (10 and 20%) of LECA as partial volume replacement of coarse aggregate to reduce the huge weight of masonry blocks can also be recommended. The types of production of the load-bearing masonry units were A and B for (
... Show MoreImage compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.
Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreIn this paper, a new tunable approach for fusion the satellite images that fall in different electromagnetic wave ranges is presented, which gives us the ability to make one of the images features little superior on the other without reducing the general resultant image fusion quality, this approach is based on the principal component analysis (PCA) fusion method. A comparison made is between the results of the proposed approach and two fusion methods (they are: the PCA fusion method and the projection of eigenvectors on the bands fusion method), and the comparison results show the validity of this new method.
Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show More