Extractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datasets (d061j through d070f). Results clarify the effectiveness of the proposed model when compared with another state-of-the-art model.
Immuno-haematological genetic markers study was carried out to understand the genetic background variations among Kirkuk (Iraq) indigenous population. A cross-sectional study of 179 patients with thalassemia major was conducted in Kirkuk. A detailed review was undertaken to define the relationships between ethnic origins, phenotype and immuno-genetic markers uniformity in relation to genetic isolation and interethnic admixture. A total of 179 thalassemia major patients were subjected to analysis in the hereditary blood diseases centre, including (18(10.05 %)) of intermarriages between different ethnic groups origin, whereas the overall consanguinity marriage rate was estimated at (161 (89.9%)) including (63(35.1%)) for first cousi
... Show MoreThe study aimed to reveal the level to which the international issues are integrated into the books of Saudi public education social studies and citizenship in the light of the principles of international education, and to know the level of continuity and integration of these issues in the books, to build a range and sequence matrix of international issues through it. The study followed the descriptive and analytical method that used the content analysis card as a tool for study when the tool achieved the necessary validity and reliability characteristics. The data of this study has been processed using the SPSS statistical program according to a set of appropriate methods of descriptive and inferential statistics.
The resu
... Show Moremany painters tried to mix colors with Music by direct employment through colorful musical pieces or the use of multiple instruments and techniques , or vice versa, including the French artist )Robert Stroben(, he transferred the piece of music to be depicted on the painting and worked on the tones of music (Johann Sebastian Bach) by dropping the color on the lines of the musical scale, for example (the C tone) ranging from brown to red ( Tone La A) from gray to orange, and so on, the presence of links and similarity factors between the world of music and the world of colors facilitated the process of linking musical notes with colors, the most famous of which was presented by the world (Newton) in the circle of basic colors and linking
... Show MoreIn this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of mo
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
In this research, an adaptive Canny algorithm using fast Otsu multithresholding method is presented, in which fast Otsu multithresholding method is used to calculate the optimum maximum and minimum hysteresis values and used as automatic thresholding for the fourth stage of the Canny algorithm. The new adaptive Canny algorithm and the standard Canny algorithm (manual hysteresis value) was tested on standard image (Lena) and satellite image. The results approved the validity and accuracy of the new algorithm to find the images edges for personal and satellite images as pre-step for image segmentation.
Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth
... Show MoreThe chemical additives used to enhance the properties of drilling mud cause damage to humans and the environment. Therefore, it is necessary to search for alternative additives to add them to the drilling mud. Thus, this study investigates the effects of pomegranate peel and grape seed powders as natural waste when added to un-weighted water-based mud. The test includes measurements of the rheological properties and filtration, as well as the alkanity and density of the drilling mud. The results showed a decrease in PH values with an increase in the concentrations of pomegranate peel or grapeseed, and a decrease in mud density with an increase in powders of pomegranate peel and grape seed concentrations that resulted f
... Show MoreThe intellectual property of digital documents has been protected by using many methods of digital watermarking. Digital documents have been so much of advantages over print documents. Digital documents are less expensive and easy to store, transport, and searched compared to traditional print documents. But it has its owner limitation too. A simple image editor can be used to modify and make a forged document. Digital documents can be tampered easily. In order to utilize the whole benefits of digital document, these limitations have to overcome these limitations by embedding some text, logo sequence that identifies the owner of the document..
In this research LSB technique has been used
... Show More