The huge amount of information in the internet makes rapid need of text
summarization. Text summarization is the process of selecting important sentences
from documents with keeping the main idea of the original documents. This paper
proposes a method depends on Technique for Order of Preference by Similarity to
Ideal Solution (TOPSIS). The first step in our model is based on extracting seven
features for each sentence in the documents set. Multiple Linear Regression (MLR)
is then used to assign a weight for the selected features. Then TOPSIS method
applied to rank the sentences. The sentences with high scores will be selected to be
included in the generated summary. The proposed model is evaluated using dataset
supplied by the Text Analysis Conference (TAC-2011) for English documents. The
performance of the proposed model is evaluated using Recall-Oriented Understudy
for Gisting Evaluation (ROUGE) metric. The obtained results support the
effectiveness of the proposed model.
The steganography (text in image hiding) methods still considered important issues to the researchers at the present time. The steganography methods were varied in its hiding styles from a simple to complex techniques that are resistant to potential attacks. In current research the attack on the host's secret text problem didn’t considered, but an improved text hiding within the image have highly confidential was proposed and implemented companied with a strong password method, so as to ensure no change will be made in the pixel values of the host image after text hiding. The phrase “highly confidential” denoted to the low suspicious it has been performed may be found in the covered image. The Experimental results show that the covere
... Show Moreيعد هذا النص أحد النصوص المسمارية المصادرة التي بحوزة المتحف العراقي، ويحمل الرقم المتحفي (235869)، قياساته )12،7x 6x 2،5سم). يتضمن مدخولات كميات من الشعير،أرخ النص الى عصر أور الثالثة (2012-2004 ق.م) و يعود الى السنة الثالثة من حكم الملك أبي-سين (2028-2004 ق.م)،أن الشخصية الرئيسة في هذا النص هو)با-اَ-كا مسمن الماشية( من مدينة أري-ساكرك، ومقارنته مع النصوص المسمارية المنشورة التي تعود الى أرشيفه يبلغ عددها (196) نصاً تضمنت نشاطاته م
... Show MoreText science presented a set of criteria to make the text essentially a project to create
texts and use. Me and means of cohesion script text scientists, two standard foundries and
knitting. Find this means their equivalent in the Arab rhetorical Heritage has been found, it
means foundries find Accompanying represented (link grammar in the classroom and link),
and referrals represented by (Baldmair, Ldefinition, and the name of the signal), and
deletion, and repetition, and presentation delays. As in the standard knitting it has confirmed
Albulagjun Arabs on the semantic consistency between the text components, as reflected in
the moral link in Chapter interfaces, as well as in moral coherence between parts of the te
In this paper, we introduce a method to identify the text printed in Arabic, since the recognition of the printed text is very important in the applications of information technology, the Arabic language is among a group of languages with related characters such as the language of Urdu , Kurdish language , Persian language also the old Turkish language " Ottoman ", it is difficult to identify the related letter because it is in several cases, such as the beginning of the word has a shape and center of the word has a shape and the last word also has a form, either texts in languages where the characters are not connected, then the image of the letter one in any location in the word has been Adoption of programs ready for him A long time.&
... Show MoreMR Younus, Nasaq Journal, 2022
Deconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show MoreAbstract
This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More