Referral techniques are normally employed in internet business applications. Existing frameworks prescribe things to a particular client according to client inclinations and former high evaluations. Quite a number of methods, such as cooperative filtering and content-based methodologies, dominate the architectural design of referral frameworks. Many referral schemes are domain-specific and cannot be deployed in a general-purpose setting. This study proposes a two-dimensional (User × Item)-space multimode referral scheme, having an enormous client base but few articles on offer. Additionally, the design of the referral scheme is anchored on the and articles, as expressed by a particular client, and is a combination of affiliation rules mining and the content-based method. The experiments used the dataset of MovieLens, consisting of 100,000 motion pictures appraisals on a size of 1-5, from 943 clients on 1,682 motion pictures. It utilised a five-overlap cross appraisal on a (User × Item)-rating matrix with 12 articles evaluated by a minimum of 320 clients. A total of 16 rules were generated for both and articles, at 35% minimum support and 80% confidence for the articles and 50% similitude for the . Experimental results showed that the anticipated appraisals in denary give a better rating than other measures of exactness. In conclusion, the proposed algorithm works well and fits on two dimensional -space with articles that are significantly fewer than users, thus making it applicable and effective in a variety of uses and scenarios as a general-purpose utility.
Multi-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show MoreIn this paper, the survival function has been estimated for the patients with lung cancer using different parametric estimation methods depending on sample for completing real data which explain the period of survival for patients who were ill with the lung cancer based on the diagnosis of disease or the entire of patients in a hospital for a time of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the estimation of the survival function for the lung cancer by using pre-test singles stage shrinkage estimator method was the best . <
... Show MoreHepatitis is one of the diseases that has become more developed in recent years in terms of the high number of infections. Hepatitis causes inflammation that destroys liver cells, and it occurs as a result of viruses, bacteria, blood transfusions, and others. There are five types of hepatitis viruses, which are (A, B, C, D, E) according to their severity. The disease varies by type. Accurate and early diagnosis is the best way to prevent disease, as it allows infected people to take preventive steps so that they do not transmit the difference to other people, and diagnosis using artificial intelligence gives an accurate and rapid diagnostic result. Where the analytical method of the data relied on the radial basis network to diagnose the
... Show MoreIn this paper, we design a fuzzy neural network to solve fuzzy singularly perturbed Volterra integro-differential equation by using a High Performance Training Algorithm such as the Levenberge-Marqaurdt (TrianLM) and the sigmoid function of the hidden units which is the hyperbolic tangent activation function. A fuzzy trial solution to fuzzy singularly perturbed Volterra integro-differential equation is written as a sum of two components. The first component meets the fuzzy requirements, however, it does not have any fuzzy adjustable parameters. The second component is a feed-forward fuzzy neural network with fuzzy adjustable parameters. The proposed method is compared with the analytical solutions. We find that the proposed meth
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreIn this work lactone (1) was prepared from the reaction of p-nitro phenyl hydrazine with ethylacetoacetate, which upon treatment with benzoyl chloride afforded the lactame (2). The reaction of (2) with 2-amino phenol produced a new Schiff base (L) in good yield. Complexes of V(IV), Zr(IV), Rh(III), Pd(II), Cd(II) and Hg(II) with the new Schiff base (L) have been prepared. The compounds (1, 2) were characterized by FT-IR and UV spectroscopy, as well as characterizing ligand (L) by the same techniques with elemental analysis (C.H.N) and (1H-NMR). The prepared complexes were identified and their structural geometries were suggested by using elemental analysis (C.H.N), flame atomic absorption technique, FT-IR and UV-Vis spectroscopy, in additio
... Show More