People’s ability to quickly convey their thoughts, or opinions, on various services or items has improved as Web 2.0 has evolved. This is to look at the public perceptions expressed in the reviews. Aspect-based sentiment analysis (ABSA) deemed to receive a set of texts (e.g., product reviews or online reviews) and identify the opinion-target (aspect) within each review. Contemporary aspect-based sentiment analysis systems, like the aspect categorization, rely predominantly on lexicon-based, or manually labelled seeds that is being incorporated into the topic models. And using either handcrafted rules or pre-labelled clues for performing implicit aspect detection. These constraints are restricted to a particular domain or language which is domain-dependent. In this work, we first propose a novel unsupervised probabilistic model Topic-seeds Latent Dirichlet Allocation (TSLDA) that leverages semantic regularities for the articulation of explicit aspect-categories. Then, based on the articulated categories, a distributed vector is used for the identification of implicit aspects. The experimental results show that our approach outperforms baseline methods for different domain-data with minimal configurations. Specifically, utilizing the RI measure, our proposed TSLDA outperformed multiple clustering and topic models by an average of 0.83% in diverse domain-data, and roughly 0.89% using the Precision metric for implicit aspect detection.
This paper investigates the capacitated vehicle routing problem (CVRP) as it is one of the numerous issues that have no impeccable solutions yet. Numerous scientists in the recent couple of decades have set up various explores and utilized numerous strategies with various methods to deal with it. However, for all researches, finding the least cost is exceptionally complicated. In any case, they have figured out how to think of rough solutions that vary in efficiencies relying upon the search space. Furthermore, tabu search (TS) is utilized to resolve this issue as it is fit for solving numerous complicated issues. The algorithm has been adjusted to resolve the exploration issue, where its methodology is not quite the same as the normal a
... Show MoreThis work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the chan
Coronavirus disease (Covid-19) has threatened human life, so it has become necessary to study this disease from many aspects. This study aims to identify the nature of the effect of interdependence between these countries and the impact of each other on each other by designating these countries as heads for the proposed graph and measuring the distance between them using the ultrametric spanning tree. In this paper, a network of countries in the Middle East is described using the tools of graph theory.
In this paper a decoder of binary BCH code is implemented using a PIC microcontroller for code length n=127 bits with multiple error correction capability, the results are presented for correcting errors up to 13 errors. The Berkelam-Massey decoding algorithm was chosen for its efficiency. The microcontroller PIC18f45k22 was chosen for the implementation and programmed using assembly language to achieve highest performance. This makes the BCH decoder implementable as a low cost module that can be used as a part of larger systems. The performance evaluation is presented in terms of total number of instructions and the bit rate.
This paper discusses the study of computer Russian language neologisms. Problems of studying computer terminology are constantly aggravated by the processes of computer technology that is introduced to all walks of life. The study identifies ways of word formation: the origin of the computer terms and the possibility of their usage in Russian language. The Internet is considered a worldwide tool of communication used extensively by students, housewives and professionals as well The Internet is a heterogeneous environment consisting of various hardware and software configurations that need to be configured to support the languages used. The development of Internet content and services is essential for expanding Internet usage. Some of the
... Show MoreThe interests toward developing accurate automatic face emotion recognition methodologies are growing vastly, and it is still one of an ever growing research field in the region of computer vision, artificial intelligent and automation. However, there is a challenge to build an automated system which equals human ability to recognize facial emotion because of the lack of an effective facial feature descriptor and the difficulty of choosing proper classification method. In this paper, a geometric based feature vector has been proposed. For the classification purpose, three different types of classification methods are tested: statistical, artificial neural network (NN) and Support Vector Machine (SVM). A modified K-Means clustering algorithm
... Show MoreHemorrhagic insult is a major source of morbidity and mortality in both adults and newborn babies in the developed countries. The mechanisms underlying the non-traumatic rupture of cerebral vessels are not fully clear, but there is strong evidence that stress, which is associated with an increase in arterial blood pressure, plays a crucial role in the development of acute intracranial hemorrhage (ICH), and alterations in cerebral blood flow (CBF) may contribute to the pathogenesis of ICH. The problem is that there are no effective diagnostic methods that allow for a prognosis of risk to be made for the development of ICH. Therefore, quantitative assessment of CBF may significantly advance the underst
Diverting river flow during construction of a main dam involves the construction of cofferdams, and tunnels, channels or other temporary passages. Diversion channels are commonly used in wide valleys where the high flow makes tunnels or culverts uneconomic. The diversion works must form part of the overall project design since it will have a major impact on its cost, as well as on the design, construction program and overall cost of the permanent works. Construction costs contain of excavation, lining of the channel, and construction of upstream and downstream cofferdams. The optimization model was applied to obtain optimalchannel cross section, height of upstream cofferdam, and height of downstream cofferdamwith minimum construction cost
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreThere is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that
... Show More