This article aims to provide a bibliometric analysis of intellectual capital research published in the Scopus database from 1956 to 2020 to trace the development of scientific activities that can pave the way for future studies by shedding light on the gaps in the field. The analysis focuses on 638 intellectual capital-related papers published in the Scopus database over 60 years, drawing upon a bibliometric analysis using VOSviewer. This paper highlights the mainstream of the current research in the intellectual capital field, based on the Scopus database, by presenting a detailed bibliometric analysis of the trend and development of intellectual capital research in the past six decades, including journals, authors, countries, institutes, co-occurrence, and author’s keywords. The findings imply that intellectual capital researchers do not use broad relevant theories and findings from studies beyond their clusters. Another result is that developing nations continue to be underexplored in terms of intellectual property research due to a lack of trust representation and a lack of appropriate investigators. Finally, the data analysis identifies a number of potential research issues to be investigated regarding intellectual capital development, which serve as raw material for future research. Once again, this study provides a framework for firms to build and implement intellectual capital development plans.
Multilayer reservoirs are currently modeled as a single zone system by averaging the reservoir parameters associated with each reservoir zone. However, this type of modeling is rarely accurate because a single zone system does not account for the fact that each zone's pressure decreases independently. Pressure drop for each zone has an effect on the total output and would result in inter-flow and the premature depletion of one of the zones. Understanding reservoir performance requires a precise estimation of each layer's permeability and skin factor. The Multilayer Transient Analysis is a well-testing technique designed to determine formation properties in more than one layer, and its effectiveness over the past two decades has been
... Show MoreBN Rashid, Social Sciences, 2022
A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreThe main problem of the current study concentrates on applying critical discourse analysis to examine textual, discoursal and social features of reduplication in some selected English newspaper headlines. The main aim of the current study is to analyze the linguistic features of reduplication by adopting Fairclough's three-dimensional model (2001). This study sets forth the following hypotheses: (1) English headline – newspapers comprise various textual, discoursal and social features ;(2)the model of analysis is best suited for the current study.To achieve the aims and verify the hypotheses, a critical discourse analysis approach is used represented by Fairclough's socio-cultural approach (2001).The present study has examined the use of
... Show MoreWith the increasing rates of cancer worldwide, a great deal of scientific discourse is devoted to arguments and statements about cancer and its causes. Scientists from different fields try to seize any available chance to warn people of the risk of consuming and exposing to carcinogens that have, unfortunately, become essential parts of modern life. The present paper attempts to investigate the proximization strategy through which scientists construct carcinogen risk to enhance people’s preventive actions against these carcinogens. The paper targets the construction which depends on producing the conflict between the values of the people themselves and the contrasting values assigned to carcinogens. To achieve this aim, Cap’s (2
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show MoreThe present study investigates the use of intensifiers as linguisticdevices employed by Charles Dickens in Hard Times. For ease of analysis, the data are obtained by a rigorous observation of spontaneously occurring intensifiers in the text. The study aims at exploring the pragmatic functions and aesthetic impact of using intensifiers in Hard Times.The current study is mainly descriptive analytical and is based on analyzing and interpreting the use of intensifiers in terms ofHolmes (1984) andCacchiani’smodel (2009). From the findings, the novelist overuses intensifiers to the extent that 280 intensifiers are used in the text. These intensifiers(218) are undistinguished
... Show MoreAddressed the problem of the research is marked: (Performing processors for the time between Impressionism and superrealism) the concept of time and how to submit artwork. The search came in four sections: general framework for research and identified the research problem and the need for him. With an indication of the importance of his presence. Then determine the research objectives of (detection processors performing to the concept of time in works of art in each of Impressionism and superrealism. And a comparison between them to reveal similarities and differences), followed by the establishment of boundaries Find three (objectivity, the temporal and spatial) were then determine the terms related to the title. Then provide the theore
... Show MoreIn this paper all possible regressions procedure as well as stepwise regression procedure were applied to select the best regression equation that explain the effect of human capital represented by different levels of human cadres on the productivity of the processing industries sector in Iraq by employing the data of a time series consisting of 21 years period. The statistical program SPSS was used to perform the required calculations.
Background: Odontogenic tumors are a diverse group of lesions with a variety of clinical behavior and histopathologic subtypes, from hamartomatous and benign to malignant. The study aimed to examine the clinical and pathological features of odontogenic tumors in Baghdad over the last 11 years (2011–2021). Materials and Methods: The present retrospective study analyzed all formalin-fixed, paraffin-embedded tissue blocks of patients diagnosed with an odontogenic tumor that were retrieved from archives at a teaching hospital/College of Dentistry in Baghdad University, Iraq, between 2011 and 2021. The diagnosis of each case was confirmed by examining the hematoxylin and eosin stained sections by two expert pathologists. Data from pati
... Show More