Modeling data acquisition systems (DASs) can support the vehicle industry in the development and design of sophisticated driver assistance systems. Modeling DASs on the basis of multiple criteria is considered as a multicriteria decision-making (MCDM) problem. Although literature reviews have provided models for DASs, the issue of imprecise, unclear, and ambiguous information remains unresolved. Compared with existing MCDM methods, the robustness of the fuzzy decision by opinion score method II (FDOSM II) and fuzzy weighted with zero inconsistency II (FWZIC II) is demonstrated for modeling the DASs. However, these methods are implemented in an intuitionistic fuzzy set environment that restricts the ability of experts to provide membership and nonmembership degrees freely, simulate real-world ambiguity efficiently, utilize a narrow fuzzy number space, and deal with interval data. Thus, this study used a more efficient fuzzy environment interval-valued linear Diophantine fuzzy set (IVLDF) with FWZIC II for criterion weighting and IVLDF with FDOSM for DAS modeling to address the issues and support industrial community characteristics in the design and implementation of advanced driver assistance systems in vehicles. The proposed methodology comprises two consecutive phases. The first phase involves adapting a decision matrix that intersects DAS alternatives and criteria. The second phase (development phase) proposes a decision modeling approach based on formulation of IVLD-FWZIC II and IVLD-FDOSM II to model DASs. A total of 14 DASs were modeled on the basis of 15 DAS criteria, including seven subcriteria for “comprehensive complexity assessment” and eight subcriteria for “design and implementation,” which had a remarkable effect on the DAS design when implemented by industrial communities. Systematic ranking, sensitivity analysis, and modeling checklists were conducted to demonstrate that the modeling results were subject to systematic ranking, as indicated by the high correlations across all described scenarios of changing criterion weight values, supporting the most important research points, and proposing a value-adding process in modeling the most desirable DAS.
The main objective of this study is to develop predictive models using SPSS software (version 18) for Marshall Test results of asphalt mixtures compacted by Hammer, Gyratory, and Roller compaction. Bulk density of (2.351) gm/cc, at OAC of (4.7) % was obtained as a benchmark after using Marshall Compactor as laboratory compactive effort with 75-blows. Same density was achieved by Roller and Gyratory Compactors using its mix designed methods.
A total of (75) specimens, for Marshall, Gyratory, and Roller Compactors have been prepared, based on OAC of (4.7) % with an additional asphalt contents of more and less than (0.5) % from the optimum value. All specimens have been subjected to Marshall Test. Mathematical model
... Show MoreBuilding Information Modeling (BIM) and Lean Construction (LC) are two quickly growing applied research areas in construction management. This study focuses on identifying the most essential benefits and analyzing the most affecting constraints on the construction sector that construction players face as they attempt to combine BIM-LC in Iraqi construction. Experts assessed 30 benefits and 28 constraints from examining the previous literature, and a two-round Delphi survey formed the responses. Expert consensus analysis was utilized to elaborate and validate responses after descriptive statistical checks had been used for data processing.
According to the study's findings, the benefits include ensuring the most ef
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Acceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.
An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreThe multi-focus image fusion method can fuse more than one focused image to generate a single image with more accurate description. The purpose of image fusion is to generate one image by combining information from many source images of the same scene. In this paper, a multi-focus image fusion method is proposed with a hybrid pixel level obtained in the spatial and transform domains. The proposed method is implemented on multi-focus source images in YCbCr color space. As the first step two-level stationary wavelet transform was applied on the Y channel of two source images. The fused Y channel is implemented by using many fusion rule techniques. The Cb and Cr channels of the source images are fused using principal component analysis (PCA).
... Show More