Modeling data acquisition systems (DASs) can support the vehicle industry in the development and design of sophisticated driver assistance systems. Modeling DASs on the basis of multiple criteria is considered as a multicriteria decision-making (MCDM) problem. Although literature reviews have provided models for DASs, the issue of imprecise, unclear, and ambiguous information remains unresolved. Compared with existing MCDM methods, the robustness of the fuzzy decision by opinion score method II (FDOSM II) and fuzzy weighted with zero inconsistency II (FWZIC II) is demonstrated for modeling the DASs. However, these methods are implemented in an intuitionistic fuzzy set environment that restricts the ability of experts to provide membership and nonmembership degrees freely, simulate real-world ambiguity efficiently, utilize a narrow fuzzy number space, and deal with interval data. Thus, this study used a more efficient fuzzy environment interval-valued linear Diophantine fuzzy set (IVLDF) with FWZIC II for criterion weighting and IVLDF with FDOSM for DAS modeling to address the issues and support industrial community characteristics in the design and implementation of advanced driver assistance systems in vehicles. The proposed methodology comprises two consecutive phases. The first phase involves adapting a decision matrix that intersects DAS alternatives and criteria. The second phase (development phase) proposes a decision modeling approach based on formulation of IVLD-FWZIC II and IVLD-FDOSM II to model DASs. A total of 14 DASs were modeled on the basis of 15 DAS criteria, including seven subcriteria for “comprehensive complexity assessment” and eight subcriteria for “design and implementation,” which had a remarkable effect on the DAS design when implemented by industrial communities. Systematic ranking, sensitivity analysis, and modeling checklists were conducted to demonstrate that the modeling results were subject to systematic ranking, as indicated by the high correlations across all described scenarios of changing criterion weight values, supporting the most important research points, and proposing a value-adding process in modeling the most desirable DAS.
In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Active learning is a teaching method that involves students actively participating in activities, exercises, and projects within a rich and diverse educational environment. The teacher plays a role in encouraging students to take responsibility for their own education under their scientific and pedagogical supervision and motivates them to achieve ambitious educational goals that focus on developing an integrated personality for today’s students and tomorrow’s leaders. It is important to understand the impact of two proposed strategies based on active learning on the academic performance of first-class intermediate students in computer subjects and their social intelligence. The research sample was intentionally selected, consis
... Show MoreSimulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Network
... Show MoreThe study deals with the issue of multi-choice linear mathematical programming. The right side of the constraints will be multi-choice. However, the issue of multi-purpose mathematical programming can not be solved directly through linear or nonlinear techniques. The idea is to transform this matter into a normal linear problem and solve it In this research, a simple technique is introduced that enables us to deal with this issue as regular linear programming. The idea is to introduce a number of binary variables And its use to create a linear combination gives one parameter was used multiple. As well as the options of linear programming model to maximize profits to the General Company for Plastic Industries product irrigation sy
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThe gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show More