The designer must find the optimum match between the object's technical and economic needs and the performance and production requirements of the various material options when choosing material for an engineering application. This study proposes an integrated (hybrid) strategy for selecting the optimal material for an engineering design depending on design requirements. The primary objective is to determine the best candidate material for the drone wings based on Ashby's performance indices and then rank the result using a grey relational technique with the entropy weight method. Aluminum alloys, titanium alloys, composites, and wood have been suggested as suitable materials for manufacturing drone wings. The requirements for designing a drone's wings are to make them as light as possible while meeting the stiffness, strength, and fracture toughness criteria. The conclusion indicates that Carbon Fiber-Reinforced Polymer (CFRP) is the best material for producing drone wings. In contrast, wood and aluminum alloys were the cheapest materials when the design had to be inexpensive.
This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe main objective of this paper is present a novel method to choice a certain wind turbine for a specific site by using normalized power and capacity factor curves. The site matching is based on identifying the optimum turbine rotation speed parameters from turbine performance index (TPI) curve, which is obtained from the higher values of normalized power and capacity factor curves. Wind Turbine Performance Index a new ranking parameter, is defined to optimally match turbines to wind site. The relations (plots) of normalized power, capacity factor, and turbine performance index versus normalized rated wind speed are drawn for a known value of Weibull shape parameter of a site, thus a superior method is used for Weibull parameters estima
... Show MoreReliability analysis methods are used to evaluate the safety of reinforced concrete structures by evaluating the limit state function 𝑔(𝑋𝑖). For implicit limit state function and nonlinear analysis , an advanced reliability analysis methods are needed. Monte Carlo simulation (MCS) can be used in this case however, as the number of input variables increases, the time required for MCS also increases, making it a time consuming method especially for complex problems with implicit performance functions. In such cases, MCS-based FORM (First Order Reliability Method) and Artificial Neural Network-based FORM (ANN FORM) have been proposed as alternatives. However, it is important to note that both MCS-FORM and ANN-FORM can also be time-con
... Show MoreDoppler assessment may lead to intervention that reduces the risk of fetal brain damage. Aim of thestudy: to assess the relation between ultrasonic hemodynamic Doppler indices of middle cerebral and umbilical arteries (PI, RI), growth indices to immediate neonatal outcomes (weight, head & abdominal circumference, APGAR scores at 1 and 5 minutes and neonatal unit admission) in women with mild, moderate and severe anemia during pregnancy. Present study is a clinical prospective study carried out in Al-Elwiya Maternity Teaching Hospital during (January-Jun) 2019, all anemic pregnant women presented to Obstetrical wards in hospitals for emergency cesarean section were the study population. The final sample selected was 120 pregnant women. Ultra
... Show MoreAbstract: Lymphoproliferative Disorders (LPDs) are a group of neoplasms affecting various cells within lymphoid system. Each type has different treatment a..70619
Link failure refers to the failure between two connections/nodes in a perfectly working simulation scenario at a particular instance. Transport layer routing protocols form an important basis of setting up a simulation, with Transmission Control Protocol and User Datagram Protocol being the primary of them. The research makes use of Network Simulator v2.35 to conduct different simulation experiments for link failure and provide validation results. In this paper, both protocols, TCP and UDP are compared based on the throughput of packets delivered from one node to the other constrained to the condition that for a certain interval of time the link fails and the simulation time remains the same for either of the protocols. Overall,
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreSteganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show More