Cloth simulation and animation has been the topic of research since the mid-80's in the field of computer graphics. Enforcing incompressible is very important in real time simulation. Although, there are great achievements in this regard, it still suffers from unnecessary time consumption in certain steps that is common in real time applications. This research develops a real-time cloth simulator for a virtual human character (VHC) with wearable clothing. This research achieves success in cloth simulation on the VHC through enhancing the position-based dynamics (PBD) framework by computing a series of positional constraints which implement constant densities. Also, the self-collision and collision with moving capsules is implemented to achieve realistic behavior cloth modelled on animated characters. This is to enable comparable incompressibility and convergence to raised cosine deformation (RCD) function solvers. On implementation, this research achieves optimized collision between clothes, syncing of the animation with the cloth simulation and setting the properties of the cloth to get the best results possible. Therefore, a real-time cloth simulation, with believable output, on animated VHC is achieved. This research perceives our proposed method can serve as a completion to the game assets clothing pipeline.
High vehicular mobility causes frequent changes in the density of vehicles, discontinuity in inter-vehicle communication, and constraints for routing protocols in vehicular ad hoc networks (VANETs). The routing must avoid forwarding packets through segments with low network density and high scale of network disconnections that may result in packet loss, delays, and increased communication overhead in route recovery. Therefore, both traffic and segment status must be considered. This paper presents real-time intersection-based segment aware routing (RTISAR), an intersection-based segment aware algorithm for geographic routing in VANETs. This routing algorithm provides an optimal route for forwarding the data packets toward their destination
... Show MoreIn this paper, the problem of developing turbulent flow in rectangular duct is investigated by obtaining numerical results of the velocity profiles in duct by using large eddy simulation model in two dimensions with different Reynolds numbers, filter equations and mesh sizes. Reynolds numbers range from (11,000) to (110,000) for velocities (1 m/sec) to (50 m/sec) with (56×56), (76×76) and (96×96) mesh sizes with different filter equations. The numerical results of the large eddy simulation model are compared with k-ε model and analytic velocity distribution and validated with experimental data of other researcher. The large eddy simulation model has a good agreement with experimental data for high Reynolds number with the first, seco
... Show MoreThe problem of research was to identify after the use of cost technology based on specifications in the validity of determining and measuring the costs of the implementation of contracting, by applying to al-Mansour General Construction Contracting Company as an appropriate alternative to the traditional costing system currently adopted, which is characterized by many shortcomings and weaknesses Which has been reflected in the validity and integrity of the calculations. To solve this problem, the research was based on the premise that: (The application of cost technology based on specifications will result in calculating the cost of the product according to the specification required by the customer, to meet his wishes properly and witho
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreTwilight is that light appear on the horizon before sunrise and after sunset, Astronomically it is known that sunrise and sunset are effected by high above sea level, but the effect of high above sea level on the time of astronomical twilight still not decided and controversy among astronomers, in This research we studies the effect of high above sea level on the time of astronomical twilight, through adding the equation correct high above sea level to equation computation of twilight and then calculate of changing in the time of twilight for different highest (0-10000) meters above sea level , and the ratio of increase for time with high between (15.45-20.5) minutes. It was found that there was an increase in the time of the twilight along
... Show MoreThe virtual decomposition control (VDC) is an efficient tool suitable to deal with the full-dynamics-based control problem of complex robots. However, the regressor-based adaptive control used by VDC to control every subsystem and to estimate the unknown parameters demands specific knowledge about the system physics. Therefore, in this paper, we focus on reorganizing the equation of the VDC for a serial chain manipulator using the adaptive function approximation technique (FAT) without needing specific system physics. The dynamic matrices of the dynamic equation of every subsystem (e.g. link and joint) are approximated by orthogonal functions due to the minimum approximation errors produced. The contr
The economic units always sought to maintain its market position and Trchinh the technology management and modern methods that will support success factors .vdila about it has become a customer and one profitability analysis of the most practical way benefit of economic units as modern management focus their attention on achieving this satisfaction, as the customers make up the axis of the success of every organization and that there are many government units aiming to profit directs attention to customers and the number of these units increased continuously. The administration used the customer profitability analysis in order to obtain information to assist in making and decision-making process. How to use modern tec
... Show MoreIn this study, from a total of 856 mastitis cases in lactating ewes, only 34 Streptococcus agalactiae isolates showed various types of resistance to three types of antibiotics (Penicillin, Erythromycin and Tetracycline). St. agalactiae isolates were identified according to the standard methods, including a new suggested technique called specific Chromogenic agar. It was found that antibiotic bacterial resistance was clearly identified by using MIC-microplate assay (dilution method). Also, by real-time PCR technique, it was determined that there were three antibiotics genes resistance ( pbp2b, tetO and mefA ). The high percentage of isolate carried of a single gene which was the Tetracycline (20.59%) followed by percentage Penicillin was
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThe art of preventing the detection of hidden information messages is the way that steganography work. Several algorithms have been proposed for steganographic techniques. A major portion of these algorithms is specified for image steganography because the image has a high level of redundancy. This paper proposed an image steganography technique using a dynamic threshold produced by the discrete cosine coefficient. After dividing the green and blue channel of the cover image into 1*3-pixel blocks, check if any bits of green channel block less or equal to threshold then start to store the secret bits in blue channel block, and to increase the security not all bits in the chosen block used to store the secret bits. Firstly, store in the cente
... Show More