The aim of this study is to propose mathematical expressions for estimation of the flexural strength of plain concrete members from ultrasonic pulse velocity (UPV) measurements. More than two hundred
pieces of precast concrete kerb units were subjected to a scheduled test program. The tests were divided into two categories; non-destructive ultrasonic and bending or rupture tests. For each precast unit, direct and indirect (surface) ultrasonic pulses were subjected to the concrete media to measure their travel velocities. The results of the tests were mointered in two graphs so that two mathematical relationships can be drawn. Direct pulse velocity versus the flexural strength was given in the first relationship while the second equation describes the flexural strength as a function of indirect (surface) pulse velocity. The application of these equations may be extended to cover the assessment of flexural strength of constructed concrete kerb units or in-situ concreting kerbstone and any other precast concrete units. Finally, a relation between direct and indirect pulse velocities of the a given concrete was predicted and suggested to be employed in case when one of the velocities is not
available can be measured for other ultrasonic pulse test applications
Simplification of new fashion design methods
In this paper, we define a new subclass of multivalent functions defined by the generalized integral operator with negative coefficients in the open unit disk U. We also give and study some interesting properties such as coefficient estimates, subordination theorems and integral means inequalities by using the famous Littlewood's subordination theorem. Finally, we conclude a type of inequalities that is upper bound and lower bound for topology multivalent functions of all analytic functions.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreEnvironmental exposure to active pharmaceutical ingredients (APIs) can have negative effects on the health of ecosystems and humans. While numerous studies have monitored APIs in rivers, these employ different analytical methods, measure different APIs, and have ignored many of the countries of the world. This makes it difficult to quantify the scale of the problem from a global perspective. Furthermore, comparison of the existing data, generated for different studies/regions/continents, is challenging due to the vast differences between the analytical methodologies employed. Here, we present a global-scale study of API pollution in 258 of the world’s rivers, representing the environmental influence of 471.4 million people across
... Show MoreDoses for most drugs are determined from population-level information, resulting in a standard ?one-size-fits-all’ dose range for all individuals. This review explores how doses can be personalised through the use of the individuals’ pharmacokinetic (PK)-pharmacodynamic (PD) profile, its particular application in children, and therapy areas where such approaches have made inroads.
The Bayesian forecasting approach, based on population PK/PD models that account for variability in exposure and response, is a potent method for personalising drug therapy. Its potential utility is eve
<span>Deepfakes have become possible using artificial intelligence techniques, replacing one person’s face with another person’s face (primarily a public figure), making the latter do or say things he would not have done. Therefore, contributing to a solution for video credibility has become a critical goal that we will address in this paper. Our work exploits the visible artifacts (blur inconsistencies) which are generated by the manipulation process. We analyze focus quality and its ability to detect these artifacts. Focus measure operators in this paper include image Laplacian and image gradient groups, which are very fast to compute and do not need a large dataset for training. The results showed that i) the Laplacian
... Show More