Abstract:In this research we prepared nanofibers by electrospinning from poly (Vinyl Alcohol) / TiO2. The spectrum of the solution (Emission) was studied at 772 nm. Several process parameter were Investigated as concentration of PVA, the effect of distance from nozzle tip to the grounded collector (gap distance), and final the effect of high voltage. We find the optimum condition to prepare a narrow nanofibers is at concentration of PVA 16gm, the fiber has 20nm diameter.
In this research we prepared nanofibers by electrospinning
from poly (Vinyl Alcohol) / TiO2. The spectrum of the solution
(Emission) was studied at 772 nm. Several process parameter were
Investigated as concentration of PVA, the effect of distance from
nozzle tip to the grounded collector (gap distance), and final the
effect of high voltage. We find the optimum condition to prepare a
narrow nanofibers is at concentration of PVA 16gm, the fiber has
20nm diameter
Research on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreIn this paper, a subspace identification method for bilinear systems is used . Wherein a " three-block " and " four-block " subspace algorithms are used. In this algorithms the input signal to the system does not have to be white . Simulation of these algorithms shows that the " four-block " gives fast convergence and the dimensions of the matrices involved are significantly smaller so that the computational complexity is lower as a comparison with " three-block " algorithm .
Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreThe extracting of personal sprite from the whole image faced many problems in separating the sprite edge from the unneeded parts, some image software try to automate this process, but usually they couldn't find the edge or have false result. In this paper, the authors have made an enhancement on the use of Canny edge detection to locate the sprite from the whole image by adding some enhancement steps by using MATLAB. Moreover, remove all the non-relevant information from the image by selecting only the sprite and place it in a transparent background. The results of comparing the Canny edge detection with the proposed method shows improvement in the edge detection.
A coin has two sides. Steganography although conceals the existence of a message but is not completely secure. It is not meant to supersede cryptography but to supplement it. The main goal of this method is to minimize the number of LSBs that are changed when substituting them with the bits of characters in the secret message. This will lead to decrease the distortion (noise) that is occurred in the pixels of the stego-image and as a result increase the immunity of the stego-image against the visual attack. The experiment shows that the proposed method gives good enhancement to the steganoraphy technique and there is no difference between the cover-image and the stego-image that can be seen by the human vision system (HVS), so this method c
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThis study focuses on improving the safety of embankment dams by considering the effects of vibration due to powerhouse operation on the dam body. The study contains two main parts. In the first part, ANSYS-CFX is used to create the three-dimensional (3D) Finite Volume (FV) model of one vertical Francis turbine unit. The 3D model is run by considering various reservoir conditions and the dimensions of units. The Re-Normalization Group (RNG) k-ε turbulence model is employed, and the physical properties of water and the flow characteristics are defined in the turbine model. In the second phases, a 3D finite element (FE) numerical model for a rock-fill dam is created by using ANSYS®, considering the dam connection with its powerhouse
... Show More