This study was conducted to explore the effects of using ionized water on the productive and physiological performance of Japanese quails (Coturnix japonica). Our study was conducted at a private farm from 20th April, 2016 to 13th July, 2016 (84 d). One hundred 42-day-old Japanese quail chicks were used, divided randomly into 5 groups with 4 replicates. Treatments consisted in a control group (T1 - normal water:), alkaline (T2 - pH 8 and T3 - pH 9), and acidic water (T4 - pH 6 and T5 - pH 5). All birds were fed a balanced diet of energy and protein. The egg production ratio, egg weight, cumulative number of eggs, egg mass, feed conversion ratio, productivity per hen per week, and effects on plasma lipids, uric acid, glucose, calcium, and phosphorus were studied. The T3 group exhibited greater (P ≤ 0.05) average egg production, productivity per hen per week, cumulative number of eggs, and egg mass compared with the other groups. Moreover, all treated groups (T2 to T5) had higher (P ≤ 0.05) overall average egg weights compared with that of the T1 group. At 18 wk old, plasma cholesterol was decreased (P ≤ 0.05) in the T3 and T5 groups. Furthermore, more (P ≤ 0.05) high-density lipoprotein and less low-density lipoprotein were observed in all treated groups compared with the T1 control group. In conclusion, the productive and physiological performance of Japanese quail was improved in the T3 (alkaline water; pH 9) and T5 (acidic water; pH 5) groups compared with the other experimental groups.
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show MoreIn the reverse engineering approach, a massive amount of point data is gathered together during data acquisition and this leads to larger file sizes and longer information data handling time. In addition, fitting of surfaces of these data point is time-consuming and demands particular skills. In the present work a method for getting the control points of any profile has been presented. Where, many process for an image modification was explained using Solid Work program, and a parametric equation of the profile that proposed has been derived using Bezier technique with the control points that adopted. Finally, the proposed profile was machined using 3-aixs CNC milling machine and a compression in dimensions process has been occurred betwe
... Show MoreExchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreThis paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate inj
... Show MoreAttack stream cipher system , using cipher text only , depends on the characteristics of plain teKt language and the randomness of the key , that used in encryption , without having detailed k.nuwh:dgt:: uf cipher algorithm by benefiting from the balance between O's and I' in the key to reduce the probability of key space.
A .technology analysis image using crops agricultural of grading and sorting the test to conducted was experiment The device coupling the of sensor a with camera a and 75 * 75 * 50 dimensions with shape cube studio made-factory locally the study to studio the in taken were photos and ,)blue-green - red (lighting triple with equipped was studio The .used were neural artificial and technology processing image using maturity and quality ,damage of fruits the of characteristics external value the quality 0.92062, of was value regression the damage predict to used was network neural artificial The .network the using scheme regression a of means by 0.98654 of was regression the of maturity and 0.97981 of was regression the of .algorithm Marr
... Show MoreAbstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas