In this paper by using δ-semi.open sets we introduced the concept of weakly δ-semi.normal and δ-semi.normal spaces . Many properties and results were investigated and studied. Also we present the notion of δ- semi.compact spaces and we were able to compare with it δ-semi.regular spaces
Background: This study aimed to assess the effect of tooth width in malocclusion in relation to normal, crowding, and spacing dentition. Materials and methods: The sample included dental casts of some dental students and orthodontic patients; their age was (18-25) years and having three groups normal, crowding, and spacing dentition groups. The sample was equally divided to three groups normal, crowding, and spacing dentition groups, each group contained 50 maxillary and 50 mandibular casts that were further subdivided by gender; all the stone casts were measured by highly sensitive digital vernier. Results and Conclusions: Non-significant side difference was found in both dental arches in the three studied groups. Males had higher mesiodis
... Show MoreThe aquatic crude extract of Silybum marianum dry grains prepared by melting them in distil water by the method of soak and shake. The effect of Silybum marianum crude extract studied in vitro on three tumor cell line the Hep-2, AMN-3 and RD for 24, 48 and 72 hours of exposure, and one cell line of normal cells REF for 72 hr exposure. The results showed that the prescence of toxic effect of the aquatic crude extract on the cell lines of Hep-2, AMN-3 and RD at 10 and 100 µg/ ml upto the higher concentrations when they exposed to the extract for 48 hr. as compared with the control treatment, and when the exposure period increased to 72 hr. the toxic effect started at low concentrations (5 and 10 µg/ ml) as compared with the control g
... Show MoreExcessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show MoreThis study focused on the expression and regulation of BRCA1 in breast cancer cell lines compared to normal breast. BRCA1 transcript levels were assessed by real time quantitative polymerase chain reaction (RT-qPCR) in the cancer cell lines. Our data show overexpression of BRCA1 mRNA level in all the studied breast cancer cell lines: MCF-7, T47D, MDA-MB-231 and MDA-MB-468 along with Jurkat, leukemia T-lymphocyte, the positive control, relative to normal breast tissue. To investigate whether a positive or negative correlation exists between BRCA1 and the transcription factor E2F6, three different si-RNA specific for E2F6 were used to transfect the normal and cancerous breast cell lines. Interestingly, strong negative relationship was found b
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreIn this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights