A new, simple, sensitive and fast developed method was used for the determination of methyldopa in pure and pharmaceutical formulations by using continuous flow injection analysis. This method is based on formation a burgundy color complex between methyldopa andammonium ceric (IV) nitrate in aqueous medium using long distance chasing photometer NAG-ADF-300-2. The linear range for calibration graph was 0.05-8.3 mmol/L for cell A and 0.1-8.5 mmol/L for cell B, and LOD 952.8000 ng /200 µL for cell A and 3.3348 µg /200 µL for cell B respectively with correlation coefficient (r) 0.9994 for cell A and 0.9991 for cell B, RSD % was lower than 1 % for n=8. The results were compared with classical method UV-Spectrophotometric at λ max=280 nm and turbidmetric method by using the standard addition method via the use of t-test, at 95% confidence level. In addition to the make use of the advantages of using F-test to predict which of the methods is more precise than other method. The comparison of data explains that long distance chasing photometer (NAG-ADF-300-2) is the choice with excellent extended detection, wide application and more sensitive.
The main problem of the current study concentrates on applying critical discourse analysis to examine textual, discoursal and social features of reduplication in some selected English newspaper headlines. The main aim of the current study is to analyze the linguistic features of reduplication by adopting Fairclough's three-dimensional model (2001). This study sets forth the following hypotheses: (1) English headline – newspapers comprise various textual, discoursal and social features ;(2)the model of analysis is best suited for the current study.To achieve the aims and verify the hypotheses, a critical discourse analysis approach is used represented by Fairclough's socio-cultural approach (2001).The present study has examined the use of
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show MoreIn the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show MoreSome nonlinear differential equations with fractional order are evaluated using a novel approach, the Sumudu and Adomian Decomposition Technique (STADM). To get the results of the given model, the Sumudu transformation and iterative technique are employed. The suggested method has an advantage over alternative strategies in that it does not require additional resources or calculations. This approach works well, is easy to use, and yields good results. Besides, the solution graphs are plotted using MATLAB software. Also, the true solution of the fractional Newell-Whitehead equation is shown together with the approximate solutions of STADM. The results showed our approach is a great, reliable, and easy method to deal with specific problems in
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
This paper proposes a completion that can allow fracturing four zones in a single trip in the well called “Y” (for confidential reasons) of the field named “X” (for confidential reasons). The steps to design a well completion for multiple fracturing are first to select the best completion method then the required equipment and the materials that it is made of. After that, the completion schematic must be drawn by using Power Draw in this case, and the summary installation procedures explained. The data used to design the completion are the well trajectory, the reservoir data (including temperature, pressure and fluid properties), the production and injection strategy. The results suggest that multi-stage hydraulic fracturing can
... Show MoreDeep learning convolution neural network has been widely used to recognize or classify voice. Various techniques have been used together with convolution neural network to prepare voice data before the training process in developing the classification model. However, not all model can produce good classification accuracy as there are many types of voice or speech. Classification of Arabic alphabet pronunciation is a one of the types of voice and accurate pronunciation is required in the learning of the Qur’an reading. Thus, the technique to process the pronunciation and training of the processed data requires specific approach. To overcome this issue, a method based on padding and deep learning convolution neural network is proposed to
... Show MoreABSTRACT This paper has a three-pronged objective: offering a unitary set of semantic distinctive features to the analysis of nominal “hatred synonyms” in the lexicon of both English and Standard Arabic (SA), applying it procedurally to test its scope of functionality crosslinguistically, and singling out the closest noun synonymous equivalents among the membership of the two sets in this particular lexical semantic field in both languages. The componential analysis and the matching procedures carried have been functional in identifying ten totally matching equivalents (i.e. at 55.6%), and eight partially matching ones (i.e. at %44.4%). This result shows that while total matching equivalences do exist in the translation of certain Eng
... Show More