A simple, economical and selective method employing ion pair dispersive liquid−liquid microextraction (DLLME) coupled with spectrophotometric determination of carbamazepine (CBZ) in pharmaceutical preparations and biological samples was developed. The method is based on reduction of Mo(VI) to Mo(V) using a combination of ammonium thiocyanate and ascorbic acid in acidic medium to form a red binary Mo(V) thiocyanate complex. After addition of CBZ to the complex, extraction of the formed CBZ−Mo(V)−(SCN)6 was performed using a mixture of methylene chloride and methanol. Then, the measurement of target complex was performed at the wavelength of 470 nm. The important extraction parameters affecting the efficiency of DLLME were studied and optimized in detail. At the optimum conditions, the linear range was 0.02–0.2 µg/mL. Moreover, the limits of detection and quantification were 0.01 and 0.04 µg/mL, respectively. High enrichment factor was obtained (118). Good recoveries at 0.06, 0.15 and 0.2 µg/mL ranging from 93 to 102% were achieved. The proposed method was successfully applied to the determination of CBZ in pharmaceutical formulations and biological samples.
A comparison of double informative and non- informative priors assumed for the parameter of Rayleigh distribution is considered. Three different sets of double priors are included, for a single unknown parameter of Rayleigh distribution. We have assumed three double priors: the square root inverted gamma (SRIG) - the natural conjugate family of priors distribution, the square root inverted gamma – the non-informative distribution, and the natural conjugate family of priors - the non-informative distribution as double priors .The data is generating form three cases from Rayleigh distribution for different samples sizes (small, medium, and large). And Bayes estimators for the parameter is derived under a squared erro
... Show MoreCancer is in general not a result of an abnormality of a single gene but a consequence of changes in many genes, it is therefore of great importance to understand the roles of different oncogenic and tumor suppressor pathways in tumorigenesis. In recent years, there have been many computational models developed to study the genetic alterations of different pathways in the evolutionary process of cancer. However, most of the methods are knowledge-based enrichment analyses and inflexible to analyze user-defined pathways or gene sets. In this paper, we develop a nonparametric and data-driven approach to testing for the dynamic changes of pathways over the cancer progression. Our method is based on an expansion and refinement of the pathway bei
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreThe objective of this research is to know the economic feasibility of hydroponics technology by estimating the expected demand for green forage for the years 2021-2031 as well as Identify and analyze project data and information in a way that helps the investor make the appropriate investment decision in addition to preparing a detailed technical preliminary study for the cultivar barley project focusing on the commercial and financing aspects and the criteria that take into account the risks and uncertainties . that indicating the economic feasibility of the project to produce green forage using hydroponics technology. Cultured barley as a product falls within the blue ocean strategy. Accordingly, the research recommends the necess
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show More