In this paper, simulation studies and applications of the New Weibull-Inverse Lomax (NWIL) distribution were presented. In the simulation studies, different sample sizes ranging from 30, 50, 100, 200, 300, to 500 were considered. Also, 1,000 replications were considered for the experiment. NWIL is a fat tail distribution. Higher moments are not easily derived except with some approximations. However, the estimates have higher precisions with low variances. Finally, the usefulness of the NWIL distribution was illustrated by fitting two data sets
A series of new coumarin and N-amino-2-quinolone derivatives have been synthesized. The reaction of coumarin (1) with excess of Hydrazine hydrate 98% yielded 1-amino-2-quinolone (2), Compound (2) was reacted with different Sulfonyl chloride to yield Sulfonamides [ N-(2-oxoquinolin-1(2H)-yl) methane sulfonamide (3), N-(2-oxoquinolin-1(2H)-yl) Benzene sulfonamide (4) and 4-methyl-N-(2-oxoquinolin-1(2H)-yl) benzene sulfonamide (5) ], while reaction of 2-(4-methyl-2-oxo-2H-chromen-7-yloxy) acetic acid (8) with different amines yielded compounds [ 2-(4-methyl-2-oxo-2H-chromen-7-yloxy)-N-(2-oxoquinolin-1(2H)-yl) acetamide (9) and N-(5-methyl-1,3,4-thiadiazol-2-yl)-2-(4-methyl-2-oxo-2H-chromen-7-yloxy)acetamide (10) ] th
... Show MoreImages hold important information, especially in military and commercial surveillance as well as in industrial inspection and communication. Therefore, the protection of the image from abuse, unauthorized access, and damage became a significant demand. This paper introduces a new Beta chaotic map for encrypting and confusing the color image with Deoxyribonucleic Acid (DNA) sequence. First, the DNA addition operation is used for diffusing each component of the plain image. Then, a new Beta chaotic map is used for shuffling the DNA color image. In addition, two chaotic maps, namely the proposed new Beta and Sine chaotic maps, are used for key generation. Finally, the DNA XOR operation is applied between the generated key and shuffled DNA i
... Show MoreConvergence prop erties of Jackson polynomials have been considered by Zugmund
[1,ch.X] in (1959) and J.Szbados [2], (p =ï‚¥) while in (1983) V.A.Popov and J.Szabados [3]
(1 ï‚£p ï‚£ ï‚¥) have proved a direct inequality for Jackson polynomials in L
p-sp ace of 2ï°-periodic bounded Riemann integrable functions (f R) in terms of some modulus of
continuity .
In 1991 S.K.Jassim proved direct and inverse inequality for Jackson polynomials in
locally global norms (L
ï¤,p) of 2ï°-p eriodic bounded measurable functions (f Lï‚¥) in terms of
suitable Peetre K-functional [4].
Now the aim of our paper is to proved direct and inverse inequalities for Jackson
polynomials
The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThe increasing population growth resulting in the tremendous increase in consumption of fuels, energy, and petrochemical products and coupled with the depletion in conventional crude oil reserves and production make it imperative for Nigeria to explore her bitumen reserves so as to meet her energy and petrochemicals needs. Samples of Agbabu bitumen were subjected to thermal cracking in a tubular steel reactor operated at 10 bar pressure to investigate the effect of temperature on the cracking reaction. The gas produced was analyzed in a Gas Chromatograph while the liquid products were subjected to Gas Chromatography-Mass Spectrometry (GC-MS) analysis. Heptane was the dominant gas produced in bitumen cracking at all temperatures and the r
... Show MoreThe science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within avera
... Show More