The science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within average of 65 and MSE within average of 85, indicate that the proposed method has proven successful in its ability to efficiently embedding data.
Recently, there has been an increasing advancement in the communications technology, and due to the increment in using the cellphone applications in the diverse aspects of life, it became possible to automate home appliances, which is the desired goal from residences worldwide, since that provides lots of comfort by knowing that their appliances are working in their highest effi ciency whenever it is required without their knowledge, and it also allows them to control the devices when they are away from home, including turning them on or off whenever required. The design and implementation of this system is carried out by using the Global System of Mobile communications (GSM) technique to control the home appliances – In this work, an ele
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Currently, with the huge increase in modern communication and network applications, the speed of transformation and storing data in compact forms are pressing issues. Daily an enormous amount of images are stored and shared among people every moment, especially in the social media realm, but unfortunately, even with these marvelous applications, the limited size of sent data is still the main restriction's, where essentially all these applications utilized the well-known Joint Photographic Experts Group (JPEG) standard techniques, in the same way, the need for construction of universally accepted standard compression systems urgently required to play a key role in the immense revolution. This review is concerned with Different
... Show MoreVisible light communication (VLC) is an upcoming wireless technology for next-generation communication for high-speed data transmission. It has the potential for capacity enhancement due to its characteristic large bandwidth. Concerning signal processing and suitable transceiver design for the VLC application, an amplification-based optical transceiver is proposed in this article. The transmitter consists of a driver and laser diode as the light source, while the receiver contains a photodiode and signal amplifying circuit. The design model is proposed for its simplicity in replacing the trans-impedance and transconductance circuits of the conventional modules by a simple amplification circuit and interface converter. Th
... Show MoreThis study dealt with the management strategy as an independent variable and the integrated industrial distribution as a variable. The study aimed at finding the integrated industrial distribution that fits with the management strategy in providing the needs of the firm on the one hand and reducing the cost of management that is reflected in increasing its profits.
The researcher selected the data from (130) decision makers in the corporation and used the questionnaire as a tool for collecting data and used a set of statistical tools and tools suitable for the nature of information and were processed using the data analysis system (SPSS version 24) Based on the analysis of the responses of the sample and the test of correlation and
This paper is concerned with Double Stage Shrinkage Bayesian (DSSB) Estimator for lowering the mean squared error of classical estimator ˆ q for the scale parameter (q) of an exponential distribution in a region (R) around available prior knowledge (q0) about the actual value (q) as initial estimate as well as to reduce the cost of experimentations. In situation where the experimentations are time consuming or very costly, a Double Stage procedure can be used to reduce the expected sample size needed to obtain the estimator. This estimator is shown to have smaller mean squared error for certain choice of the shrinkage weight factor y( ) and for acceptance region R. Expression for
... Show MoreFace detection is one of the important applications of biometric technology and image processing. Convolutional neural networks (CNN) have been successfully used with great results in the areas of image processing as well as pattern recognition. In the recent years, deep learning techniques specifically CNN techniques have achieved marvellous accuracy rates on face detection field. Therefore, this study provides a comprehensive analysis of face detection research and applications that use various CNN methods and algorithms. This paper presents ten of the most recent studies and illustrate the achieved performance of each method.
Improving performance is an important issue in Wireless Sensor Networks (WSN). WSN has many limitations including network performance. The research question is how to reduce the amount of data transmitted to improve network performance?
The work will include one of the dictionary compression methods which is Lempel Ziv Welch(LZW). One problem with the dictionary method is that the token size is fixed. The LZW dictionary method is not very useful with little data, because it loses many byt
... Show More