A joke is something that is said, written, or done to cause amusement or laughter. It could be a short piece or a long narrative joke, but either way it ends in a punchline, where the joke contains a second conflicting meaning. Sometimes when we read a joke, we understand it directly and fully, but this is not always the case. When a writer writes a joke, he intends to manipulate the reader in a way that the reader doesn’t get the joke at once. He does that by using pun on words or any other word play. We, as listeners to the joke, try to get the message depending mostly on the tone of the voice, in addition to other factors concerning vocabulary and grammar. But as readers of the joke, we need more other factors in order to get to the intended meaning of the joke. One of these important factors is punctuation. Punctuation is the use of certain signs which help understand a piece of writing. It is used to create clarity, sense, and stress in contexts, because using the correct punctuation helps us to convey our thoughts as we intend them to. So is the use of punctuation marks in writing a joke is essential in order to understand it? Or is it just helpful sometimes? What happens if the writer doesn’t use punctuation marks when writing the joke? Would this affect us in getting the meaning of the joke? In this study we try to answer these rising questions.
The problem of the high peak to average ratio (PAPR) in OFDM signals is investigated with a brief presentation of the various methods used to reduce the PAPR with special attention to the clipping method. An alternative approach of clipping is presented, where the clipping is performed right after the IFFT stage unlike the conventional clipping that is performed in the power amplifier stage, which causes undesirable out of signal band spectral growth. In the proposed method, there is clipping of samples not clipping of wave, therefore, the spectral distortion is avoided. Coding is required to correct the errors introduced by the clipping and the overall system is tested for two types of modulations, the QPSK as a constant amplitude modul
... Show MoreThe aim of this paper is to present a method for solving of system of first order initial value problems of ordinary differential equation by a semi-analytic technique with constructing polynomial solutions for decreasing dangers of lead. The original problem is concerned using two-point osculatory interpolation with the fit equals numbers of derivatives at the end points of an interval [0 , 1].
The disposal of the waste material is the main goal of this investigation by transformation to high-fineness powder and producing self-consolidation concrete (SCC) with less cost and more eco-friendly by reducing the cement weight, taking into consideration the fresh and strength properties. The reference mix design was prepared by adopting the European guide. Five waste materials (clay brick, ceramic, granite tiles, marble tiles, and thermostone blocks) were converted to high-fine particle size distribution and then used as 5, 10, and 15% weight replacements of cement. The improvement in strength properties is more significant when using clay bricks compared to other activated waste
In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreIn this paper we present the first ever measured experimental electron momentum density of Cu2Sb at an intermediate resolution (0.6 a.u.) using 59.54 keV 241Am Compton spectrometer. The measurements are compared with the theoretical Compton profiles using density function theory (DFT) within a linear combination of an atomic orbitals (LCAO) method. In DFT calculation, Perdew-Burke-Ernzerhof (PBE) scheme is employed to treat correlation whereas exchange is included by following the Becke scheme. It is seen that various approximations within LCAO-DFT show relatively better agreement with the experimental Compton data. Ionic model calculations for a number of configurations (Cu+x/2)2(Sb-x) (0.0≤x≤2.0) are also performed utilizing free a
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
The Diffie-Hellman is a key exchange protocol to provide a way to transfer shared secret keys between two parties, although those parties might never have communicated together. This paper suggested a new way to transfer keys through public or non-secure channels depending on the sent video files over the channel and then extract keys. The proposed method of key generation depends on the video file content by using the entropy value of the video frames. The proposed system solves the weaknesses in the Diffie-Hellman key exchange algorithm, which is MIMA (Man-in-the-Middle attack) and DLA( Discrete logarithm attack). When the method used high definition videos with a vast amount of data, the keys generated with a large number up to 5
... Show MoreThe aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show MoreThree-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show More