Recommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a new measure is proposed from the combination of measures to cope with the global meaning of data set ratings. After conducting the experimental results, it is shown that the proposed measure achieves major objectives that maximize the accuracy Predictions.
Channel estimation and synchronization are considered the most challenging issues in Orthogonal Frequency Division Multiplexing (OFDM) system. OFDM is highly affected by synchronization errors that cause reduction in subcarriers orthogonality, leading to significant performance degradation. The synchronization errors cause two issues: Symbol Time Offset (STO), which produces inter symbol interference (ISI) and Carrier Frequency Offset (CFO), which results in inter carrier interference (ICI). The aim of the research is to simulate Comb type pilot based channel estimation for OFDM system showing the effect of pilot numbers on the channel estimation performance and propose a modified estimation method for STO with less numb
... Show MoreThis study investigates the surgical and thermal effects on oral soft tissues produced by CO2 laser emitting at 10.6 micrometers with three different fluences 490.79, 1226.99 and 1840.4 J/cm2. These effects are specifically; incision depth, incision width and the tissue damage width and depth. The results showed that increasing the fluence and /or the number of beam passes increase the average depths of ablation. Moreover, increasing the fluence and the number of beam passes increase the adjacent tissue damage in width and depth. Surgeons using CO2 laser should avoid multiple pulses of the laser beam over the same area, to avoid unintentional tissue damage.
Since the introduction of the HTTP/3, research has focused on evaluating its influences on the existing adaptive streaming over HTTP (HAS). Among these research, due to irrelevant transport protocols, the cross-protocol unfairness between the HAS over HTTP/3 (HAS/3) and HAS over HTTP/2 (HAS/2) has caught considerable attention. It has been found that the HAS/3 clients tend to request higher bitrates than the HAS/2 clients because the transport QUIC obtains higher bandwidth for its HAS/3 clients than the TCP for its HAS/2 clients. As the problem originates from the transport layer, it is likely that the server-based unfairness solutions can help the clients overcome such a problem. Therefore, in this paper, an experimental study of the se
... Show MoreIncremental forming is a flexible sheet metal forming process which is performed by utilizing simple tools to locally deform a sheet of metal along a predefined tool path without using of dies. This work presents the single point incremental forming process for producing pyramid geometry and studies the effect of tool geometry, tool diameter, and spindle speed on the residual stresses. The residual stresses were measured by ORIONRKS 6000 test measuring instrument. This instrument was used with four angles of (0º,15º,30º, and 45º) and the average value of residual stresses was determined, the value of the residual stress in the original blanks was (10.626 MPa). The X-ray diffraction technology was used to measure the residual stresses
... Show MoreThe existence of the Internet, networking, and cloud computing support a wide range of new technologies. Blockchain is one of these technologies; this increases the interest of researchers who are concerned with providing a safe environment for the circulation of important information via the Internet. Maintaining solidity and integrity of a blockchain’s transactions is an important issue, which must always be borne in mind. Transactions in blockchain are based on use of public and private keys asymmetric cryptography. This work proposes usage of users’ DNA as a supporting technology for storing and recovering their keys in case those keys are lost — as an effective bio-cryptographic recovery method. The RSA private key is
... Show MoreThis article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MorePhotonic crystal fiber interferometers are widely used for sensing applications. In this work, solid core-Photonic crystal fiber based on Mach-Zehnder modal interferometer for sensing refractive index was presented. The general structure of sensor applied by splicing short lengths of PCF in both sides with conventional single mode fiber (SMF-28). To apply modal interferometer theory; collapsing technique based on fusion splicing used to excite higher order modes (LP01 and LP11). Laser diode (1550 nm) has been used as a pump light source. Where a high sensitive optical spectrum analyzer (OSA) was used to monitor and record the transmitted. The experimental work shows that the interference spectrum of Photonic crystal fiber interferometer
... Show MoreThe conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc
... Show More