The laser micro-cutting process is the most widely commonly applied machining process which can be applied to practically all metallic and non-metallic materials. While this had challenges in cutting quality criteria such as geometrical precision, surface quality and numerous others. This article investigates the laser micro-cutting of PEEK composite material using nano-fiber laser, due to their significant importunity and efficiency of laser in various manufacturing processes. Design of experiential tool based on Response Surface Methodology (RSM)-Central Composite Design (CCD) used to generate the statistical model. This method was employed to analysis the influence of parameters including laser speed, laser power, laser frequency and number of passes on the cutting characteristics -and geometrical. Cutting-geometry requirements are significant quality features since they are one of the metrics for the geometrical precision of micro-cutting proses it was concluded that higher laser power, slower speed, and more pass number result in a low kerf taper, these parameters have a significant impact on the other cutting characteristics, and geometrical. Whereas the frequency has the lowers impact on the cutting geometrical. Finally, the experiments show Maximum depth was 2000, width minimum top kerf width was 305.56 µm and the minimum angle of 2.8906.
Pattern matching algorithms are usually used as detecting process in intrusion detection system. The efficiency of these algorithms is affected by the performance of the intrusion detection system which reflects the requirement of a new investigation in this field. Four matching algorithms and a combined of two algorithms, for intrusion detection system based on new DNA encoding, are applied for evaluation of their achievements. These algorithms are Brute-force algorithm, Boyer-Moore algorithm, Horspool algorithm, Knuth-Morris-Pratt algorithm, and the combined of Boyer-Moore algorithm and Knuth–Morris– Pratt algorithm. The performance of the proposed approach is calculated based on the executed time, where these algorithms are applied o
... Show MoreThe cuneiform images need many processes in order to know their contents
and by using image enhancement to clarify the objects (symbols) founded in the
image. The Vector used for classifying the symbol called symbol structural vector
(SSV) it which is build from the information wedges in the symbol.
The experimental tests show insome numbersand various relevancy including
various drawings in online method. The results are high accuracy in this research,
and methods and algorithms programmed using a visual basic 6.0. In this research
more than one method was applied to extract information from the digital images
of cuneiform tablets, in order to identify most of signs of Sumerian cuneiform.
Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreThis study proposed a biometric-based digital signature scheme proposed for facial recognition. The scheme is designed and built to verify the person’s identity during a registration process and retrieve their public and private keys stored in the database. The RSA algorithm has been used as asymmetric encryption method to encrypt hashes generated for digital documents. It uses the hash function (SHA-256) to generate digital signatures. In this study, local binary patterns histograms (LBPH) were used for facial recognition. The facial recognition method was evaluated on ORL faces retrieved from the database of Cambridge University. From the analysis, the LBPH algorithm achieved 97.5% accuracy; the real-time testing was done on thirty subj
... Show MoreBackground: The risk of antibiotics resistance (AR) increases due to excessive of antibiotics either by health care provider or by the patients.
Objective: The assessment of the self-medication Practice of over the counter drugs and other prescription drugs and its associated risk factor.
Subjects and Methods: Study design: A descriptive study was conducted from “20th December 2019 to 08th January 2021”. A pre validated and structured questionnaire in English and Urdu language was created to avoid language barrier including personal detail, reasons and source and knowledge about over the counter drugs and Antibiotics. Sample of the study was randomly selected.
... Show More<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show More