Four simply supported reinforced concrete (RC) beams were test experimentaly and analyzed using the extended finite element method (XFEM). This method is used to treat the discontinuities resulting from the fracture process and crack propagation in that occur in concrete. The Meso-Scale Approach (MSA) used to model concrete as a heterogenous material consists of a three-phasic material (coarse aggregate, mortar, and air voids in the cement paste). The coarse aggregate that was used in the casting of these beams rounded and crashed aggregate shape with maximum size of 20 mm. The compressive strength used in these beams is equal to 17 MPa and 34 MPa, respectively. These RC beams are designed to fail due to flexure when subjected to load as a two-point loading. To model the coarse aggregate realistically, the aggregate must distributed randomly according to the gradient and amount actually used in the mix design. This property is not found in the ABAQUS program that resulted in the use of an alternate program to represent the aggregate randomly. Next, the random representation of the aggregate were transfered to the ABAQUS program by using commands and instructions that the program can understand, to draw as a sketch. The comparison between experimental and numerical results showed that the XFEM is a good method used to simulate the non-smooth behavior in RC beams such as discontinuitiy and singularity. While a mesoscale model can be simulated the non-homogeneity in the concrete.
Abstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreThis work examines the ability of a special type of smart antenna array known as Switched Active Switched Parasitic Antenna (SASPA) to produce a directive and electronically steerable radiation pattern. The SASPA array consists of antenna elements that are switchable between active and parasitic states by using P-Intrinsic-N (PIN) diodes. The active element is the element that is supplied by the radio frequency while short-circuiting the terminals of an element in the array results in a parasitic element. Due to the strong mutual coupling between the elements, a directional radiation pattern with high gain and a small beamwidth can be produced with only one active element operating at a time. By changing the parasitic state to the active
... Show MoreNurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreExchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreThe technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime num
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas