In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre line average, asperity density and the radius of asperities. A Gaussian distribution of asperity peak height was assumed in calculating the theoretical value of the normal approach in the elastic and plastic regions and where compared with those obtained experimentally to verify the obtained results.
This study aims to know the role of strategic leadership to achieving competitiveness in industrial establishments by identifying the respondents’ perceptions about the level of availability of dimensions of leadership strategies (creativity and innovation, risk tolerance, available opportunities) in Bashir Al-Siksek & Partners Company for the manufacture of sanitary and plastic ware in Gaza strip
To achieve this, a questionnaire was developed and distributed to a sample of managers, auditors, accountants, and administrative employees in the study sample company. The questionnaire tool was distributed to 60 employees and employees, of which (52) were retrieved, or 86.6%, and (8) were excluded for la
... Show MoreThis paper introduces some properties of separation axioms called α -feeble regular and α -feeble normal spaces (which are weaker than the usual axioms) by using elements of graph which are the essential parts of our α -topological spaces that we study them. Also, it presents some dependent concepts and studies their properties and some relationships between them.
Variable-Length Subnet Masks (VLSM), often referred to as "subnetting a subnet", is used to maximize addressing efficiency. The network administrator is able to use a long mask on networks with few hosts, and a short mask on subnets with many hosts. This addressing scheme allows growth and does not involve wasting addresses. VLSM gives a way of subnetting a network with
minimal loses of IP addresses for a specific range. Unfortunately, the network administrator has to perform several mathematical steps (or use charts) to get the required results from VLSM. In this paper, a simple graph simulator is proposed (using Visual Basic 6.0 Language) to perform all the required mathematical steps and to display the obtained required informatio
Variable-Length Subnet Masks (VLSM), often referred to as "subnetting a subnet", is used to maximize addressing efficiency. The network administrator is able to use a long mask on networks with few hosts, and a short mask on subnets with many hosts. This addressing scheme allows growth and does not involve wasting addresses. VLSM gives a way of subnetting a network with minimal loses of IP addresses for a specific range. Unfortunately, the network administrator has to perform several mathematical steps (or use charts) to get the required results from VLSM. In this paper, a simple graph simulator is proposed (using Visual Basic 6.0 Language) to perform all the required mathematical steps and to display the obtained required information (the
... Show MoreIn developing countries, individual students and researchers are not able to afford the high price of the subscription to the international publishers, like JSTOR, ELSEVIER,…; therefore the governments and/or universities of those countries aim to purchase one global subscription to the international publishers to provide their educational resources at a cheaper price, or even freely, to all students and researchers of those institutions. For realizing this concept, we must build a system that sits between the publishers and the users (students or researchers) and act as a gatekeeper and a director of information: this system must register its users and must have an adequate security to e
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreThe revolution of technology in the 21st century has changed radically the
climate of opinion concerning second language education. In order to excel in
today’s world, teachers and learners need to adopt new roles and be equipped with
new skills and competencies that go beyond the basic ones of listening, speaking,
reading, and writing; skills that cannot be gained if teachers teach mere academic
subjects, and students are evaluated on how well they have learnt the minute sub
skills in those content areas.
This session will touch upon several skills which may be considered the
new basics of the 21st century. Among these skills are: autonomy, active learning,
critical thinking, cooperative learning, and digita