The current research aims to study the extent to which the Independent High Electoral Commission applies to information security risk management by the international standard (ISO / IEC27005) in terms of policies, administrative and technical procedures, and techniques used in managing information security risks, based on the opinions of experts in the sector who occupy positions (General Manager The directorate, department heads and their agents, project managers, heads of divisions, and those authorized to access systems and software). The importance of the research comes by giving a clear picture of the field of information security risk management in the organization in question because of its significant role in identifying risks and setting appropriate controls to manage or get rid of them, flexibility in setting controls at work and gaining the confidence of stakeholders and customers that Their data is protected. Compliance with controls gives the organization the confidence of customers that it is the best supplier and raises the level of ability to meet the requirements of tenders and then get new job opportunities, which encouraged addressing this topic by focusing on the basic standards of this specification and trying to study these standards and identify the most critical problems that This prevents its application in the commission understudy in particular. The Independent High Electoral Commission/National Office in Baghdad was chosen as a site to conduct the research, and the approach of the case study and applied research was followed and through field coexistence, observations, interviews, access to documents and information extracted from records and documents in order to determine the extent of the gap Between the Information Security Department of the commission in question and the system that the specification came with, analyzing the causes of the gaps and developing solutions, and considering The research was extended to the checklists prepared by the International Standardization Organization, and for the purpose of data analysis, the heptagonal scale was used in the checklists to measure the extent to which the implementation and actual documentation conform to the requirements of the specification, while determining the weights for the answers to the questions contained in the checklists by allocating a specific weight to each paragraph of the scale. The research used two statistical methods, the percentage and the weighted mean to express the extent of application and documentation of the specification paragraphs above and relied on the statement of the main reasons for surgery in the emergence of those gaps. The results that were reached showed several reasons that prevented the application of information security risk management, in the light of which treatments were developed that would reduce the gaps that appeared, the most important of which are: that the Commission did not adopt a clear and documented strategy to address risks, and that information security risk management Ineffective and completely secured from external and internal threats. There was also interest in documenting fixed Hardware and portable Hardware represented by computers used at the headquarters of the directorate, servers and small computers used as workstations in divisions and departments and their connection to senior management, as well as laptops and personal digital assistants, which showed a gap attributed to the total undocumented application of Hardware (automatic data processing), processing accessories, and electronic media), while the application was partially and undocumented for other electronic media, including disk drives, printers, paper, and documents.
In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted in the theoretical cross section and compared with the experimental data for nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi
... Show MoreThe auditory system can suffer from exposure to loud noise and human health can be affected. Traffic noise is a primary contributor to noise pollution. To measure the noise levels, 3 variables were examined at 25 locations. It was found that the main factors that determine the increase in noise level are traffic volume, vehicle speed, and road functional class. The data have been taken during three different periods per day so that they represent and cover the traffic noise of the city during heavy traffic flow conditions. Analysis of traffic noise prediction was conducted using a simple linear regression model to accurately predict the equivalent continuous sound level. The difference between the predicted and the measured noise shows that
... Show MoreA new, Simple, sensitive and accurate spectrophotometric methods have been developed for the determination of sulfamethoxazole (SMZ) drug in pure and dosage forms. This method based on the reaction of sulfamethoxazole (SMZ) with 1,2-napthoquinone-4-sulphonic acid (NQS) to form Nalkylamono naphthoquinone by replacement of the sulphonate group of the naphthoquinone sulphonic acid by an amino group. The colored chromogen shows absorption maximum at 460 nm. The optimum conditions of condensation reaction forms were investigated by (1) univariable method, by optimizing the effect of experimental variables (different bases, reagent concentration, borax concentration and reaction time), (2) central composite design (CCD) including the effect of
... Show MoreIn this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
Starting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.
Features is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.
Background: Obesity typically results from a variety of causes and factors which contribute, genetics included, and style of living choices, and described as excessive body fat accumulation of body fat lead to excessive body, is a chronic disorder that combines pathogenic environmental and genetic factors. So, the current study objective was to investigate the of the FTO gene rs9939609 polymorphism and the obesity risk. Explaining the relationship between fat mass and obesity-associated gene (FTO) rs9939609 polymorphism and obesity in adults. Methods: Identify research exploring the association between the obesity risk and the variation polymorphisms of FTO gene rs9939609. We combined the modified odds ratios (OR) as total groups and subgro
... Show More