For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
In recent years, the world witnessed a rapid growth in attacks on the internet which resulted in deficiencies in networks performances. The growth was in both quantity and versatility of the attacks. To cope with this, new detection techniques are required especially the ones that use Artificial Intelligence techniques such as machine learning based intrusion detection and prevention systems. Many machine learning models are used to deal with intrusion detection and each has its own pros and cons and this is where this paper falls in, performance analysis of different Machine Learning Models for Intrusion Detection Systems based on supervised machine learning algorithms. Using Python Scikit-Learn library KNN, Support Ve
... Show MoreThe density functional B3LYP is used to investigate the effect of decorating the silver (Ag) atom on the sensing capability of an AlN nanotube (AlN-NT) in detecting thiophosgene (TP). There is a weak interaction between the pristine AlN-NT and TP with the sensing response (SR) of approximately 9.4. Decoration of the Ag atom into the structure of AlN-NT causes the adsorption energy of TP to decrease from − 6.2 to − 22.5 kcal/mol. Also, the corresponding SR increases significantly to 100.5. Moreover, the recovery time when TP is desorbed from the surface of the Ag-decorated AlN-NT (Ag@AlN-NT) is short, i.e., 24.9 s. The results show that Ag@AlN-NT can selectively detect TP among other gases, such as N2, O2, CO2, CO, and H2O.
It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreMany carbonate reservoirs in the world show a tilted in originally oil-water contact (OOWC) which requires a special consideration in the selection of the capillary pressure curves and an understanding of reservoir fluids distribution while initializing the reservoir simulation models.
An analytical model for predicting the capillary pressure across the interface that separates two immiscible fluids was derived from reservoir pressure transient analysis. The model reflected the entire interaction between the reservoir-aquifer fluids and rock properties measured under downhole reservoir conditions.
This model retained the natural coupling of oil reservoirs with the aquifer zone and treated them as an explicit-region composite system
The power generation of solar photovoltaic (PV) technology is being implemented in every nation worldwide due to its environmentally clean characteristics. Therefore, PV technology is significantly growing in the present applications and usage of PV power systems. Despite the strength of the PV arrays in power systems, the arrays remain susceptible to certain faults. An effective supply requires economic returns, the security of the equipment and humans, precise fault identification, diagnosis, and interruption tools. Meanwhile, the faults in unidentified arc lead to serious fire hazards to commercial, residential, and utility-scale PV systems. To ensure secure and dependable distribution of electricity, the detection of such ha
... Show MoreSuicidal ideation is one of the most severe mental health issues faced by people all over the world. There are various risk factors involved that can lead to suicide. The most common & critical risk factors among them are depression, anxiety, social isolation and hopelessness. Early detection of these risk factors can help in preventing or reducing the number of suicides. Online social networking platforms like Twitter, Redditt and Facebook are becoming a new way for the people to express themselves freely without worrying about social stigma. This paper presents a methodology and experimentation using social media as a tool to analyse the suicidal ideation in a better way, thus helping in preventing the chances of being the victim o
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThis research deals with the qualitative and quantitative interpretation of Bouguer gravity anomaly data for a region located to the SW of Qa’im City within Anbar province by using 2D- mapping methods. The gravity residual field obtained graphically by subtracting the Regional Gravity values from the values of the total Bouguer anomaly. The residual gravity field processed in order to reduce noise by applying the gradient operator and 1st directional derivatives filtering. This was helpful in assigning the locations of sudden variation in Gravity values. Such variations may be produced by subsurface faults, fractures, cavities or subsurface facies lateral variations limits. A major fault was predicted to extend with the direction NE-
... Show MoreThe consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreIn this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.