Segmentation of urban features is considered a major research challenge in the fields of photogrammetry and remote sensing. However, the dense datasets now readily available through airborne laser scanning (ALS) offer increased potential for 3D object segmentation. Such potential is further augmented by the availability of full-waveform (FWF) ALS data. FWF ALS has demonstrated enhanced performance in segmentation and classification through the additional physical observables which can be provided alongside standard geometric information. However, use of FWF information is not recommended without prior radiometric calibration, taking into account all parameters affecting the backscatter energy. This paper reports the implementation of a radiometric calibration workflow for FWF ALS data, and demonstrates how the resultant FWF information can be used to improve segmentation of an urban area. The developed segmentation algorithm presents a novel approach which uses the calibrated backscatter cross-section as a weighting function to estimate the segmentation similarity measure. The normal vector and the local Euclidian distance are used as criteria to segment the point clouds through a region growing approach. The paper demonstrates the potential to enhance 3D object segmentation in urban areas by integrating the FWF physical backscattered energy alongside geometric information. The method is demonstrated through application to an interest area sampled from a relatively dense FWF ALS dataset. The results are assessed through comparison to those delivered from utilising only geometric information. Validation against a manual segmentation demonstrates a successful automatic implementation, achieving a segmentation accuracy of 82%, and out-performs a purely geometric approach.
Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreThe necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embe
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Cassava, a significant crop in Africa, Asia, and South America, is a staple food for millions. However, classifying cassava species using conventional color, texture, and shape features is inefficient, as cassava leaves exhibit similarities across different types, including toxic and non-toxic varieties. This research aims to overcome the limitations of traditional classification methods by employing deep learning techniques with pre-trained AlexNet as the feature extractor to accurately classify four types of cassava: Gajah, Manggu, Kapok, and Beracun. The dataset was collected from local farms in Lamongan Indonesia. To collect images with agricultural research experts, the dataset consists of 1,400 images, and each type of cassava has
... Show MoreThe nature of the dark sector of the Universe remains one of the outstanding problems in modern cosmology, with the search for new observational probes guiding the development of the next generation of observational facilities. Clues come from tension between the predictions from Λ cold dark matter (ΛCDM) and observations of gravitationally lensed galaxies. Previous studies showed that galaxy clusters in the ΛCDM are not strong enough to reproduce the observed number of lensed arcs. This work aims to constrain the warm dark matter (WDM) cosmologies by means of the lensing efficiency of galaxy clusters drawn from these alternative models. The lensing characteristics of two samples of simulated clusters in the Λ warm dark matter and ΛCDM
... Show MoreFive serological methods for detection of Brucella were compaired in this study, Four of the methods are commonely used in the detections:- 1-Rose-Bengal: as primary screening test which depends on detecting antibodies in the blood serum. 2-IFAT: which detects IgG and IgM antibodies in the serum. 3-ELISA test: which detects IgG antibodies in the serum. 4-2ME test: which detects IgG antibodies The fifth methods. It was developed by a reasercher in one of the health centers in Baghdad. It was given the name of spot Immune Assay (SIA). Results declares that among (100) samples of patients blood, 76, 49, 49, 37, and 28. samples were positive to Rose Bengal, ELISA, SIA, 2ME and IFAT tests, respectively. When efficiency, sensitivity and specific
... Show MoreThis current study aims to:
1st: The recognizing of Alexithymia level for 6th grade students (Study Specimen) through the next Zero Hypothesis:1. There are no statistically significant differences at (0.05) level between the arithmetic mean of the specimen degrees as a whole and the central assumption for the scale of the lack in emotions expression
2. There are no statistically significant differences at (0.05) level between the arithmetic mean of the male students specimen and the arithmetic meanc of the female students specimen for the scale of Alexithymia.
2nd: ldentification the level of the emotional intelligence among 6th grade students (Study Specimen) through the next Zero Hypothesis:
1) There are no statistically si
The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show More