Zernike Moments has been popularly used in many shape-based image retrieval studies due to its powerful shape representation. However its strength and weaknesses have not been clearly highlighted in the previous studies. Thus, its powerful shape representation could not be fully utilized. In this paper, a method to fully capture the shape representation properties of Zernike Moments is implemented and tested on a single object for binary and grey level images. The proposed method works by determining the boundary of the shape object and then resizing the object shape to the boundary of the image. Three case studies were made. Case 1 is the Zernike Moments implementation on the original shape object image. In Case 2, the centroid of the shape object image in Case 1 is relocated to the center of the image. In Case 3, the proposed method first detect the outer boundary of the shape object and then resizing the object to the boundary of the image. Experimental investigations were made by using two benchmark shape image datasets showed that the proposed method in Case 3 had demonstrated to provide the most superior image retrieval performances as compared to both the Case 1 and Case 2. As a conlusion, to fully capture the powerful shape representation properties of the Zernike moment, a shape object should be resized to the boundary of the image.
Reliability analysis methods are used to evaluate the safety of reinforced concrete structures by evaluating the limit state function 𝑔(𝑋𝑖). For implicit limit state function and nonlinear analysis , an advanced reliability analysis methods are needed. Monte Carlo simulation (MCS) can be used in this case however, as the number of input variables increases, the time required for MCS also increases, making it a time consuming method especially for complex problems with implicit performance functions. In such cases, MCS-based FORM (First Order Reliability Method) and Artificial Neural Network-based FORM (ANN FORM) have been proposed as alternatives. However, it is important to note that both MCS-FORM and ANN-FORM can also be time-con
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MorePurpose: The research seeks to develop the implications of intellectual human capital, and social capital in business organizations, and will be accomplished on three levels, the first level (the level of description) to identify, diagnose and display content philosophical Strategic Human Resource Management at the thought of modern administrative represented by human capital and Ras social capital. The second level (level of analysis) and the analysis of the extent of the impact of alignment between human capital, and social capital in the organizational strength of the organizations. The third level (Level predict) the formulation of a plan to strengthen the organizational strength in business organizations and to develop speci
... Show MoreDBN Rashid, Asian Quarterly: An International Journal of Contemporary Issue, 2018
Due to the importance of nanotechnology because of its features and applications in various fields, it has become the focus of attention of the world and researchers. In this study, the concept of nanotechnology and nanomaterials was identified, the most important methods of preparing them, as well as the preparation techniques and the most important devices used in their characterization.
model is derived, and the methodology is given in detail. The model is constructed depending on some measurement criteria, Akaike and Bayesian information criterion. For the new time series model, a new algorithm has been generated. The forecasting process, one and two steps ahead, is discussed in detail. Some exploratory data analysis is given in the beginning. The best model is selected based on some criteria; it is compared with some naïve models. The modified model is applied to a monthly chemical sales dataset (January 1992 to Dec 2019), where the dataset in this work has been downloaded from the United States of America census (www.census.gov). Ultimately, the forecasted sales
In this paper, a compact genetic algorithm (CGA) is enhanced by integrating its selection strategy with a steepest descent algorithm (SDA) as a local search method to give I-CGA-SDA. This system is an attempt to avoid the large CPU time and computational complexity of the standard genetic algorithm. Here, CGA dramatically reduces the number of bits required to store the population and has a faster convergence. Consequently, this integrated system is used to optimize the maximum likelihood function lnL(φ1, θ1) of the mixed model. Simulation results based on MSE were compared with those obtained from the SDA and showed that the hybrid genetic algorithm (HGA) and I-CGA-SDA can give a good estimator of (φ1, θ1) for the ARMA(1,1) model. Anot
... Show More