conventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.
Many image processing and machine learning applications require sufficient image feature selection and representation. This can be achieved by imitating human ability to process visual information. One such ability is that human eyes are much more sensitive to changes in the intensity (luminance) than the color information. In this paper, we present how to exploit luminance information, organized in a pyramid structure, to transfer properties between two images. Two applications are presented to demonstrate the results of using luminance channel in the similarity metric of two images. These are image generation; where a target image is to be generated from a source one, and image colorization; where color information is to be browsed from o
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreHeart failure (HF) is characterized by family history and clinical examination combined with diagnostic tools such as electrocardiogram, chest x-ray and an assessment of left ventricular function by echocardiography. An early diagnosis of heart failure is still based on symptoms of dyspnea, fatigue and signs of fluid overload. Serum N-terminal pro-B-type natriuretic peptide (NT-pro BNP) is cardiac biomarker has emerged as potential predictor of heart failure. It is used as a sensitive biomarker in diagnosis and assessment severity of heart failure. This study assed the diagnostic value of (NT-pro BNP), in Iraqi children patients with heart failure and its correlation with LVEF% especially in emergency rooms of hospitals.Ninety (90) consecut
... Show MoreNearly, in the middle of 1970s the split-brain theory became the only theory that explains human creativity used in all fine art and art education schools. In fact, this theory- which appeared for first time in the middle of 1940s – faced many radical changes including its concepts and structures, and these changes affected both teaching art and art criticism. To update people awareness within art field of study, this paper reviews the split-brain theory and its relationship with teaching art from its appearance to its decay in 2013 and after.
sensor sampling rate (SSR) may be an effective and crucial field in networked control systems. Changing sensor sampling period after designing the networked control system is a critical matter for the stability of the system. In this article, a wireless networked control system with multi-rate sensor sampling is proposed to control the temperature of a multi-zone greenhouse. Here, a behavior based Mamdany fuzzy system is used in three approaches, first is to design the fuzzy temperature controller, second is to design a fuzzy gain selector and third is to design a fuzzy error handler. The main approach of the control system design is to control the input gain of the fuzzy temperature controller depending on the cur
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show More