Hierarchical temporal memory (HTM) is a biomimetic sequence memory algorithm that holds promise for invariant representations of spatial and spatio-temporal inputs. This article presents a comprehensive neuromemristive crossbar architecture for the spatial pooler (SP) and the sparse distributed representation classifier, which are fundamental to the algorithm. There are several unique features in the proposed architecture that tightly link with the HTM algorithm. A memristor that is suitable for emulating the HTM synapses is identified and a new Z-window function is proposed. The architecture exploits the concept of synthetic synapses to enable potential synapses in the HTM. The crossbar for the SP avoids dark spots caused by unutilized crossbar regions and supports rapid on-chip training within two clock cycles. This research also leverages plasticity mechanisms such as neurogenesis and homeostatic intrinsic plasticity to strengthen the robustness and performance of the SP. The proposed design is benchmarked for image recognition tasks using Modified National Institute of Standards and Technology (MNIST) and Yale faces datasets, and is evaluated using different metrics including entropy, sparseness, and noise robustness. Detailed power analysis at different stages of the SP operations is performed to demonstrate the suitability for mobile platforms.
This work presents a completely new develop an analyzer, named NAG-5SX1-1D-SSP, that is simple, accurate, reproducible, and affordable for the determination of cefotaxime sodium (CFS) in both pure and pharmaceutical drugs. The analyzer was designed according to flow injection analysis, and conducted to turbidimetric measurements. Ammonium cerium nitrate was utilized as a precipitating agent. After optimizing the conditions, the analysis system exhibited a linear range of 0.008-27 mmol. L-1 (n=29), with a limit of detection of 439.3 ng/sample, a limit of quantification of 0.4805 mg/sample, and a correlation coefficient of 0.9988. The repeatability of the responses was assessed by performing six successive injections of CFS at concentra
... Show MoreThe present study aimed to use the magnetic field and nanotechnology in the field of water purification, which slots offering high efficiency to the possibility of removing biological contaminants such as viruses and bacteria rather than the use of chemical and physical transactions such as chlorine and bromine, and ultraviolet light and boiling and sedimentation and distillation, ozone and others that have a direct negative impact on human safety and the environment. Where they were investigating the presence in water samples under study Coli phages using Single agar layer method and then treated samples positive for phages to three types of magnetic field fixed as follows (North Pole - South Pole - Bipolar) and compare the re
... Show MoreThe urban Gentrification is an inclusive global phenomenon to restructure the cities on the overall levels, the research to propose a specific study about the concept of urban Gentrification in the cities and showcasing its, specifications, and results, and how to deal with the variables that occur on cities through improvements as part of urban renewal projects, then the general axis of the research is shrinked, choosing the urban centers as the most important areas that deal with the urban Gentrification process due to its direct connection with indivisuals and social changes, and to process the specific axis of the research theses and studies will be showcased that discuss the topic in different research directions, and emerged
... Show MoreStenography is the art of hiding the very presence of communication by embedding secret message into innocuous looking cover document, such as digital image, videos, sound files, and other computer files that contain perceptually irrelevant or redundant information as covers or carriers to hide secret messages.
In this paper, a new Least Significant Bit (LSB) nonsequential embedding technique in wave audio files is introduced. To support the immunity of proposed hiding system, and in order to recover some weak aspect inherent with the pure implementation of stego-systems, some auxiliary processes were suggested and investigated including the use of hidden text jumping process and stream ciphering algorithm. Besides, the suggested
... Show MoreThe cuneiform images need many processes in order to know their contents
and by using image enhancement to clarify the objects (symbols) founded in the
image. The Vector used for classifying the symbol called symbol structural vector
(SSV) it which is build from the information wedges in the symbol.
The experimental tests show insome numbersand various relevancy including
various drawings in online method. The results are high accuracy in this research,
and methods and algorithms programmed using a visual basic 6.0. In this research
more than one method was applied to extract information from the digital images
of cuneiform tablets, in order to identify most of signs of Sumerian cuneiform.
Estimations of average crash density as a function of traffic elements and characteristics can be used for making good decisions relating to planning, designing, operating, and maintaining roadway networks. This study describes the relationships between total, collision, turnover, and runover accident densities with factors such as hourly traffic flow and average spot speed on multilane rural highways in Iraq. The study is based on data collected from two sources: police stations and traffic surveys. Three highways are selected in Wassit governorate as a case study to cover the studied locations of the accidents. Three highways are selected in Wassit governorate as a case study to cover the studied locations of the accidents. The selection
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreIn this work a chemical sensor was built by using Plane Wave Expansion (PWE) modeling technique by filling the core of 1550 hollow core photonic crystal fiber with chloroform that has different concentrations after being diluted with distilled water. The minimum photonic bandgap width is.0003 and .0005 rad/sec with 19 and 7 cells respectively and a concentration of chloroform that filled these two fibers is 75%.
Any software application can be divided into four distinct interconnected domains namely, problem domain, usage domain, development domain and system domain. A methodology for assistive technology software development is presented here that seeks to provide a framework for requirements elicitation studies together with their subsequent mapping implementing use-case driven object-oriented analysis for component based software architectures. Early feedback on user interface components effectiveness is adopted through process usability evaluation. A model is suggested that consists of the three environments; problem, conceptual, and representational environments or worlds. This model aims to emphasize on the relationship between the objects
... Show MoreAn experimental study on a KIA pride (SAIPA 131) car model with scale of 1:14 in the wind tunnel was made beside the real car tests. Some of the modifications to passive flow control which are (vortex generator, spoiler and slice diffuser) were added to the car to reduce the drag force which its undesirable characteristic that increase fuel consumption and exhaust toxic gases. Two types of calculations were used to determine the drag force acting on the car body. Firstly, is by the integrating the values of pressure recorded along the pressure taps (for the wind tunnel and the real car testing), secondly, is by using one component balance device (wind tunnel testing) to measure the force. The results show that, the average drag estimated on
... Show More