Brachytherapy treatment is primarily used for the certain handling kinds of cancerous tumors. Using radionuclides for the study of tumors has been studied for a very long time, but the introduction of mathematical models or radiobiological models has made treatment planning easy. Using mathematical models helps to compute the survival probabilities of irradiated tissues and cancer cells. With the expansion of using HDR-High dose rate Brachytherapy and LDR-low dose rate Brachytherapy for the treatment of cancer, it requires fractionated does treatment plan to irradiate the tumor. In this paper, authors have discussed dose calculation algorithms that are used in Brachytherapy treatment planning. Precise and less time-consuming calculations using 3D dose distribution for the patient is one of the important necessities in modern radiation oncology. For this it is required to have accurate algorithms which help in TPS. There are certain limitations with the algorithm which are used for calculating the dose. This work is done to evaluate the correctness of five algorithms that are presently employed for treatment planning, including pencil beam convolution (PBC), superposition (SP), anisotropic analytical algorithm (AAA), Monte Carlo (MC), Clarkson Method, Fast Fourier Transform, Convolution method. The algorithms used in radiotherapy treatment planning are categorized as correction‐based and model‐based.
Realistic implementation of nanofluids in subsurface projects including carbon geosequestration and enhanced oil recovery requires full understanding of nanoparticles (NPs) adsorption behaviour in the porous media. The physicochemical interactions between NPs and between the NP and the porous media grain surface control the adsorption behavior of NPs. This study investigates the reversible and irreversible adsorption of silica NPs onto oil-wet and water-wet carbonate surfaces at reservoir conditions. Each carbonate sample was treated with different concentrations of silica nanofluid to investigate NP adsorption in terms of nanoparticles initial size and hydrophobicity at different temperatures, and pressures. Aggregation behaviour and the
... Show MoreThe modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
The High Power Amplifiers (HPAs), which are used in wireless communication, are distinctly characterized by nonlinear properties. The linearity of the HPA can be accomplished by retreating an HPA to put it in a linear region on account of power performance loss. Meanwhile the Orthogonal Frequency Division Multiplex signal is very rough. Therefore, it will be required a large undo to the linear action area that leads to a vital loss in power efficiency. Thereby, back-off is not a positive solution. A Simplicial Canonical Piecewise-Linear (SCPWL) model based digital predistorters are widely employed to compensating the nonlinear distortion that introduced by a HPA component in OFDM technology. In this paper, the genetic al
... Show MoreCurrently, the prominence of automatic multi document summarization task belongs to the information rapid increasing on the Internet. Automatic document summarization technology is progressing and may offer a solution to the problem of information overload.
Automatic text summarization system has the challenge of producing a high quality summary. In this study, the design of generic text summarization model based on sentence extraction has been redirected into a more semantic measure reflecting individually the two significant objectives: content coverage and diversity when generating summaries from multiple documents as an explicit optimization model. The proposed two models have been then coupled and def
... Show MoreImage compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth
... Show MoreRecognition is one of the basic characteristics of human brain, and also for the living creatures. It is possible to recognize images, persons, or patterns according to their characteristics. This recognition could be done using eyes or dedicated proposed methods. There are numerous applications for pattern recognition such as recognition of printed or handwritten letters, for example reading post addresses automatically and reading documents or check reading in bank.
One of the challenges which faces researchers in character recognition field is the recognition of digits, which are written by hand. This paper describes a classification method for on-line handwrit
... Show MoreThe strong cryptography employed by PGP (Pretty Good Privacy) is one of the best available today. The PGP protocol is a hybrid cryptosystem that combines some of the best features of both conventional and public-key cryptography. This paper aim to improve PGP protocol by combined between the Random Genetic algorithm, NTRU (N-th degree Truncated polynomial Ring Unit) algorithm with PGP protocol stages in order to increase PGP protocol speed, security, and make it more difficult in front of the counterfeiter. This can be achieved by use the Genetic algorithm that only generates the keys according to the Random Genetic equations. The final keys that obtained from Genetic algorithm were observed to be purely random (according to the randomne
... Show MoreNowadays, it is convenient for us to use a search engine to get our needed information. But sometimes it will misunderstand the information because of the different media reports. The Recommender System (RS) is popular to use for every business since it can provide information for users that will attract more revenues for companies. But also, sometimes the system will recommend unneeded information for users. Because of this, this paper provided an architecture of a recommender system that could base on user-oriented preference. This system is called UOP-RS. To make the UOP-RS significantly, this paper focused on movie theatre information and collect the movie database from the IMDb website that provides informatio
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.