Radiation Safety of Population in the Region of NPP-2006/MIR-1200 Site

The main features of NPP-2006/MIR-1200 design are described. Estimation of individual doses for population under normal operation and accident conditions is performed for Leningradskaya NPP – 2 as an example. The radiation effect on population and environment doesn-t exceed the established normative limit and is as low as reasonably achievable. NPP- 2006/MIR-1200 design meets all Russian and international requirements for power units under construction.

Topology Preservation in SOM

The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.

Extraction of Semantic Digital Signatures from MRI Photos for Image-Identification Purposes

This paper makes an attempt to solve the problem of searching and retrieving of similar MRI photos via Internet services using morphological features which are sourced via the original image. This study is aiming to be considered as an additional tool of searching and retrieve methods. Until now the main way of the searching mechanism is based on the syntactic way using keywords. The technique it proposes aims to serve the new requirements of libraries. One of these is the development of computational tools for the control and preservation of the intellectual property of digital objects, and especially of digital images. For this purpose, this paper proposes the use of a serial number extracted by using a previously tested semantic properties method. This method, with its center being the multi-layers of a set of arithmetic points, assures the following two properties: the uniqueness of the final extracted number and the semantic dependence of this number on the image used as the method-s input. The major advantage of this method is that it can control the authentication of a published image or its partial modification to a reliable degree. Also, it acquires the better of the known Hash functions that the digital signature schemes use and produces alphanumeric strings for cases of authentication checking, and the degree of similarity between an unknown image and an original image.

Speckle Reducing Contourlet Transform for Medical Ultrasound Images

Speckle noise affects all coherent imaging systems including medical ultrasound. In medical images, noise suppression is a particularly delicate and difficult task. A tradeoff between noise reduction and the preservation of actual image features has to be made in a way that enhances the diagnostically relevant image content. Even though wavelets have been extensively used for denoising speckle images, we have found that denoising using contourlets gives much better performance in terms of SNR, PSNR, MSE, variance and correlation coefficient. The objective of the paper is to determine the number of levels of Laplacian pyramidal decomposition, the number of directional decompositions to perform on each pyramidal level and thresholding schemes which yields optimal despeckling of medical ultrasound images, in particular. The proposed method consists of the log transformed original ultrasound image being subjected to contourlet transform, to obtain contourlet coefficients. The transformed image is denoised by applying thresholding techniques on individual band pass sub bands using a Bayes shrinkage rule. We quantify the achieved performance improvement.

Mining Frequent Patterns with Functional Programming

Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.

Drum-Buffer-Rope: The Technique to Plan and Control the Production Using Theory of Constraints

Theory of Constraints has been emerging as an important tool for optimization of manufacturing/service systems. Goldratt in his first book “ The Goal " gave the introduction on Theory of Constraints and its applications in a factory scenario. A large number of production managers around the globe read this book but only a few could implement it in their plants because the book did not explain the steps to implement TOC in the factory. To overcome these limitations, Goldratt wrote this book to explain TOC, DBR and the method to implement it. In this paper, an attempt has been made to summarize the salient features of TOC and DBR listed in the book and the correct approach to implement TOC in a factory setting. The simulator available along with the book was actually used by the authors and the claim of Goldratt regarding the use of DBR and Buffer management to ease the work of production managers was tested and was found to be correct.

Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Image Retrieval: Techniques, Challenge, and Trend

This paper attempts to discuss the evolution of the retrieval techniques focusing on development, challenges and trends of the image retrieval. It highlights both the already addressed and outstanding issues. The explosive growth of image data leads to the need of research and development of Image Retrieval. However, Image retrieval researches are moving from keyword, to low level features and to semantic features. Drive towards semantic features is due to the problem of the keywords which can be very subjective and time consuming while low level features cannot always describe high level concepts in the users- mind.

An Architecture for High Performance File SystemI/O

This paper presents an architecture of current filesystem implementations as well as our new filesystem SpadFS and operating system Spad with rewritten VFS layer targeted at high performance I/O applications. The paper presents microbenchmarks and real-world benchmarks of different filesystems on the same kernel as well as benchmarks of the same filesystem on different kernels – enabling the reader to make conclusion how much is the performance of various tasks affected by operating system and how much by physical layout of data on disk. The paper describes our novel features–most notably continuous allocation of directories and cross-file readahead – and shows their impact on performance.

An Improved Fast Video Clip Search Algorithm for Copy Detection using Histogram-based Features

In this paper, we present an improved fast and robust search algorithm for copy detection using histogram-based features for short MPEG video clips from large video database. There are two types of histogram features used to generate more robust features. The first one is based on the adjacent pixel intensity difference quantization (APIDQ) algorithm, which had been reliably applied to human face recognition previously. An APIDQ histogram is utilized as the feature vector of the frame image. Another one is ordinal histogram feature which is robust to color distortion. Furthermore, by Combining with a temporal division method, the spatial and temporal features of the video sequence are integrated to realize fast and robust video search for copy detection. Experimental results show the proposed algorithm can detect the similar video clip more accurately and robust than conventional fast video search algorithm.

Fast Search Method for Large Video Database Using Histogram Features and Temporal Division

In this paper, we propose an improved fast search algorithm using combined histogram features and temporal division method for short MPEG video clips from large video database. There are two types of histogram features used to generate more robust features. The first one is based on the adjacent pixel intensity difference quantization (APIDQ) algorithm, which had been reliably applied to human face recognition previously. An APIDQ histogram is utilized as the feature vector of the frame image. Another one is ordinal feature which is robust to color distortion. Combined with active search [4], a temporal pruning algorithm, fast and robust video search can be realized. The proposed search algorithm has been evaluated by 6 hours of video to search for given 200 MPEG video clips which each length is 30 seconds. Experimental results show the proposed algorithm can detect the similar video clip in merely 120ms, and Equal Error Rate (ERR) of 1% is achieved, which is more accurately and robust than conventional fast video search algorithm.

Flow Characteristics of Pulp Liquid in Straight Ducts

An experimental investigation was performed on pulp liquid flow in straight ducts with a square cross section. Fully developed steady flow was visualized and the fiber concentration was obtained using a light-section method developed by the author et al. The obtained results reveal quantitatively, in a definite form, the distribution of the fiber concentration. From the results and measurements of pressure loss, it is found that the flow characteristics of pulp liquid in ducts can be classified into five patterns. The relationships among the distributions of mean and fluctuation of fiber concentration, the pressure loss and the flow velocity are discussed, and then the features for each pattern are extracted. The degree of nonuniformity of the fiber concentration, which is indicated by the standard deviation of its distribution, is decreased from 0.3 to 0.05 with an increase in the velocity of the tested pulp liquid from 0.4 to 0.8%.

Collaborative Web Platform for Rich Media Educational Material Creation

This paper describes a platform that faces the main research areas for e-learning educational contents. Reusability tackles the possibility to use contents in different courses reducing costs and exploiting available data from repositories. In our approach the production of educational material is based on templates to reuse learning objects. In terms of interoperability the main challenge lays on reaching the audience through different platforms. E-learning solution must track social consumption evolution where nowadays lots of multimedia contents are accessed through the social networks. Our work faces it by implementing a platform for generation of multimedia presentations focused on the new paradigm related to social media. The system produces videos-courses on top of web standard SMIL (Synchronized Multimedia Integration Language) ready to be published and shared. Regarding interfaces it is mandatory to satisfy user needs and ease communication. To overcome it the platform deploys virtual teachers that provide natural interfaces while multimodal features remove barriers to pupils with disabilities.

Identification of MIMO Systems Using Neuro-Fuzzy Models with a Shuffled Frog Leaping Algorithm

In this paper, a TSK-type Neuro-fuzzy Inference System that combines the features of fuzzy sets and neural networks has been applied for the identification of MIMO systems. The procedure of adapting parameters in TSK model employs a Shuffled Frog Leaping Algorithm (SFLA) which is inspired from the memetic evolution of a group of frogs when seeking for food. To demonstrate the accuracy and effectiveness of the proposed controller, two nonlinear systems have been considered as the MIMO plant, and results have been compared with other learning methods based on Particle Swarm Optimization algorithm (PSO) and Genetic Algorithm (GA).

Numerical Simulation of Cavitation and Aeration in Discharge Gated Tunnel of a Dam Based on the VOF Method

Cavitation, usually known as a destructive phenomenon, involves turbulent unsteady two-phase flow. Having such features, cavitating flows have been turned to a challenging topic in numerical studies and many researches are being done for better understanding of bubbly flows and proposing solutions to reduce its consequent destructive effects. Aeration may be regarded as an effective protection against cavitation erosion in many hydraulic structures, like gated tunnels. The paper concerns numerical simulation of flow in discharge gated tunnel of a dam using ing RNG k -ε model coupled with the volume of fluid (VOF) method and the zone which is susceptible of cavitation inception in the tunnel is predicted. In the second step, a vent is considered in the mentioned zone for aeration and the numerical simulation is done again to study the effects of aeration. The results show that aeration is an impressively useful method to exclude cavitation in mentioned tunnels.

Deployment of Service Quality Characteristics

This work discusses an innovative methodology for deployment of service quality characteristics. Four groups of organizational features that may influence the quality of services are identified: human resource, technology, planning, and organizational relationships. A House of Service Quality (HOSQ) matrix is built to extract the desired improvement in the service quality characteristics and to translate them into a hierarchy of important organizational features. The Mean Square Error (MSE) criterion enables the pinpointing of the few essential service quality characteristics to be improved as well as selection of the vital organizational features. The method was implemented in an engineering supply enterprise and provides useful information on its vital service dimensions.

A Study of Computational Organizational Narrative Generation for Decision Support

Narratives are invaluable assets of human lives. Due to the distinct features of narratives, they are useful for supporting human reasoning processes. However, many useful narratives become residuals in organizations or human minds nowadays. Researchers have contributed effort to investigate and improve narrative generation processes. This paper attempts to contemplate essential components in narratives and explore a computational approach to acquire and extract knowledge to generate narratives. The methodology and significant benefit for decision support are presented.

Standard Deviation of Mean and Variance of Rows and Columns of Images for CBIR

This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.

On Developing an Automatic Speech Recognition System for Standard Arabic Language

The Automatic Speech Recognition (ASR) applied to Arabic language is a challenging task. This is mainly related to the language specificities which make the researchers facing multiple difficulties such as the insufficient linguistic resources and the very limited number of available transcribed Arabic speech corpora. In this paper, we are interested in the development of a HMM-based ASR system for Standard Arabic (SA) language. Our fundamental research goal is to select the most appropriate acoustic parameters describing each audio frame, acoustic models and speech recognition unit. To achieve this purpose, we analyze the effect of varying frame windowing (size and period), acoustic parameter number resulting from features extraction methods traditionally used in ASR, speech recognition unit, Gaussian number per HMM state and number of embedded re-estimations of the Baum-Welch Algorithm. To evaluate the proposed ASR system, a multi-speaker SA connected-digits corpus is collected, transcribed and used throughout all experiments. A further evaluation is conducted on a speaker-independent continue SA speech corpus. The phonemes recognition rate is 94.02% which is relatively high when comparing it with another ASR system evaluated on the same corpus.

Wavelet and K-L Seperability Based Feature Extraction Method for Functional Data Classification

This paper proposes a novel feature extraction method, based on Discrete Wavelet Transform (DWT) and K-L Seperability (KLS), for the classification of Functional Data (FD). This method combines the decorrelation and reduction property of DWT and the additive independence property of KLS, which is helpful to extraction classification features of FD. It is an advanced approach of the popular wavelet based shrinkage method for functional data reduction and classification. A theory analysis is given in the paper to prove the consistent convergence property, and a simulation study is also done to compare the proposed method with the former shrinkage ones. The experiment results show that this method has advantages in improving classification efficiency, precision and robustness.