On Quantum BCH Codes and Its Duals

Classical Bose-Chaudhuri-Hocquenghem (BCH) codes C that contain their dual codes can be used to construct quantum stabilizer codes this chapter studies the properties of such codes. It had been shown that a BCH code of length n which contains its dual code satisfies the bound on weight of any non-zero codeword in C and converse is also true. One impressive difficulty in quantum communication and computation is to protect informationcarrying quantum states against undesired interactions with the environment. To address this difficulty, many good quantum errorcorrecting codes have been derived as binary stabilizer codes. We were able to shed more light on the structure of dual containing BCH codes. These results make it possible to determine the parameters of quantum BCH codes in terms of weight of non-zero dual codeword.

Application of Multi-objective Optimization Packages in Design of an Evaporator Coil

A novel methodology has been used to design an evaporator coil of a refrigerant. The methodology used is through a complete Computer Aided Design /Computer Aided Engineering approach, by means of a Computational Fluid Dynamic/Finite Element Analysis model which is executed many times for the thermal-fluid exploration of several designs' configuration by an commercial optimizer. Hence the design is carried out automatically by parallel computations, with an optimization package taking the decisions rather than the design engineer. The engineer instead takes decision regarding the physical settings and initializing of the computational models to employ, the number and the extension of the geometrical parameters of the coil fins and the optimization tools to be employed. The final design of the coil geometry found to be better than the initial design.

Lodging Business Management in Nakhon Pathom with Sufficient Economy Approach

The objectives of this research are to search the management pattern of Nakhon Pathom lodging entrepreneurs for sufficient economy ways, to know the threat that affects this sector and design fit arrangement model to sustain their business with Nakhon Pathom style. What will happen if they do not use this approach? Will they have a financial crisis? The data and information are collected by informal discussions with 12 managers and 400 questionnaires. A mixed method of both qualitative research and quantitative research are used. Bent Flyvbjerg’s phronesis is utilized for this analysis. Our research will prove that sufficient economy can help small business firms to solve their problems. We think that the results of our research will be a financial model to solve many problems of the entrepreneurs and this way will can be a model for other provinces of Thailand.

Analytical Solution of Time-Harmonic Torsional Vibration of a Cylindrical Cavity in a Half-Space

In this article an isotropic linear elastic half-space with a cylindrical cavity of finite length is considered to be under the effect of a ring shape time-harmonic torsion force applied at an arbitrary depth on the surface of the cavity. The equation of equilibrium has been written in a cylindrical coordinate system. By means of Fourier cosine integral transform, the non-zero displacement component is obtained in the transformed domain. With the aid of the inversion theorem of the Fourier cosine integral transform, the displacement is obtained in the real domain. With the aid of boundary conditions, the involved boundary value problem for the fundamental solution is reduced to a generalized Cauchy singular integral equation. Integral representation of the stress and displacement are obtained, and it is shown that their degenerated form to the static problem coincides with existing solutions in the literature.

Rhetorical Communication in the CogSci Discourse Community: The Cognitive Neurosciences (2004) in the Context of Scientific Dissemination

In recent years linguistic research has turned increasing attention to covert/overt strategies to modulate authorial stance and positioning in scientific texts, and to the recipients' response. This study discussed some theoretical implications of the use of rhetoric in scientific communication and analysed qualitative data from the authoritative The Cognitive Neurosciences III (2004) volume. Its genre-identity, status and readability were considered, in the social interactive context of contemporary disciplinary discourses – in their polyphony of traditional and new, emerging genres. Evidence was given of the ways its famous authors negotiate and shape knowledge and research results – explicitly appraising team work and promoting faith in the fast-paced progress of Cognitive Neuroscience, also through experiential metaphors – by presenting a set of examples, ordered according to their dominant rhetorical quality.

Adaptive Naïve Bayesian Anti-Spam Engine

The problem of spam has been seriously troubling the Internet community during the last few years and currently reached an alarming scale. Observations made at CERN (European Organization for Nuclear Research located in Geneva, Switzerland) show that spam mails can constitute up to 75% of daily SMTP traffic. A naïve Bayesian classifier based on a Bag Of Words representation of an email is widely used to stop this unwanted flood as it combines good performance with simplicity of the training and classification processes. However, facing the constantly changing patterns of spam, it is necessary to assure online adaptability of the classifier. This work proposes combining such a classifier with another NBC (naïve Bayesian classifier) based on pairs of adjacent words. Only the latter will be retrained with examples of spam reported by users. Tests are performed on considerable sets of mails both from public spam archives and CERN mailboxes. They suggest that this architecture can increase spam recall without affecting the classifier precision as it happens when only the NBC based on single words is retrained.

Synthesis and Characterization of ZnO and Fe3O4 Nanocrystals from Oleat-based Organometallic Compounds

Magnetic and semiconductor nanomaterials exhibit novel magnetic and optical properties owing to their unique size and shape-dependent effects. With shrinking the size down to nanoscale region, various anomalous properties that normally not present in bulk start to dominate. Ability in harnessing of these anomalous properties for the design of various advance electronic devices is strictly dependent on synthetic strategies. Hence, current research has focused on developing a rational synthetic control to produce high quality nanocrystals by using organometallic approach to tune both size and shape of the nanomaterials. In order to elucidate the growth mechanism, transmission electron microscopy was employed as a powerful tool in performing real time-resolved morphologies and structural characterization of magnetic (Fe3O4) and semiconductor (ZnO) nanocrystals. The current synthetic approach is found able to produce nanostructures with well-defined shapes. We have found that oleic acid is an effective capping ligand in preparing oxide-based nanostructures without any agglomerations, even at high temperature. The oleate-based precursors and capping ligands are fatty acid compounds, which are respectively originated from natural palm oil with low toxicity. In comparison with other synthetic approaches in producing nanostructures, current synthetic method offers an effective route to produce oxide-based nanomaterials with well-defined shapes and good monodispersity. The nanocystals are well-separated with each other without any stacking effect. In addition, the as-synthesized nanopellets are stable in terms of chemically and physically if compared to those nanomaterials that are previous reported. Further development and extension of current synthetic strategy are being pursued to combine both of these materials into nanocomposite form that will be used as “smart magnetic nanophotocatalyst" for industry waste water treatment.

Action Functional of the Electomagnetic Field: Effect of Gravitation

The scalar wave equation for a potential in a curved space time, i.e., the Laplace-Beltrami equation has been studied in this work. An action principle is used to derive a finite element algorithm for determining the modes of propagation inside a waveguide of arbitrary shape. Generalizing this idea, the Maxwell theory in a curved space time determines a set of linear partial differential equations for the four electromagnetic potentials given by the metric of space-time. Similar to the Einstein-s formulation of the field equations of gravitation, these equations are also derived from an action principle. In this paper, the expressions for the action functional of the electromagnetic field have been derived in the presence of gravitational field.

3D Shape Modelling of Left Ventricle: Towards Correlation of Myocardial Scintigraphy Data and Coronarography Result

The myocardial sintigraphy is an imaging modality which provides functional informations. Whereas, coronarography modality gives useful informations about coronary arteries anatomy. In case of coronary artery disease (CAD), the coronarography can not determine precisely which moderate lesions (artery reduction between 50% and 70%), known as the “gray zone", are haemodynamicaly significant. In this paper, we aim to define the relationship between the location and the degree of the stenosis in coronary arteries and the observed perfusion on the myocardial scintigraphy. This allows us to model the impact evolution of these stenoses in order to justify a coronarography or to avoid it for patients suspected being in the gray zone. Our approach is decomposed in two steps. The first step consists in modelling a coronary artery bed and stenoses of different location and degree. The second step consists in modelling the left ventricle at stress and at rest using the sphercical harmonics model and myocardial scintigraphic data. We use the spherical harmonics descriptors to analyse left ventricle model deformation between stress and rest which permits us to conclude if ever an ischemia exists and to quantify it.

Adaptive Impedance Control for Unknown Non-Flat Environment

This paper presents a new adaptive impedance control strategy, based on Function Approximation Technique (FAT) to compensate for unknown non-flat environment shape or time-varying environment location. The target impedance in the force controllable direction is modified by incorporating adaptive compensators and the uncertainties are represented by FAT, allowing the update law to be derived easily. The force error feedback is utilized in the estimation and the accurate knowledge of the environment parameters are not required by the algorithm. It is shown mathematically that the stability of the controller is guaranteed based on Lyapunov theory. Simulation results presented to demonstrate the validity of the proposed controller.

Analysis of DNA Microarray Data using Association Rules: A Selective Study

DNA microarrays allow the measurement of expression levels for a large number of genes, perhaps all genes of an organism, within a number of different experimental samples. It is very much important to extract biologically meaningful information from this huge amount of expression data to know the current state of the cell because most cellular processes are regulated by changes in gene expression. Association rule mining techniques are helpful to find association relationship between genes. Numerous association rule mining algorithms have been developed to analyze and associate this huge amount of gene expression data. This paper focuses on some of the popular association rule mining algorithms developed to analyze gene expression data.

Simulation of Large Deformations of Rubbers by the RKPM Method

In this paper processes including large deformations of a rubber with hyperelastic material behavior are simulated by the RKPM method. Due to the loss of kronecker delta properties in the mesh less shape functions, the imposition of essential boundary conditions consumes significant CPU time in mesh free computations. In this work transformation method is used for imposition of essential boundary conditions. A RKPM material shape function is used in this analysis. The support of the material shape functions covers the same set of particles during material deformation and hence the transformation matrix is formed only once at the initial stages. A computer program in MATLAB is developed for simulations.

Fast Facial Feature Extraction and Matching with Artificial Face Models

Facial features are frequently used to represent local properties of a human face image in computer vision applications. In this paper, we present a fast algorithm that can extract the facial features online such that they can give a satisfying representation of a face image. It includes one step for a coarse detection of each facial feature by AdaBoost and another one to increase the accuracy of the found points by Active Shape Models (ASM) in the regions of interest. The resulted facial features are evaluated by matching with artificial face models in the applications of physiognomy. The distance measure between the features and those in the fate models from the database is carried out by means of the Hausdorff distance. In the experiment, the proposed method shows the efficient performance in facial feature extractions and online system of physiognomy.

Computer Aided Design of Reshaping Process of Circular Pipes into Square Pipes

Square pipes (pipes with square cross sections) are being used for various industrial objectives, such as machine structure components and housing/building elements. The utilization of them is extending rapidly and widely. Hence, the out-put of those pipes is increasing and new application fields are continually developing. Due to various demands in recent time, the products have to satisfy difficult specifications with high accuracy in dimensions. The reshaping process design of pipes with square cross sections; however, is performed by trial and error and based on expert-s experience. In this paper, a computer-aided simulation is developed based on the 2-D elastic-plastic method with consideration of the shear deformation to analyze the reshaping process. Effect of various parameters such as diameter of the circular pipe and mechanical properties of metal on product dimension and quality can be evaluated by using this simulation. Moreover, design of reshaping process include determination of shrinkage of cross section, necessary number of stands, radius of rolls and height of pipe at each stand, are investigated. Further, it is shown that there are good agreements between the results of the design method and the experimental results.

Six Sigma Solutions and its Benefit-Cost Ratio for Quality Improvement

This is an application research presenting the improvement of production quality using the six sigma solutions and the analyses of benefit-cost ratio. The case of interest is the production of tile-concrete. Such production has faced with the problem of high nonconforming products from an inappropriate surface coating and had low process capability based on the strength property of tile. Surface coating and tile strength are the most critical to quality of this product. The improvements followed five stages of six sigma solutions. After the improvement, the production yield was improved to 80% as target required and the defective products from coating process was remarkably reduced from 29.40% to 4.09%. The process capability based on the strength quality was increased from 0.87 to 1.08 as customer oriented. The improvement was able to save the materials loss for 3.24 millions baht or 0.11 million dollars. The benefits from the improvement were analyzed from (1) the reduction of the numbers of non conforming tile using its factory price for surface coating improvement and (2) the materials saved from the increment of process capability. The benefit-cost ratio of overall improvement was high as 7.03. It was non valuable investment in define, measure, analyses and the initial of improve stages after that it kept increasing. This was due to there were no benefits in define, measure, and analyze stages of six sigma since these three stages mainly determine the cause of problem and its effects rather than improve the process. The benefit-cost ratio starts existing in the improve stage and go on. Within each stage, the individual benefitcost ratio was much higher than the accumulative one as there was an accumulation of cost since the first stage of six sigma. The consideration of the benefit-cost ratio during the improvement project helps make decisions for cost saving of similar activities during the improvement and for new project. In conclusion, the determination of benefit-cost ratio behavior through out six sigma implementation period provides the useful data for managing quality improvement for the optimal effectiveness. This is the additional outcome from the regular proceeding of six sigma.

Concurrent Approach to Data Parallel Model using Java

Parallel programming models exist as an abstraction of hardware and memory architectures. There are several parallel programming models in commonly use; they are shared memory model, thread model, message passing model, data parallel model, hybrid model, Flynn-s models, embarrassingly parallel computations model, pipelined computations model. These models are not specific to a particular type of machine or memory architecture. This paper expresses the model program for concurrent approach to data parallel model through java programming.

A Multi Steps Algorithm for Sperm Segmentation in Microscopic Image

Nothing that an effective cure for infertility happens when we can find a unique solution, a great deal of study has been done in this field and this is a hot research subject for to days study. So we could analyze the men-s seaman and find out about fertility and infertility and from this find a true cure for this, since this will be a non invasive and low risk procedure, it will be greatly welcomed. In this research, the procedure has been based on few Algorithms enhancement and segmentation of images which has been done on the images taken from microscope in different fertility institution and have obtained a suitable result from the computer images which in turn help us to distinguish these sperms from fluids and its surroundings.

Fluidity of A713 Cast Alloy with and without Scrap Addition using Double Spiral Fluidity Test: A Comparison

Recycling of aluminum alloys often decrease fluidity, consequently influence the castability of the alloy. In this study, the fluidity of Al-Zn alloys, such as the standard A713 alloy with and without scrap addition has been investigated. The scrap added was comprised of contaminated alloy turning chips. Fluidity measurements were performed with double spiral fluidity test consisting of gravity casting of double spirals in green sand moulds with good reproducibility. The influence of recycled alloy on fluidity has been compared with that of the virgin alloy and the results showed that the fluidity decreased with the increase in recycled alloy at minimum pouring temperatures. Interestingly, an appreciable improvement in the fluidity was observed at maximum pouring temperature, especially for coated spirals.

Happiness Understanding Depending on Features of Coping Behavior

The importance of happiness understanding research is caused by cardinal changes experiences in system of people values in the post-Soviet countries territory. «The time of changes», which characterized with destruction of old values and not creativeness of new, stimulating experiences by the person of existential vacuum. The given research is actual not only in connection with sense formation, but also in connection with necessity creatively to adapt in integrative space. According to numerous works [1,2,3], we define happiness as the peak experience connected with satisfaction correlated system of needs, dependent on style of subject's coping behavior.

Face Localization and Recognition in Varied Expressions and Illumination

In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.