Abstract: In this paper we present high performance
dynamically allocated multi-queue (DAMQ) buffer schemes for fault
tolerance systems on chip applications that require an interconnection
network. Two virtual channels shared the same buffer space. Fault
tolerant mechanisms for interconnection networks are becoming a
critical design issue for large massively parallel computers. It is also
important to high performance SoCs as the system complexity keeps
increasing rapidly. On the message switching layer, we make
improvement to boost system performance when there are faults
involved in the components communication. The proposed scheme is
when a node or a physical channel is deemed as faulty, the previous
hop node will terminate the buffer occupancy of messages destined
to the failed link. The buffer usage decisions are made at switching
layer without interactions with higher abstract layer, thus buffer
space will be released to messages destined to other healthy nodes
quickly. Therefore, the buffer space will be efficiently used in case
fault occurs at some nodes.
Abstract: The paper attempts to elucidate the columnar structure
of the cortex by answering the following questions. (1) Why the
cortical neurons with similar interests tend to be vertically arrayed
forming what is known as cortical columns? (2) How to describe the
cortex as a whole in concise mathematical terms? (3) How to design
efficient digital models of the cortex?
Abstract: In this article, we expose our research work in
Human-machine Interaction. The research consists in manipulating
the workspace by eyes. We present some of our results, in particular
the detection of eyes and the mouse actions recognition. Indeed, the
handicaped user becomes able to interact with the machine in a more
intuitive way in diverse applications and contexts. To test our
application we have chooses to work in real time on videos captured
by a camera placed in front of the user.
Abstract: In molecular biology, microarray technology is widely and successfully utilized to efficiently measure gene activity. If working with less studied organisms, methods to design custom-made microarray probes are available. One design criterion is to select probes with minimal melting temperature variances thus ensuring similar hybridization properties. If the microarray application focuses on the investigation of metabolic pathways, it is not necessary to cover the whole genome. It is more efficient to cover each metabolic pathway with a limited number of genes. Firstly, an approach is presented which minimizes the overall melting temperature variance of selected probes for all genes of interest. Secondly, the approach is extended to include the additional constraints of covering all pathways with a limited number of genes while minimizing the overall variance. The new optimization problem is solved by a bottom-up programming approach which reduces the complexity to make it computationally feasible. The new method is exemplary applied for the selection of microarray probes in order to cover all fungal secondary metabolite gene clusters for Aspergillus terreus.
Abstract: The limit load carrying capacity of functionally
graded materials (FGM) circular plates subjected to an arbitrary
rotationally symmetric loading has been computed. It is provided that
the plate material behaves rigid perfectly plastic and obeys either the
Square or the Tresca yield criterion. To this end the upper and lower
bound principles of limit analysis are employed to determine the
exact value for the limiting load. The correctness of the result are
verified and finally limiting loads for two examples namely; through
radius and through thickness FGM circular plates with simply
supported edges are calculated, respectively and moreover, the values
of critical loading factor are determined.
Abstract: The geometric errors in the manufacturing process can
be reduced by optimal positioning of the fixture elements in the
fixture to make the workpiece stiff. We propose a new fixture layout
optimization method N-3-2-1 for large metal sheets in this paper that
combines the genetic algorithm and finite element analysis. The
objective function in this method is to minimize the sum of the nodal
deflection normal to the surface of the workpiece. Two different
kinds of case studies are presented, and optimal position of the
fixturing element is obtained for different cases.
Abstract: Vernacular building is considered as sustainable in
energy consumption and environment and its thermal performance is
more and more concerned by researchers. This paper investigates the
thermal property of the vernacular building in Lhasa by theoretical
analysis on the aspects of building form, envelope and materials etc.
The values of thermal resistance and thermal capacity of the envelope
are calculated and compared with the current China building code and
modern building case. And it is concluded that Lhasa vernacular
building meets the current China building code of thermal standards
and have better performance in some aspects, which is achieved by
various passive means with close response to local climate conditions.
Abstract: The scientific achievements coming from molecular
biology depend greatly on the capability of computational
applications to analyze the laboratorial results. A comprehensive
analysis of an experiment requires typically the simultaneous study
of the obtained dataset with data that is available in several distinct
public databases. Nevertheless, developing a centralized access to
these distributed databases rises up a set of challenges such as: what
is the best integration strategy, how to solve nomenclature clashes,
how to solve database overlapping data and how to deal with huge
datasets. In this paper we present GeNS, a system that uses a simple and yet innovative approach to address several biological data integration issues. Compared with existing systems, the main
advantages of GeNS are related to its maintenance simplicity and to its coverage and scalability, in terms of number of supported
databases and data types. To support our claims we present the current use of GeNS in two concrete applications. GeNS currently contains more than 140 million of biological relations and it can be
publicly downloaded or remotely access through SOAP web services.
Abstract: The forest fires in Thailand are annual occurrence which is the cause of air pollutions. This study intended to estimate the emission from forest fire during 2005-2009 using MODerateresolution Imaging Spectro-radiometer (MODIS) sensor aboard the Terra and Aqua satellites, experimental data, and statistical data. The forest fire emission is estimated using equation established by Seiler and Crutzen in 1982. The spatial and temporal variation of forest fire emission is analyzed and displayed in the form of grid density map. From the satellite data analysis suggested between 2005 and 2009, the number of fire hotspots occurred 86,877 fire hotspots with a significant highest (more than 80% of fire hotspots) in the deciduous forest. The peak period of the forest fire is in January to May. The estimation on the emissions from forest fires during 2005 to 2009 indicated that the amount of CO, CO2, CH4, and N2O was about 3,133,845 tons, 47,610.337 tons, 204,905 tons, and 6,027 tons, respectively, or about 6,171,264 tons of CO2eq. They also emitted 256,132 tons of PM10. The year 2007 was found to be the year when the emissions were the largest. Annually, March is the period that has the maximum amount of forest fire emissions. The areas with high density of forest fire emission were the forests situated in the northern, the western, and the upper northeastern parts of the country.
Abstract: Thermo-chemical treatment (TCT) such as pyrolysis
is getting recognized as a valid route for (i) materials and valuable
products and petrochemicals recovery; (ii) waste recycling; and (iii)
elemental characterization. Pyrolysis is also receiving renewed
attention for its operational, economical and environmental
advantages. In this study, samples of polyethylene terephthalate
(PET) and polystyrene (PS) were pyrolysed in a microthermobalance
reactor (using a thermogravimetric-TGA setup). Both
polymers were prepared and conditioned prior to experimentation.
The main objective was to determine the kinetic parameters of the
depolymerization reactions that occur within the thermal degradation
process. Overall kinetic rate constants (ko) and activation energies
(Eo) were determined using the general kinetics theory (GKT)
method previously used by a number of authors. Fitted correlations
were found and validated using the GKT, errors were within ± 5%.
This study represents a fundamental step to pave the way towards the
development of scaling relationship for the investigation of larger
scale reactors relevant to industry.
Abstract: The phase diagrams and compositions of coexisting
phases have been determined for aqueous two-phase systems
containing poly(propylene glycol) with average molecular weight of
425 and sodium citrate at various pH of 3.93, 4.44, 4.6, 4.97, 5.1,
8.22. The effect of pH on the salting-out effect of poly (propylene
glycol) by sodium citrate has been studied. It was found that, an
increasing in pH caused the expansion of two-phase region.
Increasing pH also increases the concentration of PPG in the PPGrich
phase, while the salt-rich phase will be somewhat mole diluted.
Abstract: This paper uses quasi-steady molecular statics model
and diamond tool to carry out simulation temperature rise of nanoscale
orthogonal cutting single-crystal silicon. It further qualitatively
analyzes temperature field of silicon workpiece without considering
heat transfer and considering heat transfer. This paper supposes that
the temperature rise of workpiece is mainly caused by two heat sources:
plastic deformation heat and friction heat. Then, this paper develops a
theoretical model about production of the plastic deformation heat and
friction heat during nanoscale orthogonal cutting. After the increased
temperature produced by these two heat sources are added up, the
acquired total temperature rise at each atom of the workpiece is
substituted in heat transfer finite difference equation to carry out heat
transfer and calculates the temperature field in each step and makes
related analysis.
Abstract: Synthetic Aperture Radar (SAR) is an imaging radar form by taking full advantage of the relative movement of the antenna with respect to the target. Through the simultaneous processing of the radar reflections over the movement of the antenna via the Range Doppler Algorithm (RDA), the superior resolution of a theoretical wider antenna, termed synthetic aperture, is obtained. Therefore, SAR can achieve high resolution two dimensional imagery of the ground surface. In addition, two filtering steps in range and azimuth direction provide accurate enough result. This paper develops a simulation in which realistic SAR images can be generated. Also, the effect of velocity errors in the resulting image has also been investigated. Taking some velocity errors into account, the simulation results on the image resolution would be presented. Most of the times, algorithms need to be adjusted for particular datasets, or particular applications.
Abstract: The automatic discrimination of seismic signals is an important practical goal for the earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, we present new techniques for seismic signals classification: local, regional and global discrimination. These techniques were tested on seismic signals from the data base of the National Geophysical Institute of the Centre National pour la Recherche Scientifique et Technique (Morocco) by using the Moroccan software for seismic signals analysis.
Abstract: This article investigates a contribution of synthesized visual speech. Synthesis of visual speech expressed by a computer consists in an animation in particular movements of lips. Visual speech is also necessary part of the non-manual component of a sign language. Appropriate methodology is proposed to determine the quality and the accuracy of synthesized visual speech. Proposed methodology is inspected on Czech speech. Hence, this article presents a procedure of recording of speech data in order to set a synthesis system as well as to evaluate synthesized speech. Furthermore, one option of the evaluation process is elaborated in the form of a perceptual test. This test procedure is verified on the measured data with two settings of the synthesis system. The results of the perceptual test are presented as a statistically significant increase of intelligibility evoked by real and synthesized visual speech. Now, the aim is to show one part of evaluation process which leads to more comprehensive evaluation of the sign speech synthesis system.
Abstract: Since large power transformers are the most
expensive and strategically important components of any power
generator and transmission system, their reliability is crucially
important for the energy system operation. Also, Circuit breakers are
very important elements in the power transmission line so monitoring
the events gives a knowledgebase to determine time to the next
maintenance. This paper deals with the introduction of the
comparative method of the state estimation of transformers and
Circuit breakers using continuous monitoring of voltage, current.
This paper gives details a new method based on wavelet to apparatus
insulation monitoring. In this paper to insulation monitoring of
transformer, a new method based on wavelet transformation and
neutral point analysis is proposed. Using the EMTP tools, fault in
transformer winding and the detailed transformer winding model
were simulated. The current of neutral point of winding was analyzed
by wavelet transformation. It is shown that the neutral current of the
transformer winding has useful information about fault in insulation
of the transformer.
Abstract: Accounts of language acquisition differ significantly in their treatment of the role of prediction in language learning. In particular, nativist accounts posit that probabilistic learning about words and word sequences has little to do with how children come to use language. The accuracy of this claim was examined by testing whether distributional probabilities and frequency contributed to how well 3-4 year olds repeat simple word chunks. Corresponding chunks were the same length, expressed similar content, and were all grammatically acceptable, yet the results of the study showed marked differences in performance when overall distributional frequency varied. It was found that a distributional model of language predicted the empirical findings better than a number of other models, replicating earlier findings and showing that children attend to distributional probabilities in an adult corpus. This suggested that language is more prediction-and-error based, rather than on abstract rules which nativist camps suggest.
Abstract: Hypernetworks are a generalized graph structure
representing higher-order interactions between variables. We present a
method for self-organizing hypernetworks to learn an associative
memory of sentences and to recall the sentences from this memory.
This learning method is inspired by the “mental chemistry" model of
cognition and the “molecular self-assembly" technology in
biochemistry. Simulation experiments are performed on a corpus of
natural-language dialogues of approximately 300K sentences
collected from TV drama captions. We report on the sentence
completion performance as a function of the order of word-interaction
and the size of the learning corpus, and discuss the plausibility of this
architecture as a cognitive model of language learning and memory.
Abstract: The purpose of this paper is to shed light on the
controversial subject of tax incentives to promote regional
development. Although extensive research has been conducted, a
review of the literature gives an inconclusive answer to whether
economic incentives are effective. One reason is the fact that for
some researchers “effective" means the significant location of new
firms in targeted areas, while for others the creation of jobs
regardless if new firms are arriving in a significant fashion. We
present this dichotomy by analyzing a tax incentive program via both
alternatives: location and job creation. The contribution of the paper
is to inform policymakers about the potential opportunities and
pitfalls when designing incentive strategies. This is particularly
relevant, given that both the US and Europe have been promoting
incentives as a tool for regional economic development.
Abstract: In a state-of-the-art industrial production line of
photovoltaic products the handling and automation processes are of
particular importance and implication. While processing a fully
functional crystalline solar cell an as-cut photovoltaic wafer is subject
to numerous repeated handling steps. With respect to stronger
requirements in productivity and decreasing rejections due to defects
the mechanical stress on the thin wafers has to be reduced to a
minimum as the fragility increases by decreasing wafer thicknesses.
In relation to the increasing wafer fragility, researches at the
Fraunhofer Institutes IPA and CSP showed a negative correlation
between multiple handling processes and the wafer integrity. Recent
work therefore focused on the analysis and optimization of the dry
wafer stack separation process with compressed air. The achievement
of a wafer sensitive process capability and a high production
throughput rate is the basic motivation in this research.