Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: As a tool for human spatial cognition and thinking, the map has been playing an important role. Maps are perhaps as fundamental to society as language and the written word. Economic and social development requires extensive and in-depth understanding of their own living environment, from the scope of the overall global to urban housing. This has brought unprecedented opportunities and challenges for traditional cartography . This paper first proposed the concept of scaleless-map and its basic characteristics, through the analysis of the existing multi-scale representation techniques. Then some strategies are presented for automated mapping compilation. Taking into account the demand of automated map compilation, detailed proposed the software - WJ workstation must have four technical features, which are generalization operators, symbol primitives, dynamically annotation and mapping process template. This paper provides a more systematic new idea and solution to improve the intelligence and automation of the scaleless cartography.
Abstract: A combination of photosynthetic bacteria along with
anaerobic acidogenic bacteria is an ideal option for efficient
hydrogen production. In the present study, the optimum
concentration of substrates for the growth of Rhodobacter
sphaeroides was found by response surface methodology. The
optimum combination of three individual fatty acids was determined
by Box Behnken design. Increase of volatile fatty acid concentration
decreased the growth. Combination of sodium acetate and sodium
propionate was most significant for the growth of the organism. The
results showed that a maximum biomass concentration of 0.916 g/l
was obtained when the concentrations of acetate, propionate and
butyrate were 0.73g/l,0.99g/l and 0.799g/l, respectively. The growth
was studied under an optimum concentration of volatile fatty acids
and at a light intensity of 3000 lux, initial pH of 7 and a temperature
of 35°C.The maximum biomass concentration of 0.92g/l was
obtained which verified the practicability of this optimization.
Abstract: Efforts to secure supervisory control and data acquisition
(SCADA) systems must be supported under the guidance of
sound security policies and mechanisms to enforce them. Critical
elements of the policy must be systematically translated into a format
that can be used by policy enforcement components. Ideally, the
goal is to ensure that the enforced policy is a close reflection of
the specified policy. However, security controls commonly used to
enforce policies in the IT environment were not designed to satisfy
the specific needs of the SCADA environment. This paper presents
a language, based on the well-known XACML framework, for the
expression of authorization policies for SCADA systems.
Abstract: The Sphere Method is a flexible interior point algorithm for linear programming problems. This was developed mainly by Professor Katta G. Murty. It consists of two steps, the centering step and the descent step. The centering step is the most expensive part of the algorithm. In this centering step we proposed some improvements such as introducing two or more initial feasible solutions as we solve for the more favorable new solution by objective value while working with the rigorous updates of the feasible region along with some ideas integrated in the descent step. An illustration is given confirming the advantage of using the proposed procedure.
Abstract: Architecture as a form of art, whilst actively
developing, finds new methods and conceptions. Currently,
architectural animation is actively developing as a step, successive to
architectural visualization. Interesting vistas of architectural ideas
were discovered by artists of Japanese animation, in which there are
traditional spirits, kami, and imaginary spaces relating to them.
Anime art should be considered abstract painting, another kind of an
architectural workshop, where new architectural ideas are generated.
Abstract: The present work is motivated by the idea that the
layer deformation in anisotropic elasticity can be estimated from the
theory of interfacial dislocations. In effect, this work which is an
extension of a previous approach given by one of the authors
determines the anisotropic displacement fields and the critical
thickness due to a complex biperiodic network of MDs lying just
below the free surface in view of the arrangement of dislocations.
The elastic fields of such arrangements observed along interfaces
play a crucial part in the improvement of the physical properties of
epitaxial systems. New results are proposed in anisotropic elasticity
for hexagonal networks of MDs which contain intrinsic and extrinsic
stacking faults. We developed, using a previous approach based on
the relative interfacial displacement and a Fourier series formulation
of the displacement fields, the expressions of elastic fields when
there is a possible dissociation of MDs. The numerical investigations
in the case of the observed system Si/(111)Si with low twist angles
show clearly the effect of the anisotropy and thickness when the
misfit networks are dissociated.
Abstract: This paper presented a novel combined cycle of air separation and natural gas liquefaction. The idea is that natural gas can be liquefied, meanwhile gaseous or liquid nitrogen and oxygen are produced in one combined cryogenic system. Cycle simulation and exergy analysis were performed to evaluate the process and thereby reveal the influence of the crucial parameter, i.e., flow rate ratio through two stages expanders β on heat transfer temperature difference, its distribution and consequent exergy loss. Composite curves for the combined hot streams (feeding natural gas and recycled nitrogen) and the cold stream showed the degree of optimization available in this process if appropriate β was designed. The results indicated that increasing β reduces temperature difference and exergy loss in heat exchange process. However, the maximum limit value of β should be confined in terms of minimum temperature difference proposed in heat exchanger design standard and heat exchanger size. The optimal βopt under different operation conditions corresponding to the required minimum temperature differences was investigated.
Abstract: Many states are now committed to implementing
international human rights standards domestically. In terms of
practical governance, how might effectiveness be measured? A facevalue
answer can be found in domestic laws and institutions relating
to human rights. However, this article provides two further tools to
help states assess their status on the spectrum of robust to fragile
human rights governance. The first recognises that each state has its
own 'human rights history' and the ideal end stage is robust human
rights governance, and the second is developing criteria to assess
robustness. Although a New Zealand case study is used to illustrate
these tools, the widespread adoption of human rights standards by
many states inevitably means that the issues are relevant to other
countries. This is even though there will always be varying degrees of
similarity-difference in constitutional background and developed or
emerging human rights systems.
Abstract: A manufacturing feature can be defined simply as a
geometric shape and its manufacturing information to create the shape.
In a feature-based process planning system, feature library that
consists of pre-defined manufacturing features and the manufacturing
information to create the shape of the features, plays an important role
in the extraction of manufacturing features with their proper
manufacturing information. However, to manage the manufacturing
information flexibly, it is important to build a feature library that can
be easily modified. In this paper, the implementation of Semantic Wiki
for the development of the feature library is proposed.
Abstract: Many Thai movies have been very popular
domestically and internationally. Some movies were box office hits
and receiving awards. However, there has not yet been research
about how Thai movies can sell in international markets
The objectives of the research were 1) To analyze the
characteristics of Thai movies that can sell to world audiences; 2) To
investigate the factors making Thai movies into foreign markets. Thai
film professionals were interviewed. Their ideas were analyzed to
find out what factors contributing to Thai movies widely seen in
worldwide markets. Nine foreign audiences were also interviewed to
reveal what characteristics of Thai movies would be well accepted by
the markets.
The results showed that major characteristics of Thai movies
proving successful worldwide were cultural and exotic Thai movies,
outstanding genres, well-known actors, music and songs. Factors
contributing to global market were marketing, qualities of Thai
movies, and financial support from the government.
Abstract: The proof of concept experiments were conducted to
determine the feasibility of using small amounts of Dissolved
Sulphur (DS) from the gaseous phase to precipitate platinum ions in
chloride media. Two sets of precipitation experiments were
performed in which the source of sulphur atoms was either a
thiosulphate solution (Na2S2O3) or a sulphur dioxide gas (SO2). In
liquid-liquid (L-L) system, complete precipitation of Pt was achieved
at small dosages of Na2S2O3 (0.01 – 1.0 M) in a time interval of 3-5
minutes. On the basis of this result, gas absorption tests were carried
out mainly to achieve sulphur solubility equivalent to 0.018 M. The
idea that huge amounts of precious metals could be recovered
selectively from their dilute solutions by utilizing the waste SO2
streams at low pressure seemed attractive from the economic and
environmental point of views. Therefore, mass transfer characteristics
of SO2 gas associated with reactive absorption across the gas-liquid
(G-L) interface were evaluated under different conditions of pressure
(0.5 – 2 bar), solution temperature ranges from 20 – 50 oC and acid
strength (1 – 4 M, HCl). This paper concludes with information about
selective precipitation of Pt in the presence of cations (Fe2+, Co2+,
and Cr3+) in a CSTR and recommendation to scale up laboratory data
to industrial pilot scale operations.
Abstract: The aeration process via injectors is used to combat
the lack of oxygen in lakes due to eutrophication. A 3D numerical
simulation of the resulting flow using a simplified model is presented.
In order to generate the best dynamic in the fluid with respect to
the aeration purpose, the optimization of the injectors location is
considered. We propose to adapt to this problem the topological
sensitivity analysis method which gives the variation of a criterion
with respect to the creation of a small hole in the domain. The main
idea is to derive the topological sensitivity analysis of the physical
model with respect to the insertion of an injector in the fluid flow
domain. We propose in this work a topological optimization algorithm
based on the studied asymptotic expansion. Finally we present some
numerical results, showing the efficiency of our approach
Abstract: A mathematical model for the hydrodynamics of a
surface water treatment pilot plant was developed and validated by
the determination of the residence time distribution (RTD) for the
main equipments of the unit. The well known models of ideal/real
mixing, ideal displacement (plug flow) and (one-dimensional axial)
dispersion model were combined in order to identify the structure
that gives the best fitting of the experimental data for each equipment
of the pilot plant. RTD experimental results have shown that pilot
plant hydrodynamics can be quite well approximated by a
combination of simple mathematical models, structure which is
suitable for engineering applications. Validated hydrodynamic
models will be further used in the evaluation and selection of the
most suitable coagulation-flocculation reagents, optimum operating
conditions (injection point, reaction times, etc.), in order to improve
the quality of the drinking water.
Abstract: The great majority of the electric installations belong
to the first and second category. In order to ensure a high level of
reliability of their electric system feeder, two power supply sources
are envisaged, one principal, the other of reserve, generally a cold
reserve (electric diesel group).
The principal source being under operation, its control can be ideal
and sure, however for the reserve source being in stop, a preventive
maintenance-s which proceeds on time intervals (periodicity) and
for well defined lengths of time are envisaged, so that this source will
always available in case of the principal source failure.
The choice of the periodicity of preventive maintenance of the
source of reserve influences directly the reliability of the electric
feeder system. On the basis of the semi-markovians processes, the
influence of the periodicity of the preventive maintenance of the
source of reserve is studied and is given the optimal periodicity.
Abstract: Applying the idea of soft set theory to lattice implication algebras, the novel concept of (implicative) filteristic soft lattice implication algebras which related to (implicative) filter(for short, (IF-)F-soft lattice implication algebras) are introduced. Basic properties of (IF-)F-soft lattice implication algebras are derived. Two kinds of fuzzy filters (i.e.(2, 2 _qk)((2, 2 _ qk))-fuzzy (implicative) filter) of L are introduced, which are generalizations of fuzzy (implicative) filters. Some characterizations for a soft set to be a (IF-)F-soft lattice implication algebra are provided. Analogously, this idea can be used in other types of filteristic lattice implication algebras (such as fantastic (positive implicative) filteristic soft lattice implication algebras).
Abstract: There exists an injective, information-preserving function
that maps a semantic network (i.e a directed labeled network)
to a directed network (i.e. a directed unlabeled network). The edge
label in the semantic network is represented as a topological feature
of the directed network. Also, there exists an injective function that
maps a directed network to an undirected network (i.e. an undirected
unlabeled network). The edge directionality in the directed network
is represented as a topological feature of the undirected network.
Through function composition, there exists an injective function that
maps a semantic network to an undirected network. Thus, aside from
space constraints, the semantic network construct does not have any
modeling functionality that is not possible with either a directed
or undirected network representation. Two proofs of this idea will
be presented. The first is a proof of the aforementioned function
composition concept. The second is a simpler proof involving an
undirected binary encoding of a semantic network.
Abstract: Recent scientific investigations indicate that
multimodal biometrics overcome the technical limitations of
unimodal biometrics, making them ideally suited for everyday life
applications that require a reliable authentication system. However,
for a successful adoption of multimodal biometrics, such systems
would require large heterogeneous datasets with complex multimodal
fusion and privacy schemes spanning various distributed
environments. From experimental investigations of current
multimodal systems, this paper reports the various issues related to
speed, error-recovery and privacy that impede the diffusion of such
systems in real-life. This calls for a robust mechanism that caters to
the desired real-time performance, robust fusion schemes,
interoperability and adaptable privacy policies.
The main objective of this paper is to present a framework that
addresses the abovementioned issues by leveraging on the
heterogeneous resource sharing capacities of Grid services and the
efficient machine learning capabilities of artificial neural networks
(ANN). Hence, this paper proposes a Grid-based neural network
framework for adopting multimodal biometrics with the view of
overcoming the barriers of performance, privacy and risk issues that
are associated with shared heterogeneous multimodal data centres.
The framework combines the concept of Grid services for reliable
brokering and privacy policy management of shared biometric
resources along with a momentum back propagation ANN (MBPANN)
model of machine learning for efficient multimodal fusion and
authentication schemes. Real-life applications would be able to adopt
the proposed framework to cater to the varying business requirements
and user privacies for a successful diffusion of multimodal
biometrics in various day-to-day transactions.
Abstract: In this work, we present a novel active learning approach
for learning a visual object detection system. Our system
is composed of an active learning mechanism as wrapper around
a sub-algorithm which implement an online boosting-based learning
object detector. In the core is a combination of a bootstrap procedure
and a semi automatic learning process based on the online boosting
procedure. The idea is to exploit the availability of classifier during
learning to automatically label training samples and increasingly
improves the classifier. This addresses the issue of reducing labeling
effort meanwhile obtain better performance. In addition, we propose
a verification process for further improvement of the classifier.
The idea is to allow re-update on seen data during learning for
stabilizing the detector. The main contribution of this empirical study
is a demonstration that active learning based on an online boosting
approach trained in this manner can achieve results comparable or
even outperform a framework trained in conventional manner using
much more labeling effort. Empirical experiments on challenging data
set for specific object deteciton problems show the effectiveness of
our approach.
Abstract: One of the common problems encountered in software
engineering is addressing and responding to the changing nature of
requirements. While several approaches have been devised to address
this issue, ranging from instilling resistance to changing requirements
in order to mitigate impact to project schedules, to developing an
agile mindset towards requirements, the approach discussed in this
paper is one of conceptualizing the delta in requirement and
modeling it, in order to plan a response to it. To provide some
context here, change is first formally identified and categorized as
either formal change or informal change. While agile methodology
facilitates informal change, the approach discussed in this paper
seeks to develop the idea of facilitating formal change. To collect,
document meta-requirements that represent the phenomena of change
would be a pro-active measure towards building a realistic cognition
of the requirements entity that can further be harnessed in the
software engineering process.