Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: Today-s Information and Knowledge Society has
placed new demands on education and a new paradigm of education
is required. Learning, facilitated by educational systems and the
pedagogic process, is globally undergoing dramatic changes. The aim
of this paper is the development of a simple Instructional Design tool
for E-Learning, named IDEL (Instructional Design for Electronic
Learning), that provides the educators with facilities to create their
own courses with the essential educational material and manage
communication with students. It offers flexibility in the way of
learning and provides ease in employment and reusability of
resources. IDEL is a web-based Instructional System and is designed
to facilitate course design process in accordance with the ADDIE
model and the instructional design principles with emphasis placed
on the use of technology enhanced learning. An example case of
using the ADDIE model to systematically develop a course and its
implementation with the aid of IDEL is given and some results from
student evaluation of the tool and the course are reported.
Abstract: This paper presents a new method for read out of the piezoresistive accelerometer sensors. The circuit works based on Instrumentation amplifier and it is useful for reducing offset In Wheatstone Bridge. The obtained gain is 645 with 1μv/°c Equivalent drift and 1.58mw power consumption. A Schmitt trigger and multiplexer circuit control output node. a high speed counter is designed in this work .the proposed circuit is designed and simulated In 0.18μm CMOS technology with 1.8v power supply.
Abstract: This paper investigates the application of Particle Swarm Optimization (PSO) technique for coordinated design of a Power System Stabilizer (PSS) and a Thyristor Controlled Series Compensator (TCSC)-based controller to enhance the power system stability. The design problem of PSS and TCSC-based controllers is formulated as a time domain based optimization problem. PSO algorithm is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. To compare the capability of PSS and TCSC-based controller, both are designed independently first and then in a coordinated manner for individual and coordinated application. The proposed controllers are tested on a weakly connected power system. The eigenvalue analysis and non-linear simulation results are presented to show the effectiveness of the coordinated design approach over individual design. The simulation results show that the proposed controllers are effective in damping low frequency oscillations resulting from various small disturbances like change in mechanical power input and reference voltage setting.
Abstract: A novel idea presented in this paper is to combine
multihop routing with single-frequency networks (SFNs) for a
broadcasting scenario. An SFN is a set of multiple nodes that transmit
the same data simultaneously, resulting in transmitter macrodiversity.
Two of the most important performance factors of multihop
networks, node reachability and routing robustness, are analyzed.
Simulation results show that our proposed SFN-D routing algorithm
improves the node reachability by 37 percentage points as compared
to non-SFN multihop routing. It shows a diversity gain of 3.7 dB,
meaning that 3.7 dB lower transmission powers are required for the
same reachability. Even better results are possible for larger
networks. If an important node becomes inactive, this algorithm can
find new routes that a non-SFN scheme would not be able to find.
Thus, two of the major problems in multihopping are addressed;
achieving robust routing as well as improving node reachability or
reducing transmission power.
Abstract: The public sector holds large amounts of data of
various areas such as social affairs, economy, or tourism. Various
initiatives such as Open Government Data or the EU Directive on
public sector information aim to make these data available for public
and private service providers. Requirements for the provision of
public sector data are defined by legal and organizational
frameworks. Surprisingly, the defined requirements hardly cover
security aspects such as integrity or authenticity.
In this paper we discuss the importance of these missing
requirements and present a concept to assure the integrity and
authenticity of provided data based on electronic signatures. We
show that our concept is perfectly suitable for the provisioning of
unaltered data. We also show that our concept can also be extended
to data that needs to be anonymized before provisioning by
incorporating redactable signatures. Our proposed concept enhances
trust and reliability of provided public sector data.
Abstract: In this paper sensitivity analysis is performed for
reliability evaluation of power systems. When examining the
reliability of a system, it is useful to recognize how results
change as component parameters are varied. This knowledge
helps engineers to understand the impact of poor data, and
gives insight on how reliability can be improved. For these
reasons, a sensitivity analysis can be performed. Finally, a real
network was used for testing the presented method.
Abstract: This paper introduces a novel design for boring bar with enhanced damping capability. The principle followed in the design phase was to enhance the damping capability minimizing the loss in static stiffness through implementation of composite material interfaces. The newly designed tool has been compared to a conventional tool. The evaluation criteria were the dynamic characteristics, frequency and damping ratio, of the machining system, as well as the surface roughness of the machined workpieces. The use of composite material in the design of damped tool has been demonstrated effective. Furthermore, the autoregressive moving average (ARMA) models presented in this paper take into consideration the interaction between the elastic structure of the machine tool and the cutting process and can therefore be used to characterize the machining system in operational conditions.
Abstract: Indian subcontinent has a plethora of traditional
medicine systems that provide promising solutions to lifestyle
disorders in an 'all natural way'. Spices and oilseeds hold
prominence in Indian cuisine hence the focus of the current study
was to evaluate the bioactive molecules from Linum usitatissinum
(LU), Lepidium sativum (LS), Nigella sativa (NS) and Guizotia
abyssinica (GA) seeds. The seeds were characterized for functional
lipids like omega-3 fatty acid, antioxidant capacity, phenolic
compounds, dietary fiber and anti-nutritional factors. Analysis of the
seeds revealed LU and LS to be a rich source of α-linolenic acid
(41.85 ± 0.33%, 26.71 ± 0.63%), an omega 3 fatty acid (using
GCMS). While studying antioxidant potential NS seeds demonstrated
highest antioxidant ability (61.68 ± 0.21 TEAC/ 100 gm DW) due to
the presence of phenolics and terpenes as assayed by the Mass
spectral analysis. When screened for anti-nutritional factor
cyanogenic glycoside, LS seeds showed content as high as 1674 ± 54
mg HCN / kg. GA is a probable good source of a stable vegetable oil
(SFA: PUFA 1:2.3). The seeds showed diversified bioactive profile
and hence further studies to use different bio molecules in tandem for
the development of a possible 'nutraceutical cocktail' have been
initiated..
Abstract: Taking into account that many problems of natural
sciences and engineering are reduced to solving initial-value problem
for ordinary differential equations, beginning from Newton, the
scientists investigate approximate solution of ordinary differential
equations. There are papers of different authors devoted to the
solution of initial value problem for ODE. The Euler-s known
method that was developed under the guidance of the famous
scientists Adams, Runge and Kutta is the most popular one among
these methods.
Recently the scientists began to construct the methods preserving
some properties of Adams and Runge-Kutta methods and called them
hybrid methods. The constructions of such methods are investigated
from the middle of the XX century. Here we investigate one
generalization of multistep and hybrid methods and on their base we
construct specific methods of accuracy order p = 5 and p = 6 for
k = 1 ( k is the order of the difference method).
Abstract: This paper presents a implementation of an object tracking system in a video sequence. This object tracking is an important task in many vision applications. The main steps in video analysis are two: detection of interesting moving objects and tracking of such objects from frame to frame. In a similar vein, most tracking algorithms use pre-specified methods for preprocessing. In our work, we have implemented several object tracking algorithms (Meanshift, Camshift, Kalman filter) with different preprocessing methods. Then, we have evaluated the performance of these algorithms for different video sequences. The obtained results have shown good performances according to the degree of applicability and evaluation criteria.
Abstract: A new approach is adopted in this paper based
on Turk and Pentland-s eigenface method. It was found that the
probability density function of the distance between the projection
vector of the input face image and the average projection vector of
the subject in the face database, follows Rayleigh distribution. In
order to decrease the false acceptance rate and increase the
recognition rate, the input face image has been recognized using two
thresholds including the acceptance threshold and the rejection
threshold. We also find out that the value of two thresholds will be
close to each other as number of trials increases. During the training,
in order to reduce the number of trials, the projection vectors for each
subject has been averaged. The recognition experiments using the
proposed algorithm show that the recognition rate achieves to
92.875% whilst the average number of judgment is only 2.56 times.
Abstract: Requirements management is critical to software
delivery success and project lifecycle. Requirements management
and their traceability provide assistance for many software
engineering activities like impact analysis, coverage analysis,
requirements validation and regression testing. In addition
requirements traceability is the recognized component of many
software process improvement initiatives. Requirements traceability
also helps to control and manage evolution of a software system.
This paper aims to provide an evaluation of current requirements
management and traceability tools. Management and test managers
require an appropriate tool for the software under test. We hope,
evaluation identified here will help to select the efficient and
effective tool.
Abstract: In this paper a functional interpretation of quantum
theory (QT) with emphasis on quantum field theory (QFT) is proposed.
Besides the usual statements on relations between a functions
initial state and final state, a functional interpretation also contains
a description of the dynamic evolution of the function. That is, it
describes how things function. The proposed functional interpretation
of QT/QFT has been developed in the context of the author-s work
towards a computer model of QT with the goal of supporting
the largest possible scope of QT concepts. In the course of this
work, the author encountered a number of problems inherent in the
translation of quantum physics into a computer program. He came
to the conclusion that the goal of supporting the major QT concepts
can only be satisfied, if the present model of QT is supplemented
by a "functional interpretation" of QT/QFT. The paper describes a
proposal for that
Abstract: Neural networks are well known for their ability to
model non linear functions, but as statistical methods usually does,
they use a no parametric approach thus, a priori knowledge is not
obvious to be taken into account no more than the a posteriori
knowledge. In order to deal with these problematics, an original way
to encode the knowledge inside the architecture is proposed. This
method is applied to the problem of the evapotranspiration inside
karstic aquifer which is a problem of huge utility in order to deal
with water resource.
Abstract: Morphogenesis is the process that underpins the selforganised development and regeneration of biological systems. The ability to mimick morphogenesis in artificial systems has great potential for many engineering applications, including production of biological tissue, design of robust electronic systems and the co-ordination of parallel computing. Previous attempts to mimick these complex dynamics within artificial systems have relied upon the use of evolutionary algorithms that have limited their size and complexity. This paper will present some insight into the underlying dynamics of morphogenesis, then show how to, without the assistance of evolutionary algorithms, design cellular architectures that converge to complex patterns.
Abstract: A dynamic of Bertrand duopoly game is analyzed, where players use different production methods and choose their prices with bounded rationality. The equilibriums of the corresponding discrete dynamical systems are investigated. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability conditions of Nash equilibrium under a local adjustment process are studied. The stability of Nash equilibrium, as some parameters of the model are varied, gives rise to complex dynamics such as cycles of higher order and chaos. On this basis, we discover that an increase of adjustment speed of bounded rational player can make Bertrand market sink into the chaotic state. Finally, the complex dynamics, bifurcations and chaos are displayed by numerical simulation.
Abstract: Today, design requirements are extending more and
more from electronic (analogue and digital) to multidiscipline design.
These current needs imply implementation of methodologies to make
the CAD product reliable in order to improve time to market, study
costs, reusability and reliability of the design process.
This paper proposes a high level design approach applied for the
characterization and the optimization of Switched-Current Sigma-
Delta Modulators. It uses the new hardware description language
VHDL-AMS to help the designers to optimize the characteristics of
the modulator at a high level with a considerably reduced CPU time
before passing to a transistor level characterization.
Abstract: Scale Invariant Feature Transform (SIFT) has been
widely applied, but extracting SIFT feature is complicated and
time-consuming. In this paper, to meet the demand of the real-time
applications, SIFT is parallelized and optimized on cluster system,
which is named pSIFT. Redundancy storage and communication are
used for boundary data to improve the performance, and before
representation of feature descriptor, data reallocation is adopted to
keep load balance in pSIFT. Experimental results show that pSIFT
achieves good speedup and scalability.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.