Abstract: Biodiversity crisis is one of the many crises that
started at the turn of the millennia. Concrete form of expression is
still disputed, but there is a relatively high consensus regarding the
high rate of degradation and the urgent need for action. The strategy
of action outlines a strong economic component, together with the
recognition of market mechanisms as the most effective policies to
protect biodiversity. In this context, biodiversity and ecosystem
services are natural assets that play a key role in economic strategies
and technological development to promote development and
prosperity. Developing and strengthening policies for transition to an
economy based on efficient use of resources is the way forward.
To emphasize the co-viability specific to the connection economyecosystem
services, scientific approach aimed on one hand how to
implement policies for nature conservation and on the other hand, the
concepts underlying the economic expression of ecosystem services-
value, in the context of current technology. Following the analysis of
business opportunities associated with changes in ecosystem services
was concluded that development of market mechanisms for nature
conservation is a trend that is increasingly stronger individualized
within recent years. Although there are still many controversial issues
that have already given rise to an obvious bias, international
organizations and national governments have initiated and
implemented in cooperation or independently such mechanisms.
Consequently, they created the conditions for convergence between
private interests and social interests of nature conservation, so there
are opportunities for ongoing business development which leads,
among other things, the positive effects on biodiversity. Finally,
points out that markets fail to quantify the value of most ecosystem
services. Existing price signals reflect at best, only a proportion of the
total amount corresponding provision of food, water or fuel.
Abstract: Aggressive scaling of MOS devices requires use of ultra-thin gate oxides to maintain a reasonable short channel effect and to take the advantage of higher density, high speed, lower cost etc. Such thin oxides give rise to high electric fields, resulting in considerable gate tunneling current through gate oxide in nano regime. Consequently, accurate analysis of gate tunneling current is very important especially in context of low power application. In this paper, a simple and efficient analytical model has been developed for channel and source/drain overlap region gate tunneling current through ultra thin gate oxide n-channel MOSFET with inevitable deep submicron effect (DSME).The results obtained have been verified with simulated and reported experimental results for the purpose of validation. It is shown that the calculated tunnel current is well fitted to the measured one over the entire oxide thickness range. The proposed model is suitable enough to be used in circuit simulator due to its simplicity. It is observed that neglecting deep sub-micron effect may lead to large error in the calculated gate tunneling current. It is found that temperature has almost negligible effect on gate tunneling current. It is also reported that gate tunneling current reduces with the increase of gate oxide thickness. The impact of source/drain overlap length is also assessed on gate tunneling current.
Abstract: In the context of large volume Big Divisor (nearly)
SLagy D3/D7 μ-Split SUSY [1], after an explicit identification
of first generation of SM leptons and quarks with fermionic superpartners
of four Wilson line moduli, we discuss the identification of
gravitino as a potential dark matter candidate by explicitly calculating
the decay life times of gravitino (LSP) to be greater than age of
universe and lifetimes of decays of the co-NLSPs (the first generation
squark/slepton and a neutralino) to the LSP (the gravitino) to be
very small to respect BBN constraints. Interested in non-thermal
production mechanism of gravitino, we evaluate the relic abundance
of gravitino LSP in terms of that of the co-NLSP-s by evaluating
their (co-)annihilation cross sections and hence show that the former
satisfies the requirement for a potential Dark Matter candidate. We
also show that it is possible to obtain a 125 GeV light Higgs in our
setup.
Abstract: The hybridization of artificial immune system with
cellular automata (CA-AIS) is a novel method. In this hybrid model,
the cellular automaton within each cell deploys the artificial immune
system algorithm under optimization context in order to increase its
fitness by using its neighbor-s efforts. The hybrid model CA-AIS is
introduced to fix the standard artificial immune system-s weaknesses.
The credibility of the proposed approach is evaluated by simulations
and it shows that the proposed approach achieves better results
compared to standard artificial immune system.
Abstract: The efficient knowledge management system (KMS)
is one of the important strategies to help firms to achieve sustainable
competitive advantages, but little research has been conducted to
understand what contributes to the KMS success. This study thus set
to investigate the determinants of KMS success in the context of Thai
banking industry. A questionnaire survey was conducted in four
major Thai Banks to test the proposed KMS Success model.
The result of this study shows that KMS use and user satisfaction
relate significantly to the success of KMS, and knowledge quality,
service quality and trust lead to system use, and knowledge quality,
system quality and trust lead to user satisfaction. However, this
research focuses only on system and user-related factors. Future
research thus can extend to study factors such as management support
and organization readiness.
Abstract: Encryption protects communication partners from
disclosure of their secret messages but cannot prevent traffic analysis
and the leakage of information about “who communicates with
whom". In the presence of collaborating adversaries, this linkability
of actions can danger anonymity. However, reliably providing
anonymity is crucial in many applications. Especially in contextaware
mobile business, where mobile users equipped with PDAs
request and receive services from service providers, providing
anonymous communication is mission-critical and challenging at the
same time. Firstly, the limited performance of mobile devices does
not allow for heavy use of expensive public-key operations which are
commonly used in anonymity protocols. Moreover, the demands for
security depend on the application (e.g., mobile dating vs. pizza
delivery service), but different users (e.g., a celebrity vs. a normal
person) may even require different security levels for the same
application. Considering both hardware limitations of mobile devices
and different sensitivity of users, we propose an anonymity
framework that is dynamically configurable according to user and
application preferences. Our framework is based on Chaum-s mixnet.
We explain the proposed framework, its configuration
parameters for the dynamic behavior and the algorithm to enforce
dynamic anonymity.
Abstract: In H.264/AVC video encoding, rate-distortion
optimization for mode selection plays a significant role to achieve
outstanding performance in compression efficiency and video quality.
However, this mode selection process also makes the encoding
process extremely complex, especially in the computation of the ratedistortion
cost function, which includes the computations of the sum
of squared difference (SSD) between the original and reconstructed
image blocks and context-based entropy coding of the block. In this
paper, a transform-domain rate-distortion optimization accelerator
based on fast SSD (FSSD) and VLC-based rate estimation algorithm
is proposed. This algorithm could significantly simplify the hardware
architecture for the rate-distortion cost computation with only
ignorable performance degradation. An efficient hardware structure
for implementing the proposed transform-domain rate-distortion
optimization accelerator is also proposed. Simulation results
demonstrated that the proposed algorithm reduces about 47% of total
encoding time with negligible degradation of coding performance.
The proposed method can be easily applied to many mobile video
application areas such as a digital camera and a DMB (Digital
Multimedia Broadcasting) phone.
Abstract: The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.
Abstract: The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.
Abstract: In online context, the design and implementation of
effective remote laboratories environment is highly challenging on
account of hardware and software needs. This paper presents the
remote laboratory software framework modified from ilab shared
architecture (ISA). The ISA is a framework which enables students to
remotely acccess and control experimental hardware using internet
infrastructure. The need for remote laboratories came after
experiencing problems imposed by traditional laboratories. Among
them are: the high cost of laboratory equipment, scarcity of space,
scarcity of technical personnel along with the restricted university
budget creates a significant bottleneck on building required
laboratory experiments. The solution to these problems is to build
web-accessible laboratories. Remote laboratories allow students and
educators to interact with real laboratory equipment located
anywhere in the world at anytime. Recently, many universities and
other educational institutions especially in third world countries rely
on simulations because they do not afford the experimental
equipment they require to their students. Remote laboratories enable
users to get real data from real-time hand-on experiments. To
implement many remote laboratories, the system architecture should
be flexible, understandable and easy to implement, so that different
laboratories with different hardware can be deployed easily. The
modifications were made to enable developers to add more
equipment in ISA framework and to attract the new developers to
develop many online laboratories.
Abstract: Transportation is one of the most fundamental
challenges of urban development in contemporary world. On the
other hand, sustainable urban development has received tremendous
public attention in the last few years. This trend in addition to other
factors such as energy cost, environmental concerns, traffic
congestion and the feeling of lack of belonging have contributed to
the development of pedestrian areas. The purpose of this paper is to
study the role of walkable streets in sustainable development of
cities. Accordingly, a documentary research through valid sources
has been utilized to substantiate this study. The findings demonstrate
that walking can lead to sustainable urban development from
physical, social, environmental, cultural, economic and political
aspects. Also, pedestrian areas –which are the main context of
walking- act as focal points of development in cities and have a great
effect on modifying and stimulating of their adjacent urban spaces.
Abstract: Software maintenance is extremely important activity in software development life cycle. It involves a lot of human efforts, cost and time. Software maintenance may be further subdivided into different activities such as fault prediction, fault detection, fault prevention, fault correction etc. This topic has gained substantial attention due to sophisticated and complex applications, commercial hardware, clustered architecture and artificial intelligence. In this paper we surveyed the work done in the field of software maintenance. Software fault prediction has been studied in context of fault prone modules, self healing systems, developer information, maintenance models etc. Still a lot of things like modeling and weightage of impact of different kind of faults in the various types of software systems need to be explored in the field of fault severity.
Abstract: To reduce accidents in the industry, WSNs(Wireless Sensor
networks)- sensor data is used. WSNs- sensor data has the persistence and
continuity. therefore, we design and exploit the buffer management system that
has the persistence and continuity to avoid and delivery data conflicts. To
develop modules, we use the multi buffers and design the buffer management
modules that transfer sensor data through the context-aware methods.
Abstract: Mostly the systems are dealing with time varying
signals. The Power efficiency can be achieved by adapting the system
activity according to the input signal variations. In this context
an adaptive rate filtering technique, based on the level crossing sampling
is devised. It adapts the sampling frequency and the filter order
by following the input signal local variations. Thus, it correlates the
processing activity with the signal variations. Interpolation is required
in the proposed technique. A drastic reduction in the interpolation
error is achieved by employing the symmetry during the interpolation
process. Processing error of the proposed technique is
calculated. The computational complexity of the proposed filtering
technique is deduced and compared to the classical one. Results
promise a significant gain of the computational efficiency and hence
of the power consumption.
Abstract: The values of managers and employees in organizations are phenomena that have captured the interest of researchers at large. Despite this attention, there continues to be a lack of agreement on what values are and how they influence individuals, or how they are constituted in individuals- mind. In this article content-based approach is presented as alternative reference frame for exploring values. In content-based approach human thinking in different contexts is set at the focal point. Differences in valuations can be explained through the information contents of mental representations. In addition to the information contents, attention is devoted to those cognitive processes through which mental representations of values are constructed. Such informational contents are in decisive role for understanding human behavior. By applying content-based analysis to an examination of values as mental representations, it is possible to reach a deeper to the motivational foundation of behaviors, such as decision making in organizational procedures, through understanding the structure and meanings of specific values at play.
Abstract: The purpose of this paper is to investigate the
influence of a number of variables on the conditional mean and
conditional variance of credit spread changes. The empirical analysis
in this paper is conducted within the context of bivariate GARCH-in-
Mean models, using the so-called BEKK parameterization. We show
that credit spread changes are determined by interest-rate and equityreturn
variables, which is in line with theory as provided by the
structural models of default. We also identify the credit spread
change volatility as an important determinant of credit spread
changes, and provide evidence on the transmission of volatility
between the variables under study.
Abstract: This paper represents an investigation on how exploiting multiple transmit antennas by OFDM based wireless LAN subscribers can mitigate physical layer error rate. Then by comparing the Wireless LANs that utilize spatial diversity techniques with the conventional ones it will reveal how PHY and TCP throughputs behaviors are ameliorated. In the next step it will assess the same issues based on a cellular context operation which is mainly introduced as an innovated solution that beside a multi cell operation scenario benefits spatio-temporal signaling schemes as well. Presented simulations will shed light on the improved performance of the wide range and high quality wireless LAN services provided by the proposed approach.
Abstract: Speckled images arise when coherent microwave,
optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar
systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted
by speckle noise is complicated by the nature of the noise and is not
as straightforward as detection and estimation in additive noise. In
this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The
motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this
context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series
of Laguerre weighted exponential functions, resulting in a doubly
stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form.
It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an
exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.
Abstract: Environmental decision making, particularly about
hazardous waste management, is inherently exposed to a high
potential conflict, principally because of the trade-off between sociopolitical,
environmental, health and economic factors. The need to
plan complex contexts has led to an increasing request for decision
analytic techniques as support for the decision process. In this work,
alternative systems of asbestos-containing waste management
(ACW) in Puglia (Southern Italy) were explored by a multi-criteria
decision analysis. In particular, through Analytic Hierarchy Process
five alternatives management have been compared and ranked
according to their performance and efficiency, taking into account
environmental, health and socio-economic aspects. A separated
valuation has been performed for different temporal scale. For short
period results showed a narrow deviation between the disposal
alternatives “mono-material landfill in public quarry" and “dedicate
cells in existing landfill", with the best performance of the first one.
While for long period “treatment plant to eliminate hazard from
asbestos-containing waste" was prevalent, although high energy
demand required to achieve the change of crystalline structure. A
comparison with results from a participative approach in valuation
process might be considered as future development of method
application to ACW management.
Abstract: With the proliferation of Weblogs (blogs) use in
educational contexts, gaining a better understanding of why
students are willing to utilize blog systems has become an
important topic for practitioners and academics. While perceived
enjoyment has been found to have a significant influence on
behavioral intentions to use blogs or hedonic systems, few studies
have investigated the antecedents of perceived enjoyment in the
acceptance of blogging. The main purpose of the present study is to
explore the individual difference antecedents of perceived
enjoyment and examine how they influence behavioral intention to
blog through the mediation of perceived enjoyment. Based on the
previous literature, the Big Five personality traits (i.e.,
extraversion, agreeableness, conscientiousness, neuroticism, and
openness to experience), as well as computer self-efficacy and
personal innovation in information technology (PIIT), are
hypothesized as potential antecedents of perceived enjoyment in
the acceptance of blogging. Data collected from 358 respondents in
Taiwan are tested against the research model using the structural
equation modeling approach. The results indicate that extraversion,
agreeableness, conscientiousness, and PIIT have a significant
influence on perceived enjoyment, which in turn significantly
influences the behavioral intention to blog. These findings lead to
several important implications for future research.