Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: Semiconductor detector arrays are widely used in
high-temperature plasma diagnostics. They have a fast response,
which allows observation of many processes and instabilities in
tokamaks. In this paper, there are reviewed several diagnostics based
on semiconductor arrays as cameras, AXUV photodiodes (referred
often as fast “bolometers") and detectors of both soft X-rays and
visible light installed on the COMPASS tokamak recently. Fresh
results from both spring and summer campaigns in 2012 are
introduced. Examples of the utilization of the detectors are shown on
the plasma shape determination, fast calculation of the radiation
center, two-dimensional plasma radiation tomography in different
spectral ranges, observation of impurity inflow, and also on
investigation of MHD activity in the COMPASS tokamak discharges.
Abstract: Variations in the growth rate constant of the Listeria
monocytogenes bacterial species were determined at 37°C in
irradiated environments and compared to the situation of a nonirradiated
environment. The bacteria cells, contained in a suspension
made of a nutrient solution of Brain Heart Infusion, were made to
grow at different frequency (2.30e2.60 GHz) and power (0e400
mW) values, in a plug flow reactor positioned in the irradiated
environment. Then the reacting suspension was made to pass into a
cylindrical cuvette where its optical density was read every 2.5
minutes at a wavelength of 600 nm. The obtained experimental data
of optical density vs. time allowed the bacterial growth rate constant
to be derived; this was found to be slightly influenced by microwave
power, but not by microwave frequency; in particular, a minimum
value was found for powers in the 50e150 mW field.
Abstract: A number of studies highlighted problems related to
ERP systems, yet, most of these studies focus on the problems during
the project and implementation stages but not during the postimplementation
use process. Problems encountered in the process of
using ERP would hinder the effective exploitation and the extended
and continued use of ERP systems and their value to organisations.
This paper investigates the different types of problems users
(operational, supervisory and managerial) faced in using ERP and
how 'feral system' is used as the coping mechanism. The paper
adopts a qualitative method and uses data collected from two cases
and 26 interviews, to inductively develop a casual network model of
ERP usage problem and its coping mechanism. This model classified
post ERP usage problems as data quality, system quality, interface
and infrastructure. The model is also categorised the different coping
mechanism through use of 'feral system' inclusive of feral
information system, feral data and feral use of technology.
Abstract: Aspect Oriented Programming promises many
advantages at programming level by incorporating the cross cutting
concerns into separate units, called aspects. Join Points are
distinguishing features of Aspect Oriented Programming as they
define the points where core requirements and crosscutting concerns
are (inter)connected. Currently, there is a problem of multiple
aspects- composition at the same join point, which introduces the
issues like ordering and controlling of these superimposed aspects.
Dynamic strategies are required to handle these issues as early as
possible. State chart is an effective modeling tool to capture dynamic
behavior at high level design. This paper provides methodology to
formulate the strategies for multiple aspect composition at high level,
which helps to better implement these strategies at coding level. It
also highlights the need of designing shared join point at high level,
by providing the solutions of these issues using state chart diagrams
in UML 2.0. High level design representation of shared join points
also helps to implement the designed strategy in systematic way.
Abstract: Visible Light Communication (VLC) offers advantages of low energy consumption, licence free and RF interference free operation. One application area for VLC is in the provision of health centred services circumventing issues of interference with any biomedical device within the environment. VLC performamce is affected by natural light restricting systems avilability and relibility. The paper presents an analysis of the performance of VLC systems under different meteorological conditions. The evaluation considered the impact of natural light as a function of different reflection surfaces in different room sizes.
Abstract: This paper proposes a new approach to perform the
problem of real-time face detection. The proposed method combines
primitive Haar-Like feature and variance value to construct a new
feature, so-called Variance based Haar-Like feature. Face in image
can be represented with a small quantity of features using this
new feature. We used SVM instead of AdaBoost for training and
classification. We made a database containing 5,000 face samples
and 10,000 non-face samples extracted from real images for learning
purposed. The 5,000 face samples contain many images which have
many differences of light conditions. And experiments showed that
face detection system using Variance based Haar-Like feature and
SVM can be much more efficient than face detection system using
primitive Haar-Like feature and AdaBoost. We tested our method on
two Face databases and one Non-Face database. We have obtained
96.17% of correct detection rate on YaleB face database, which is
higher 4.21% than that of using primitive Haar-Like feature and
AdaBoost.
Abstract: The importance of nurturing, accumulating, and efficiently deploying knowledge resources through formal structures and organisational mechanisms is well understood. Recent trends in knowledge management (KM) highlight that the effective creation and transfer of knowledge can also rely upon extra-organisational channels, such as, informal networks. The perception exists that the role of informal networks in knowledge creation and performance has been underestimated in the organisational context. Literature indicates that many managers fail to comprehend and successfully exploit the potential role of informal networks to create value for their organisations. This paper investigates: 1) whether managers share work-specific knowledge with informal contacts within and outside organisational boundaries; and 2) what do they think is the importance of this knowledge collaboration in their learning and work outcomes.
Abstract: Nowadays, people are going more and more mobile, both in terms of devices and associated applications. Moreover, services that these devices are offering are getting wider and much more complex. Even though actual handheld devices have considerable computing power, their contexts of utilization are different. These contexts are affected by the availability of connection, high latency of wireless networks, battery life, size of the screen, on-screen or hard keyboard, etc. Consequently, development of mobile applications and their associated mobile Web services, if any, should follow a concise methodology so they will provide a high Quality of Service. The aim of this paper is to highlight and discuss main issues to consider when developing mobile applications and mobile Web services and then propose a framework that leads developers through different steps and modules toward development of efficient and secure mobile applications. First, different challenges in developing such applications are elicited and deeply discussed. Second, a development framework is presented with different modules addressing each of these challenges. Third, the paper presents an example of a mobile application, Eivom Cinema Guide, which benefits from following our development framework.
Abstract: The increasing complexity of software development based on peer to peer networks makes necessary the creation of new frameworks in order to simplify the developer-s task. Additionally, some applications, e.g. fire detection or security alarms may require real-time constraints and the high level definition of these features eases the application development. In this paper, a service model based on a component model with real-time features is proposed. The high-level model will abstract developers from implementation tasks, such as discovery, communication, security or real-time requirements. The model is oriented to deploy services on small mobile devices, such as sensors, mobile phones and PDAs, where the computation is light-weight. Services can be composed among them by means of the port concept to form complex ad-hoc systems and their implementation is carried out using a component language called UM-RTCOM. In order to apply our proposals a fire detection application is described.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: To improve the classification rate of the face
recognition, features combination and a novel non-linear kernel are
proposed. The feature vector concatenates three different radius of
local binary patterns and Gabor wavelet features. Gabor features are
the mean, standard deviation and the skew of each scaling and
orientation parameter. The aim of the new kernel is to incorporate
the power of the kernel methods with the optimal balance between
the features. To verify the effectiveness of the proposed method,
numerous methods are tested by using four datasets, which are
consisting of various emotions, orientations, configuration,
expressions and lighting conditions. Empirical results show the
superiority of the proposed technique when compared to other
methods.
Abstract: The technical realization of data transmission using
glass fiber began after the development of diode laser in year 1962.
The erbium doped fiber amplifiers (EDFA's) in high speed networks
allow information to be transmitted over longer distances without
using of signal amplification repeaters. These kinds of fibers are
doped with erbium atoms which have energy levels in its atomic
structure for amplifying light at 1550nm. When a carried signal wave
at 1550nm enters the erbium fiber, the light stimulates the excited
erbium atoms which pumped with laser beam at 980nm as additional
light. The wavelength and intensity of the semiconductor lasers
depend on the temperature of active zone and the injection current.
The present paper shows the effect of the diode lasers temperature
and injection current on the optical amplification. From the results of
in- and output power one may calculate the max. optical gain by
erbium doped fiber amplifier.
Abstract: The necessity of updating the numerical models inputs, because of geometrical and resistive variations in rivers subject to solid transport phenomena, requires detailed control and monitoring activities. The human employment and financial resources of these activities moves the research towards the development of expeditive methodologies, able to evaluate the outflows through the measurement of more easily acquirable sizes. Recent studies highlighted the dependence of the entropic parameter on the kinematical and geometrical flow conditions. They showed a meaningful variability according to the section shape, dimension and slope. Such dependences, even if not yet well defined, could reduce the difficulties during the field activities, and also the data elaboration time. On the basis of such evidences, the relationships between the entropic parameter and the geometrical and resistive sizes, obtained through a large and detailed laboratory experience on steady free surface flows in conditions of macro and intermediate homogeneous roughness, are analyzed and discussed.
Abstract: The research objective of the project and article
“European Ecological Network Natura 2000 – opportunities and
threats” Natura 2000 sites constitute a form of environmental
protection, several legal problems are likely to result. Most
controversially, certain sites will be subject to two regimes of
protection: as national parks and as Natura 2000 sites. This dualism
of the legal regulation makes it difficult to perform certain legal
obligations related to the regimes envisaged under each form of
environmental protection. Which regime and which obligations
resulting from the particular form of environmental protection have
priority and should prevail? What should be done if these obligations
are contradictory? Furthermore, an institutional problem consists in
that no public administration authority has the power to resolve legal
conflicts concerning the application of a particular regime on a given
site. There are also no criteria to decide priority and superiority of
one form of environmental protection over the other. Which
regulations are more important, those that pertain to national parks or
to Natura 2000 sites? In the light of the current regulations, it is
impossible to give a decisive answer to these questions. The internal
hierarchy of forms of environmental protection has not been
determined, and all such forms should be treated equally.
Abstract: The distribution, enrichment, and accumulation of zinc
(Zn) in the sediments of Kaohsiung Ocean Disposal Site (KODS),
Taiwan were investigated. Sediment samples from two outer disposal
site stations and nine disposed stations in the KODS were collected per
quarterly in 2009 and characterized for Zn, aluminum, organic matter,
and grain size. Results showed that the mean Zn concentrations varied
from 48 mg/kg to 456 mg/kg. Results from the enrichment factor (EF)
and geo-accumulation index (Igeo) analyses imply that the sediments
collected from the KODS can be characterized between moderate and
moderately severe degree enrichment and between none and none to
medium accumulation of Zn, respectively. However, results of
potential ecological risk index indicate that the sediment has low
ecological potential risk. The EF, Igeo, and Zn concentrations at the
disposed stations were slightly higher than those at outer disposal site.
This indicated that the disposed area centers may be subjected to the
disposal impaction of harbor dredged sediments.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.
Abstract: In this work we study the reflection of circularly
polarised light from a nano-structured biological material found in
the exocuticle of scarabus beetles. This material is made of a stack
of ultra-thin (~5 nm) uniaxial layers arranged in a left-handed
helicoidal stack, which resonantly reflects circularly polarized light.
A chirp in the layer thickness combined with a finite absorption
coefficient produce a broad smooth reflectance spectrum. By
comparing model calculations and electron microscopy with
measured spectra we can explain our observations and quantify most
relevant structural parameters.
Abstract: The hydrolysis kinetics of polycrystalline lithium hydride (LiH) in argon at various low humidities was measured by gravimetry and Raman spectroscopy with ambient water concentration ranging from 200 to 1200 ppm. The results showed that LiH hydrolysis curve revealed a paralinear shape, which was attributed to two different reaction stages that forming different products as explained by the 'Layer Diffusion Control' model. Based on the model, a novel two-stage rate equation for LiH hydrolysis reactions was developed and used to fit the experimental data for determination of Li2O steady thickness Hs and the ultimate hydrolysis rate vs. The fitted data presented a rise of Hs as ambient water concentration cw increased. However, in spite of the negative effect imposed by Hs increasing, the upward trend of vs remained, which implied that water concentration, rather than Li2O thickness, played a predominant role in LiH hydrolysis kinetics. In addition, the proportional relationship between vsHs and cw predicted by rate equation and confirmed by gravimetric data validated the model in such conditions.
Abstract: Decrease in hardware costs and advances in computer
networking technologies have led to increased interest in the use of
large-scale parallel and distributed computing systems. One of the
biggest issues in such systems is the development of effective
techniques/algorithms for the distribution of the processes/load of a
parallel program on multiple hosts to achieve goal(s) such as
minimizing execution time, minimizing communication delays,
maximizing resource utilization and maximizing throughput.
Substantive research using queuing analysis and assuming job
arrivals following a Poisson pattern, have shown that in a multi-host
system the probability of one of the hosts being idle while other host
has multiple jobs queued up can be very high. Such imbalances in
system load suggest that performance can be improved by either
transferring jobs from the currently heavily loaded hosts to the lightly
loaded ones or distributing load evenly/fairly among the hosts .The
algorithms known as load balancing algorithms, helps to achieve the
above said goal(s). These algorithms come into two basic categories -
static and dynamic. Whereas static load balancing algorithms (SLB)
take decisions regarding assignment of tasks to processors based on
the average estimated values of process execution times and
communication delays at compile time, Dynamic load balancing
algorithms (DLB) are adaptive to changing situations and take
decisions at run time.
The objective of this paper work is to identify qualitative
parameters for the comparison of above said algorithms. In future this
work can be extended to develop an experimental environment to
study these Load balancing algorithms based on comparative
parameters quantitatively.