Abstract: Available Bit Rate Service (ABR) is the lower priority
service and the better service for the transmission of data. On wireline
ATM networks ABR source is always getting the feedback from
switches about increase or decrease of bandwidth according to the
changing network conditions and minimum bandwidth is guaranteed.
In wireless networks guaranteeing the minimum bandwidth is really a
challenging task as the source is always in mobile and traveling from
one cell to another cell. Re establishment of virtual circuits from start
to end every time causes the delay in transmission. In our proposed
solution we proposed the mechanism to provide more available
bandwidth to the ABR source by re-usage of part of old Virtual
Channels and establishing the new ones. We want the ABR source to
transmit the data continuously (non-stop) inorderto avoid the delay.
In worst case scenario at least minimum bandwidth is to be allocated.
In order to keep the data flow continuously, priority is given to the
handoff ABR call against new ABR call.
Abstract: Logistics outsourcing is a growing trend and measuring its performance, a challenge. It must be consistent with the objectives set for logistics outsourcing, but we have found no objective-based performance measurement system. We have conducted a comprehensive review of the specialist literature to cover this gap, which has led us to identify and define these objectives. The outcome is that we have obtained a list of the most relevant objectives and their descriptions. This will enable us to analyse in a future study whether the indicators used for measuring logistics outsourcing performance are consistent with the objectives pursued with the outsourcing. If this is not the case, a proposal will be made for a set of financial and operational indicators to measure performance in logistics outsourcing that take the goals being pursued into account.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: All around the world pulp and paper industries are the
biggest plant production with the environmental pollution as the
biggest challenge facing the pulp manufacturing operations. The
concern among these industries is to produce a high volume of papers
with the high quality standard and of low cost without affecting the
environment. This result obtained from this bleaching study show
that the activation of peroxide was an effective method of reducing
the total applied charge of chlorine dioxide which is harmful to our
environment and also show that softwood and hardwood Kraft pulps
responded linearly to the peroxide treatments. During the bleaching
process the production plant produce chlorines. Under the trial stages
chloride dioxide has been reduced by 3 kg/ton to reduce the
brightness from 65% ISO to 60% ISO of pulp and the dosing point
returned to the E stage charges by pre-treating Kraft pulps with
hydrogen peroxide. The pulp and paper industry has developed
elemental chlorine free (ECF) and totally chlorine free (TCF)
bleaching, in their quest for being environmental friendly, they have
been looking at ways to turn their ECF process into a TCF process
while still being competitive. This prompted the research to
investigate the capability of the hydrogen peroxide as catalyst to
reduce chloride dioxide.
Abstract: In the globalization process, when the struggle for minds and values of the people is taking place, the impact of the virtual space can cause unexpected effects and consequences in the process of adjustment of young people in this world. Their special significance is defined by unconscious influence on the underlying process of meaning and therefore the values preached by them are much more effective and affect both the personal characteristics and the peculiarities of adjustment process. Related to this the challenge is to identify factors influencing the reflection characteristics of virtual subjects and measures their impact on the personal characteristics of the students.
Abstract: Grid computing provides an effective infrastructure for massive computation among flexible and dynamic collection of individual system for resource discovery. The major challenge for grid computing is to prevent breaches and secure the data from trespassers. To overcome such conflicts a semantic approach can be designed which will filter the access requests of peers by checking the resource description specifying the data and the metadata as factual statements. Between every node in the grid a semantic firewall as a middleware will be present The intruder will be required to present an application specifying there needs to the firewall and hence accordingly the system will grant or deny the application request.
Abstract: Many exist studies always use Markov decision
processes (MDPs) in modeling optimal route choice in
stochastic, time-varying networks. However, taking many
variable traffic data and transforming them into optimal route
decision is a computational challenge by employing MDPs in
real transportation networks. In this paper we model finite
horizon MDPs using directed hypergraphs. It is shown that the
problem of route choice in stochastic, time-varying networks
can be formulated as a minimum cost hyperpath problem, and
it also can be solved in linear time. We finally demonstrate the
significant computational advantages of the introduced
methods.
Abstract: This paper focuses upon three such painters working in
France from this time and their representations both of their host
country in which they found themselves displaced, and of their
homeland which they represent through refracted memories from their
new perspective in Europe. What is their representation of France and
China´╝ÅTaiwan? Is it Otherness or an origin?
This paper also attempts to explore the three artists- diasporic lives
and to redefine their transnational identities. Hou Chin-lang, the
significance of his multiple-split images serve to highlight the intricate
relationships between his work and the surrounding family, and to
reveal his identity of his Taiwan “homeland". Yin Xin takes paintings
from the Western canon and subjects them to a process of
transformation through Chinese imagery. In the same period, Lin
Li-ling, transforms the transnational spirit of Yin Xin to symbolic
codes with neutered female bodies and tatoos, thus creates images that
challenge the boundaries of both gender and nationality.
Abstract: Injection molding is a very complicated process to
monitor and control. With its high complexity and many process
parameters, the optimization of these systems is a very challenging
problem. To meet the requirements and costs demanded by the
market, there has been an intense development and research with the
aim to maintain the process under control. This paper outlines the
latest advances in necessary algorithms for plastic injection process
and monitoring, and also a flexible data acquisition system that
allows rapid implementation of complex algorithms to assess their
correct performance and can be integrated in the quality control
process. This is the main topic of this paper. Finally, to demonstrate
the performance achieved by this combination, a real case of use is
presented.
Abstract: The new framework the Higher Education is
immersed in involves a complete change in the way lecturers must
teach and students must learn. Whereas the lecturer was the main
character in traditional education, the essential goal now is to
increase the students' participation in the process. Thus, one of the
main tasks of lecturers in this new context is to design activities of
different nature in order to encourage such participation. Seminars
are one of the activities included in this environment. They are active
sessions that enable going in depth into specific topics as support of
other activities. They are characterized by some features such as
favoring interaction between students and lecturers or improving
their communication skills. Hence, planning and organizing strategic
seminars is indeed a great challenge for lecturers with the aim of
acquiring knowledge and abilities. This paper proposes a method
using Artificial Intelligence techniques to obtain student profiles
from their marks and preferences. The goal of building such profiles
is twofold. First, it facilitates the task of splitting the students into
different groups, each group with similar preferences and learning
difficulties. Second, it makes it easy to select adequate topics to be a
candidate for the seminars. The results obtained can be either a
guarantee of what the lecturers could observe during the development
of the course or a clue to reconsider new methodological strategies in
certain topics.
Abstract: The main objective of this paper is to contribute the
existing knowledge transfer and IT Outsourcing literature
specifically in the context of Malaysia by reviewing the current
practices of e-government IT outsourcing in Malaysia including the
issues and challenges faced by the public agencies in transferring the
knowledge during the engagement. This paper discusses various
factors and different theoretical model of knowledge transfer starting
from the traditional model to the recent model suggested by the
scholars. The present paper attempts to align organizational
knowledge from the knowledge-based view (KBV) and
organizational learning (OL) lens. This review could help shape the
direction of both future theoretical and empirical studies on inter-firm
knowledge transfer specifically on how KBV and OL perspectives
could play significant role in explaining the complex relationships
between the client and vendor in inter-firm knowledge transfer and
the role of organizational management information system and
Transactive Memory System (TMS) to facilitate the organizational
knowledge transferring process. Conclusion is drawn and further
research is suggested.
Abstract: Long term rainfall analysis and prediction is a
challenging task especially in the modern world where the impact of
global warming is creating complications in environmental issues.
These factors which are data intensive require high performance
computational modeling for accurate prediction. This research paper
describes a prototype which is designed and developed on grid
environment using a number of coupled software infrastructural
building blocks. This grid enabled system provides the demanding
computational power, efficiency, resources, user-friendly interface,
secured job submission and high throughput. The results obtained
using sequential execution and grid enabled execution shows that
computational performance has enhanced among 36% to 75%, for
decade of climate parameters. Large variation in performance can be
attributed to varying degree of computational resources available for
job execution.
Grid Computing enables the dynamic runtime selection, sharing
and aggregation of distributed and autonomous resources which plays
an important role not only in business, but also in scientific
implications and social surroundings. This research paper attempts to
explore the grid enabled computing capabilities on weather indices
from HOAPS data for climate impact modeling and change
detection.
Abstract: Knowledge sharing in general and the contextual
access to knowledge in particular, still represent a key challenge in
the knowledge management framework. Researchers on semantic
web and human machine interface study techniques to enhance this
access. For instance, in semantic web, the information retrieval is
based on domain ontology. In human machine interface, keeping
track of user's activity provides some elements of the context that can
guide the access to information. We suggest an approach based on
these two key guidelines, whilst avoiding some of their weaknesses.
The approach permits a representation of both the context and the
design rationale of a project for an efficient access to knowledge. In
fact, the method consists of an information retrieval environment
that, in the one hand, can infer knowledge, modeled as a semantic
network, and on the other hand, is based on the context and the
objectives of a specific activity (the design). The environment we
defined can also be used to gather similar project elements in order to
build classifications of tasks, problems, arguments, etc. produced in a
company. These classifications can show the evolution of design
strategies in the company.
Abstract: Grid computing is a group of clusters connected over
high-speed networks that involves coordinating and sharing
computational power, data storage and network resources operating
across dynamic and geographically dispersed locations. Resource
management and job scheduling are critical tasks in grid computing.
Resource selection becomes challenging due to heterogeneity and
dynamic availability of resources. Job scheduling is a NP-complete
problem and different heuristics may be used to reach an optimal or
near optimal solution. This paper proposes a model for resource and
job scheduling in dynamic grid environment. The main focus is to
maximize the resource utilization and minimize processing time of
jobs. Grid resource selection strategy is based on Max Heap Tree
(MHT) that best suits for large scale application and root node of
MHT is selected for job submission. Job grouping concept is used to
maximize resource utilization for scheduling of jobs in grid
computing. Proposed resource selection model and job grouping
concept are used to enhance scalability, robustness, efficiency and
load balancing ability of the grid.
Abstract: Nanophotocatalysts such as titanium (TiO2), zinc (ZnO), and iron (Fe2O3) oxides can be used in organic pollutants oxidation, and in many other applications. But among the challenges for technological application (scale-up) of the nanotechnology scientific developments two aspects are still little explored: research on environmental risk of the nanomaterials preparation methods, and the study of nanomaterials properties and/or performance variability. The environmental analysis was performed for six different methods of ZnO nanoparticles synthesis, and showed that it is possible to identify the more environmentally compatible process even at laboratory scale research. The obtained ZnO nanoparticles were tested as photocatalysts, and increased the degradation rate of the Rhodamine B dye up to 30 times.
Abstract: The use of hard and brittle material has become
increasingly more extensive in recent years. Therefore processing of
these materials for the parts fabrication has become a challenging
problem. However, it is time-consuming to machine the hard brittle
materials with the traditional metal-cutting technique that uses
abrasive wheels. In addition, the tool would suffer excessive wear as
well. However, if ultrasonic energy is applied to the machining
process and coupled with the use of hard abrasive grits, hard and
brittle materials can be effectively machined. Ultrasonic machining
process is mostly used for the brittle materials. The present research
work has developed models using finite element approach to predict
the mechanical stresses sand strains produced in the tool during
ultrasonic machining process. Also the flow behavior of abrasive
slurry coming out of the nozzle has been studied for simulation using
ANSYS CFX module. The different abrasives of different grit sizes
have been used for the experimentation work.
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Abstract: Nowadays it is a trend for electronic circuit designers to
integrate all system components on a single-chip. This paper proposed
the design of a single-chip proportional to absolute temperature
(PTAT) sensor including a voltage reference circuit using CEDEC
0.18m CMOS Technology. It is a challenge to design asingle-chip
wide range linear response temperature sensor for many applications.
The channel widths between the compensation transistor and the
reference transistor are critical to design the PTAT temperature sensor
circuit. The designed temperature sensor shows excellent linearity
between -100°C to 200° and the sensitivity is about 0.05mV/°C.
The chip is designed to operate with a single voltage source of 1.6V.
Abstract: For over a decade, the Pulse Coupled Neural Network
(PCNN) based algorithms have been successfully used in image
interpretation applications including image segmentation. There are
several versions of the PCNN based image segmentation methods,
and the segmentation accuracy of all of them is very sensitive to the
values of the network parameters. Most methods treat PCNN
parameters like linking coefficient and primary firing threshold as
global parameters, and determine them by trial-and-error. The
automatic determination of appropriate values for linking coefficient,
and primary firing threshold is a challenging problem and deserves
further research. This paper presents a method for obtaining global as
well as local values for the linking coefficient and the primary firing
threshold for neurons directly from the image statistics. Extensive
simulation results show that the proposed approach achieves
excellent segmentation accuracy comparable to the best accuracy
obtainable by trial-and-error for a variety of images.