Abstract: The access to relevant information that is adapted to
user’s needs, preferences and environment is a challenge in many
applications running. That causes an appearance of context-aware
systems. To facilitate the development of this class of applications, it
is necessary that these applications share a common context
metamodel. In this article, we will present our context metamodel
that is defined using the OMG Meta Object facility (MOF).This
metamodel is based on the analysis and synthesis of context concepts
proposed in literature.
Abstract: This survey paper shows the recent state of model
comparison as it’s applies to Model Driven engineering. In Model
Driven Engineering to calculate the difference between the models is
a very important and challenging task. There are number of tasks
involved in model differencing that firstly starts with identifying and
matching the elements of the model. In this paper, we discuss how
model matching is accomplished, the strategies, techniques and the
types of the model. We also discuss the future direction. We found
out that many of the latest model comparison strategies are geared
near enabling Meta model and similarity based matching. Therefore
model versioning is the most dominant application of the model
comparison. Recently to work on comparison for versioning has
begun to deteriorate, giving way to different applications. Ultimately
there is wide change among the tools in the measure of client exertion
needed to perform model comparisons, as some require more push to
encourage more sweeping statement and expressive force.
Abstract: The teaching of computer programming for beginners
has been generally considered as a difficult and challenging task.
Several methodologies and research tools have been developed,
however, the difficulty of teaching still remains. Our work integrates
the state of the art in teaching programming with game software and
further provides metrics for the evaluation of student performance in
a collaborative activity of playing games. This paper aims to present a
multi-agent system architecture to be incorporated to the educational
collaborative game software for teaching programming that monitors,
evaluates and encourages collaboration by the participants. A
literature review has been made on the concepts of Collaborative
Learning, Multi-agents systems, collaborative games and techniques
to teach programming using these concepts simultaneously.
Abstract: The right to basic sanitation, was elevated to the
category of fundamental right by the Constitution of 1988 to protect
the ecologically balanced environment, ensuring social rights to
health and adequate housing and put the dignity of the human person
as the foundation of the Brazilian Democratic State. Before their
essentiality to humans, this article seeks to understand why universal
access to basic sanitation is a goal so difficult to achieve in Brazil.
Therefore, this research uses the deductive and analytical method.
Given the nature of the research literature, research techniques were
centered in specialized books on the subject, journals, theses and
dissertations, laws, relevant law case and raising social indicators
relating to the theme. The relevance of the topic stems, among other
things, the fact that sanitation services are essential for a dignified
life, i.e., everyone is entitled to the maintenance of the necessary
existence conditions are satisfied. However, the effectiveness of this
right is undermined in society, since Brazil has huge deficit in
sanitation services, denying thus a worthy life to most of the
population. Thus, it can be seen that the provision of water and
sewage services in Brazil is still characterized by a large imbalance,
since the municipalities with lower population index have greater
disability in the sanitation service. The truth is that the precariousness
of water and sewage services in Brazil is still very concentrated in the
North and Northeast regions, limiting the effective implementation of
the Law 11.445/2007 in the country. Therefore, there is urgent need
for a positive service by the State in the provision of sanitation
services in order to prevent and control disease, improve quality of
life and productivity of individuals, besides preventing contamination
of water resources. More than just social and economic necessity,
there is a government duty to implement such services. In this sense,
given the current scenario, to achieve universal access to basic
sanitation imposes many hurdles. These are mainly in the field of
properly formulated and implemented public policies, i.e., it requires
an excellent institutional organization, management services,
strategic planning, social control, in order to provide answers to
complex challenges.
Abstract: As smartphones are equipped with various sensors,
there have been many studies focused on using these sensors to create
valuable applications. Human activity recognition is one such
application motivated by various welfare applications, such as the
support for the elderly, measurement of calorie consumption, lifestyle
and exercise patterns analyses, and so on. One of the challenges one
faces when using smartphone sensors for activity recognition is that
the number of sensors should be minimized to save battery power. In
this paper, we show that a fairly accurate classifier can be built that
can distinguish ten different activities by using only a single sensor
data, i.e., the smartphone accelerometer data. The approach that we
adopt to deal with this twelve-class problem uses various methods.
The features used for classifying these activities include not only the
magnitude of acceleration vector at each time point, but also the
maximum, the minimum, and the standard deviation of vector
magnitude within a time window. The experiments compared the
performance of four kinds of basic multi-class classifiers and the
performance of four kinds of ensemble learning methods based on
three kinds of basic multi-class classifiers. The results show that
while the method with the highest accuracy is ECOC based on
Random forest.
Abstract: In this article, and through the modernization project
of metropolis of Constantine (PMMC) experience in Algeria,
discussed to highlight the importance of management in an urban
project at various levels: strategic and operational. The statement we attended to reach is to evaluate the
modernization project of metropolis of Constantine in the light of
management and prove the relation between a good urban
management and the success of an urban project.
Abstract: Sewer deposits have been identified as a major cause
of dysfunctions in combined sewer systems regarding sewer
management, which induces different negative consequents resulting
in poor hydraulic conveyance, environmental damages as well as
worker’s health. In order to overcome the problematics of
sedimentation, flushing has been considered as the most operative
and cost-effective way to minimize the sediments impacts and
prevent such challenges. Flushing, by prompting turbulent wave
effects, can modify the bed form depending on the hydraulic
properties and geometrical characteristics of the conduit. So far, the
dynamics of the bed-load during high-flow events in combined sewer
systems as a complex environment is not well understood, mostly due
to lack of measuring devices capable to work in the “hostile” in
combined sewer system correctly. In this regards, a one-episode
flushing issue from an opening gate valve with weir function was
carried out in a trunk sewer in Paris to understand its cleansing
efficiency on the sediments (thickness: 0-30 cm). During more than
1h of flushing within 5 m distance in downstream of this flushing
device, a maximum flowrate and a maximum level of water have
been recorded at 5 m in downstream of the gate as 4.1 m3/s and 2.1
m respectively. This paper is aimed to evaluate the efficiency of this
type of gate for around 1.1 km (from the point -50 m to +1050 m in
downstream from the gate) by (i) determining bed grain-size
distribution and sediments evolution through the sewer channel, as
well as their organic matter content, and (ii) identifying sections that
exhibit more changes in their texture after the flush. For the first one,
two series of sampling were taken from the sewer length and then
analyzed in laboratory, one before flushing and second after, at same
points among the sewer channel. Hence, a non-intrusive sampling
instrument has undertaken to extract the sediments smaller than the
fine gravels. The comparison between sediments texture after the
flush operation and the initial state, revealed the most modified zones
by the flush effect, regarding the sewer invert slope and hydraulic
parameters in the zone up to 400 m from the gate. At this distance,
despite the increase of sediment grain-size rages, D50 (median grainsize)
varies between 0.6 mm and 1.1 mm compared to 0.8 mm and 10
mm before and after flushing, respectively. Overall, regarding the
sewer channel invert slope, results indicate that grains smaller than
sands (< 2 mm) are more transported to downstream along about 400
m from the gate: in average 69% before against 38% after the flush
with more dispersion of grain-sizes distributions. Furthermore, high
effect of the channel bed irregularities on the bed material evolution
has been observed after the flush.
Abstract: Over the past few years, a lot of research has been
conducted to bring Automatic Speech Recognition (ASR) into various
areas of Air Traffic Control (ATC), such as air traffic control
simulation and training, monitoring live operators for with the aim
of safety improvements, air traffic controller workload measurement
and conducting analysis on large quantities controller-pilot speech.
Due to the high accuracy requirements of the ATC context and its
unique challenges, automatic speech recognition has not been widely
adopted in this field. With the aim of providing a good starting
point for researchers who are interested bringing automatic speech
recognition into ATC, this paper gives an overview of possibilities
and challenges of applying automatic speech recognition in air traffic
control. To provide this overview, we present an updated literature
review of speech recognition technologies in general, as well as
specific approaches relevant to the ATC context. Based on this
literature review, criteria for selecting speech recognition approaches
for the ATC domain are presented, and remaining challenges and
possible solutions are discussed.
Abstract: Discursive practices enacted by educators in
kindergarten create a blueprint for how the educational trajectories of
students with disabilities are constructed. This two-year ethnographic
case study critically examines educators’ relationships with students
considered to present challenging behaviors in one kindergarten
classroom located in a predominantly White middle class school
district in the Northeast of the United States. Focusing on the
language and practices used by one special education teacher and
three teaching assistants, this paper analyzes how teacher responses
to students’ behaviors constructs and positions students over one year
of kindergarten education. Using a critical discourse analysis it shows
that educators understand students’ behaviors as deficit and needing
consequences. This study highlights how educators’ responses reflect
students' individual characteristics including family background,
socioeconomics and ability status. This paper offers in depth analysis
of two students’ stories, which evidenced that the language used by
educators amplifies the social positioning of students within the
classroom and creates a foundation for who they are constructed to
be. Through exploring routine language and practices, this paper
demonstrates that educators outlined a blueprint of kindergartners,
which positioned students as learners in ways that became the ground
for either a limited or a promising educational pathway for them.
Abstract: This study analyzes the critical gaps in the
architecture of European stability and the expected role of the
banking union as the new important step towards completing the
Economic and Monetary Union that should enable the creation of
safe and sound financial sector for the euro area market. The single
rulebook together with the Single Supervisory Mechanism and the
Single Resolution Mechanism - as two main pillars of the banking
union, should provide a consistent application of common rules and
administrative standards for supervision, recovery and resolution of
banks – with the final aim of replacing the former bail-out practice
with the bail-in system through which possible future bank failures
would be resolved by their own funds, i.e. with minimal costs for
taxpayers and real economy. In this way, the vicious circle between
banks and sovereigns would be broken. It would also reduce the
financial fragmentation recorded in the years of crisis as the result of
divergent behaviors in risk premium, lending activities and interest
rates between the core and the periphery. In addition, it should
strengthen the effectiveness of monetary transmission channels, in
particular the credit channels and overflows of liquidity on the money
market which, due to the fragmentation of the common financial
market, has been significantly disabled in period of crisis. However,
contrary to all the positive expectations related to the future
functioning of the banking union, major findings of this study
indicate that characteristics of the economic system in which the
banking union will operate should not be ignored. The euro area is an
integration of strong and weak entities with large differences in
economic development, wealth, assets of banking systems, growth
rates and accountability of fiscal policy. The analysis indicates that
low and unbalanced economic growth remains a challenge for the
maintenance of financial stability and this problem cannot be
resolved just by a single supervision. In many countries bank assets
exceed their GDP by several times and large banks are still a matter
of concern, because of their systemic importance for individual
countries and the euro zone as a whole. The creation of the Single
Supervisory Mechanism and the Single Resolution Mechanism is a
response to the European crisis, which has particularly affected
peripheral countries and caused the associated loop between the
banking crisis and the sovereign debt crisis, but has also influenced
banks’ balance sheets in the core countries, as the result of crossborder
capital flows. The creation of the SSM and the SRM should
prevent the similar episodes to happen again and should also provide
a new opportunity for strengthening of economic and financial
systems of the peripheral countries. On the other hand, there is a
potential threat that future focus of the ECB, resolution mechanism
and other relevant institutions will be extremely oriented towards
large and significant banks (whereby one half of them operate in the
core and most important euro area countries), and therefore it remains
questionable to what extent will the common resolution funds will be used for rescue of less important institutions. Recent geopolitical
developments will be the optimal indicator to show whether the
previously established mechanisms are sufficient enough to maintain
the adequate financial stability in the euro area market.
Abstract: Mobile Ad hoc Network is a set of self-governing
nodes which communicate through wireless links. Dynamic topology
MANETs makes routing a challenging task. Various routing
protocols are there, but due to various fundamental characteristic
open medium, changing topology, distributed collaboration and
constrained capability, these protocols are tend to various types of
security attacks. Black hole is one among them. In this attack,
malicious node represents itself as having the shortest path to the
destination but that path not even exists. In this paper, we aim to
develop a routing protocol for detection and prevention of black hole
attack by modifying AODV routing protocol. This protocol is able to
detect and prevent the black hole attack. Simulation is done using
NS-2, which shows the improvement in network performance.
Abstract: This paper applied factor conditions from Porter’s
Diamond Model (1990) to understand the various challenges facing
the AMISA. Factor conditions highlighted in Porter’s model are
grouped into two groups namely, basic and advance factors. Two
AMISA associations representing over 10 000 employees were
interviewed. The largest Clothing, Textiles and Leather (CTL)
apparel retail group was also interviewed with a government
department implementing the industrialization policy were
interviewed. The paper points out that AMISA have basic factor conditions
necessary for competitive advantage in the apparel industries.
However advance factor creation has proven to be a challenge for
AMISA, Higher Education Institutions (HEIs) and government. Poor
infrastructural maintenance has contributed to high manufacturing
costs and poor quick response technologies. The use of Porter’s
Factor Conditions as a tool to analyze the sector’s competitive
advantage challenges and opportunities has increased knowledge
regarding factors that limit the AMISA’s competitiveness. It is
therefore argued that other studies on Porter’s Diamond model
factors like Demand conditions, Firm strategy, structure and rivalry
and Related and supporting industries can be used to analyze the
situation of the AMISA for the purposes of improving competitive
advantage.
Abstract: Workflow scheduling is an important part of cloud
computing and based on different criteria it decides cost, execution
time, and performances. A cloud workflow system is a platform
service facilitating automation of distributed applications based on
new cloud infrastructure. An aspect which differentiates cloud
workflow system from others is market-oriented business model, an
innovation which challenges conventional workflow scheduling
strategies. Time and Cost optimization algorithm for scheduling
Hybrid Clouds (TCHC) algorithm decides which resource should be
chartered from public providers is combined with a new De-De
algorithm considering that every instance of single and multiple
workflows work without deadlocks. To offset this, two new concepts
- De-De Dodging Algorithm and Priority Based Decisive Algorithm -
combine with conventional deadlock avoidance issues by proposing
one algorithm that maximizes active (not just allocated) resource use
and reduces Makespan.
Abstract: The design of Reverse logistics Network has attracted
growing attention with the stringent pressures from both
environmental awareness and business sustainability. Reverse
logistical activities include return, remanufacture, disassemble and
dispose of products can be quite complex to manage. In addition,
demand can be difficult to predict, and decision making is one of the
challenges task in such network. This complexity has amplified the
need to develop an integrated architecture for product return as an
enterprise system. The main purpose of this paper is to design Multi
Agent System (MAS) architecture using the Prometheus
methodology to efficiently manage reverse logistics processes. The
proposed MAS architecture includes five types of agents: Gate
keeping Agent, Collection Agent, Sorting Agent, Processing Agent
and Disposal Agent which act respectively during the five steps of
reverse logistics Network.
Abstract: Based on application requirements, nodes are static or
mobile in Wireless Sensor Networks (WSNs). Mobility poses
challenges in protocol design, especially at the link layer requiring
mobility adaptation algorithms to localize mobile nodes and predict
link quality to be established with them. This study implements
XMAC and Berkeley Media Access Control (BMAC) routing
protocols to evaluate performance under WSN’s static and mobility
conditions. This paper gives a comparative study of mobility-aware
MAC protocols. Routing protocol performance, based on Average
End to End Delay, Average Packet Delivery Ratio, Average Number
of hops, and Jitter is evaluated.
Abstract: Live video streaming is one of the most widely used
service among end users, yet it is a big challenge for the network
operators in terms of quality. The only way to provide excellent
Quality of Experience (QoE) to the end users is continuous
monitoring of live video streaming. For this purpose, there are several
objective algorithms available that monitor the quality of the video in
a live stream. Subjective tests play a very important role in fine
tuning the results of objective algorithms. As human perception is
considered to be the most reliable source for assessing the quality of a
video stream subjective tests are conducted in order to develop more
reliable objective algorithms. Temporal impairments in a live video
stream can have a negative impact on the end users. In this paper we
have conducted subjective evaluation tests on a set of video
sequences containing temporal impairment known as frame freezing.
Frame Freezing is considered as a transmission error as well as a
hardware error which can result in loss of video frames on the
reception side of a transmission system. In our subjective tests, we
have performed tests on videos that contain a single freezing event
and also for videos that contain multiple freezing events. We have
recorded our subjective test results for all the videos in order to give a
comparison on the available No Reference (NR) objective
algorithms. Finally, we have shown the performance of no reference
algorithms used for objective evaluation of videos and suggested the
algorithm that works better. The outcome of this study shows the
importance of QoE and its effect on human perception. The results
for the subjective evaluation can serve the purpose for validating
objective algorithms.
Abstract: Superabsorbent polymers received much attention and
are used in many fields because of their superior characters to
traditional absorbents, e.g., sponge and cotton. So, it is very
important but challenging to prepare highly and fast-swelling
superabsorbents. A reliable, efficient and low-cost technique for
removing heavy metal ions from wastewater is the adsorption using
bio-adsorbents obtained from biological materials, such as
polysaccharides-based hydrogels superabsorbents. In this study, novel multi-functional superabsorbent composites
type semi-interpenetrating polymer networks (Semi-IPNs) were
prepared via graft polymerization of acrylamide onto chitosan
backbone in presence of gelatin, CTS-g-PAAm/Ge, using potassium
persulfate and N,N’-methylene bisacrylamide as initiator and
crosslinker, respectively. These hydrogels were also partially
hydrolyzed to achieve superabsorbents with ampholytic properties
and uppermost swelling capacity. The formation of the grafted
network was evidenced by Fourier Transform Infrared Spectroscopy
(ATR-FTIR) and Thermogravimetric Analysis (TGA). The porous
structures were observed by Scanning Electron Microscope (SEM).
From TGA analysis, it was concluded that the incorporation of the Ge
in the CTS-g-PAAm network has marginally affected its thermal
stability. The effect of gelatin content on the swelling capacities of
these superabsorbent composites was examined in various media
(distilled water, saline and pH-solutions). The water absorbency was
enhanced by adding Ge in the network, where the optimum value was
reached at 2 wt. % of Ge. Their hydrolysis has not only greatly
optimized their absorption capacity but also improved the swelling
kinetic.These materials have also showed reswelling ability. We
believe that these super-absorbing materials would be very effective
for the adsorption of harmful metal ions from wastewater.
Abstract: Accurate forecasting of fresh produce demand is one
the challenges faced by Small Medium Enterprise (SME)
wholesalers. This paper is an attempt to understand the cause for the
high level of variability such as weather, holidays etc., in demand of
SME wholesalers. Therefore, understanding the significance of
unidentified factors may improve the forecasting accuracy. This
paper presents the current literature on the factors used to predict
demand and the existing forecasting techniques of short shelf life
products. It then investigates a variety of internal and external
possible factors, some of which is not used by other researchers in the
demand prediction process. The results presented in this paper are
further analysed using a number of techniques to minimize noise in
the data. For the analysis past sales data (January 2009 to May 2014)
from a UK based SME wholesaler is used and the results presented
are limited to product ‘Milk’ focused on café’s in derby. The
correlation analysis is done to check the dependencies of variability
factor on the actual demand. Further PCA analysis is done to
understand the significance of factors identified using correlation.
The PCA results suggest that the cloud cover, weather summary and
temperature are the most significant factors that can be used in
forecasting the demand. The correlation of the above three factors
increased relative to monthly and becomes more stable compared to
the weekly and daily demand.
Abstract: In present global scenario, aluminum alloys are
coining the attention of many innovators as competing structural
materials for automotive and space applications. Comparing to other
challenging alloys, especially, 7xxx series aluminum alloys have
been studied seriously because of benefits such as moderate strength;
better deforming characteristics and affordable cost. It is expected
that substitution of aluminum alloys for steels will result in great
improvements in energy economy, durability and recyclability.
However, it is necessary to improve the strength and the formability
levels at low temperatures in aluminum alloys for still better
applications. Aluminum–Zinc–Magnesium with or without other
wetting agent denoted as 7XXX series alloys are medium strength
heat treatable alloys. In addition to Zn, Mg as major alloying
additions, Cu, Mn and Si are the other solute elements which
contribute for the improvement in mechanical properties by suitable
heat treatment process. Subjecting to suitable treatments like age
hardening or cold deformation assisted heat treatments; known as low
temperature thermomechanical treatments (LTMT) the challenging
properties might be incorporated. T6 is the age hardening or
precipitation hardening process with artificial aging cycle whereas T8
comprises of LTMT treatment aged artificially with X% cold
deformation. When the cold deformation is provided after solution
treatment, there is increase in hardness related properties such as
wear resistance, yield and ultimate strength, toughness with the
expense of ductility. During precipitation hardening both hardness
and strength of the samples are increasing. The hardness value may
further improve when room temperature deformation is positively
supported with age hardening known as thermomechanical treatment.
It is intended to perform heat treatment and evaluate hardness, tensile
strength, wear resistance and distribution pattern of reinforcement in
the matrix. 2 to 2.5 and 3 to 3.5 times increase in hardness is reported
in age hardening and LTMT treatments respectively as compared to
as-cast composite. There was better distribution of reinforcements in
the matrix, nearly two fold increase in strength levels and up to 5
times increase in wear resistance are also observed in the present
study.
Abstract: A knowledge base stores facts and rules about the
world that applications can use for the purpose of reasoning. By
applying the concept of granular computing to a knowledge base,
several advantages emerge. These can be harnessed by applications
to improve their capabilities and performance. In this paper, the
concept behind such a construct, called a granular knowledge cube,
is defined, and its intended use as an instrument that manages to
cope with different data types and detect knowledge domains is
elaborated. Furthermore, the underlying architecture, consisting of the
three layers of the storing, representing, and structuring of knowledge,
is described. Finally, benefits as well as challenges of deploying it
are listed alongside application types that could profit from having
such an enhanced knowledge base.