Abstract: The model-based approach to user interface design
relies on developing separate models capturing various aspects about
users, tasks, application domain, presentation and dialog structures.
This paper presents a task modeling approach for user interface
design and aims at exploring mappings between task, domain and
presentation models. The basic idea of our approach is to identify
typical configurations in task and domain models and to investigate
how they relate each other. A special emphasis is put on applicationspecific
functions and mappings between domain objects and
operational task structures. In this respect, we will address two
layers in task decomposition: a functional (planning) layer and an
operational layer.
Abstract: In the urban traffic network, the intersections are the
“bottleneck point" of road network capacity. And the arterials are the
main body in road network and the key factor which guarantees the
normal operation of the city-s social and economic activities. The
rapid increase in vehicles leads to seriously traffic jam and cause the
increment of vehicles- delay. Most cities of our country are
traditional single control system, which cannot meet the need for the
city traffic any longer. In this paper, Synchro6.0 as a platform to
minimize the intersection delay, optimizesingle signal cycle and split
for Zhonghua Street in Handan City. Meanwhile, linear control
system uses to optimize the phase for the t arterial road in this
system. Comparing before and after use the control, capacities and
service levels of this road and the adjacent road have improved
significantly.
Abstract: The mineral having chemical compositional formula MgAl2O4 is called “spinel". The ferrites crystallize in spinel structure are known as spinel-ferrites or ferro-spinels. The spinel structure has a fcc cage of oxygen ions and the metallic cations are distributed among tetrahedral (A) and octahedral (B) interstitial voids (sites). The X-ray diffraction (XRD) intensity of each Bragg plane is sensitive to the distribution of cations in the interstitial voids of the spinel lattice. This leads to the method of determination of distribution of cations in the spinel oxides through XRD intensity analysis. The computer program for XRD intensity analysis has been developed in C language and also tested for the real experimental situation by synthesizing the spinel ferrite materials Mg0.6Zn0.4AlxFe2- xO4 and characterized them by X-ray diffractometry. The compositions of Mg0.6Zn0.4AlxFe2-xO4(x = 0.0 to 0.6) ferrites have been prepared by ceramic method and powder X-ray diffraction patterns were recorded. Thus, the authenticity of the program is checked by comparing the theoretically calculated data using computer simulation with the experimental ones. Further, the deduced cation distributions were used to fit the magnetization data using Localized canting of spins approach to explain the “recovery" of collinear spin structure due to Al3+ - substitution in Mg-Zn ferrites which is the case if A-site magnetic dilution and non-collinear spin structure. Since the distribution of cations in the spinel ferrites plays a very important role with regard to their electrical and magnetic properties, it is essential to determine the cation distribution in spinel lattice.
Abstract: This paper proposes a prototype of a lower-limb
rehabilitation system for recovering and strengthening patients-
injured lower limbs. The system is composed of traction motors for
each leg position, a treadmill as a walking base, tension sensors,
microcontrollers controlling motor functions and a main system with
graphic user interface. For derivation of reference or normal velocity
profiles of the body segment point, kinematic method is applied based
on the humanoid robot model using the reference joint angle data of
normal walking.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.
Abstract: A microchannel with two inlets and two outlets was tested as a potential reactor to carry out two-phase catalytic phase transfer reaction with phase separation at the exit of the microchannel. The catalytic phase transfer reaction between benzyl chloride and sodium sulfide was chosen as a model reaction. The effect of operational time on the conversion was studied. By utilizing a multiphase parallel flow inside the microchannel reactor with the aid of a guideline structure, the catalytic phase reaction followed by phase separation could be ensured. The organic phase could be separated completely from one exit and part of the aqueous phase was separated purely and could be reused with slightly affecting the catalytic phase transfer reaction.
Abstract: Long Term Evolution (LTE) is a 4G wireless
broadband technology developed by the Third Generation
Partnership Project (3GPP) release 8, and it's represent the
competitiveness of Universal Mobile Telecommunications System
(UMTS) for the next 10 years and beyond. The concepts for LTE
systems have been introduced in 3GPP release 8, with objective of
high-data-rate, low-latency and packet-optimized radio access
technology. In this paper, performance of different TCP variants
during LTE network investigated. The performance of TCP over
LTE is affected mostly by the links of the wired network and total
bandwidth available at the serving base station. This paper describes
an NS-2 based simulation analysis of TCP-Vegas, TCP-Tahoe, TCPReno,
TCP-Newreno, TCP-SACK, and TCP-FACK, with full
modeling of all traffics of LTE system. The Evaluation of the
network performance with all TCP variants is mainly based on
throughput, average delay and lost packet. The analysis of TCP
performance over LTE ensures that all TCP's have a similar
throughput and the best performance return to TCP-Vegas than other
variants.
Abstract: Analysis of the elastic scattering of protons on 6,7Li
nuclei has been done in the framework of the optical model at the
beam energies up to 50 MeV. Differential cross sections for the 6,7Li +
p scattering were measured over the proton laboratory–energy range
from 400 to 1050 keV. The elastic scattering of 6,7Li+p data at
different proton incident energies have been analyzed using singlefolding
model. In each case the real potential obtained from the
folding model was supplemented by a phenomenological imaginary
potential, and during the fitting process the real potential was
normalized and the imaginary potential optimized. Normalization
factor NR is calculated in the range between 0.70 and 0.84.
Abstract: Banishing hunger from the face of earth has been
frequently expressed in various international, national and regional
level conferences since 1974. Providing food security has become
important issue across the world particularly in developing countries.
In a developing country like India, where growth rate of population is
more than that of the food grains production, food security is a
question of great concern. According to the International Food Policy
Research Institute's Global Hunger Index, 2011, India ranks 67 of the
81 countries of the world with the worst food security status. After
Green Revolution, India became a food surplus country. Its
production has increased from 74.23 million tonnes in 1966-67 to
257.44 million tonnes in 2011-12. But after achieving selfsufficiency
in food during last three decades, the country is now
facing new challenges due to increasing population, climate change,
stagnation in farm productivity. Therefore, the main objective of the
present paper is to examine the food security situation at national
level in the country and further to explain the paradox of food
insecurity in a food surplus state of India i.e in Punjab at micro level.
In order to achieve the said objectives, secondary data collected from
the Ministry of Agriculture and the Agriculture department of Punjab
State was analyzed. The result of the study showed that despite
having surplus food production the country is still facing food
insecurity problem at micro level. Within the Kandi belt of Punjab
state, the area adjacent to plains is food secure while the area along
the hills falls in food insecure zone.
The present paper is divided into following three sections (i)
Introduction, (ii) Analysis of food security situation at national level
as well as micro level (Kandi belt of Punjab State) (iii) Concluding
Observations
Abstract: Gene expression profiling is rapidly evolving into a
powerful technique for investigating tumor malignancies. The
researchers are overwhelmed with the microarray-based platforms
and methods that confer them the freedom to conduct large-scale
gene expression profiling measurements. Simultaneously,
investigations into cross-platform integration methods have started
gaining momentum due to their underlying potential to help
comprehend a myriad of broad biological issues in tumor diagnosis,
prognosis, and therapy. However, comparing results from different
platforms remains to be a challenging task as various inherent
technical differences exist between the microarray platforms. In this
paper, we explain a simple ratio-transformation method, which can
provide some common ground for cDNA and Affymetrix platform
towards cross-platform integration. The method is based on the
characteristic data attributes of Affymetrix- and cDNA- platform. In
the work, we considered seven childhood leukemia patients and their
gene expression levels in either platform. With a dataset of 822
differentially expressed genes from both these platforms, we carried
out a specific ratio-treatment to Affymetrix data, which subsequently
showed an improvement in the relationship with the cDNA data.
Abstract: With the development of virtual communities, there is
an increase in the number of members in Virtual Communities (VCs).
Many join VCs with the objective of sharing their knowledge and
seeking knowledge from others. Despite the eagerness of sharing
knowledge and receiving knowledge through VCs, there is no
standard of assessing ones knowledge sharing capabilities and
prospects of knowledge sharing. This paper developed a vector space
model to assess the knowledge sharing prospect of VC users.
Abstract: This paper concerns about the experimental and
numerical investigations of energy absorption and axial tearing
behaviour of aluminium 6060 circular thin walled tubes under static
axial compression. The tubes are received in T66 heat treatment
condition with fixed outer diameter of 42mm, thickness of 1.5mm
and length of 120mm. The primary variables are the conical die
angles (15°, 20° and 25°). Numerical simulations are carried on
ANSYS/LS-DYNA software tool, for investigating the effect of
friction between the tube and the die.
Abstract: In modern telecommunications industry, demand &
supply chain management (DSCM) needs reliable design and
versatile tools to control the material flow. The objective for efficient
DSCM is reducing inventory, lead times and related costs in order to
assure reliable and on-time deliveries from manufacturing units
towards customers. In this paper the multi-rate expert system based
methodology for developing simulation tools that would enable
optimal DSCM for multi region, high volume and high complexity
manufacturing environment was proposed.
Abstract: This article attempts to analyze functionally graded beam thermal buckling along with piezoelectric layers applying based on the third order shearing deformation theory considering various boundary conditions. The beam properties are assumed to vary continuously from the lower surface to the upper surface of the beam. The equilibrium equations are derived using the total potential energy equations, Euler equations, piezoelectric material constitutive equations and third order shear deformation theory assumptions. In order to fulfill such an aim, at first functionally graded beam with piezoelectric layers applying the third order shearing deformation theory along with clamped -clamped boundary conditions are thoroughly analyzed, and then following making sure of the correctness of all the equations, the very same beam is analyzed with piezoelectric layers through simply-simply and simply-clamped boundary conditions. In this article buckling critical temperature for functionally graded beam is derived in two different ways, without piezoelectric layer and with piezoelectric layer and the results are compared together. Finally, all the conclusions obtained will be compared and contrasted with the same samples in the same and distinguished conditions through tables and charts. It would be noteworthy that in this article, the software MAPLE has been applied in order to do the numeral calculations.
Abstract: This paper describes a method of modeling to model
shadow play puppet using sophisticated computer graphics techniques
available in OpenGL in order to allow interactive play in real-time
environment as well as producing realistic animation. This paper
proposes a novel real-time method is proposed for modeling of puppet
and its shadow image that allows interactive play of virtual shadow
play using texture mapping and blending techniques. Special effects
such as lighting and blurring effects for virtual shadow play
environment are also developed. Moreover, the use of geometric
transformations and hierarchical modeling facilitates interaction
among the different parts of the puppet during animation. Based on the
experiments and the survey that were carried out, the respondents
involved are very satisfied with the outcomes of these techniques.
Abstract: Network management techniques have long been of
interest to the networking research community. The queue size plays
a critical role for the network performance. The adequate size of the
queue maintains Quality of Service (QoS) requirements within
limited network capacity for as many users as possible. The
appropriate estimation of the queuing model parameters is crucial for
both initial size estimation and during the process of resource
allocation. The accurate resource allocation model for the
management system increases the network utilization. The present
paper demonstrates the results of empirical observation of memory
allocation for packet-based services.
Abstract: The traditional software product and process metrics
are neither suitable nor sufficient in measuring the complexity of
software components, which ultimately is necessary for quality and
productivity improvement within organizations adopting CBSE.
Researchers have proposed a wide range of complexity metrics for
software systems. However, these metrics are not sufficient for
components and component-based system and are restricted to the
module-oriented systems and object-oriented systems. In this
proposed study it is proposed to find the complexity of the JavaBean
Software Components as a reflection of its quality and the component
can be adopted accordingly to make it more reusable. The proposed
metric involves only the design issues of the component and does not
consider the packaging and the deployment complexity. In this way,
the software components could be kept in certain limit which in turn
help in enhancing the quality and productivity.
Abstract: In 2002 an amendment to SOLAS opened for
lightweight material constructions in vessels if the same fire safety as
in steel constructions could be obtained. FISPAT (FIreSPread
Analysis Tool) is a computer application that simulates fire spread
and fault injection in cruise vessels and identifies fire sensitive areas.
It was developed to analyze cruise vessel designs and provides a
method to evaluate network layout and safety of cruise vessels. It
allows fast, reliable and deterministic exhaustive simulations and
presents the result in a graphical vessel model. By performing the
analysis iteratively while altering the cruise vessel design it can be
used along with fire chamber experiments to show that the
lightweight design can be as safe as a steel construction and that
SOLAS regulations are fulfilled.
Abstract: Domineering is a classic two-player combinatorial
game usually played on a rectangular board. Three-player Domineering
is the three-player version of Domineering played on a three
dimensional board. Experimental results are presented for x×y ×z
boards with x + y + z < 10 and x, y, z ≥ 2. Also, some theoretical
results are shown for 2 × 2 × n board with n even and n ≥ 4.
Abstract: While compressing text files is useful, compressing
still image files is almost a necessity. A typical image takes up much
more storage than a typical text message and without compression
images would be extremely clumsy to store and distribute. The
amount of information required to store pictures on modern
computers is quite large in relation to the amount of bandwidth
commonly available to transmit them over the Internet and
applications. Image compression addresses the problem of reducing
the amount of data required to represent a digital image. Performance
of any image compression method can be evaluated by measuring the
root-mean-square-error & peak signal to noise ratio. The method of
image compression that will be analyzed in this paper is based on the
lossy JPEG image compression technique, the most popular
compression technique for color images. JPEG compression is able to
greatly reduce file size with minimal image degradation by throwing
away the least “important" information. In JPEG, both color
components are downsampled simultaneously, but in this paper we
will compare the results when the compression is done by
downsampling the single chroma part. In this paper we will
demonstrate more compression ratio is achieved when the
chrominance blue is downsampled as compared to downsampling the
chrominance red in JPEG compression. But the peak signal to noise
ratio is more when the chrominance red is downsampled as compared
to downsampling the chrominance blue in JPEG compression. In
particular we will use the hats.jpg as a demonstration of JPEG
compression using low pass filter and demonstrate that the image is
compressed with barely any visual differences with both methods.