Abstract: Adapting wireless devices to communicate within grid
networks empowers us by providing range of possibilities.. These
devices create a mechanism for consumers and publishers to create
modern networks with or without peer device utilization. Emerging
mobile networks creates new challenges in the areas of reliability,
security, and adaptability. In this paper, we propose a system
encompassing mobility management using AAA context transfer for
mobile grid networks. This system ultimately results in seamless task
processing and reduced packet loss, communication delays,
bandwidth, and errors.
Abstract: We introduce an extended resource leveling model that abstracts real life projects that consider specific work ranges for each resource. Contrary to traditional resource leveling problems this model considers scarce resources and multiple objectives: the minimization of the project makespan and the leveling of each resource usage over time. We formulate this model as a multiobjective optimization problem and we propose a multiobjective genetic algorithm-based solver to optimize it. This solver consists in a two-stage process: a main stage where we obtain non-dominated solutions for all the objectives, and a postprocessing stage where we seek to specifically improve the resource leveling of these solutions. We propose an intelligent encoding for the solver that allows including domain specific knowledge in the solving mechanism. The chosen encoding proves to be effective to solve leveling problems with scarce resources and multiple objectives. The outcome of the proposed solvers represent optimized trade-offs (alternatives) that can be later evaluated by a decision maker, this multi-solution approach represents an advantage over the traditional single solution approach. We compare the proposed solver with state-of-art resource leveling methods and we report competitive and performing results.
Abstract: The OTOP Entrepreneurship that used to create
substantial source of income for local Thai communities are now in a
stage of exigent matters that required assistances from public sectors
due to over Entrepreneurship of duplicative ideas, unable to adjust
costs and prices, lack of innovation, and inadequate of quality
control. Moreover, there is a repetitive problem of middlemen who
constantly corner the OTOP market. Local OTOP producers become
easy preys since they do not know how to add more values, how to
create and maintain their own brand name, and how to create proper
packaging and labeling. The suggested solutions to local OTOP
producers are to adopt modern management techniques, to find
knowhow to add more values to products and to unravel other
marketing problems. The objectives of this research are to study the
prevalent OTOP products management and to discover direction to
manage OTOP products to enhance the effectiveness of OTOP
Entrepreneurship in Nonthaburi Province, Thailand. There were 113
participants in this study. The research tools can be divided into two
parts: First part is done by questionnaire to find responses of the
prevalent OTOP Entrepreneurship management. Second part is the
use of focus group which is conducted to encapsulate ideas and local
wisdom. Data analysis is performed by using frequency, percentage,
mean, and standard deviation as well as the synthesis of several small
group discussions. The findings reveal that 1) Business Resources:
the quality of product is most important and the marketing of product
is least important. 2) Business Management: Leadership is most
important and raw material planning is least important. 3) Business
Readiness: Communication is most important and packaging is least
important. 4) Support from public sector: Certified from the
government is most important and source of raw material is the least
important.
Abstract: In this paper a computer system for electromagnetic
properties measurements is designed. The system employs Agilent
4294A precision impedance analyzer to measure the amplitude and
the phase of a signal applied over a tested biological tissue sample.
Measured by the developed computer system data could be used for
tissue characterization in wide frequency range from 40Hz to
110MHz. The computer system can interface with output devices
acquiring flexible testing process.
Abstract: This paper is devoted to predict laminar and turbulent
heating rates around blunt re-entry spacecraft at hypersonic
conditions. Heating calculation of a hypersonic body is normally
performed during the critical part of its flight trajectory. The
procedure is of an inverse method, where a shock wave is assumed,
and the body shape that supports this shock, as well as the flowfield
between the shock and body, are calculated. For simplicity the
normal momentum equation is replaced with a second order pressure
relation; this simplification significantly reduces computation time.
The geometries specified in this research, are parabola and ellipsoids
which may have conical after bodies. An excellent agreement is
observed between the results obtained in this paper and those
calculated by others- research. Since this method is much faster than
Navier-Stokes solutions, it can be used in preliminary design,
parametric study of hypersonic vehicles.
Abstract: Development of a Robust Supply Chain for Dynamic
Operating Environment as we move further into the twenty first
century, organisations are under increasing pressure to deliver a high
product variation at a reasonable cost without compromise in quality.
In a number of cases this will take the form of a customised or high
variety low volume manufacturing system that requires prudent
management of resources, among a number of functions, to achieve
competitive advantage. Purchasing and Supply Chain management is
one of such function and due to the substantial interaction with
external elements needs to be strategically managed. This requires a
number of primary and supporting tools that will enable the
appropriate decisions to be made rapidly. This capability is
especially vital in a dynamic environment as it provides a pivotal role
in increasing the profit margin of the product. The management of
this function can be challenging by itself and even more for Small
and Medium Enterprises (SMEs) due to the limited resources and
expertise available at their disposal.
This paper discusses the development of tools and concepts
towards effectively managing the purchasing and supply chain
function. The developed tools and concepts will provide a cost
effective way of managing this function within SMEs. The paper
further shows the use of these tools within Contechs, a manufacturer
of luxury boat interiors, and the associated benefits achieved as a
result of this implementation. Finally a generic framework towards
use in such environments is presented.
Abstract: In this paper, a novel road extraction method using Stationary Wavelet Transform is proposed. To detect road features from color aerial satellite imagery, Mexican hat Wavelet filters are used by applying the Stationary Wavelet Transform in a multiresolution, multi-scale, sense and forming the products of Wavelet coefficients at a different scales to locate and identify road features at a few scales. In addition, the shifting of road features locations is considered through multiple scales for robust road extraction in the asymmetry road feature profiles. From the experimental results, the proposed method leads to a useful technique to form the basis of road feature extraction. Also, the method is general and can be applied to other features in imagery.
Abstract: This paper describes a computer model of Quantum Field Theory (QFT), referred to in this paper as QTModel. After specifying the initial configuration for a QFT process (e.g. scattering) the model generates the possible applicable processes in terms of Feynman diagrams, the equations for the scattering matrix, and evaluates probability amplitudes for the scattering matrix and cross sections. The computations of probability amplitudes are performed numerically. The equations generated by QTModel are provided for demonstration purposes only. They are not directly used as the base for the computations of probability amplitudes. The computer model supports two modes for the computation of the probability amplitudes: (1) computation according to standard QFT, and (2) computation according to a proposed functional interpretation of quantum theory.
Abstract: This paper proposes and implements an core transform architecture, which is one of the major processes in HEVC video compression standard. The proposed core transform architecture is implemented with only adders and shifters instead of area-consuming multipliers. Shifters in the proposed core transform architecture are implemented in wires and multiplexers, which significantly reduces chip area. Also, it can process from 4×4 to 16×16 blocks with common hardware by reusing processing elements. Designed core transform architecture in 0.13um technology can process a 16×16 block with 2-D transform in 130 cycles, and its gate count is 101,015 gates.
Abstract: This paper provides a scheme to improve the read efficiency of anti-collision algorithm in EPCglobal UHF Class-1 Generation-2 RFID standard. In this standard, dynamic frame slotted ALOHA is specified to solve the anti-collision problem. Also, the Q-algorithm with a key parameter C is adopted to dynamically adjust the frame sizes. In the paper, we split the C parameter into two parameters to increase the read speed and derive the optimal values of the two parameters through simulations. The results indicate our method outperforms the original Q-algorithm.
Abstract: This study develops a relation to explore the factors influencing management and technology capabilities in strategic alliances. Alliances between firms are recognizing increasingly popular as a vehicle to create and extract greater value from the market. Firm’s alliance can be described as the collaborative problem solving process to solve problems jointly. This study starts from research questions what factors of firm’s management and technology characteristics affect performance of firms which are formed alliances. In this study, we investigated the effect of strategic alliances on company performance. That is, we try to identify whether firms made an alliance with other organizations are differed by characteristics of management and technology. And we test that alliance type and alliance experiences moderate the relationship between firm’s capabilities and its performance. We employ problem-solving perspective and resource-based view perspective to shed light on this research questions. The empirical work is based on the Survey of Business Activities conducted from2006 to 2008 by Statistics Korea. We verify correlations between to point out that these results contribute new empirical evidence on the effect of strategic alliances on company performance.
Abstract: Ant colony optimization is an ant algorithm framework that took inspiration from foraging behavior of ant colonies. Indeed, ACO algorithms use a chemical communication, represented by pheromone trails, to build good solutions. However, ants involve different communication channels to interact. Thus, this paper introduces the acoustic communication between ants while they are foraging. This process allows fine and local exploration of search space and permits optimal solution to be improved.
Abstract: The aim of the current work is to present a comparison among three popular optimization methods in the inverse elastostatics problem (IESP) of flaw detection within a solid. In more details, the performance of a simulated annealing, a Hooke & Jeeves and a sequential quadratic programming algorithm was studied in the test case of one circular flaw in a plate solved by both the boundary element (BEM) and the finite element method (FEM). The proposed optimization methods use a cost function that utilizes the displacements of the static response. The methods were ranked according to the required number of iterations to converge and to their ability to locate the global optimum. Hence, a clear impression regarding the performance of the aforementioned algorithms in flaw identification problems was obtained. Furthermore, the coupling of BEM or FEM with these optimization methods was investigated in order to track differences in their performance.
Abstract: The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.
Abstract: Bumpers play an important role in preventing the
impact energy from being transferred to the automobile and
passengers. Saving the impact energy in the bumper to be released in
the environment reduces the damages of the automobile and
passengers.
The goal of this paper is to design a bumper with minimum weight
by employing the Glass Material Thermoplastic (GMT) materials.
This bumper either absorbs the impact energy with its deformation or
transfers it perpendicular to the impact direction.
To reach this aim, a mechanism is designed to convert about 80%
of the kinetic impact energy to the spring potential energy and
release it to the environment in the low impact velocity according to
American standard1. In addition, since the residual kinetic energy
will be damped with the infinitesimal elastic deformation of the
bumper elements, the passengers will not sense any impact. It should
be noted that in this paper, modeling, solving and result-s analysis
are done in CATIA, LS-DYNA and ANSYS V8.0 software
respectively.
Abstract: True stress-strain curve of railhead steel is required to
investigate the behaviour of railhead under wheel loading through elasto-plastic Finite Element (FE) analysis. To reduce the rate of wear, the railhead material is hardened through annealing and
quenching. The Australian standard rail sections are not fully hardened and hence suffer from non-uniform distribution of the
material property; usage of average properties in the FE modelling can potentially induce error in the predicted plastic strains. Coupons
obtained at varying depths of the railhead were, therefore, tested under axial tension and the strains were measured using strain gauges as well as an image analysis technique, known as the Particle Image Velocimetry (PIV). The head hardened steel exhibit existence of three distinct zones of yield strength; the yield strength as the ratio of the average yield strength provided in the standard (σyr=780MPa) and
the corresponding depth as the ratio of the head hardened zone along
the axis of symmetry are as follows: (1.17 σyr, 20%), (1.06 σyr, 20%-80%) and (0.71 σyr, > 80%). The stress-strain curves exhibit limited plastic zone with fracture occurring at strain less than 0.1.
Abstract: This paper presents a wavelet transform and Support
Vector Machine (SVM) based algorithm for estimating fault location
on transmission lines. The Discrete wavelet transform (DWT) is used
for data pre-processing and this data are used for training and testing
SVM. Five types of mother wavelet are used for signal processing to
identify a suitable wavelet family that is more appropriate for use in
estimating fault location. The results demonstrated the ability of SVM
to generalize the situation from the provided patterns and to
accurately estimate the location of faults with varying fault resistance.
Abstract: Modeling of complex dynamic systems, which are
very complicated to establish mathematical models, requires new and
modern methodologies that will exploit the existing expert
knowledge, human experience and historical data. Fuzzy cognitive
maps are very suitable, simple, and powerful tools for simulation and
analysis of these kinds of dynamic systems. However, human experts
are subjective and can handle only relatively simple fuzzy cognitive
maps; therefore, there is a need of developing new approaches for an
automated generation of fuzzy cognitive maps using historical data.
In this study, a new learning algorithm, which is called Big Bang-Big
Crunch, is proposed for the first time in literature for an automated
generation of fuzzy cognitive maps from data. Two real-world
examples; namely a process control system and radiation therapy
process, and one synthetic model are used to emphasize the
effectiveness and usefulness of the proposed methodology.
Abstract: The log periodogram regression is widely used in empirical
applications because of its simplicity, since only a least squares
regression is required to estimate the memory parameter, d, its good
asymptotic properties and its robustness to misspecification of the
short term behavior of the series. However, the asymptotic distribution
is a poor approximation of the (unknown) finite sample distribution
if the sample size is small. Here the finite sample performance of different
nonparametric residual bootstrap procedures is analyzed when
applied to construct confidence intervals. In particular, in addition to
the basic residual bootstrap, the local and block bootstrap that might
adequately replicate the structure that may arise in the errors of the
regression are considered when the series shows weak dependence in
addition to the long memory component. Bias correcting bootstrap
to adjust the bias caused by that structure is also considered. Finally,
the performance of the bootstrap in log periodogram regression based
confidence intervals is assessed in different type of models and how
its performance changes as sample size increases.
Abstract: This paper presents a distributed intrusion
detection system IDS, based on the concept of specialized
distributed agents community representing agents with the
same purpose for detecting distributed attacks. The semantic of
intrusion events occurring in a predetermined network has been
defined. The correlation rules referring the process which our
proposed IDS combines the captured events that is distributed
both spatially and temporally. And then the proposed IDS tries
to extract significant and broad patterns for set of well-known
attacks. The primary goal of our work is to provide intrusion
detection and real-time prevention capability against insider
attacks in distributed and fully automated environments.