Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.
Abstract: Researches related to standard product model and
development of neutral manufacturing interfaces for numerical
control machines becomes a significant topic since the last 25 years.
In this paper, a detail description of STEP implementation on turnmill
manufacturing has been discussed. It shows requirements of
information contents from ISO14649 data model. It covers to
describe the design of STEP-NC framework applicable to turn-mill
manufacturing. In the framework, EXPRESS-G and UML modeling
tools are used to depict the information contents of the system and
established the bases of information model requirement. A product
and manufacturing data model applicable for STEP compliant
manufacturing. The next generation turn-mill operations
requirements have been represented by a UML diagram. An object
oriented classes of ISO1449 has been developed on Visual Basic dot
NET platform for binding the static information model represented
by the UML diagram. An architect of the proposed system
implementation has been given on the bases of the design and
manufacturing module of STEP-NC interface established. Finally, a
part 21 file process plan generated for an illustration of turn-mill
components.
Abstract: Elliptic curve-based certificateless signature is slowly
gaining attention due to its ability to retain the efficiency of
identity-based signature to eliminate the need of certificate
management while it does not suffer from inherent private
key escrow problem. Generally, cryptosystem based on elliptic
curve offers equivalent security strength at smaller key sizes
compared to conventional cryptosystem such as RSA which
results in faster computations and efficient use of computing
power, bandwidth, and storage. This paper proposes to implement
certificateless signature based on bilinear pairing to
structure the framework of IKE authentication. In this paper,
we perform a comparative analysis of certificateless signature
scheme with a well-known RSA scheme and also present the
experimental results in the context of signing and verification
execution times. By generalizing our observations, we discuss the
different trade-offs involved in implementing IKE authentication
by using certificateless signature.
Abstract: Buckling-Restrained Braced Frame system(BRBFs)
are a new type of steel seismic-load-resisting system that has found
use in several countries because of its efficiency and its promise of
seismic performance far superior to that of conventional braced
frames. The system is addressed in the 2005 edition of the AISC
Seismic Provisions for Structural Steel Buildings, also a set of design
provisions has been developed by NEHRP. This report illustrates the
seismic design of buckling restrained braced frames and compares
the result of design in the application of earthquake load for ordinary
bracing systems and buckling restrained bracing systems to see the
advantage and disadvantages of this new type of seismic resisting
system in comparison with the old Ordinary Concentric Braced
Frame systems (OCBFs); they are defined by the provisions
governing their design.
Abstract: the current study presents a modeling framework to determine the torsion strength of an induction hardened splined shaft by considering geometry and material aspects with the aim to optimize the static torsion strength by selection of spline geometry and hardness depth. Six different spline geometries and seven different hardness profiles including non-hardened and throughhardened shafts have been considered. The results reveal that the torque that causes initial yielding of the induction hardened splined shaft is strongly dependent on the hardness depth and the geometry of the spline teeth. Guidelines for selection of the appropriate hardness depth and spline geometry are given such that an optimum static torsion strength of the component can be achieved.
Abstract: Wireless capsule Endoscopy (WCE) has rapidly
shown its wide applications in medical domain last ten years
thanks to its noninvasiveness for patients and support for thorough
inspection through a patient-s entire digestive system including
small intestine. However, one of the main barriers to efficient
clinical inspection procedure is that it requires large amount of
effort for clinicians to inspect huge data collected during the
examination, i.e., over 55,000 frames in video. In this paper, we
propose a method to compute meaningful motion changes of
WCE by analyzing the obtained video frames based on regional
optical flow estimations. The computed motion vectors are used to
remove duplicate video frames caused by WCE-s imaging nature,
such as repetitive forward-backward motions from peristaltic
movements. The motion vectors are derived by calculating
directional component vectors in four local regions. Our
experiments are performed on small intestine area, which is of
main interest to clinical experts when using WCEs, and our
experimental results show significant frame reductions comparing
with a simple frame-to-frame similarity-based image reduction
method.
Abstract: An original Direct Numerical Simulation (DNS) method to tackle the problem of particulate flows at moderate to high concentration and finite Reynolds number is presented. Our method is built on the framework established by Glowinski and his coworkers [1] in the sense that we use their Distributed Lagrange Multiplier/Fictitious Domain (DLM/FD) formulation and their operator-splitting idea but differs in the treatment of particle collisions. The novelty of our contribution relies on replacing the simple artificial repulsive force based collision model usually employed in the literature by an efficient Discrete Element Method (DEM) granular solver. The use of our DEM solver enables us to consider particles of arbitrary shape (at least convex) and to account for actual contacts, in the sense that particles actually touch each other, in contrast with the simple repulsive force based collision model. We recently upgraded our serial code, GRIFF 1 [2], to full MPI capabilities. Our new code, PeliGRIFF 2, is developed under the framework of the full MPI open source platform PELICANS [3]. The new MPI capabilities of PeliGRIFF open new perspectives in the study of particulate flows and significantly increase the number of particles that can be considered in a full DNS approach: O(100000) in 2D and O(10000) in 3D. Results on the 2D/3D sedimentation/fluidization of isometric polygonal/polyedral particles with collisions are presented.
Abstract: With the rapid popularization of internet services, it is apparent that the next generation terrestrial communication systems must be capable of supporting various applications like voice, video, and data. This paper presents the performance evaluation of turbo- coded mobile terrestrial communication systems, which are capable of providing high quality services for delay sensitive (voice or video) and delay tolerant (text transmission) multimedia applications in urban and suburban areas. Different types of multimedia information require different service qualities, which are generally expressed in terms of a maximum acceptable bit-error-rate (BER) and maximum tolerable latency. The breakthrough discovery of turbo codes allows us to significantly reduce the probability of bit errors with feasible latency. In a turbo-coded system, a trade-off between latency and BER results from the choice of convolutional component codes, interleaver type and size, decoding algorithm, and the number of decoding iterations. This trade-off can be exploited for multimedia applications by using optimal and suboptimal performance parameter amalgamations to achieve different service qualities. The results are therefore proposing an adaptive framework for turbo-coded wireless multimedia communications which incorporate a set of performance parameters that achieve an appropriate set of service qualities, depending on the application's requirements.
Abstract: The Influence Diagrams (IDs) is a kind of Probabilistic Belief Networks for graphic modeling. The usage of IDs can improve the communication among field experts, modelers, and decision makers, by showing the issue frame discussed from a high-level point of view. This paper enhances the Time-Sliced Influence Diagrams (TSIDs, or called Dynamic IDs) based formalism from a Discrete Event Systems Modeling and Simulation (DES M&S) perspective, for Exploring Analysis (EA) modeling. The enhancements enable a modeler to specify times occurred of endogenous events dynamically with stochastic sampling as model running and to describe the inter- influences among them with variable nodes in a dynamic situation that the existing TSIDs fails to capture. The new class of model is named Dynamic-Stochastic Influence Diagrams (DSIDs). The paper includes a description of the modeling formalism and the hiberarchy simulators implementing its simulation algorithm, and shows a case study to illustrate its enhancements.
Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: The dynamic spectrum allocation solutions such as
cognitive radio networks have been proposed as a key technology to
exploit the frequency segments that are spectrally underutilized.
Cognitive radio users work as secondary users who need to
constantly and rapidly sense the presence of primary users or
licensees to utilize their frequency bands if they are inactive. Short
sensing cycles should be run by the secondary users to achieve
higher throughput rates as well as to provide low level of interference
to the primary users by immediately vacating their channels once
they have been detected. In this paper, the throughput-sensing time
relationship in local and cooperative spectrum sensing has been
investigated under two distinct scenarios, namely, constant primary
user protection (CPUP) and constant secondary user spectrum
usability (CSUSU) scenarios. The simulation results show that the
design of sensing slot duration is very critical and depends on the
number of cooperating users under CPUP scenario whereas under
CSUSU, cooperating more users has no effect if the sensing time
used exceeds 5% of the total frame duration.
Abstract: Structural Integrity Management (SIM) is
important for the protection of offshore crew, environment, business assets and company and industry reputation. API RP 2A contained guidelines for assessment of existing platforms mostly for the Gulf
of Mexico (GOM). ISO 19902 SIM framework also does not
specifically cater for Malaysia. There are about 200 platforms in
Malaysia with 90 exceeding their design life. The Petronas Carigali
Sdn Bhd (PCSB) uses the Asset Integrity Management System and
the very subjective Risk based Inspection Program for these
platforms. Petronas currently doesn-t have a standalone Petronas
Technical Standard PTS-SIM. This study proposes a recommended
practice for the SIM process for offshore structures in Malaysia,
including studies by API and ISO and local elements such as the
number of platforms, types of facilities, age and risk ranking. Case
study on SMG-A platform in Sabah shows missing or scattered
platform data and a gap in inspection history. It is to undergo a level
3 underwater inspection in year 2015.
Abstract: This paper provides a key driver-based conceptual framework that can be used to improve a firm-s success in commercializing technology and in new product innovation resulting from collaboration with other organizations through strategic alliances. Based on a qualitative study using an interview approach, strategic alliances of entrepreneurs in the food processing industry in Thailand are explored. This paper describes factors affecting decisions to collaborate through alliances. It identifies four issues: maintaining the efficiency of the value chain for production capability, adapting to present and future competition, careful assessment of value of outcomes, and management of innovation. We consider five driving factors: resource orientation, assessment of risk, business opportunity, sharing of benefits and confidence in alliance partners. These factors will be of interest to entrepreneurs and policy makers with regard to further understanding of the direction of business strategies.
Abstract: In recent years, sustainable supply chain management
(SSCM) has been widely researched in academic domain. However,
due to the traditional operational role and the complexity of supply
chain management in the cement industry, a relatively small amount
of research has been conducted on cement supply chain simulation
integrated with sustainability criteria. This paper analyses the cement
supply chain operations using the Push-Pull supply chain
frameworks, the Life Cycle Assessment (LCA) methodology; and
proposal integration approach, proposes three supply chain scenarios
based on Make-To-Stock (MTS), Pack-To-Order (PTO) and Grind-
To-Order (GTO) strategies. A Discrete-Event Simulation (DES)
model of SSCM is constructed using Arena software to implement
the three-target scenarios. We conclude with the simulation results
that (GTO) is the optimal supply chain strategy that demonstrates the
best economic, ecological and social performance in the cement
industry.
Abstract: This paper presents an optimization of the hull
separation, i.e. transverse clearance. The main objective is to identify
the feasible speed ranges and find the optimum transverse clearance
considering the minimum wave-making resistance. The dimensions
and the weight of hardware systems installed in the catamaran
structured fuel cell powered USV (Unmanned Surface Vehicle) were
considered as constraints. As the CAE (Computer Aided Engineering)
platform FRIENDSHIP-Framework was used. The hull surface
modeling, DoE (Design of Experiment), Tangent search optimization,
tool integration and the process automation were performed by
FRIENDSHIP-Framework. The hydrodynamic result was evaluated
by XPAN the potential solver of SHIPFLOW.
Abstract: Knowledge is renowned as a significant component
for sustaining competitive advantage and gives leading edge in
business. This study emphasizes towards proper and effectuate
utilization of internal and external (both either explicit or tacit)
knowledge comes from stakeholders, highly supportive to combat
with the challenges and enhance organizational productivity.
Furthermore, it proposed a model under context of IRSA framework
which facilitates the organization including flow of knowledge and
experience sharing among employees. In discussion section an
innovative model which indulges all functionality as mentioned in
analysis section.
Abstract: Coordinated supply chain represents major challenges
for the different actors involved in it, because each agent responds to
individual interests. The paper presents a framework with the
reviewed literature regarding the system's decision structure and
nature of demand. Later, it characterizes an agri food supply chain in
the Central Region of Colombia, it responds to a decentralized
distribution system and a stochastic demand. Finally, the paper
recommends coordinating the chain based on shared information, and
mechanisms for each agent, as VMI (vendor-managed inventory)
strategy for farmer-buyer relationship, information system for
farmers and contracts for transportation service providers.
Abstract: Nowadays scientific data is inevitably digital and
stored in a wide variety of formats in heterogeneous systems.
Scientists need to access an integrated view of remote or local
heterogeneous data sources with advanced data accessing, analyzing,
and visualization tools. This research suggests the use of Service
Oriented Architecture (SOA) to integrate biological data from
different data sources. This work shows SOA will solve the problems
that facing integration process and if the biologist scientists can
access the biological data in easier way. There are several methods to
implement SOA but web service is the most popular method. The
Microsoft .Net Framework used to implement proposed architecture.
Abstract: In this paper, we present the region based hidden Markov random field model (RBHMRF), which encodes the characteristics of different brain regions into a probabilistic framework for brain MR image segmentation. The recently proposed TV+L1 model is used for region extraction. By utilizing different spatial characteristics in different brain regions, the RMHMRF model performs beyond the current state-of-the-art method, the hidden Markov random field model (HMRF), which uses identical spatial information throughout the whole brain. Experiments on both real and synthetic 3D MR images show that the segmentation result of the proposed method has higher accuracy compared to existing algorithms.
Abstract: One of the popular methods for recognition of facial
expressions such as happiness, sadness and surprise is based on
deformation of facial features. Motion vectors which show these
deformations can be specified by the optical flow. In this method, for
detecting emotions, the resulted set of motion vectors are compared
with standard deformation template that caused by facial expressions.
In this paper, a new method is introduced to compute the quantity of
likeness in order to make decision based on the importance of
obtained vectors from an optical flow approach. For finding the
vectors, one of the efficient optical flow method developed by
Gautama and VanHulle[17] is used. The suggested method has been
examined over Cohn-Kanade AU-Coded Facial Expression Database,
one of the most comprehensive collections of test images available.
The experimental results show that our method could correctly
recognize the facial expressions in 94% of case studies. The results
also show that only a few number of image frames (three frames) are
sufficient to detect facial expressions with rate of success of about
83.3%. This is a significant improvement over the available methods.