Abstract: The OTOP Entrepreneurship that used to create
substantial source of income for local Thai communities are now in a
stage of exigent matters that required assistances from public sectors
due to over Entrepreneurship of duplicative ideas, unable to adjust
costs and prices, lack of innovation, and inadequate of quality
control. Moreover, there is a repetitive problem of middlemen who
constantly corner the OTOP market. Local OTOP producers become
easy preys since they do not know how to add more values, how to
create and maintain their own brand name, and how to create proper
packaging and labeling. The suggested solutions to local OTOP
producers are to adopt modern management techniques, to find
knowhow to add more values to products and to unravel other
marketing problems. The objectives of this research are to study the
prevalent OTOP products management and to discover direction to
manage OTOP products to enhance the effectiveness of OTOP
Entrepreneurship in Nonthaburi Province, Thailand. There were 113
participants in this study. The research tools can be divided into two
parts: First part is done by questionnaire to find responses of the
prevalent OTOP Entrepreneurship management. Second part is the
use of focus group which is conducted to encapsulate ideas and local
wisdom. Data analysis is performed by using frequency, percentage,
mean, and standard deviation as well as the synthesis of several small
group discussions. The findings reveal that 1) Business Resources:
the quality of product is most important and the marketing of product
is least important. 2) Business Management: Leadership is most
important and raw material planning is least important. 3) Business
Readiness: Communication is most important and packaging is least
important. 4) Support from public sector: Certified from the
government is most important and source of raw material is the least
important.
Abstract: This paper describes a computer model of Quantum Field Theory (QFT), referred to in this paper as QTModel. After specifying the initial configuration for a QFT process (e.g. scattering) the model generates the possible applicable processes in terms of Feynman diagrams, the equations for the scattering matrix, and evaluates probability amplitudes for the scattering matrix and cross sections. The computations of probability amplitudes are performed numerically. The equations generated by QTModel are provided for demonstration purposes only. They are not directly used as the base for the computations of probability amplitudes. The computer model supports two modes for the computation of the probability amplitudes: (1) computation according to standard QFT, and (2) computation according to a proposed functional interpretation of quantum theory.
Abstract: This paper proposes and implements an core transform architecture, which is one of the major processes in HEVC video compression standard. The proposed core transform architecture is implemented with only adders and shifters instead of area-consuming multipliers. Shifters in the proposed core transform architecture are implemented in wires and multiplexers, which significantly reduces chip area. Also, it can process from 4×4 to 16×16 blocks with common hardware by reusing processing elements. Designed core transform architecture in 0.13um technology can process a 16×16 block with 2-D transform in 130 cycles, and its gate count is 101,015 gates.
Abstract: This paper provides a scheme to improve the read efficiency of anti-collision algorithm in EPCglobal UHF Class-1 Generation-2 RFID standard. In this standard, dynamic frame slotted ALOHA is specified to solve the anti-collision problem. Also, the Q-algorithm with a key parameter C is adopted to dynamically adjust the frame sizes. In the paper, we split the C parameter into two parameters to increase the read speed and derive the optimal values of the two parameters through simulations. The results indicate our method outperforms the original Q-algorithm.
Abstract: Orthogonal Frequency Division Multiplexing (OFDM) is one of the techniques for high speed data rate communication with main consideration for 4G and 5G systems. In OFDM, there are several mapping schemes which provide a way of parallel transmission. In this paper, comparisons of mapping schemes used by some standards have been made and also has been discussed about the performance of the non-conventional modulation technique. The Comparisons of Bit Error Rate (BER) performances for conventional and non-conventional modulation schemes have been done using MATLAB software. Mentioned schemes used in OFDM system can be selected on the basis of the requirement of power or spectrum efficiency and BER analysis.
Abstract: In this paper we propose a novel Run Time Interface
(RTI) technique to provide an efficient environment for MPI jobs on
the heterogeneous architecture of PARAM Padma. It suggests an
innovative, unified framework for the job management interface
system in parallel and distributed computing. This approach employs
proxy scheme. The implementation shows that the proposed RTI is
highly scalable and stable. Moreover RTI provides the storage access
for the MPI jobs in various operating system platforms and improve
the data access performance through high performance C-DAC
Parallel File System (C-PFS). The performance of the RTI is
evaluated by using the standard HPC benchmark suites and the
simulation results show that the proposed RTI gives good
performance on large scale supercomputing system.
Abstract: The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.
Abstract: Bumpers play an important role in preventing the
impact energy from being transferred to the automobile and
passengers. Saving the impact energy in the bumper to be released in
the environment reduces the damages of the automobile and
passengers.
The goal of this paper is to design a bumper with minimum weight
by employing the Glass Material Thermoplastic (GMT) materials.
This bumper either absorbs the impact energy with its deformation or
transfers it perpendicular to the impact direction.
To reach this aim, a mechanism is designed to convert about 80%
of the kinetic impact energy to the spring potential energy and
release it to the environment in the low impact velocity according to
American standard1. In addition, since the residual kinetic energy
will be damped with the infinitesimal elastic deformation of the
bumper elements, the passengers will not sense any impact. It should
be noted that in this paper, modeling, solving and result-s analysis
are done in CATIA, LS-DYNA and ANSYS V8.0 software
respectively.
Abstract: True stress-strain curve of railhead steel is required to
investigate the behaviour of railhead under wheel loading through elasto-plastic Finite Element (FE) analysis. To reduce the rate of wear, the railhead material is hardened through annealing and
quenching. The Australian standard rail sections are not fully hardened and hence suffer from non-uniform distribution of the
material property; usage of average properties in the FE modelling can potentially induce error in the predicted plastic strains. Coupons
obtained at varying depths of the railhead were, therefore, tested under axial tension and the strains were measured using strain gauges as well as an image analysis technique, known as the Particle Image Velocimetry (PIV). The head hardened steel exhibit existence of three distinct zones of yield strength; the yield strength as the ratio of the average yield strength provided in the standard (σyr=780MPa) and
the corresponding depth as the ratio of the head hardened zone along
the axis of symmetry are as follows: (1.17 σyr, 20%), (1.06 σyr, 20%-80%) and (0.71 σyr, > 80%). The stress-strain curves exhibit limited plastic zone with fracture occurring at strain less than 0.1.
Abstract: The hard clam (meretrix lusoria) cultivated industry
has been developed vigorously for recent years in Taiwan, and
seawater quality determines the cultivated environment. The pH
concentration variation affects survival rate of meretrix lusoria
immediately. In order to monitor seawater quality, solid-state sensing
electrode of ruthenium-doped titanium dioxide (TiO2:Ru) is developed
to measure hydrogen ion concentration in different cultivated
solutions. Because the TiO2:Ru sensing electrode has high chemical
stability and superior sensing characteristics, thus it is applied as a pH
sensor. Response voltages of TiO2:Ru sensing electrode are readout by
instrument amplifier in different sample solutions. Mean sensitivity
and linearity of TiO2:Ru sensing electrode are 55.20 mV/pH and 0.999
from pH1 to pH13, respectively. We expect that the TiO2:Ru sensing
electrode can be applied to real environment measurement, therefore
we collect two sample solutions by different meretrix lusoria
cultivated ponds in the Yunlin, Taiwan. The two sample solutions are
both measured for 200 seconds after calibration of standard pH buffer
solutions (pH7, pH8 and pH 9). Mean response voltages of sample 1
and sample 2 are -178.758 mV (Standard deviation=0.427 mV) and
-180.206 mV (Standard deviation =0.399 mV), respectively. Response
voltages of the two sample solutions are between pH 8 and pH 9 which
conform to weak alkali range and suitable meretrix lusoria growth. For
long-term monitoring, drift of cultivated solutions (sample 1 and
sample 2) are 1.16 mV/hour and 1.03 mV/hour, respectively.
Abstract: This paper analyzes the effect of a single uniform accounting rule on reporting quality by investigating the influence of IFRS on earnings management. This paper examines whether earnings management is reduced after IFRS adoption through the use of “loss avoidance thresholds”, a method that has been verified in earlier studies. This paper concentrates on two European countries: one that represents the continental code law tradition with weak protection of investors (France) and one that represents the Anglo-American common law tradition, which typically implies a strong enforcement system (the United Kingdom).
The research investigates a sample of 526 companies (6822 firm-year observations) during the years 2000 – 2013. The results are different for the two jurisdictions. This study demonstrates that a single set of accounting standards contributes to better reporting quality and reduces the pervasiveness of earnings management in France. In contrast, there is no evidence that a reduction in earnings management followed the implementation of IFRS in the United Kingdom. Due to the fact that IFRS benefit France but not the United Kingdom, other political and economic factors, such legal system or capital market strength, must play a significant role in influencing the comparability and transparency cross-border companies’ financial statements. Overall, the result suggests that IFRS moderately contribute to the accounting quality of reported financial statements and bring benefit for stakeholders, though the role played by other economic factors cannot be discounted.
Abstract: A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.
Abstract: IPsec has now become a standard information security
technology throughout the Internet society. It provides a well-defined
architecture that takes into account confidentiality, authentication,
integrity, secure key exchange and protection mechanism against
replay attack also. For the connectionless security services on packet
basis, IETF IPsec Working Group has standardized two extension
headers (AH&ESP), key exchange and authentication protocols. It is
also working on lightweight key exchange protocol and MIB's for
security management. IPsec technology has been implemented on
various platforms in IPv4 and IPv6, gradually replacing old
application-specific security mechanisms. IPv4 and IPv6 are not
directly compatible, so programs and systems designed to one
standard can not communicate with those designed to the other. We
propose the design and implementation of controlled Internet security
system, which is IPsec-based Internet information security system in
IPv4/IPv6 network and also we show the data of performance
measurement. With the features like improved scalability and
routing, security, ease-of-configuration, and higher performance of
IPv6, the controlled Internet security system provides consistent
security policy and integrated security management on IPsec-based
Internet security system.
Abstract: An important structuring mechanism for knowledge bases is building clusters based on the content of their knowledge objects. The objects are clustered based on the principle of maximizing the intraclass similarity and minimizing the interclass similarity. Clustering can also facilitate taxonomy formation, that is, the organization of observations into a hierarchy of classes that group similar events together. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. In this paper, a set of related HPRs is called a cluster and is represented by a HPR-tree. This paper discusses an algorithm based on cumulative learning scenario for dynamic structuring of clusters. The proposed scheme incrementally incorporates new knowledge into the set of clusters from the previous episodes and also maintains summary of clusters as Synopsis to be used in the future episodes. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested incremental structuring of clusters would be useful in mining data streams.
Abstract: In recent years, response surface methodology (RSM) has
brought many attentions of many quality engineers in different
industries. Most of the published literature on robust design
methodology is basically concerned with optimization of a single
response or quality characteristic which is often most critical to
consumers. For most products, however, quality is multidimensional,
so it is common to observe multiple responses in an experimental
situation. Through this paper interested person will be familiarize
with this methodology via surveying of the most cited technical
papers.
It is believed that the proposed procedure in this study can resolve
a complex parameter design problem with more than two responses.
It can be applied to those areas where there are large data sets and a
number of responses are to be optimized simultaneously. In addition,
the proposed procedure is relatively simple and can be implemented
easily by using ready-made standard statistical packages.
Abstract: Digital Video Terrestrial Broadcasting (DVB-T)
allows combining broadcasting, telephone and data services in one
network. It has facilitated mobile TV broadcasting. Mobile TV
broadcasting is dominated by fragmentation of standards in use in
different continents. In Asia T-DMB and ISDB-T are used while
Europe uses mainly DVB-H and in USA it is MediaFLO. Issues of
royalty for developers of these different incompatible technologies,
investments made and differing local conditions shall make it
difficult to agree on a unified standard in a very near future. Despite
this shortcoming, mobile TV has shown very good market potential.
There are a number of challenges that still exist for regulators,
investors and technology developers but the future looks bright.
There is need for mobile telephone operators to cooperate with
content providers and those operating terrestrial digital broadcasting
infrastructure for mutual benefit.
Abstract: Overloading is a technique to accommodate more
number of users than the spreading factor N. This is a bandwidth
efficient scheme to increase the number users in a fixed bandwidth.
One of the efficient schemes to overload a CDMA system is to use
two sets of orthogonal signal waveforms (O/O). The first set is
assigned to the N users and the second set is assigned to the
additional M users. An iterative interference cancellation technique is
used to cancel interference between the two sets of users. In this
paper, the performance of an overloading scheme in which the first N
users are assigned Walsh-Hadamard orthogonal codes and extra users
are assigned the same WH codes but overlaid by a fixed (quasi) bent
sequence [11] is evaluated. This particular scheme is called Quasi-
Orthogonal Sequence (QOS) O/O scheme, which is a part of
cdma2000 standard [12] to provide overloading in the downlink
using single user detector. QOS scheme are balance O/O scheme,
where the correlation between any set-1 and set-2 users are
equalized. The allowable overload of this scheme is investigated in
the uplink on an AWGN and Rayleigh fading channels, so that the
uncoded performance with iterative multistage interference
cancellation detector remains close to the single user bound. It is
shown that this scheme provides 19% and 11% overloading with
SDIC technique for N= 16 and 64 respectively, with an SNR
degradation of less than 0.35 dB as compared to single user bound at
a BER of 0.00001. But on a Rayleigh fading channel, the channel
overloading is 45% (29 extra users) at a BER of 0.0005, with an SNR
degradation of about 1 dB as compared to single user performance
for N=64. This is a significant amount of channel overloading on a
Rayleigh fading channel.
Abstract: In today-s economy plant engineering faces many
challenges. For instance the intensifying competition in this business
is leading to cost competition and needs for a shorter time-to-market.
To remain competitive companies need to make their businesses
more profitable by implementing improvement programs such as
standardization projects. But they have difficulties to tap their full
economic potential for various reasons. One of them is non-holistic
planning and implementation of standardization projects. This paper
describes a new conceptual framework - the layer-model. The model
combines and expands existing proven approaches in order to
improve design, implementation and management of standardization
projects. Based on a holistic approach it helps to systematically
analyze the effects of standardization projects on different business
layers and enables companies to better seize the opportunities offered
by standardization.
Abstract: The low power wireless sensor devices which usually
uses the low power wireless private area network (IEEE 802.15.4)
standard are being widely deployed for various purposes and in
different scenarios. IPv6 low power wireless private area network
(6LoWPAN) was adopted as part of the IETF standard for the
wireless sensor devices so that it will become an open standard
compares to other dominated proprietary standards available in the
market. 6LoWPAN also allows the integration and communication of
sensor nodes with the Internet more viable. This paper presents a
comparative study on different available IPv6 platforms for wireless
sensor networks including open and close sources. It also discusses
about the platforms used by these stacks. Finally it evaluates and
provides appropriate suggestions which can be use for selection of
required IPv6 stack for low power devices.
Abstract: In this paper channel estimation techniques are
considered as the support methods for OFDM transmission systems
based on Non Binary LDPC (Low Density Parity Check) codes.
Standard frequency domain pilot aided LS (Least Squares) and
LMMSE (Linear Minimum Mean Square Error) estimators are
investigated. Furthermore, an iterative algorithm is proposed as a
solution exploiting the NB-LDPC channel decoder to improve the
performance of the LMMSE estimator. Simulation results of signals
transmitted through fading mobile channels are presented to compare
the performance of the proposed channel estimators.