Abstract: As German companies roll out their standardized
production systems to offshore manufacturing plants, they face the
challenge of implementing them in different cultural environments.
Studies show that the local adaptation is one of the key factors for a
successful implementation. Thus the question arises of where the line
between standardization and adaptation can be drawn. To answer
this question the influence of culture on production systems is
analysed in this paper. The culturally contingent components of
production systems are identified. Also the contingency factors are
classified according to their impact on the necessary adaptation
changes and implementation effort. Culturally specific decision
making, coordination, communication and motivation patterns
require one-time changes in organizational and process design. The
attitude towards rules requires more intense coaching and controlling.
Lastly a framework is developed to depict standardization and
adaption needs when transplanting production systems into different
cultural environments.
Abstract: Nowadays there is a growing environmental concern
and the business communities have slowly started recognising
environmental protection and sustainable utilization of natural
resources into their marketing strategies. This paper discusses the
various Ecolabeling and Certification Systems developed world
over to regulate and introduce Fair Trade in Ornamental Fish
Industry. Ecolabeling and green certification are considered as part
of these strategies implemented partly out of compulsion from the
National and International Regulatory Bodies and Environmental
Movements. All the major markets of ornamental fishes like
European Union, USA and Japan have started putting restrictions on
the trade to impose ecolabeling as a non tariff barrier like the one
imposed on seafood and aqua cultured products. A review was done
on the available Ecolabeling and Green Certification Schemes
available at local, national and international levels for fisheries
including aquaculture and ornamental fish trade and to examine the
success and constraints faced by these schemes during its
implementation. The primary downside of certification is the
multiplicity of ecolabels and cost incurred by applicants for
certification, costs which may in turn be passed on to consumers.
The studies reveal serious inadequacies in a number of ecolabels
and cast doubt on their overall contribution to effective fisheries
management and sustainability. The paper also discusses the
inititive taken in India to develop guidelines for Green Certification
of Fresh water ornamental fishes.
Abstract: Internet Protocol version 4 (IPv4) address is decreasing and a rapid transition method to the next generation IP address (IPv6) should be established. This study aims to evaluate and select the best performance of the IPv6 address network transitionmechanisms, such as IPv4/IPv6 dual stack, transport Relay Translation (TRT) and Reverse Proxy with additional features. It is also aim to prove that faster access can be done while ensuring optimal usage of available resources used during the test and actual implementation. This study used two test methods such asInternet Control Message Protocol (ICMP)ping and ApacheBenchmark (AB) methodsto evaluate the performance.Performance metrics for this study include aspects ofaverageaccessin one second,time takenfor singleaccess,thedata transfer speed and the costof additional requirements.Reverse Proxy with Caching featureis the most efficientmechanism because of it simpler configurationandthe best performerfrom the test conducted.
Abstract: The performance of sensor-less controlled induction
motor drive depends on the accuracy of the estimated speed.
Conventional estimation techniques being mathematically complex
require more execution time resulting in poor dynamic response. The
nonlinear mapping capability and powerful learning algorithms of
neural network provides a promising alternative for on-line speed
estimation. The on-line speed estimator requires the NN model to be
accurate, simpler in design, structurally compact and computationally
less complex to ensure faster execution and effective control in real
time implementation. This in turn to a large extent depends on the
type of Neural Architecture. This paper investigates three types of
neural architectures for on-line speed estimation and their
performance is compared in terms of accuracy, structural
compactness, computational complexity and execution time. The
suitable neural architecture for on-line speed estimation is identified
and the promising results obtained are presented.
Abstract: This paper proposed a new CAD tools for microwave amplifier design. The proposed tool is based on survey about the broadband amplifier design methods, such as the Feedback amplifiers, balanced amplifiers and Compensated Matching Network The proposed tool is developed for broadband amplifier using a compensated matching network "unconditional stability amplifier". The developed program is based on analytical procedures with ability of smith chart explanation. The C# software is used for the proposed tools implementation. The program is applied on broadband amplifier as an example for testing. The designed amplifier is considered as a broadband amplifier at the range 300-700 MHz. The results are highly agreement with the expected results. Finally, these methods can be extended for wide band amplifier design.
Abstract: Process improvements have drawn much attention in
practical software engineering. The capability maturity levels from
CMMI have become an important index to assess a software company-s
software engineering capability. However, in countries like
Taiwan, customers often have no choices but to deal with vendors that
are not CMMI prepared or qualified. We call these vendors maturitylevel-
1 (ML1) vendors. In this paper, we describe our experience
from consulting an e-school project. We propose an approach to help
our client tackle the ML1 vendors. Through our system analysis, we
produce a design. This design is suggested to be used as part of
contract and a blueprint to guide the implementation.
Abstract: Currently is characterized production engineering
together with the integration of industrial automation and robotics
such very quick view of to manufacture the products. The production
range is continuously changing, expanding and producers have to be
flexible in this regard. It means that need to offer production
possibilities, which can respond to the quick change. Engineering
product development is focused on supporting CAD software, such
systems are mainly used for product design. That manufacturers are
competitive, it should be kept procured machines made available
capable of responding to output flexibility. In response to that
problem is the development of flexible manufacturing systems,
consisting of various automated systems. The integration of flexible
manufacturing systems and subunits together with product design and
of engineering is a possible solution for this issue. Integration is
possible through the implementation of CIM systems. Such a solution
and finding a hyphen between CAD and procurement system ICIM
3000 from Festo Co. is engaged in the research project and this
contribution. This can be designed the products in CAD systems and
watch the manufacturing process from order to shipping by the
development of methods and processes of integration, This can be
modeled in CAD systems products and watch the manufacturing
process from order to shipping to develop methods and processes of
integration, which will improve support for product design
parameters by monitoring of the production process, by creating of
programs for production using the CAD and therefore accelerates the
a total of process from design to implementation.
Abstract: The objective of this paper is twofold: (1) discuss and
analyze the successful case studies worldwide, and (2) identify the
similarities and differences of case studies worldwide. Design
methodology/approach: The nature of this research is mainly method
qualitative (multi-case studies, literature review). This investigation
uses ten case studies, and the data was mainly collected and
organizational documents from the international countries. Finding:
The finding of this research can help incubator manager, policy
maker and government parties for successful implementation.
Originality/value: This paper contributes to the current literate review
on the best practices worldwide. Additionally, it presents future
perspective for academicians and practitioners.
Abstract: Time interleaved sigma-delta (TIΣΔ) architecture is a
potential candidate for high bandwidth analog to digital converters
(ADC) which remains a bottleneck for software and cognitive radio
receivers. However, the performance of the TIΣΔ architecture is
limited by the unavoidable gain and offset mismatches resulting
from the manufacturing process. This paper presents a novel digital
calibration method to compensate the gain and offset mismatch
effect. The proposed method takes advantage of the reconstruction
digital signal processing on each channel and requires only few logic
components for implementation. The run time calibration is estimated
to 10 and 15 clock cycles for offset cancellation and gain mismatch
calibration respectively.
Abstract: Optical flow is a research topic of interest for many
years. It has, until recently, been largely inapplicable to real-time
applications due to its computationally expensive nature. This paper
presents a new reliable flow technique which is combined with a
motion detection algorithm, from stationary camera image streams,
to allow flow-based analyses of moving entities, such as rigidity, in
real-time. The combination of the optical flow analysis with motion
detection technique greatly reduces the expensive computation of
flow vectors as compared with standard approaches, rendering the
method to be applicable in real-time implementation. This paper
describes also the hardware implementation of a proposed pipelined
system to estimate the flow vectors from image sequences in real
time. This design can process 768 x 576 images at a very high frame
rate that reaches to 156 fps in a single low cost FPGA chip, which is
adequate for most real-time vision applications.
Abstract: Master plan is a tool to guide and manage the growth of cities in a planned manner. The soul of a master plan lies in its implementation framework. If not implemented, people are trapped in a mess of urban problems and laissez-faire development having serious long term repercussions. Unfortunately, Master Plans prepared for several major cities of Pakistan could not be fully implemented due to host of reasons and Lahore is no exception. Being the second largest city of Pakistan with a population of over 7 million people, Lahore holds the distinction that the first ever Master Plan in the country was prepared for this city in 1966. Recently in 2004, a new plan titled `Integrated Master Plan for Lahore-2021- has been approved for implementation. This paper provides a comprehensive account of the weaknesses and constraints in the plan preparation process and implementation strategies of Master Plans prepared for Lahore. It also critically reviews the new Master Plan particularly with respect to the proposed implementation framework. The paper discusses the prospects and pre-conditions for successful implementation of the new Plan in the light of historic analysis, interviews with stakeholders and the new institutional context under the devolution plan.
Abstract: The paper proposes and validates a new method of solving instances of the vehicle routing problem (VRP). The approach is based on a multiple agent system paradigm. The paper contains the VRP formulation, an overview of the multiple agent environment used and a description of the proposed implementation. The approach is validated experimentally. The experiment plan and the discussion of experiment results follow.
Abstract: The application of the synchronous dynamic random
access memory (SDRAM) has gone beyond the scope of personal
computers for quite a long time. It comes into hand whenever a big
amount of low price and still high speed memory is needed. Most of
the newly developed stand alone embedded devices in the field of
image, video and sound processing take more and more use of it. The
big amount of low price memory has its trade off – the speed. In
order to take use of the full potential of the memory, an efficient
controller is needed. Efficient stands for maximum random accesses
to the memory both for reading and writing and less area after
implementation. This paper proposes a target device independent
DDR SDRAM pipelined controller and provides performance
comparison with available solutions.
Abstract: Compaction testing methods allow at-speed detecting
of errors while possessing low cost of implementation. Owing to this
distinctive feature, compaction methods have been widely used for
built-in testing, as well as external testing. In the latter case, the
bandwidth requirements to the automated test equipment employed
are relaxed which reduces the overall cost of testing. Concurrent
compaction testing methods use operational signals to detect
misbehavior of the device under test and do not require input test
stimuli. These methods have been employed for digital systems only.
In the present work, we extend the use of compaction methods for
concurrent testing of analog-to-digital converters. We estimate
tolerance bounds for the result of compaction and evaluate the
aliasing rate.
Abstract: In this paper, we evaluate the choice of suitable
quantization characteristics for both the decoder messages and the
received samples in Low Density Parity Check (LDPC) coded
systems using M-QAM (Quadrature Amplitude Modulation)
schemes. The analysis involves the demapper block that provides
initial likelihood values for the decoder, by relating its quantization
strategy of the decoder. A mapping strategy refers to the grouping of
bits within a codeword, where each m-bit group is used to select a
2m-ary signal in accordance with the signal labels. Further we
evaluate the system with mapping strategies like Consecutive-Bit
(CB) and Bit-Reliability (BR). A new demapper version, based on
approximate expressions, is also presented to yield a low complexity
hardware implementation.
Abstract: Various intelligences and inspirations have been
adopted into the iterative searching process called as meta-heuristics.
They intelligently perform the exploration and exploitation in the
solution domain space aiming to efficiently seek near optimal
solutions. In this work, the bee algorithm, inspired by the natural
foraging behaviour of honey bees, was adapted to find the near
optimal solutions of the transportation management system, dynamic
multi-zone dispatching. This problem prepares for an uncertainty and
changing customers- demand. In striving to remain competitive,
transportation system should therefore be flexible in order to cope
with the changes of customers- demand in terms of in-bound and outbound
goods and technological innovations. To remain higher service
level but lower cost management via the minimal imbalance scenario,
the rearrangement penalty of the area, in each zone, including time
periods are also included. However, the performance of the algorithm
depends on the appropriate parameters- setting and need to be
determined and analysed before its implementation. BEE parameters
are determined through the linear constrained response surface
optimisation or LCRSOM and weighted centroid modified simplex
methods or WCMSM. Experimental results were analysed in terms
of best solutions found so far, mean and standard deviation on the
imbalance values including the convergence of the solutions
obtained. It was found that the results obtained from the LCRSOM
were better than those using the WCMSM. However, the average
execution time of experimental run using the LCRSOM was longer
than those using the WCMSM. Finally a recommendation of proper
level settings of BEE parameters for some selected problem sizes is
given as a guideline for future applications.
Abstract: Pressures for urban redevelopment are intensifying in
all large cities. A new logic for urban development is required –
green urbanism – that provides a spatial framework for directing
population and investment inwards to brownfields and greyfields
precincts, rather than outwards to the greenfields. This represents
both a major opportunity and a major challenge for city planners in
pluralist liberal democracies. However, plans for more compact
forms of urban redevelopment are stalling in the face of community
resistance. A new paradigm and spatial planning platform is required
that will support timely multi-level and multi-actor stakeholder
engagement, resulting in the emergence of consensus plans for
precinct-level urban regeneration capable of more rapid
implementation. Using Melbourne, Australia as a case study, this
paper addresses two of the urban intervention challenges – where and
how – via the application of a 21st century planning tool ENVISION
created for this purpose.
Abstract: This paper presents a novel genetic algorithm, termed
the Optimum Individual Monogenetic Algorithm (OIMGA) and
describes its hardware implementation. As the monogenetic strategy
retains only the optimum individual, the memory requirement is
dramatically reduced and no crossover circuitry is needed, thereby
ensuring the requisite silicon area is kept to a minimum.
Consequently, depending on application requirements, OIMGA
allows the investigation of solutions that warrant either larger GA
populations or individuals of greater length. The results given in this
paper demonstrate that both the performance of OIMGA and its
convergence time are superior to those of existing hardware GA
implementations. Local convergence is achieved in OIMGA by
retaining elite individuals, while population diversity is ensured by
continually searching for the best individuals in fresh regions of the
search space.
Abstract: The goals of the present research are to estimate Six Sigma implementation in Latvian commercial banks and to identify the perceived benefits of its implementation. To achieve the goals, the authors used sequential explanatory method. To obtain empirical data, the authors have developed the questionnaire and adapted it for the employees of Latvian commercial banks. The questions are related to Six Sigma implementation and its perceived benefits. The questionnaire mainly consists of closed questions, the evaluation of which is based on 5 point Likert scale. The obtained empirical data has shown that of the two hypotheses put forward in the present research – Hypothesis 1 – has to be rejected, while Hypothesis 2 has been partially confirmed. The authors have also faced some research limitations related to the fact that the participants in the questionnaire belong to different rank of the organization hierarchy.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.