Abstract: Considering today-s increasing speed of change,
radical and innovative improvement - Kaikaku, is a necessity parallel
to continuous incremental improvement - Kaizen, especially for
SME-s in order to attain the competitive edge needed to be profitable.
During 2011, a qualitative single case study with the objective of
realizing a kaikaku in production has been conducted. The case study
was run as a one year project using a collaborative approach
including both researchers and company representatives. The case
study was conducted with the purpose of gaining further knowledge
about kaikaku realization as well as its implications. The empirical
results provide insights about the great productivity results achieved
by applying a specific kaikaku realization approach. However, it also
sheds light on the difficulty and contradiction of combining
innovation management and production system development.
Abstract: Wrist pulse analysis for identification of health status
is found in Ancient Indian as well as Chinese literature. The preprocessing
of wrist pulse is necessary to remove outlier pulses and
fluctuations prior to the analysis of pulse pressure signal. This paper
discusses the identification of irregular pulses present in the pulse
series and intricacies associated with the extraction of time domain
pulse features. An approach of Dynamic Time Warping (DTW) has
been utilized for the identification of outlier pulses in the wrist pulse
series. The ambiguity present in the identification of pulse features is
resolved with the help of first derivative of Ensemble Average of
wrist pulse series. An algorithm for detecting tidal and dicrotic notch
in individual wrist pulse segment is proposed.
Abstract: In this study, three subtypes of influenza A viruses (pH1N1, H5N1 and H3N2) which naturally infected human were analyzed by bioinformatic approaches to find candidate human cellular miRNAs targeting viral genomes. There were 76 miRNAs targeting influenza A viruses. Among these candidates, 70 miRNAs were subtypes specifically targeting each subtype of influenza A virus including 21 miRNAs targeted subtype H1N1, 27 miRNAs targeted subtype H5N1 and 22 miRNAs targeted subtype H3N2. The remaining 6 miRNAs target on multiple subtypes of influenza A viruses. Uniquely, hsa-miR-3145 is the only one candidate miRNA targeting PB1 gene of all three subtypes. Obviously, most of the candidate miRNAs are targeting on polymerase complex genes (PB2, PB1 and PA) of influenza A viruses. This study predicted potential human miRNAs targeting on different subtypes of influenza A viruses which might be useful for inhibition of viral replication and for better understanding of the interaction between virus and host cell.
Abstract: The use of wind energy for electricity generation is
growing rapidly across the world and in Portugal. However, the
geographical characteristics of the country along with the average
wind regime and with the environmental restrictions imposed to these
projects create limitations to the exploit of the onshore wind
resource. The best onshore wind spots are already committed and the
possibility of offshore wind farms in the Portuguese cost is now
being considered. This paper aims to make a contribution to the
evaluation of offshore wind power projects in Portugal. The technical
restrictions are addressed and the strategic, environmental and
financial interest of the project is analysed from the private company
and public points of view. The results suggest that additional support
schemes are required to ensure private investors interest for these
projects. Assuming an approach of direct substitution of energy
sources for electricity generation, the avoided CO2 equivalent
emissions for an offshore wind power project were quantified. Based
on the conclusions, future research is proposed to address the
environmental and social impacts of these projects.
Abstract: Flow-shop scheduling problem (FSP) deals with the
scheduling of a set of jobs that visit a set of machines in the same
order. The FSP is NP-hard, which means that an efficient algorithm
for solving the problem to optimality is unavailable. To meet the
requirements on time and to minimize the make-span performance of
large permutation flow-shop scheduling problems in which there are
sequence dependent setup times on each machine, this paper
develops one hybrid genetic algorithms (HGA). Proposed HGA
apply a modified approach to generate population of initial
chromosomes and also use an improved heuristic called the iterated
swap procedure to improve initial solutions. Also the author uses
three genetic operators to make good new offspring. The results are
compared to some recently developed heuristics and computational
experimental results show that the proposed HGA performs very
competitively with respect to accuracy and efficiency of solution.
Abstract: This study adopts a qualitative approach, which
engages in the dialectical discussion on two levels of dyad opposite
views. The first level of the dyad opposite views is the Western
strategic perspective and the Eastern Tai-Chi thinking. The second
level of the dyad opposite views is resource-based view and resource
dependence theory. This study concludes the resource-oriented actions
for competitive advantage as the metaphor of Tai-Chi consisted of yin
and yang. This study argues that the focal firm should adopt bridging
strategy during the core competence development period because its
core competence development is likely to meet its competitor’s needs
of exploring strategy during the competitor’s external resource
development stage. In addition, the focal firm should adopt buffering
strategy during the external resource development period to prevent its
competitor’s the exploiting strategy from attack during the
competitor’s core competence development stage. Consequently, this
study takes a significant first step toward a novel contextualize
understanding of resource development based on strategic perspective
and Tai-Chi thinking providing more fully sustainable strategy for
competitive advantage.
Abstract: This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.
Abstract: Thousands of masters athletes participate
quadrennially in the World Masters Games (WMG), yet this cohort
of athletes remains proportionately under-investigated. Due to a
growing global obesity pandemic in context of benefits of physical
activity across the lifespan, the BMI trends for this unique population
was of particular interest. The nexus between health, physical
activity and aging is complex and has raised much interest in recent
times due to the realization that a multifaceted approach is necessary
in order to counteract the obesity pandemic. By investigating age
based trends within a population adhering to competitive sport at
older ages, further insight might be gleaned to assist in understanding
one of many factors influencing this relationship.BMI was derived
using data gathered on a total of 6,071 masters athletes (51.9% male,
48.1% female) aged 25 to 91 years ( =51.5, s =±9.7), competing at
the Sydney World Masters Games (2009). Using linear and loess
regression it was demonstrated that the usual tendency for prevalence
of higher BMI increasing with age was reversed in the sample. This
trend in reversal was repeated for both male and female only sub-sets
of the sample participants, indicating the possibility of improved
prevalence of BMI with increasing age for both the sample as a
whole and these individual sub-groups.This evidence of improved
classification in one index of health (reduced BMI) for masters
athletes (when compared to the general population) implies there are
either improved levels of this index of health with aging due to
adherence to sport or possibly the reduced BMI is advantageous and
contributes to this cohort adhering (or being attracted) to masters
sport at older ages.
Abstract: It is expected that ubiquitous era will come soon. A ubiquitous environment has features like peer-to-peer and nomadic environments. Such features can be represented by peer-to-peer systems and mobile ad-hoc networks (MANETs). The features of P2P systems and MANETs are similar, appealing for implementing P2P systems in MANET environment. It has been shown that, however, the performance of the P2P systems designed for wired networks do not perform satisfactorily in mobile ad-hoc environment. Subsequently, this paper proposes a method to improve P2P performance using cross-layer design and the goodness of a node as a peer. The proposed method uses routing metric as well as P2P metric to choose favorable peers to connect. It also utilizes proactive approach for distributing peer information. According to the simulation results, the proposed method provides higher query success rate, shorter query response time and less energy consumption by constructing an efficient overlay network.
Abstract: The research investigates the “impact of VLE on mathematical concepts acquisition of the special education needs (SENs) students at KS4 secondary education sector" in England. The overall aim of the study is to establish possible areas of difficulties to approach for above or below knowledge standard requirements for KS4 students in the acquisition and validation of basic mathematical concepts. A teaching period, in which virtual learning environment (Fronter) was used to emphasise different mathematical perception and symbolic representation was carried out and task based survey conducted to 20 special education needs students [14 actually took part]. The result shows that students were able to process information and consider images, objects and numbers within the VLE at early stages of acquisition process. They were also able to carry out perceptual tasks but with limiting process of different quotient, thus they need teacher-s guidance to connect them to symbolic representations and sometimes coach them through. The pilot study further indicates that VLE curriculum approaches for students were minutely aligned with mathematics teaching which does not emphasise the integration of VLE into the existing curriculum and current teaching practice. There was also poor alignment of vision regarding the use of VLE in realisation of the objectives of teaching mathematics by the management. On the part of teacher training, not much was done to develop teacher-s skills in the technical and pedagogical aspects of VLE that is in-use at the school. The classroom observation confirmed teaching practice will find a reliance on VLE as an enhancer of mathematical skills, providing interaction and personalisation of learning to SEN students.
Abstract: Business and IT alignment has continued as a
top concern for business and IT executives for almost three
decades. Many researchers have conducted empirical studies on
the relationship between business-IT alignment and performance.
Yet, these approaches, lacking a social perspective, have had little
impact on sustaining performance and competitive advantage. In
addition to the limited alignment literature that explores
organisational learning that is represented in shared understanding,
communication, cognitive maps and experiences.
Hence, this paper proposes an integrated process that enables
social and intellectual dimensions through the concept of
organisational learning. In particular, the feedback and feedforward
process which provide a value creation across dynamic
multilevel of learning. This mechanism enables on-going
effectiveness through development of individuals, groups and
organisations, which improves the quality of business and IT
strategies and drives to performance.
Abstract: In the traditional concept of product life cycle management, the activities of design, manufacturing, and assembly are performed in a sequential way. The drawback is that the considerations in design may contradict the considerations in manufacturing and assembly. The different designs of components can lead to different assembly sequences. Therefore, in some cases, a good design may result in a high cost in the downstream assembly activities. In this research, an integrated design evaluation and assembly sequence planning model is presented. Given a product requirement, there may be several design alternative cases to design the components for the same product. If a different design case is selected, the assembly sequence for constructing the product can be different. In this paper, first, the designed components are represented by using graph based models. The graph based models are transformed to assembly precedence constraints and assembly costs. A particle swarm optimization (PSO) approach is presented by encoding a particle using a position matrix defined by the design cases and the assembly sequences. The PSO algorithm simultaneously performs design evaluation and assembly sequence planning with an objective of minimizing the total assembly costs. As a result, the design cases and the assembly sequences can both be optimized. The main contribution lies in the new concept of integrated design evaluation and assembly sequence planning model and the new PSO solution method. The test results show that the presented method is feasible and efficient for solving the integrated design evaluation and assembly planning problem. In this paper, an example product is tested and illustrated.
Abstract: An original Direct Numerical Simulation (DNS) method to tackle the problem of particulate flows at moderate to high concentration and finite Reynolds number is presented. Our method is built on the framework established by Glowinski and his coworkers [1] in the sense that we use their Distributed Lagrange Multiplier/Fictitious Domain (DLM/FD) formulation and their operator-splitting idea but differs in the treatment of particle collisions. The novelty of our contribution relies on replacing the simple artificial repulsive force based collision model usually employed in the literature by an efficient Discrete Element Method (DEM) granular solver. The use of our DEM solver enables us to consider particles of arbitrary shape (at least convex) and to account for actual contacts, in the sense that particles actually touch each other, in contrast with the simple repulsive force based collision model. We recently upgraded our serial code, GRIFF 1 [2], to full MPI capabilities. Our new code, PeliGRIFF 2, is developed under the framework of the full MPI open source platform PELICANS [3]. The new MPI capabilities of PeliGRIFF open new perspectives in the study of particulate flows and significantly increase the number of particles that can be considered in a full DNS approach: O(100000) in 2D and O(10000) in 3D. Results on the 2D/3D sedimentation/fluidization of isometric polygonal/polyedral particles with collisions are presented.
Abstract: This paper provides a key driver-based conceptual framework that can be used to improve a firm-s success in commercializing technology and in new product innovation resulting from collaboration with other organizations through strategic alliances. Based on a qualitative study using an interview approach, strategic alliances of entrepreneurs in the food processing industry in Thailand are explored. This paper describes factors affecting decisions to collaborate through alliances. It identifies four issues: maintaining the efficiency of the value chain for production capability, adapting to present and future competition, careful assessment of value of outcomes, and management of innovation. We consider five driving factors: resource orientation, assessment of risk, business opportunity, sharing of benefits and confidence in alliance partners. These factors will be of interest to entrepreneurs and policy makers with regard to further understanding of the direction of business strategies.
Abstract: The objectives of this research paper were to study the
influencing factors that contributed to the success of electronic
commerce (e-commerce) and to study the approach to enhance the
standard of e-commerce for small and medium enterprises (SME).
The research paper focused the study on only sole proprietorship
SMEs in Bangkok, Thailand. The factors contributed to the success
of SME included business management, learning in the organization,
business collaboration, and the quality of website. A quantitative and
qualitative mixed research methodology was used. In terms of
quantitative method, a questionnaire was used to collect data from
251 sole proprietorships. The System Equation Model (SEM) was
utilized as the tool for data analysis. In terms of qualitative method,
an in-depth interview, a dialogue with experts in the field of ecommerce
for SMEs, and content analysis were used.
By using the adjusted causal relationship structure model, it was
revealed that the factors affecting the success of e-commerce for
SMEs were found to be congruent with the empirical data. The
hypothesis testing indicated that business management influenced the
learning in the organization, the learning in the organization
influenced business collaboration and the quality of the website, and
these factors, in turn, influenced the success of SMEs. Moreover, the
approach to enhance the standard of SMEs revealed that the majority
of respondents wanted to enhance the standard of SMEs to a high
level in the category of safety of e-commerce system, basic structure
of e-commerce, development of staff potentials, assistance of budget
and tax reduction, and law improvement regarding the e-commerce
respectively.
Abstract: This research paper deals with the implementation of face recognition using neural network (recognition classifier) on low-resolution images. The proposed system contains two parts, preprocessing and face classification. The preprocessing part converts original images into blurry image using average filter and equalizes the histogram of those image (lighting normalization). The bi-cubic interpolation function is applied onto equalized image to get resized image. The resized image is actually low-resolution image providing faster processing for training and testing. The preprocessed image becomes the input to neural network classifier, which uses back-propagation algorithm to recognize the familiar faces. The crux of proposed algorithm is its beauty to use single neural network as classifier, which produces straightforward approach towards face recognition. The single neural network consists of three layers with Log sigmoid, Hyperbolic tangent sigmoid and Linear transfer function respectively. The training function, which is incorporated in our work, is Gradient descent with momentum (adaptive learning rate) back propagation. The proposed algorithm was trained on ORL (Olivetti Research Laboratory) database with 5 training images. The empirical results provide the accuracy of 94.50%, 93.00% and 90.25% for 20, 30 and 40 subjects respectively, with time delay of 0.0934 sec per image.
Abstract: With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Abstract: Early detection of lung cancer through chest radiography is a widely used method due to its relatively affordable cost. In this paper, an approach to improve lung nodule visualization on chest radiographs is presented. The approach makes use of linear phase high-frequency emphasis filter for digital filtering and
histogram equalization for contrast enhancement to achieve improvements. Results obtained indicate that a filtered image can
reveal sharper edges and provide more details. Also, contrast enhancement offers a way to further enhance the global (or local) visualization by equalizing the histogram of the pixel values within
the whole image (or a region of interest). The work aims to improve lung nodule visualization of chest radiographs to aid detection of lung cancer which is currently the leading cause of cancer deaths worldwide.
Abstract: This paper addresses the problem of blind source separation
(BSS). To recover original signals, from linear instantaneous
mixtures, we propose a new contrast function based on the use of a
double referenced system. Our approach assumes statistical independence
sources. The reference vectors will be incrusted in the cumulant
to evaluate the independence. The estimation of the separating matrix
will be performed in two steps: whitening observations and joint
diagonalization of a set of referenced cumulant matrices. Computer
simulations are presented to demonstrate the effectiveness of the
suggested approach.
Abstract: In recent years, sustainable supply chain management
(SSCM) has been widely researched in academic domain. However,
due to the traditional operational role and the complexity of supply
chain management in the cement industry, a relatively small amount
of research has been conducted on cement supply chain simulation
integrated with sustainability criteria. This paper analyses the cement
supply chain operations using the Push-Pull supply chain
frameworks, the Life Cycle Assessment (LCA) methodology; and
proposal integration approach, proposes three supply chain scenarios
based on Make-To-Stock (MTS), Pack-To-Order (PTO) and Grind-
To-Order (GTO) strategies. A Discrete-Event Simulation (DES)
model of SSCM is constructed using Arena software to implement
the three-target scenarios. We conclude with the simulation results
that (GTO) is the optimal supply chain strategy that demonstrates the
best economic, ecological and social performance in the cement
industry.