Abstract: This study develops a relation to explore the factors influencing management and technology capabilities in strategic alliances. Alliances between firms are recognizing increasingly popular as a vehicle to create and extract greater value from the market. Firm’s alliance can be described as the collaborative problem solving process to solve problems jointly. This study starts from research questions what factors of firm’s management and technology characteristics affect performance of firms which are formed alliances. In this study, we investigated the effect of strategic alliances on company performance. That is, we try to identify whether firms made an alliance with other organizations are differed by characteristics of management and technology. And we test that alliance type and alliance experiences moderate the relationship between firm’s capabilities and its performance. We employ problem-solving perspective and resource-based view perspective to shed light on this research questions. The empirical work is based on the Survey of Business Activities conducted from2006 to 2008 by Statistics Korea. We verify correlations between to point out that these results contribute new empirical evidence on the effect of strategic alliances on company performance.
Abstract: Modeling of complex dynamic systems, which are
very complicated to establish mathematical models, requires new and
modern methodologies that will exploit the existing expert
knowledge, human experience and historical data. Fuzzy cognitive
maps are very suitable, simple, and powerful tools for simulation and
analysis of these kinds of dynamic systems. However, human experts
are subjective and can handle only relatively simple fuzzy cognitive
maps; therefore, there is a need of developing new approaches for an
automated generation of fuzzy cognitive maps using historical data.
In this study, a new learning algorithm, which is called Big Bang-Big
Crunch, is proposed for the first time in literature for an automated
generation of fuzzy cognitive maps from data. Two real-world
examples; namely a process control system and radiation therapy
process, and one synthetic model are used to emphasize the
effectiveness and usefulness of the proposed methodology.
Abstract: We have proposed an information filtering system
using index word selection from a document set based on the
topics included in a set of documents. This method narrows
down the particularly characteristic words in a document set
and the topics are obtained by Sparse Non-negative Matrix
Factorization. In information filtering, a document is often
represented with the vector in which the elements correspond
to the weight of the index words, and the dimension of the
vector becomes larger as the number of documents is
increased. Therefore, it is possible that useless words as index
words for the information filtering are included. In order to
address the problem, the dimension needs to be reduced. Our
proposal reduces the dimension by selecting index words
based on the topics included in a document set. We have
applied the Sparse Non-negative Matrix Factorization to the
document set to obtain these topics. The filtering is carried out
based on a centroid of the learning document set. The centroid
is regarded as the user-s interest. In addition, the centroid is
represented with a document vector whose elements consist of
the weight of the selected index words. Using the English test
collection MEDLINE, thus, we confirm the effectiveness of
our proposal. Hence, our proposed selection can confirm the
improvement of the recommendation accuracy from the other
previous methods when selecting the appropriate number of
index words. In addition, we discussed the selected index
words by our proposal and we found our proposal was able to
select the index words covered some minor topics included in
the document set.
Abstract: Cryo-electron microscopy (CEM) in combination with
single particle analysis (SPA) is a widely used technique for
elucidating structural details of macromolecular assemblies at closeto-
atomic resolutions. However, development of automated software
for SPA processing is still vital since thousands to millions of
individual particle images need to be processed. Here, we present our
workflow for automated particle picking. Our approach integrates
peak shape analysis to the classical correlation and an iterative
approach to separate macromolecules and background by
classification. This particle selection workflow furthermore provides
a robust means for SPA with little user interaction. Processing
simulated and experimental data assesses performance of the
presented tools.
Abstract: Efficient storage, transmission and use of video information are key requirements in many multimedia applications currently being addressed by MPEG-4. To fulfill these requirements, a new approach for representing video information which relies on an object-based representation, has been adopted. Therefore, objectbased watermarking schemes are needed for copyright protection. This paper proposes a novel blind object watermarking scheme for images and video using the in place lifting shape adaptive-discrete wavelet transform (SA-DWT). In order to make the watermark robust and transparent, the watermark is embedded in the average of wavelet blocks using the visual model based on the human visual system. Wavelet coefficients n least significant bits (LSBs) are adjusted in concert with the average. Simulation results shows that the proposed watermarking scheme is perceptually invisible and robust against many attacks such as lossy image/video compression (e.g. JPEG, JPEG2000 and MPEG-4), scaling, adding noise, filtering, etc.
Abstract: Mobile agents are a powerful approach to develop distributed systems since they migrate to hosts on which they have the resources to execute individual tasks. In a dynamic environment like a peer-to-peer network, Agents have to be generated frequently and dispatched to the network. Thus they will certainly consume a certain amount of bandwidth of each link in the network if there are too many agents migration through one or several links at the same time, they will introduce too much transferring overhead to the links eventually, these links will be busy and indirectly block the network traffic, therefore, there is a need of developing routing algorithms that consider about traffic load. In this paper we seek to create cooperation between a probabilistic manner according to the quality measure of the network traffic situation and the agent's migration decision making to the next hop based on decision tree learning algorithms.
Abstract: This paper presents a comparison between Spectrum-
Sliced Wavelength Division Multiplexing (SS-WDM) and Spectrum
Amplitude Coding Optical Code Division Multiple Access (SAC
Optical CDMA) systems for different light sources. The performance
of the system is shown in the simulated results of the bit error rate
(BER) and the eye diagram of both systems. The comparison results
indicate that the Multiple Access Interference (MAI) effects have a
significant impact on SS-WDM over SAC Optical CDMA systems.
Finally, in terms of spectral efficiency at constant BER of 10-12, SSWDM
offers higher spectral efficiency than optical CDMA since no
bandwidth expansion in needed.
Abstract: Digital Video Terrestrial Broadcasting (DVB-T)
allows combining broadcasting, telephone and data services in one
network. It has facilitated mobile TV broadcasting. Mobile TV
broadcasting is dominated by fragmentation of standards in use in
different continents. In Asia T-DMB and ISDB-T are used while
Europe uses mainly DVB-H and in USA it is MediaFLO. Issues of
royalty for developers of these different incompatible technologies,
investments made and differing local conditions shall make it
difficult to agree on a unified standard in a very near future. Despite
this shortcoming, mobile TV has shown very good market potential.
There are a number of challenges that still exist for regulators,
investors and technology developers but the future looks bright.
There is need for mobile telephone operators to cooperate with
content providers and those operating terrestrial digital broadcasting
infrastructure for mutual benefit.
Abstract: We present an integration approach of a CMOS biosensor into a polymer based microfluidic environment suitable for mass production. It consists of a wafer-level-package for the silicon die and laser bonding process promoted by an intermediate hot melt foil to attach the sensor package to the microfluidic chip, without the need for dispensing of glues or underfiller. A very good condition of the sensing area was obtained after introducing a protection layer during packaging. A microfluidic flow cell was fabricated and shown to withstand pressures up to Δp = 780 kPa without leakage. The employed biosensors were electrically characterized in a dry environment.
Abstract: In today-s economy plant engineering faces many
challenges. For instance the intensifying competition in this business
is leading to cost competition and needs for a shorter time-to-market.
To remain competitive companies need to make their businesses
more profitable by implementing improvement programs such as
standardization projects. But they have difficulties to tap their full
economic potential for various reasons. One of them is non-holistic
planning and implementation of standardization projects. This paper
describes a new conceptual framework - the layer-model. The model
combines and expands existing proven approaches in order to
improve design, implementation and management of standardization
projects. Based on a holistic approach it helps to systematically
analyze the effects of standardization projects on different business
layers and enables companies to better seize the opportunities offered
by standardization.
Abstract: The significance of psychology in studying politics
is embedded in philosophical issues as well as behavioural
pursuits. For the former is often associated with Sigmund Freud
and his followers. The latter is inspired by the writings of Harold
Lasswell. Political psychology or psychopolitics has its own
impression on political thought ever since it deciphers the concept
of human nature and political propaganda. More importantly,
psychoanalysis views political thought as a textual content which
needs to explore the latent from the manifest content. In other
words, it reads the text symptomatically and interprets the hidden
truth. This paper explains the paradigm of dream interpretation
applied by Freud. The dream work is a process which has four
successive activities: condensation, displacement, representation
and secondary revision. The texts dealing with political though can
also be interpreted on these principles. Freud's method of dream
interpretation draws its source after the hermeneutic model of
philological research. It provides theoretical perspective and
technical rules for the interpretation of symbolic structures. The
task of interpretation remains a discovery of equivalence of
symbols and actions through perpetual analogies. Psychoanalysis
can help in studying political thought in two ways: to study the text
distortion, Freud's dream interpretation is used as a paradigm
exploring the latent text from its manifest text; and to apply Freud's
psychoanalytic concepts and theories ranging from individual mind
to civilization, religion, war and politics.
Abstract: This paper provides new ways to explore the old
problem of failure of information systems development in an
organisation. Based on the theory of cognitive dissonance,
information systems (IS) failure is defined as a gap between what the
users expect from an information system and how well these
expectations are met by the perceived performance of the delivered
system. Bridging the expectation-perception gap requires that IS
professionals make a radical change from being the proprietor of
information systems and products to being service providers. In order
to deliver systems and services that IS users perceive as valuable, IS
people must become expert in determining and assessing users-
expectations and perceptions. It is also suggested that the IS
community, in general, has given relatively little attention to the
front-end process of requirements specification for IS development.
There is a simplistic belief that requirements are obtainable from
users, they are then translatable into a formal specification. The
process of information needs analysis is problematic and worthy of
investigation.
Abstract: Speeding represents one of the main concerns for road safety and it still is a subject for research. The need to address this problem and to understand why drivers over speed increases especially in Romania, where in 2011, speed was the main cause of car accidents. This article addresses this problem by using the theory of planned behaviour. A questionnaire was administered to a sample of young Romanian drivers (18 to 25 years) and several path analyses were made in order to verify if the model proposed by the theory of planned behaviour fits the data. One interesting result is that perceived behavioural control does not predict the intention to speed or self-reported driving speed, but subjective norms do. This implies that peers and social environment have a greater impact on young Romanian drivers than we thought.
Abstract: Recently, a vehicular ad-hoc networks(VANETs) for
Intelligent Transport System(ITS) have become able safety and convenience services surpassing the simple services such as
an electronic toll collection system. To provide the proper services,
VANET needs infrastructure over the country infrastructure. Thus, we have to spend a huge sum of
human resources. In this reason, several studies have been made on the
usage of cellular networks instead of new protocols
this study is to assess a performance evaluation of the
cellular network for VANET. In this paper, the result of a
for the suitability of cellular networks for VANET
experiment, The LTE(Long Term Evolution) of cellular networks found to be most suitable among the others cellular networks
Abstract: Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.
Abstract: The objectives of this research were to explore factors
influencing knowledge management process in the manufacturing
industry and develop a model to support knowledge management
processes. The studied factors were technology infrastructure, human
resource, knowledge sharing, and the culture of the organization. The
knowledge management processes included discovery, capture,
sharing, and application. Data were collected through questionnaires
and analyzed using multiple linear regression and multiple
correlation. The results found that technology infrastructure, human
resource, knowledge sharing, and culture of the organization
influenced the discovery and capture processes. However, knowledge
sharing had no influence in sharing and application processes. A
model to support knowledge management processes was developed,
which indicated that sharing knowledge needed further improvement
in the organization.
Abstract: Combined therapy using Interferon and Ribavirin is the standard treatment in patients with chronic hepatitis C. However, the number of responders to this treatment is low, whereas its cost and side effects are high. Therefore, there is a clear need to predict patient’s response to the treatment based on clinical information to protect the patients from the bad drawbacks, Intolerable side effects and waste of money. Different machine learning techniques have been developed to fulfill this purpose. From these techniques are Associative Classification (AC) and Decision Tree (DT). The aim of this research is to compare the performance of these two techniques in the prediction of virological response to the standard treatment of HCV from clinical information. 200 patients treated with Interferon and Ribavirin; were analyzed using AC and DT. 150 cases had been used to train the classifiers and 50 cases had been used to test the classifiers. The experiment results showed that the two techniques had given acceptable results however the best accuracy for the AC reached 92% whereas for DT reached 80%.
Abstract: The internet is constantly expanding. Identifying web
links of interest from web browsers requires users to visit each of the
links listed, individually until a satisfactory link is found, therefore
those users need to evaluate a considerable amount of links before
finding their link of interest; this can be tedious and even
unproductive. By incorporating web assistance, web users could be
benefited from reduced time searching on relevant websites. In this
paper, a rough set approach is presented, which facilitates
classification of unlimited available e-vocabulary, to assist web users
in reducing search times looking for relevant web sites. This
approach includes two methods for identifying relevance data on web
links based on the priority and percentage of relevance. As a result of
these methods, a list of web sites is generated in priority sequence
with an emphasis of the search criteria.
Abstract: In view of growing competition in the service sector,
services are as much in need of modeling, analysis and improvement
as business or working processes. Graphical process models are
important means to capture process-related know-how for an
effective management of the service process. In this contribution, a
human performance analysis of process model development paying
special attention to model development time and the working method
was conducted. It was found that modelers with higher application
experience need significantly less time for mental activities than
modelers with lower application experience, spend more time on
labeling graphical elements, and achieved higher process model
quality in terms of activity label quality.
Abstract: Estimates of temperature values at a specific time of day, from daytime and daily profiles, are needed for a number of environmental, ecological, agricultural and technical applications, ranging from natural hazards assessments, crop growth forecasting to design of solar energy systems. The scope of this research is to investigate the efficiency of data mining techniques in estimating minimum, maximum and mean temperature values. For this reason, a number of experiments have been conducted with well-known regression algorithms using temperature data from the city of Patras in Greece. The performance of these algorithms has been evaluated using standard statistical indicators, such as Correlation Coefficient, Root Mean Squared Error, etc.