Abstract: The paper introduces and discusses definitions and concepts from the supplier relationship management area. This review has the goal to provide readers with the basic conditions to understand the market mechanisms and the technological developments of the SRM market. Further on, the work gives a picture of the actual business environment in which the SRM vendors are in, and the main trends in the field, based on the main SRM functionalities i.e. e-Procurement, e-Sourcing and Supplier Enablement, which indicates users and software providers the future technological developments and practises that will take place in this area in the next couple of years.
Abstract: In this paper, the decomposition-aggregation method
is used to carry out connective stability criteria for general linear
composite system via aggregation. The large scale system is
decomposed into a number of subsystems. By associating directed
graphs with dynamic systems in an essential way, we define the
relation between system structure and stability in the sense of
Lyapunov. The stability criteria is then associated with the stability
and system matrices of subsystems as well as those interconnected
terms among subsystems using the concepts of vector differential
inequalities and vector Lyapunov functions. Then, we show that the
stability of each subsystem and stability of the aggregate model
imply connective stability of the overall system. An example is
reported, showing the efficiency of the proposed technique.
Abstract: This paper presents the study of parameters affecting
the environment protection in the printing industry. The paper has
also compared LCA studies performed within the printing industry in
order to identify common practices, limitations, areas for
improvement, and opportunities for standardization. This comparison
is focused on the data sources and methodologies used in the printing
pollutants register. The presented concepts, methodology and results
represent the contribution to the sustainable development
management. Furthermore, the paper analyzes the result of the
quantitative identification of hazardous substances emitted in printing
industry of Novi Sad.
Abstract: Complexity, as a theoretical background has made it
easier to understand and explain the features and dynamic behavior
of various complex systems. As the common theoretical background
has confirmed, borrowing the terminology for design from the
natural sciences has helped to control and understand urban
complexity. Phenomena like self-organization, evolution and
adaptation are appropriate to describe the formerly inaccessible
characteristics of the complex environment in unpredictable bottomup
systems. Increased computing capacity has been a key element in
capturing the chaotic nature of these systems.
A paradigm shift in urban planning and architectural design has
forced us to give up the illusion of total control in urban
environment, and consequently to seek for novel methods for
steering the development. New methods using dynamic modeling
have offered a real option for more thorough understanding of
complexity and urban processes. At best new approaches may renew
the design processes so that we get a better grip on the complex
world via more flexible processes, support urban environmental
diversity and respond to our needs beyond basic welfare by liberating
ourselves from the standardized minimalism.
A complex system and its features are as such beyond human
ethics. Self-organization or evolution is either good or bad. Their
mechanisms are by nature devoid of reason. They are common in
urban dynamics in both natural processes and gas. They are features
of a complex system, and they cannot be prevented. Yet their
dynamics can be studied and supported.
The paradigm of complexity and new design approaches has been
criticized for a lack of humanity and morality, but the ethical
implications of scientific or computational design processes have not
been much discussed. It is important to distinguish the (unexciting)
ethics of the theory and tools from the ethics of computer aided
processes based on ethical decisions. Urban planning and architecture
cannot be based on the survival of the fittest; however, the natural
dynamics of the system cannot be impeded on grounds of being
“non-human".
In this paper the ethical challenges of using the dynamic models
are contemplated in light of a few examples of new architecture and
dynamic urban models and literature. It is suggested that ethical
challenges in computational design processes could be reframed
under the concepts of responsibility and transparency.
Abstract: Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.
Abstract: Increasing growth of information volume in the
internet causes an increasing need to develop new (semi)automatic
methods for retrieval of documents and ranking them according to
their relevance to the user query. In this paper, after a brief review
on ranking models, a new ontology based approach for ranking
HTML documents is proposed and evaluated in various
circumstances. Our approach is a combination of conceptual,
statistical and linguistic methods. This combination reserves the
precision of ranking without loosing the speed. Our approach
exploits natural language processing techniques to extract phrases
from documents and the query and doing stemming on words. Then
an ontology based conceptual method will be used to annotate
documents and expand the query. To expand a query the spread
activation algorithm is improved so that the expansion can be done
flexible and in various aspects. The annotated documents and the
expanded query will be processed to compute the relevance degree
exploiting statistical methods. The outstanding features of our
approach are (1) combining conceptual, statistical and linguistic
features of documents, (2) expanding the query with its related
concepts before comparing to documents, (3) extracting and using
both words and phrases to compute relevance degree, (4) improving
the spread activation algorithm to do the expansion based on
weighted combination of different conceptual relationships and (5)
allowing variable document vector dimensions. A ranking system
called ORank is developed to implement and test the proposed
model. The test results will be included at the end of the paper.
Abstract: Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web
Abstract: TUSAT is a prospective Turkish
Communication Satellite designed for providing mainly data
communication and broadcasting services through Ku-Band
and C-Band channels. Thermal control is a vital issue in
satellite design process. Therefore, all satellite subsystems and
equipments should be maintained in the desired temperature
range from launch to end of maneuvering life. The main
function of the thermal control is to keep the equipments and
the satellite structures in a given temperature range for various
phases and operating modes of spacecraft during its lifetime.
This paper describes the thermal control design which uses
passive and active thermal control concepts. The active
thermal control is based on heaters regulated by software via
thermistors. Alternatively passive thermal control composes of
heat pipes, multilayer insulation (MLI) blankets, radiators,
paints and surface finishes maintaining temperature level of
the overall carrier components within an acceptable value.
Thermal control design is supported by thermal analysis using
thermal mathematical models (TMM).
Abstract: One of the main processes of supply chain
management is supplier selection process which its accurate
implementation can dramatically increase company competitiveness.
In presented article model developed based on the features of
second tiers suppliers and four scenarios are predicted in order to
help the decision maker (DM) in making up his/her mind. In addition
two tiers of suppliers have been considered as a chain of suppliers.
Then the proposed approach is solved by a method combined of
concepts of fuzzy set theory (FST) and linear programming (LP)
which has been nourished by real data extracted from an engineering
design and supplying parts company. At the end results reveal the
high importance of considering second tier suppliers features as
criteria for selecting the best supplier.
Abstract: Business rules and data warehouse are concepts and
technologies that impact a wide variety of organizational tasks. In
general, each area has evolved independently, impacting application
development and decision-making. Generating knowledge from data
warehouse is a complex process. This paper outlines an approach to
ease import of information and knowledge from a data warehouse
star schema through an inference class of business rules. The paper
utilizes the Oracle database for illustrating the working of the
concepts. The star schema structure and the business rules are stored
within a relational database. The approach is explained through a
prototype in Oracle-s PL/SQL Server Pages.
Abstract: Wavelet transform provides several important
characteristics which can be used in a texture analysis and
classification. In this work, an efficient texture classification method,
which combines concepts from wavelet and co-occurrence matrices,
is presented. An Euclidian distance classifier is used to evaluate the
various methods of classification. A comparative study is essential to
determine the ideal method. Using this conjecture, we developed a
novel feature set for texture classification and demonstrate its
effectiveness
Abstract: This paper deals with the design, development & implementation of a temperature sensor using zigbee. The main aim of the work undertaken in this paper is to sense the temperature and to display the result on the LCD using the zigbee technology. ZigBee operates in the industrial, scientific and medical (ISM) radio bands; 868 MHz in Europe, 915 MHz in the USA and 2.4 GHz in most jurisdictions worldwide. The technology is intended to be simpler and cheaper than other WPANs such as Bluetooth. The most capable ZigBee node type is said to require only about 10 % of the software of a typical Bluetooth or Wireless Internet node, while the simplest nodes are about 2 %. However, actual code sizes are much higher, more like 50 % of the Bluetooth code size. ZigBee chip vendors have announced 128-kilobyte devices. In this work undertaken in the design & development of the temperature sensor, it senses the temperature and after amplification is then fed to the micro controller, this is then connected to the zigbee module, which transmits the data and at the other end the zigbee reads the data and displays on to the LCD. The software developed is highly accurate and works at a very high speed. The method developed shows the effectiveness of the scheme employed.
Abstract: The primary education system in Indonesia involved the community recognized as the school committee, to take a part in the process of achieving the quality of education via the school facility performance, the low level of school committee involvement in the education system has become the issue in the development of education and reflected to the quality of education. This paper will discuss the conceptual framework and methodology for the performance of school committees within the management of school facilities in Batubara district of Indonesia. The concepts of Community based Facility Management (CbFM) and Logometrix are used as a basis to measure the school committee performance in order to address the needs of quality school management. The data will be taken from questionnaires distributed for those who work and use school facilities spread over seven sub district of Batubara, Indonesia. The result of this study is expected to provide a guide for evaluating the performance of existing school committee in improving the quality of education in Indonesia.
Abstract: Context awareness is a capability whereby mobile
computing devices can sense their physical environment and adapt
their behavior accordingly. The term context-awareness, in
ubiquitous computing, was introduced by Schilit in 1994 and has
become one of the most exciting concepts in early 21st-century
computing, fueled by recent developments in pervasive computing
(i.e. mobile and ubiquitous computing). These include computing
devices worn by users, embedded devices, smart appliances, sensors
surrounding users and a variety of wireless networking technologies.
Context-aware applications use context information to adapt
interfaces, tailor the set of application-relevant data, increase the
precision of information retrieval, discover services, make the user
interaction implicit, or build smart environments. For example: A
context aware mobile phone will know that the user is currently in a
meeting room, and reject any unimportant calls. One of the major
challenges in providing users with context-aware services lies in
continuously monitoring their contexts based on numerous sensors
connected to the context aware system through wireless
communication. A number of context aware frameworks based on
sensors have been proposed, but many of them have neglected the
fact that monitoring with sensors imposes heavy workloads on
ubiquitous devices with limited computing power and battery. In this
paper, we present CALEEF, a lightweight and energy efficient
context aware framework for resource limited ubiquitous devices.
Abstract: Very few studies have examined performance
implications of strategic alliance announcements in the information
technologies industry from a resource-based view. Furthermore, none
of these studies have investigated resource congruence and alliance
motive as potential sources of abnormal firm performance. This paper
extends upon current resource-based literature to discover and explore
linkages between these concepts and the practical performance of
strategic alliances. This study finds that strategic alliance
announcements have provided overall abnormal positive returns, and
that marketing alliances with marketing resource incongruence have
also contributed to significant firm performance.
Abstract: In this paper, we first introduce the concepts of weakly prime and weakly quasi-prime fuzzy left ideals of an ordered semigroup S. Furthermore, we give some characterizations of weakly prime and weakly quasi-prime fuzzy left ideals of an ordered semigroup S by the ordered fuzzy points and fuzzy subsets of S.
Abstract: We consider different types of aggregation operators
such as the heavy ordered weighted averaging (HOWA) operator and
the fuzzy ordered weighted averaging (FOWA) operator. We
introduce a new extension of the OWA operator called the fuzzy
heavy ordered weighted averaging (FHOWA) operator. The main
characteristic of this aggregation operator is that it deals with
uncertain information represented in the form of fuzzy numbers (FN)
in the HOWA operator. We develop the basic concepts of this
operator and study some of its properties. We also develop a wide
range of families of FHOWA operators such as the fuzzy push up
allocation, the fuzzy push down allocation, the fuzzy median
allocation and the fuzzy uniform allocation.
Abstract: Tacit knowledge has been one of the most discussed
and contradictory concepts in the field of knowledge management
since the mid 1990s. The concept is used relatively vaguely to refer
to any type of information that is difficult to articulate, which has led
to discussions about the original meaning of the concept (adopted
from Polanyi-s philosophy) and the nature of tacit knowing. It is
proposed that the subject should be approached from the perspective
of cognitive science in order to connect tacit knowledge to
empirically studied cognitive phenomena. Some of the most
important examples of tacit knowing presented by Polanyi are
analyzed in order to trace the cognitive mechanisms of tacit knowing
and to promote better understanding of the nature of tacit knowledge.
The cognitive approach to Polanyi-s theory reveals that the
tacit/explicit typology of knowledge often presented in the
knowledge management literature is not only artificial but totally
opposite approach compared to Polanyi-s thinking.
Abstract: To fight against the economic crisis, French
Government, like many others in Europe, has decided to give a boost
to high-speed line projects. This paper explores the implementation
and decision-making process in TGV projects, their evolutions,
especially since the Mediterranean TGV-line. This project was
probably the most controversial, but paradoxically represents today a
huge success for all the actors involved.
What kind of lessons we can learn from this experience? How to
evaluate the impact of this project on TGV-line planning? How can
we characterize this implementation and decision-making process
regards to the sustainability challenges?
The construction of Mediterranean TGV-line was the occasion to
make several innovations: to introduce more dialog into the decisionmaking
process, to take into account the environment, to introduce a
new project management and technological innovations. That-s why
this project appears today as an example in terms of integration of
sustainable development.
In this paper we examine the different kinds of innovations
developed in this project, by using concepts from sociology of
innovation to understand how these solutions emerged in a
controversial situation. Then we analyze the lessons which were
drawn from this decision-making process (in the immediacy and a
posteriori) and the way in which procedures evolved: creation of new
tools and devices (public consultation, project management...).
Finally we try to highlight the impact of this evolution on TGV
projects governance. In particular, new methods of implementation
and financing involve a reconfiguration of the system of actors. The
aim of this paper is to define the impact of this reconfiguration on
negotiations between stakeholders.
Abstract: This paper is mainly concerned with the application of
a novel technique of data interpretation for classifying measurements
of plasma columns in Tokamak reactors for nuclear fusion
applications. The proposed method exploits several concepts derived
from soft computing theory. In particular, Artificial Neural Networks
and Multi-Class Support Vector Machines have been exploited to
classify magnetic variables useful to determine shape and position of
the plasma with a reduced computational complexity. The proposed
technique is used to analyze simulated databases of plasma equilibria
based on ITER geometry configuration. As well as demonstrating the
successful recovery of scalar equilibrium parameters, we show that
the technique can yield practical advantages compared with earlier
methods.