Abstract: This paper proposes a new solution to string matching problem. This solution constructs an inverted list representing a string pattern to be searched for. It then uses a new algorithm to process an input string in a single pass. The preprocessing phase takes 1) time complexity O(m) 2) space complexity O(1) where m is the length of pattern. The searching phase time complexity takes 1) O(m+α ) in average case 2) O(n/m) in the best case and 3) O(n) in the worst case, where α is the number of comparing leading to mismatch and n is the length of input text.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: Knowledge is attributed to human whose problemsolving
behavior is subjective and complex. In today-s knowledge
economy, the need to manage knowledge produced by a community
of actors cannot be overemphasized. This is due to the fact that
actors possess some level of tacit knowledge which is generally
difficult to articulate. Problem-solving requires searching and sharing
of knowledge among a group of actors in a particular context.
Knowledge expressed within the context of a problem resolution
must be capitalized for future reuse. In this paper, an approach that
permits dynamic capitalization of relevant and reliable actors-
knowledge in solving decision problem following Economic
Intelligence process is proposed. Knowledge annotation method and
temporal attributes are used for handling the complexity in the
communication among actors and in contextualizing expressed
knowledge. A prototype is built to demonstrate the functionalities of
a collaborative Knowledge Management system based on this
approach. It is tested with sample cases and the result showed that
dynamic capitalization leads to knowledge validation hence
increasing reliability of captured knowledge for reuse. The system
can be adapted to various domains.
Abstract: The direct implementation of interleaver functions
in WiMAX is not hardware efficient due to presence of complex
functions. Also the conventional method i.e. using memories for
storing the permutation tables is silicon consuming. This work
presents a 2-D transformation for WiMAX channel interleaver
functions which reduces the overall hardware complexity to
compute the interleaver addresses on the fly. A fully reconfigurable
architecture for address generation in WiMAX
channel interleaver is presented, which consume 1.1 k-gates in
total. It can be configured for any block size and any modulation
scheme in WiMAX. The presented architecture can run at a
frequency of 200 MHz, thus fully supporting high bandwidth
requirements for WiMAX.
Abstract: The objective of our work is to develop a new approach for discovering knowledge from a large mass of data, the result of applying this approach will be an expert system that will serve as diagnostic tools of a phenomenon related to a huge information system. We first recall the general problem of learning Bayesian network structure from data and suggest a solution for optimizing the complexity by using organizational and optimization methods of data. Afterward we proposed a new heuristic of learning a Multi-Entities Bayesian Networks structures. We have applied our approach to biological facts concerning hereditary complex illnesses where the literatures in biology identify the responsible variables for those diseases. Finally we conclude on the limits arched by this work.
Abstract: The modern telecommunication industry demands
higher capacity networks with high data rate. Orthogonal frequency
division multiplexing (OFDM) is a promising technique for high data
rate wireless communications at reasonable complexity in wireless
channels. OFDM has been adopted for many types of wireless
systems like wireless local area networks such as IEEE 802.11a, and
digital audio/video broadcasting (DAB/DVB). The proposed research
focuses on a concatenated coding scheme that improve the
performance of OFDM based wireless communications. It uses a
Redundant Residue Number System (RRNS) code as the outer code
and a convolutional code as the inner code. Here, a direct conversion
of analog signal to residue domain is done to reduce the conversion
complexity using sigma-delta based parallel analog-to-residue
converter. The bit error rate (BER) performances of the proposed
system under different channel conditions are investigated. These
include the effect of additive white Gaussian noise (AWGN),
multipath delay spread, peak power clipping and frame start
synchronization error. The simulation results show that the proposed
RRNS-Convolutional concatenated coding (RCCC) scheme provides
significant improvement in the system performance by exploiting the
inherent properties of RRNS.
Abstract: Most Decision Support Systems (DSS) for waste
management (WM) constructed are not widely marketed and lack
practical applications. This is due to the number of variables and
complexity of the mathematical models which include the
assumptions and constraints required in decision making. The
approach made by many researchers in DSS modelling is to isolate a
few key factors that have a significant influence to the DSS. This
segmented approach does not provide a thorough understanding of
the complex relationships of the many elements involved. The
various elements in constructing the DSS must be integrated and
optimized in order to produce a viable model that is marketable and
has practical application. The DSS model used in assisting decision
makers should be integrated with GIS, able to give robust prediction
despite the inherent uncertainties of waste generation and the plethora
of waste characteristics, and gives optimal allocation of waste stream
for recycling, incineration, landfill and composting.
Abstract: Using vision based solution in intelligent vehicle application often needs large memory to handle video stream and image process which increase complexity of hardware and software. In this paper, we present a FPGA implement of a vision based lane departure warning system. By taking frame of videos, the line gradient of line is estimated and the lane marks are found. By analysis the position of lane mark, departure of vehicle will be detected in time. This idea has been implemented in Xilinx Spartan6 FPGA. The lane departure warning system used 39% logic resources and no memory of the device. The average availability is 92.5%. The frame rate is more than 30 frames per second (fps).
Abstract: This paper presents a very simple and efficient
algorithm for codebook search, which reduces a great deal of
computation as compared to the full codebook search. The algorithm
is based on sorting and centroid technique for search. The results
table shows the effectiveness of the proposed algorithm in terms of
computational complexity. In this paper we also introduce a new
performance parameter named as Average fractional change in pixel
value as we feel that it gives better understanding of the closeness of
the image since it is related to the perception. This new performance
parameter takes into consideration the average fractional change in
each pixel value.
Abstract: Injection molding is a very complicated process to
monitor and control. With its high complexity and many process
parameters, the optimization of these systems is a very challenging
problem. To meet the requirements and costs demanded by the
market, there has been an intense development and research with the
aim to maintain the process under control. This paper outlines the
latest advances in necessary algorithms for plastic injection process
and monitoring, and also a flexible data acquisition system that
allows rapid implementation of complex algorithms to assess their
correct performance and can be integrated in the quality control
process. This is the main topic of this paper. Finally, to demonstrate
the performance achieved by this combination, a real case of use is
presented.
Abstract: It well recognized that one feature that makes a
successful company is its ability to successfully align its business goals with its information communication technologies platform.
Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and
providing support for information flows. However, the technological
systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP).
This paper aims to investigate the role of training in improving the
usage of ERP systems. To this end, we have designed an instrument
survey to employees of a Norwegian multinational global provider of
technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for
both researchers and practitioners as a step towards a better
understanding of ERP system implementation.
Abstract: Knowledge development in companies relies on
knowledge-intensive business processes, which are characterized by
a high complexity in their execution, weak structuring,
communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of
knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is
modeled with the help of general knowledge conversions between
knowledge assets. Here knowledge dynamics is understood to cover
all of acquisition, conversion, transfer, development and usage of
knowledge. Through this conception we gain a sound basis for
knowledge management and development in an enterprise. Especially
the type dimension of knowledge, which categorizes it according to
its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development,
because knowledge should be made available by converting it to
more external types.
Built on this conception, a modeling approach for knowledgeintensive
business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of
a product is given.
Abstract: The highly nonlinear characteristics of drying
processes have prompted researchers to seek new nonlinear control
solutions. However, the relation between the implementation
complexity, on-line processing complexity, reliability control
structure and controller-s performance is not well established. The
present paper proposes high performance nonlinear fuzzy controllers
for a real-time operation of a drying machine, being developed under
a consistent match between those issues. A PCI-6025E data
acquisition device from National Instruments® was used, and the
control system was fully designed with MATLAB® / SIMULINK
language. Drying parameters, namely relative humidity and
temperature, were controlled through MIMOs Hybrid Bang-bang+PI
(BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based
controllers to perform drying tests on biological materials. The
performance of the drying strategies was compared through several
criteria, which are reported without controllers- retuning. Controllers-
performance analysis has showed much better performance of FLC
than BPI controller. The absolute errors were lower than 8,85 % for
Fuzzy Logic Controller, about three times lower than the
experimental results with BPI control.
Abstract: The performance and complexity of QoS routing depends on the complex interaction between a large set of parameters. This paper investigated the scaling properties of source-directed link-state routing in large core networks. The simulation results show that the routing algorithm, network topology, and link cost function each have a significant impact on the probability of successfully routing new connections. The experiments confirm and extend the findings of other studies, and also lend new insight designing efficient quality-of-service routing policies in large networks.
Abstract: This paper presents a new true RMS-to-DC converter
circuit based on a square-root-domain squarer/divider. The circuit is
designed by employing up-down translinear loop and using of
MOSFET transistors that operate in strong inversion saturation
region. The converter offer advantages of two-quadrant input current,
low circuit complexity, low supply voltage (1.2V) and immunity
from the body effect. The circuit has been simulated by HSPICE.
The simulation results are seen to conform to the theoretical analysis
and shows benefits of the proposed circuit.
Abstract: We have devised a thermal carpet cloak theoretically
and implemented in silicon using layered metamaterial. The layered
metamaterial is composed of single crystalline silicon and its phononic
crystal. The design is based on a coordinate transformation. We
demonstrate the result with numerical simulation. Great cloaking
performance is achieved as a thermal insulator is well hidden under the
thermal carpet cloak. We also show that the thermal carpet cloak can
even the temperature on irregular surface. Using thermal carpet cloak
to manipulate the heat conduction is effective because of its low
complexity.
Abstract: Sickness absence represents a major economic and
social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is
often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient
and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using
a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model
selection and a critical analysis of the temporal trends, the occurrence
and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large
sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to
select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model
applicability to complicated longitudinal data.
Abstract: Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Abstract: Metal stamping die design is a complex, experiencebased
and time-consuming task. Various artificial intelligence (AI)
techniques are being used by worldwide researchers for stamping die
design to reduce complexity, dependence on human expertise and
time taken in design process as well as to improve design efficiency.
In this paper a comprehensive review of applications of AI
techniques in manufacturability evaluation of sheet metal parts, die
design and process planning of metal stamping die is presented.
Further the salient features of major research work published in the
area of metal stamping are presented in tabular form and scope of
future research work is identified.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.