Abstract: Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.
Abstract: All the available algorithms for blind estimation namely constant modulus algorithm (CMA), Decision-Directed Algorithm (DDA/DFE) suffer from the problem of convergence to local minima. Also, if the channel drifts considerably, any DDA looses track of the channel. So, their usage is limited in varying channel conditions. The primary limitation in such cases is the requirement of certain overhead bits in the transmit framework which leads to wasteful use of the bandwidth. Also such arrangements fail to use channel state information (CSI) which is an important aid in improving the quality of reception. In this work, the main objective is to reduce the overhead imposed by the pilot symbols, which in effect reduces the system throughput. Also we formulate an arrangement based on certain dynamic Artificial Neural Network (ANN) topologies which not only contributes towards the lowering of the overhead but also facilitates the use of the CSI. A 2×2 Multiple Input Multiple Output (MIMO) system is simulated and the performance variation with different channel estimation schemes are evaluated. A new semi blind approach based on dynamic ANN is proposed for channel tracking in varying channel conditions and the performance is compared with perfectly known CSI and least square (LS) based estimation.
Abstract: In molecular biology, microarray technology is widely and successfully utilized to efficiently measure gene activity. If working with less studied organisms, methods to design custom-made microarray probes are available. One design criterion is to select probes with minimal melting temperature variances thus ensuring similar hybridization properties. If the microarray application focuses on the investigation of metabolic pathways, it is not necessary to cover the whole genome. It is more efficient to cover each metabolic pathway with a limited number of genes. Firstly, an approach is presented which minimizes the overall melting temperature variance of selected probes for all genes of interest. Secondly, the approach is extended to include the additional constraints of covering all pathways with a limited number of genes while minimizing the overall variance. The new optimization problem is solved by a bottom-up programming approach which reduces the complexity to make it computationally feasible. The new method is exemplary applied for the selection of microarray probes in order to cover all fungal secondary metabolite gene clusters for Aspergillus terreus.
Abstract: The load frequency control problem of power systems has attracted a lot of attention from engineers and researchers over the years. Increasing and quickly changing load demand, coupled with the inclusion of more generators with high variability (solar and wind power generators) on the network are making power systems more difficult to regulate. Frequency changes are unavoidable but regulatory authorities require that these changes remain within a certain bound. Engineers are required to perform the tricky task of adjusting the control system to maintain the frequency within tolerated bounds. It is well known that to minimize frequency variations, a large proportional feedback gain (speed regulation constant) is desirable. However, this improvement in performance using proportional feedback comes about at the expense of a reduced stability margin and also allows some steady-state error. A conventional PI controller is then included as a secondary control loop to drive the steadystate error to zero. In this paper, we propose a robust controller to replace the conventional PI controller which guarantees performance and stability of the power system over the range of variation of the speed regulation constant. Simulation results are shown to validate the superiority of the proposed approach on a simple single-area power system model.
Abstract: This paper concerns a formal model to help the
simulation of agent societies where institutional roles and
institutional links can be specified operationally. That is, this paper
concerns institutional roles that can be specified in terms of a minimal behavioral capability that an agent should have in order to
enact that role and, thus, to perform the set of institutional functions that role is responsible for. Correspondingly, the paper concerns
institutional links that can be specified in terms of a minimal
interactional capability that two agents should have in order to, while
enacting the two institutional roles that are linked by that institutional
link, perform for each other the institutional functions supported by
that institutional link. The paper proposes a cognitive architecture
approach to institutional roles and institutional links, that is, an approach in which a institutional role is seen as an abstract cognitive
architecture that should be implemented by any concrete agent (or set of concrete agents) that enacts the institutional role, and in which
institutional links are seen as interactions between the two abstract
cognitive agents that model the two linked institutional roles. We
introduce a cognitive architecture for such purpose, called the
Institutional BCC (IBCC) model, which lifts Yoav Shoham-s BCC
(Beliefs-Capabilities-Commitments) agent architecture to social
contexts. We show how the resulting model can be taken as a means
for a cognitive architecture account of institutional roles and
institutional links of agent societies. Finally, we present an example
of a generic scheme for certain fragments of the social organization
of agent societies, where institutional roles and institutional links are
given in terms of the model.
Abstract: In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.
Abstract: Developing a stable early warning system (EWS)
model that is capable to give an accurate prediction is a challenging
task. This paper introduces k-nearest neighbour (k-NN) method
which never been applied in predicting currency crisis before with the
aim of increasing the prediction accuracy. The proposed k-NN
performance depends on the choice of a distance that is used where in
our analysis; we take the Euclidean distance and the Manhattan as a
consideration. For the comparison, we employ three other methods
which are logistic regression analysis (logit), back-propagation neural
network (NN) and sequential minimal optimization (SMO). The
analysis using datasets from 8 countries and 13 macro-economic
indicators for each country shows that the proposed k-NN method
with k = 4 and Manhattan distance performs better than the other
methods.
Abstract: In this study, a minimal submaximal element of LIT(X) (the lattice of all intuitionistic topologies for X, ordered by inclusion) is determined. Afterwards, a new contractive property, intuitionistic mega-connectedness, is defined. We show that the submaximality and mega-connectedness are not complementary intuitionistic topological invariants by identifying those members of LIT(X) which are intuitionistic mega-connected.
Abstract: Cryptographic algorithms play a crucial role in the
information society by providing protection from unauthorized
access to sensitive data. It is clear that information technology will
become increasingly pervasive, Hence we can expect the emergence
of ubiquitous or pervasive computing, ambient intelligence. These
new environments and applications will present new security
challenges, and there is no doubt that cryptographic algorithms and
protocols will form a part of the solution. The efficiency of a public
key cryptosystem is mainly measured in computational overheads,
key size and bandwidth. In particular the RSA algorithm is used in
many applications for providing the security. Although the security
of RSA is beyond doubt, the evolution in computing power has
caused a growth in the necessary key length. The fact that most chips
on smart cards can-t process key extending 1024 bit shows that there
is need for alternative. NTRU is such an alternative and it is a
collection of mathematical algorithm based on manipulating lists of
very small integers and polynomials. This allows NTRU to high
speeds with the use of minimal computing power. NTRU (Nth degree
Truncated Polynomial Ring Unit) is the first secure public key
cryptosystem not based on factorization or discrete logarithm
problem. This means that given sufficient computational resources
and time, an adversary, should not be able to break the key. The
multi-party communication and requirement of optimal resource
utilization necessitated the need for the present day demand of
applications that need security enforcement technique .and can be
enhanced with high-end computing. This has promoted us to develop
high-performance NTRU schemes using approaches such as the use
of high-end computing hardware. Peer-to-peer (P2P) or enterprise
grids are proven as one of the approaches for developing high-end
computing systems. By utilizing them one can improve the
performance of NTRU through parallel execution. In this paper we
propose and develop an application for NTRU using enterprise grid
middleware called Alchemi. An analysis and comparison of its
performance for various text files is presented.
Abstract: The challenge for software development house in
Bangladesh is to find a path of using minimum process rather than CMMI or ISO type gigantic practice and process area. The small and medium size organization in Bangladesh wants to ensure minimum
basic Software Process Improvement (SPI) in day to day operational
activities. Perhaps, the basic practices will ensure to realize their company's improvement goals. This paper focuses on the key issues in basic software practices for small and medium size software
organizations, who are unable to effort the CMMI, ISO, ITIL etc. compliance certifications. This research also suggests a basic software process practices model for Bangladesh and it will show the mapping of our suggestions with international best practice. In this IT
competitive world for software process improvement, Small and medium size software companies that require collaboration and
strengthening to transform their current perspective into inseparable global IT scenario. This research performed some investigations and analysis on some projects- life cycle, current good practice, effective approach, reality and pain area of practitioners, etc. We did some
reasoning, root cause analysis, comparative analysis of various
approach, method, practice and justifications of CMMI and real life. We did avoid reinventing the wheel, where our focus is for minimal
practice, which will ensure a dignified satisfaction between
organizations and software customer.
Abstract: This paper presents three models which enable the
customisation of Universal Description, Discovery and Integration
(UDDI) query results, based on some pre-defined and/or real-time
changing parameters. These proposed models detail the requirements,
design and techniques which make ranking of Web service discovery
results from a service registry possible. Our contribution is two fold:
First, we present an extension to the UDDI inquiry capabilities. This
enables a private UDDI registry owner to customise or rank the query
results, based on its business requirements. Second, our proposal
utilises existing technologies and standards which require minimal
changes to existing UDDI interfaces or its data structures. We believe
these models will serve as valuable reference for enhancing the
service discovery methods within a private UDDI registry
environment.
Abstract: In this article, we propose a new surgical device for
circumferentially excision of high anal fistulas in a minimally
invasive manner. The new apparatus works on the basis of axially
rotating and moving a tubular blade along a fistulous tract
straightened using a rigid straight guidewire. As the blade moves
along the tract, its sharp circular cutting edge circumferentially
separates approximately 2.25 mm thickness of tract encircling the
rigid guidewire. We used the new set to excise two anal fistulas in a
62-year-old male patient, an extrasphincteric type and a long tract
with no internal opening. With regard to the results of this test, the
new device can be considered as a sphincter preserving mechanism
for treatment of high anal fistulas. Consequently, a major reduction
in the risk of fecal incontinence, recurrence rate, convalescence
period and patient morbidity may be achieved using the new device
for treatment of fistula-in-ano.
Abstract: Ultra-low-power (ULP) circuits have received
widespread attention due to the rapid growth of biomedical
applications and Battery-less Electronics. Subthreshold region of
transistor operation is used in ULP circuits. Major research challenge
in the subthreshold operating region is to extract the ULP benefits
with minimal degradation in speed and robustness. Process, Voltage
and Temperature (PVT) variations significantly affect the
performance of subthreshold circuits. Designed performance
parameters of ULP circuits may vary largely due to temperature
variations. Hence, this paper investigates the effect of temperature
variation on device and circuit performance parameters at different
biasing voltages in the subthreshold region. Simulation results clearly
demonstrate that in deep subthreshold and near threshold voltage
regions, performance parameters are significantly affected whereas in
moderate subthreshold region, subthreshold circuits are more
immune to temperature variations. This establishes that moderate
subthreshold region is ideal for temperature immune circuits.
Abstract: In this paper, the processing of sonar signals has been
carried out using Minimal Resource Allocation Network (MRAN)
and a Probabilistic Neural Network (PNN) in differentiation of
commonly encountered features in indoor environments. The
stability-plasticity behaviors of both networks have been
investigated. The experimental result shows that MRAN possesses
lower network complexity but experiences higher plasticity than
PNN. An enhanced version called parallel MRAN (pMRAN) is
proposed to solve this problem and is proven to be stable in
prediction and also outperformed the original MRAN.
Abstract: It is suggested to evaluate environmental performance
of energy sector using Data Envelopment Analysis with nondiscretionary
factors (DEA-ND) with relative indicators as inputs and
outputs. The latter allows for comparison of the objects essentially
different in size. Inclusion of non-discretionary factors serves
separation of the indicators that are beyond the control of the objects.
A virtual perfect object comprised of maximal outputs and minimal
inputs was added to the group of actual ones. In this setting, explicit
solution of the DEA-ND problem was obtained. Energy sector of the
United States was analyzed using suggested approach for the period
of 1980 – 2006 with expected values of economic indicators for 2030
used for forming the perfect object. It was obtained that
environmental performance has been increasing steadily for the
period from 7.7% through 50.0% but still remains well below the
prospected level
Abstract: Compensating physiological motion in the context
of minimally invasive cardiac surgery has become an attractive
issue since it outperforms traditional cardiac procedures offering
remarkable benefits. Owing to space restrictions, computer vision
techniques have proven to be the most practical and suitable solution.
However, the lack of robustness and efficiency of existing methods
make physiological motion compensation an open and challenging
problem. This work focusses on increasing robustness and efficiency
via exploration of the classes of 1−and 2−regularized optimization,
emphasizing the use of explicit regularization. Both approaches are
based on natural features of the heart using intensity information.
Results pointed out the 1−regularized optimization class as the best
since it offered the shortest computational cost, the smallest average
error and it proved to work even under complex deformations.
Abstract: The goal of this paper is to find Wardrop equilibrium
in transport networks at case of uncertainty situations, where the
uncertainty comes from lack of information. We use simulation tool
to find the equilibrium, which gives only approximate solution, but
this is sufficient for large networks as well. In order to take the
uncertainty into account we have developed an interval-based
procedure for finding the paths with minimal cost using the
Dempster-Shafer theory. Furthermore we have investigated the users-
behaviors using game theory approach, because their path choices
influence the costs of the other users- paths.
Abstract: Mining sequential patterns from large customer transaction databases has been recognized as a key research topic in database systems. However, the previous works more focused on mining sequential patterns at a single concept level. In this study, we introduced concept hierarchies into this problem and present several algorithms for discovering multiple-level sequential patterns based on the hierarchies. An experiment was conducted to assess the performance of the proposed algorithms. The performances of the algorithms were measured by the relative time spent on completing the mining tasks on two different datasets. The experimental results showed that the performance depends on the characteristics of the datasets and the pre-defined threshold of minimal support for each level of the concept hierarchy. Based on the experimental results, some suggestions were also given for how to select appropriate algorithm for a certain datasets.
Abstract: In this paper, we proposed an efficient data
compression strategy exploiting the multi-resolution characteristic of
the wavelet transform. We have developed a sensor node called
“Smart Sensor Node; SSN". The main goals of the SSN design are
lightweight, minimal power consumption, modular design and robust
circuitry. The SSN is made up of four basic components which are a
sensing unit, a processing unit, a transceiver unit and a power unit.
FiOStd evaluation board is chosen as the main controller of the SSN
for its low costs and high performance. The software coding of the
implementation was done using Simulink model and MATLAB
programming language. The experimental results show that the
proposed data compression technique yields recover signal with good
quality. This technique can be applied to compress the collected data
to reduce the data communication as well as the energy consumption
of the sensor and so the lifetime of sensor node can be extended.
Abstract: Cloud computing is the innovative and leading
information technology model for enabling convenient, on-demand
network access to a shared pool of configurable computing resources
that can be rapidly provisioned and released with minimal
management effort. This paper presents our development on enabling
an individual user's desktop in a virtualized environment, which is
stored on a remote virtual machine rather than locally. We present the
initial work on the integration of virtual desktop and application
sharing with virtualization technology. Given the development of
remote desktop virtualization, this proposed effort has the potential to
positively provide an efficient, resilience and elastic environment for
online cloud service. Users no longer need to burden the cost of
software licenses and platform maintenances. Moreover, this
development also helps boost user productivity by promoting a
flexible model that lets users access their desktop environments from
virtually anywhere.