Abstract: This paper presents the development of a Bayesian
belief network classifier for prediction of graft status and survival
period in renal transplantation using the patient profile information
prior to the transplantation. The objective was to explore feasibility
of developing a decision making tool for identifying the most suitable
recipient among the candidate pool members. The dataset was
compiled from the University of Toledo Medical Center Hospital
patients as reported to the United Network Organ Sharing, and had
1228 patient records for the period covering 1987 through 2009. The
Bayes net classifiers were developed using the Weka machine
learning software workbench. Two separate classifiers were induced
from the data set, one to predict the status of the graft as either failed
or living, and a second classifier to predict the graft survival period.
The classifier for graft status prediction performed very well with a
prediction accuracy of 97.8% and true positive values of 0.967 and
0.988 for the living and failed classes, respectively. The second
classifier to predict the graft survival period yielded a prediction
accuracy of 68.2% and a true positive rate of 0.85 for the class
representing those instances with kidneys failing during the first year
following transplantation. Simulation results indicated that it is
feasible to develop a successful Bayesian belief network classifier for
prediction of graft status, but not the graft survival period, using the
information in UNOS database.
Abstract: Langmuir–Blodgett (LB) films of polyaniline (PANI) grown onto ITO coated glass substrates were utilized for the fabrication of Uric acid biosensor for efficient detection of uric acid by immobilizing Uricase via EDC–NHS coupling. The modified electrodes were characterized by atomic force microscopy (AFM). The response characteristics after immobilization of uricase were studied using cyclic voltammetry and electrochemical impedance spectroscopy techniques. The uricase/PANI/ITO/glass bioelectrode studied by CV and EIS techniques revealed detection of uric acid in a wide range of 0.05 mM to 1.0 mM, covering the physiological range in blood. A low Michaelis–Menten constant (Km) of 0.21 mM indicates the higher affinity of immobilized Uricase towards its analyte (uric acid). The fabricated uric acid biosensor based on PANI LB films exhibits excellent sensitivity of 0.21 mA/mM with a response time of 4 s, good reproducibility, long shelf life (8 weeks) and high selectivity.
Abstract: In Orthogonal Frequency Division Multiplexing (OFDM) systems, the peak to average power ratio (PAR) is much high. The clipping signal scheme is a useful method to reduce PAR. Clipping the OFDM signal, however, increases the overall noise level by introducing clipping noise. It is necessary to recover the figure of the original signal at receiver in order to reduce the clipping noise. Considering the continuity of the signal and the figure of the peak, we obtain a certain conic function curve to replace the clipped signal module within the clipping time. The results of simulation show that the proposed scheme can reduce the systems? BER (bit-error rate) 10 times when signal-to-interference-and noise-ratio (SINR) equals to 12dB. And the BER performance of the proposed scheme is superior to that of kim's scheme, too.
Abstract: Adopting Zakowski-s upper approximation operator
C and lower approximation operator C, this paper investigates
granularity-wise separations in covering approximation spaces. Some
characterizations of granularity-wise separations are obtained by
means of Pawlak rough sets and some relations among granularitywise
separations are established, which makes it possible to research
covering approximation spaces by logical methods and mathematical
methods in computer science. Results of this paper give further
applications of Pawlak rough set theory in pattern recognition and
artificial intelligence.
Abstract: Nowaday-s, many organizations use systems that
support business process as a whole or partially. However, in some
application domains, like software development and health care
processes, a normative Process Aware System (PAS) is not suitable,
because a flexible support is needed to respond rapidly to new
process models. On the other hand, a flexible Process Aware System
may be vulnerable to undesirable and fraudulent executions, which
imposes a tradeoff between flexibility and security. In order to make
this tradeoff available, a genetic-based anomaly detection model for
logs of Process Aware Systems is presented in this paper. The
detection of an anomalous trace is based on discovering an
appropriate process model by using genetic process mining and
detecting traces that do not fit the appropriate model as anomalous
trace; therefore, when used in PAS, this model is an automated
solution that can support coexistence of flexibility and security.
Abstract: A perfect secret-sharing scheme is a method to distribute a secret among a set of participants in such a way that only qualified subsets of participants can recover the secret and the joint share of participants in any unqualified subset is statistically independent of the secret. The collection of all qualified subsets is called the access structure of the perfect secret-sharing scheme. In a graph-based access structure, each vertex of a graph G represents a participant and each edge of G represents a minimal qualified subset. The average information ratio of a perfect secret-sharing scheme realizing the access structure based on G is defined as AR = (Pv2V (G) H(v))/(|V (G)|H(s)), where s is the secret and v is the share of v, both are random variables from and H is the Shannon entropy. The infimum of the average information ratio of all possible perfect secret-sharing schemes realizing a given access structure is called the optimal average information ratio of that access structure. Most known results about the optimal average information ratio give upper bounds or lower bounds on it. In this present structures based on bipartite graphs and determine the exact values of the optimal average information ratio of some infinite classes of them.
Abstract: Rough set theory is a very effective tool to deal with granularity and vagueness in information systems. Covering-based rough set theory is an extension of classical rough set theory. In this paper, firstly we present the characteristics of the reducible element and the minimal description covering-based rough sets through downsets. Then we establish lattices and topological spaces in coveringbased rough sets through down-sets and up-sets. In this way, one can investigate covering-based rough sets from algebraic and topological points of view.
Abstract: In this paper, a simple microfluidic device for monitoring algal cell behavior is proposed. An array of algal microwells is fabricated by PDMS soft-lithography using X-ray LIGA mold, placed on a glass substrate. Two layers of replicated PDMS and substrate are attached by oxygen plasma bonding, creating a microchannel for the microfluidic system. Algal cell are loaded into the microfluidic device, which provides positive charge on the bottom surface of wells. Algal cells, which are negative charged, can be attracted to the bottom of the wells via electrostatic interaction. By varying the concentration of algal cells in the loading suspension, it is possible to obtain wells with a single cell. Liquid medium for cells monitoring are flown continuously over the wells, providing nutrient and waste exchange between the well and the main flow. This device could lead to the uncovering of the quantitative biology of the algae, which is a key to effective and extensive algal utilizations in the field of biotechnology, food industry and bioenergy research and developments.
Abstract: Given a large sparse signal, great wishes are to
reconstruct the signal precisely and accurately from lease number of
measurements as possible as it could. Although this seems possible
by theory, the difficulty is in built an algorithm to perform the
accuracy and efficiency of reconstructing. This paper proposes a new
proved method to reconstruct sparse signal depend on using new
method called Least Support Matching Pursuit (LS-OMP) merge it
with the theory of Partial Knowing Support (PSK) given new method
called Partially Knowing of Least Support Orthogonal Matching
Pursuit (PKLS-OMP).
The new methods depend on the greedy algorithm to compute the
support which depends on the number of iterations. So to make it
faster, the PKLS-OMP adds the idea of partial knowing support of its
algorithm. It shows the efficiency, simplicity, and accuracy to get
back the original signal if the sampling matrix satisfies the Restricted
Isometry Property (RIP).
Simulation results also show that it outperforms many algorithms
especially for compressible signals.
Abstract: Drought is one of the most damaging climate-related
hazards, it is generally considered as a prolonged absence of
precipitation. This normal and recurring climate phenomenon had
plagued civilization throughout history because of the negative
impacts on economical, environmental and social sectors. Drought
characteristics are thus recognized as important factors in water
resources planning and management. The purpose of this study is to
detect the changes in drought frequency, persistence and severity
in the Ruhr river basin. The frequency of drought events was
calculated using the Standardized Precipitation Index (SPI). Used
data are daily precipitation records from seven meteorological
stations covering the period 1961-2007. The main benefit of the
application of this index is its versatility, only rainfall data is required
to deliver five major dimensions of a drought : duration, intensity,
severity, magnitude, and frequency. Furthermore, drought can be
calculated in different time steps. In this study SPI was calculated for
1, 3, 6, 9, 12, and 24 months. Several drought events were detected
in the covered period, these events contain mild, moderate and severe
droughts. Also positive and negative trends in the SPI values were
observed.
Abstract: Cloud Computing has recently emerged as a
compelling paradigm for managing and delivering services over the
internet. The rise of Cloud Computing is rapidly changing the
landscape of information technology, and ultimately turning the longheld
promise of utility computing into a reality. As the development
of Cloud Computing paradigm is speedily progressing, concepts, and
terminologies are becoming imprecise and ambiguous, as well as
different technologies are interfering. Thus, it becomes crucial to
clarify the key concepts and definitions. In this paper, we present the
anatomy of Cloud Computing, covering its essential concepts,
prominent characteristics, its affects, architectural design and key
technologies. We differentiate various service and deployment
models. Also, significant challenges and risks need are tackled in
order to guarantee the long-term success of Cloud Computing. The
aim of this paper is to provide a better understanding of the anatomy
of Cloud Computing and pave the way for further research in this
area.
Abstract: An induced acyclic graphoidal cover of a graph G is a
collection ψ of open paths in G such that every path in ψ has atleast
two vertices, every vertex of G is an internal vertex of at most one
path in ψ, every edge of G is in exactly one path in ψ and every
member of ψ is an induced path. The minimum cardinality of an
induced acyclic graphoidal cover of G is called the induced acyclic
graphoidal covering number of G and is denoted by ηia(G) or ηia.
Here we find induced acyclic graphoidal cover for some classes of
graphs.
Abstract: The world of wireless telecommunications is rapidly evolving. Technologies under research and development promise to deliver more services to more users in less time. This paper presents the emerging technologies helping wireless systems grow from where we are today into our visions of the future. This paper will cover the applications and characteristics of emerging wireless technologies: Wireless Local Area Networks (WiFi-802.11n), Wireless Personal Area Networks (ZigBee) and Wireless Metropolitan Area Networks (WiMAX). The purpose of this paper is to explain the impending 802.11n standard and how it will enable WLANs to support emerging media-rich applications. The paper will also detail how 802.11n compares with existing WLAN standards and offer strategies for users considering higher-bandwidth alternatives. The emerging IEEE 802.15.4 (ZigBee) standard aims to provide low data rate wireless communications with high-precision ranging and localization, by employing UWB technologies for a low-power and low cost solution. WiMAX (Worldwide Interoperability for Microwave Access) is a standard for wireless data transmission covering a range similar to cellular phone towers. With high performance in both distance and throughput, WiMAX technology could be a boon to current Internet providers seeking to become the leader of next generation wireless Internet access. This paper also explores how these emerging technologies differ from one another.
Abstract: A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving vertex covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the vertex covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.
Abstract: With the explosive growth of data available on the
Internet, personalization of this information space become a
necessity. At present time with the rapid increasing popularity of the
WWW, Websites are playing a crucial role to convey knowledge and
information to the end users. Discovering hidden and meaningful
information about Web users usage patterns is critical to determine
effective marketing strategies to optimize the Web server usage for
accommodating future growth. The task of mining useful information
becomes more challenging when the Web traffic volume is enormous
and keeps on growing. In this paper, we propose a intelligent model
to discover and analyze useful knowledge from the available Web
log data.
Abstract: Frauds in insurance industry are one of the major
sources of operational risk of insurance companies and constitute a
significant portion of their losses. Every reasonable company on the
market aims for improving their processes of uncovering frauds and
invests their resources to reduce them. This article is addressing fraud
management area from the view of extension of existing Business
Intelligence solution. We describe the frame of such solution and
would like to share with readers all benefits brought to insurance
companies by adopting this approach in their fight against insurance
frauds.
Abstract: Suspended cable structures are most preferable for large spans covering due to rational use of structural materials, but the problem of suspended cable structures is initial shape change under the action of non-symmetrical load. The problem can be solved by increasing of relation of dead weight and imposed load, but this methods cause increasing of materials consumption.Prestressed cable truss usage is another way how the problem of shape change under the action of non-symmetrical load can be fixed. The better results can be achieved if we replace top chord with cable truss with cross web. Rational structure of the cable truss for prestressed cable truss top chord was developed using optimization realized in FEM program ANSYS 12 environment. Single cable and cable truss model work was discovered.Analytical and model testing results indicate, that usage of cable truss with the cross web as a top chord of prestressed cable truss instead of single cable allows to reduce total displacements by 13-16% in the case of non-symmetrical load. In case of uniformly distributed load single cable is preferable.
Abstract: Stairway Ushtobin Village is one of the five villages with original and sustainable architecture in Northwest of Iran along the border of Armenia, which has been able to maintain its environment and sustainable ecosystem. Studying circulation, function and scale (grand, medium and minor) of space, ratio of full and empty spaces, number and height of stairs, ratio of compound volume to luxury spaces, openings, type of local masonry (stone, mud, wood) and form of covering elements have been carried out in four houses of this village comparatively as some samples in this article, and furthermore, this article analyzes that the architectural shapes and organic texture of the village meet the needs of cold and dry climate. Finally, some efficient plans are offered suiting the present needs of the village to have a sustainable architecture.
Abstract: A big organization may have multiple branches spread across different locations. Processing of data from these branches becomes a huge task when innumerable transactions take place. Also, branches may be reluctant to forward their data for centralized processing but are ready to pass their association rules. Local mining may also generate a large amount of rules. Further, it is not practically possible for all local data sources to be of the same size. A model is proposed for discovering valid rules from different sized data sources where the valid rules are high weighted rules. These rules can be obtained from the high frequency rules generated from each of the data sources. A data source selection procedure is considered in order to efficiently synthesize rules. Support Equalization is another method proposed which focuses on eliminating low frequency rules at the local sites itself thus reducing the rules by a significant amount.
Abstract: The inherent flexibilities of XML in both structure
and semantics makes mining from XML data a complex task with
more challenges compared to traditional association rule mining in
relational databases. In this paper, we propose a new model for the
effective extraction of generalized association rules form a XML
document collection. We directly use frequent subtree mining
techniques in the discovery process and do not ignore the tree
structure of data in the final rules. The frequent subtrees based on the
user provided support are split to complement subtrees to form the
rules. We explain our model within multi-steps from data preparation
to rule generation.