Abstract: General requirements for knowledge representation in
the form of logic rules, applicable to design and control of industrial
processes, are formulated. Characteristic behavior of decision trees
(DTs) and rough sets theory (RST) in rules extraction from recorded
data is discussed and illustrated with simple examples. The
significance of the models- drawbacks was evaluated, using
simulated and industrial data sets. It is concluded that performance of
DTs may be considerably poorer in several important aspects,
compared to RST, particularly when not only a characterization of a
problem is required, but also detailed and precise rules are needed,
according to actual, specific problems to be solved.
Abstract: Mobile IP has been developed to provide the
continuous information network access to mobile users. In IP-based
mobile networks, location management is an important component of
mobility management. This management enables the system to track
the location of mobile node between consecutive communications. It
includes two important tasks- location update and call delivery.
Location update is associated with signaling load. Frequent updates
lead to degradation in the overall performance of the network and the
underutilization of the resources. It is, therefore, required to devise
the mechanism to minimize the update rate. Mobile IPv6 (MIPv6)
and Hierarchical MIPv6 (HMIPv6) have been the potential
candidates for deployments in mobile IP networks for mobility
management. HMIPv6 through studies has been shown with better
performance as compared to MIPv6. It reduces the signaling
overhead traffic by making registration process local. In this paper,
we present performance analysis of MIPv6 and HMIPv6 using an
analytical model. Location update cost function is formulated based
on fluid flow mobility model. The impact of cell residence time, cell
residence probability and user-s mobility is investigated. Numerical
results are obtained and presented in graphical form. It is shown that
HMIPv6 outperforms MIPv6 for high mobility users only and for low
mobility users; performance of both the schemes is almost equivalent
to each other.
Abstract: Recently, X. Ge and J. Qian investigated some relations between higher mathematics scores and calculus scores (resp. linear algebra scores, probability statistics scores) for Chinese university students. Based on rough-set theory, they established an information system S = (U,CuD,V, f). In this information system, higher mathematics score was taken as a decision attribute and calculus score, linear algebra score, probability statistics score were taken as condition attributes. They investigated importance of each condition attribute with respective to decision attribute and strength of each condition attribute supporting decision attribute. In this paper, we give further investigations for this issue. Based on the above information system S = (U, CU D, V, f), we analyze the decision rules between condition and decision granules. For each x E U, we obtain support (resp. strength, certainty factor, coverage factor) of the decision rule C —>x D, where C —>x D is the decision rule induced by x in S = (U, CU D, V, f). Results of this paper gives new analysis of on higher mathematics scores for Chinese university students, which can further lead Chinese university students to raise higher mathematics scores in Chinese graduate student entrance examination.
Abstract: Enzymatic saccharification of biomass for reducing
sugar production is one of the crucial processes in biofuel production
through biochemical conversion. In this study, enzymatic
saccharification of dilute potassium hydroxide (KOH) pre-treated
Tetraselmis suecica biomass was carried out by using cellulase
enzyme obtained from Trichoderma longibrachiatum. Initially, the
pre-treatment conditions were optimised by changing alkali reagent
concentration, retention time for reaction, and temperature. The T.
suecica biomass after pre-treatment was also characterized using
Fourier Transform Infrared Spectra and Scanning Electron
Microscope. These analyses revealed that the functional group such
as acetyl and hydroxyl groups, structure and surface of T. suecica
biomass were changed through pre-treatment, which is favourable for
enzymatic saccharification process. Comparison of enzymatic
saccharification of untreated and pre-treated microalgal biomass
indicated that higher level of reducing sugar can be obtained from
pre-treated T. suecica. Enzymatic saccharification of pre-treated T.
suecica biomass was optimised by changing temperature, pH, and
enzyme concentration to solid ratio ([E]/[S]). Highest conversion of
carbohydrate into reducing sugar of 95% amounted to reducing sugar
yield of 20 (wt%) from pre-treated T. suecica was obtained from
saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1
after 72 h of incubation. Hydrolysate obtained from enzymatic
saccharification of pretreated T. suecica biomass was further
fermented into biobutanol using Clostridium saccharoperbutyliticum
as biocatalyst. The results from this study demonstrate a positive
prospect of application of dilute alkaline pre-treatment to enhance
enzymatic saccharification and biobutanol production from
microalgal biomass.
Abstract: In the automotive industry test drives are being conducted
during the development of new vehicle models or as a part of
quality assurance of series-production vehicles. The communication
on the in-vehicle network, data from external sensors, or internal
data from the electronic control units is recorded by automotive
data loggers during the test drives. The recordings are used for fault
analysis. Since the resulting data volume is tremendous, manually
analysing each recording in great detail is not feasible.
This paper proposes to use machine learning to support domainexperts
by preventing them from contemplating irrelevant data and
rather pointing them to the relevant parts in the recordings. The
underlying idea is to learn the normal behaviour from available
recordings, i.e. a training set, and then to autonomously detect
unexpected deviations and report them as anomalies.
The one-class support vector machine “support vector data description”
is utilised to calculate distances of feature vectors. SVDDSUBSEQ
is proposed as a novel approach, allowing to classify subsequences
in multivariate time series data. The approach allows to
detect unexpected faults without modelling effort as is shown with
experimental results on recordings from test drives.
Abstract: The trends of design and development of information systems have undergone a variety of ongoing phases and stages. These variations have been evolved due to brisk changes in user requirements and business needs. To meet these requirements and needs, a flexible and agile business solution was required to come up with the latest business trends and styles. Another obstacle in agility of information systems was typically different treatment of same diseases of two patients: business processes and information services. After the emergence of information technology, the business processes and information systems have become counterparts. But these two business halves have been treated under totally different standards. There is need to streamline the boundaries of these both pillars that are equally sharing information system's burdens and liabilities. In last decade, the object orientation has evolved into one of the major solutions for modern business needs and now, SOA is the solution to shift business on ranks of electronic platform. BPM is another modern business solution that assists to regularize optimization of business processes. This paper discusses how object orientation can be conformed to incorporate or embed SOA in BPM for improved information systems.
Abstract: This paper is to investigate the impplementation of security
mechanism in object oriented database system. Formal methods
plays an essential role in computer security due to its powerful expressiveness
and concise syntax and semantics. In this paper, both issues
of specification and implementation in database security environment
will be considered; and the database security is achieved through
the development of an efficient implementation of the specification
without compromising its originality and expressiveness.
Abstract: The aim of this paper is to adopt a compromise ratio (CR) methodology for fuzzy multi-attribute single-expert decision making proble. In this paper, the rating of each alternative has been described by linguistic terms, which can be expressed as triangular fuzzy numbers. The compromise ratio method for fuzzy multi-attribute single expert decision making has been considered here by taking the ranking index based on the concept that the chosen alternative should be as close as possible to the ideal solution and as far away as possible from the negative-ideal solution simultaneously. From logical point of view, the distance between two triangular fuzzy numbers also is a fuzzy number, not a crisp value. Therefore a fuzzy distance measure, which is itself a fuzzy number, has been used here to calculate the difference between two triangular fuzzy numbers. Now in this paper, with the help of this fuzzy distance measure, it has been shown that the compromise ratio is a fuzzy number and this eases the problem of the decision maker to take the decision. The computation principle and the procedure of the compromise ratio method have been described in detail in this paper. A comparative analysis of the compromise ratio method previously proposed [1] and the newly adopted method have been illustrated with two numerical examples.
Abstract: This Classifying Bird Sounds (chip notes) project-s
purpose is to reduce the unwanted noise from recorded bird sound
chip notes, design a scheme to detect differences and similarities
between recorded chip notes, and classify bird sound chip notes. The
technologies of determining the similarities of sound waves have
been used in communication, sound engineering and wireless sound
applications for many years. Our research is focused on the similarity
of chip notes, which are the sounds from different birds. The program
we use is generated by Microsoft Cµ.
Abstract: The present study was designed to investigate the
cardio protective role of chronic oral administration of alcoholic
extract of Terminalia arjuna in in-vivo ischemic reperfusion injury
and the induction of HSP72. Rabbits, divided into three groups, and
were administered with the alcoholic extract of the bark powder of
Terminalia arjuna (TAAE) by oral gavage [6.75mg/kg: (T1) and
9.75mg/kg: (T2), 6 days /week for 12 weeks]. In open-chest
Ketamine pentobarbitone anaesthetized rabbits, the left anterior
descending coronary artery was occluded for 15 min of ischemia
followed by 60 min of reperfusion. In the vehicle-treated group,
ischemic-reperfusion injury (IRI) was evidenced by depression of
global hemodynamic function (MAP, HR, LVEDP, peak LV (+) & (-
) (dP/dt) along with depletion of HEP compounds. Oxidative stress
in IRI was evidenced by, raised levels of myocardial TBARS and
depletion of endogenous myocardial antioxidants GSH, SOD and
catalase. Western blot analysis showed a single band corresponding
to 72 kDa in homogenates of hearts from rabbits treated with both the
doses. In the alcoholic extract of the bark powder of Terminalia
arjuna treatment groups, both the doses had better recovery of
myocardial hemodynamic function, with significant reduction in
TBARS, and rise in SOD, GSH, catalase were observed. The results
of the present study suggest that the alcoholic extract of the bark
powder of Terminalia arjuna in rabbit induces myocardial HSP 72
and augments myocardial endogenous antioxidants, without causing
any cellular injury and offered better cardioprotection against
oxidative stress associated with myocardial IR injury.
Abstract: It well recognized that one feature that makes a
successful company is its ability to successfully align its business goals with its information communication technologies platform.
Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and
providing support for information flows. However, the technological
systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP).
This paper aims to investigate the role of training in improving the
usage of ERP systems. To this end, we have designed an instrument
survey to employees of a Norwegian multinational global provider of
technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for
both researchers and practitioners as a step towards a better
understanding of ERP system implementation.
Abstract: Transport and land use are two systems that are
mutually influenced. Their interaction is a complex process
associated with continuous feedback. The paper examines the
existing land use around an under construction metro station of the
new metro network of Thessaloniki, Greece, through the use of field
investigations, around the station-s predefined location. Moreover,
except from the analytical land use recording, a sampling
questionnaire survey is addressed to several selected enterprises of
the study area. The survey aims to specify the characteristics of the
enterprises, the trip patterns of their employees and clients, as well as
the stated preferences towards the changes the new metro station is
considered to bring to the area. The interpretation of the interrelationships
among selected data from the questionnaire survey takes
place using the method of Principal Components Analysis for
Categorical Data. The followed methodology and the survey-s results
contribute to the enrichment of the relevant bibliography concerning
the way the creation of a new metro station can have an impact on the
land use pattern of an area, by examining the situation before the
operation of the station.
Abstract: Knowledge development in companies relies on
knowledge-intensive business processes, which are characterized by
a high complexity in their execution, weak structuring,
communication-oriented tasks and high decision autonomy, and often the need for creativity and innovation. A foundation of knowledge development is provided, which is based on a new conception of
knowledge and knowledge dynamics. This conception consists of a three-dimensional model of knowledge with types, kinds and qualities. Built on this knowledge conception, knowledge dynamics is
modeled with the help of general knowledge conversions between
knowledge assets. Here knowledge dynamics is understood to cover
all of acquisition, conversion, transfer, development and usage of
knowledge. Through this conception we gain a sound basis for
knowledge management and development in an enterprise. Especially
the type dimension of knowledge, which categorizes it according to
its internality and externality with respect to the human being, is crucial for enterprise knowledge management and development,
because knowledge should be made available by converting it to
more external types.
Built on this conception, a modeling approach for knowledgeintensive
business processes is introduced, be it human-driven,e-driven or task-driven processes. As an example for this approach, a model of the creative activity for the renewal planning of
a product is given.
Abstract: To illustrate diversity of methods used to extract relevant (where the concept of relevance can be differently defined for different applications) visual data, the paper discusses three groups of such methods. They have been selected from a range of alternatives to highlight how hardware and software tools can be complementarily used in order to achieve various functionalities in case of different specifications of “relevant data". First, principles of gated imaging are presented (where relevance is determined by the range). The second methodology is intended for intelligent intrusion detection, while the last one is used for content-based image matching and retrieval. All methods have been developed within projects supervised by the author.
Abstract: The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.
Abstract: Electrocardiogram (ECG) segmentation is necessary to help reduce the time consuming task of manually annotating ECG's. Several algorithms have been developed to segment the ECG automatically. We first review several of such methods, and then present a new single lead segmentation method based on Adaptive piecewise constant approximation (APCA) and Piecewise derivative dynamic time warping (PDDTW). The results are tested on the QT database. We compared our results to Laguna's two lead method. Our proposed approach has a comparable mean error, but yields a slightly higher standard deviation than Laguna's method.
Abstract: In this paper, a watermarking algorithm that uses the wavelet transform with Multiple Description Coding (MDC) and Quantization Index Modulation (QIM) concepts is introduced. Also, the paper investigates the role of Contourlet Transform (CT) versus Wavelet Transform (WT) in providing robust image watermarking. Two measures are utilized in the comparison between the waveletbased and the contourlet-based methods; Peak Signal to Noise Ratio (PSNR) and Normalized Cross-Correlation (NCC). Experimental results reveal that the introduced algorithm is robust against different attacks and has good results compared to the contourlet-based algorithm.
Abstract: In present study the effects of anti-inflammatory and
antinociceptive of vitex hydro-alcoholic extract were evaluated on
male mice. In inflammatory test mice were divided into 7 groups:
first group was control. The second group, positive control group,
received dexamethasone (15 mg/kg) and the other five groups
received different doses of hydroalcohol extract of Vitex fruit (265,
365, 465, 565, and 665 mg/kg). The inflammation was caused by
xylene-induced ear edema. Formalin test was used for evaluation of
antinociceptive effect of extract. In this test, mice were divided into 7
groups: control, morphine (10mg/kg) as positive control group, and
Vitex extract groups ((265, 365, 465, 565, and 665 mg/kg). All drugs
were administered intrapritoneally, 30 min before each test. The data
were analyzed using one-way ANOVA followed by Tukey-kramer
multiple comparison test. Results have shown significant antiinflammatory
effects of extract at all dosed as compared with control
(P
Abstract: “Garbage enzyme", a fermentation product of kitchen waste, water and brown sugar, is claimed in the media as a multipurpose solution for household and agricultural uses. This study assesses the effects of dilutions (5% to 75%) of garbage enzyme in reducing pollutants in domestic wastewater. The pH of the garbage enzyme was found to be 3.5, BOD concentration about 150 mg/L. Test results showed that the garbage enzyme raised the wastewater-s BOD in proportion to its dilution due to its high organic content. For mixtures with more than 10% garbage enzyme, its pH remained acidic after the 5-day digestion period. However, it seems that ammonia nitrogen and phosphorus could be removed by the addition of the garbage enzyme. The most economic solution for removal of ammonia nitrogen and phosphorus was found to be 9%. Further tests are required to understand the removal mechanisms of the ammonia nitrogen and phosphorus.
Abstract: We have devised a thermal carpet cloak theoretically
and implemented in silicon using layered metamaterial. The layered
metamaterial is composed of single crystalline silicon and its phononic
crystal. The design is based on a coordinate transformation. We
demonstrate the result with numerical simulation. Great cloaking
performance is achieved as a thermal insulator is well hidden under the
thermal carpet cloak. We also show that the thermal carpet cloak can
even the temperature on irregular surface. Using thermal carpet cloak
to manipulate the heat conduction is effective because of its low
complexity.