Abstract: Writer identification is one of the areas in pattern
recognition that attract many researchers to work in, particularly in
forensic and biometric application, where the writing style can be
used as biometric features for authenticating an identity. The
challenging task in writer identification is the extraction of unique
features, in which the individualistic of such handwriting styles
can be adopted into bio-inspired generalized global shape for
writer identification. In this paper, the feasibility of generalized
global shape concept of complimentary binding in Artificial
Immune System (AIS) for writer identification is explored. An
experiment based on the proposed framework has been conducted
to proof the validity and feasibility of the proposed approach for
off-line writer identification.
Abstract: Need for an appropriate system of evaluating students-
educational developments is a key problem to achieve the predefined
educational goals. Intensity of the related papers in the last years; that
tries to proof or disproof the necessity and adequacy of the students
assessment; is the corroborator of this matter. Some of these studies
tried to increase the precision of determining question weights in
scientific examinations. But in all of them there has been an attempt
to adjust the initial question weights while the accuracy and precision
of those initial question weights are still under question. Thus In
order to increase the precision of the assessment process of students-
educational development, the present study tries to propose a new
method for determining the initial question weights by considering
the factors of questions like: difficulty, importance and complexity;
and implementing a combined method of PROMETHEE and fuzzy
analytic network process using a data mining approach to improve
the model-s inputs. The result of the implemented case study proves
the development of performance and precision of the proposed
model.
Abstract: This paper represents an investigation on how exploiting multiple transmit antennas by OFDM based wireless LAN subscribers can mitigate physical layer error rate. Then by comparing the Wireless LANs that utilize spatial diversity techniques with the conventional ones it will reveal how PHY and TCP throughputs behaviors are ameliorated. In the next step it will assess the same issues based on a cellular context operation which is mainly introduced as an innovated solution that beside a multi cell operation scenario benefits spatio-temporal signaling schemes as well. Presented simulations will shed light on the improved performance of the wide range and high quality wireless LAN services provided by the proposed approach.
Abstract: Building life cycle will never be excused from the existence of defects and deterioration. They are common problems in building, existed in newly build or in aged building. Buildings constructed from wood are indeed affected by its agent and serious defects and damages can reduce values to a building. In repair works, it is important to identify the causes and repair techniques that best suites with the condition. This paper reviews the conservation of traditional timber mosque in Malaysia comprises the concept, principles and approaches of mosque conservation in general. As in conservation practice, wood in historic building can be conserved by using various restoration and conservation techniques which this can be grouped as Fully and Partial Replacement, Mechanical Reinforcement, Consolidation by Impregnation and Reinforcement, Removing Paint and also Preservation of Wood and Control Insect Invasion, as to prolong and extended the function of a timber in a building. It resulted that the common techniques adopted in timber mosque conservation are from the conventional ways and the understanding of the repair technique requires the use of only preserve wood to prevent the future immature defects.
Abstract: With the aim of improving nutritional profile and antioxidant capacity of gluten-free cookies, blueberry pomace, by-product of juice production, was processed into a new food ingredient by drying and grinding and used for a gluten-free cookie formulation. Since the quality of a baked product is highly influenced by the baking conditions, the objective of this work was to optimize the baking time and thickness of dough pieces, by applying Response Surface Methodology (RSM) in order to obtain the best technological quality of the cookies. The experiments were carried out according to a Central Composite Design (CCD) by selecting the dough thickness and baking time as independent variables, while hardness, color parameters (L*, a* and b* values), water activity, diameter and short/long ratio were response variables. According to the results of RSM analysis, the baking time of 13.74min and dough thickness of 4.08mm was found to be the optimal for the baking temperature of 170°C. As similar optimal parameters were obtained by previously conducted experiment based on sensory analysis, response surface methodology (RSM) can be considered as a suitable approach to optimize the baking process.
Abstract: This paper focuses on operational risk measurement
techniques and on economic capital estimation methods. A data
sample of operational losses provided by an anonymous Central
European bank is analyzed using several approaches. Loss
Distribution Approach and scenario analysis method are considered.
Custom plausible loss events defined in a particular scenario are
merged with the original data sample and their impact on capital
estimates and on the financial institution is evaluated. Two main
questions are assessed – What is the most appropriate statistical
method to measure and model operational loss data distribution? and
What is the impact of hypothetical plausible events on the financial
institution? The g&h distribution was evaluated to be the most
suitable one for operational risk modeling. The method based on the
combination of historical loss events modeling and scenario analysis
provides reasonable capital estimates and allows for the measurement
of the impact of extreme events on banking operations.
Abstract: Speckled images arise when coherent microwave,
optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar
systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted
by speckle noise is complicated by the nature of the noise and is not
as straightforward as detection and estimation in additive noise. In
this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The
motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this
context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series
of Laguerre weighted exponential functions, resulting in a doubly
stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form.
It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an
exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.
Abstract: Knowledge-based e-mail systems focus on
incorporating knowledge management approach in order to enhance
the traditional e-mail systems. In this paper, we present a knowledgebased
e-mail system called KS-Mail where people do not only send
and receive e-mail conventionally but are also able to create a sense
of knowledge flow. We introduce semantic processing on the e-mail
contents by automatically assigning categories and providing links to
semantically related e-mails. This is done to enrich the knowledge
value of each e-mail as well as to ease the organization of the e-mails
and their contents. At the application level, we have also built
components like the service manager, evaluation engine and search
engine to handle the e-mail processes efficiently by providing the
means to share and reuse knowledge. For this purpose, we present the
KS-Mail architecture, and elaborate on the details of the e-mail
server and the application server. We present the ontology mapping
technique used to achieve the e-mail content-s categorization as well
as the protocols that we have developed to handle the transactions in
the e-mail system. Finally, we discuss further on the implementation
of the modules presented in the KS-Mail architecture.
Abstract: Mobile adhoc network (MANET) is a collection of
mobile devices which form a communication network with no preexisting
wiring or infrastructure. Multiple routing protocols have
been developed for MANETs. As MANETs gain popularity, their
need to support real time applications is growing as well. Such
applications have stringent quality of service (QoS) requirements
such as throughput, end-to-end delay, and energy. Due to dynamic
topology and bandwidth constraint supporting QoS is a challenging
task. QoS aware routing is an important building block for QoS
support. The primary goal of the QoS aware protocol is to determine
the path from source to destination that satisfies the QoS
requirements. This paper proposes a new energy and delay aware
protocol called energy and delay aware TORA (EDTORA) based on
extension of Temporally Ordered Routing Protocol (TORA).Energy
and delay verifications of query packet have been done in each node.
Simulation results show that the proposed protocol has a higher
performance than TORA in terms of network lifetime, packet
delivery ratio and end-to-end delay.
Abstract: To understand life as biological system, evolutionary
understanding is indispensable. Protein interactions data are rapidly
accumulating and are suitable for system-level evolutionary analysis.
We have analyzed yeast protein interaction network by both
mathematical and biological approaches. In this poster presentation,
we inferred the evolutionary birth periods of yeast proteins by
reconstructing phylogenetic profile. It has been thought that hub
proteins that have high connection degree are evolutionary old. But
our analysis showed that hub proteins are entirely evolutionary new.
We also examined evolutionary processes of protein complexes. It
showed that member proteins of complexes were tend to have
appeared in the same evolutionary period. Our results suggested that
protein interaction network evolved by modules that form the
functional unit. We also reconstructed standardized phylogenetic trees
and calculated evolutionary rates of yeast proteins. It showed that
there is no obvious correlation between evolutionary rates and
connection degrees of yeast proteins.
Abstract: Response Surface Methodology (RSM) is a powerful
and efficient mathematical approach widely applied in the
optimization of cultivation process. Cellulase enzyme production by
Trichoderma reesei RutC30 using agricultural waste rice straw and
banana fiber as carbon source were investigated. In this work,
sequential optimization strategy based statistical design was
employed to enhance the production of cellulase enzyme through
submerged cultivation. A fractional factorial design (26-2) was applied
to elucidate the process parameters that significantly affect cellulase
production. Temperature, Substrate concentration, Inducer
concentration, pH, inoculum age and agitation speed were identified
as important process parameters effecting cellulase enzyme synthesis.
The concentration of lignocelluloses and lactose (inducer) in the
cultivation medium were found to be most significant factors. The
steepest ascent method was used to locate the optimal domain and a
Central Composite Design (CCD) was used to estimate the quadratic
response surface from which the factor levels for maximum
production of cellulase were determined.
Abstract: For the past one decade, biclustering has become popular data mining technique not only in the field of biological data analysis but also in other applications like text mining, market data analysis with high-dimensional two-way datasets. Biclustering clusters both rows and columns of a dataset simultaneously, as opposed to traditional clustering which clusters either rows or columns of a dataset. It retrieves subgroups of objects that are similar in one subgroup of variables and different in the remaining variables. Firefly Algorithm (FA) is a recently-proposed metaheuristic inspired by the collective behavior of fireflies. This paper provides a preliminary assessment of discrete version of FA (DFA) while coping with the task of mining coherent and large volume bicluster from web usage dataset. The experiments were conducted on two web usage datasets from public dataset repository whereby the performance of FA was compared with that exhibited by other population-based metaheuristic called binary Particle Swarm Optimization (PSO). The results achieved demonstrate the usefulness of DFA while tackling the biclustering problem.
Abstract: Diabetes Mellitus is a chronic metabolic disorder, where the improper management of the blood glucose level in the diabetic patients will lead to the risk of heart attack, kidney disease and renal failure. This paper attempts to enhance the diagnostic accuracy of the advancing blood glucose levels of the diabetic patients, by combining principal component analysis and wavelet neural network. The proposed system makes separate blood glucose prediction in the morning, afternoon, evening and night intervals, using dataset from one patient covering a period of 77 days. Comparisons of the diagnostic accuracy with other neural network models, which use the same dataset are made. The comparison results showed overall improved accuracy, which indicates the effectiveness of this proposed system.
Abstract: The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.
Abstract: This paper focuses on the data-driven generation
of fuzzy IF...THEN rules. The resulted fuzzy rule base can be
applied to build a classifier, a model used for prediction, or
it can be applied to form a decision support system. Among
the wide range of possible approaches, the decision tree and
the association rule based algorithms are overviewed, and two
new approaches are presented based on the a priori fuzzy
clustering based partitioning of the continuous input variables.
An application study is also presented, where the developed
methods are tested on the well known Wisconsin Breast Cancer
classification problem.
Abstract: The method described in this paper deals with the problems of T-wave detection in an ECG. Determining the position of a T-wave is complicated due to the low amplitude, the ambiguous and changing form of the complex. A wavelet transform approach handles these complications therefore a method based on this concept was developed. In this way we developed a detection method that is able to detect T-waves with a sensitivity of 93% and a correct-detection ratio of 93% even with a serious amount of baseline drift and noise.
Abstract: In this paper, the generalized (2+1)-dimensional Calogero-Bogoyavlenskii-Schiff (shortly CBS) equations are investigated. We employ the Hirota-s bilinear method to obtain the bilinear form of CBS equations. Then by the idea of extended homoclinic test approach (shortly EHTA), some exact soliton solutions including breather type solutions are presented.
Abstract: In this paper a sliding-mode torque and flux control is
designed for encoderless synchronous reluctance motor drive. The
sliding-mode plus PI controllers are designed in the stator-flux field
oriented reference frame which is able to track the mentioned
reference signals with a minimum pulsations in the state condition. In
addition, with these controllers a fast dynamic response is also
achieved for the drive system. The proposed control scheme is robust
subject to parameters variation except to stator resistance. To solve
this problem a simple estimator is used for on-line detecting of this
parameter. Moreover, the rotor position and speed are estimated by
on-line obtaining of the stator-flux-space vector. The effectiveness
and capability of the proposed control approach is verified by both
the simulation and experimental results.
Abstract: Starting from the basic pillars of the supportability
analysis this paper queries its characteristics in LCI (Life Cycle
Integration) environment. The research methodology contents a
review of modern logistics engineering literature with the objective to
collect and synthesize the knowledge relating to standards of
supportability design in e-logistics environment. The results show
that LCI framework has properties which are in fully compatibility
with the requirement of simultaneous logistics support and productservice
bundle design. The proposed approach is a contribution to the
more comprehensive and efficient supportability design process.
Also, contributions are reflected through a greater consistency of
collected data, automated creation of reports suitable for different
analysis, as well as the possibility of their customization according
with customer needs. In addition to this, convenience of this approach
is its practical use in real time. In a broader sense, LCI allows
integration of enterprises on a worldwide basis facilitating electronic
business.
Abstract: The contribution deals with current or potential approaches to the modeling and optimization of tactical activities. This issue takes on importance in recent times, particularly with the increasing trend of digitized battlefield, the development of C4ISR systems and intention to streamline the command and control process at the lowest levels of command. From fundamental and philosophically point of view, this new approaches seek to significantly upgrade and enhance the decision-making process of the tactical commanders.