Abstract: This study proposes a materials procurement contracts
model to which the zero-cost collar option is applied for heading price
fluctuation risks in construction.The material contract model based on
the collar option that consists of the call option striking zone of the
construction company(the buyer) following the materials price
increase andthe put option striking zone of the material vendor(the
supplier) following a materials price decrease. This study first
determined the call option strike price Xc of the construction company
by a simple approach: it uses the predicted profit at the project starting
point and then determines the strike price of put option Xp that has an
identical option value, which completes the zero-cost material
contract.The analysis results indicate that the cost saving of the
construction company increased as Xc decreased. This was because the
critical level of the steel materials price increasewas set at a low level.
However, as Xc decreased, Xpof a put option that had an identical
option value gradually increased. Cost saving increased as Xc
decreased. However, as Xp gradually increased, the risk of loss from a
construction company increased as the steel materials price decreased.
Meanwhile, cost saving did not occur for the construction company,
because of volatility. This result originated in the zero-cost features of
the two-way contract of the collar option. In the case of the regular
one-way option, the transaction cost had to be subtracted from the cost
saving. The transaction cost originated from an option value that
fluctuated with the volatility. That is, the cost saving of the one-way
option was affected by the volatility. Meanwhile, even though the
collar option with zero transaction cost cut the connection between
volatility and cost saving, there was a risk of exercising the put option.
Abstract: In this paper, a new approach for target recognition based on the Empirical mode decomposition (EMD) algorithm of Huang etal. [11] and the energy tracking operator of Teager [13]-[14] is introduced. The conjunction of these two methods is called Teager-Huang analysis. This approach is well suited for nonstationary signals analysis. The impulse response (IR) of target is first band pass filtered into subsignals (components) called Intrinsic mode functions (IMFs) with well defined Instantaneous frequency (IF) and Instantaneous amplitude (IA). Each IMF is a zero-mean AM-FM component. In second step, the energy of each IMF is tracked using the Teager energy operator (TEO). IF and IA, useful to describe the time-varying characteristics of the signal, are estimated using the Energy separation algorithm (ESA) algorithm of Maragos et al .[16]-[17]. In third step, a set of features such as skewness and kurtosis are extracted from the IF, IA and IMF energy functions. The Teager-Huang analysis is tested on set of synthetic IRs of Sonar targets with different physical characteristics (density, velocity, shape,? ). PCA is first applied to features to discriminate between manufactured and natural targets. The manufactured patterns are classified into spheres and cylinders. One hundred percent of correct recognition is achieved with twenty three echoes where sixteen IRs, used for training, are free noise and seven IRs, used for testing phase, are corrupted with white Gaussian noise.
Abstract: This study introduces a new method for detecting,
sorting, and localizing spikes from multiunit EEG recordings. The
method combines the wavelet transform, which localizes distinctive
spike features, with Super-Paramagnetic Clustering (SPC) algorithm,
which allows automatic classification of the data without assumptions
such as low variance or Gaussian distributions. Moreover, the method
is capable of setting amplitude thresholds for spike detection. The
method makes use of several real EEG data sets, and accordingly the
spikes are detected, clustered and their times were detected.
Abstract: The main goal in this paper is to quantify the quality of
different techniques for radiation treatment plans, a back-propagation
artificial neural network (ANN) combined with biomedicine theory
was used to model thirteen dosimetric parameters and to calculate
two dosimetric indices. The correlations between dosimetric indices
and quality of life were extracted as the features and used in the ANN
model to make decisions in the clinic. The simulation results show
that a trained multilayer back-propagation neural network model can
help a doctor accept or reject a plan efficiently. In addition, the
models are flexible and whenever a new treatment technique enters
the market, the feature variables simply need to be imported and the
model re-trained for it to be ready for use.
Abstract: The aim of this study is to emphasize the opportunities in space design under the aspect of HCI as performance areas. HCI is a multidisciplinary approach that could be identified in many different areas. The aesthetical reflections of HCI by virtual reality in space design are the high-tech solutions of the new innovations as computational facilities by artistic features. The method of this paper is to identify the subject in 3 main parts. In the first part a general approach and definition of interactivity on the basis of space design; in the second part the concept of multimedia interactive theater by some chosen samples from the world and interactive design aspects; in the third part the samples from Turkey will be identified by stage designing principles. In the results it could be declared that the multimedia database is the virtual approach of theatre stage designing regarding interactive means by computational facilities according to aesthetical aspects. HCI is mostly identified in theatre stages as computational intelligence under the affect of interactivity.
Abstract: This paper considers the influence of promotion
instruments for renewable energy sources (RES) on a multi-energy
modeling framework. In Europe, so called Feed-in Tariffs are
successfully used as incentive structures to increase the amount of
energy produced by RES. Because of the stochastic nature of large
scale integration of distributed generation, many problems have
occurred regarding the quality and stability of supply. Hence, a
macroscopic model was developed in order to optimize the power
supply of the local energy infrastructure, which includes electricity,
natural gas, fuel oil and district heating as energy carriers. Unique
features of the model are the integration of RES and the adoption of
Feed-in Tariffs into one optimization stage. Sensitivity studies are
carried out to examine the system behavior under changing profits
for the feed-in of RES. With a setup of three energy exchanging
regions and a multi-period optimization, the impact of costs and
profits are determined.
Abstract: Hybrid knowledge model is suggested as an underlying
framework for product development management. It can support such
hybrid features as ontologies and rules. Effective collaboration in
product development environment depends on sharing and reasoning
product information as well as engineering knowledge. Many studies
have considered product information and engineering knowledge.
However, most previous research has focused either on building the
ontology of product information or rule-based systems of engineering
knowledge. This paper shows that F-logic based knowledge model can
support such desirable features in a hybrid way.
Abstract: The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.
Abstract: The application of Neural Network for disease
diagnosis has made great progress and is widely used by physicians.
An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which
was the great motivation towards our study. In our work, tachycardia
features obtained are used for the training and testing of a Neural
Network. In this study we are using Fuzzy Probabilistic Neural
Networks as an automatic technique for ECG signal analysis. As
every real signal recorded by the equipment can have different
artifacts, we needed to do some preprocessing steps before feeding it
to our system. Wavelet transform is used for extracting the
morphological parameters of the ECG signal. The outcome of the
approach for the variety of arrhythmias shows the represented
approach is superior than prior presented algorithms with an average
accuracy of about %95 for more than 7 tachy arrhythmias.
Abstract: Object Relational Databases (ORDB) are complex in
nature than traditional relational databases because they combine the
characteristics of both object oriented concepts and relational
features of conventional databases. Design of an ORDB demands
efficient and quality schema considering the structural, functional
and componential traits. This internal quality of the schema is
assured by metrics that measure the relevant attributes. This is
extended to substantiate the understandability, usability and
reliability of the schema, thus assuring external quality of the
schema. This work institutes a formalization of ORDB metrics;
metric definition, evaluation methodology and the calibration of the
metric. Three ORDB schemas were used to conduct the evaluation
and the formalization of the metrics. The metrics are calibrated using
content and criteria related validity based on the measurability,
consistency and reliability of the metrics. Nominal and summative
scales are derived based on the evaluated metric values and are
standardized. Future works pertaining to ORDB metrics forms the
concluding note.
Abstract: A theory for optimal filtering of infinite sets of random
signals is presented. There are several new distinctive features of the
proposed approach. First, a single optimal filter for processing any
signal from a given infinite signal set is provided. Second, the filter is
presented in the special form of a sum with p terms where each term
is represented as a combination of three operations. Each operation
is a special stage of the filtering aimed at facilitating the associated
numerical work. Third, an iterative scheme is implemented into the
filter structure to provide an improvement in the filter performance at
each step of the scheme. The final step of the scheme concerns signal
compression and decompression. This step is based on the solution of
a new rank-constrained matrix approximation problem. The solution
to the matrix problem is described in this paper. A rigorous error
analysis is given for the new filter.
Abstract: Microtomographic images and thin section (TS)
images were analyzed and compared against some parameters of
geological interest such as porosity and its distribution along the
samples. The results show that microtomography (CT) analysis,
although limited by its resolution, have some interesting information
about the distribution of porosity (homogeneous or not) and can also
quantify the connected and non-connected pores, i.e., total porosity.
TS have no limitations concerning resolution, but are limited by the
experimental data available in regards to a few glass sheets for
analysis and also can give only information about the connected
pores, i.e., effective porosity. Those two methods have their own
virtues and flaws but when paired together they are able to
complement one another, making for a more reliable and complete
analysis.
Abstract: This is an applied research to propose the method for
price quotation for a contract electronics manufacturer. It has had a
precise price quoting method but such method could not quickly
provide a result as the customer required. This reduces the ability of
company to compete in this kind of business. In this case, the cause
of long time quotation process was analyzed. A lot of product
features have been demanded by customer. By checking routine
processes, it was found that high fraction of quoting time was used
for production time estimating which has effected to the
manufacturing or production cost. Then the historical data of
products including types, number of components, assembling
method, and their assembling time were used to analyze the key
components affecting to production time. The price quoting model
then was proposed. The implementation of proposed model was able
to remarkably reduce quoting time with an acceptable required
precision.
Abstract: Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automation of the analysis and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for this task and serve as input into a radial basis function network that is trained to discriminate transient shapes from pulse like to wave like. We concentrate on signals in the Very Low Frequency (VLF, 3 -30 kHz) range in this paper, but the developed methods are independent of this specific choice.
Abstract: Automatic reading of handwritten cheque is a computationally
complex process and it plays an important role in financial
risk management. Machine vision and learning provide a viable
solution to this problem. Research effort has mostly been focused
on recognizing diverse pitches of cheques and demand drafts with an
identical outline. However most of these methods employ templatematching
to localize the pitches and such schemes could potentially
fail when applied to different types of outline maintained by the
bank. In this paper, the so-called outline problem is resolved by
a cheque information tree (CIT), which generalizes the localizing
method to extract active-region-of-entities. In addition, the weight
based density plot (WBDP) is performed to isolate text entities and
read complete pitches. Recognition is based on texture features using
neural classifiers. Legal amount is subsequently recognized by both
texture and perceptual features. A post-processing phase is invoked
to detect the incorrect readings by Type-2 grammar using the Turing
machine. The performance of the proposed system was evaluated
using cheque and demand drafts of 22 different banks. The test data
consists of a collection of 1540 leafs obtained from 10 different
account holders from each bank. Results show that this approach
can easily be deployed without significant design amendments.
Abstract: This article considers the main features of party
construction in the course of political modernization of Kazakhstan.
Along with consideration of party construction author analyzed how
the transformation of the party system was fulfilled in Kazakhstan.
Besides the basic stages in the course of party construction were
explained by the author. The statistical data is cited.
Abstract: The goal of this research is discovering the
determinants of the success or failure of external cooperation in small
and medium enterprises (SMEs). For this, a survey was given to 190
SMEs that experienced external cooperation within the last 3 years. A
logistic regression model was used to derive organizational or strategic
characteristics that significantly influence whether external
collaboration of domestic SMEs is successful or not. Results suggest
that research and development (R&D) features in general
characteristics (both idea creation and discovering market
opportunities) that focused on and emphasized indirected-market
stakeholders (such as complementary companies and affiliates) and
strategies in innovative strategic characteristics raise the probability of
successful external cooperation. This can be used meaningfully to
build a policy or strategy for inducing successful external cooperation
or to understand the innovation of SMEs.
Abstract: Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.
Abstract: Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.
Abstract: The use of machine vision to inspect the outcome of
surgical tasks is investigated, with the aim of incorporating this
approach in robotic surgery systems. Machine vision is a non-contact
form of inspection i.e. no part of the vision system is in direct contact
with the patient, and is therefore well suited for surgery where
sterility is an important consideration,. As a proof-of-concept, three
primary surgical tasks for a common neurosurgical procedure were
inspected using machine vision. Experiments were performed on
cadaveric pig heads to simulate the two possible outcomes i.e.
satisfactory or unsatisfactory, for tasks involved in making a burr
hole, namely incision, retraction, and drilling. We identify low level
image features to distinguish the two outcomes, as well as report on
results that validate our proposed approach. The potential of using
machine vision in a surgical environment, and the challenges that
must be addressed, are identified and discussed.