Abstract: A Decision Support System/Expert System for stock
portfolio selection presented where at first step, both technical and
fundamental data used to estimate technical and fundamental return
and risk (1st phase); Then, the estimated values are aggregated with
the investor preferences (2nd phase) to produce convenient stock
portfolio.
In the 1st phase, there are two expert systems, each of which is
responsible for technical or fundamental estimation. In the technical
expert system, for each stock, twenty seven candidates are identified
and with using rough sets-based clustering method (RC) the effective
variables have been selected. Next, for each stock two fuzzy rulebases
are developed with fuzzy C-Mean method and Takai-Sugeno-
Kang (TSK) approach; one for return estimation and the other for
risk. Thereafter, the parameters of the rule-bases are tuned with backpropagation
method. In parallel, for fundamental expert systems,
fuzzy rule-bases have been identified in the form of “IF-THEN" rules
through brainstorming with the stock market experts and the input
data have been derived from financial statements; as a result two
fuzzy rule-bases have been generated for all the stocks, one for return
and the other for risk.
In the 2nd phase, user preferences represented by four criteria and
are obtained by questionnaire. Using an expert system, four estimated
values of return and risk have been aggregated with the respective
values of user preference. At last, a fuzzy rule base having four rules,
treats these values and produce a ranking score for each stock which
will lead to a satisfactory portfolio for the user.
The stocks of six manufacturing companies and the period of
2003-2006 selected for data gathering.
Abstract: In the paper, the energetic features of the loaded gait
are newly analyzed depending on the trunk flexion change. To
investigate the loaded gait, walking experiments are performed for five
subjects and, the ground reaction forces and kinematic data are
measured. Based on these information, we compute the impulse,
momentum and mechanical works done on the center of body mass,
through the trunk flexion change. As a result, it is shown that the trunk
flexion change does not affect the impulses and momentums during
the step-to-step transition as well. However, the direction of the
pre-collision momentum does change depending on the trunk flexion
change, which is degenerated just after (or during) the collision period.
Abstract: In this paper, we present a novel approach to accurately
detect text regions including shop name in signboard images with
complex background for mobile system applications. The proposed
method is based on the combination of text detection using edge
profile and region segmentation using fuzzy c-means method. In the
first step, we perform an elaborate canny edge operator to extract all
possible object edges. Then, edge profile analysis with vertical and
horizontal direction is performed on these edge pixels to detect
potential text region existing shop name in a signboard. The edge
profile and geometrical characteristics of each object contour are
carefully examined to construct candidate text regions and classify the
main text region from background. Finally, the fuzzy c-means
algorithm is performed to segment and detected binarize text region.
Experimental results show that our proposed method is robust in text
detection with respect to different character size and color and can
provide reliable text binarization result.
Abstract: The aim of this study is to identify the conditions of
implementation for reconfigurability in summarizing past flexible
manufacturing systems (FMS) research by drawing overall
conclusions from many separate High Performance Manufacturing
(HPM) studies. Meta-analysis will be applied to links between HPM
programs and their practices related to FMS and manufacturing
performance with particular reference to responsiveness performance.
More specifically, an application of meta-analysis will be made with
reference to two of the main steps towards the development of an
empirically-tested theory: testing the adequacy of the measurement of
variables and testing the linkages between the variables.
Abstract: Soy polyol obtained from hydroxylation of soy
epoxide with ethylene glycol were prepared as pre-polyurethane. The
two step process method were applied in the polyurethane synthesis.
The blending of soy polyol with synthetic polyol then simultaneously
carried out to TDI (2,4): MDI (4,4-) (80:20), blowing agent, and
surfactant. Ethylene glycol were not taking part in the polyurethane
synthesis. The inclusion of ethylene glycol were used as a control.
Characterization of polyurethane foam through impact resillience,
indentation deflection, and density can visualize the polyurethane
classifications.
Abstract: The various applications of VLSI circuits in highperformance
computing, telecommunications, and consumer
electronics has been expanding progressively, and at a very hasty
pace. This paper describes a new model for partitioning a circuit
using DBSCAN and fuzzy ARTMAP neural network. The first step
is concerned with feature extraction, where we had make use
DBSCAN algorithm. The second step is the classification and is
composed of a fuzzy ARTMAP neural network. The performance of
both approaches is compared using benchmark data provided by
MCNC standard cell placement benchmark netlists. Analysis of the
investigational results proved that the fuzzy ARTMAP with
DBSCAN model achieves greater performance then only fuzzy
ARTMAP in recognizing sub-circuits with lowest amount of
interconnections between them The recognition rate using fuzzy
ARTMAP with DBSCAN is 97.7% compared to only fuzzy
ARTMAP.
Abstract: In this paper we propose a method which improves the efficiency of video coding. Our method combines an adaptive GOP (group of pictures) structure and the shot cut detection. We have analyzed different approaches for shot cut detection with aim to choose the most appropriate one. The next step is to situate N frames to the positions of detected cuts during the process of video encoding. Finally the efficiency of the proposed method is confirmed by simulations and the obtained results are compared with fixed GOP structures of sizes 4, 8, 12, 16, 32, 64, 128 and GOP structure with length of entire video. Proposed method achieved the gain in bit rate from 0.37% to 50.59%, while providing PSNR (Peak Signal-to-Noise Ratio) gain from 1.33% to 0.26% in comparison to simulated fixed GOP structures.
Abstract: When an assignable cause(s) manifests itself to a multivariate process and the process shifts to an out-of-control condition, a root-cause analysis should be initiated by quality engineers to identify and eliminate the assignable cause(s) affected the process. A root-cause analysis in a multivariate process is more complex compared to a univariate process. In the case of a process involved several correlated variables an effective root-cause analysis can be only experienced when it is possible to identify the required knowledge including the out-of-control condition, the change point, and the variable(s) responsible to the out-of-control condition, all simultaneously. Although literature addresses different schemes to monitor multivariate processes, one can find few scientific reports focused on all the required knowledge. To the best of the author’s knowledge this is the first time that a multi task model based on artificial neural network (ANN) is reported to monitor all the required knowledge at the same time for a multivariate process with more than two correlated quality characteristics. The performance of the proposed scheme is evaluated numerically when different step shifts affect the mean vector. Average run length is used to investigate the performance of the proposed multi task model. The simulated results indicate the multi task scheme performs all the required knowledge effectively.
Abstract: we propose a new normalized LMS (NLMS) algorithm, which gives satisfactory performance in certain applications in comaprison with con-ventional NLMS recursion. This new algorithm can be treated as a block based simplification of NLMS algorithm with significantly reduced number of multi¬ply and accumulate as well as division operations. It is also shown that such a recursion can be easily implemented in block floating point (BFP) arithmetic, treating the implementational issues much efficiently. In particular, the core challenges of a BFP realization to such adaptive filters are mainly considered in this regard. A global upper bound on the step size control parameter of the new algorithm due to BFP implementation is also proposed to prevent overflow in filtering as well as weight updating operations jointly.
Abstract: An investigation of noise in a micro stepping motor is
considered to study in this article. Because of the trend towards higher
precision and more and more small 3C (including Computer,
Communication and Consumer Electronics) products, the micro
stepping motor is frequently used to drive the micro system or the
other 3C products. Unfortunately, noise in a micro stepped motor is
too large to accept by the customs. To depress the noise of a micro
stepped motor, the dynamic characteristics in this system must be
studied. In this article, a Visual Basic (VB) computer program speed
controlled micro stepped motor in a digital camera is investigated.
Karman KD2300-2S non-contract eddy current displacement sensor,
probe microphone, and HP 35670A analyzer are employed to analyze
the dynamic characteristics of vibration and noise in a motor. The
vibration and noise measurement of different type of bearings and
different treatment of coils are compared. The rotating components,
bearings, coil, etc. of the motor play the important roles in producing
vibration and noise. It is found that the noise will be depressed about
3~4 dB and 6~7 dB, when substitutes the copper bearing with plastic
one and coats the motor coil with paraffin wax, respectively.
Abstract: Document image processing has become an
increasingly important technology in the automation of office
documentation tasks. During document scanning, skew is inevitably
introduced into the incoming document image. Since the algorithm
for layout analysis and character recognition are generally very
sensitive to the page skew. Hence, skew detection and correction in
document images are the critical steps before layout analysis. In this
paper, a novel skew detection method is presented for binary
document images. The method considered the some selected
characters of the text which may be subjected to thinning and Hough
transform to estimate skew angle accurately. Several experiments
have been conducted on various types of documents such as
documents containing English Documents, Journals, Text-Book,
Different Languages and Document with different fonts, Documents
with different resolutions, to reveal the robustness of the proposed
method. The experimental results revealed that the proposed method
is accurate compared to the results of well-known existing methods.
Abstract: Transpedicular screw fixation in spinal fractures,
degenerative changes, or deformities is a well-established procedure.
However, important rate of fixation failure due to screw bending,
loosening, or pullout are still reported particularly in weak bone stock
in osteoporosis. To overcome the problem, mechanism of failure has
to be fully investigated in vitro. Post-mortem human subjects are less
accessible and animal cadavers comprise limitations due to different
geometry and mechanical properties. Therefore, the development of a
synthetic model mimicking the realistic human vertebra is highly
demanded. A bone surrogate, composed of Polyurethane (PU) foam
analogous to cancellous bone porous structure, was tested for 3
different densities in this study. The mechanical properties were
investigated under uniaxial compression test by minimizing the end
artifacts on specimens. The results indicated that PU foam of 0.32
g.cm-3 density has comparable mechanical properties to human
cancellous bone in terms of young-s modulus and yield strength.
Therefore, the obtained information can be considered as primary
step for developing a realistic cancellous bone of human vertebral
body. Further evaluations are also recommended for other density
groups.
Abstract: The rapid development of manufacturing and information systems has caused significant changes in manufacturing environments in recent decades. Mass production has given way to flexible manufacturing systems, in which an important characteristic is customized or "on demand" production. In this scenario, the seamless and without gaps information flow becomes a key factor for success of enterprises. In this paper we present a framework to support the mapping of features into machining workingsteps compliant with the ISO 14649 standard (known as STEP-NC). The system determines how the features can be made with the available manufacturing resources. Examples of the mapping method are presented for features such as a pocket with a general surface.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: The cable tower of Liede Bridge is a double-column curved-lever arched-beam portal framed structure. Being novel and unique in structure, its cable tower differs in complexity from traditional ones. This paper analyzes the ultimate load capacity of cable tower by adopting the finite element calculations and model tests which indicate that constitutive relations applied here give a better simulation of actual failure process of prestressed reinforced concrete. In vertical load, horizontal load and overloading tests, the stepped loading of the tower model is of linear relationship, and the test data has good repeatability. All suggests that the cable tower has good bearing capacity, rational design and high emergency capacity.
Abstract: In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: This study describes a micro device integrated with
multi-chamber for polymerase chain reaction (PCR) with different
annealing temperatures. The device consists of the reaction
polydimethylsiloxane (PDMS) chip, a cover glass chip, and is
equipped with cartridge heaters, fans, and thermocouples for
temperature control. In this prototype, commercial software is utilized
to determine the geometric and operational parameters those are
responsible for creating the denaturation, annealing, and extension
temperatures within the chip. Two cartridge heaters are placed at two
sides of the chip and maintained at two different temperatures to
achieve a thermal gradient on the chip during the annealing step. The
temperatures on the chip surface are measured via an infrared imager.
Some thermocouples inserted into the reaction chambers are used to
obtain the transient temperature profiles of the reaction chambers
during several thermal cycles. The experimental temperatures
compared to the simulated results show a similar trend. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: The classification of the protein structure is commonly
not performed for the whole protein but for structural domains, i.e.,
compact functional units preserved during evolution. Hence, a first
step to a protein structure classification is the separation of the
protein into its domains. We approach the problem of protein domain
identification by proposing a novel graph theoretical algorithm. We
represent the protein structure as an undirected, unweighted and
unlabeled graph which nodes correspond the secondary structure
elements of the protein. This graph is call the protein graph. The
domains are then identified as partitions of the graph corresponding
to vertices sets obtained by the maximization of an objective function,
which mutually maximizes the cycle distributions found in the
partitions of the graph. Our algorithm does not utilize any other kind
of information besides the cycle-distribution to find the partitions. If
a partition is found, the algorithm is iteratively applied to each of
the resulting subgraphs. As stop criterion, we calculate numerically
a significance level which indicates the stability of the predicted
partition against a random rewiring of the protein graph. Hence,
our algorithm terminates automatically its iterative application. We
present results for one and two domain proteins and compare our
results with the manually assigned domains by the SCOP database
and differences are discussed.