Abstract: Tumor classification is a key area of research in the
field of bioinformatics. Microarray technology is commonly used in
the study of disease diagnosis using gene expression levels. The
main drawback of gene expression data is that it contains thousands
of genes and a very few samples. Feature selection methods are used
to select the informative genes from the microarray. These methods
considerably improve the classification accuracy. In the proposed
method, Genetic Algorithm (GA) is used for effective feature
selection. Informative genes are identified based on the T-Statistics,
Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate
solutions of GA are obtained from top-m informative genes. The
classification accuracy of k-Nearest Neighbor (kNN) method is used
as the fitness function for GA. In this work, kNN and Support Vector
Machine (SVM) are used as the classifiers. The experimental results
show that the proposed work is suitable for effective feature
selection. With the help of the selected genes, GA-kNN method
achieves 100% accuracy in 4 datasets and GA-SVM method
achieves in 5 out of 10 datasets. The GA with kNN and SVM
methods are demonstrated to be an accurate method for microarray
based tumor classification.
Abstract: In Both developed and developing countries,
governments play a basic role in making policies, programs and
instruments which support the development of micro, small and
medium enterprises. One of the mechanisms employed to nurture
small firms for more than two decades is business incubation. One of
the mechanisms employed to nurture small firms for more than two
decades is technology business incubation. The main aim of this
research was to establish influencing factors in Technology Business
Incubator's effectiveness and their explanatory model. Therefore,
among 56 Technology Business Incubators in Iran, 32 active
incubators were selected and by stratified random sampling, 528
start-ups were chosen. The validity of research questionnaires
was determines by expert consensus, item analysis and factor
analysis; and their reliability calculated by Cronbach-s alpha.
Data analysis was then made through SPSS and LISREL soft wares.
Both organizational procedures and entrepreneurial behaviors were
the meaningful mediators. Organizational procedures with (P < .01, β
=0.45) was stronger mediator for the improvement of Technology
Business Incubator's effectiveness comparing to entrepreneurial
behavior with (P < .01, β =0.36).
Abstract: COSMED K4b2 is a portable electrical device designed to test pulmonary functions. It is ideal for many applications that need the measurement of the cardio-respiratory response either in the field or in the lab is capable with the capability to delivery real time data to a sink node or a PC base station with storing data in the memory at the same time. But the actual sensor outputs and data received may contain some errors, such as impulsive noise which can be related to sensors, low batteries, environment or disturbance in data acquisition process. These abnormal outputs might cause misinterpretations of exercise or living activities to persons being monitored. In our paper we propose an effective and feasible method to detect and identify errors in applications by principal component analysis (PCA) and a back propagation (BP) neural network.
Abstract: Petri Net being one of the most useful graphical tools for modelling complex asynchronous systems, we have used Petri Net to model multi-track railway level crossing system. The roadway has been augmented with four half-size barriers. For better control, a three stage control mechanism has been introduced to ensure that no road-vehicle is trapped on the level crossing. Timed Petri Net is used to include the temporal nature of the signalling system. Safeness analysis has also been included in the discussion section.
Abstract: Starting with an analysis of the financial and
operational indicators that can be found in the specialised literature,
this study aims to contribute to improvements in the performance
measurement systems used when the unit of analysis is the
manufacturing plant. For this a search was done in the highest impact
Journals of Production and Operations Management and
Management Accounting , with the aim of determining the financial
and operational indicators used to evaluate performance when
Advanced Production Practices have been implemented, more
specifically when the practices implemented are Total Quality
Management, JIT/Lean Manufacturing and Total Productive
Maintenance. This has enabled us to obtain a classification of the two
types of indicators based on how much each is used. For the financial
indicators we have also prepared a proposal that can be adapted to
manufacturing plants- accounting features. In the near future we will
propose a model that links practices implementation with financial
and operational indicators and these two last with each other. We aim
to will test this model empirically with the data obtained in the High
Performance Manufacturing Project.
Abstract: In this paper, the bio-mechanical analysis of human joints is carried out and the study is extended to the robot manipulator. This study will first focus on the kinematics of human arm which include the movement of each joint in shoulder, wrist, elbow and finger complexes. Those analyses are then extended to the design of a human robot manipulator. A simulator is built for Direct Kinematics and Inverse Kinematics of human arm. In the simulation of Direct Kinematics, the human joint angles can be inserted, while the position and orientation of each finger tips (end-effector) are shown. Inverse Kinematics does the reverse of the Direct Kinematics. Based on previous materials obtained from kinematics analysis, the human manipulator joints can be designed to follow prescribed position trajectories.
Abstract: In July 1, 2007, Taiwan Stock Exchange (TWSE) on
market observation post system (MOPS) adds a new "Financial
reference database" for investors to do investment reference. This
database as a warning to public offering companies listed on the
public financial information and it original within eight targets. In
this paper, this database provided by the indicators for the application
of company financial crisis early warning model verify that the
database provided by the indicator forecast for the financial crisis,
whether or not companies have a high accuracy rate as opposed to
domestic and foreign scholars have positive results. There is use of
Logistic Regression Model application of the financial early warning
model, in which no joined back-conditions is the first model, joined it
in is the second model, has been taken occurred in the financial crisis
of companies to research samples and then business took place
before the financial crisis point with T-1 and T-2 sample data to do
positive analysis. The results show that this database provided the
debt ratio and net per share for the best forecast variables.
Abstract: This paper mainly investigates the environmental and
economic impacts of worldwide use of electric vehicles. It can be
concluded that governments have good reason to promote the use of
electric vehicles. First, the global vehicles population is evaluated with
the help of grey forecasting model and the amount of oil saving is
estimated through approximate calculation. After that, based on the
game theory, the amount and types of electricity generation needed by
electronic vehicles are established. Finally, some conclusions on the
government-s attitudes are drawn.
Abstract: In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: This study describes a micro device integrated with
multi-chamber for polymerase chain reaction (PCR) with different
annealing temperatures. The device consists of the reaction
polydimethylsiloxane (PDMS) chip, a cover glass chip, and is
equipped with cartridge heaters, fans, and thermocouples for
temperature control. In this prototype, commercial software is utilized
to determine the geometric and operational parameters those are
responsible for creating the denaturation, annealing, and extension
temperatures within the chip. Two cartridge heaters are placed at two
sides of the chip and maintained at two different temperatures to
achieve a thermal gradient on the chip during the annealing step. The
temperatures on the chip surface are measured via an infrared imager.
Some thermocouples inserted into the reaction chambers are used to
obtain the transient temperature profiles of the reaction chambers
during several thermal cycles. The experimental temperatures
compared to the simulated results show a similar trend. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: Consumer behaviour analysis represents an important
field of study in marketing. Particularly strategy development for
marketing and communications will be more focused and effective
when marketers have an understanding of the motivations, behaviour
and psychology of consumers. While materialism has been found to
be one of the important elements in consumer behaviour, compulsive
consumption represents another aspect that has recently attracted
more attention. This is because of the growing prevalence of
dysfunctional buying that has raised concern in consumer societies.
Present studies and analyses on origins and motivations of
compulsive buying have mainly focused on either individual factors
or groups of related factors and hence a need for a holistic view
exists. This paper provides a comprehensive perspective on
compulsive consumption and establishes relevant propositions
keeping the family life cycle stages as a reference for the incidence of
chronic consumer states and their influence on compulsive
consumption.
Abstract: Increasing use of cell phone as a medium of human interaction is playing a vital role in solving riddles of crime as well. A young girl went missing from her home late in the evening in the month of August, 2008 when her enraged relatives and villagers physically assaulted and chased her fiancée who often frequented her home. Two years later, her mother lodged a complaint against the relatives and the villagers alleging that after abduction her daughter was either sold or killed as she had failed to trace her. On investigation, a rusted cell phone with partial visible IMEI number, clothes, bangles, human skeleton etc. recovered from abandoned well in the month of May, 2011 were examined in the lab. All hopes pinned on identity of cell phone, for only linking evidence to fix the scene of occurrence supported by call detail record (CDR) and to dispel doubts about mode of sudden disappearance or death as DNA technology did not help in establishing identity of the deceased. The conventional scientific methods were used without success and international mobile equipment identification number of the cell phone could be generated by using statistical analysis followed by online verification.
Abstract: Biochemical and molecular analysis of some
antioxidant enzyme genes revealed different level of gene expression
on oilseed (Brassica napus). For molecular and biochemical
analysis, leaf tissues were harvested from plants at eight different
developmental stages, from young to senescence. The levels of total
protein and chlorophyll were increased during maturity stages of
plant, while these were decreased during the last stages of plant
growth. Structural analysis (nucleotide and deduced amino acid
sequence, and phylogenic tree) of a complementary DNA revealed a
high level of similarity for a family of Catalase genes. The
expression of the gene encoded by different Catalase isoforms was
assessed during different plant growth phase. No significant
difference between samples was observed, when Catalase activity
was statistically analyzed at different developmental stages. EST
analysis exhibited different transcripts levels for a number of other
relevant antioxidant genes (different isoforms of SOD and
glutathione). The high level of transcription of these genes at
senescence stages was indicated that these genes are senescenceinduced
genes.
Abstract: This paper presents a Reliability-Based Topology
Optimization (RBTO) based on Evolutionary Structural Optimization
(ESO). An actual design involves uncertain conditions such as
material property, operational load and dimensional variation.
Deterministic Topology Optimization (DTO) is obtained without
considering of the uncertainties related to the uncertainty parameters.
However, RBTO involves evaluation of probabilistic constraints,
which can be done in two different ways, the reliability index
approach (RIA) and the performance measure approach (PMA). Limit
state function is approximated using Monte Carlo Simulation and
Central Composite Design for reliability analysis. ESO, one of the
topology optimization techniques, is adopted for topology
optimization. Numerical examples are presented to compare the DTO
with RBTO.
Abstract: DG application has received increasing attention during
recent years. The impact of DG on various aspects of distribution system
operation, such as reliability and energy loss, depend highly on DG
location in distribution feeder. Optimal DG placement is an important
subject which has not been fully discussed yet.
This paper presents an optimization method to determine optimal DG
placement, based on a cost/worth analysis approach. This method
considers technical and economical factors such as energy loss, load point
reliability indices and DG costs, and particularly, portability of DG. The
proposed method is applied to a test system and the impacts of different
parameters such as load growth rate and load forecast uncertainty (LFU)
on optimum DG location are studied.
Abstract: With the rapid growth and development of information and communication technology, the Internet has played a definite and irreplaceable role in people-s social lives in Taiwan like in other countries. In July 2008, on a general social website, an unexpected phenomenon was noticed – that there were more than one hundred users who started forming clubs voluntarily and having face-to-face gatherings for specific purposes. In this study, it-s argued whether or not teenagers- social contact on the Internet is involved in their life context, and tried to reveal the teenagers- social preferences, values, and needs, which merge with and influence teenagers- social activities. Therefore, the study conducts multiple user experience research methods, which include practical observations and qualitative analysis by contextual inquiries and in-depth interviews. Based on the findings, several design implications for software related to social interactions and cultural inheritance are offered. It is concluded that the inherent values of a social behaviors might be a key issue in developing computer-mediated communication or interaction designs in the future.
Abstract: Various formal and informal brand alliances are being formed in professional service firms. Professional service corporate brand is heavily dependent on brands of professional employees who comprise them, and professional employee brands are in turn dependent on the corporate brand. Prior work provides limited scientific evidence of brand alliance effects in professional service area – i.e., how professional service corporate-employee brand allies are affected by an alliance, what are brand attitude effects after alliance formation and how these effects vary with different strengths of an ally. Scientific literature analysis and theoretical modeling are the main methods of the current study. As a result, a theoretical model is constructed for estimating spillover effects of professional service corporate-employee brand alliances and for comparison among different professional service firm expertise practice models – from “brains" to “procedure" model. The resulting theoretical model lays basis for future experimental studies.
Abstract: In face recognition, feature extraction techniques
attempts to search for appropriate representation of the data. However,
when the feature dimension is larger than the samples size, it brings
performance degradation. Hence, we propose a method called
Normalization Discriminant Independent Component Analysis
(NDICA). The input data will be regularized to obtain the most
reliable features from the data and processed using Independent
Component Analysis (ICA). The proposed method is evaluated on
three face databases, Olivetti Research Ltd (ORL), Face Recognition
Technology (FERET) and Face Recognition Grand Challenge
(FRGC). NDICA showed it effectiveness compared with other
unsupervised and supervised techniques.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.