Abstract: This paper presents recent work on the improvement
of the robotics vision based control strategy for underwater pipeline
tracking system. The study focuses on developing image processing
algorithms and a fuzzy inference system for the analysis of the
terrain. The main goal is to implement the supervisory fuzzy learning
control technique to reduce the errors on navigation decision due to
the pipeline occlusion problem. The system developed is capable of
interpreting underwater images containing occluded pipeline, seabed
and other unwanted noise. The algorithm proposed in previous work
does not explore the cooperation between fuzzy controllers,
knowledge and learnt data to improve the outputs for underwater
pipeline tracking. Computer simulations and prototype simulations
demonstrate the effectiveness of this approach. The system accuracy
level has also been discussed.
Abstract: Early supplier involvement (ESI) benefits new
product development projects several ways. Nevertheless, many castuser
companies do not know the advantages of ESI and therefore do
not utilize it. This paper presents reasons why to utilize ESI in
casting industry and how that can be done. Further, this paper
presents advantages and challenges related to ESI in casting industry,
and introduces a Casting-Network Collaboration Model. The model
presents practices for companies to build advantageous collaborative
relationships. More detailed, the model describes three levels for
company-network relationships in casting industry with different
degrees of collaboration, and requirements for operating in each
level. In our research, ESI was found to influence, for example, on
project time, component cost, and quality. In addition, challenges
related to ESI, such as, a lack of mutual trust and unawareness about
the advantages were found. Our research approach was a case study
including four cases.
Abstract: Through the course of this paper we define Business Case Management and its characteristics, and highlight its link to knowledge workers. Business Case Management combines knowledge and process effectively, supporting the ad hoc and unpredictable nature of cases, and coordinate a range of other technologies to appropriately support knowledge-intensive processes. We emphasize the growing importance of knowledge workers and the current poor support for knowledge work automation. We also discuss the challenges in supporting this kind of knowledge work and propose a novel approach to overcome these challenges.
Abstract: Economic dispatch (ED) has been considered to be one of the key functions in electric power system operation which can help to build up effective generating management plans. The practical ED problem has non-smooth cost function with nonlinear constraints which make it difficult to be effectively solved. This paper presents a novel heuristic and efficient optimization approach based on the new Bat algorithm (BA) to solve the practical non-smooth economic dispatch problem. The proposed algorithm easily takes care of different constraints. In addition, two newly introduced modifications method is developed to improve the variety of the bat population when increasing the convergence speed simultaneously. The simulation results obtained by the proposed algorithms are compared with the results obtained using other recently develop methods available in the literature.
Abstract: This article addresses feature selection for breast
cancer diagnosis. The present process contains a wrapper approach
based on Genetic Algorithm (GA) and case-based reasoning (CBR).
GA is used for searching the problem space to find all of the possible
subsets of features and CBR is employed to estimate the evaluation
result of each subset. The results of experiment show that the
proposed model is comparable to the other models on Wisconsin
breast cancer (WDBC) dataset.
Abstract: With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Abstract: This paper is a description approach to predict
incoming and outgoing data rate in network system by using
association rule discover, which is one of the data mining
techniques. Information of incoming and outgoing data in each
times and network bandwidth are network performance
parameters, which needed to solve in the traffic problem. Since
congestion and data loss are important network problems. The result
of this technique can predicted future network traffic. In addition,
this research is useful for network routing selection and network
performance improvement.
Abstract: Recent advancements in sensor technologies and
Wireless Body Area Networks (WBANs) have led to the
development of cost-effective healthcare devices which can be used
to monitor and analyse a person-s physiological parameters from
remote locations. These advancements provides a unique opportunity
to overcome current healthcare challenges of low quality service
provisioning, lack of easy accessibility to service varieties, high costs
of services and increasing population of the elderly experienced
globally. This paper reports on a prototype implementation of an
architecture that seamlessly integrates Wireless Body Area Network
(WBAN) with Web services (WS) to proactively collect
physiological data of remote patients to recommend diagnostic
services. Technologies based upon WBAN and WS can provide
ubiquitous accessibility to a variety of services by allowing
distributed healthcare resources to be massively reused to provide
cost-effective services without individuals physically moving to the
locations of those resources. In addition, these technologies can
reduce costs of healthcare services by allowing individuals to access
services to support their healthcare. The prototype uses WBAN body
sensors implemented on arduino fio platforms to be worn by the
patient and an android smart phone as a personal server. The
physiological data are collected and uploaded through GPRS/internet
to the Medical Health Server (MHS) to be analysed. The prototype
monitors the activities, location and physiological parameters such as
SpO2 and Heart Rate of the elderly and patients in rehabilitation.
Medical practitioners would have real time access to the uploaded
information through a web application.
Abstract: The importance of machining process in today-s
industry requires the establishment of more practical approaches to
clearly represent the intimate and severe contact on the tool-chipworkpiece
interfaces. Mathematical models are developed using the
measured force signals to relate each of the tool-chip friction
components on the rake face to the operating cutting parameters in
rough turning operation using multilayers coated carbide inserts.
Nonlinear modeling proved to have high capability to detect the
nonlinear functional variability embedded in the experimental data.
While feedrate is found to be the most influential parameter on the
friction coefficient and its related force components, both cutting
speed and depth of cut are found to have slight influence. Greater
deformed chip thickness is found to lower the value of friction
coefficient as the sliding length on the tool-chip interface is reduced.
Abstract: The PAX6, a transcription factor, is essential for the morphogenesis of the eyes, brain, pituitary and pancreatic islets. In rodents, the loss of Pax6 function leads to central nervous system defects, anophthalmia, and nasal hypoplasia. The haplo-insufficiency of Pax6 causes microphthalmia, aggression and other behavioral abnormalities. It is also required in brain patterning and neuronal plasticity. In human, heterozygous mutation of Pax6 causes loss of iris [aniridia], mental retardation and glucose intolerance. The 3- deletion in Pax6 leads to autism and aniridia. The phenotypes are variable in peneterance and expressivity. However, mechanism of function and interaction of PAX6 with other proteins during development and associated disease are not clear. It is intended to explore interactors of PAX6 to elucidated biology of PAX6 function in the tissues where it is expressed and also in the central regulatory pathway. This report describes In-silico approaches to explore interacting proteins of PAX6. The models show several possible proteins interacting with PAX6 like MITF, SIX3, SOX2, SOX3, IPO13, TRIM, and OGT. Since the Pax6 is a critical transcriptional regulator and master control gene of eye and brain development it might be interacting with other protein involved in morphogenesis [TGIF, TGF, Ras etc]. It is also presumed that matricelluar proteins [SPARC, thrombospondin-1 and osteonectin etc] are likely to interact during transport and processing of PAX6 and are somewhere its cascade. The proteins involved in cell survival and cell proliferation can also not be ignored.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. If the
framework contains defects, the defects will be passed on to the
applications developed from the framework. Framework defects are
hard to discover at the time the framework is instantiated. Therefore,
it is important to remove all defects before instantiating the
framework. In this paper, two measures for the adequacy of an
object-oriented system-based testing technique are introduced. The
measures assess the usefulness and uniqueness of the testing
technique. The two measures are applied to experimentally compare
the adequacy of two testing techniques introduced to test objectoriented
frameworks at the system level. The two considered testing
techniques are the New Framework Test Approach and Testing
Frameworks Through Hooks (TFTH). The techniques are also
compared analytically in terms of their coverage power of objectoriented
aspects. The comparison study results show that the TFTH
technique is better than the New Framework Test Approach in terms
of usefulness degree, uniqueness degree, and coverage power.
Abstract: Software complexity metrics are used to predict
critical information about reliability and maintainability of software
systems. Object oriented software development requires a different
approach to software complexity metrics. Object Oriented Software
Metrics can be broadly classified into static and dynamic metrics.
Static Metrics give information at the code level whereas dynamic
metrics provide information on the actual runtime. In this paper we
will discuss the various complexity metrics, and the comparison
between static and dynamic complexity.
Abstract: Chemical industry project management involves
complex decision making situations that require discerning abilities
and methods to make sound decisions. Project managers are faced
with decision environments and problems in projects that are
complex. In this work, case study is Research and Development
(R&D) project selection. R&D is an ongoing process for forward
thinking technology-based chemical industries. R&D project
selection is an important task for organizations with R&D project
management. It is a multi-criteria problem which includes both
tangible and intangible factors. The ability to make sound decisions
is very important to success of R&D projects. Multiple-criteria
decision making (MCDM) approaches are major parts of decision
theory and analysis. This paper presents all of MCDM approaches
for use in R&D project selection. It is hoped that this work will
provide a ready reference on MCDM and this will encourage the
application of the MCDM by chemical engineering management.
Abstract: Natural disasters, including earthquake, kill many people around the world every year. Society rescue actions, which start after the earthquake and are called LAST in abbreviation, include locating, access, stabilization and transportation. In the present article, we have studied the process of local accessibility to the injured and transporting them to health care centers. With regard the heavy traffic load due to earthquake, the destruction of connecting roads and bridges and the heavy debris in alleys and street, which put the lives of the injured and the people buried under the debris in danger, accelerating the rescue actions and facilitating the accessibilities are of great importance, obviously. Tehran, the capital of Iran, is among the crowded cities in the world and is the center of extensive economic, political, cultural and social activities. Tehran has a population of about 9.5 millions and because of the immigration of people from the surrounding cities. Furthermore, considering the fact that Tehran is located on two important and large faults, a 6 Richter magnitude earthquake in this city could lead to the greatest catastrophe during the entire human history. The present study is a kind of review and a major part of the required information for it, has been obtained from libraries all of the rescue vehicles around the world, including rescue helicopters, ambulances, fire fighting vehicles and rescue boats, and their applied technology, and also the robots specifically designed for the rescue system and the advantages and disadvantages of them, have been investigated. The studies show that there is a significant relationship between the rescue team-s arrival time at the incident zone and the number of saved people; so that, if the duration of burial under debris 30 minutes, the probability of survival is %99.3, after a day is %81, after 2days is %19 and after 5days is %7.4. The exiting transport systems all have some defects. If these defects are removed, more people could be saved each hour and the preparedness against natural disasters is increased. In this study, transport system has been designed for the rescue team and the injured; which could carry the rescue team to the incident zone and the injured to the health care centers. In addition, this system is able to fly in the air and move on the earth as well; so that the destruction of roads and the heavy traffic load could not prevent the rescue team from arriving early at the incident zone. The system also has the equipment required firebird for debris removing, optimum transport of the injured and first aid.
Abstract: In this paper we propose a mixture of two different
distributions such as Exponential-Gamma, Exponential-Weibull and
Gamma-Weibull to model heterogeneous survival data. Various
properties of the proposed mixture of two different distributions are
discussed. Maximum likelihood estimations of the parameters are
obtained by using the EM algorithm. Illustrative example based on
real data are also given.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: In today's day and age, one of the important topics in
information security is authentication. There are several alternatives
to text-based authentication of which includes Graphical Password
(GP) or Graphical User Authentication (GUA). These methods stems
from the fact that humans recognized and remembers images better
than alphanumerical text characters. This paper will focus on the
security aspect of GP algorithms and what most researchers have
been working on trying to define these security features and
attributes. The goal of this study is to develop a fuzzy decision model
that allows automatic selection of available GP algorithms by taking
into considerations the subjective judgments of the decision makers
who are more than 50 postgraduate students of computer science. The
approach that is being proposed is based on the Fuzzy Analytic
Hierarchy Process (FAHP) which determines the criteria weight as a
linear formula.
Abstract: Palestinian cities face the challenges of land scarcity,
high population growth rates, rapid urbanization, uneven
development and territorial fragmentation. Due to geopolitical
constrains and the absence of an effective Palestinian planning
institution, urban development in Palestinian cities has not followed
any discernable planning scheme. This has led to a number of
internal contradictions in the structure of cities, and adversely
affected land use, the provision of urban services, and the quality of
the living environment.
This paper explores these challenges, and the potential that exists
for introducing a more sustainable urban development pattern in
Palestinian cities. It assesses alternative development approaches
with a particular focus on sustainable development, promoting ecodevelopment
imperatives, limiting random urbanization, and meeting
present and future challenges, including fulfilling the needs of the
people and conserving the scarce land and limited natural resources.
This paper concludes by offering conceptual proposals and guidelines
for promoting sustainable physical development in Palestinian cities.
Abstract: Power loss reduction is one of the main targets in power industry and so in this paper, the problem of finding the optimal configuration of a radial distribution system for loss reduction is considered. Optimal reconfiguration involves the selection of the best set of branches to be opened ,one each from each loop, for reducing resistive line losses , and reliving overloads on feeders by shifting the load to adjacent feeders. However ,since there are many candidate switching combinations in the system ,the feeder reconfiguration is a complicated problem. In this paper a new approach is proposed based on a simple optimum loss calculation by determining optimal trees of the given network. From graph theory a distribution network can be represented with a graph that consists a set of nodes and branches. In fact this problem can be viewed as a problem of determining an optimal tree of the graph which simultaneously ensure radial structure of each candidate topology .In this method the refined genetic algorithm is also set up and some improvements of algorithm are made on chromosome coding. In this paper an implementation of the algorithm presented by [7] is applied by modifying in load flow program and a comparison of this method with the proposed method is employed. In [7] an algorithm is proposed that the choice of the switches to be opened is based on simple heuristic rules. This algorithm reduce the number of load flow runs and also reduce the switching combinations to a fewer number and gives the optimum solution. To demonstrate the validity of these methods computer simulations with PSAT and MATLAB programs are carried out on 33-bus test system. The results show that the performance of the proposed method is better than [7] method and also other methods.