Abstract: Pipeline infrastructures normally represent high cost of investment and the pipeline must be free from risks that could cause environmental hazard and potential threats to personnel safety. Pipeline integrity such monitoring and management become very crucial to provide unimpeded transportation and avoiding unnecessary production deferment. Thus proper cleaning and inspection is the key to safe and reliable pipeline operation and plays an important role in pipeline integrity management program and has become a standard industry procedure. In view of this, understanding the motion (dynamic behavior), prediction and control of the PIG speed is important in executing pigging operation as it offers significant benefits, such as estimating PIG arrival time at receiving station, planning for suitable pigging operation, and improves efficiency of pigging tasks. The objective of this paper is to review recent developments in speed control system of pipeline PIGs. The review carried out would serve as an industrial application in a form of quick reference of recent developments in pipeline PIG speed control system, and further initiate others to add-in/update the list in the future leading to knowledge based data, and would attract active interest of others to share their view points.
Abstract: One of the basic concepts in marketing is the concept
of meeting customers- needs. Since customer satisfaction is essential
for lasting survival and development of a business, screening and
observing customer satisfaction and recognizing its underlying
factors must be one of the key activities of every business.
The purpose of this study is to recognize the drivers that effect
customer satisfaction in a business-to-business situation in order to
improve marketing activities. We conducted a survey in which 93
business customers of a manufacturer of Diesel Generator in Iran
participated and they talked about their ideas and satisfaction of
supplier-s services related to its products. We developed the measures
for drivers of satisfaction first by as investigative research (by means
of feedback from executives and customers of sponsoring firm). Then
based on these measures, we created a mail survey, and asked the
respondents to explain their opinion about the sponsoring firm which
was a supplier of diesel generator and similar products. Furthermore,
the survey required the participants to mention their functional areas
and their company features.
In Conclusion we found that there are three drivers for customer
satisfaction, which are reliability, information about product, and
commercial features. Buyers/users from different functional areas
attribute different degree of importance to the last two drivers. For
instance, people from buying and management areas believe that
commercial features are more important than information about
products. But people in engineering, maintenance and production
areas believe that having information about products is more
important than commercial aspects. Marketing experts should
consider the attribute of customers regarding information about the
product and commercial features to improve market share.
Abstract: In the last years, the computers have increased their capacity of calculus and networks, for the interconnection of these machines. The networks have been improved until obtaining the actual high rates of data transferring. The programs that nowadays try to take advantage of these new technologies cannot be written using the traditional techniques of programming, since most of the algorithms were designed for being executed in an only processor,in a nonconcurrent form instead of being executed concurrently ina set of processors working and communicating through a network.This paper aims to present the ongoing development of a new system for the reconfiguration of grouping of computers, taking into account these new technologies.
Abstract: The intelligent fuzzy input estimator is used to estimate
the input force of the rigid bar structural system in this study. The
fuzzy Kalman filter without the input term and the fuzzy weighting
recursive least square estimator are two main portions of this method.
The practicability and accuracy of the proposed method were verified
with numerical simulations from which the input forces of a rigid bar
structural system were estimated from the output responses. In order to
examine the accuracy of the proposed method, a rigid bar structural
system is subjected to periodic sinusoidal dynamic loading. The
excellent performance of this estimator is demonstrated by comparing
it with the use of difference weighting function and improper the
initial process noise covariance. The estimated results have a good
agreement with the true values in all cases tested.
Abstract: The aim of this contribution is to present a new
approach in modeling the electrical activity of the human heart. A
recurrent artificial neural network is being used in order to exhibit a
subset of the dynamics of the electrical behavior of the human heart.
The proposed model can also be used, when integrated, as a
diagnostic tool of the human heart system.
What makes this approach unique is the fact that every model is
being developed from physiological measurements of an individual.
This kind of approach is very difficult to apply successfully in many
modeling problems, because of the complexity and entropy of the
free variables describing the complex system. Differences between
the modeled variables and the variables of an individual, measured at
specific moments, can be used for diagnostic purposes. The sensor
fusion used in order to optimize the utilization of biomedical sensors
is another point that this paper focuses on. Sensor fusion has been
known for its advantages in applications such as control and
diagnostics of mechanical and chemical processes.
Abstract: This paper presents a supervised clustering algorithm,
namely Grid-Based Supervised Clustering (GBSC), which is able to
identify clusters of any shapes and sizes without presuming any
canonical form for data distribution. The GBSC needs no prespecified
number of clusters, is insensitive to the order of the input
data objects, and is capable of handling outliers. Built on the
combination of grid-based clustering and density-based clustering,
under the assistance of the downward closure property of density
used in bottom-up subspace clustering, the GBSC can notably reduce
its search space to avoid the memory confinement situation during its
execution. On two-dimension synthetic datasets, the GBSC can
identify clusters with different shapes and sizes correctly. The GBSC
also outperforms other five supervised clustering algorithms when
the experiments are performed on some UCI datasets.
Abstract: The health record in the Electronic Health Record
(EHR) system is more sensitive than demographic. It raises the
important issue for the EHR requirement in privacy, security, audit
trail, patient access, and archiving and data retention. The studies
about the EHR system security are deficient. The aim of this study is to
build a security environment for the EHR system by Integrating the
Healthcare Enterprise (IHE) Audit Trail and Node Authentication
Security (ATNA) profile. The CDAs can be access in a secure EHR
environment.
Abstract: Trihalogenmethanes are the most significant byproducts of the reaction of disinfection agent with organic precursors naturally present in ground and surface waters.Their incidence negatively affects the quality of drinking water in relation to their nephrotoxic, hepatotoxic and genotoxic effects on human health. Taking into consideration the considerable volatility of monitored contaminants it could be assumed that their incidence in drinking water would depend on the distance of sampling from the area of disinfection. Based on the concentration of trihalogenmethanes determined with the help of gas chromatography with mass detector and the analysis of variance (ANOVA) such dependence has been proved as statistically significant. The acquired outcomes will be used for assessing the non-carcinogenic and genotoxic risks to consumers.
Abstract: This paper presents a mark-up approach to service creation in Next Generation Networks. The approach allows deriving added value from network functions exposed by Parlay/OSA (Open Service Access) interfaces. With OSA interfaces service logic scripts might be executed both on callrelated and call-unrelated events. To illustrate the approach XMLbased language constructions for data and method definitions, flow control, time measuring and supervision and database access are given and an example of OSA application is considered.
Abstract: Composite steel-concrete slabs using thin-walled
corrugated steel sheets with embossments represent a modern and
effective combination of steel and concrete. However, the design
of new types of sheeting is conditional on the execution of expensive
and time-consuming laboratory testing. The effort to develop
a cheaper and faster method has lead to many investigations all over
the world. In our paper we compare the results from our experiments
involving vacuum loading, four-point bending and small-scale shear
tests.
Abstract: Restructured electricity markets may provide
opportunities for producers to exercise market power maintaining
prices in excess of competitive levels. In this paper an oligopolistic
market is presented that all Generation Companies (GenCos) bid in a
Cournot model. Genetic algorithm (GA) is applied to obtain
generation scheduling of each GenCo as well as hourly market
clearing prices (MCP). In order to consider network constraints a
multiperiod framework is presented to simulate market clearing
mechanism in which the behaviors of market participants are
modelled through piecewise block curves. A mixed integer linear
programming (MILP) is employed to solve the problem. Impacts of
market clearing process on participants- characteristic and final
market prices are presented. Consequently, a novel multi-objective
model is addressed for security constrained optimal bidding strategy
of GenCos. The capability of price-maker GenCos to alter MCP is
evaluated through introducing an effective-supply curve. In addition,
the impact of exercising market power on the variation of market
characteristics as well as GenCos scheduling is studied.
Abstract: In this paper a PID control strategy using neural
network adaptive RASP1 wavelet for WECS-s control is proposed.
It is based on single layer feedforward neural networks with hidden
nodes of adaptive RASP1 wavelet functions controller and an infinite
impulse response (IIR) recurrent structure. The IIR is combined by
cascading to the network to provide double local structure resulting
in improving speed of learning. This particular neuro PID controller
assumes a certain model structure to approximately identify the
system dynamics of the unknown plant (WECS-s) and generate the
control signal. The results are applied to a typical turbine/generator
pair, showing the feasibility of the proposed solution.
Abstract: The Requirements Abstraction Model (RAM) helps in managing abstraction in requirements by organizing them at four levels (product, feature, function and component). The RAM is adaptable and can be tailored to meet the needs of the various organizations. Because software requirements are an important source of information for developing high-level tests, organizations willing to adopt the RAM model need to know the suitability of the RAM requirements for developing high-level tests. To investigate this suitability, test cases from twenty randomly selected requirements were developed, analyzed and graded. Requirements were selected from the requirements document of a Course Management System, a web based software system that supports teachers and students in performing course related tasks. This paper describes the results of the requirements document analysis. The results show that requirements at lower levels in the RAM are suitable for developing executable tests whereas it is hard to develop from requirements at higher levels.
Abstract: The lack of any centralized infrastructure in mobile ad
hoc networks (MANET) is one of the greatest security concerns in
the deployment of wireless networks. Thus communication in
MANET functions properly only if the participating nodes cooperate
in routing without any malicious intention. However, some of the
nodes may be malicious in their behavior, by indulging in flooding
attacks on their neighbors. Some others may act malicious by
launching active security attacks like denial of service. This paper
addresses few related works done on trust evaluation and
establishment in ad hoc networks. Related works on flooding attack
prevention are reviewed. A new trust approach based on the extent of
friendship between the nodes is proposed which makes the nodes to
co-operate and prevent flooding attacks in an ad hoc environment.
The performance of the trust algorithm is tested in an ad hoc network
implementing the Ad hoc On-demand Distance Vector (AODV)
protocol.
Abstract: In this paper we discuss on the security module for the
car appliances to prevent stealing and illegal use on other cars. We
proposed an open structure including authentication and encryption by
embed a security module in each to protect car appliances. Illegal
moving and use a car appliance with the security module without
permission will lead the appliance to useless. This paper also presents
the component identification and deal with relevant procedures. It is at
low cost to recover from destroys by the burglar. Expect this paper to
offer the new business opportunity to the automotive and technology
industry.
Abstract: Key management is a vital component in any modern security protocol. Due to scalability and practical implementation considerations automatic key management seems a natural choice in significantly large virtual private networks (VPNs). In this context IETF Internet Key Exchange (IKE) is the most promising protocol under permanent review. We have made a humble effort to pinpoint IKEv2 net gain over IKEv1 due to recent modifications in its original structure, along with a brief overview of salient improvements between the two versions. We have used US National Institute of Technology NIIST VPN simulator to get some comparisons of important performance metrics.
Abstract: Grid computing provides a virtual framework for
controlled sharing of resources across institutional boundaries.
Recently, trust has been recognised as an important factor for
selection of optimal resources in a grid. We introduce a new method
that provides a quantitative trust value, based on the past interactions
and present environment characteristics. This quantitative trust value
is used to select a suitable resource for a job and eliminates run time
failures arising from incompatible user-resource pairs. The proposed
work will act as a tool to calculate the trust values of the various
components of the grid and there by improves the success rate of the
jobs submitted to the resource on the grid. The access to a resource
not only depend on the identity and behaviour of the resource but
also upon its context of transaction, time of transaction, connectivity
bandwidth, availability of the resource and load on the resource. The
quality of the recommender is also evaluated based on the accuracy
of the feedback provided about a resource. The jobs are submitted for
execution to the selected resource after finding the overall trust value
of the resource. The overall trust value is computed with respect to
the subjective and objective parameters.
Abstract: The emerging Semantic Web has been attracted many
researchers and developers. New applications have been developed on top of Semantic Web and many supporting tools introduced to improve its software development process. Metadata modeling is one of development process where supporting tools exists. The existing
tools are lack of readability and easiness for a domain knowledge expert to graphically models a problem in semantic model. In this paper, a metadata modeling tool called RDFGraph is proposed. This
tool is meant to solve those problems. RDFGraph is also designed to work with modern database management systems that support RDF and to improve the performance of the query execution process. The
testing result shows that the rules used in RDFGraph follows the W3C standard and the graphical model produced in this tool is properly translated and correct.
Abstract: Stock portfolio selection is a classic problem in finance,
and it involves deciding how to allocate an institution-s or an individual-s
wealth to a number of stocks, with certain investment objectives
(return and risk). In this paper, we adopt the classical Markowitz
mean-variance model and consider an additional common realistic
constraint, namely, the cardinality constraint. Thus, stock portfolio
optimization becomes a mixed-integer quadratic programming problem
and it is difficult to be solved by exact optimization algorithms.
Chemical Reaction Optimization (CRO), which mimics the molecular
interactions in a chemical reaction process, is a population-based
metaheuristic method. Two different types of CRO, named canonical
CRO and Super Molecule-based CRO (S-CRO), are proposed to solve
the stock portfolio selection problem. We test both canonical CRO
and S-CRO on a benchmark and compare their performance under
two criteria: Markowitz efficient frontier (Pareto frontier) and Sharpe
ratio. Computational experiments suggest that S-CRO is promising
in handling the stock portfolio optimization problem.
Abstract: As far as the latest technological improvements are concerned, digital systems more become popular than the past. Despite this growing demand to the digital systems, content copy and attack against the digital cinema contents becomes a serious problem. To solve the above security problem, we propose “traceable watermarking using Hash functions for digital cinema system. Digital Cinema is a great application for traceable watermarking since it uses watermarking technology during content play as well as content transmission. The watermark is embedded into the randomly selected movie frames using CRC-32 techniques. CRC-32 is a Hash function. Using it, the embedding position is distributed by Hash Function so that any party cannot break off the watermarking or will not be able to change. Finally, our experimental results show that proposed DWT watermarking method using CRC-32 is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.