Abstract: Producing IT products/services required carefully
designed. IT development process is intangible and labour intensive.
Making optimal use of available resources, both soft (knowledge,
skill-set etc.) and hard (computer system, ancillary equipment etc.),
is vital if IT development is to achieve sensible economical
advantages. Apart from the norm of Project Life Cycle and System
Development Life Cycle (SDLC), there is an urgent need to establish
a general yet widely acceptable guideline on the most effective and
efficient way to precede an IT project in the broader view of Product
Life Cycle. The current paper proposes such a framework with two
major areas of concern: (1) an integration of IT Products and IT
Services within an existing IT Process architecture and; (2) how IT
Product and IT Services are built into the framework of Product Life
Cycle, Project Life Cycle and SDLC.
Abstract: The main objective of this work is to provide a fault detection and isolation based on Markov parameters for residual generation and a neural network for fault classification. The diagnostic approach is accomplished in two steps: In step 1, the system is identified using a series of input / output variables through an identification algorithm. In step 2, the fault is diagnosed comparing the Markov parameters of faulty and non faulty systems. The Artificial Neural Network is trained using predetermined faulty conditions serves to classify the unknown fault. In step 1, the identification is done by first formulating a Hankel matrix out of Input/ output variables and then decomposing the matrix via singular value decomposition technique. For identifying the system online sliding window approach is adopted wherein an open slit slides over a subset of 'n' input/output variables. The faults are introduced at arbitrary instances and the identification is carried out in online. Fault residues are extracted making a comparison of the first five Markov parameters of faulty and non faulty systems. The proposed diagnostic approach is illustrated on benchmark problems with encouraging results.
Abstract: This paper presents an new vision technique for
robotic manipulation of randomly oriented objects in industrial
applications. The proposed approach uses 2D and 3D vision for
efficiently extracting the 3D pose of an object in the presence of
multiple randomly positioned objects. 2D vision permits to quickly
select the objects of interest for 3D processing with a new modified
ICP algorithm (FaR-ICP), thus reducing significantly the processing
time. The extracted 3D pose is then sent to the robot manipulator for
picking. The tests show that the proposed system achieves high
performances
Abstract: Two completely different approaches for a Gigabit
Ethernet compliant stream transmission over 50m of 1mm PMMA SI-POF have been experimentally demonstrated and are compared in this paper. The first solution is based on a commercial RC-LED
transmission and a careful optimization of the physical layer architecture, realized during the POF-PLUS EU Project. The second solution exploits the performance of an edge-emitting laser at the
transmitter side in order to avoid any sort of electrical equalization at the receiver side.
Abstract: In this work, we present a novel active learning approach
for learning a visual object detection system. Our system
is composed of an active learning mechanism as wrapper around
a sub-algorithm which implement an online boosting-based learning
object detector. In the core is a combination of a bootstrap procedure
and a semi automatic learning process based on the online boosting
procedure. The idea is to exploit the availability of classifier during
learning to automatically label training samples and increasingly
improves the classifier. This addresses the issue of reducing labeling
effort meanwhile obtain better performance. In addition, we propose
a verification process for further improvement of the classifier.
The idea is to allow re-update on seen data during learning for
stabilizing the detector. The main contribution of this empirical study
is a demonstration that active learning based on an online boosting
approach trained in this manner can achieve results comparable or
even outperform a framework trained in conventional manner using
much more labeling effort. Empirical experiments on challenging data
set for specific object deteciton problems show the effectiveness of
our approach.
Abstract: Glazing is a process used to reduce undesirable drying or dehydration of fish during frozen or cold storage. To evaluate the effect of the time/ temperature binomial of the cryogenic frozen tunnel in the amount of glazing watera Central Composite Rotatable Design was used, with application of the Response Surface Methodology. The results reveal that the time/ temperature obtained for pink cusk-eel in experimental conditions for glazing water are similar to the industrial process, but for red fish and merluza the industrial process needs some adjustments. Control charts were established and implementedto control the amount of glazing water on sardine and merluza. They show that the freezing process was statistically controlled but there were some tendencies that must be analyzed, since the trend of sample mean values approached either of the limits, mainly in merluza. Thus, appropriate actions must be taken, in order to improve the process.
Abstract: In this paper, a novel method for a biometric system based on the ECG signal is proposed, using spectral coefficients computed through linear predictive coding (LPC). ECG biometric systems have traditionally incorporated characteristics of fiducial points of the ECG signal as the feature set. These systems have been shown to contain loopholes and thus a non-fiducial system allows for tighter security. In the proposed system, incorporating non-fiducial features from the LPC spectrum produced a segment and subject recognition rate of 99.52% and 100% respectively. The recognition rates outperformed the biometric system that is based on the wavelet packet decomposition (WPD) algorithm in terms of recognition rates and computation time. This allows for LPC to be used in a practical ECG biometric system that requires fast, stringent and accurate recognition.
Abstract: A new approach to determine the machine layout in flexible manufacturing cell, and to find the feasible robot configuration of the robot to achieve minimum cycle time is presented in this paper. The location of the input/output location and the optimal robot configuration is obtained for all sequences of work tasks of the robot within a specified period of time. A more realistic approach has been presented to model the problem using the robot joint space. The problem is formulated as a nonlinear optimization problem and solved using Sequential Quadratic Programming algorithm.
Abstract: The world wide web coupled with the ever-increasing
sophistication of online technologies and software applications puts
greater emphasis on the need of even more sophisticated and
consistent quality requirements modeling than traditional software
applications. Web sites and Web applications (WebApps) are
becoming more information driven and content-oriented raising the
concern about their information quality (InQ). The consistent and
consolidated modeling of InQ requirements for WebApps at different
stages of the life cycle still poses a challenge. This paper proposes an
approach to specify InQ requirements for WebApps by reusing and
extending the ISO 25012:2008(E) data quality model. We also
discuss learnability aspect of information quality for the WebApps.
The proposed ISO 25012 based InQ framework is a step towards a
standardized approach to evaluate WebApps InQ.
Abstract: One of the common problems encountered in software
engineering is addressing and responding to the changing nature of
requirements. While several approaches have been devised to address
this issue, ranging from instilling resistance to changing requirements
in order to mitigate impact to project schedules, to developing an
agile mindset towards requirements, the approach discussed in this
paper is one of conceptualizing the delta in requirement and
modeling it, in order to plan a response to it. To provide some
context here, change is first formally identified and categorized as
either formal change or informal change. While agile methodology
facilitates informal change, the approach discussed in this paper
seeks to develop the idea of facilitating formal change. To collect,
document meta-requirements that represent the phenomena of change
would be a pro-active measure towards building a realistic cognition
of the requirements entity that can further be harnessed in the
software engineering process.
Abstract: Location-aware computing is a type of pervasive
computing that utilizes user-s location as a dominant factor for
providing urban services and application-related usages. One of the
important urban services is navigation instruction for wayfinders in a
city especially when the user is a tourist. The services which are
presented to the tourists should provide adapted location aware
instructions. In order to achieve this goal, the main challenge is to
find spatial relevant objects and location-dependent information. The
aim of this paper is the development of a reusable location-aware
model to handle spatial relevancy parameters in urban location-aware
systems. In this way we utilized ontology as an approach which could
manage spatial relevancy by defining a generic model. Our
contribution is the introduction of an ontological model based on the
directed interval algebra principles. Indeed, it is assumed that the
basic elements of our ontology are the spatial intervals for the user
and his/her related contexts. The relationships between them would
model the spatial relevancy parameters. The implementation language
for the model is OWLs, a web ontology language. The achieved
results show that our proposed location-aware model and the
application adaptation strategies provide appropriate services for the
user.
Abstract: To determine the presence and location of faults in a transmission by the adaptation of protective distance relay based on the measurement of fixed settings as line impedance is achieved by several different techniques. Moreover, a fast, accurate and robust technique for real-time purposes is required for the modern power systems. The appliance of radial basis function neural network in transmission line protection is demonstrated in this paper. The method applies the power system via voltage and current signals to learn the hidden relationship presented in the input patterns. It is experiential that the proposed technique is competent to identify the particular fault direction more speedily. System simulations studied show that the proposed approach is able to distinguish the direction of a fault on a transmission line swiftly and correctly, therefore suitable for the real-time purposes.
Abstract: The identification and classification of weeds are of
major technical and economical importance in the agricultural
industry. To automate these activities, like in shape, color and
texture, weed control system is feasible. The goal of this paper is to
build a real-time, machine vision weed control system that can detect
weed locations. In order to accomplish this objective, a real-time
robotic system is developed to identify and locate outdoor plants
using machine vision technology and pattern recognition. The
algorithm is developed to classify images into broad and narrow class
for real-time selective herbicide application. The developed
algorithm has been tested on weeds at various locations, which have
shown that the algorithm to be very effectiveness in weed
identification. Further the results show a very reliable performance
on weeds under varying field conditions. The analysis of the results
shows over 90 percent classification accuracy over 140 sample
images (broad and narrow) with 70 samples from each category of
weeds.
Abstract: Distributed denial-of-service (DDoS) attacks pose a
serious threat to network security. There have been a lot of
methodologies and tools devised to detect DDoS attacks and reduce
the damage they cause. Still, most of the methods cannot
simultaneously achieve (1) efficient detection with a small number of
false alarms and (2) real-time transfer of packets. Here, we introduce
a method for proactive detection of DDoS attacks, by classifying the
network status, to be utilized in the detection stage of the proposed
anti-DDoS framework. Initially, we analyse the DDoS architecture
and obtain details of its phases. Then, we investigate the procedures
of DDoS attacks and select variables based on these features. Finally,
we apply the k-nearest neighbour (k-NN) method to classify the
network status into each phase of DDoS attack. The simulation result
showed that each phase of the attack scenario is classified well and
we could detect DDoS attack in the early stage.
Abstract: This paper investigates the performance of Multiple- Input Multiple-Output (MIMO) feedback system combined with Orthogonal Frequency Division Multiplexing (OFDM). Two types of codebook based channel feedback techniques are used in this work. The first feedback technique uses a combination of both the long-term and short-term channel state information (CSI) at the transmitter, whereas the second technique uses only the short term CSI. The long-term and short-term CSI at the transmitter is used for efficient channel utilization. OFDM is a powerful technique employed in communication systems suffering from frequency selectivity. Combined with multiple antennas at the transmitter and receiver, OFDM proves to be robust against delay spread. Moreover, it leads to significant data rates with improved bit error performance over links having only a single antenna at both the transmitter and receiver. The effectiveness of these techniques has been demonstrated through the simulation of a MIMO-OFDM feedback system. The results have been evaluated for 4x4 MIMO channels. Simulation results indicate the benefits of the MIMO-OFDM channel feedback system over the one without incorporating OFDM. Performance gain of about 3 dB is observed for MIMO-OFDM feedback system as compared to the one without employing OFDM. Hence MIMO-OFDM becomes an attractive approach for future high speed wireless communication systems.
Abstract: This paper determines most common model of in-pipe
robots to derive its degree of freedom in order to compare with the
necessary degree of freedom required for a system to move inside
pipelines freely in order to derive analytical reason for losing control
of in-pipe robots at branched pipe. DOF of most common mechanism
in in-pipe robots can be calculated by considering the robot as a
parallel manipulator. A new design based on previously researched
in-pipe robot PAROYS has been suggested, and its possibility to
overcome branched section has been simulated.
Abstract: The incorporation of computational fluid dynamics in the design of modern hydraulic turbines appears to be necessary in order to improve their efficiency and cost-effectiveness beyond the traditional design practices. A numerical optimization methodology is developed and applied in the present work to a Turgo water turbine. The fluid is simulated by a Lagrangian mesh-free approach that can provide detailed information on the energy transfer and enhance the understanding of the complex, unsteady flow field, at very small computing cost. The runner blades are initially shaped according to hydrodynamics theory, and parameterized using Bezier polynomials and interpolation techniques. The use of a limited number of free design variables allows for various modifications of the standard blade shape, while stochastic optimization using evolutionary algorithms is implemented to find the best blade that maximizes the attainable hydraulic efficiency of the runner. The obtained optimal runner design achieves considerably higher efficiency than the standard one, and its numerically predicted performance is comparable to a real Turgo turbine, verifying the reliability and the prospects of the new methodology.
Abstract: Information sharing and gathering are important in the rapid advancement era of technology. The existence of WWW has caused rapid growth of information explosion. Readers are overloaded with too many lengthy text documents in which they are more interested in shorter versions. Oil and gas industry could not escape from this predicament. In this paper, we develop an Automated Text Summarization System known as AutoTextSumm to extract the salient points of oil and gas drilling articles by incorporating statistical approach, keywords identification, synonym words and sentence-s position. In this study, we have conducted interviews with Petroleum Engineering experts and English Language experts to identify the list of most commonly used keywords in the oil and gas drilling domain. The system performance of AutoTextSumm is evaluated using the formulae of precision, recall and F-score. Based on the experimental results, AutoTextSumm has produced satisfactory performance with F-score of 0.81.
Abstract: In this paper we propose segmentation approach based
on Vector Quantization technique. Here we have used Kekre-s fast
codebook generation algorithm for segmenting low-altitude aerial
image. This is used as a preprocessing step to form segmented
homogeneous regions. Further to merge adjacent regions color
similarity and volume difference criteria is used. Experiments
performed with real aerial images of varied nature demonstrate that
this approach does not result in over segmentation or under
segmentation. The vector quantization seems to give far better results
as compared to conventional on-the-fly watershed algorithm.
Abstract: Sensitive and predictive DILI (Drug Induced Liver
Injury) biomarkers are needed in drug R&D to improve early
detection of hepatotoxicity. The discovery of DILI biomarkers that
demonstrate the predictive power to identify individuals at risk to
DILI would represent a major advance in the development of
personalized healthcare approaches. In this healthy volunteer
acetaminophen study (4g/day for 7 days, with 3 monitored nontreatment
days before and 4 after), 450 serum samples from 32
subjects were analyzed using protein profiling by antibody
suspension bead arrays. Multiparallel protein profiles were generated
using a DILI target protein array with 300 antibodies, where the
antibodies were selected based on previous literature findings of
putative DILI biomarkers and a screening process using pre dose
samples from the same cohort. Of the 32 subjects, 16 were found to
develop an elevated ALT value (2Xbaseline, responders). Using the
plasma profiling approach together with multivariate statistical
analysis some novel findings linked to lipid metabolism were found
and more important, endogenous protein profiles in baseline samples
(prior to treatment) with predictive power for ALT elevations were
identified.