Abstract: This paper looks into areas not covered by prominent
Agent-Oriented Software Engineering (AOSE) methodologies.
Extensive paper review led to the identification of two issues, first
most of these methodologies almost neglect semantic web and
ontology. Second, as expected, each one has its strength and
weakness and may focus on some phases of the development
lifecycle but not all of the phases. The work presented here builds
extensions to a highly regarded AOSE methodology (MaSE) in order
to cover the areas that this methodology does not concentrate on. The
extensions include introducing an ontology stage for semantic
representation and integrating early requirement specification from a
methodology which mainly focuses on that. The integration involved
developing transformation rules (with the necessary handling of nonmatching
notions) between the two sets of representations and
building the software which automates the transformation. The
application of this integration on a case study is also presented in the
paper. The main flow of MaSE stages was changed to smoothly
accommodate the new additions.
Abstract: The prediction of long-term deformations of concrete and reinforced concrete structures has been a field of extensive research and several different creep models have been developed so far. Most of the models were developed for constant concrete stresses, thus, in case of varying stresses a specific superposition principle or time-integration, respectively, is necessary. Nowadays, when modeling concrete creep the engineering focus is rather on the application of sophisticated time-integration methods than choosing the more appropriate creep model. For this reason, this paper presents a method to quantify the uncertainties of creep prediction originating from the selection of creep models or from the time-integration methods. By adapting variance based global sensitivity analysis, a methodology is developed to quantify the influence of creep model selection or choice of time-integration method. Applying the developed method, general recommendations how to model creep behavior for varying stresses are given.
Abstract: The pollution of sediments sampled from the North
Port by polycyclic aromatic hydrocarbons (PAHs) was investigated.
Concentrations of PAHs estimated in the port sediments ranged from
199 to 2851.2 μg/kg dw. The highest concentration was found which
is closed to the Berth line, this locations affected by intensive
shipping activities and Land based runoff and they were dominated
by the high molecular weight PAHs (4–6- rings). Source
identification showed that PAHs originated mostly from the
pyrogenic source either from the combustion of fossil fuels, grass,
wood and coal (majority of the samples). Ecological Risk Assessment
on the port sediments presented that slightly adverse ecological
effects to biological community are expected to occur at the vicinity
of the stations 1 and 4. Thus PAHs are not considered as pollutants of
concern in the North Port.
Abstract: A healthcare monitoring system is presented in this
paper. This system is based on ultra-low power sensor nodes and a
personal server, which is based on hardware and software extensions
to a Personal Digital Assistant (PDA)/Smartphone. The sensor node
collects data from the body of a patient and sends it to the personal
server where the data is processed, displayed and made ready to be
sent to a healthcare network, if necessary. The personal server
consists of a compact low power receiver module and equipped with
a Smartphone software. The receiver module takes less than 30 × 30
mm board size and consumes approximately 25 mA in active mode.
Abstract: This paper presents an improved image segmentation
model with edge preserving regularization based on the
piecewise-smooth Mumford-Shah functional. A level set formulation
is considered for the Mumford-Shah functional minimization in
segmentation, and the corresponding partial difference equations are
solved by the backward Euler discretization. Aiming at encouraging
edge preserving regularization, a new edge indicator function is
introduced at level set frame. In which all the grid points which is used
to locate the level set curve are considered to avoid blurring the edges
and a nonlinear smooth constraint function as regularization term is
applied to smooth the image in the isophote direction instead of the
gradient direction. In implementation, some strategies such as a new
scheme for extension of u+ and u- computation of the grid points and
speedup of the convergence are studied to improve the efficacy of the
algorithm. The resulting algorithm has been implemented and
compared with the previous methods, and has been proved efficiently
by several cases.
Abstract: This paper presents a new steganography approach suitable for Arabic texts. It can be classified under steganography feature coding methods. The approach hides secret information bits within the letters benefiting from their inherited points. To note the specific letters holding secret bits, the scheme considers the two features, the existence of the points in the letters and the redundant Arabic extension character. We use the pointed letters with extension to hold the secret bit 'one' and the un-pointed letters with extension to hold 'zero'. This steganography technique is found attractive to other languages having similar texts to Arabic such as Persian and Urdu.
Abstract: Conventionally the selection of parameters depends
intensely on the operator-s experience or conservative technological
data provided by the EDM equipment manufacturers that assign
inconsistent machining performance. The parameter settings given by
the manufacturers are only relevant with common steel grades. A
single parameter change influences the process in a complex way.
Hence, the present research proposes artificial neural network (ANN)
models for the prediction of surface roughness on first commenced
Ti-15-3 alloy in electrical discharge machining (EDM) process. The
proposed models use peak current, pulse on time, pulse off time and
servo voltage as input parameters. Multilayer perceptron (MLP) with
three hidden layer feedforward networks are applied. An assessment
is carried out with the models of distinct hidden layer. Training of the
models is performed with data from an extensive series of
experiments utilizing copper electrode as positive polarity. The
predictions based on the above developed models have been verified
with another set of experiments and are found to be in good
agreement with the experimental results. Beside this they can be
exercised as precious tools for the process planning for EDM.
Abstract: Different numerical methods are employed and developed for simulating interfacial flows. A large range of applications belong to this group, e.g. two-phase flows of air bubbles in water or water drops in air. In such problems surface tension effects often play a dominant role. In this paper, various models of surface tension force for interfacial flows, the CSF, CSS, PCIL and SGIP models have been applied to simulate the motion of small air bubbles in water and the results were compared and reviewed. It has been pointed out that by using SGIP or PCIL models, we are able to simulate bubble rise and obtain results in close agreement with the experimental data.
Abstract: Information is a critical asset and an important source for gaining competitive advantage in firms. The effective maintenance of IT becomes an important task. In order to better understand the determinants of IT effectiveness, this study employs the Industrial Organization (I/O) and Resource Based View (RBV) theories and investigates the industry effect and several major firmspecific factors in relation to their impact on firms- IT effectiveness. The data consist of a panel data of ten-year observations of firms whose IT excellence had been recognized by the CIO Magazine. The non-profit organizations were deliberately excluded, as explained later. The results showed that the effectiveness of IT management varied significantly across industries. Industry also moderated the effects of firm demographic factors such as size and age on IT effectiveness. Surprisingly, R & D investment intensity had negative correlation to IT effectiveness. For managers and practitioners, this study offers some insights for evaluation criteria and expectation for IT project success. Finally, the empirical results indicate that the sustainability of IT effectiveness appears to be short in duration.
Abstract: This paper reports the fatigue crack growth behaviour
of gas tungsten arc, electron beam and laser beam welded Ti-6Al-4V
titanium alloy. Centre cracked tensile specimens were prepared to
evaluate the fatigue crack growth behaviour. A 100kN servo
hydraulic controlled fatigue testing machine was used under constant
amplitude uniaxial tensile load (stress ratio of 0.1 and frequency of
10 Hz). Crack growth curves were plotted and crack growth
parameters (exponent and intercept) were evaluated. Critical and
threshold stress intensity factor ranges were also evaluated. Fatigue
crack growth behaviour of welds was correlated with mechanical
properties and microstructural characteristics of welds. Of the three
joints, the joint fabricated by laser beam welding exhibited higher
fatigue crack growth resistance due to the presence of fine lamellar
microstructure in the weld metal.
Abstract: This paper presents an algebraic approach to optimize
queries in domain-specific database management system
for protein structure data. The approach involves the introduction of
several protein structure specific algebraic operators to query the
complex data stored in an object-oriented database system. The
Protein Algebra provides an extensible set of high-level Genomic
Data Types and Protein Data Types along with a comprehensive
collection of appropriate genomic and protein functions. The paper
also presents a query translator that converts high-level query
specifications in algebra into low-level query specifications in
Protein-QL, a query language designed to query protein structure
data. The query transformation process uses a Protein Ontology that
serves the purpose of a dictionary.
Abstract: This paper presents the averaging model of a buck
converter derived from the generalized state-space averaging method.
The sliding mode control is used to regulate the output voltage of the
converter and taken into account in the model. The proposed model
requires the fast computational time compared with those of the full
topology model. The intensive time-domain simulations via the exact
topology model are used as the comparable model. The results show
that a good agreement between the proposed model and the switching
model is achieved in both transient and steady-state responses. The
reported model is suitable for the optimal controller design by using
the artificial intelligence techniques.
Abstract: Current advancements in nanotechnology are dependent on the capabilities that can enable nano-scientists to extend their eyes and hands into the nano-world. For this purpose, a haptics (devices capable of recreating tactile or force sensations) based system for AFM (Atomic Force Microscope) is proposed. The system enables the nano-scientists to touch and feel the sample surfaces, viewed through AFM, in order to provide them with better understanding of the physical properties of the surface, such as roughness, stiffness and shape of molecular architecture. At this stage, the proposed work uses of ine images produced using AFM and perform image analysis to create virtual surfaces suitable for haptics force analysis. The research work is in the process of extension from of ine to online process where interaction will be done directly on the material surface for realistic analysis.
Abstract: Multi-dimensional principal component analysis
(PCA) is the extension of the PCA, which is used widely as the
dimensionality reduction technique in multivariate data analysis, to
handle multi-dimensional data. To calculate the PCA the singular
value decomposition (SVD) is commonly employed by the reason of
its numerical stability. The multi-dimensional PCA can be calculated
by using the higher-order SVD (HOSVD), which is proposed by
Lathauwer et al., similarly with the case of ordinary PCA. In this
paper, we apply the multi-dimensional PCA to the multi-dimensional
medical data including the functional independence measure (FIM)
score, and describe the results of experimental analysis.
Abstract: The study of effect of laser scanning speed on
material efficiency in Ti6Al4V application is very important because unspent powder is not reusable because of high temperature oxygen
pick-up and contamination. This study carried out an extensive study
on the effect of scanning speed on material efficiency by varying the
speed between 0.01 to 0.1m/sec. The samples are wire brushed and
cleaned with acetone after each deposition to remove un-melted
particles from the surface of the deposit. The substrate is weighed before and after deposition. A formula was developed to calculate the
material efficiency and the scanning speed was compared with the
powder efficiency obtained. The results are presented and discussed.
The study revealed that the optimum scanning speed exists for this study at 0.01m/sec, above and below which the powder efficiency
will drop
Abstract: Coronary artery bypass grafts (CABG) are widely
studied with respect to hemodynamic conditions which play
important role in presence of a restenosis. However, papers which
concern with constitutive modeling of CABG are lacking in the
literature. The purpose of this study is to find a constitutive model for
CABG tissue. A sample of the CABG obtained within an autopsy
underwent an inflation–extension test. Displacements were
recoredered by CCD cameras and subsequently evaluated by digital
image correlation. Pressure – radius and axial force – elongation
data were used to fit material model. The tissue was modeled as onelayered
composite reinforced by two families of helical fibers. The
material is assumed to be locally orthotropic, nonlinear,
incompressible and hyperelastic. Material parameters are estimated
for two strain energy functions (SEF). The first is classical
exponential. The second SEF is logarithmic which allows
interpretation by means of limiting (finite) strain extensibility.
Presented material parameters are estimated by optimization based
on radial and axial equilibrium equation in a thick-walled tube. Both
material models fit experimental data successfully. The exponential
model fits significantly better relationship between axial force and
axial strain than logarithmic one.
Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: In the present work, behavior of inoxydable steel as
reinforcement bar in composite concrete is being investigated. The
bar-concrete adherence in reinforced concrete (RC) beam is studied
and focus is made on the tension stiffening parameter. This study
highlighted an approach to observe this interaction behavior in
bending test instead of direct tension as per reported in many
references. The approach resembles actual loading condition of the
structural RC beam. The tension stiffening properties are then
applied to numerical finite element analysis (FEA) to verify their
correlation with laboratory results. Comparison with laboratory
shows a good correlation between the two. The experimental settings
is able to determine tension stiffening parameters in RC beam and
the modeling strategies made in ABAQUS can closely represent the
actual condition. Tension stiffening model used can represent the
interaction properties between inoxydable steel and concrete.