Abstract: Current advancements in nanotechnology are dependent on the capabilities that can enable nano-scientists to extend their eyes and hands into the nano-world. For this purpose, a haptics (devices capable of recreating tactile or force sensations) based system for AFM (Atomic Force Microscope) is proposed. The system enables the nano-scientists to touch and feel the sample surfaces, viewed through AFM, in order to provide them with better understanding of the physical properties of the surface, such as roughness, stiffness and shape of molecular architecture. At this stage, the proposed work uses of ine images produced using AFM and perform image analysis to create virtual surfaces suitable for haptics force analysis. The research work is in the process of extension from of ine to online process where interaction will be done directly on the material surface for realistic analysis.
Abstract: In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.
Abstract: In this paper, five options of Iran’s gas flare recovery
have been compared via MCDM method. For developing the model,
the weighing factor of each indicator an AHP method is used via the
Expert-choice software. Several cases were considered in this
analysis. They are defined where the priorities were defined always
keeping one criterion in first position, while the priorities of the other
criteria were defined by ordinal information defining the mutual
relations of the criteria and the respective indicators. The results,
show that amongst these cases, priority is obtained for CHP usage
where availability indicator is highly weighted while the pipeline
usage is obtained where environmental indicator highly weighted and
the injection priority is obtained where economic indicator is highly
weighted and also when the weighing factor of all the criteria are the
same the Injection priority is obtained.
Abstract: Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.
Abstract: In this paper, we propose a novel frequency offset
estimation scheme for orthogonal frequency division multiplexing
(OFDM) systems. By correlating the OFDM signals within the coherence
phase bandwidth and employing a threshold in the frequency
offset estimation process, the proposed scheme is not only robust to
the timing offset but also has a reduced complexity compared with
that of the conventional scheme. Moreover, a timing offset estimation
scheme is also proposed as the next stage of the proposed frequency
offset estimation. Numerical results show that the proposed scheme
can estimate frequency offset with lower computational complexity
and does not require additional memory while maintaining the same
level of estimation performance.
Abstract: One of the essential requirements of a realistic
surgical simulator is to reproduce haptic sensations due to the
interactions in the virtual environment. However, the interaction need
to be performed in real-time, since a delay between the user action
and the system reaction reduces the immersion sensation. In this
paper, a prototype of a coronary stent implant simulator is present;
this system allows real-time interactions with an artery by means of a
specific haptic device. To improve the realism of the simulation, the
building of the virtual environment is based on real patients- images
and a Web Portal is used to search in the geographically remote
medical centres a virtual environment with specific features in terms
of pathology or anatomy. The functional architecture of the system
defines several Medical Centres in which virtual environments built
from the real patients- images and related metadata with specific
features in terms of pathology or anatomy are stored. The searched
data are downloaded from the Medical Centre to the Training Centre
provided with a specific haptic device and with the software
necessary both to manage the interaction in the virtual environment.
After the integration of the virtual environment in the simulation
system it is possible to perform training on the specific surgical
procedure.
Abstract: Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.
Abstract: Webcam systems now function as the new privileged
vantage points from which to view the city. This transformation of
CCTV technology from surveillance to promotional tool is significant
because its'scopic regime' presents, back to the public, a new virtual
'site' that sits alongside its real-time counterpart. Significantly,
thisraw 'image' data can, in fact,be co-optedand processed so as to
disrupt their original purpose. This paper will demonstrate this
disruptive capacity through an architectural project. It will reveal how
the adaption the webcam image offers a technical springboard by
which to initiate alternate urban form making decisions and subvert
the disciplinary reliance on the 'flat' orthographic plan. In so doing,
the paper will show how this 'digital material' exceeds the imagistic
function of the image; shiftingit from being a vehicle of signification
to a site of affect.
Abstract: Coronary artery bypass grafts (CABG) are widely
studied with respect to hemodynamic conditions which play
important role in presence of a restenosis. However, papers which
concern with constitutive modeling of CABG are lacking in the
literature. The purpose of this study is to find a constitutive model for
CABG tissue. A sample of the CABG obtained within an autopsy
underwent an inflation–extension test. Displacements were
recoredered by CCD cameras and subsequently evaluated by digital
image correlation. Pressure – radius and axial force – elongation
data were used to fit material model. The tissue was modeled as onelayered
composite reinforced by two families of helical fibers. The
material is assumed to be locally orthotropic, nonlinear,
incompressible and hyperelastic. Material parameters are estimated
for two strain energy functions (SEF). The first is classical
exponential. The second SEF is logarithmic which allows
interpretation by means of limiting (finite) strain extensibility.
Presented material parameters are estimated by optimization based
on radial and axial equilibrium equation in a thick-walled tube. Both
material models fit experimental data successfully. The exponential
model fits significantly better relationship between axial force and
axial strain than logarithmic one.
Abstract: In this paper, acoustic techniques are used to detect hidden insect infestations of date palm tress (Phoenix dactylifera L.). In particular, we use an acoustic instrument for early discovery of the presence of a destructive insect pest commonly known as the Red Date Palm Weevil (RDPW) and scientifically as Rhynchophorus ferrugineus (Olivier). This type of insect attacks date palm tress and causes irreversible damages at late stages. As a result, the infected trees must be destroyed. Therefore, early presence detection is a major part in controlling the spread and economic damage caused by this type of infestation. Furthermore monitoring and early detection of the disease can asses in taking appropriate measures such as isolating or treating the infected trees. The acoustic system is evaluated in terms of its ability for early discovery of hidden bests inside the tested tree. When signal acquisitions is completed for a number of date palms, a signal processing technique known as time-frequency analysis is evaluated in terms of providing an estimate that can be visually used to recognize the acoustic signature of the RDPW. The testing instrument was tested in the laboratory first then; it was used on suspected or infested tress in the field. The final results indicate that the acoustic monitoring approach along with signal processing techniques are very promising for the early detection of presence of the larva as well as the adult pest in the date palms.
Abstract: Statistics indicate that more than 1000 phishing attacks are launched every month. With 57 million people hit by the fraud so far in America alone, how do we combat phishing?This publication aims to discuss strategies in the war against Phishing. This study is an examination of the analysis and critique found in the ways adopted at various levels to counter the crescendo of phishing attacks and new techniques being adopted for the same. An analysis of the measures taken up by the varied popular Mail servers and popular browsers is done under this study. This work intends to increase the understanding and awareness of the internet user across the globe and even discusses plausible countermeasures at the users as well as the developers end. This conceptual paper will contribute to future research on similar topics.
Abstract: Main Memory Database systems (MMDB) store their
data in main physical memory and provide very high-speed access.
Conventional database systems are optimized for the particular
characteristics of disk storage mechanisms. Memory resident
systems, on the other hand, use different optimizations to structure
and organize data, as well as to make it reliable.
This paper provides a brief overview on MMDBs and one of the
memory resident systems named FastDB and compares the
processing time of this system with a typical disc resident database
based on the results of the implementation of TPC benchmarks
environment on both.
Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: Dust storms are one of the most costly and destructive
events in many desert regions. They can cause massive damages both
in natural environments and human lives. This paper is aimed at
presenting a preliminary study on dust storms, as a major natural
hazard in arid and semi-arid regions. As a case study, dust storm
events occurred in Zabol city located in Sistan Region of Iran was
analyzed to diagnose and predict dust storms. The identification and
prediction of dust storm events could have significant impacts on
damages reduction. Present models for this purpose are complicated
and not appropriate for many areas with poor-data environments. The
present study explores Gamma test for identifying inputs of ANNs
model, for dust storm prediction. Results indicate that more attempts
must be carried out concerning dust storms identification and
segregate between various dust storm types.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.