Abstract: This paper proposes a novel methodology for enabling
debugging and tracing of production web applications without
affecting its normal flow and functionality. This method of debugging
enables developers and maintenance engineers to replace a set of
existing resources such as images, server side scripts, cascading
style sheets with another set of resources per web session. The new
resources will only be active in the debug session and other sessions
will not be affected. This methodology will help developers in tracing
defects, especially those that appear only in production environments
and in exploring the behaviour of the system. A realization of the
proposed methodology has been implemented in Java.
Abstract: The aim of this paper is the analysis and preservation of lime kilns, focusing on the structure, construction, and functionality of vertical shaft lime kilns of the Cap Corse in Corsica. Plans and sections of two lime kilns are presented in detail, providing an overall picture of this specific industrial heritage. The potential damage areas are identified performing structural analysis of a lime kiln using the finite element method. A restoration and strengthening technique that satisfies the directions of the Charter of Venice is presented using post-tensioning tendons. Recommendations are given to preserve and promote these important historical structures integrating them into the custom footpath.
Abstract: Early Intervention Program (EIP) is required to
improve the overall development of children with Trisomy 21 (Down
syndrome). In order to help trainer and parent in the implementation
of EIP, a support system has been developed. The support system is
able to screen data automatically, store and analyze data, generate
individual EIP (curriculum) with optimal training duration and to
generate training automatically. The system consists of hardware and
software where the software has been implemented using Java
language and Linux Fedora. The software has been tested to ensure the
functionality and reliability. The prototype has been also tested in
Down syndrome centers. Test result shows that the system is reliable
to be used for generation of an individual curriculum which includes
the training program to improve the motor, cognitive, and combination
abilities of Down syndrome children under 6 years.
Abstract: Resource Discovery in Grids is critical for efficient
resource allocation and management. Heterogeneous nature and
dynamic availability of resources make resource discovery a
challenging task. As numbers of nodes are increasing from tens to
thousands, scalability is essentially desired. Peer-to-Peer (P2P)
techniques, on the other hand, provide effective implementation of
scalable services and applications. In this paper we propose a model
for resource discovery in Condor Middleware by using the four axis
framework defined in P2P approach. The proposed model enhances
Condor to incorporate functionality of a P2P system, thus aim to
make Condor more scalable, flexible, reliable and robust.
Abstract: Evolvable hardware (EHW) is a developing field that
applies evolutionary algorithm (EA) to automatically design circuits,
antennas, robot controllers etc. A lot of research has been done in this
area and several different EAs have been introduced to tackle
numerous problems, as scalability, evolvability etc. However every
time a specific EA is chosen for solving a particular task, all its
components, such as population size, initialization, selection
mechanism, mutation rate, and genetic operators, should be selected
in order to achieve the best results. In the last three decade the
selection of the right parameters for the EA-s components for solving
different “test-problems" has been investigated. In this paper the
behaviour of mutation rate for designing logic circuits, which has not
been done before, has been deeply analyzed. The mutation rate for an
EHW system modifies the number of inputs of each logic gates, the
functionality (for example from AND to NOR) and the connectivity
between logic gates. The behaviour of the mutation has been
analyzed based on the number of generations, genotype redundancy
and number of logic gates for the evolved circuits. The experimental
results found provide the behaviour of the mutation rate during
evolution for the design and optimization of simple logic circuits.
The experimental results propose the best mutation rate to be used for
designing combinational logic circuits. The research presented is
particular important for those who would like to implement a
dynamic mutation rate inside the evolutionary algorithm for evolving
digital circuits. The researches on the mutation rate during the last 40
years are also summarized.
Abstract: There are many expand of Wi-Fi zones provided
mobile careers and usage of wireless access point at home as increase
of usage of wireless internet caused by the use of smart phone. This
paper shows wireless local area network status, security threats of
WLAN and functionality of major wireless access point in Korea. We
propose security countermeasures concerned with life cycle of access
point from manufacturing to installation, using and finally disposal.
There needed to releasing with configured secure at access point.
Because, it is most cost effective resolution than stage of installation or
other life cycle of access point.
Abstract: The goal of this paper is to segment the countries
based on the value of export from Iran during 14 years ending at 2005. To measure the dissimilarity among export baskets of different countries, we define Dissimilarity Export Basket (DEB) function and
use this distance function in K-means algorithm. The DEB function
is defined based on the concepts of the association rules and the
value of export group-commodities. In this paper, clustering quality
function and clusters intraclass inertia are defined to, respectively,
calculate the optimum number of clusters and to compare the
functionality of DEB versus Euclidean distance. We have also study
the effects of importance weight in DEB function to improve
clustering quality. Lastly when segmentation is completed, a
designated RFM model is used to analyze the relative profitability of
each cluster.
Abstract: A new OTA-based logarithmic-control variable gain
current amplifier (LCCA) is presented. It consists of two Operational
Transconductance Amplifier (OTA) and two PMOS transistors
biased in weak inversion region. The circuit operates from 0.6V DC
power supply and consumes 0.6 μW. The linear-dB controllable
output range is 43 dB with maximum error less than 0.5dB. The
functionality of the proposed design was confirmed using HSPICE in
0.35μm CMOS process technology.
Abstract: A computational platform is presented in this
contribution. It has been designed as a virtual laboratory to be used
for exploring optimization algorithms in biological problems. This
platform is built on a blackboard-based agent architecture. As a test
case, the version of the platform presented here is devoted to the
study of protein folding, initially with a bead-like description of the
chain and with the widely used model of hydrophobic and polar
residues (HP model). Some details of the platform design are
presented along with its capabilities and also are revised some
explorations of the protein folding problems with different types of
discrete space. It is also shown the capability of the platform to
incorporate specific tools for the structural analysis of the runs in
order to understand and improve the optimization process.
Accordingly, the results obtained demonstrate that the ensemble of
computational tools into a single platform is worthwhile by itself,
since experiments developed on it can be designed to fulfill different
levels of information in a self-consistent fashion. By now, it is being
explored how an experiment design can be useful to create a
computational agent to be included within the platform. These
inclusions of designed agents –or software pieces– are useful for the
better accomplishment of the tasks to be developed by the platform.
Clearly, while the number of agents increases the new version of the
virtual laboratory thus enhances in robustness and functionality.
Abstract: Topology Optimization is a defined as the method of
determining optimal distribution of material for the assumed design
space with functionality, loads and boundary conditions [1].
Topology optimization can be used to optimize shape for the
purposes of weight reduction, minimizing material requirements or
selecting cost effective materials [2]. Topology optimization has been
implemented through the use of finite element methods for the
analysis, and optimization techniques based on the method of moving
asymptotes, genetic algorithms, optimality criteria method, level sets
and topological derivatives. Case study of Typical “Fuselage design"
is considered for this paper to explain the benefits of Topology
Optimization in the design cycle. A cylindrical shell is assumed as
the design space and aerospace standard pay loads were applied on
the fuselage with wing attachments as constraints. Then topological
optimization is done using Finite Element (FE) based software. This
optimization results in the structural concept design which satisfies
all the design constraints using minimum material.
Abstract: This paper presents functionality of negotiation agent
on value-based design decision. The functionality is based on the
characteristics of the system and goal specification. A Prometheus
Design Tool model was used for developing the system. Group
functionality will be the attribute for negotiation agents, which
comprises a coordinator agent and decision- maker agent. The results
of the testing of the system to a building system selection on valuebased
decision environment are also presented.
Abstract: There exists an injective, information-preserving function
that maps a semantic network (i.e a directed labeled network)
to a directed network (i.e. a directed unlabeled network). The edge
label in the semantic network is represented as a topological feature
of the directed network. Also, there exists an injective function that
maps a directed network to an undirected network (i.e. an undirected
unlabeled network). The edge directionality in the directed network
is represented as a topological feature of the undirected network.
Through function composition, there exists an injective function that
maps a semantic network to an undirected network. Thus, aside from
space constraints, the semantic network construct does not have any
modeling functionality that is not possible with either a directed
or undirected network representation. Two proofs of this idea will
be presented. The first is a proof of the aforementioned function
composition concept. The second is a simpler proof involving an
undirected binary encoding of a semantic network.
Abstract: Cloud Computing is an approach that provides computation and storage services on-demand to clients over the network, independent of device and location. In the last few years, cloud computing became a trend in information technology with many companies that transfer their business processes and applications in the cloud. Cloud computing with service oriented architecture has contributed to rapid development of Geographic Information Systems. Open Geospatial Consortium with its standards provides the interfaces for hosted spatial data and GIS functionality to integrated GIS applications. Furthermore, with the enormous processing power, clouds provide efficient environment for data intensive applications that can be performed efficiently, with higher precision, and greater reliability. This paper presents our work on the geospatial data services within the cloud computing environment and its technology. A cloud computing environment with the strengths and weaknesses of the geographic information system will be introduced. The OGC standards that solve our application interoperability are highlighted. Finally, we outline our system architecture with utilities for requesting and invoking our developed data intensive applications as a web service.
Abstract: Web services are pieces of software that can be invoked via a standardized protocol. They can be combined via formalized taskflow languages. The Open Knowledge system is a fully distributed system using P2P technology, that allows users to publish the setaskflows, and programmers to register their web services or publish implementations of them, for the roles described in these workflows.Besides this, the system offers the functionality to select a peer that could coordinate such an interaction model and inform web services when it is their 'turn'. In this paper we describe the architecture and implementation of the Open Knowledge Kernel which provides the core functionality of the Open Knowledge system.
Abstract: SeqWord Gene Island Sniffer, a new program for
the identification of mobile genetic elements in sequences of bacterial chromosomes is presented. This program is based on the
analysis of oligonucleotide usage variations in DNA sequences. 3,518 mobile genetic elements were identified in 637 bacterial
genomes and further analyzed by sequence similarity and the
functionality of encoded proteins. The results of this study are stored in an open database http://anjie.bi.up.ac.za/geidb/geidbhome.
php). The developed computer program and the database provide the information valuable for further investigation of the
distribution of mobile genetic elements and virulence factors among bacteria. The program is available for download at www.bi.up.ac.za/SeqWord/sniffer/index.html.
Abstract: Testing is an activity that is required both in the
development and maintenance of the software development life cycle
in which Integration Testing is an important activity. Integration
testing is based on the specification and functionality of the software
and thus could be called black-box testing technique. The purpose of
integration testing is testing integration between software
components. In function or system testing, the concern is with overall
behavior and whether the software meets its functional specifications
or performance characteristics or how well the software and
hardware work together. This explains the importance and necessity
of IT for which the emphasis is on interactions between modules and
their interfaces. Software errors should be discovered early during
IT to reduce the costs of correction. This paper introduces a new type
of integration error, presenting an overview of Integration Testing
techniques with comparison of each technique and also identifying
which technique detects what type of error.
Abstract: In this work, I present a review on Sparse Distributed
Memory for Small Cues (SDMSCue), a variant of Sparse Distributed
Memory (SDM) that is capable of handling small cues. I then conduct
and show some cognitive experiments on SDMSCue to test its
cognitive soundness compared to SDM. Small cues refer to input
cues that are presented to memory for reading associations; but have
many missing parts or fields from them. The original SDM failed to
handle such a problem. SDMSCue handles and overcomes this
pitfall. The main idea in SDMSCue; is the repeated projection of the
semantic space on smaller subspaces; that are selected based on the
input cue length and pattern. This process allows for Read/Write
operations using an input cue that is missing a large portion.
SDMSCue is augmented with the use of genetic algorithms for
memory allocation and initialization. I claim that SDM functionality
is a subset of SDMSCue functionality.
Abstract: Computer technology and the Internet have made a
breakthrough in the existence of data communication. This has
opened a whole new way of implementing steganography to ensure
secure data transfer. Steganography is the fine art of hiding the
information. Hiding the message in the carrier file enables the
deniability of the existence of any message at all. This paper designs
a stego machine to develop a steganographic application to hide data
containing text in a computer video file and to retrieve the hidden
information. This can be designed by embedding text file in a video
file in such away that the video does not loose its functionality using
Least Significant Bit (LSB) modification method. This method
applies imperceptible modifications. This proposed method strives
for high security to an eavesdropper-s inability to detect hidden
information.
Abstract: Delivering streaming video over wireless is an
important component of many interactive multimedia applications
running on personal wireless handset devices. Such personal devices
have to be inexpensive, compact, and lightweight. But wireless
channels have a high channel bit error rate and limited bandwidth.
Delay variation of packets due to network congestion and the high bit
error rate greatly degrades the quality of video at the handheld
device. Therefore, mobile access to multimedia contents requires
video transcoding functionality at the edge of the mobile network for
interworking with heterogeneous networks and services. Therefore,
to guarantee quality of service (QoS) delivered to the mobile user, a
robust and efficient transcoding scheme should be deployed in
mobile multimedia transporting network. Hence, this paper
examines the challenges and limitations that the video transcoding
schemes in mobile multimedia transporting network face. Then
handheld resources, network conditions and content based mobile
and wireless video transcoding is proposed to provide high QoS
applications. Exceptional performance is demonstrated in the
experiment results. These experiments were designed to verify and
prove the robustness of the proposed approach. Extensive
experiments have been conducted, and the results of various video
clips with different bit rate and frame rate have been provided.
Abstract: Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.