Abstract: Reverse engineering of full-genomic interaction networks based on compendia of expression data has been successfully applied for a number of model organisms. This study adapts these approaches for an important non-model organism: The major human fungal pathogen Candida albicans. During the infection process, the pathogen can adapt to a wide range of environmental niches and reversibly changes its growth form. Given the importance of these processes, it is important to know how they are regulated. This study presents a reverse engineering strategy able to infer fullgenomic interaction networks for C. albicans based on a linear regression, utilizing the sparseness criterion (LASSO). To overcome the limited amount of expression data and small number of known interactions, we utilize different prior-knowledge sources guiding the network inference to a knowledge driven solution. Since, no database of known interactions for C. albicans exists, we use a textmining system which utilizes full-text research papers to identify known regulatory interactions. By comparing with these known regulatory interactions, we find an optimal value for global modelling parameters weighting the influence of the sparseness criterion and the prior-knowledge. Furthermore, we show that soft integration of prior-knowledge additionally improves the performance. Finally, we compare the performance of our approach to state of the art network inference approaches.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: Three-dimensional reconstruction of small objects has
been one of the most challenging problems over the last decade.
Computer graphics researchers and photography professionals have
been working on improving 3D reconstruction algorithms to fit the
high demands of various real life applications. Medical sciences,
animation industry, virtual reality, pattern recognition, tourism
industry, and reverse engineering are common fields where 3D
reconstruction of objects plays a vital role. Both lack of accuracy and
high computational cost are the major challenges facing successful
3D reconstruction. Fringe projection has emerged as a promising 3D
reconstruction direction that combines low computational cost to both
high precision and high resolution. It employs digital projection,
structured light systems and phase analysis on fringed pictures.
Research studies have shown that the system has acceptable
performance, and moreover it is insensitive to ambient light.
This paper presents an overview of fringe projection approaches. It
also presents an experimental study and implementation of a simple
fringe projection system. We tested our system using two objects
with different materials and levels of details. Experimental results
have shown that, while our system is simple, it produces acceptable
results.
Abstract: When it comes to last, it is regarded as the critical
foundation of shoe design and development. A computer aided
methodology for various last form designs is proposed in this study.
The reverse engineering is mainly applied to the process of scanning
for the last form. Then with the minimum energy for revision of
surface continuity, the surface reconstruction of last is rebuilt by the
feature curves of the scanned last. When the surface reconstruction of
last is completed, the weighted arithmetic mean method is applied to
the computation on the shape morphing for the control mesh of last,
thus 3D last form of different sizes is generated from its original form
feature with functions remained. In the end, the result of this study is
applied to an application for 3D last reconstruction system. The
practicability of the proposed methodology is verified through later
case studies.
Abstract: The project describes the modeling of various
architectures mechatronics specifically morphologies of robots in an educational environment. Each structure developed by students of
pre-school, primary and secondary was created using the concept of
reverse engineering in a constructivist environment, to later be integrated in educational software that promotes the teaching of
educational Robotics in a virtual and economic environment.
Abstract: The most reliable and accurate description of the actual behavior of a software system is its source code. However, not all questions about the system can be answered directly by resorting to this repository of information. What the reverse engineering methodology aims at is the extraction of abstract, goal-oriented “views" of the system, able to summarize relevant properties of the computation performed by the program. While concentrating on reverse engineering we had modeled the C++ files by designing the translator.
Abstract: Obfuscation is a low cost software protection
methodology to avoid reverse engineering and re engineering of
applications. Source code obfuscation aims in obscuring the source
code to hide the functionality of the codes. This paper proposes an
Array data transformation in order to obfuscate the source code
which uses arrays. The applications using the proposed data
structures force the programmer to obscure the logic manually. It
makes the developed obscured codes hard to reverse engineer and
also protects the functionality of the codes.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.