Abstract: Colored Petri Nets (CPN) are very known kind of
high level Petri nets. With sound and complete semantics, rewriting
logic is one of very powerful logics in description and verification of
non-deterministic concurrent systems. Recently, CPN semantics are
defined in terms of rewriting logic, allowing us to built models by
formal reasoning. In this paper, we propose an automatic translation
of CPN to the rewriting logic language Maude. This tool allows
graphical editing and simulating CPN. The tool allows the user
drawing a CPN graphically and automatic translating the graphical
representation of the drawn CPN to Maude specification. Then,
Maude language is used to perform the simulation of the resulted
Maude specification. It is the first rewriting logic based environment
for this category of Petri Nets.
Abstract: The area of Project Risk Management (PRM) has
been extensively researched, and the utilization of various tools and
techniques for managing risk in several industries has been
sufficiently reported. Formal and systematic PRM practices have
been made available for the construction industry. Based on such
body of knowledge, this paper tries to find out the global picture of
PRM practices and approaches with the help of a survey to look into
the usage of PRM techniques and diffusion of software tools, their
level of maturity, and their usefulness in the construction sector.
Results show that, despite existing techniques and tools, their usage is
limited: software tools are used only by a minority of respondents
and their cost is one of the largest hurdles in adoption. Finally, the
paper provides some important guidelines for future research
regarding quantitative risk analysis techniques and suggestions for
PRM software tools development and improvement.
Abstract: The paper contains an investigation of zeros Of Bargmann analytic representation. A brief introduction to Harmonic oscillator formalism is given. The Bargmann analytic representation has been studied. The zeros of Bargmann analytic function are considered. The Q or Husimi functions are introduced. The The Bargmann functions and the Husimi functions have the same zeros. The Bargmann functions f(z) have exactly q zeros. The evolution time of the zeros μn are discussed. Various examples have been given.
Abstract: Verification of real-time software systems can be
expensive in terms of time and resources. Testing is the main method
of proving correctness but has been shown to be a long and time
consuming process. Everyday engineers are usually unwilling to
adopt formal approaches to correctness because of the overhead
associated with developing their knowledge of such techniques.
Performance modelling techniques allow systems to be evaluated
with respect to timing constraints. This paper describes PARTES, a
framework which guides the extraction of performance models from
programs written in an annotated subset of C.
Abstract: Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Abstract: There is significant interest in achieving technology
innovation through new product development activities. It is
recognized, however, that traditional project management practices
focused only on performance, cost, and schedule attributes, can often
lead to risk mitigation strategies that limit new technology
innovation. In this paper, a new approach is proposed for formally
managing and quantifying technology innovation. This approach uses
a risk-based framework that simultaneously optimizes innovation
attributes along with traditional project management and system
engineering attributes. To demonstrate the efficacy of the new riskbased
approach, a comprehensive product development experiment
was conducted. This experiment simultaneously managed the
innovation risks and the product delivery risks through the proposed
risk-based framework. Quantitative metrics for technology
innovation were tracked and the experimental results indicate that the
risk-based approach can simultaneously achieve both project
deliverable and innovation objectives.
Abstract: It is widely acknowledged that there is a shortage of software developers, not only in South Africa, but also worldwide. Despite reports on a gap between industry needs and software education, the gap has mostly been explored in quantitative studies. This paper reports on the qualitative data of a mixed method study of the perceptions of professional software developers regarding what topics they learned from their formal education and the importance of these topics to their actual work. The analysis suggests that there is a gap between industry’s needs and software development education and the following recommendations are made: 1) Real-life projects must be included in students’ education; 2) Soft skills and business skills must be included in curricula; 3) Universities must keep the curriculum up to date; 4) Software development education must be made accessible to a diverse range of students.
Abstract: As more people from non-technical backgrounds
are becoming directly involved with large-scale ontology
development, the focal point of ontology research has shifted
from the more theoretical ontology issues to problems
associated with the actual use of ontologies in real-world,
large-scale collaborative applications. Recently the National
Science Foundation funded a large collaborative ontology
development project for which a new formal ontology model,
the Ontology Abstract Machine (OAM), was developed to
satisfy some unique functional and data representation
requirements. This paper introduces the OAM model and the
related algorithms that enable maintenance of an ontology that
supports node-based user access. The successful software
implementation of the OAM model and its subsequent
acceptance by a large research community proves its validity
and its real-world application value.
Abstract: Taking into account the link between the efficiency of
a detector and the complexity of a stealth mechanism, we propose in
this paper a new formalism for stealth using graph theory.
Abstract: Decisions are regularly made during a project or
daily life. Some decisions are critical and have a direct impact on
project or human success. Formal evaluation is thus required,
especially for crucial decisions, to arrive at the optimal solution
among alternatives to address issues. According to microeconomic
theory, all people-s decisions can be modeled as indifference curves.
The proposed approach supports formal analysis and decision by
constructing indifference curve model from the previous experts-
decision criteria. These knowledge embedded in the system can be
reused or help naïve users select alternative solution of the similar
problem. Moreover, the method is flexible to cope with unlimited
number of factors influencing the decision-making. The preliminary
experimental results of the alternative selection are accurately
matched with the expert-s decisions.
Abstract: A computer model of Quantum Theory (QT) has been
developed by the author. Major goal of the computer model was
support and demonstration of an as large as possible scope of QT.
This includes simulations for the major QT (Gedanken-) experiments
such as, for example, the famous double-slit experiment.
Besides the anticipated difficulties with (1) transforming exacting
mathematics into a computer program, two further types of problems
showed up, namely (2) areas where QT provides a complete mathematical
formalism, but when it comes to concrete applications the
equations are not solvable at all, or only with extremely high effort;
(3) QT rules which are formulated in natural language and which do
not seem to be translatable to precise mathematical expressions, nor
to a computer program.
The paper lists problems in all three categories and describes also
the possible solutions or circumventions developed for the computer
model.
Abstract: The aim of this study is evaluating the antinociceptive
and anti-inflamatory activity of Geum kokanicum. After
determination total extract LD50, different doses of extract were
chosen for intrapritoneal injections. In inflammation test, male NMRI
mice were divided into 6 groups: control (normal saline), positive
control (Dexamethasone 15mg/kg), and total extract (0.025, 0.05,
0.1, and 0.2 gr/kg). The inflammation was produced by xyleneinduced
edema. In order to evaluate the antinociceptive effect of total
extract, formalin test was used. Mice were divided into 6 groups:
control, positive control (morphine 10mg/kg), and 4 groups which
received total extract. Then they received Formalin. The animals
were observed for the reaction to pain. Data were analyzed using
One-way ANOVA followed by Tukey-Kramer multiple comparison
test. LD50 was 1 gr/kg. Data indicated that 0.5,0.1 and 0.2 gr/kg
doses of total extract have particular antinociceptive and antiinflammatory
effects in a comparison with control (P
Abstract: Specification-based testing enables us to detect errors
in the implementation of functions defined in given specifications.
Its effectiveness in achieving high path coverage and efficiency in
generating test cases are always major concerns of testers. The automatic
test cases generation approach based on formal specifications
proposed by Liu and Nakajima is aimed at ensuring high effectiveness
and efficiency, but this approach has not been empirically assessed.
In this paper, we present an experiment for assessing Liu-s testing
approach. The result indicates that this testing approach may not be
effective in some circumstances. We discuss the result, analyse the
specific causes for the ineffectiveness, and describe some suggestions
for improvement.
Abstract: One of the main advantages of the LO paradigm is to
allow the availability of good quality, shareable learning material
through the Web. The effectiveness of the retrieval process requires a
formal description of the resources (metadata) that closely fits the
user-s search criteria; in spite of the huge international efforts in this
field, educational metadata schemata often fail to fulfil this
requirement. This work aims to improve the situation, by the
definition of a metadata model capturing specific didactic features of
shareable learning resources. It classifies LOs into “teacher-oriented"
and “student-oriented" categories, in order to describe the role a LO
is to play when it is integrated into the educational process. This
article describes the model and a first experimental validation process
that has been carried out in a controlled environment.
Abstract: In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.
Abstract: This paper is to investigate the impplementation of security
mechanism in object oriented database system. Formal methods
plays an essential role in computer security due to its powerful expressiveness
and concise syntax and semantics. In this paper, both issues
of specification and implementation in database security environment
will be considered; and the database security is achieved through
the development of an efficient implementation of the specification
without compromising its originality and expressiveness.
Abstract: In present study the effects of anti-inflammatory and
antinociceptive of vitex hydro-alcoholic extract were evaluated on
male mice. In inflammatory test mice were divided into 7 groups:
first group was control. The second group, positive control group,
received dexamethasone (15 mg/kg) and the other five groups
received different doses of hydroalcohol extract of Vitex fruit (265,
365, 465, 565, and 665 mg/kg). The inflammation was caused by
xylene-induced ear edema. Formalin test was used for evaluation of
antinociceptive effect of extract. In this test, mice were divided into 7
groups: control, morphine (10mg/kg) as positive control group, and
Vitex extract groups ((265, 365, 465, 565, and 665 mg/kg). All drugs
were administered intrapritoneally, 30 min before each test. The data
were analyzed using one-way ANOVA followed by Tukey-kramer
multiple comparison test. Results have shown significant antiinflammatory
effects of extract at all dosed as compared with control
(P
Abstract: Analysis for the propagation of elastic waves in
arbitrary anisotropic plates is investigated, commencing with a
formal analysis of waves in a layered plate of an arbitrary anisotropic
media, the dispersion relations of elastic waves are obtained by
invoking continuity at the interface and boundary of conditions on
the surfaces of layered plate. The obtained solutions can be used for
material systems of higher symmetry such as monoclinic,
orthotropic, transversely isotropic, cubic, and isotropic as it is
contained implicitly in the analysis. The cases of free layered plate
and layered half space are considered separately. Some special cases
have also been deduced and discussed. Finally numerical solution of
the frequency equations for an aluminum epoxy is carried out, and
the dispersion curves for the few lower modes are presented. The
results obtained theoretically have been verified numerically and
illustrated graphically.
Abstract: This paper introduces a tool that is being developed for the expression of information security policy controls that govern electronic healthcare records. By reference to published findings, the paper introduces the theory behind the use of knowledge management for automatic and consistent security policy assertion using the formalism called the Secutype; the development of the tool and functionality is discussed; some examples of Secutypes generated by the tool are provided; proposed integration with existing medical record systems is described. The paper is concluded with a section on further work and critique of the work achieved to date.
Abstract: Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.