Abstract: The success of an e-learning system is highly
dependent on the quality of its educational content and how effective,
complete, and simple the design tool can be for teachers. Educational
modeling languages (EMLs) are proposed as design languages
intended to teachers for modeling diverse teaching-learning
experiences, independently of the pedagogical approach and in
different contexts. However, most existing EMLs are criticized for
being too abstract and too complex to be understood and manipulated
by teachers. In this paper, we present a visual EML that simplifies the
process of designing learning scenarios for teachers with no
programming background. Based on the conceptual framework of the
activity theory, our resulting visual EML focuses on using Domainspecific
modeling techniques to provide a pedagogical level of
abstraction in the design process.
Abstract: Knowledge is attributed to human whose problemsolving
behavior is subjective and complex. In today-s knowledge
economy, the need to manage knowledge produced by a community
of actors cannot be overemphasized. This is due to the fact that
actors possess some level of tacit knowledge which is generally
difficult to articulate. Problem-solving requires searching and sharing
of knowledge among a group of actors in a particular context.
Knowledge expressed within the context of a problem resolution
must be capitalized for future reuse. In this paper, an approach that
permits dynamic capitalization of relevant and reliable actors-
knowledge in solving decision problem following Economic
Intelligence process is proposed. Knowledge annotation method and
temporal attributes are used for handling the complexity in the
communication among actors and in contextualizing expressed
knowledge. A prototype is built to demonstrate the functionalities of
a collaborative Knowledge Management system based on this
approach. It is tested with sample cases and the result showed that
dynamic capitalization leads to knowledge validation hence
increasing reliability of captured knowledge for reuse. The system
can be adapted to various domains.
Abstract: The aim of the present work is to study the effect of annealing on the vibration damping capacity of high-chromium (16%) ferromagnetic steel. The alloys were prepared from raw materials of 99.9% purity melted in a high frequency induction furnace under high vacuum. The samples were heat-treated in vacuum at various temperatures (800 to 1200ºC) for 1 hour followed by slow cooling (120ºC/h). The inverted torsional pendulum method was used to evaluate the vibration damping capacity. The results indicated that the vibration damping capacity of the alloys is influenced by annealing and there exists a critical annealing temperature after 1000ºC. The damping capacity increases quickly below the critical temperature since the magnetic domains move more easily.
Abstract: The Corporate Social Responsibility (CSR) performance has garnered significant interest during the last two decades as numerous methodologies are proposed by Social Responsible Investment (SRI) indexes. The weight of each indicator is a crucial component of the CSR measurement procedures. Based on a previous study, the appropriate weight of each proposed indicator for the Greek telecommunication sector is specified using the rank reciprocal weighting. The Kendall-s Coefficient of Concordance and Spearman Correlation Coefficient non-parametric tests are adopted to determine the level of consensus among the experts concerning the importance rank of indicators. The results show that there is no consensus regarding the rank of indicators in most of stakeholders- domains. The equal weight for all indicators could be proposed as a solution for the lack of consensus among the experts. The study recommends three different equations concerning the adopted weight approach.
Abstract: The Norwegian Military Academy (Army) has
initiated a project with the main ambition to explore possible avenues
to enhancing operational effectiveness through an increased use of
simulation-based training and exercises. Within a cost/benefit
framework, we discuss opportunities and limitations of vertical and
horizontal integration of the existing tactical training system. Vertical
integration implies expanding the existing training system to span the
full range of training from tactical level (platoon, company) to
command and staff level (battalion, brigade). Horizontal integration
means including other domains than army tactics and staff
procedures in the training, such as military ethics, foreign languages,
leadership and decision making. We discuss each of the integration
options with respect to purpose and content of training, "best
practice" for organising and conducting simulation-based training,
and suggest how to evaluate training procedures and measure
learning outcomes. We conclude by giving guidelines towards further
explorative work and possible implementation.
Abstract: The connection between solar activity and adverse phenomena in the Earth’s environment that can affect space and ground based technologies has spurred interest in Space Weather (SW) research. A great effort has been put on the development of suitable models that can provide advanced forecast of SW events. With the progress in computational technology, it is becoming possible to develop operational large scale physics based models which can incorporate the most important physical processes and domains of the Sun-Earth system. In order to enhance our SW prediction capabilities we are developing advanced numerical tools. With operational requirements in mind, our goal is to develop a modular simulation framework of propagation of the disturbances from the Sun through interplanetary space to the Earth. Here, we report and discuss on the development of coronal field and solar wind components for a large scale MHD code. The model for these components is based on a potential field source surface model and an empirical Wang-Sheeley-Arge solar wind relation.
Abstract: The overriding goal of software engineering is to
provide a high quality system, application or a product. To achieve
this goal, software engineers must apply effective methods coupled
with modern tools within the context of a mature software process
[2]. In addition, it is also must to assure that high quality is realized.
Although many quality measures can be collected at the project
levels, the important measures are errors and defects. Deriving a
quality measure for reusable components has proven to be
challenging task now a days. The results obtained from the study are
based on the empirical evidence of reuse practices, as emerged from
the analysis of industrial projects. Both large and small companies,
working in a variety of business domains, and using object-oriented
and procedural development approaches contributed towards this
study. This paper proposes a quality metric that provides benefit at
both project and process level, namely defect removal efficiency
(DRE).
Abstract: In this paper we present a new approach to detecting a
flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image
based on texture features. Texture is one of the most important
features used in recognizing patterns in an image. The paper
describes texture features based on 2D Gabor functions, i.e.,
Gaussian shaped band-pass filters, with dyadic treatment of the radial
spatial frequency range and multiple orientations, which represent an
appropriate choice for tasks requiring simultaneous measurement in
both space and frequency domains. The most relevant features are
used as input data on a Fuzzy c-mean clustering classifier. The
classes that exist are only two: 'defects' or 'no defects'. The proposed
approach is tested on the T.O.F.D image achieved at the laboratory
and on the industrial field.
Abstract: The PRAF family of proteins is a plant specific family of proteins with distinct domain architecture and various unique sequence/structure traits. We have carried out an extensive search of the Arabidopsis genome using an automated pipeline and manual methods to verify previously known and identify unknown instances of PRAF proteins, characterize their sequence and build 3D structures of their individual domains. Integrating the sequence, structure and whatever little known experimental details for each of these proteins and their domains, we present a comprehensive characterization of the different domains in these proteins and their variant properties.
Abstract: Cluster analysis is the name given to a diverse collection of techniques that can be used to classify objects (e.g. individuals, quadrats, species etc). While Kohonen's Self-Organizing Feature Map (SOFM) or Self-Organizing Map (SOM) networks have been successfully applied as a classification tool to various problem domains, including speech recognition, image data compression, image or character recognition, robot control and medical diagnosis, its potential as a robust substitute for clustering analysis remains relatively unresearched. SOM networks combine competitive learning with dimensionality reduction by smoothing the clusters with respect to an a priori grid and provide a powerful tool for data visualization. In this paper, SOM is used for creating a toroidal mapping of two-dimensional lattice to perform cluster analysis on results of a chemical analysis of wines produced in the same region in Italy but derived from three different cultivators, referred to as the “wine recognition data" located in the University of California-Irvine database. The results are encouraging and it is believed that SOM would make an appealing and powerful decision-support system tool for clustering tasks and for data visualization.
Abstract: Presents a concept for a multidisciplinary process
supporting effective task transitions between different technical
domains during the architectural design stage.
A system configuration challenge is the multifunctional driven
increased solution space. As a consequence, more iteration is needed
to find a global optimum, i.e. a compromise between involved
disciplines without negative impact on development time. Since state
of the art standards like ISO 15288 and VDI 2206 do not provide a
detailed methodology on multidisciplinary design process, higher
uncertainties regarding final specifications arise. This leads to the
need of more detailed and standardized concepts or processes which
could mitigate risks.
The performed work is based on analysis of multidisciplinary
interaction, of modeling and simulation techniques. To demonstrate
and prove the applicability of the presented concept, it is applied to
the design of aircraft high lift systems, in the context of the
engineering disciplines kinematics, actuation, monitoring, installation
and structure design.
Abstract: Genetic Folding (GF) a new class of EA named as is
introduced for the first time. It is based on chromosomes composed
of floating genes structurally organized in a parent form and
separated by dots. Although, the genotype/phenotype system of GF
generates a kernel expression, which is the objective function of
superior classifier. In this work the question of the satisfying
mapping-s rules in evolving populations is addressed by analyzing
populations undergoing either Mercer-s or none Mercer-s rule. The
results presented here show that populations undergoing Mercer-s
rules improve practically models selection of Support Vector
Machine (SVM). The experiment is trained multi-classification
problem and tested on nonlinear Ionosphere dataset. The target of this
paper is to answer the question of evolving Mercer-s rule in SVM
addressed using either genetic folding satisfied kernel-s rules or not
applied to complicated domains and problems.
Abstract: The need to have standards has always been a priority
of all the disciplines in the world. Today, standards such as XML and
USB are trying to create a universal interface for their respective
areas. The information regarding every family in the discipline
addressed, must have a lot in common, known as Metadata. A lot of
work has been done in specific domains such as IEEE LOM and
MPEG-7 but they do not appeal to the universality of creating
Metadata for all entities, where we take an entity (object) as, not
restricted to Software Terms. This paper tries to address this problem
of universal Metadata Definition which may lead to increase in
precision of search.
Abstract: In this paper, a decision aid method for preoptimization
is presented. The method is called “negotiation", and it
is based on the identification, formulation, modeling and use of
indicators defined as “negotiation indicators". These negotiation
indicators are used to explore the solution space by means of a classbased
approach. The classes are subdomains for the negotiation
indicators domain. They represent equivalent cognitive solutions in
terms of the negotiation indictors being used. By this method, we
reduced the size of the solution space and the criteria, thus aiding the
optimization methods. We present an example to show the method.
Abstract: In distributed resource allocation a set of agents must assign their resources to a set of tasks. This problem arises in many real-world domains such as distributed sensor networks, disaster rescue, hospital scheduling and others. Despite the variety of approaches proposed for distributed resource allocation, a systematic formalization of the problem, explaining the different sources of difficulties, and a formal explanation of the strengths and limitations of key approaches is missing. We take a step towards this goal by using a formalization of distributed resource allocation that represents both dynamic and distributed aspects of the problem. In this paper we present a new idea for target tracking in sensor networks and compare it with previous approaches. The central contribution of the paper is a generalized mapping from distributed resource allocation to DDCSP. This mapping is proven to correctly perform resource allocation problems of specific difficulty. This theoretical result is verified in practice by a simulation on a realworld distributed sensor network.
Abstract: Process-oriented software development is a new
software development paradigm in which software design is modeled
by a business process which is in turn translated into a process
execution language for execution. The building blocks of this
paradigm are software units that are composed together to work
according to the flow of the business process. This new paradigm
still exhibits the characteristic of the applications built with the
traditional software component technology. This paper discusses an
approach to apply a traditional technique for software component
fabrication to the design of process-oriented software units, called
process components. These process components result from
decomposing a business process of a particular application domain
into subprocesses, and these process components can be reused to
design the business processes of other application domains. The
decomposition considers five managerial goals, namely cost
effectiveness, ease of assembly, customization, reusability, and
maintainability. The paper presents how to design or decompose
process components from a business process model and measure
some technical features of the design that would affect the
managerial goals. A comparison between the measurement values
from different designs can tell which process component design is
more appropriate for the managerial goals that have been set. The
proposed approach can be applied in Web Services environment
which accommodates process-oriented software development.
Abstract: In a particular case of behavioural model reduction by ANNs, a validity domain shortening has been found. In mechanics, as in other domains, the notion of validity domain allows the engineer to choose a valid model for a particular analysis or simulation. In the study of mechanical behaviour for a cantilever beam (using linear and non-linear models), Multi-Layer Perceptron (MLP) Backpropagation (BP) networks have been applied as model reduction technique. This reduced model is constructed to be more efficient than the non-reduced model. Within a less extended domain, the ANN reduced model estimates correctly the non-linear response, with a lower computational cost. It has been found that the neural network model is not able to approximate the linear behaviour while it does approximate the non-linear behaviour very well. The details of the case are provided with an example of the cantilever beam behaviour modelling.
Abstract: β-Glucosidase is an important enzyme for production
of ethanol from lignocellulose. With hydrolytic activity on
cellooligosaccharides, especially cellobiose, β-glucosidase removes
product inhibitory effect on cellulases and forms fermentable sugars.
In this study, β-glucosidase encoding gene (BGL1) from traditional
starter yeast Saccharomycosis fibuligera BMQ908 was cloned and
expressed in Pichia pastoris. BGL1 of S. fibuligera BMQ 908 shared
98% nucleotide homology with the closest GenBank sequence
(M22475) but identity in amino-acid sequences of catalytic domains.
Recombinant plasmid pPICZαA/BGL1 containing the sequence
encoding BGL1 mature protein and α-factor secretion signal was
constructed and transformed into methylotrophic yeast P. pastoris by
electroporation. The recombinant strain produced single extracellular
protein with molecular weight of 120 kDa and cellobiase activity of
60 IU/ml. The optimum pH of the recombinant β-glucosidase was 5.0
and the optimum temperature was 50°C.
Abstract: The purpose of this article is to identify the practical strategies of R&D (research and development) entities for developing converging technology in organizational context. Based on the multi-assignation technological domains of patents derived from entire government-supported R&D projects for 13 years, we find that technology convergence is likely to occur when a university solely develops technology or when university develops technology as one of the collaborators. These results reflect the important role of universities in developing converging technology
Abstract: This paper deals with the helical flow of a Newtonian
fluid in an infinite circular cylinder, due to both longitudinal and
rotational shear stress. The velocity field and the resulting shear
stress are determined by means of the Laplace and finite Hankel
transforms and satisfy all imposed initial and boundary conditions.
For large times, these solutions reduce to the well-known steady-state
solutions.