Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: Prior research has not effectively investigated how the
profitability of Chinese branches affect FDIs in China [1, 2], so this
study for the first time incorporates realistic earnings information
to systematically investigate effects of innovation, imitation, and
profit factors of FDI diffusions from Taiwan to China. Our nonlinear
least square (NLS) model, which incorporates earnings factors,
forms a nonlinear ordinary differential equation (ODE) in numerical
simulation programs. The model parameters are obtained through
a genetic algorithms (GA) technique and then optimized with the
collected data for the best accuracy. Particularly, Taiwanese regulatory
FDI restrictions are also considered in our modified model to meet
the realistic conditions. To validate the model-s effectiveness, this
investigation compares the prediction accuracy of modified model
with the conventional diffusion model, which does not take account
of the profitability factors.
The results clearly demonstrate the internal influence to be positive,
as early FDI adopters- consistent praises of FDI attract potential firms
to make the same move. The former erects a behavior model for the
latter to imitate their foreign investment decision. Particularly, the
results of modified diffusion models show that the earnings from
Chinese branches are positively related to the internal influence. In
general, the imitating tendency of potential consumers is substantially
hindered by the losses in the Chinese branches, and these firms would
invest less into China. The FDI inflow extension depends on earnings
of Chinese branches, and companies will adjust their FDI strategies
based on the returns. Since this research has proved that earning is
an influential factor on FDI dynamics, our revised model explicitly
performs superior in prediction ability than conventional diffusion
model.
Abstract: A method has been developed for preparing load
models for power flow and stability. The load modeling
(LOADMOD) computer software transforms data on load class mix,
composition, and characteristics into the from required for
commonly–used power flow and transient stability simulation
programs. Typical default data have been developed for load
composition and characteristics. This paper defines LOADMOD
software and describes the dynamic and static load modeling
techniques used in this software and results of initial testing for
BAKHTAR power system.
Abstract: The process of constructing a scale measuring the attitudes of youth toward violence on televisions is reported. A 30-item draft attitude scale was applied to a working group of 232 students attending the Faculty of Educational Sciences at Ankara University between the years 2005-2006. To introduce the construct validity and dimensionality of the scale, exploratory and confirmatory factor analysis was applied to the data. Results of the exploratory factor analysis showed that the scale had three factors that accounted for 58,44% (22,46% for the first, 22,15% for the second and 13,83% for the third factor) of the common variance. It is determined that the first factor considered issues related individual effects of violence on televisions, the second factor concerned issues related social effects of violence on televisions and the third factor concerned issues related violence on television programs. Results of the confirmatory factor analysis showed that all the items under each factor are fitting the concerning factors structure. An alpha reliability of 0,90 was estimated for the whole scale. It is concluded that the scale is valid and reliable.
Abstract: In this presentation, we discuss the use of information technologies in the area of special education for teaching individuals with learning disabilities. Application software which was developed for this purpose is used to demonstrate the applicability of a database integrated information processing system to alleviate the burden of educators. The software allows the preparation of individualized education programs based on the predefined objectives, goals and behaviors.
Abstract: This paper proposes an analytical method for the
dynamics of generating firms- alliance networks along with business
phases. Dynamics in network developments have previously been
discussed in the research areas of organizational strategy rather than in
the areas of regional cluster, where the static properties of the
networks are often discussed. The analytical method introduces the
concept of business phases into innovation processes and uses
relationships called prior experiences; this idea was developed in
organizational strategy to investigate the state of networks from the
viewpoints of tradeoffs between link stabilization and node
exploration. This paper also discusses the results of the analytical
method using five cases of the network developments of firms. The
idea of Embeddedness helps interpret the backgrounds of the
analytical results. The analytical method is useful for policymakers of
regional clusters to establish concrete evaluation targets and a
viewpoint for comparisons of policy programs.
Abstract: Most scientific programs have large input and output
data sets that require out-of-core programming or use virtual memory
management (VMM). Out-of-core programming is very error-prone
and tedious; as a result, it is generally avoided. However, in many
instance, VMM is not an effective approach because it often results
in substantial performance reduction. In contrast, compiler driven I/O
management will allow a program-s data sets to be retrieved in parts,
called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a
compiler combined with a user level runtime system that can be used
to replace standard VMM for out-of-core programs. We describe
Comanche and demonstrate on a number of representative problems
that it substantially out-performs VMM. Significantly our system
does not require any special services from the operating system and
does not require modification of the operating system kernel.
Abstract: The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical
performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and
rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper
reviews the methods currently in use for assessment of surgical
skill and some modern techniques using computer-based
measurements and virtual reality systems for more quantitative
measurements
Abstract: This paper examines two policy spaces–the ARC and TVA–and their spatialized politics. The research observes that the regional concept informs public policy and can contribute to the formation of stable policy initiatives. Using the subsystem framework to understand the political viability of policy regimes, the authors conclude policy geographies that appeal to traditional definitions of regions are more stable over time. In contrast, geographies that fail to reflect pre-existing representations of space are engaged in more competitive subsystem politics. The paper demonstrates that the spatial practices of policy regions and their directional politics influence the political viability of programs. The paper concludes that policy spaces should institutionalize pre-existing geographies–not manufacture new ones.
Abstract: Teaching and learning about sustainability is a pedagogical endeavour with various innate difficulties and increased demands. Higher education has a dual role to play in addressing this challenge: to identify and explore innovative approaches and tools for addressing the complex and value-laden nature of sustainability in more meaningful ways, and to help teachers to integrate these approaches into their practice through appropriate professional development programs. The study reported here was designed and carried out within the context of a Masters course in Environmental Education. Eight teachers were collaboratively engaged in reconstructing a digital game microworld which was deliberately designed by the researchers to be questioned and evoke critical discussion on the idea of ‘sustainable city’. The study was based on the design-based research method. The findings indicate that the teachers’ involvement in processes of co-constructing the microworld initiated discussion and reflection upon the concepts of sustainability and sustainable lifestyles.
Abstract: Probabilistic techniques in computer programs are becoming
more and more widely used. Therefore, there is a big
interest in the formal specification, verification, and development
of probabilistic programs. In our work-in-progress project, we are
attempting to make a constructive framework for developing probabilistic
programs formally. The main contribution of this paper
is to introduce an intermediate artifact of our work, a Z-based
formalism called PZ, by which one can build set theoretical models of
probabilistic programs. We propose to use a constructive set theory,
called CZ set theory, to interpret the specifications written in PZ.
Since CZ has an interpretation in Martin-L¨of-s theory of types, this
idea enables us to derive probabilistic programs from correctness
proofs of their PZ specifications.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: Many difficulties are faced in the process of learning
computer programming. This paper will propose a system framework
intended to reduce cognitive load in learning programming. In first
section focus is given on the process of learning and the
shortcomings of the current approaches to learning programming.
Finally the proposed prototype is suggested along with the
justification of the prototype. In the proposed prototype the concept
map is used as visualization metaphor. Concept maps are similar to
the mental schema in long term memory and hence it can reduce
cognitive load well. In addition other method such as part code
method is also proposed in this framework to can reduce cognitive
load.