Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.
Abstract: We address the balancing problem of transfer lines in
this paper to find the optimal line balancing that minimizes the nonproductive
time. We focus on the tool change time and face
orientation change time both of which influence the makespane. We
consider machine capacity limitations and technological constraints
associated with the manufacturing process of auto cylinder heads.
The problem is represented by a mixed integer programming model
that aims at distributing the design features to workstations and
sequencing the machining processes at a minimum non-productive
time. The proposed model is solved by an algorithm established using
linearization schemes and Benders- decomposition approach. The
experiments show the efficiency of the algorithm in reaching the
exact solution of small and medium problem instances at reasonable
time.
Abstract: This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Abstract: Long terms variation of solar insolation had been
widely studied. However, its parallel observations in short time scale
is rather lacking. This paper aims to investigate the short time scale
evolution of solar radiation spectrum (UV, PAR, and NIR bands) due
to atmospheric aerosols and water vapors. A total of 25 days of
global and diffused solar spectrum ranges from air mass 2 to 6 were
collected using ground-based spectrometer with shadowband
technique. The result shows that variation of solar radiation is the
least in UV fraction, followed by PAR and the most in NIR. Broader
variations in PAR and NIR are associated with the short time scale
fluctuations of aerosol and water vapors. The corresponding daily
evolution of UV, PAR, and NIR fractions implies that aerosol and
water vapors variation could also be responsible for the deviation
pattern in the Langley-plot analysis.
Abstract: In this article, while it is attempted to describe the
problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of
transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and
unsuccessful companies. According to the methodology, the
method of research, hypotheses, population and statistical sample
are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of
analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that
transformational leadership is significantly higher in successful companies than unsuccessful ones P
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.
Abstract: The data measurement of mean velocity has been
taken for the wake of single circular cylinder with three different diameters for two different velocities. The effects of change in
diameter and in velocity are studied in self-similar coordinate system.
The spatial variations of velocity defect and that of the half-width
have been investigated. The results are compared with those
published by H.Schlichting. In the normalized coordinates, it is also observed that all cases except for the first station are self-similar. By attention to self-similarity profiles of mean velocity, it is observed for all the cases at the each station curves tend to zero at a same point.
Abstract: The use of electronic sensors in the electronics
industry has become increasingly popular over the past few years,
and it has become a high competition product. The frequency
adjustment process is regarded as one of the most important process
in the electronic sensor manufacturing process. Due to inaccuracies
in the frequency adjustment process, up to 80% waste can be caused
due to rework processes; therefore, this study aims to provide a
preliminary understanding of the role of parameters used in the
frequency adjustment process, and also make suggestions in order to
further improve performance. Four parameters are considered in this
study: air pressure, dispensing time, vacuum force, and the distance
between the needle tip and the product. A full factorial design for
experiment 2k was considered to determine those parameters that
significantly affect the accuracy of the frequency adjustment process,
where a deviation in the frequency after adjustment and the target
frequency is expected to be 0 kHz. The experiment was conducted on
two levels, using two replications and with five center-points added.
In total, 37 experiments were carried out. The results reveal that air
pressure and dispensing time significantly affect the frequency
adjustment process. The mathematical relationship between these
two parameters was formulated, and the optimal parameters for air
pressure and dispensing time were found to be 0.45 MPa and 458 ms,
respectively. The optimal parameters were examined by carrying out
a confirmation experiment in which an average deviation of 0.082
kHz was achieved.
Abstract: The research aims to study the quality of surface water
for consumer in Samut Songkram province. Water sample were
collected from 217 sampling sites conclude 72 sampling sites in
Amphawa, 67 sampling sites in Bangkhonthee and 65 sampling sites
in Muang. Water sample were collected in December 2011 for
winter, March 2012 for summer and August 2012 for rainy season.
From the investigation of surface water quality in Mae Klong
River, main and tributaries canals in Samut Songkram province, we
found that water quality meet the type III of surface water quality
standard issued by the National Environmental Quality Act B.E.
1992. Seasonal variations of pH, Temperature, nitrate, lead and
cadmium have statistical differences between 3 seasons.
Abstract: The number of electronic participation (eParticipation) projects introduced by different governments and international organisations is considerably high and increasing. In order to have an overview of the development of these projects, various evaluation frameworks have been proposed. In this paper, a five-level participation model, which takes into account the advantages of the Social Web or Web 2.0, together with a quantitative approach for the evaluation of eParticipation projects is presented. Each participation level is evaluated independently, taking into account three main components: Web evolution, media richness, and communication channels. This paper presents the evaluation of a number of existing Voting Advice Applications (VAAs). The results provide an overview of the main features implemented by each project, their strengths and weaknesses, and the participation levels reached.
Abstract: Intrusion Detection Systems are increasingly a key
part of systems defense. Various approaches to Intrusion Detection
are currently being used, but they are relatively ineffective. Artificial
Intelligence plays a driving role in security services. This paper
proposes a dynamic model Intelligent Intrusion Detection System,
based on specific AI approach for intrusion detection. The
techniques that are being investigated includes neural networks and
fuzzy logic with network profiling, that uses simple data mining
techniques to process the network data. The proposed system is a
hybrid system that combines anomaly, misuse and host based
detection. Simple Fuzzy rules allow us to construct if-then rules that
reflect common ways of describing security attacks. For host based
intrusion detection we use neural-networks along with self
organizing maps. Suspicious intrusions can be traced back to its
original source path and any traffic from that particular source will
be redirected back to them in future. Both network traffic and system
audit data are used as inputs for both.
Abstract: The removal efficiency of 4-chlorophenol with
different advanced oxidation processes have been studied. Oxidation
experiments were carried out using two 4-chlorophenol
concentrations: 100 mg L-1 and 250 mg L-1 and UV generated from a
KrCl excilamp with (molar ratio H2O2: 4-chlorophenol = 25:1) and
without H2O2, and, with Fenton process (molar ratio H2O2:4-
chlorophenol of 25:1 and Fe2+ concentration of 5 mg L-1).
The results show that there is no significant difference in the 4-
chlorophenol conversion when using one of the three assayed
methods. However, significant concentrations of the photoproductos
still remained in the media when the chosen treatment involves UV
without hydrogen peroxide. Fenton process removed all the
intermediate photoproducts except for the hydroquinone and the
1,2,4-trihydroxybenzene. In the case of UV and hydrogen peroxide
all the intermediate photoproducts are removed.
Microbial bioassays were carried out utilising the naturally
luminescent bacterium Vibrio fischeri and a genetically modified
Pseudomonas putida isolated from a waste treatment plant receiving
phenolic waste. The results using V. fischeri show that with samples
after degradation, only the UV treatment showed toxicity (IC50 =38)
whereas with H2O2 and Fenton reactions the samples exhibited no
toxicity after treatment in the range of concentrations studied. Using
the Pseudomonas putida biosensor no toxicity could be detected for
all the samples following treatment due to the higher tolerance of the
organism to phenol concentrations encountered.
Abstract: In the present work, behavior of inoxydable steel as
reinforcement bar in composite concrete is being investigated. The
bar-concrete adherence in reinforced concrete (RC) beam is studied
and focus is made on the tension stiffening parameter. This study
highlighted an approach to observe this interaction behavior in
bending test instead of direct tension as per reported in many
references. The approach resembles actual loading condition of the
structural RC beam. The tension stiffening properties are then
applied to numerical finite element analysis (FEA) to verify their
correlation with laboratory results. Comparison with laboratory
shows a good correlation between the two. The experimental settings
is able to determine tension stiffening parameters in RC beam and
the modeling strategies made in ABAQUS can closely represent the
actual condition. Tension stiffening model used can represent the
interaction properties between inoxydable steel and concrete.
Abstract: Prior to the use of detectors, characteristics
comparison study was performed and baseline established. In patient
specific QA, the portal dosimetry mean values of area gamma,
average gamma and maximum gamma were 1.02, 0.31 and 1.31 with
standard deviation of 0.33, 0.03 and 0.14 for IMRT and the
corresponding values were 1.58, 0.48 and 1.73 with standard
deviation of 0.31, 0.06 and 0.66 for VMAT. With ImatriXX 2-D
array system, on an average 99.35% of the pixels passed the criteria
of 3%-3 mm gamma with standard deviation of 0.24 for dynamic
IMRT. For VMAT, the average value was 98.16% with a standard
deviation of 0.86. The results showed that both the systems can be
used in patient specific QA measurements for IMRT and VMAT.
The values obtained with the portal dosimetry system were found to
be relatively more consistent compared to those obtained with
ImatriXX 2-D array system.
Abstract: In this paper, the action research driven design of a
context relevant, developmental peer review of teaching model, its
implementation strategy and its impact at an Australian university is
presented. PRO-Teaching realizes an innovative process that
triangulates contemporaneous teaching quality data from a range of
stakeholders including students, discipline academics, learning and
teaching expert academics, and teacher reflection to create reliable
evidence of teaching quality. Data collected over multiple classroom
observations allows objective reporting on development differentials
in constructive alignment, peer, and student evaluations. Further
innovation is realized in the application of this highly structured
developmental process to provide summative evidence of sufficient
validity to support claims for professional advancement and learning
and teaching awards. Design decision points and contextual triggers
are described within the operating domain. Academics and
developers seeking to introduce structured peer review of teaching
into their organization will find this paper a useful reference.
Abstract: As a part of the development of a numerical method of
close capture exhausts systems for machining devices, a test rig
recreating a situation similar to a grinding operation, but in a
perfectly controlled environment, is used. The properties of the
obtained spray of solid particles are initially characterized using
particle tracking velocimetry (PTV), in order to obtain input and
validation parameters for numerical simulations. The dispersion of a
tracer gas (SF6) emitted simultaneously with the particle jet is then
studied experimentally, as the dispersion of such a gas is
representative of that of finer particles, whose aerodynamic response
time is negligible. Finally, complete modeling of the test rig is
achieved to allow comparison with experimental results and thus to
progress towards validation of the models used to describe a twophase
flow generated by machining operation.