Abstract: The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.
Abstract: We address the balancing problem of transfer lines in
this paper to find the optimal line balancing that minimizes the nonproductive
time. We focus on the tool change time and face
orientation change time both of which influence the makespane. We
consider machine capacity limitations and technological constraints
associated with the manufacturing process of auto cylinder heads.
The problem is represented by a mixed integer programming model
that aims at distributing the design features to workstations and
sequencing the machining processes at a minimum non-productive
time. The proposed model is solved by an algorithm established using
linearization schemes and Benders- decomposition approach. The
experiments show the efficiency of the algorithm in reaching the
exact solution of small and medium problem instances at reasonable
time.
Abstract: This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Abstract: A high energy dual-wavelength extracavity KTA
optical parametric oscillator (OPO) with excellent stability and beam
quality, which is pumped by a Q-switched single-longitudinal-mode
Nd:YAG laser, has been demonstrated based on a type II noncritical
phase matching (NCPM) KTA crystal. The maximum pulse energy of
10.2 mJ with the output stability of better than 4.1% rms at 3.467 μm is
obtained at the repetition rate of 10 Hz and pulse width of 2 ns, and the
11.9 mJ of 1.535 μm radiation is obtained simultaneously. This
extracavity NCPM KTA OPO is very useful when high energy, high
beam quality and smooth time domain are needed.
Abstract: Long terms variation of solar insolation had been
widely studied. However, its parallel observations in short time scale
is rather lacking. This paper aims to investigate the short time scale
evolution of solar radiation spectrum (UV, PAR, and NIR bands) due
to atmospheric aerosols and water vapors. A total of 25 days of
global and diffused solar spectrum ranges from air mass 2 to 6 were
collected using ground-based spectrometer with shadowband
technique. The result shows that variation of solar radiation is the
least in UV fraction, followed by PAR and the most in NIR. Broader
variations in PAR and NIR are associated with the short time scale
fluctuations of aerosol and water vapors. The corresponding daily
evolution of UV, PAR, and NIR fractions implies that aerosol and
water vapors variation could also be responsible for the deviation
pattern in the Langley-plot analysis.
Abstract: Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.
Abstract: Clustering algorithms help to understand the hidden
information present in datasets. A dataset may contain intrinsic and
nested clusters, the detection of which is of utmost importance. This
paper presents a Distributed Grid-based Density Clustering algorithm
capable of identifying arbitrary shaped embedded clusters as well as
multi-density clusters over large spatial datasets. For handling
massive datasets, we implemented our method using a 'sharednothing'
architecture where multiple computers are interconnected
over a network. Experimental results are reported to establish the
superiority of the technique in terms of scale-up, speedup as well as
cluster quality.
Abstract: In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Abstract: Software reliability, defined as the probability of a
software system or application functioning without failure or errors
over a defined period of time, has been an important area of research
for over three decades. Several research efforts aimed at developing
models to improve reliability are currently underway. One of the
most popular approaches to software reliability adopted by some of
these research efforts involves the use of operational profiles to
predict how software applications will be used. Operational profiles
are a quantification of usage patterns for a software application. The
research presented in this paper investigates an innovative multiagent
framework for automatic creation and management of
operational profiles for generic distributed systems after their release
into the market. The architecture of the proposed Operational Profile
MAS (Multi-Agent System) is presented along with detailed
descriptions of the various models arrived at following the analysis
and design phases of the proposed system. The operational profile in
this paper is extended to comprise seven different profiles. Further,
the criticality of operations is defined using a new composed metrics
in order to organize the testing process as well as to decrease the time
and cost involved in this process. A prototype implementation of the
proposed MAS is included as proof-of-concept and the framework is
considered as a step towards making distributed systems intelligent
and self-managing.
Abstract: Optimization is often a critical issue for most system
design problems. Evolutionary Algorithms are population-based,
stochastic search techniques, widely used as efficient global
optimizers. However, finding optimal solution to complex high
dimensional, multimodal problems often require highly
computationally expensive function evaluations and hence are
practically prohibitive. The Dynamic Approximate Fitness based
Hybrid EA (DAFHEA) model presented in our earlier work [14]
reduced computation time by controlled use of meta-models to
partially replace the actual function evaluation by approximate
function evaluation. However, the underlying assumption in
DAFHEA is that the training samples for the meta-model are
generated from a single uniform model. Situations like model
formation involving variable input dimensions and noisy data
certainly can not be covered by this assumption. In this paper we
present an enhanced version of DAFHEA that incorporates a
multiple-model based learning approach for the SVM approximator.
DAFHEA-II (the enhanced version of the DAFHEA framework) also
overcomes the high computational expense involved with additional
clustering requirements of the original DAFHEA framework. The
proposed framework has been tested on several benchmark functions
and the empirical results illustrate the advantages of the proposed
technique.
Abstract: The article is aimed at bringing information on the scope and the level of use of talent management by organizations in one of the Czech Republic regions, in the Moravian-Silesian Region. On the basis of data acquired by a questionnaire survey it has been found out that organizations in the above-mentioned region are implementing the system of talent management on a small scale: this approach is used by 3.8 % of organizations only that is 9 from 237 (100 %) of the approached respondents. The main reasons why this approach is not used is either that organizations have no knowledge of it or there is lack of financial and personnel resources. In the article recommendations suggested by the author can be found for a wider application of talent management in the Czech practice.
Abstract: The use of electronic sensors in the electronics
industry has become increasingly popular over the past few years,
and it has become a high competition product. The frequency
adjustment process is regarded as one of the most important process
in the electronic sensor manufacturing process. Due to inaccuracies
in the frequency adjustment process, up to 80% waste can be caused
due to rework processes; therefore, this study aims to provide a
preliminary understanding of the role of parameters used in the
frequency adjustment process, and also make suggestions in order to
further improve performance. Four parameters are considered in this
study: air pressure, dispensing time, vacuum force, and the distance
between the needle tip and the product. A full factorial design for
experiment 2k was considered to determine those parameters that
significantly affect the accuracy of the frequency adjustment process,
where a deviation in the frequency after adjustment and the target
frequency is expected to be 0 kHz. The experiment was conducted on
two levels, using two replications and with five center-points added.
In total, 37 experiments were carried out. The results reveal that air
pressure and dispensing time significantly affect the frequency
adjustment process. The mathematical relationship between these
two parameters was formulated, and the optimal parameters for air
pressure and dispensing time were found to be 0.45 MPa and 458 ms,
respectively. The optimal parameters were examined by carrying out
a confirmation experiment in which an average deviation of 0.082
kHz was achieved.
Abstract: This paper studies ruin probabilities in two discrete-time
risk models with premiums, claims and rates of interest modelled by
three autoregressive moving average processes. Generalized Lundberg
inequalities for ruin probabilities are derived by using recursive
technique. A numerical example is given to illustrate the applications
of these probability inequalities.
Abstract: The Proton Exchange Membrane Fuel Cell (PEMFC)
control system has an important effect on operation of cell.
Traditional controllers couldn-t lead to acceptable responses because
of time- change, long- hysteresis, uncertainty, strong- coupling and
nonlinear characteristics of PEMFCs, so an intelligent or adaptive
controller is needed. In this paper a neural network predictive
controller have been designed to control the voltage of at the
presence of fluctuations of temperature. The results of
implementation of this designed NN Predictive controller on a
dynamic electrochemical model of a small size 5 KW, PEM fuel cell
have been simulated by MATLAB/SIMULINK.
Abstract: The research aims to study the quality of surface water
for consumer in Samut Songkram province. Water sample were
collected from 217 sampling sites conclude 72 sampling sites in
Amphawa, 67 sampling sites in Bangkhonthee and 65 sampling sites
in Muang. Water sample were collected in December 2011 for
winter, March 2012 for summer and August 2012 for rainy season.
From the investigation of surface water quality in Mae Klong
River, main and tributaries canals in Samut Songkram province, we
found that water quality meet the type III of surface water quality
standard issued by the National Environmental Quality Act B.E.
1992. Seasonal variations of pH, Temperature, nitrate, lead and
cadmium have statistical differences between 3 seasons.
Abstract: The number of electronic participation (eParticipation) projects introduced by different governments and international organisations is considerably high and increasing. In order to have an overview of the development of these projects, various evaluation frameworks have been proposed. In this paper, a five-level participation model, which takes into account the advantages of the Social Web or Web 2.0, together with a quantitative approach for the evaluation of eParticipation projects is presented. Each participation level is evaluated independently, taking into account three main components: Web evolution, media richness, and communication channels. This paper presents the evaluation of a number of existing Voting Advice Applications (VAAs). The results provide an overview of the main features implemented by each project, their strengths and weaknesses, and the participation levels reached.