Abstract: Although so far, many methods for ranking fuzzy numbers
have been discussed broadly, most of them contained some shortcomings,
such as requirement of complicated calculations, inconsistency
with human intuition and indiscrimination. The motivation of
this study is to develop a model for ranking fuzzy numbers based
on the lexicographical ordering which provides decision-makers with
a simple and efficient algorithm to generate an ordering founded on
a precedence. The main emphasis here is put on the ease of use
and reliability. The effectiveness of the proposed method is finally
demonstrated by including a comprehensive comparing different
ranking methods with the present one.
Abstract: In the last two decades radiofrequency ablation (RFA)
has been considered a promising medical procedure for the treatment
of primary and secondary malignancies. However, the needle-based
electrodes so far developed for this kind of treatment are not suitable
for the thermal ablation of tumors located in hollow organs like
esophagus, colon or bile duct. In this work a tubular electrode
solution is presented. Numerical and experimental analyses were
performed to characterize the volume of the lesion induced. Results
show that this kind of electrode is a feasible solution and numerical
simulation might provide a tool for planning RFA procedure with
some accuracy.
Abstract: The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.
Abstract: We report on a high-speed quantum cryptography
system that utilizes simultaneous entanglement in polarization and in
“time-bins". With multiple degrees of freedom contributing to the
secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o
the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.
Abstract: We report the size dependence of 1D superconductivity in ultrathin (10-130 nm) nanowires produced by coating suspended carbon nanotubes with a superconducting NbN thin film. The resistance-temperature characteristic curves for samples with ≧25 nm wire width show the superconducting transition. On the other hand, for the samples with 10-nm width, the superconducting transition is not exhibited owing to the quantum size effect. The differential resistance vs. current density characteristic curves show some peak, indicating that Josephson junctions are formed in nanowires. The presence of the Josephson junctions is well explained by the measurement of the magnetic field dependence of the critical current. These understanding allow for the further expansion of the potential application of NbN, which is utilized for single photon detectors and so on.
Abstract: The colors of the human skin represent a special
category of colors, because they are distinctive from the colors of
other natural objects. This category is found as a cluster in color
spaces, and the skin color variations between people are mostly due
to differences in the intensity. Besides, the face detection based on
skin color detection is a faster method as compared to other
techniques. In this work, we present a system to track faces by
carrying out skin color detection in four different color spaces: HSI,
YCbCr, YES and RGB. Once some skin color regions have been
detected for each color space, we label each and get some
characteristics such as size and position. We are supposing that a face
is located in one the detected regions. Next, we compare and employ
a polling strategy between labeled regions to determine the final
region where the face effectively has been detected and located.
Abstract: Understanding of how and where NOx formation
occurs in industrial burner is very important for efficient and clean
operation of utility burners. Also the importance of this problem is
mainly due to its relation to the pollutants produced by more burners
used widely of gas turbine in thermal power plants and glass and steel
industry.
In this article, a numerical model of an industrial burner operating
in MILD combustion is validated with experimental data.. Then
influence of air flow rate and air temperature on combustor
temperature profiles and NOX product are investigated. In order to
modification this study reports on the effects of fuel and air dilution
(with inert gases H2O, CO2, N2), and also influence of lean-premixed
of fuel, on the temperature profiles and NOX emission.
Conservation equations of mass, momentum and energy, and
transport equations of species concentrations, turbulence, combustion
and radiation modeling in addition to NO modeling equations were
solved together to present temperature and NO distribution inside the
burner.
The results shows that dilution, cause to a reduction in value of
temperature and NOX emission, and suppresses any flame
propagation inside the furnace and made the flame inside the furnace
invisible. Dilution with H2O rather than N2 and CO2 decreases further
the value of the NOX. Also with raise of lean-premix level, local
temperature of burner and the value of NOX product are decreases
because of premixing prevents local “hot spots" within the combustor
volume that can lead to significant NOx formation. Also leanpremixing
of fuel with air cause to amount of air in reaction zone is
reach more than amount that supplied as is actually needed to burn
the fuel and this act lead to limiting NOx formation
Abstract: In this paper we propose two first non-generic constructions
of multisignature scheme based on coding theory. The
first system make use of the CFS signature scheme and is secure
in random oracle while the second scheme is based on the KKS
construction and is a few times. The security of our construction relies
on a difficult problems in coding theory: The Syndrome Decoding
problem which has been proved NP-complete [4].
Abstract: The main objective of this study was to remove and recover Ni, Cu and Fe from a mixed metal system using sodium hypophosphite as a reducing agent and nickel powder as seeding material. The metal systems studied consisted of Ni-Cu, Ni-Fe and Ni-Cu-Fe solutions. A 5 L batch reactor was used to conduct experiments where 100 mg/l of each respective metal was used. It was found that the metals were reduced to their elemental form with removal efficiencies of over 80%. The removal efficiency decreased in the order Fe>Ni>Cu. The metal powder obtained contained between 97-99% Ni and was almost spherical and porous. Size enlargement by aggregation was the dominant particulate process.
Abstract: In the closed quantum system, if the control system is
strongly regular and all other eigenstates are directly coupled to the
target state, the control system can be asymptotically stabilized at the
target eigenstate by the Lyapunov control based on the state error.
However, if the control system is not strongly regular or as long as
there is one eigenstate not directly coupled to the target state, the
situations will become complicated. In this paper, we propose an
implicit Lyapunov control method based on the state error to solve the
convergence problems for these two degenerate cases. And at the same
time, we expand the target state from the eigenstate to the arbitrary
pure state. Especially, the proposed method is also applicable in the
control system with multi-control Hamiltonians. On this basis, the
convergence of the control systems is analyzed using the LaSalle
invariance principle. Furthermore, the relation between the implicit
Lyapunov functions of the state distance and the state error is
investigated. Finally, numerical simulations are carried out to verify
the effectiveness of the proposed implicit Lyapunov control method.
The comparisons of the control effect using the implicit Lyapunov
control method based on the state distance with that of the state error
are given.
Abstract: Scheduling algorithms are used in operating systems
to optimize the usage of processors. One of the most efficient
algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ)
algorithm which uses several queues with different quanta. The most
important weakness of this method is the inability to define the
optimized the number of the queues and quantum of each queue. This
weakness has been improved in IMLFQ scheduling algorithm.
Number of the queues and quantum of each queue affect the response
time directly. In this paper, we review the IMLFQ algorithm for
solving these problems and minimizing the response time. In this
algorithm Recurrent Neural Network has been utilized to find both
the number of queues and the optimized quantum of each queue.
Also in order to prevent any probable faults in processes' response
time computation, a new fault tolerant approach has been presented.
In this approach we use combinational software redundancy to
prevent the any probable faults. The experimental results show that
using the IMLFQ algorithm results in better response time in
comparison with other scheduling algorithms also by using fault
tolerant mechanism we improve IMLFQ performance.
Abstract: This paper presents a numerical investigation of the
unsteady flow around an American 19th century vertical-axis
windmill: the Stevens & Jolly rotor, patented on April 16, 1895. The
computational approach used is based on solving the complete
transient Reynolds-Averaged Navier-Stokes (t-RANS) equations: a
full campaign of numerical simulation has been performed using the
k-ω SST turbulence model. Flow field characteristics have been
investigated for several values of tip speed ratio and for a constant
unperturbed free-stream wind velocity of 6 m/s, enabling the study of
some unsteady flow phenomena in the rotor wake. Finally, the global
power generated from the windmill has been determined for each
simulated angular velocity, allowing the calculation of the rotor
power-curve.
Abstract: World has entered in 21st century. The technology of
computer graphics and digital cameras is prevalent. High resolution
display and printer are available. Therefore high resolution images
are needed in order to produce high quality display images and high
quality prints. However, since high resolution images are not usually
provided, there is a need to magnify the original images. One
common difficulty in the previous magnification techniques is that of
preserving details, i.e. edges and at the same time smoothing the data
for not introducing the spurious artefacts. A definitive solution to this
is still an open issue. In this paper an image magnification using
adaptive interpolation by pixel level data-dependent geometrical
shapes is proposed that tries to take into account information about
the edges (sharp luminance variations) and smoothness of the image.
It calculate threshold, classify interpolation region in the form of
geometrical shapes and then assign suitable values inside
interpolation region to the undefined pixels while preserving the
sharp luminance variations and smoothness at the same time.
The results of proposed technique has been compared qualitatively
and quantitatively with five other techniques. In which the qualitative
results show that the proposed method beats completely the Nearest
Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The
quantitative results are competitive and consistent with NN, BL, BC
and others.
Abstract: The essence of the 21st century is knowledge economy. Knowledge has become the key resource of economic growth and social development. Construction industry is no exception. Because of the characteristic of complexity, project manager can't depend only on information management. The only way to improve the level of construction project management is to set up a kind of effective knowledge accumulation mechanism. This paper first introduced the IFC standard and the concept of ontology. Then put forward the construction method of the architectural engineering domain ontology based on IFC. And finally build up the concepts, properties and the relationship between the concepts of the ontology. The deficiency of this paper is also pointed out.
Abstract: The city of Suceava, one of the most important
medieval capital of Moldova, owes its urban genesis to the power
center established in its territory at the turn of the thirteenth and
fourteenth centuries. Freed from the effective control exercised by
the Emir Nogai through Alanians, the local center of power evolved
as the main representative of the interests of indigenous people in
relation to the Hungarian Angevin dinasty and to their
representatives from Maramures. From this perspective, the political
and military role of the settlement of Suceava was archeologically
proved by the discovery of extensive fortifications, unrivaled in the
first half of the XIVth century-s Moldavia. At the end of that century,
voivod Peter I decides to move the capital of the state from Siret to
Suceava. That option stimulated the development of the settlement
on specific urban coordinates.
Abstract: A straightforward and intuitive combination of single simulations into an aggregated master-simulation is not trivial. There are lots of problems, which trigger-specific difficulties during the modeling and execution of such a simulation. In this paper we identify these problems and aim to solve them by mapping the task to the field of multi agent systems. The solution is a new meta-model named AGENTMAP, which is able to mitigate most of the problems and to support intuitive modeling at the same time. This meta-model will be introduced and explained on basis of an example from the e-commerce domain.
Abstract: Today global warming, climate change and energy supply are of greater concern as it is widely realized that the planet earth does not provide an infinite capacity for absorbing human industrialization in the 21st century. The aim of this paper is to analyze upstream and downstream electricity production in selected case studies: a coal power plant, a pump system and a microwave oven covering and consumption to explore the position of energy efficiency in engineering sustainability. Collectively, the analysis presents energy efficiency as a major pathway towards sustainability that requires an inclusive and a holistic supply chain response in the engineering design process.
Abstract: A systematic and exhaustive method based on the group
structure of a unitary Lie algebra is proposed to generate an enormous
number of quantum codes. With respect to the algebraic structure,
the orthogonality condition, which is the central rule of generating
quantum codes, is proved to be fully equivalent to the distinguishability
of the elements in this structure. In addition, four types of
quantum codes are classified according to the relation of the codeword
operators and some initial quantum state. By linking the unitary Lie
algebra with the additive group, the classical correspondences of some
of these quantum codes can be rendered.