Abstract: There is significant interest in achieving technology
innovation through new product development activities. It is
recognized, however, that traditional project management practices
focused only on performance, cost, and schedule attributes, can often
lead to risk mitigation strategies that limit new technology
innovation. In this paper, a new approach is proposed for formally
managing and quantifying technology innovation. This approach uses
a risk-based framework that simultaneously optimizes innovation
attributes along with traditional project management and system
engineering attributes. To demonstrate the efficacy of the new riskbased
approach, a comprehensive product development experiment
was conducted. This experiment simultaneously managed the
innovation risks and the product delivery risks through the proposed
risk-based framework. Quantitative metrics for technology
innovation were tracked and the experimental results indicate that the
risk-based approach can simultaneously achieve both project
deliverable and innovation objectives.
Abstract: Virtual Reality Modelling Language (VRML) is description language, which belongs to a field Window on World virtual reality system. The file, which is in VRML format, can be interpreted by VRML explorer in three-dimensional scene. VRML was created with aim to represent virtual reality on Internet easier. Development of 3D graphic is connected with Silicon Graphic Corporation. VRML 2.0 is the file format for describing interactive 3D scenes and objects. It can be used in collaboration with www, can be used for 3D complex representations creating of scenes, products or VR applications VRML 2.0 enables represent static and animated objects too. Interesting application of VRML is in area of manufacturing systems presentation.
Abstract: Today, Higher Education in a global scope is subordinated to the greater institutional controls through the policies of the Quality of Education. These include processes of over evaluation of all the academic activities: students- and professors- performance, educational logistics, managerial standards for the administration of institutions of higher education, as well as the establishment of the imaginaries of excellence and prestige as the foundations on which universities of the XXI century will focus their present and future goals and interests. But at the same time higher education systems worldwide are facing the most profound crisis of sense and meaning and attending enormous mutations in their identity. Based in a qualitative research approach, this paper shows the social configurations that the scholars at the Universities in Mexico build around the discourse of the Quality of Education, and how these policies put in risk the social recognition of these individuals.
Abstract: Optical Bursts Switching (OBS) is a relatively new
optical switching paradigm. Contention and burst loss in OBS
networks are major concerns. To resolve contentions, an interesting
alternative to discarding the entire data burst is to partially drop the
burst. Partial burst dropping is based on burst segmentation concept
that its implementation is constrained by some technical challenges,
besides the complexity added to the algorithms and protocols on both
edge and core nodes. In this paper, the burst segmentation concept is
investigated, and an implementation scheme is proposed and
evaluated. An appropriate dropping policy that effectively manages
the size of the segmented data bursts is presented. The dropping
policy is further supported by a new control packet format that
provides constant transmission overhead.
Abstract: Marketing is an essential issue to the survival of any
real estate company in Turkey. There are some factors which are
constraining the achievements of the marketing and sales strategies in
the Turkey real estate industry. This study aims to identify and
prioritise the most significant constraints to marketing in real estate
sector and new strategies based on those constraints. This study is
based on survey method, where the respondents such as credit
counsellors, real estate investors, consultants, academicians and
marketing representatives in Turkey were asked to rank forty seven
sub-factors according to their levels of impact. The results of Multiattribute
analytical technique indicated that the main subcomponents
having impact on marketing in real estate sector are interest rates, real
estate credit availability, accessibility, company image and consumer
real income, respectively. The identified constraints are expected to
guide the marketing team in a sales-effective way.
Abstract: Monitoring lightning electromagnetic pulses (sferics)
and other terrestrial as well as extraterrestrial transient radiation signals
is of considerable interest for practical and theoretical purposes
in astro- and geophysics as well as meteorology. Managing a continuous
flow of data, automisation of the detection and classification
process is important. Features based on a combination of wavelet and
statistical methods proved efficient for analysis and characterisation
of transients and as input into a radial basis function network that is
trained to discriminate transients from pulse like to wave like.
Abstract: Renewed interest in propeller propulsion on aircraft
configurations combined with higher propeller loads lead to the question how the effects of the propulsion on model support disturbances
should be accounted for. In this paper, the determination of engine power effects on support interference of sting-mounted models is
demonstrated by a measurement on a four-engine turboprop aircraft.
CFD results on a more generic model are presented in order to clarify
the possible mechanism behind engine power effects on support
interference. The engine slipstream induces a local change in angle
of sideslip at the model sting thereby influencing the sting near-field and far-field effects. Whether or not the net result of these changes
in the disturbance pattern leads to a significant engine power effect depends on the configuration of the wind tunnel model and the test
setup.
Abstract: We study the performance of compressed beamforming
weights feedback technique in generalized triangular decomposition
(GTD) based MIMO system. GTD is a beamforming technique that
enjoys QoS flexibility. The technique, however, will perform at its
optimum only when the full knowledge of channel state information
(CSI) is available at the transmitter. This would be impossible in
the real system, where there are channel estimation error and limited
feedback. We suggest a way to implement the quantized beamforming
weights feedback, which can significantly reduce the feedback data,
on GTD-based MIMO system and investigate the performance of
the system. Interestingly, we found that compressed beamforming
weights feedback does not degrade the BER performance of the
system at low input power, while the channel estimation error
and quantization do. For comparison, GTD is more sensitive to
compression and quantization, while SVD is more sensitive to the
channel estimation error. We also explore the performance of GTDbased
MU-MIMO system, and find that the BER performance starts
to degrade largely at around -20 dB channel estimation error.
Abstract: Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.
Abstract: In this paper we are interested in Moufang-Klingenberg
planesM(A) defined over a local alternative ring A of dual numbers.
We show that a collineation of M(A) preserve cross-ratio. Also, we
obtain some results about harmonic points.
Abstract: The present study investigates numerically the
phenomenon of vortex-shedding and its suppression in twodimensional
mixed convective flow past a square cylinder under the
joint influence of buoyancy and free-stream orientation with respect
to gravity. The numerical experiments have been conducted at a
fixed Reynolds number (Re) of 100 and Prandtl number (Pr) of 0.71,
while Richardson number (Ri) is varied from 0 to 1.6 and freestream
orientation, α, is kept in the range 0o≤ α ≤ 90o, with 0o
corresponding to an upward flow and 90o representing a cross-flow
scenario, respectively. The continuity, momentum and energy
equations, subject to Boussinesq approximation, are discretized using
a finite difference method and are solved by a semi-explicit pressure
correction scheme. The critical Richardson number, leading to the
suppression of the vortex-shedding (Ric), is estimated by using
Stuart-Landau theory at various free-stream orientations and the
neutral curve is obtained in the Ri-α plane. The neutral curve
exhibits an interesting non-monotonic behavior with Ric first
increasing with increasing values of α upto 45o and then decreasing
till 70o. Beyond 70o, the neutral curve again exhibits a sharp
increasing asymptotic trend with Ric approaching very large values
as α approaches 90o. The suppression of vortex shedding is not
observed at α = 90o (cross-flow). In the unsteady flow regime, the
Strouhal number (St) increases with the increase in Richardson
number.
Abstract: The purpose of this study is to introduce a new
interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues
in proton therapy. This interface program was developed under
MATLAB software and includes a friendly graphical user interface
with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image
segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton
beam. The result of the mentioned technique is a number of nonoverlapped
squares with different sizes in every image. By this way
the resolution of image segmentation is high enough in and near
heterogeneous areas to preserve the precision of dose calculations
and is low enough in homogeneous areas to reduce the number of
cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron
and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.
Abstract: Artificial atoms are growing fields of interest due to their physical and optoelectronicapplications. The absorption spectra of the proposed artificial atom inpresence of Tera-Hertz field is investigated theoretically. We use the non-perturbativeFloquet theory and finite difference method to study the electronic structure of ArtificialAtom. The effect of static electric field on the energy levels of artificial atom is studied.The effect of orientation of static electric field on energy levels and diploe matrix elementsis also highlighted.
Abstract: This article describes Uruk, the virtual museum of
Iraq that we developed for visual exploration and retrieval of image
collections. The system largely exploits the loosely-structured
hierarchy of XML documents that provides a useful representation
method to store semi-structured or unstructured data, which does not
easily fit into existing database. The system offers users the
capability to mine and manage the XML-based image collections
through a web-based Graphical User Interface (GUI). Typically, at an
interactive session with the system, the user can browse a visual
structural summary of the XML database in order to select interesting
elements. Using this intermediate result, queries combining structure
and textual references can be composed and presented to the system.
After query evaluation, the full set of answers is presented in a visual
and structured way.
Abstract: Both the minimum energy consumption and
smoothness, which is quantified as a function of jerk, are generally
needed in many dynamic systems such as the automobile and the
pick-and-place robot manipulator that handles fragile equipments.
Nevertheless, many researchers come up with either solely
concerning on the minimum energy consumption or minimum jerk
trajectory. This research paper proposes a simple yet very interesting
when combining the minimum energy and jerk of indirect jerks
approaches in designing the time-dependent system yielding an
alternative optimal solution. Extremal solutions for the cost functions
of the minimum energy, the minimum jerk and combining them
together are found using the dynamic optimization methods together
with the numerical approximation. This is to allow us to simulate
and compare visually and statistically the time history of state inputs
employed by combining minimum energy and jerk designs. The
numerical solution of minimum direct jerk and energy problem are
exactly the same solution; however, the solutions from problem of
minimum energy yield the similar solution especially in term of
tendency.
Abstract: The theory of Groebner Bases, which has recently been
honored with the ACM Paris Kanellakis Theory and Practice Award,
has become a crucial building block to computer algebra, and is
widely used in science, engineering, and computer science. It is wellknown
that Groebner bases computation is EXP-SPACE in a general
setting. In this paper, we give an algorithm to show that Groebner
bases computation is P-SPACE in Boolean rings. We also show that
with this discovery, the Groebner bases method can theoretically be
as efficient as other methods for automated verification of hardware
and software. Additionally, many useful and interesting properties of
Groebner bases including the ability to efficiently convert the bases
for different orders of variables making Groebner bases a promising
method in automated verification.
Abstract: Every commercial bank optimises its asset portfolio
depending on the profitability of assets and chosen or imposed
constraints. This paper proposes and applies a stylized model for
optimising banks' asset and liability structure, reflecting profitability
of different asset categories and their risks as well as costs associated
with different liability categories and reserve requirements. The level
of detail for asset and liability categories is chosen to create a
suitably parsimonious model and to include the most important
categories in the model. It is shown that the most appropriate
optimisation criterion for the model is the maximisation of the ratio
of net interest income to assets. The maximisation of this ratio is
subject to several constraints. Some are accounting identities or
dictated by legislative requirements; others vary depending on the
market objectives for a particular bank. The model predicts variable
amount of assets allocated to loan provision.
Abstract: The paper describes a new approach for fingerprint
classification, based on the distribution of local features (minute
details or minutiae) of the fingerprints. The main advantage is that
fingerprint classification provides an indexing scheme to facilitate
efficient matching in a large fingerprint database. A set of rules based
on heuristic approach has been proposed. The area around the core
point is treated as the area of interest for extracting the minutiae
features as there are substantial variations around the core point as
compared to the areas away from the core point. The core point in a
fingerprint has been located at a point where there is maximum
curvature. The experimental results report an overall average
accuracy of 86.57 % in fingerprint classification.
Abstract: How to coordinate the behaviors of the agents through
learning is a challenging problem within multi-agent domains.
Because of its complexity, recent work has focused on how
coordinated strategies can be learned. Here we are interested in using
reinforcement learning techniques to learn the coordinated actions of a
group of agents, without requiring explicit communication among
them. However, traditional reinforcement learning methods are based
on the assumption that the environment can be modeled as Markov
Decision Process, which usually cannot be satisfied when multiple
agents coexist in the same environment. Moreover, to effectively
coordinate each agent-s behavior so as to achieve the goal, it-s
necessary to augment the state of each agent with the information
about other existing agents. Whereas, as the number of agents in a
multiagent environment increases, the state space of each agent grows
exponentially, which will cause the combinational explosion problem.
Profit sharing is one of the reinforcement learning methods that allow
agents to learn effective behaviors from their experiences even within
non-Markovian environments. In this paper, to remedy the drawback
of the original profit sharing approach that needs much memory to
store each state-action pair during the learning process, we firstly
address a kind of on-line rational profit sharing algorithm. Then, we
integrate the advantages of modular learning architecture with on-line
rational profit sharing algorithm, and propose a new modular
reinforcement learning model. The effectiveness of the technique is
demonstrated using the pursuit problem.
Abstract: Term Extraction, a key data preparation step in Text
Mining, extracts the terms, i.e. relevant collocation of words,
attached to specific concepts (e.g. genetic-algorithms and decisiontrees
are terms associated to the concept “Machine Learning" ). In
this paper, the task of extracting interesting collocations is achieved
through a supervised learning algorithm, exploiting a few
collocations manually labelled as interesting/not interesting. From
these examples, the ROGER algorithm learns a numerical function,
inducing some ranking on the collocations. This ranking is optimized
using genetic algorithms, maximizing the trade-off between the false
positive and true positive rates (Area Under the ROC curve). This
approach uses a particular representation for the word collocations,
namely the vector of values corresponding to the standard statistical
interestingness measures attached to this collocation. As this
representation is general (over corpora and natural languages),
generality tests were performed by experimenting the ranking
function learned from an English corpus in Biology, onto a French
corpus of Curriculum Vitae, and vice versa, showing a good
robustness of the approaches compared to the state-of-the-art Support
Vector Machine (SVM).