Abstract: This paper presents a new version of the SVM mixture algorithm initially proposed by Kwok for classification and regression problems. For both cases, a slight modification of the mixture model leads to a standard SVM training problem, to the existence of an exact solution and allows the direct use of well known decomposition and working set selection algorithms. Only the regression case is considered in this paper but classification has been addressed in a very similar way. This method has been successfully applied to engine pollutants emission modeling.
Abstract: The construction of a civil structure inside a urban
area inevitably modifies the outdoor microclimate at the building
site. Wind speed, wind direction, air pollution, driving rain, radiation
and daylight are some of the main physical aspects that are subjected
to the major changes. The quantitative amount of these modifications
depends on the shape, size and orientation of the building and on its
interaction with the surrounding environment.The flow field over a
flat roof model building has been numerically investigated in order to
determine two-dimensional CFD guidelines for the calculation of the
turbulent flow over a structure immersed in an atmospheric boundary
layer. To this purpose, a complete validation campaign has been
performed through a systematic comparison of numerical simulations
with wind tunnel experimental data.Several turbulence models and
spatial node distributions have been tested for five different vertical
positions, respectively from the upstream leading edge to the
downstream bottom edge of the analyzed model. Flow field
characteristics in the neighborhood of the building model have been
numerically investigated, allowing a quantification of the capabilities
of the CFD code to predict the flow separation and the extension of
the recirculation regions.The proposed calculations have allowed the
development of a preliminary procedure to be used as a guidance in
selecting the appropriate grid configuration and corresponding
turbulence model for the prediction of the flow field over a twodimensional
roof architecture dominated by flow separation.
Abstract: Nowadays companies strive to survive in a
competitive global environment. To speed up product
development/modifications, it is suggested to adopt a collaborative
product development approach. However, despite the advantages of
new IT improvements still many CAx systems work separately and
locally. Collaborative design and manufacture requires a product
information model that supports related CAx product data models. To
solve this problem many solutions are proposed, which the most
successful one is adopting the STEP standard as a product data model
to develop a collaborative CAx platform. However, the improvement
of the STEP-s Application Protocols (APs) over the time, huge
number of STEP AP-s and cc-s, the high costs of implementation,
costly process for conversion of older CAx software files to the STEP
neutral file format; and lack of STEP knowledge, that usually slows
down the implementation of the STEP standard in collaborative data
exchange, management and integration should be considered. In this
paper the requirements for a successful collaborative CAx system is
discussed. The STEP standard capability for product data integration
and its shortcomings as well as the dominant platforms for supporting
CAx collaboration management and product data integration are
reviewed. Finally a platform named LAYMOD to fulfil the
requirements of CAx collaborative environment and integrating the
product data is proposed. The platform is a layered platform to enable
global collaboration among different CAx software
packages/developers. It also adopts the STEP modular architecture
and the XML data structures to enable collaboration between CAx
software packages as well as overcoming the STEP standard
limitations. The architecture and procedures of LAYMOD platform
to manage collaboration and avoid contradicts in product data
integration are introduced.
Abstract: Indium-tin oxide films are deposited by low plasma
temperature RF sputtering on highly flexible modification of glycol
polyethyleneterephtalate substrates. The produced layers are
characterized with transparency over 82 % and sheet resistance of
86.9 Ω/square. The film’s conductivity was further improved by
additional UV illumination from light source (365 nm), having power
of 250 W. The influence of the UV exposure dose on the structural
and electro-optical properties of ITO was investigated. It was
established that the optimum time of illumination is 10 minutes and
further UV treatment leads to polymer substrates degradation.
Structural and bonds type analysis show that at longer treatment
carbon atoms release and diffuse into ITO films, which worsen their
electrical behavior. For the optimum UV dose the minimum sheet
resistance was measured to be 19.2 Ω/square, and the maximum
transparency remained almost unchanged – above 82 %.
Abstract: Now a days, a significant part of commercial and governmental organisations like museums, cultural organizations, libraries, commercial enterprises, etc. invest intensively in new technologies for image digitization, digital libraries, image archiving and retrieval. Hence image authorization, authentication and security has become prime need. In this paper, we present a semi-fragile watermarking scheme for color images. The method converts the host image into YIQ color space followed by application of orthogonal dual domains of DCT and DWT transforms. The DCT helps to separate relevant from irrelevant image content to generate silent image features. DWT has excellent spatial localisation to help aid in spatial tamper characterisation. Thus image adaptive watermark is generated based of image features which allows the sharp detection of microscopic changes to locate modifications in the image. Further, the scheme utilises the multipurpose watermark consisting of soft authenticator watermark and chrominance watermark. Which has been proved fragile to some predefined processing like intentinal fabrication of the image or forgery and robust to other incidental attacks caused in the communication channel.
Abstract: Variational methods for optical flow estimation are
known for their excellent performance. The method proposed by Brox
et al. [5] exemplifies the strength of that framework. It combines
several concepts into single energy functional that is then minimized
according to clear numerical procedure. In this paper we propose
a modification of that algorithm starting from the spatiotemporal
gradient constancy assumption. The numerical scheme allows to
establish the connection between our model and the CLG(H) method
introduced in [18]. Experimental evaluation carried out on synthetic
sequences shows the significant superiority of the spatial variant of
the proposed method. The comparison between methods for the realworld
sequence is also enclosed.
Abstract: This paper proposes a new version of the Particle
Swarm Optimization (PSO) namely, Modified PSO (MPSO) for
model order formulation of Single Input Single Output (SISO) linear
time invariant continuous systems. In the General PSO, the
movement of a particle is governed by three behaviors namely
inertia, cognitive and social. The cognitive behavior helps the
particle to remember its previous visited best position. In Modified
PSO technique split the cognitive behavior into two sections like
previous visited best position and also previous visited worst
position. This modification helps the particle to search the target very
effectively. MPSO approach is proposed to formulate the higher
order model. The method based on the minimization of error
between the transient responses of original higher order model and
the reduced order model pertaining to the unit step input. The results
obtained are compared with the earlier techniques utilized, to validate
its ease of computation. The proposed method is illustrated through
numerical example from literature.
Abstract: In the upgrade process of enterprise information
systems, whether new systems will be success and their development
will be efficient, depends on how to deal with and utilize those legacy systems. We propose an evaluation system, which comprehensively
describes the capacity of legacy information systems in five aspects.
Then a practical legacy systems evaluation method is scripted. Base on
the evaluation result, we put forward 4 kinds of migration strategy: eliminated, maintenance, modification, encapsulating. The methods
and strategies play important roles in practice.
Abstract: This paper presents an analytical method to solve
governing consolidation parabolic partial differential equation (PDE)
for inelastic porous Medium (soil) with consideration of variation of
equation coefficient under cyclic loading. Since under cyclic loads,
soil skeleton parameters change, this would introduce variable
coefficient of parabolic PDE. Classical theory would not rationalize
consolidation phenomenon in such condition. In this research, a
method based on time space mapping to a virtual time space along
with superimposing rule is employed to solve consolidation of
inelastic soils in cyclic condition. Changes of consolidation
coefficient applied in solution by modification of loading and
unloading duration by introducing virtual time. Mapping function is
calculated based on consolidation partial differential equation results.
Based on superimposing rule a set of continuous static loads in
specified times used instead of cyclic load. A set of laboratory
consolidation tests under cyclic load along with numerical
calculations were performed in order to verify the presented method.
Numerical solution and laboratory tests results showed accuracy of
presented method.
Abstract: This paper proposes an easy-to-use instruction hiding
method to protect software from malicious reverse engineering
attacks. Given a source program (original) to be protected, the
proposed method (1) takes its modified version (fake) as an input,
(2) differences in assembly code instructions between original and
fake are analyzed, and, (3) self-modification routines are introduced
so that fake instructions become correct (i.e., original instructions)
before they are executed and that they go back to fake ones after
they are executed. The proposed method can add a certain amount
of security to a program since the fake instructions in the resultant
program confuse attackers and it requires significant effort to discover
and remove all the fake instructions and self-modification routines.
Also, this method is easy to use (with little effort) because all a user
(who uses the proposed method) has to do is to prepare a fake source
code by modifying the original source code.
Abstract: Many studies have applied the Theory of Planned
Behavior (TPB) in predicting health behaviors among unique
populations. However, a new paradigm is emerging where focus is
now directed to modification and expansion of the TPB model rather
than utilization of the traditional theory. This review proposes new
models modified from the Theory of Planned Behavior and suggest
an appropriate study design that can be used to test the models within
physical activity and dietary practice domains among Type 2
diabetics in Kenya. The review was conducted by means of literature
search in the field of nutrition behavior, health psychology and
mixed methods using predetermined key words. The results identify
pre-intention and post intention gaps within the TPB model that need
to be filled. Additional psychosocial factors are proposed to be
included in the TPB model to generate new models and the efficacy
of these models tested using mixed methods design.
Abstract: Bentonitic material from South Aswan, Egypt was evaluated in terms of mineral-ogy and chemical composition as bleaching clay in refining of transformer oil before and after acid activation and thermal treatment followed by acid leaching using HCl and H2SO4 for different contact times. Structural modification and refining power of bento-nite were investigated during modification by means of X-ray diffraction and infrared spectroscopy. The results revealed that the activated bentonite could be used for refining of transformer oil. The oil parameters such as; dielectric strength, viscosity and flash point had been improved. The dielectric breakdown strength of used oil increased from 29 kV for used oil treated with unactivated bentonite to 74 kV after treatment with activated bentonite. Kinematic Viscosity changed from 19 to 11 mm2 /s after treatment with activated bentonite. However, flash point achieved 149 ºC.
Abstract: Cry j 1 is a causative substance of Japanese cedar
pollinosis, and it may deteriorate by Cry j 1 invasion to a lower
respiratory tract. We observed airborne particles containing Cry j 1 by
an immunofluorescence technique using a fluorescence microscope,
and we clarified that Cry j 1 exist as aggregates of airborne fine
particles (< 1.1 μm) in the urban atmosphere. Airborne Cry j 1 may
react with air pollutants and be denature to a substance deteriorated
Japanese cedar pollinosis. Therefore, we applied a sodium dodecyl
sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) to evaluate a
Cry j 1 reacted with various air pollutants by liquid phase reaction,
and calculated kinetics constants of Cry j 1 extracted from pollens
collected in various sites and airborne fine particles containing Cry j
1 by using a surface plasmon resonance (SPR) method. As a result, it
is suggested that Cry j 1 may be denatured by air pollutants during
the transportation to the urban atmosphere.
Abstract: Regression testing is a maintenance activity applied to
modified software to provide confidence that the changed parts are
correct and that the unchanged parts have not been adversely affected
by the modifications. Regression test selection techniques reduce the
cost of regression testing, by selecting a subset of an existing test
suite to use in retesting modified programs. This paper presents the
first general regression-test-selection technique, which based on code
and allows selecting test cases for any programs written in any
programming language. Then it handles incomplete program. We
also describe RTSDiff, a regression-test-selection system that
implements the proposed technique. The results of the empirical
studied that performed in four programming languages java, C#, Cµ
and Visual basic show that the efficiency and effective in reducing
the size of test suit.
Abstract: In this paper, an efficient technique is proposed to manage the cache memory. The proposed technique introduces some modifications on the well-known set associative mapping technique. This modification requires a little alteration in the structure of the cache memory and on the way by which it can be referenced. The proposed alteration leads to increase the set size virtually and consequently to improve the performance and the utilization of the cache memory. The current mapping techniques have accomplished good results. In fact, there are still different cases in which cache memory lines are left empty and not used, whereas two or more processes overwrite the lines of each other, instead of using those empty lines. The proposed algorithm aims at finding an efficient way to deal with such problem.
Abstract: Dietary macro and micro nutrients in their respective proportion and fractions present a practical potential tool to fabricate milk constituents since cells of lactating mammary glands obtain about 80 % of milk synthesis nutrients from blood, reflecting the existence of an isotonic equilibrium between blood and milk. Diverting milk biosynthetic activities through manipulation of nutrients towards producing milk not only keeping in view its significance as natural food but also as food item which prevents or dilutes the adverse effects of some diseases (like cardiovascular problem by saturated milk fat intake) has been area of interest in the last decade. Nutritional modification / supplementation has been reported to enhance conjugated linoleic acid, fatty acid type and concentration, essential fatty acid concentration, vitamin B12& C, Se, Cu, I and Fe which are involved to counter the health threats to human well being. Synchronizing dietary nutrients aimed to modify rumen dynamics towards synthesis of nutrients or their precursors to make their drive towards formulated milk constituents presents a practical option. Formulating dietary constituents to design milk constituents will let the farmers, consumers and investors know about the real potential and profit margins associated with this enterprise. This article briefly recapitulates the ways and means to modify milk constituents keeping an eye on human health and well being issues, which allows milk to serve more than a food item.
Abstract: Selecting the routes and the assignment of link flow in a computer communication networks are extremely complex combinatorial optimization problems. Metaheuristics, such as genetic or simulated annealing algorithms, are widely applicable heuristic optimization strategies that have shown encouraging results for a large number of difficult combinatorial optimization problems. This paper considers the route selection and hence the flow assignment problem. A genetic algorithm and simulated annealing algorithm are used to solve this problem. A new hybrid algorithm combining the genetic with the simulated annealing algorithm is introduced. A modification of the genetic algorithm is also introduced. Computational experiments with sample networks are reported. The results show that the proposed modified genetic algorithm is efficient in finding good solutions of the flow assignment problem compared with other techniques.
Abstract: This paper presents an interactive modeling system of
uniform polyhedra using the isomorphic graphs. Especially,
Kepler-Poinsot solids are formed by modifications of dodecahedron
and icosahedron.
Abstract: The tracing methods determine the contribution the
power system sources have in their supplying. The methods can be used
to assess the transmission prices, but also to recover the transmission
fixed cost. In this paper is presented the influence of the modification of
commons structure has on the specific price of transfer. The operator
must make use of a few basic principles about allocation. Most
tracing methods are based on the proportional sharing principle. In this
paper Kirschen method is used. In order to illustrate this method, the 25-
bus test system is used, elaborated within the Electrical Power
Engineering Department, from Timisoara, Romania.
Abstract: In this paper, we have compared the performance of a Turbo and Trellis coded optical code division multiple access (OCDMA) system. The comparison of the two codes has been accomplished by employing optical orthogonal codes (OOCs). The Bit Error Rate (BER) performances have been compared by varying the code weights of address codes employed by the system. We have considered the effects of optical multiple access interference (OMAI), thermal noise and avalanche photodiode (APD) detector noise. Analysis has been carried out for the system with and without double optical hard limiter (DHL). From the simulation results it is observed that a better and distinct comparison can be drawn between the performance of Trellis and Turbo coded systems, at lower code weights of optical orthogonal codes for a fixed number of users. The BER performance of the Turbo coded system is found to be better than the Trellis coded system for all code weights that have been considered for the simulation. Nevertheless, the Trellis coded OCDMA system is found to be better than the uncoded OCDMA system. Trellis coded OCDMA can be used in systems where decoding time has to be kept low, bandwidth is limited and high reliability is not a crucial factor as in local area networks. Also the system hardware is less complex in comparison to the Turbo coded system. Trellis coded OCDMA system can be used without significant modification of the existing chipsets. Turbo-coded OCDMA can however be employed in systems where high reliability is needed and bandwidth is not a limiting factor.