Abstract: As the data-driven economy is growing faster than
ever and the demand for energy is being spurred, we are facing
unprecedented challenges of improving energy efficiency in data
centers. Effectively maximizing energy efficiency or minimising the
cooling energy demand is becoming pervasive for data centers. This
paper investigates overall energy consumption and the energy
efficiency of cooling system for a data center in Finland as a case
study. The power, cooling and energy consumption characteristics
and operation condition of facilities are examined and analysed.
Potential energy and cooling saving opportunities are identified and
further suggestions for improving the performance of cooling system
are put forward. Results are presented as a comprehensive evaluation
of both the energy performance and good practices of energy
efficient cooling operations for the data center. Utilization of an
energy recovery concept for cooling system is proposed. The
conclusion we can draw is that even though the analysed data center
demonstrated relatively high energy efficiency, based on its power
usage effectiveness value, there is still a significant potential for
energy saving from its cooling systems.
Abstract: Earthquakes produce some of the most violent loading
situations that a structure can be subjected to and if a structure fails
under these loads then inevitably human life is put at risk. One of the
most common methods by which a structure fails under seismic
loading is at the connection of structural elements. The research
presented in this paper investigates the interlock systems as a novel
method for building structures. The main objective of this
experimental study wasto determine the dynamic characteristics and
the seismic behaviour of the proposed structures compared to
conventional structural systemsduring seismic motions. Results of
this study indicate that the interlock mechanism of the panels
influences the behaviour of lateral load-resisting systems of the
structures during earthquakes, contributing to better structural
flexibility and easier maintenance.
Abstract: Grid computing is a form of distributed computing
that involves coordinating and sharing computational power, data
storage and network resources across dynamic and geographically
dispersed organizations. Scheduling onto the Grid is NP-complete,
so there is no best scheduling algorithm for all grid computing
systems. An alternative is to select an appropriate scheduling
algorithm to use in a given grid environment because of the
characteristics of the tasks, machines and network connectivity. Job
and resource scheduling is one of the key research area in grid
computing. The goal of scheduling is to achieve highest possible
system throughput and to match the application need with the
available computing resources. Motivation of the survey is to
encourage the amateur researcher in the field of grid computing, so
that they can understand easily the concept of scheduling and can
contribute in developing more efficient scheduling algorithm. This
will benefit interested researchers to carry out further work in this
thrust area of research.
Abstract: Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.
Abstract: The deficit of power for electricity demand reaches
almost 30% for consumers in the last few years. This reflects with
continually increasing the price of electricity, and today the price for
small industry is almost 110Euro/MWh. The high price is additional
problem for the owners in the economy crisis which is reflected with
higher price of the goods.
The paper gives analyses of the energy needs for real agro
complex in Macedonia, private vinery with capacity of over 2 million
liters in a year and with self grapes and fruits fields. The existing
power supply is from grid with 10/04 kV transformer. The
geographical and meteorological condition of the vinery location
gives opportunity for including renewable as a power supply option
for the vinery complex.
After observation of the monthly energy needs for the vinery, the
base scenario is the existing power supply from the distribution grid.
The electricity bill in small industry has three factors: electricity in
high and low tariffs in kWh and the power engaged for the
technological process of production in kW. These three factors make
the total electricity bill and it is over 110 Euro/MWh which is the
price near competitive for renewable option. On the other side
investments in renewable (especially photovoltaic (PV)) has tendency
of decreasing with price of near 1,5 Euro/W. This means that
renewable with PV can be real option for power supply for small
industry capacities (under 500kW installed power).
Therefore, the other scenarios give the option with PV and the last
one includes wind option. The paper presents some scenarios for
power supply of the vinery as the followings:
• Base scenario of existing conventional power supply from the
grid
• Scenario with implementation of renewable of Photovoltaic
• Scenario with implementation of renewable of Photovoltaic and
Wind power
The total power installed in a vinery is near 570 kW, but the
maximum needs are around 250kW. At the end of the full paper some
of the results from scenarios will be presented. The paper also
includes the environmental impacts of the renewable scenarios, as
well as financial needs for investments and revenues from renewable.
Abstract: The mathematical modeling of storm surge in sea and
coastal regions such as the South China Sea (SCS) and the Gulf of
Thailand (GoT) are important to study the typhoon characteristics.
The storm surge causes an inundation at a lateral boundary exhibiting
in the coastal zones particularly in the GoT and some part of the SCS.
The model simulations in the three dimensional primitive equations
with a high resolution model are important to protect local properties
and human life from the typhoon surges. In the present study, the
mathematical modeling is used to simulate the typhoon–induced
surges in three case studies of Typhoon Linda 1997. The results
of model simulations at the tide gauge stations can describe the
characteristics of storm surges at the coastal zones.
Abstract: This paper deals with a delayed single population model on time scales. With the assistance of coincidence degree theory, sufficient conditions for existence of periodic solutions are obtained. Furthermore, the better estimations for bounds of periodic solutions are established.
Abstract: This paper presents the use of Legendre pseudospectral
method for the optimization of finite-thrust orbital transfer for
spacecrafts. In order to get an accurate solution, the System-s
dynamics equations were normalized through a dimensionless method.
The Legendre pseudospectral method is based on interpolating
functions on Legendre-Gauss-Lobatto (LGL) quadrature nodes. This
is used to transform the optimal control problem into a constrained
parameter optimization problem. The developed novel optimization
algorithm can be used to solve similar optimization problems of
spacecraft finite-thrust orbital transfer. The results of a numerical
simulation verified the validity of the proposed optimization method.
The simulation results reveal that pseudospectral optimization method
is a promising method for real-time trajectory optimization and
provides good accuracy and fast convergence.
Abstract: In this paper, we propose a robust face relighting
technique by using spherical space properties. The proposed method
is done for reducing the illumination effects on face recognition.
Given a single 2D face image, we relight the face object by
extracting the nine spherical harmonic bases and the face spherical
illumination coefficients. First, an internal training illumination
database is generated by computing face albedo and face normal
from 2D images under different lighting conditions. Based on the
generated database, we analyze the target face pixels and compare
them with the training bootstrap by using pre-generated tiles. In this
work, practical real time processing speed and small image size were
considered when designing the framework. In contrast to other works,
our technique requires no 3D face models for the training process
and takes a single 2D image as an input. Experimental results on
publicly available databases show that the proposed technique works
well under severe lighting conditions with significant improvements
on the face recognition rates.
Abstract: Continuously growing needs for Internet applications
that transmit massive amount of data have led to the emergence of
high speed network. Data transfer must take place without any
congestion and hence feedback parameters must be transferred from
the receiver end to the sender end so as to restrict the sending rate in
order to avoid congestion. Even though TCP tries to avoid
congestion by restricting the sending rate and window size, it never
announces the sender about the capacity of the data to be sent and
also it reduces the window size by half at the time of congestion
therefore resulting in the decrease of throughput, low utilization of
the bandwidth and maximum delay. In this paper, XCP protocol is
used and feedback parameters are calculated based on arrival rate,
service rate, traffic rate and queue size and hence the receiver
informs the sender about the throughput, capacity of the data to be
sent and window size adjustment, resulting in no drastic decrease in
window size, better increase in sending rate because of which there is
a continuous flow of data without congestion. Therefore as a result of
this, there is a maximum increase in throughput, high utilization of
the bandwidth and minimum delay. The result of the proposed work
is presented as a graph based on throughput, delay and window size.
Thus in this paper, XCP protocol is well illustrated and the various
parameters are thoroughly analyzed and adequately presented.
Abstract: the measurement of the angular distribution for the
elastic scattering of 16O, 14N and 12C on 27Al has been done at energy
1.75 MeV/nucleon. The optical potential code SPIVAL used in this
work to analyze the experimental results. A good agreement between
the experimental and theoretical results was obtained.
Abstract: A device analysis of the photoconductive
semiconductor switch is carried out to investigate distribution of
electric field and carrier concentrations as well as the current density
distribution. The operation of this device was then investigated as a
switch operating in X band. It is shown that despite the presence of
symmetry geometry, switch current density of the on-state steady
state mode is distributed asymmetrically throughout the device.
Abstract: In this paper, we focus on the fusion of images from
different sources using multiresolution wavelet transforms. Based on
reviews of popular image fusion techniques used in data analysis,
different pixel and energy based methods are experimented. A novel
architecture with a hybrid algorithm is proposed which applies pixel
based maximum selection rule to low frequency approximations and
filter mask based fusion to high frequency details of wavelet
decomposition. The key feature of hybrid architecture is the
combination of advantages of pixel and region based fusion in a
single image which can help the development of sophisticated
algorithms enhancing the edges and structural details. A Graphical
User Interface is developed for image fusion to make the research
outcomes available to the end user. To utilize GUI capabilities for
medical, industrial and commercial activities without MATLAB
installation, a standalone executable application is also developed
using Matlab Compiler Runtime.
Abstract: An application framework provides a reusable design
and implementation for a family of software systems. Frameworks
are introduced to reduce the cost of a product line (i.e., a family of
products that shares the common features). Software testing is a timeconsuming
and costly ongoing activity during the application
software development process. Generating reusable test cases for the
framework applications during the framework development stage,
and providing and using the test cases to test part of the framework
application whenever the framework is used reduces the application
development time and cost considerably. This paper introduces the
Framework Interface State Transition Tester (FIST2), a tool for
automated unit testing of Java framework applications. During the
framework development stage, given the formal descriptions of the
framework hooks, the specifications of the methods of the
framework-s extensible classes, and the illegal behavior description
of the Framework Interface Classes (FICs), FIST2 generates unitlevel
test cases for the classes. At the framework application
development stage, given the customized method specifications of
the implemented FICs, FIST2 automates the use, execution, and
evaluation of the already generated test cases to test the implemented
FICs. The paper illustrates the use of the FIST2 tool for testing
several applications that use the SalesPoint framework.
Abstract: Fast development of technologies, economic globalization and many other external circumstances stimulate company’s competitiveness. One of the major trends in today’s business is the shift to the exploitation of the Internet and electronic environment for entrepreneurial needs. Latest researches confirm that e-environment provides a range of possibilities and opportunities for companies, especially for micro-, small- and medium-sized companies, which have limited resources. The usage of e-tools raises the effectiveness and the profitability of an organization, as well as its competitiveness.
In the electronic market, as in the classic one, there are factors, such as globalization, development of new technology, price sensitive consumers, Internet, new distribution and communication channels that influence entrepreneurship. As a result of eenvironment development, e-commerce and e-marketing grow as well.
Objective of the paper: To describe and identify factors influencing company’s competitiveness in e-environment.
Research methodology: The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistics method, factor analysis in SPSS 20 environment, etc. The theoretical and methodological background of the research is formed by using scientific researches and publications, such as that from mass media and professional literature; statistical information from legal institutions as well as information collected by the authors during the surveying process.
Research result: The authors detected and classified factors influencing competitiveness in e-environment.
In this paper, the authors presented their findings based on theoretical, scientific, and field research. Authors have conducted a research on e-environment utilization among Latvian enterprises.
Abstract: This paper argues that increased uncertainty, in certain
situations, may actually encourage investment. Since earlier studies
mostly base their arguments on the assumption of geometric Brownian
motion, the study extends the assumption to alternative stochastic
processes, such as mixed diffusion-jump, mean-reverting process, and
jump amplitude process. A general approach of Monte Carlo
simulation is developed to derive optimal investment trigger for the
situation that the closed-form solution could not be readily obtained
under the assumption of alternative process. The main finding is that
the overall effect of uncertainty on investment is interpreted by the
probability of investing, and the relationship appears to be an invested
U-shaped curve between uncertainty and investment. The implication
is that uncertainty does not always discourage investment even under
several sources of uncertainty. Furthermore, high-risk projects are not
always dominated by low-risk projects because the high-risk projects
may have a positive realization effect on encouraging investment.
Abstract: On-line (near infrared) spectroscopy is widely used to support the operation of complex process systems. Information extracted from spectral database can be used to estimate unmeasured product properties and monitor the operation of the process. These techniques are based on looking for similar spectra by nearest neighborhood algorithms and distance based searching methods. Search for nearest neighbors in the spectral space is an NP-hard problem, the computational complexity increases by the number of points in the discrete spectrum and the number of samples in the database. To reduce the calculation time some kind of indexing could be used. The main idea presented in this paper is to combine indexing and visualization techniques to reduce the computational requirement of estimation algorithms by providing a two dimensional indexing that can also be used to visualize the structure of the spectral database. This 2D visualization of spectral database does not only support application of distance and similarity based techniques but enables the utilization of advanced clustering and prediction algorithms based on the Delaunay tessellation of the mapped spectral space. This means the prediction has not to use the high dimension space but can be based on the mapped space too. The results illustrate that the proposed method is able to segment (cluster) spectral databases and detect outliers that are not suitable for instance based learning algorithms.
Abstract: This paper attempts to discuss the evolution of the
retrieval techniques focusing on development, challenges and trends
of the image retrieval. It highlights both the already addressed and
outstanding issues. The explosive growth of image data leads to the
need of research and development of Image Retrieval. However,
Image retrieval researches are moving from keyword, to low level
features and to semantic features. Drive towards semantic features is
due to the problem of the keywords which can be very subjective and
time consuming while low level features cannot always describe high
level concepts in the users- mind.
Abstract: Silver nanoparticles were prepared by chemical reduction method. Silver nitrate was taken as the metal precursor and hydrazine hydrate as a reducing agent. The formation of the silver nanoparticles was monitored using UV-Vis absorption spectroscopy. The UV-Vis spectroscopy revealed the formation of silver nanopart├¡cles by exhibing the typical surface plasmon absorption maxima at 418-420 nm from the UV–Vis spectrum. Comparison of theoretical (Mie light scattering theory) and experimental results showed that diameter of silver nanoparticles in colloidal solution is about 60 nm. We have used energy-dispersive spectroscopy (EDX), X-ray diffraction (XRD), transmission electron microscopy (TEM) and, UV–Vis spectroscopy to characterize the nanoparticles obtained. The energy-dispersive spectroscopy (EDX) of the nanoparticles dispersion confirmed the presence of elemental silver signal no peaks of other impurity were detected. The average size and morphology of silver nanoparticles were determined by transmission electron microscopy (TEM). TEM photographs indicate that the nanopowders consist of well dispersed agglomerates of grains with a narrow size distribution (40 and 60 nm), whereas the radius of the individual particles are between 10 and 20 nm. The synthesized nanoparticles have been structurally characterized by X-ray diffraction and transmission high-energy electron diffraction (HEED). The peaks in the XRD pattern are in good agreement with the standard values of the face-centered-cubic form of metallic silver (ICCD-JCPDS card no. 4-0787) and no peaks of other impurity crystalline phases were detected. Additionally, the antibacterial activity of the nanopart├¡culas dispersion was measured by Kirby-Bauer method. The nanoparticles of silver showed high antimicrobial and bactericidal activity against gram positive bacteria such as Escherichia Coli, Pseudimonas aureginosa and staphylococcus aureus which is a highly methicillin resistant strain.
Abstract: In present work are considered the scheme of
evaluation the transition probability in quantum system. It is based on
path integral representation of transition probability amplitude and its
evaluation by means of a saddle point method, applied to the part of
integration variables. The whole integration process is reduced to
initial value problem solutions of Hamilton equations with a random
initial phase point. The scheme is related to the semiclassical initial
value representation approaches using great number of trajectories. In
contrast to them from total set of generated phase paths only one path
for each initial coordinate value is selected in Monte Karlo process.