Abstract: During recent years, attention in 'Green Computing'
has moved research into energy-saving techniques for home
computers to enterprise systems' Client and Server machines. Saving
energy or reduction of carbon footprints is one of the aspects of
Green Computing. The research in the direction of Green Computing
is more than just saving energy and reducing carbon foot prints. This
study provides a brief account of Green Computing. The emphasis of
this study is on current trends in Green Computing; challenges in the
field of Green Computing and the future trends of Green Computing.
Abstract: This paper presents a model of case based corporate
memory named ReCaRo (REsource, CAse, ROle). The approach
suggested in ReCaRo decomposes the domain to model through a set
of components. These components represent the objects developed by
the company during its activity. They are reused, and sometimes,
while bringing adaptations. These components are enriched by
knowledge after each reuse. ReCaRo builds the corporate memory on
the basis of these components. It models two types of knowledge: 1)
Business Knowledge, which constitutes the main knowledge capital
of the company, refers to its basic skill, thus, directly to the
components and 2) the Experience Knowledge which is a specialised
knowledge and represents the experience gained during the handling
of business knowledge. ReCaRo builds corporate memories which
are made up of five communicating ones.
Abstract: This paper introduces a new signal denoising based on the Empirical mode decomposition (EMD) framework. The method is a fully data driven approach. Noisy signal is decomposed adaptively into oscillatory components called Intrinsic mode functions (IMFs) by means of a process called sifting. The EMD denoising involves filtering or thresholding each IMF and reconstructs the estimated signal using the processed IMFs. The EMD can be combined with a filtering approach or with nonlinear transformation. In this work the Savitzky-Golay filter and shoftthresholding are investigated. For thresholding, IMF samples are shrinked or scaled below a threshold value. The standard deviation of the noise is estimated for every IMF. The threshold is derived for the Gaussian white noise. The method is tested on simulated and real data and compared with averaging, median and wavelet approaches.
Abstract: In this paper, we consider a multi user multiple input
multiple output (MU-MIMO) based cooperative reporting system for
cognitive radio network. In the reporting network, the secondary
users forward the primary user data to the common fusion center
(FC). The FC is equipped with linear equalizers and an energy
detector to make the decision about the spectrum. The primary user
data are considered to be a digital video broadcasting - terrestrial
(DVB-T) signal. The sensing channel and the reporting channel are
assumed to be an additive white Gaussian noise and an independent
identically distributed Raleigh fading respectively. We analyzed the
detection probability of MU-MIMO system with linear equalizers and
arrived at the closed form expression for average detection
probability. Also the system performance is investigated under
various MIMO scenarios through Monte Carlo simulations.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: In recent years a number of applications with multirobot
systems (MRS) is growing in various areas. But their design
is in practice often difficult and algorithms are proposed for the
theoretical background and do not consider errors and noise in real
conditions, so they are not usable in real environment. These errors
are visible also in task of target localization enough, when robots
try to find and estimate the position of the target by the sensors.
Localization of target is possible also with one robot but as it was
examined target finding and localization with group of mobile robots
can estimate the target position more accurately and faster. The
accuracy of target position estimation is made by cooperation of
MRS and particle filtering. Advantage of usage the MRS with particle
filtering was tested on task of fixed target localization by group of
mobile robots.
Abstract: Cryptography, Image watermarking and E-banking are
filled with apparent oxymora and paradoxes. Random sequences are
used as keys to encrypt information to be used as watermark during
embedding the watermark and also to extract the watermark during
detection. Also, the keys are very much utilized for 24x7x365
banking operations. Therefore a deterministic random sequence is
very much useful for online applications. In order to obtain the same
random sequence, we need to supply the same seed to the generator.
Many researchers have used Deterministic Random Number
Generators (DRNGs) for cryptographic applications and Pseudo
Noise Random sequences (PNs) for watermarking. Even though,
there are some weaknesses in PN due to attacks, the research
community used it mostly in digital watermarking. On the other hand,
DRNGs have not been widely used in online watermarking due to its
computational complexity and non-robustness. Therefore, we have
invented a new design of generating DRNG using Pi-series to make it
useful for online Cryptographic, Digital watermarking and Banking
applications.
Abstract: Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.
Abstract: The Norwegian Military Academy (Army) has
initiated a project with the main ambition to explore possible avenues
to enhancing operational effectiveness through an increased use of
simulation-based training and exercises. Within a cost/benefit
framework, we discuss opportunities and limitations of vertical and
horizontal integration of the existing tactical training system. Vertical
integration implies expanding the existing training system to span the
full range of training from tactical level (platoon, company) to
command and staff level (battalion, brigade). Horizontal integration
means including other domains than army tactics and staff
procedures in the training, such as military ethics, foreign languages,
leadership and decision making. We discuss each of the integration
options with respect to purpose and content of training, "best
practice" for organising and conducting simulation-based training,
and suggest how to evaluate training procedures and measure
learning outcomes. We conclude by giving guidelines towards further
explorative work and possible implementation.
Abstract: The mathematical modeling of storm surge in sea and
coastal regions such as the South China Sea (SCS) and the Gulf of
Thailand (GoT) are important to study the typhoon characteristics.
The storm surge causes an inundation at a lateral boundary exhibiting
in the coastal zones particularly in the GoT and some part of the SCS.
The model simulations in the three dimensional primitive equations
with a high resolution model are important to protect local properties
and human life from the typhoon surges. In the present study, the
mathematical modeling is used to simulate the typhoon–induced
surges in three case studies of Typhoon Linda 1997. The results
of model simulations at the tide gauge stations can describe the
characteristics of storm surges at the coastal zones.
Abstract: An experiment was conducted to determine the effect
of the rearing system on growth performance, carcass yield,
hematological parameters, and feather pecking damage of Thai
indigenous chickens. Three hundred and sixty 1-d-old chicks were
randomly assigned to 2 treatments: indoor treatment and outdoor
access treatment. In the indoor treatment, the chickens were housed
in floor pens (5 birds/m2). In the outdoor access treatment, the
chickens were housed in a similar indoor house; in addition, they also
had an outdoor grass paddock (1 bird/m2). All birds were provided
with same diet and were raised for 16 wk of age. The results showed
that growth performance and carcass yield were not different among
treatment (P>0.05). Outdoor access had no effect on hematological
parameters (P>0.05). However, the feather pecking damage of the
chickens in the outdoor access treatment was lower than that of the
chickens in the indoor treatment (P
Abstract: We analyze the effectivity of different pseudo noise (PN) and orthogonal sequences for encrypting speech signals in terms of perceptual intelligence. Speech signal can be viewed as sequence of correlated samples and each sample as sequence of bits. The residual intelligibility of the speech signal can be reduced by removing the correlation among the speech samples. PN sequences have random like properties that help in reducing the correlation among speech samples. The mean square aperiodic auto-correlation (MSAAC) and the mean square aperiodic cross-correlation (MSACC) measures are used to test the randomness of the PN sequences. Results of the investigation show the effectivity of large Kasami sequences for this purpose among many PN sequences.
Abstract: The objective of this research was to study the themes
of alcoholic beverage advertisements in Thailand after the enactment
of the 2008 Alcoholic Beverage Control Act. Data was collected
through textual analysis of 35 television and cinema advertisements
for alcoholic beverage products broadcast in Thailand. Nine themes
were identified, seven of which were themes that had previously been
used before the new law (i.e. power, competition, friendship,
Thainess, success, romance and safety) and two of which were new
themes (volunteerism and conservation) that were introduced as a
form of adaptation and negotiation in response to the new law.
Abstract: The optical properties of InGaN/GaN laser diode based on quaternary alloys stopper and superlattice layers are numerically studied using ISE TCAD (Integrated System Engineering) simulation program. Improvements in laser optical performance have been achieved using quaternary alloy as superlattice layers in InGaN/GaN laser diodes. Lower threshold current of 18 mA and higher output power and slope efficiency of 22 mW and 1.6 W/A, respectively, at room temperature have been obtained. The laser structure with InAlGaN quaternary alloys as an electron blocking layer was found to provide better laser performance compared with the ternary AlxGa1-xN blocking layer.
Abstract: Imprecision is a long-standing problem in CAD design
and high accuracy image-based reconstruction applications. The visual
hull which is the closed silhouette equivalent shape of the objects
of interest is an important concept in image-based reconstruction.
We extend the domain-theoretic framework, which is a robust and
imprecision capturing geometric model, to analyze the imprecision in
the output shape when the input vertices are given with imprecision.
Under this framework, we show an efficient algorithm to generate the
2D partial visual hull which represents the exact information of the
visual hull with only basic imprecision assumptions. We also show
how the visual hull from polyhedra problem can be efficiently solved
in the context of imprecise input.
Abstract: Fast development of technologies, economic globalization and many other external circumstances stimulate company’s competitiveness. One of the major trends in today’s business is the shift to the exploitation of the Internet and electronic environment for entrepreneurial needs. Latest researches confirm that e-environment provides a range of possibilities and opportunities for companies, especially for micro-, small- and medium-sized companies, which have limited resources. The usage of e-tools raises the effectiveness and the profitability of an organization, as well as its competitiveness.
In the electronic market, as in the classic one, there are factors, such as globalization, development of new technology, price sensitive consumers, Internet, new distribution and communication channels that influence entrepreneurship. As a result of eenvironment development, e-commerce and e-marketing grow as well.
Objective of the paper: To describe and identify factors influencing company’s competitiveness in e-environment.
Research methodology: The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistics method, factor analysis in SPSS 20 environment, etc. The theoretical and methodological background of the research is formed by using scientific researches and publications, such as that from mass media and professional literature; statistical information from legal institutions as well as information collected by the authors during the surveying process.
Research result: The authors detected and classified factors influencing competitiveness in e-environment.
In this paper, the authors presented their findings based on theoretical, scientific, and field research. Authors have conducted a research on e-environment utilization among Latvian enterprises.
Abstract: In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.
Abstract: In this paper, an analytical approach for free vibration
analysis of rectangular and circular membranes is presented. The
method is based on wave approach. From wave standpoint vibration
propagate, reflect and transmit in a structure. Firstly, the propagation
and reflection matrices for rectangular and circular membranes are
derived. Then, these matrices are combined to provide a concise and
systematic approach to free vibration analysis of membranes.
Subsequently, the eigenvalue problem for free vibration of membrane
is formulated and the equation of membrane natural frequencies is
constructed. Finally, the effectiveness of the approach is shown by
comparison of the results with existing classical solution.
Abstract: The (sub)-optimal soolution of linear filtering problem
with correlated noises is considered. The special recursive form of
the class of filters and criteria for selecting the best estimator are
the essential elements of the design method. The properties of the
proposed filter are studied. In particular, for Markovian observation
noise, the approximate filter becomes an optimal Gevers-Kailath filter
subject to a special choice of the parameter in the class of given linear
recursive filters.
Abstract: In this paper, for the first time, a two-dimensional
(2D) analytical drain current model for sub-100 nm multi-layered
gate material engineered trapezoidal recessed channel (MLGMETRC)
MOSFET: a novel design is presented and investigated using
ATLAS and DEVEDIT device simulators, to mitigate the large gate
leakages and increased standby power consumption that arise due to
continued scaling of SiO2-based gate dielectrics. The twodimensional
(2D) analytical model based on solution of Poisson-s
equation in cylindrical coordinates, utilizing the cylindrical
approximation, has been developed which evaluate the surface
potential, electric field, drain current, switching metric: ION/IOFF
ratio and transconductance for the proposed design. A good
agreement between the model predictions and device simulation
results is obtained, verifying the accuracy of the proposed analytical
model.