Abstract: This study aims to investigate and explore the underlying causes of security concerns of customers emerged when WHSmith transformed its physical system to virtualized business model through NetSuite. NetSuite is essentially fully integrated software which helps transforming the physical system to virtualized business model. Modern organisations are moving away from traditional business models to cloud based models and consequently it is expected to have a better, secure and innovative environment for customers. The vital issue of the modern age race is the security when transforming virtualized through cloud based models and designers of interactive systems often misunderstand privacy and even often ignore it, thus causing concerns for users. The content analysis approach is being used to collect the qualitative data from 120 online bloggers including TRUSTPILOT. The results and finding provide useful new insights into the nature and form of security concerns of online users after they have used the WHSmith services offered online through their website. Findings have theoretical as well as practical implications for the successful adoption of cloud computing Business-to-Business model and similar systems.
Abstract: Cloud computing is a business model which provides
an easier management of computing resources. Cloud users can
request virtual machine and install additional softwares and configure
them if needed. However, user can also request virtual appliance
which provides a better solution to deploy application in much faster
time, as it is ready-built image of operating system with necessary
softwares installed and configured. Large numbers of virtual
appliances are available in different image format. User can
download available appliances from public marketplace and start
using it. However, information published about the virtual appliance
differs from each providers leading to the difficulty in choosing
required virtual appliance as it is composed of specific OS with
standard software version. However, even if user choses the
appliance from respective providers, user doesn’t have any flexibility
to choose their own set of softwares with required OS and
application. In this paper, we propose a referenced architecture for
dynamically customizing virtual appliance and provision them in an
easier manner. We also add our experience in integrating our
proposed architecture with public marketplace and Mi-Cloud, a cloud
management software.
Abstract: Cloud computing can reduce the start-up expenses of implementing EHR (Electronic Health Records). However, many of the healthcare institutions are yet to implement cloud computing due to the associated privacy and security issues. In this paper, we analyze the challenges and opportunities of implementing cloud computing in healthcare. We also analyze data of over 5000 US hospitals that use Telemedicine applications. This analysis helps to understand the importance of smart phones over the desktop systems in different departments of the healthcare institutions. The wide usage of smartphones and cloud computing allows ubiquitous and affordable access to the health data by authorized persons, including patients and doctors. Cloud computing will prove to be beneficial to a majority of the departments in healthcare. Through this analysis, we attempt to understand the different healthcare departments that may benefit significantly from the implementation of cloud computing.
Abstract: Fractal based digital image compression is a specific
technique in the field of color image. The method is best suited for
irregular shape of image like snow bobs, clouds, flame of fire; tree
leaves images, depending on the fact that parts of an image often
resemble with other parts of the same image. This technique has
drawn much attention in recent years because of very high
compression ratio that can be achieved. Hybrid scheme incorporating
fractal compression and speedup techniques have achieved high
compression ratio compared to pure fractal compression. Fractal
image compression is a lossy compression method in which selfsimilarity
nature of an image is used. This technique provides high
compression ratio, less encoding time and fart decoding process. In
this paper, fractal compression with quad tree and DCT is proposed
to compress the color image. The proposed hybrid schemes require
four phases to compress the color image. First: the image is
segmented and Discrete Cosine Transform is applied to each block of
the segmented image. Second: the block values are scanned in a
zigzag manner to prevent zero co-efficient. Third: the resulting image
is partitioned as fractals by quadtree approach. Fourth: the image is
compressed using Run length encoding technique.
Abstract: Common Platform for Automated Programming
(CPAP) is defined in details. Two versions of CPAP are described:
Cloud based (including set of components for classic programming,
and set of components for combined programming); and Knowledge
Based Automated Software Engineering (KBASE) based (including
set of components for automated programming, and set of
components for ontology programming). Four KBASE products
(Module for Automated Programming of Robots, Intelligent Product
Manual, Intelligent Document Display, and Intelligent Form
Generator) are analyzed and CPAP contributions to automated
programming are presented.
Abstract: A solar receiver is designed for operation under
extremely uneven heat flux distribution, cyclic weather, and cloud
transient cycle conditions, which can include large thermal stress and
even receiver failure. In this study, the effect of different oil velocity
on convection coefficient factor and impact of wind velocity on local
Nusselt number by Finite Volume Method will be analyzed. This
study is organized to give an overview of the numerical modeling
using a MATLAB software, as an accurate, time efficient and
economical way of analyzing the heat transfer trends over stationary
receiver tube for different Reynolds number. The results reveal when
oil velocity is below 0.33m/s, the value of convection coefficient is
negligible at low temperature. The numerical graphs indicate that
when oil velocity increases up to 1.2 m/s, heat convection coefficient
increases significantly. In fact, a reduction in oil velocity causes a
reduction in heat conduction through the glass envelope. In addition,
the different local Nusselt number is reduced when the wind blows
toward the concave side of the collector and it has a significant effect
on heat losses reduction through the glass envelope.
Abstract: This paper evaluates the accrual based scheduling for
cloud in single and multi-resource system. Numerous organizations
benefit from Cloud computing by hosting their applications. The
cloud model provides needed access to computing with potentially
unlimited resources. Scheduling is tasks and resources mapping to a
certain optimal goal principle. Scheduling, schedules tasks to virtual
machines in accordance with adaptable time, in sequence under
transaction logic constraints. A good scheduling algorithm improves
CPU use, turnaround time, and throughput. In this paper, three realtime
cloud services scheduling algorithm for single resources and
multiple resources are investigated. Experimental results show
Resource matching algorithm performance to be superior for both
single and multi-resource scheduling when compared to benefit first
scheduling, Migration, Checkpoint algorithms.
Abstract: Cloud Computing refers to applications delivered as
services over the internet, and the datacenters that provide those
services with hardware and systems software. These were earlier
referred to as Software as a Service (SaaS). Scheduling is justified by
job components (called tasks), lack of information. In fact, in a large
fraction of jobs from machine learning, bio-computing, and image
processing domains, it is possible to estimate the maximum time
required for a task in the job. This study focuses on Trust based
scheduling to improve cloud security by modifying Heterogeneous
Earliest Finish Time (HEFT) algorithm. It also proposes TR-HEFT
(Trust Reputation HEFT) which is then compared to Dynamic Load
Scheduling.
Abstract: E-retailing is the sale of goods online that takes place
over the Internet. The Internet has shrunk the entire World. World eretailing
is growing at an exponential rate in the Americas, Europe
and Asia. However, e-retailing costs require expensive investment,
such as hardware, software, and security systems. Cloud computing
technology is internet-based computing for the management and
delivery of applications and services. Cloud-based e-retailing
application models allow enterprises to lower their costs with their
effective implementation of e-retailing activities. In this paper, we
describe the concept of cloud computing and present the architecture
of cloud computing, combining the features of e-retailing. In
addition, we propose a strategy for implementing cloud computing
with e-retailing. Finally, we explain the benefits from the
architecture.
Abstract: Photovoltaic (PV) power generation systems, mainly
small scale, are rapidly being deployed in Jordan. The impact of these
systems on the grid has not been studied or analyzed. These systems
can cause many technical problems such as reverse power flows and
voltage rises in distribution feeders, and real and reactive power
transients that affect the operation of the transmission system. To
fully understand and address these problems, extensive research,
simulation, and case studies are required. To this end, this paper
studies the cloud shadow effect on the power generation of a ground
mounted PV system installed at the test field of the Renewable
Energy Center at the Applied Science University.
Abstract: Workflow scheduling is an important part of cloud
computing and based on different criteria it decides cost, execution
time, and performances. A cloud workflow system is a platform
service facilitating automation of distributed applications based on
new cloud infrastructure. An aspect which differentiates cloud
workflow system from others is market-oriented business model, an
innovation which challenges conventional workflow scheduling
strategies. Time and Cost optimization algorithm for scheduling
Hybrid Clouds (TCHC) algorithm decides which resource should be
chartered from public providers is combined with a new De-De
algorithm considering that every instance of single and multiple
workflows work without deadlocks. To offset this, two new concepts
- De-De Dodging Algorithm and Priority Based Decisive Algorithm -
combine with conventional deadlock avoidance issues by proposing
one algorithm that maximizes active (not just allocated) resource use
and reduces Makespan.
Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.
Abstract: In this paper, the secure BioSemantic Scheme is
presented to bridge biological/biomedical research problems and
computational solutions via semantic computing. Due to the diversity
of problems in various research fields, the semantic capability
description language (SCDL) plays and important role as a common
language and generic form for problem formalization. SCDL is
expected the essential for future semantic and logical computing in
Biosemantic field. We show several example to Biomedical problems
in this paper. Moreover, in the coming age of cloud computing, the
security problem is considered to be crucial issue and we presented a
practical scheme to cope with this problem.
Abstract: Accurate forecasting of fresh produce demand is one
the challenges faced by Small Medium Enterprise (SME)
wholesalers. This paper is an attempt to understand the cause for the
high level of variability such as weather, holidays etc., in demand of
SME wholesalers. Therefore, understanding the significance of
unidentified factors may improve the forecasting accuracy. This
paper presents the current literature on the factors used to predict
demand and the existing forecasting techniques of short shelf life
products. It then investigates a variety of internal and external
possible factors, some of which is not used by other researchers in the
demand prediction process. The results presented in this paper are
further analysed using a number of techniques to minimize noise in
the data. For the analysis past sales data (January 2009 to May 2014)
from a UK based SME wholesaler is used and the results presented
are limited to product ‘Milk’ focused on café’s in derby. The
correlation analysis is done to check the dependencies of variability
factor on the actual demand. Further PCA analysis is done to
understand the significance of factors identified using correlation.
The PCA results suggest that the cloud cover, weather summary and
temperature are the most significant factors that can be used in
forecasting the demand. The correlation of the above three factors
increased relative to monthly and becomes more stable compared to
the weekly and daily demand.
Abstract: This study suggests the estimation method of stress
distribution for the beam structures based on TLS (Terrestrial Laser
Scanning). The main components of method are the creation of the
lattices of raw data from TLS to satisfy the suitable condition and
application of CSSI (Cubic Smoothing Spline Interpolation) for
estimating stress distribution. Estimation of stress distribution for the
structural member or the whole structure is one of the important
factors for safety evaluation of the structure. Existing sensors which
include ESG (Electric strain gauge) and LVDT (Linear Variable
Differential Transformer) can be categorized as contact type sensor
which should be installed on the structural members and also there are
various limitations such as the need of separate space where the
network cables are installed and the difficulty of access for sensor
installation in real buildings. To overcome these problems inherent in
the contact type sensors, TLS system of LiDAR (light detection and
ranging), which can measure the displacement of a target in a long
range without the influence of surrounding environment and also get
the whole shape of the structure, has been applied to the field of
structural health monitoring. The important characteristic of TLS
measuring is a formation of point clouds which has many points
including the local coordinate. Point clouds are not linear distribution
but dispersed shape. Thus, to analyze point clouds, the interpolation is
needed vitally. Through formation of averaged lattices and CSSI for
the raw data, the method which can estimate the displacement of
simple beam was developed. Also, the developed method can be
extended to calculate the strain and finally applicable to estimate a
stress distribution of a structural member. To verify the validity of the
method, the loading test on a simple beam was conducted and TLS
measured it. Through a comparison of the estimated stress and
reference stress, the validity of the method is confirmed.
Abstract: Cloud computing has emerged as a promising
direction for cost efficient and reliable service delivery across data
communication networks. The dynamic location of service facilities
and the virtualization of hardware and software elements are stressing
the communication networks and protocols, especially when data
centres are interconnected through the internet. Although the
computing aspects of cloud technologies have been largely
investigated, lower attention has been devoted to the networking
services without involving IT operating overhead. Cloud computing
has enabled elastic and transparent access to infrastructure services
without involving IT operating overhead. Virtualization has been a
key enabler for cloud computing. While resource virtualization and
service abstraction have been widely investigated, networking in
cloud remains a difficult puzzle. Even though network has significant
role in facilitating hybrid cloud scenarios, it hasn't received much
attention in research community until recently. We propose Network
as a Service (NaaS), which forms the basis of unifying public and
private clouds. In this paper, we identify various challenges in
adoption of hybrid cloud. We discuss the design and implementation
of a cloud platform.
Abstract: Cyberspace has become a more viable arena for
budding artists to share musical acts through digital forms. The
increasing relevance of online communities has attracted scholars
from various fields demonstrating its influence on social capital. This
paper extends this understanding of social capital among Filipino
music artists belonging to the SoundCloud Philippines Facebook
Group.
The study makes use of various qualitative data obtained from
key-informant interviews and participant observation of online and
physical encounters, analyzed using the case study approach.
Soundcloud Philippines has over seven-hundred members and is
composed of Filipino singers, instrumentalists, composers, arrangers,
producers, multimedia artists and event managers. Group interactions
are a mix of online encounters based on Facebook and SoundCloud
and physical encounters through meet-ups and events. Benefits
reaped from the community are informational, technical,
instrumental, promotional, motivational and social support. Under the
guidance of online group administrators, collaborative activities such
as music productions, concerts and events transpire. Most conflicts
and problems arising are resolved peacefully. Social capital in
SoundCloud Philippines is mobilized through recognition, respect
and reciprocity.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: The security of cloud services is the concern of cloud
service providers. In this paper, we will mention different
classifications of cloud attacks referred by specialized organizations.
Each agency has its classification of well-defined properties. The
purpose is to present a high-level classification of current research in
cloud computing security. This classification is organized around
attack strategies and corresponding defenses.
Abstract: Distributed applications deployed on LEO satellites
and ground stations require substantial communication between
different members in a constellation to overcome the earth
coverage barriers imposed by GEOs. Applications running on LEO
constellations suffer the earth line-of-sight blockage effect. They
need adequate lab testing before launching to space. We propose
a scalable cloud-based network simulation framework to simulate
problems created by the earth line-of-sight blockage. The framework
utilized cloud IaaS virtual machines to simulate LEO satellites
and ground stations distributed software. A factorial ANOVA
statistical analysis is conducted to measure simulator overhead on
overall communication performance. The results showed a very low
simulator communication overhead. Consequently, the simulation
framework is proposed as a candidate for testing LEO constellations
with distributed software in the lab before space launch.