Abstract: Grid computing is a group of clusters connected over
high-speed networks that involves coordinating and sharing
computational power, data storage and network resources operating
across dynamic and geographically dispersed locations. Resource
management and job scheduling are critical tasks in grid computing.
Resource selection becomes challenging due to heterogeneity and
dynamic availability of resources. Job scheduling is a NP-complete
problem and different heuristics may be used to reach an optimal or
near optimal solution. This paper proposes a model for resource and
job scheduling in dynamic grid environment. The main focus is to
maximize the resource utilization and minimize processing time of
jobs. Grid resource selection strategy is based on Max Heap Tree
(MHT) that best suits for large scale application and root node of
MHT is selected for job submission. Job grouping concept is used to
maximize resource utilization for scheduling of jobs in grid
computing. Proposed resource selection model and job grouping
concept are used to enhance scalability, robustness, efficiency and
load balancing ability of the grid.
Abstract: When acid is pumped into damaged reservoirs for
damage removal/stimulation, distorted inflow of acid into the
formation occurs caused by acid preferentially traveling into highly
permeable regions over low permeable regions, or (in general) into
the path of least resistance. This can lead to poor zonal coverage and
hence warrants diversion to carry out an effective placement of acid.
Diversion is desirably a reversible technique of temporarily reducing
the permeability of high perm zones, thereby forcing the acid into
lower perm zones.
The uniqueness of each reservoir can pose several challenges to
engineers attempting to devise optimum and effective diversion
strategies. Diversion techniques include mechanical placement and/or
chemical diversion of treatment fluids, further sub-classified into ball
sealers, bridge plugs, packers, particulate diverters, viscous gels,
crosslinked gels, relative permeability modifiers (RPMs), foams,
and/or the use of placement techniques, such as coiled tubing (CT)
and the maximum pressure difference and injection rate (MAPDIR)
methodology.
It is not always realized that the effectiveness of diverters greatly
depends on reservoir properties, such as formation type, temperature,
reservoir permeability, heterogeneity, and physical well
characteristics (e.g., completion type, well deviation, length of
treatment interval, multiple intervals, etc.). This paper reviews the
mechanisms by which each variety of diverter functions and
discusses the effect of various reservoir properties on the efficiency
of diversion techniques. Guidelines are recommended to help
enhance productivity from zones of interest by choosing the best
methods of diversion while pumping an optimized amount of
treatment fluid. The success of an overall acid treatment often
depends on the effectiveness of the diverting agents.
Abstract: A model of (4, 4) single-walled boron-nitride nanotube as a representative of armchair boron-nitride nanotubes studied. At first the structure optimization performed and then Nuclear Magnetic Resonance parameters (NMR) by Density Functional Theory (DFT) method at 11B and 15N nuclei calculated. Resulted parameters evaluation presents electrostatic environment heterogeneity along the nanotube and especially at the ends but the nuclei in a layer feel the same electrostatic environment. All of calculations carried out using Gaussian 98 Software package.
Abstract: Double heterogeneity of randomly located pebbles in
the core and Coated Fuel Particles (CFPs) in the pebbles are specific
features in pebble bed reactors and usually, because of difficulty to
model with MCNP code capabilities, are neglected. In this study,
characteristics of HTR-10, Tsinghua University research reactor, are
used and not only double heterogeneous but also truncated CFPs and
Pebbles are considered.Firstly, 8335 CFPs are distributed randomly
in a pebble and then the core of reactor is filled with those pebbles
and graphite pebbles as moderator such that 57:43 ratio of fuel and
moderator pebbles is established.Finally, four different core
configurations are modeled. They are Simple Cubic (SC) structure
with truncated pebbles,SC structure without truncated pebble, and
Simple Hexagonal(SH) structure without truncated pebbles and SH
structure with truncated pebbles. Results like effective multiplication
factor (Keff), critical height,etc. are compared with available data.
Abstract: Heterogeneity has to be taken into account when
integrating a set of existing information sources into a distributed
information system that are nowadays often based on Service-
Oriented Architectures (SOA). This is also particularly applicable to
distributed services such as event monitoring, which are useful in the
context of Event Driven Architectures (EDA) and Complex Event
Processing (CEP). Web services deal with this heterogeneity at a
technical level, also providing little support for event processing. Our
central thesis is that such a fully generic solution cannot provide
complete support for event monitoring; instead, source specific
semantics such as certain event types or support for certain event
monitoring techniques have to be taken into account. Our core result
is the design of a configurable event monitoring (Web) service that
allows us to trade genericity for the exploitation of source specific
characteristics. It thus delivers results for the areas of SOA, Web
services, CEP and EDA.
Abstract: Heterogeneity of solid waste characteristics as well as the complex processes taking place within the landfill ecosystem motivated the implementation of soft computing methodologies such as artificial neural networks (ANN), fuzzy logic (FL), and their combination. The present work uses a hybrid ANN-FL model that employs knowledge-based FL to describe the process qualitatively and implements the learning algorithm of ANN to optimize model parameters. The model was developed to simulate and predict the landfill gas production at a given time based on operational parameters. The experimental data used were compiled from lab-scale experiment that involved various operating scenarios. The developed model was validated and statistically analyzed using F-test, linear regression between actual and predicted data, and mean squared error measures. Overall, the simulated landfill gas production rates demonstrated reasonable agreement with actual data. The discussion focused on the effect of the size of training datasets and number of training epochs.
Abstract: Semantic Web services will enable the semiautomatic
and automatic annotation, advertisement, discovery,
selection, composition, and execution of inter-organization business
logic, making the Internet become a common global platform where
organizations and individuals communicate with each other to carry
out various commercial activities and to provide value-added
services. There is a growing consensus that Web services alone will
not be sufficient to develop valuable solutions due the degree of
heterogeneity, autonomy, and distribution of the Web. This paper
deals with two of the hottest R&D and technology areas currently
associated with the Web – Web services and the Semantic Web. It
presents the synergies that can be created between Web Services and
Semantic Web technologies to provide a new generation of eservices.
Abstract: The growing influence of service industries has
prompted greater attention being paid to service operations
management. However, service managers often have difficulty
articulating the veritable effects of their service innovation. Especially,
the performance evaluation process of service innovation problems
generally involves uncertain and imprecise data. This paper presents a
2-tuple fuzzy linguistic computing approach to dealing with
heterogeneous information and information loss problems while the
processes of subjective evaluation integration. The proposed method
based on group decision-making scenario to assist business managers
in measuring performance of service innovation manipulates the
heterogeneity integration processes and avoids the information loss
effectively.
Abstract: Over the past decade, mobile has experienced a
revolution that will ultimately change the way we communicate.All
these technologies have a common denominator exploitation of
computer information systems, but their operation can be tedious
because of problems with heterogeneous data sources.To overcome
the problems of heterogeneous data sources, we propose to use a
technique of adding an extra layer interfacing applications of
management or supervision at the different data sources.This layer
will be materialized by the implementation of a mediator between
different host applications and information systems frequently used
hierarchical and relational manner such that the heterogeneity is
completely transparent to the VoIP platform.
Abstract: The service sector continues to grow and the percentage
of GDP accounted for by service industries keeps increasing. The
growth and importance of service to an economy is not just a
phenomenon of advanced economies, service is now a majority of the
world gross domestic products. However, the performance evaluation
process of new service development problems generally involves
uncertain and imprecise data. This paper presents a 2-tuple fuzzy
linguistic computing approach to dealing with heterogeneous
information and information loss problems while the processes of
subjective evaluation integration. The proposed method based on group
decision-making scenario to assist business managers in measuring
performance of new service development manipulates the
heterogeneity integration processes and avoids the information loss
effectively.
Abstract: Workload and resource management are two essential functions provided at the service level of the grid software infrastructure. To improve the global throughput of these software environments, workloads have to be evenly scheduled among the available resources. To realize this goal several load balancing strategies and algorithms have been proposed. Most strategies were developed in mind, assuming homogeneous set of sites linked with homogeneous and fast networks. However for computational grids we must address main new issues, namely: heterogeneity, scalability and adaptability. In this paper, we propose a layered algorithm which achieve dynamic load balancing in grid computing. Based on a tree model, our algorithm presents the following main features: (i) it is layered; (ii) it supports heterogeneity and scalability; and, (iii) it is totally independent from any physical architecture of a grid.
Abstract: The zero inflated models are usually used in modeling
count data with excess zeros where the existence of the excess zeros
could be structural zeros or zeros which occur by chance. These type
of data are commonly found in various disciplines such as finance,
insurance, biomedical, econometrical, ecology, and health sciences
which involve sex and health dental epidemiology. The most popular
zero inflated models used by many researchers are zero inflated
Poisson and zero inflated negative binomial models. In addition, zero
inflated generalized Poisson and zero inflated double Poisson models
are also discussed and found in some literature. Recently zero
inflated inverse trinomial model and zero inflated strict arcsine
models are advocated and proven to serve as alternative models in
modeling overdispersed count data caused by excessive zeros and
unobserved heterogeneity. The purpose of this paper is to review
some related literature and provide a variety of examples from
different disciplines in the application of zero inflated models.
Different model selection methods used in model comparison are
discussed.
Abstract: After the accounting scandals and the financial crisis, regulators have stressed the need for more financial experts on boards. Several studies conducted in countries with developed capital markets report positive effects of board financial competencies. As each country offers a different context and specific institutional factors this paper addresses the subject in the context of Romania. The Romanian capital market offers an interesting research field because of the heterogeneity of listed firms. After analyzing board members education based on public information posted on listed companies websites and their annual reports we found a positive association between the proportion of board members holding a postgraduate degree in financial fields and market based performance measured by Tobin q. We found also that the proportion of Board members holding degrees in financial fields is higher in bigger firms and firms with more concentrated ownership.
Abstract: Mobile devices, which are progressively surrounded
in our everyday life, have created a new paradigm where they
interconnect, interact and collaborate with each other. This network
can be used for flexible and secure coordinated sharing. On the other
hand Grid computing provides dependable, consistent, pervasive, and
inexpensive access to high-end computational capabilities. In this
paper, efforts are made to map the concepts of Grid on Ad-Hoc
networks because both exhibit similar kind of characteristics like
Scalability, Dynamism and Heterogeneity. In this context we
propose “Mobile Ad-Hoc Services Grid – MASGRID".
Abstract: Wireless LAN (WLAN) access in public hotspot areas
becomes popular in the recent years. Since more and more multimedia
information is available in the Internet, there is an increasing demand
for accessing multimedia information through WLAN hotspots.
Currently, the bandwidth offered by an IEEE 802.11 WLAN cannot
afford many simultaneous real-time video accesses. A possible way to
increase the offered bandwidth in a hotspot is the use of multiple access
points (APs). However, a mobile station is usually connected to the
WLAN AP with the strongest received signal strength indicator (RSSI).
The total consumed bandwidth cannot be fairly allocated among those
APs. In this paper, we will propose an effective load-balancing scheme
via the support of the IAPP and SNMP in APs. The proposed scheme is
an open solution and doesn-t need any changes in both wireless stations
and APs. This makes load balancing possible in WLAN hotspots,
where a variety of heterogeneous mobile devices are employed.
Abstract: If organizations like Mellat Bank want to identify its
customer market completely to reach its specified goals, it can
segment the market to offer the product package to the right segment.
Our objective is to offer a segmentation model for Iran banking
market in Mellat bank view. The methodology of this project is
combined by “segmentation on the basis of four part-quality
variables" and “segmentation on the basis of different in means".
Required data are gathered from E-Systems and researcher personal
observation. Finally, the research offers the organization that at first
step form a four dimensional matrix with 756 segments using four
variables named value-based, behavioral, activity style, and activity
level, and at the second step calculate the means of profit for every
cell of matrix in two distinguished work level (levels α1:normal
condition and α2: high pressure condition) and compare the segments
by checking two conditions that are 1- homogeneity every segment
with its sub segment and 2- heterogeneity with other segments, and
so it can do the necessary segmentation process. After all, the last
offer (more explained by an operational example and feedback
algorithm) is to test and update the model because of dynamic
environment, technology, and banking system.
Abstract: Recurrent event data is a special type of multivariate
survival data. Dynamic and frailty models are one of the approaches
that dealt with this kind of data. A comparison between these two
models is studied using the empirical standard deviation of the
standardized martingale residual processes as a way of assessing the
fit of the two models based on the Aalen additive regression model.
Here we found both approaches took heterogeneity into account and
produce residual standard deviations close to each other both in the
simulation study and in the real data set.
Abstract: A scalable QoS aware multicast deployment in
DiffServ networks has become an important research dimension in
recent years. Although multicasting and differentiated services are
two complementary technologies, the integration of the two
technologies is a non-trivial task due to architectural conflicts
between them. A popular solution proposed is to extend the
functionality of the DiffServ components to support multicasting. In
this paper, we propose an algorithm to construct an efficient QoSdriven
multicast tree, taking into account the available bandwidth per
service class. We also present an efficient way to provision the
limited available bandwidth for supporting heterogeneous users. The
proposed mechanism is evaluated using simulated tests. The
simulated result reveals that our algorithm can effectively minimize
the bandwidth use and transmission cost
Abstract: The financial crisis has decreased the opportunities of
small businesses to acquire financing through conventional financial
actors, such as commercial banks. This credit constraint is partly the
reason for the emergence of new alternatives of financing, in addition
to the spreading opportunities for communication and secure
financial transfer through Internet. One of the most interesting venues
for finance is termed “crowdfunding". As the term suggests
crowdfunding is an appeal to prospective customers and investors to
form a crowd that will finance projects that otherwise would find it
hard to generate support through the most common financial actors.
Crowdfunding is in this paper divided into different models; the
threshold model, the microfinance model, the micro loan model and
the equity model. All these models add to the financial possibilities of
emerging entrepreneurs.