Abstract: Solar power plants(SPPs) have shown a lot of good outcomes
in providing a various functions depending on industrial expectations by
deploying ad-hoc networking with helps of light loaded and battery powered
sensor nodes. In particular, it is strongly requested to develop an algorithm to
deriver the sensing data from the end node of solar power plants to the sink node
on time. In this paper, based on the above observation we have proposed an
IEEE802.15.4 based self routing scheme for solar power plants. The proposed
beacon based priority routing Algorithm (BPRA) scheme utilizes beacon
periods in sending message with embedding the high priority data and thus
provides high quality of service(QoS) in the given criteria. The performance
measures are the packet Throughput, delivery, latency, total energy
consumption. Simulation results under TinyOS Simulator(TOSSIM) have
shown the proposed scheme outcome the conventional Ad hoc On-Demand
Distance Vector(AODV) Routing in solar power plants.
Abstract: Climate change causes severe effects on natural
habitats, especially wetlands. These challenges require the adaptation
of their management to probable effects of climate change. A
compilation of necessary changes in land management was collected
in a Hungarian area being both national park and Natura 2000 SAC
and SCI site in favor of increasing the resilience and reducing
vulnerability. Several factors, such as ecological aspects, nature
conservation and climatic adaptation should be combined with social
and economic factors during the process of developing climate
change adapted management on vulnerable wetlands. Planning
adaptive management should be determined by a priority order of
conservation aims and evaluation of factors at the determined
planning unit. Mowing techniques, frequency and exact date should
be observed as well as grazing species and their breed, due to
different grazing, group forming and trampling habits. Integrating
landscape history and historical land development into the planning
process is essential.
Abstract: For a spatiotemporal database management system,
I/O cost of queries and other operations is an important performance
criterion. In order to optimize this cost, an intense research on
designing robust index structures has been done in the past decade.
With these major considerations, there are still other design issues
that deserve addressing due to their direct impact on the I/O cost.
Having said this, an efficient buffer management strategy plays a key
role on reducing redundant disk access. In this paper, we proposed an
efficient buffer strategy for a spatiotemporal database index
structure, specifically indexing objects moving over a network of
roads. The proposed strategy, namely MONPAR, is based on the data
type (i.e. spatiotemporal data) and the structure of the index
structure. For the purpose of an experimental evaluation, we set up a
simulation environment that counts the number of disk accesses
while executing a number of spatiotemporal range-queries over the
index. We reiterated simulations with query sets with different
distributions, such as uniform query distribution and skewed query
distribution. Based on the comparison of our strategy with wellknown
page-replacement techniques, like LRU-based and Prioritybased
buffers, we conclude that MONPAR behaves better than its
competitors for small and medium size buffers under all used query-distributions.
Abstract: In the current context of globalization, accountability has become a key subject of real interest for both, national and international business areas, due to the need for comparability and transparency of the economic situation, so we can speak about the harmonization and convergence of international accounting. The paper presents a qualitative research through content analysis of several reports concerning the roadmap for convergence. First, we develop a conceptual framework for the evolution of standards’ convergence and further we discuss the degree of standards harmonization and convergence between US GAAP and IAS/IFRS as to October 2012. We find that most topics did not follow the expected progress. Furthermore there are still some differences in the long-term project that are in process to be completed and other that were reassessed as a lower priority project.
Abstract: The internet is constantly expanding. Identifying web
links of interest from web browsers requires users to visit each of the
links listed, individually until a satisfactory link is found, therefore
those users need to evaluate a considerable amount of links before
finding their link of interest; this can be tedious and even
unproductive. By incorporating web assistance, web users could be
benefited from reduced time searching on relevant websites. In this
paper, a rough set approach is presented, which facilitates
classification of unlimited available e-vocabulary, to assist web users
in reducing search times looking for relevant web sites. This
approach includes two methods for identifying relevance data on web
links based on the priority and percentage of relevance. As a result of
these methods, a list of web sites is generated in priority sequence
with an emphasis of the search criteria.
Abstract: Our goal is to effectively increase the number of boats in the river during a six month period. The main factors of determining the number of boats are duration and “select the priority trip". In the microcosmic simulation model, the best result is 4 to 24 nights with DSCF, and the number of boats is 812 with an increasing ratio of 9.0% related to the second best result. However, the number of boats is related to 31.6% less than the best one in 6 to 18 nights with FCFS. In the discrete duration model, we get from 6 to 18 nights, the numbers of boats have increased to 848 with an increase ratio of 29.7% than the best result in model I for the same time range. Moreover, from 4 to 24 nights, the numbers of boats have increase to 1194 with an increase ratio of 47.0% than the best result in model I for the same time range.
Abstract: Grid environments consist of the volatile integration
of discrete heterogeneous resources. The notion of the Grid is to
unite different users and organisations and pool their resources into
one large computing platform where they can harness, inter-operate,
collaborate and interact. If the Grid Community is to achieve this
objective, then participants (Users and Organisations) need to be
willing to donate or share their resources and permit other
participants to use their resources. Resources do not have to be
shared at all times, since it may result in users not having access to
their own resource. The idea of reward-based computing was
developed to address the sharing problem in a pragmatic manner.
Participants are offered a reward to donate their resources to the
Grid. A reward may include monetary recompense or a pro rata share
of available resources when constrained. This latter point may imply
a quality of service, which in turn may require some globally agreed
reservation mechanism. This paper presents a platform for economybased
computing using the WebCom Grid middleware. Using this
middleware, participants can configure their resources at times and
priority levels to suit their local usage policy. The WebCom system
accounts for processing done on individual participants- resources
and rewards them accordingly.
Abstract: Dynamic bandwidth allocation in EPONs can be
generally separated into inter-ONU scheduling and intra-ONU scheduling. In our previous work, the active intra-ONU scheduling
(AS) utilizes multiple queue reports (QRs) in each report message to cooperate with the inter-ONU scheduling and makes the granted
bandwidth fully utilized without leaving unused slot remainder (USR).
This scheme successfully solves the USR problem originating from the
inseparability of Ethernet frame. However, without proper setting of
threshold value in AS, the number of QRs constrained by the IEEE
802.3ah standard is not enough, especially in the unbalanced traffic
environment. This limitation may be solved by enlarging the threshold
value. The large threshold implies the large gap between the adjacent QRs, thus resulting in the large difference between the best granted bandwidth and the real granted bandwidth. In this paper, we integrate
AS with a cooperative prediction mechanism and distribute multiple
QRs to reduce the penalty brought by the prediction error.
Furthermore, to improve the QoS and save the usage of queue reports,
the highest priority (EF) traffic which comes during the waiting time is
granted automatically by OLT and is not considered in the requested
bandwidth of ONU. The simulation results show that the proposed
scheme has better performance metrics in terms of bandwidth
utilization and average delay for different classes of packets.
Abstract: The conventional GA combined with a local search
algorithm, such as the 2-OPT, forms a hybrid genetic algorithm(HGA)
for the traveling salesman problem (TSP). However, the geometric
properties which are problem specific knowledge can be used to
improve the search process of the HGA. Some tour segments (edges)
of TSPs are fine while some maybe too long to appear in a short tour.
This knowledge could constrain GAs to work out with fine tour
segments without considering long tour segments as often.
Consequently, a new algorithm is proposed, called intelligent-OPT
hybrid genetic algorithm (IOHGA), to improve the GA and the 2-OPT
algorithm in order to reduce the search time for the optimal solution.
Based on the geometric properties, all the tour segments are assigned
2-level priorities to distinguish between good and bad genes. A
simulation study was conducted to evaluate the performance of the
IOHGA. The experimental results indicate that in general the IOHGA
could obtain near-optimal solutions with less time and better accuracy
than the hybrid genetic algorithm with simulated annealing algorithm
(HGA(SA)).
Abstract: WiMAX is defined as Worldwide Interoperability for
Microwave Access by the WiMAX Forum, formed in June 2001 to
promote conformance and interoperability of the IEEE 802.16
standard, officially known as WirelessMAN. The attractive features
of WiMAX technology are very high throughput and Broadband
Wireless Access over a long distance. A detailed simulation
environment is demonstrated with the UGS, nrtPS and ertPS service
classes for throughput, delay and packet delivery ratio for a mixed
environment of fixed and mobile WiMAX. A simple mobility aspect
is considered for the mobile WiMAX and the PMP mode of
transmission is considered in TDD mode. The Network Simulator 2
(NS-2) is the tool which is used to simulate the WiMAX network
scenario. A simple Priority Scheduler and Weighted Round Robin
Schedulers are the WiMAX schedulers used in the research work
Abstract: Whit the increasing of traffic, noise emanated from
motor vehicles increases as well, which subsequently causes adding
to the stress of modern city. Thus, it is needed to look for most
critical areas in terms of environmental and social impact of noise.
There are several critical situations for noise emanated from motor
vehicles such as stop and go situation which usually occurs near
junctions or at-grade intersections. This study was conducted in two
locations, most common types of intersections, crossroads and Tjunctions.
The highest average noise levels are recorded during Go
phase for T-junction, 64.4 dB, and Drive phase for crossroad, 64 dB.
It implies that the existence of intersection caused the noise level to
increase. The vehicles starting to move produce more sound than
when they travel at a constant speed along the intersection. It is
suggested that special considerations and priority of allocating funds
should be given to these critical spots.
Abstract: In this paper, we deal with the Steiner tree problem
(STP) on a graph in which a fuzzy number, instead of a real number,
is assigned to each edge. We propose a modification of the shortest
paths approximation based on the fuzzy shortest paths (FSP)
evaluations. Since a fuzzy min operation using the extension
principle leads to nondominated solutions, we propose another
approach to solving the FSP using Cheng's centroid point fuzzy
ranking method.
Abstract: University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.
Abstract: Wireless sensor networks are consisted of hundreds or
thousands of small sensors that have limited resources.
Energy-efficient techniques are the main issue of wireless sensor
networks. This paper proposes an energy efficient agent-based
framework in wireless sensor networks. We adopt biologically
inspired approaches for wireless sensor networks. Agent operates
automatically with their behavior policies as a gene. Agent aggregates
other agents to reduce communication and gives high priority to nodes
that have enough energy to communicate. Agent behavior policies are
optimized by genetic operation at the base station. Simulation results
show that our proposed framework increases the lifetime of each node.
Each agent selects a next-hop node with neighbor information and
behavior policies. Our proposed framework provides self-healing,
self-configuration, self-optimization properties to sensor nodes.
Abstract: Safety, river environment, and sediment utilization are the elements of the target of sediment management. As a change in an element by sediment management, may affect the other two elements, and the priority among three elements depends on stakeholders. It is necessary to develop a method to evaluate the effect of sediment management on each element and an integrated evaluation method for socio-economic effect. In this study, taking Mount Merapi basin as an investigation field, the method for an active volcanic basin was developed. An integrated evaluation method for sediment management was discussed from a socio-economic point on safety, environment, and sediment utilization and a case study of sediment management was evaluated by means of this method. To evaluate the effect of sediment management, some parameters on safety, utilization, and environment have been introduced. From a utilization point of view, job opportunity, additional income of local people, and tax income to local government were used to evaluate the effectiveness of sediment management. The risk degree of river infrastructure was used to describe the effect of sediment management on a safety aspect. To evaluate the effects of sediment management on environment, the mean diameter of grain size distribution of riverbed surface was used. On the coordinate system designating these elements, the direction of change in basin condition by sediment management can be predicted, so that the most preferable sediment management can be decided. The results indicate that the cases of sediment management tend to give the negative impacts on sediment utilization. However, these sediment managements will give positive impacts on safety and environment condition. Evaluation result from a social-economic point of view shows that the case study of sediment management reduces job opportunity and additional income for inhabitants as well as tax income for government. Therefore, it is necessary to make another policy for creating job opportunity for inhabitants to support these sediment managements.
Abstract: Image compression is one of the most important
applications Digital Image Processing. Advanced medical imaging
requires storage of large quantities of digitized clinical data. Due to
the constrained bandwidth and storage capacity, however, a medical
image must be compressed before transmission and storage. There
are two types of compression methods, lossless and lossy. In Lossless
compression method the original image is retrieved without any
distortion. In lossy compression method, the reconstructed images
contain some distortion. Direct Cosine Transform (DCT) and Fractal
Image Compression (FIC) are types of lossy compression methods.
This work shows that lossy compression methods can be chosen for
medical image compression without significant degradation of the
image quality. In this work DCT and Fractal Compression using
Partitioned Iterated Function Systems (PIFS) are applied on different
modalities of images like CT Scan, Ultrasound, Angiogram, X-ray
and mammogram. Approximately 20 images are considered in each
modality and the average values of compression ratio and Peak
Signal to Noise Ratio (PSNR) are computed and studied. The quality
of the reconstructed image is arrived by the PSNR values. Based on
the results it can be concluded that the DCT has higher PSNR values
and FIC has higher compression ratio. Hence in medical image
compression, DCT can be used wherever picture quality is preferred
and FIC is used wherever compression of images for storage and
transmission is the priority, without loosing picture quality
diagnostically.
Abstract: Waiting times and queues are a daily problem for theme parks. Fast lines or priority queues appear as a solution for a specific segment of customers, that is, tourists who are willing to pay to avoid waiting. This paper analyzes the fast line system and explores the factors that affect the decision to purchase a fast line pass. A greater understanding of these factors may help companies to design appropriate products and services. This conceptual paper was based on a literature review in marketing and consumer behavior. Additional research was identified in related disciplines such as leisure studies, psychology, and sociology. A conceptual framework of the factors influencing the decision to purchase a fast line pass is presented.
Abstract: In recent years, global warming has become a
worldwide problem. The reduction of carbon dioxide emissions is a
top priority for many companies in the manufacturing industry. In the
automobile industry as well, the reduction of carbon dioxide emissions
is one of the most important issues. Technology to reduce the weight
of automotive parts improves the fuel economy of automobiles, and is
an important technology for reducing carbon dioxide. Also, even if
this weight reduction technology is applied to electric automobiles
rather than gasoline automobiles, reducing energy consumption
remains an important issue. Plastic processing of hollow pipes is one
important technology for realizing the weight reduction of automotive
parts. Ohashi et al. [1],[2] present an example of research on pipe
formation in which a process was carried out to enlarge a pipe
diameter using a lost core, achieving the suppression of wall thickness
reduction and greater pipe expansion than hydroforming.
In this study, we investigated a method to increase the wall
thickness of a pipe through pipe compression using planetary rolls.
The establishment of a technology whereby the wall thickness of a
pipe can be controlled without buckling the pipe is an important
technology for the weight reduction of products. Using the finite
element analysis method, we predicted that it would be possible to
increase the compression of an aluminum pipe with a 3mm wall
thickness by approximately 20%, and wall thickness by approximately
20% by pressing the hollow pipe with planetary rolls.
Abstract: A synchronous network-on-chip using wormhole packet switching
and supporting guaranteed-completion best-effort with low-priority (LP)
and high-priority (HP) wormhole packet delivery service is presented in
this paper. Both our proposed LP and HP message services deliver a good
quality of service in term of lossless packet completion and in-order message
data delivery. However, the LP message service does not guarantee minimal
completion bound. The HP packets will absolutely use 100% bandwidth of
their reserved links if the HP packets are injected from the source node with
maximum injection. Hence, the service are suitable for small size messages
(less than hundred bytes). Otherwise the other HP and LP messages, which
require also the links, will experience relatively high latency depending on the
size of the HP message. The LP packets are routed using a minimal adaptive
routing, while the HP packets are routed using a non-minimal adaptive routing
algorithm. Therefore, an additional 3-bit field, identifying the packet type,
is introduced in their packet headers to classify and to determine the type
of service committed to the packet. Our NoC prototypes have been also
synthesized using a 180-nm CMOS standard-cell technology to evaluate the
cost of implementing the combination of both services.
Abstract: Unlike the best effort service provided by the internet
today, next-generation wireless networks will support real-time
applications. This paper proposes an adaptive early packet discard
(AEPD) policy to improve the performance of the real time TCP
traffic over ATM networks and avoid the fragmentation problem.
Three main aspects are incorporated in the proposed policy. First,
providing quality-of-service (QoS) guaranteed for real-time
applications by implementing a priority scheduling. Second,
resolving the partially corrupted packets problem by differentiating
the buffered cells of one packet from another. Third, adapting a
threshold dynamically using Fuzzy logic based on the traffic
behavior to maintain a high throughput under a variety of load
conditions. The simulation is run for two priority classes of the input
traffic: real time and non-real time classes. Simulation results show
that the proposed AEPD policy improves throughput and fairness
over that using static threshold under the same traffic conditions.