Abstract: While the feature sizes of recent Complementary Metal
Oxid Semiconductor (CMOS) devices decrease the influence of static
power prevails their energy consumption. Thus, power savings that
benefit from Dynamic Frequency and Voltage Scaling (DVFS) are
diminishing and temporal shutdown of cores or other microchip
components become more worthwhile. A consequence of powering off unused parts of a chip is that the
relative difference between idle and fully loaded power consumption
is increased. That means, future chips and whole server systems gain
more power saving potential through power-aware load balancing,
whereas in former times this power saving approach had only
limited effect, and thus, was not widely adopted. While powering
off complete servers was used to save energy, it will be superfluous
in many cases when cores can be powered down. An important
advantage that comes with that is a largely reduced time to respond
to increased computational demand. We include the above developments in a server power model
and quantify the advantage. Our conclusion is that strategies from
datacenters when to power off server systems might be used in the
future on core level, while load balancing mechanisms previously
used at core level might be used in the future at server level.
Abstract: The increasing availability of information about earth
surface elevation (Digital Elevation Models DEM) generated from
different sources (remote sensing, Aerial Images, Lidar) poses the
question about how to integrate and make available to the most than
possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the
quality of data management plays a fundamental role. Due to the high
acquisition costs and the huge amount of generated data, highresolution
terrain surveys tend to be small or medium sized and
available on limited portion of earth. Here comes the need to merge
large-scale height maps that typically are made available for free at
worldwide level, with very specific high resolute datasets. One the
other hand, the third dimension increases the user experience and the
data representation quality, unlocking new possibilities in data
analysis for civil protection, real estate, urban planning, environment
monitoring, etc. The open-source 3D virtual globes, which are
trending topics in Geovisual Analytics, aim at improving the
visualization of geographical data provided by standard web services
or with proprietary formats. Typically, 3D Virtual globes like do not
offer an open-source tool that allows the generation of a terrain
elevation data structure starting from heterogeneous-resolution terrain
datasets. This paper describes a technological solution aimed to set
up a so-called “Terrain Builder”. This tool is able to merge
heterogeneous-resolution datasets, and to provide a multi-resolution
worldwide terrain services fully compatible with CesiumJS and
therefore accessible via web using traditional browser without any
additional plug-in.
Abstract: With 40% of total world energy consumption,
building systems are developing into technically complex large
energy consumers suitable for application of sophisticated power
management approaches to largely increase the energy efficiency
and even make them active energy market participants. Centralized
control system of building heating and cooling managed by
economically-optimal model predictive control shows promising
results with estimated 30% of energy efficiency increase. The research
is focused on implementation of such a method on a case study
performed on two floors of our faculty building with corresponding
sensors wireless data acquisition, remote heating/cooling units and
central climate controller. Building walls are mathematically modeled
with corresponding material types, surface shapes and sizes. Models
are then exploited to predict thermal characteristics and changes in
different building zones. Exterior influences such as environmental
conditions and weather forecast, people behavior and comfort
demands are all taken into account for deriving price-optimal climate
control. Finally, a DC microgrid with photovoltaics, wind turbine,
supercapacitor, batteries and fuel cell stacks is added to make the
building a unit capable of active participation in a price-varying
energy market. Computational burden of applying model predictive
control on such a complex system is relaxed through a hierarchical
decomposition of the microgrid and climate control, where the
former is designed as higher hierarchical level with pre-calculated
price-optimal power flows control, and latter is designed as lower
level control responsible to ensure thermal comfort and exploit
the optimal supply conditions enabled by microgrid energy flows
management. Such an approach is expected to enable the inclusion
of more complex building subsystems into consideration in order to
further increase the energy efficiency.
Abstract: This paper describes a simple way to control the speed
of PMBLDC motor using Fuzzy logic control method. In the
conventional PI controller the performance of the motor system is
simulated and the speed is regulated by using PI controller. These
methods used to improve the performance of PMSM drives, but in
some cases at different operating conditions when the dynamics of
the system also vary over time and it can change the reference speed,
parameter variations and the load disturbance. The simulation is
powered with the MATLAB program to get a reliable and flexible
simulation. In order to highlight the effectiveness of the speed control
method the FLC method is used. The proposed method targeted in
achieving the improved dynamic performance and avoids the
variations of the motor drive. This drive has high accuracy, robust
operation from near zero to high speed. The effectiveness and
flexibility of the individual techniques of the speed control method
will be thoroughly discussed for merits and demerits and finally
verified through simulation and experimental results for comparative
analysis.
Abstract: In this paper, the regression dependence of dancing
intensity from wind speed and length of span was established due to
the statistic data obtained from multi-year observations on line wires
dancing accumulated by power systems of Kazakhstan and the
Russian Federation. The lower and upper limitations of the equations
parameters were estimated, as well as the adequacy of the regression
model. The constructed model will be used in research of dancing
phenomena for the development of methods and means of protection
against dancing and for zoning plan of the territories of line wire
dancing.
Abstract: Seeking and sharing knowledge on online forums
have made them popular in recent years. Although online forums are
valuable sources of information, due to variety of sources of
messages, retrieving reliable threads with high quality content is an
issue. Majority of the existing information retrieval systems ignore
the quality of retrieved documents, particularly, in the field of thread
retrieval. In this research, we present an approach that employs
various quality features in order to investigate the quality of retrieved
threads. Different aspects of content quality, including completeness,
comprehensiveness, and politeness, are assessed using these features,
which lead to finding not only textual, but also conceptual relevant
threads for a user query within a forum. To analyse the influence of
the features, we used an adopted version of voting model thread
search as a retrieval system. We equipped it with each feature solely
and also various combinations of features in turn during multiple
runs. The results show that incorporating the quality features
enhances the effectiveness of the utilised retrieval system
significantly.
Abstract: In this study, the potential benefits of playing action
video game among congenitally deaf and dumb subjects is reported in
terms of EEG ratio indices. The frontal and occipital lobes are
associated with development of motor skills, cognition, and visual
information processing and color recognition. The sixteen hours of
First-Person shooter action video game play resulted in the increase
of the ratios β/(α+θ) and β/θ in frontal and occipital lobes. This can
be attributed to the enhancement of certain aspect of cognition among
deaf and dumb subjects.
Abstract: The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.
Abstract: Patient-specific models are instance-based learning
algorithms that take advantage of the particular features of the patient
case at hand to predict an outcome. We introduce two patient-specific
algorithms based on decision tree paradigm that use AUC as a
metric to select an attribute. We apply the patient specific algorithms
to predict outcomes in several datasets, including medical datasets.
Compared to the patient-specific decision path (PSDP) entropy-based
and CART methods, the AUC-based patient-specific decision path
models performed equivalently on area under the ROC curve (AUC).
Our results provide support for patient-specific methods being a
promising approach for making clinical predictions.
Abstract: Recently, traffic monitoring has attracted the attention
of computer vision researchers. Many algorithms have been
developed to detect and track moving vehicles. In fact, vehicle
tracking in daytime and in nighttime cannot be approached with the
same techniques, due to the extreme different illumination conditions.
Consequently, traffic-monitoring systems are in need of having a
component to differentiate between daytime and nighttime scenes. In
this paper, a HSV-based day/night detector is proposed for traffic
monitoring scenes. The detector employs the hue-histogram and the
value-histogram on the top half of the image frame. Experimental
results show that the extraction of the brightness features along with
the color features within the top region of the image is effective for
classifying traffic scenes. In addition, the detector achieves high
precision and recall rates along with it is feasible for real time
applications.
Abstract: Data fusion technology can be the best way to extract
useful information from multiple sources of data. It has been widely
applied in various applications. This paper presents a data fusion
approach in multimedia data for event detection in twitter by using
Dempster-Shafer evidence theory. The methodology applies a mining
algorithm to detect the event. There are two types of data in the
fusion. The first is features extracted from text by using the bag-ofwords
method which is calculated using the term frequency-inverse
document frequency (TF-IDF). The second is the visual features
extracted by applying scale-invariant feature transform (SIFT). The
Dempster - Shafer theory of evidence is applied in order to fuse the
information from these two sources. Our experiments have indicated
that comparing to the approaches using individual data source, the
proposed data fusion approach can increase the prediction accuracy
for event detection. The experimental result showed that the proposed
method achieved a high accuracy of 0.97, comparing with 0.93 with
texts only, and 0.86 with images only.
Abstract: Nowadays, education cannot be imagined without digital technologies. It broadens the horizons of teaching learning processes. Several universities are offering online courses. For evaluation purpose, e-examination systems are being widely adopted in academic environments. Multiple-choice tests are extremely popular. Moving away from traditional examinations to e-examination, Moodle as Learning Management Systems (LMS) is being used. Moodle logs every click that students make for attempting and navigational purposes in e-examination. Data mining has been applied in various domains including retail sales, bioinformatics. In recent years, there has been increasing interest in the use of data mining in e-learning environment. It has been applied to discover, extract, and evaluate parameters related to student’s learning performance. The combination of data mining and e-learning is still in its babyhood. Log data generated by the students during online examination can be used to discover knowledge with the help of data mining techniques. In web based applications, number of right and wrong answers of the test result is not sufficient to assess and evaluate the student’s performance. So, assessment techniques must be intelligent enough. If student cannot answer the question asked by the instructor then some easier question can be asked. Otherwise, more difficult question can be post on similar topic. To do so, it is necessary to identify difficulty level of the questions. Proposed work concentrate on the same issue. Data mining techniques in specific clustering is used in this work. This method decide difficulty levels of the question and categories them as tough, easy or moderate and later this will be served to the desire students based on their performance. Proposed experiment categories the question set and also group the students based on their performance in examination. This will help the instructor to guide the students more specifically. In short mined knowledge helps to support, guide, facilitate and enhance learning as a whole.
Abstract: Many cluster based routing protocols have been
proposed in the field of wireless sensor networks, in which a group of
nodes are formed as clusters. A cluster head is selected from one
among those nodes based on residual energy, coverage area, number
of hops and that cluster-head will perform data gathering from
various sensor nodes and forwards aggregated data to the base station
or to a relay node (another cluster-head), which will forward the
packet along with its own data packet to the base station. Here a
Game Theory based Diligent Energy Utilization Algorithm (GTDEA)
for routing is proposed. In GTDEA, the cluster head selection is done
with the help of game theory, a decision making process, that selects
a cluster-head based on three parameters such as residual energy
(RE), Received Signal Strength Index (RSSI) and Packet Reception
Rate (PRR). Finding a feasible path to the destination with minimum
utilization of available energy improves the network lifetime and is
achieved by the proposed approach. In GTDEA, the packets are
forwarded to the base station using inter-cluster routing technique,
which will further forward it to the base station. Simulation results
reveal that GTDEA improves the network performance in terms of
throughput, lifetime, and power consumption.
Abstract: Although Mobile Wireless Sensor Networks (MWSNs),
which consist of mobile sensor nodes (MSNs), can cover a wide range
of observation region by using a small number of sensor nodes, they
need to construct a network to collect the sensing data on the base
station by moving the MSNs. As an effective method, the network
construction method based on Virtual Rails (VRs), which is referred
to as VR method, has been proposed. In this paper, we propose two
types of effective techniques for the VR method. They can prolong
the operation time of the network, which is limited by the battery
capabilities of MSNs and the energy consumption of MSNs. The
first technique, an effective arrangement of VRs, almost equalizes
the number of MSNs belonging to each VR. The second technique,
an adaptive movement method of MSNs, takes into account the
residual energy of battery. In the simulation, we demonstrate that each
technique can improve the network lifetime and the combination of
both techniques is the most effective.
Abstract: Energy consumption data, in particular those involving
public buildings, are impacted by many factors: the building structure,
climate/environmental parameters, construction, system operating
condition, and user behavior patterns. Traditional methods for data
analysis are insufficient. This paper delves into the data mining
technology to determine its application in the analysis of building
energy consumption data including energy consumption prediction,
fault diagnosis, and optimal operation. Recent literature are reviewed
and summarized, the problems faced by data mining technology in the
area of energy consumption data analysis are enumerated, and research
points for future studies are given.
Abstract: In the deep south of Thailand, checkpoints for people
verification are necessary for the security management of risk zones,
such as official buildings in the conflict area. In this paper, we
propose an automatic checkpoint system that verifies persons using
information from ID cards and facial features. The methods for a
person’s information abstraction and verification are introduced
based on useful information such as ID number and name, extracted
from official cards, and facial images from videos. The proposed
system shows promising results and has a real impact on the local
society.
Abstract: Introduction: To update ourselves and understand the
concept of latest electronic formats available for Health care
providers and how it could be used and developed as per standards.
The idea is to correlate between the patients Manual Medical Records
keeping and maintaining patients Electronic Information in a Health
care setup in this world. Furthermore, this stands with adapting to the
right technology depending upon the organization and improve our
quality and quantity of Healthcare providing skills. Objective: The
concept and theory is to explain the terms of Electronic Medical
Record (EMR), Electronic Health Record (EHR) and Personal Health
Record (PHR) and selecting the best technical among the available
Electronic sources and software before implementing. It is to guide
and make sure the technology used by the end users without any
doubts and difficulties. The idea is to evaluate is to admire the uses
and barriers of EMR-EHR-PHR. Aim and Scope: The target is to
achieve the health care providers like Physicians, Nurses, Therapists,
Medical Bill reimbursements, Insurances and Government to assess
the patient’s information on easy and systematic manner without
diluting the confidentiality of patient’s information. Method: Health
Information Technology can be implemented with the help of
Organisations providing with legal guidelines and help to stand by
the health care provider. The main objective is to select the correct
embedded and affordable database management software and
generating large-scale data. The parallel need is to know how the
latest software available in the market. Conclusion: The question lies
here is implementing the Electronic information system with
healthcare providers and organization. The clinicians are the main
users of the technology and manage us to “go paperless”. The fact is
that day today changing technologically is very sound and up to date.
Basically, the idea is to tell how to store the data electronically safe
and secure. All three exemplifies the fact that an electronic format
has its own benefit as well as barriers.
Abstract: This paper proposes a method of learning topics for
broadcasting contents. There are two kinds of texts related to
broadcasting contents. One is a broadcasting script, which is a series of
texts including directions and dialogues. The other is blogposts, which
possesses relatively abstracted contents, stories, and diverse
information of broadcasting contents. Although two texts range over
similar broadcasting contents, words in blogposts and broadcasting
script are different. When unseen words appear, it needs a method to
reflect to existing topic. In this paper, we introduce a semantic
vocabulary expansion method to reflect unseen words. We expand
topics of the broadcasting script by incorporating the words in
blogposts. Each word in blogposts is added to the most semantically
correlated topics. We use word2vec to get the semantic correlation
between words in blogposts and topics of scripts. The vocabularies of
topics are updated and then posterior inference is performed to
rearrange the topics. In experiments, we verified that the proposed
method can discover more salient topics for broadcasting contents.
Abstract: In IEEE 802.11 networks, it is well known that the
traditional time-domain contention often leads to low channel
utilization. The first frequency-domain contention scheme, the time to
frequency (T2F), has recently been proposed to improve the channel
utilization and has attracted a great deal of attention. In this paper, we
present the latest research progress on the weighed frequency-domain
contention. We compare the basic ideas, work principles of these
related schemes and point out their differences. This paper is very
useful for further study on frequency-domain contention.
Abstract: Fractal based digital image compression is a specific
technique in the field of color image. The method is best suited for
irregular shape of image like snow bobs, clouds, flame of fire; tree
leaves images, depending on the fact that parts of an image often
resemble with other parts of the same image. This technique has
drawn much attention in recent years because of very high
compression ratio that can be achieved. Hybrid scheme incorporating
fractal compression and speedup techniques have achieved high
compression ratio compared to pure fractal compression. Fractal
image compression is a lossy compression method in which selfsimilarity
nature of an image is used. This technique provides high
compression ratio, less encoding time and fart decoding process. In
this paper, fractal compression with quad tree and DCT is proposed
to compress the color image. The proposed hybrid schemes require
four phases to compress the color image. First: the image is
segmented and Discrete Cosine Transform is applied to each block of
the segmented image. Second: the block values are scanned in a
zigzag manner to prevent zero co-efficient. Third: the resulting image
is partitioned as fractals by quadtree approach. Fourth: the image is
compressed using Run length encoding technique.