Abstract: Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.
Abstract: Healthcare waste management continues to present an
array of challenges for developing countries, and Liberia is of no
exception. There is insufficient information available regarding the
generation, handling, and disposal of health care waste. This face
serves as an impediment to healthcare management schemes. The
specific objective of this study is to present an evaluation of the
current health care management practices in Liberia. It also presented
procedures, techniques used, methods of handling, transportation, and
disposal methods of wastes as well as the quantity and composition
of health care waste. This study was conducted as an investigative
case study, covering three different health care facilities; a hospital, a
health center, and a clinic in Monrovia, Montserrado County. The
average waste generation was found to be 0-7kg per day at the clinic
and health center and 8-15kg per/day at the hospital. The composition
of the waste includes hazardous and non-hazardous waste i.e. plastic,
papers, sharps, and pathological elements etc. Nevertheless, the
investigation showed that the healthcare waste generated by the
surveyed healthcare facilities were not properly handled because of
insufficient guidelines for separate collection, and classification, and
adequate methods for storage and proper disposal of generated
wastes. This therefore indicates that there is a need for improvement
within the healthcare waste management system to improve the
existing situation.
Abstract: This paper is a qualitative research report. A group of
students form a public university in a small town in Colombia
participated in this study which aimed at describing to what extend
the use of social ads, published on the internet, helped to develop
their critical thinking skills. Students’ productions, field notes, video
recordings and direct observation were the instruments and
techniques used by the researches in order to gather the data which
was analyzed under the principles of grounded theory and
triangulation. The implementation of social ads into the classroom
evidenced a noticeable improvement in students’ ability to interpret
and argue social issues, as well as, their self-improvement in oral and
written production in English, as a foreign language.
Abstract: Hydrologic models are increasingly used as tools to
predict stormwater quantity and quality from urban catchments.
However, due to a range of practical issues, most models produce
gross errors in simulating complex hydraulic and hydrologic systems.
Difficulty in finding a robust approach for model calibration is one of
the main issues. Though automatic calibration techniques are
available, they are rarely used in common commercial hydraulic and
hydrologic modelling software e.g. MIKE URBAN. This is partly
due to the need for a large number of parameters and large datasets in
the calibration process. To overcome this practical issue, a
framework for automatic calibration of a hydrologic model was
developed in R platform and presented in this paper. The model was
developed based on the time-area conceptualization. Four calibration
parameters, including initial loss, reduction factor, time of
concentration and time-lag were considered as the primary set of
parameters. Using these parameters, automatic calibration was
performed using Approximate Bayesian Computation (ABC). ABC is
a simulation-based technique for performing Bayesian inference
when the likelihood is intractable or computationally expensive to
compute. To test the performance and usefulness, the technique was
used to simulate three small catchments in Gold Coast. For
comparison, simulation outcomes from the same three catchments
using commercial modelling software, MIKE URBAN were used.
The graphical comparison shows strong agreement of MIKE URBAN
result within the upper and lower 95% credible intervals of posterior
predictions as obtained via ABC. Statistical validation for posterior
predictions of runoff result using coefficient of determination (CD),
root mean square error (RMSE) and maximum error (ME) was found
reasonable for three study catchments. The main benefit of using
ABC over MIKE URBAN is that ABC provides a posterior
distribution for runoff flow prediction, and therefore associated
uncertainty in predictions can be obtained. In contrast, MIKE
URBAN just provides a point estimate. Based on the results of the
analysis, it appears as though ABC the developed framework
performs well for automatic calibration.
Abstract: In recent years, a wide variety of applications are developed with Support Vector Machines -SVM- methods and Artificial Neural Networks -ANN-. In general, these methods depend on intrusion knowledge databases such as KDD99, ISCX, and CAIDA among others. New classes of detectors are generated by machine learning techniques, trained and tested over network databases. Thereafter, detectors are employed to detect anomalies in network communication scenarios according to user’s connections behavior. The first detector based on training dataset is deployed in different real-world networks with mobile and non-mobile devices to analyze the performance and accuracy over static detection. The vulnerabilities are based on previous work in telemedicine apps that were developed on the research group. This paper presents the differences on detections results between some network scenarios by applying traditional detectors deployed with artificial neural networks and support vector machines.
Abstract: Flue gas desulfurization gypsum (FGD) is a waste
material arouse from coal power plants. Hydroxyapatite (HAP) is a
biomaterial with porous structure. In this study, FGD gypsum which
retrieved from coal power plant in Turkey was characterized and
HAP particles which can be used as an adsorbent in wastewater
treatment application were synthesized from the FGD gypsum. The
raw materials are characterized by using X Ray Diffraction (XRD)
and Fourier transform infrared spectroscopy (FT-IR) techniques and
produced HAP are characterized by using XRD. As a result, HAP
particles were synthesized at the molar ratio of 5:10, 5:15, 5:20, 5:24,
at room temperature, in alkaline medium (pH=11) and in 1 hour-reaction
time. Among these conditions, 5:20 had the best result.
Abstract: The combination of the properties of graphene oxide
(OG) and PVDF homopolymer makes their combined composite
materials as multifunctional systems with great potential. Knowledge
of the molecular structure is essential for better use. In this work, the
degradation of PVDF polymer exposed to gamma irradiation in
oxygen atmosphere in high dose rate has been studied and compared
to degradation of PVDF/OG composites. The samples were irradiated
with a Co-60 source at constant dose rate, with doses ranging from
100 kGy to 1,000 kGy. In FTIR data shown that the formation of
oxidation products was at the both samples with formation of
carbonyl and hydroxyl groups amongst the most prevalent products
in the pure PVDF samples. In the other hand, the composites samples
exhibit less presence of degradation products with predominant
formation of carbonyl groups, these results also seen in the UV-Vis
analysis. The results show that the samples of composites may have
greater resistance to the irradiation process, since they have less
degradation products than pure PVDF samples seen by spectroscopic
techniques.
Abstract: The irradiation of polymeric materials has received
much attention because it can produce diverse changes in chemical
structure and physical properties. Thus, studying the chemical and
structural changes of polymers is important in practice to achieve
optimal conditions for the modification of polymers. The effect of
gamma irradiation on the crystalline structure of poly(vinylidene
fluoride) (PVDF) has been investigated using differential scanning
calorimetry (DSC) and X-ray diffraction techniques (XRD). Gamma
irradiation was carried out in atmosphere air with doses between 100
kGy at 3,000 kGy with a Co-60 source. In the melting thermogram of
the samples irradiated can be seen a bimodal melting endotherm is
detected with two melting temperature. The lower melting
temperature is attributed to melting of crystals originally present and
the higher melting peak due to melting of crystals reorganized upon
heat treatment. These results are consistent with those obtained by
XRD technique showing increasing crystallinity with increasing
irradiation dose, although the melting latent heat is decreasing.
Abstract: The combination of multi–walled carbon nanotubes
(MWCNTs) with polymers offers an attractive route to reinforce the
macromolecular compounds as well as the introduction of new
properties based on morphological modifications or electronic
interactions between the two constituents. As they are only a few
nanometers in dimension, it offers ultra-large interfacial area per
volume between the nano-element and polymer matrix. Nevertheless,
the use of MWCNTs as a rough material in different applications has
been largely limited by their poor processability, insolubility, and
infusibility. Studies concerning the nanofiller reinforced polymer
composites are justified in an attempt to overcome these limitations.
This work presents one preliminary study of MWCNTs dispersion
into the PVDF homopolymer. For preparation, the composite
components were diluted in n,n-dimethylacetamide (DMAc) with
mechanical agitation assistance. After complete dilution, followed by
slow evaporation of the solvent at 60°C, the samples were dried.
Films of about 80 μm were obtained. FTIR and UV-Vis
spectroscopic techniques were used to characterize the
nanocomposites. The appearance of absorption bands in the FTIR
spectra of nanofilled samples, when compared to the spectrum of
pristine PVDF samples, are discussed and compared with the UV-Vis
measurements.
Abstract: This paper analyzes the institutionalization of social
protest in Spain. In the current crisis Podemos party seems to
represent the political positions of the most affected citizens by the
economic situation. It studies using quantitative techniques (statistical
bivariate analysis), focusing on the exploitation of several bases of
statistics data from the Center for Sociological and Research of
Spanish Government, 15M movement characterization to its
institutionalization in the Podemos party. Making a comparison
between the participant's profile by the 15M and the social bases of
Podemos votes. Data on the transformation of the socio-demographic
profile of the fans, connoisseurs and 15M participants and voters are
given.
Abstract: Nowadays, food safety is a great public concern;
therefore, robust and effective techniques are required for detecting
the safety situation of goods. Hyperspectral Imaging (HSI) is an
attractive material for researchers to inspect food quality and safety
estimation such as meat quality assessment, automated poultry
carcass inspection, quality evaluation of fish, bruise detection of
apples, quality analysis and grading of citrus fruits, bruise detection
of strawberry, visualization of sugar distribution of melons,
measuring ripening of tomatoes, defect detection of pickling
cucumber, and classification of wheat kernels. HSI can be used to
concurrently collect large amounts of spatial and spectral data on the
objects being observed. This technique yields with exceptional
detection skills, which otherwise cannot be achieved with either
imaging or spectroscopy alone. This paper presents a nonlinear
technique based on kernel Fukunaga-Koontz transform (KFKT) for
detection of fat content in ground meat using HSI. The KFKT which
is the nonlinear version of FKT is one of the most effective
techniques for solving problems involving two-pattern nature. The
conventional FKT method has been improved with kernel machines
for increasing the nonlinear discrimination ability and capturing
higher order of statistics of data. The proposed approach in this paper
aims to segment the fat content of the ground meat by regarding the
fat as target class which is tried to be separated from the remaining
classes (as clutter). We have applied the KFKT on visible and nearinfrared
(VNIR) hyperspectral images of ground meat to determine
fat percentage. The experimental studies indicate that the proposed
technique produces high detection performance for fat ratio in ground
meat.
Abstract: Voltage level must be raised in order to deliver the
produced energy to the consumption zones with less loss and less
cost. Power transformers used to raise or lower voltage are important
parts of the energy transmission system. Power transformers used in
switchgear and power generation plants stay in human's intensive
habitat zones as a result of expanding cities. Accordingly, noise
levels produced by power transformers have begun more and more
important and they have established itself as one of the research field.
In this research, the noise cause on transformers has been
investigated, it's causes has been examined and noise measurement
techniques have been introduced. Examples of transformer noise test
results are submitted and precautions to be taken were discussed for
the purpose of decreasing of the noise which will occurred by
transformers.
Abstract: The UK has had its fair share of the shale gas
revolutionary waves blowing across the global oil and gas industry at
present. Although, its exploitation is widely agreed to have been
delayed, shale gas was looked upon favorably by the UK Parliament
when they recognized it as genuine energy source and granted
licenses to industry to search and extract the resource. This, although
a significant progress by industry, there yet remains another test the
UK fracking resource must pass in order to render shale gas
extraction feasible – it must be economically extractible and
sustainably so. Developing unconventional resources is much more
expensive and risky, and for shale gas wells, producing in
commercial volumes is conditional upon drilling horizontal wells and
hydraulic fracturing, techniques which increase CAPEX. Meanwhile,
investment in shale gas development projects is sensitive to gas price
and technical and geological risks. Using a Two-Factor Model, the
economics of the Bowland shale wells were analyzed and the
operational conditions under which fracking is profitable in the UK
was characterized. We find that there is a great degree of flexibility
about Opex spending; hence Opex does not pose much threat to the
fracking industry in the UK. However, we discover Bowland shale
gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas
price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more
than $14.95M Capex are required to create value within the present
petroleum tax regime, in the UK fracking industry.
Abstract: This paper describes a simple way to control the speed
of PMBLDC motor using Fuzzy logic control method. In the
conventional PI controller the performance of the motor system is
simulated and the speed is regulated by using PI controller. These
methods used to improve the performance of PMSM drives, but in
some cases at different operating conditions when the dynamics of
the system also vary over time and it can change the reference speed,
parameter variations and the load disturbance. The simulation is
powered with the MATLAB program to get a reliable and flexible
simulation. In order to highlight the effectiveness of the speed control
method the FLC method is used. The proposed method targeted in
achieving the improved dynamic performance and avoids the
variations of the motor drive. This drive has high accuracy, robust
operation from near zero to high speed. The effectiveness and
flexibility of the individual techniques of the speed control method
will be thoroughly discussed for merits and demerits and finally
verified through simulation and experimental results for comparative
analysis.
Abstract: Seeking and sharing knowledge on online forums
have made them popular in recent years. Although online forums are
valuable sources of information, due to variety of sources of
messages, retrieving reliable threads with high quality content is an
issue. Majority of the existing information retrieval systems ignore
the quality of retrieved documents, particularly, in the field of thread
retrieval. In this research, we present an approach that employs
various quality features in order to investigate the quality of retrieved
threads. Different aspects of content quality, including completeness,
comprehensiveness, and politeness, are assessed using these features,
which lead to finding not only textual, but also conceptual relevant
threads for a user query within a forum. To analyse the influence of
the features, we used an adopted version of voting model thread
search as a retrieval system. We equipped it with each feature solely
and also various combinations of features in turn during multiple
runs. The results show that incorporating the quality features
enhances the effectiveness of the utilised retrieval system
significantly.
Abstract: The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.
Abstract: Recently, traffic monitoring has attracted the attention
of computer vision researchers. Many algorithms have been
developed to detect and track moving vehicles. In fact, vehicle
tracking in daytime and in nighttime cannot be approached with the
same techniques, due to the extreme different illumination conditions.
Consequently, traffic-monitoring systems are in need of having a
component to differentiate between daytime and nighttime scenes. In
this paper, a HSV-based day/night detector is proposed for traffic
monitoring scenes. The detector employs the hue-histogram and the
value-histogram on the top half of the image frame. Experimental
results show that the extraction of the brightness features along with
the color features within the top region of the image is effective for
classifying traffic scenes. In addition, the detector achieves high
precision and recall rates along with it is feasible for real time
applications.
Abstract: Nowadays, education cannot be imagined without digital technologies. It broadens the horizons of teaching learning processes. Several universities are offering online courses. For evaluation purpose, e-examination systems are being widely adopted in academic environments. Multiple-choice tests are extremely popular. Moving away from traditional examinations to e-examination, Moodle as Learning Management Systems (LMS) is being used. Moodle logs every click that students make for attempting and navigational purposes in e-examination. Data mining has been applied in various domains including retail sales, bioinformatics. In recent years, there has been increasing interest in the use of data mining in e-learning environment. It has been applied to discover, extract, and evaluate parameters related to student’s learning performance. The combination of data mining and e-learning is still in its babyhood. Log data generated by the students during online examination can be used to discover knowledge with the help of data mining techniques. In web based applications, number of right and wrong answers of the test result is not sufficient to assess and evaluate the student’s performance. So, assessment techniques must be intelligent enough. If student cannot answer the question asked by the instructor then some easier question can be asked. Otherwise, more difficult question can be post on similar topic. To do so, it is necessary to identify difficulty level of the questions. Proposed work concentrate on the same issue. Data mining techniques in specific clustering is used in this work. This method decide difficulty levels of the question and categories them as tough, easy or moderate and later this will be served to the desire students based on their performance. Proposed experiment categories the question set and also group the students based on their performance in examination. This will help the instructor to guide the students more specifically. In short mined knowledge helps to support, guide, facilitate and enhance learning as a whole.
Abstract: Although Mobile Wireless Sensor Networks (MWSNs),
which consist of mobile sensor nodes (MSNs), can cover a wide range
of observation region by using a small number of sensor nodes, they
need to construct a network to collect the sensing data on the base
station by moving the MSNs. As an effective method, the network
construction method based on Virtual Rails (VRs), which is referred
to as VR method, has been proposed. In this paper, we propose two
types of effective techniques for the VR method. They can prolong
the operation time of the network, which is limited by the battery
capabilities of MSNs and the energy consumption of MSNs. The
first technique, an effective arrangement of VRs, almost equalizes
the number of MSNs belonging to each VR. The second technique,
an adaptive movement method of MSNs, takes into account the
residual energy of battery. In the simulation, we demonstrate that each
technique can improve the network lifetime and the combination of
both techniques is the most effective.
Abstract: This paper outlines the development of an
experimental technique in quantifying supersonic jet flows, in an
attempt to avoid seeding particle problems frequently associated with
particle-image velocimetry (PIV) techniques at high Mach numbers.
Based on optical flow algorithms, the idea behind the technique
involves using high speed cameras to capture Schlieren images of the
supersonic jet shear layers, before they are subjected to an adapted
optical flow algorithm based on the Horn-Schnuck method to
determine the associated flow fields. The proposed method is capable
of offering full-field unsteady flow information with potentially
higher accuracy and resolution than existing point-measurements or
PIV techniques. Preliminary study via numerical simulations of a
circular de Laval jet nozzle successfully reveals flow and shock
structures typically associated with supersonic jet flows, which serve
as useful data for subsequent validation of the optical flow based
experimental results. For experimental technique, a Z-type Schlieren
setup is proposed with supersonic jet operated in cold mode,
stagnation pressure of 4 bar and exit Mach of 1.5. High-speed singleframe
or double-frame cameras are used to capture successive
Schlieren images. As implementation of optical flow technique to
supersonic flows remains rare, the current focus revolves around
methodology validation through synthetic images. The results of
validation test offers valuable insight into how the optical flow
algorithm can be further improved to improve robustness and
accuracy. Despite these challenges however, this supersonic flow
measurement technique may potentially offer a simpler way to
identify and quantify the fine spatial structures within the shock shear
layer.