Abstract: Frequent, continuous speech training has proven to be
a necessary part of a successful speech therapy process, but
constraints of traveling time and employment dispensation become
key obstacles especially for individuals living in remote areas or for
dependent children who have working parents. In order to ameliorate
speech difficulties with ample guidance from speech therapists, a
website has been developed that supports speech therapy and training
for people with articulation disorders in the standard Thai language.
This web-based program has the ability to record speech training
exercises for each speech trainee. The records will be stored in a
database for the speech therapist to investigate, evaluate, compare
and keep track of all trainees’ progress in detail. Speech trainees can
request live discussions via video conference call when needed.
Communication through this web-based program facilitates and
reduces training time in comparison to walk-in training or
appointments. This type of training also allows people with
articulation disorders to practice speech lessons whenever or
wherever is convenient for them, which can lead to a more regular
training processes.
Abstract: Frequent pattern mining is the process of finding a
pattern (a set of items, subsequences, substructures, etc.) that occurs
frequently in a data set. It was proposed in the context of frequent
itemsets and association rule mining. Frequent pattern mining is used
to find inherent regularities in data. What products were often
purchased together? Its applications include basket data analysis,
cross-marketing, catalog design, sale campaign analysis, Web log
(click stream) analysis, and DNA sequence analysis. However, one of
the bottlenecks of frequent itemset mining is that as the data increase
the amount of time and resources required to mining the data
increases at an exponential rate. In this investigation a new algorithm
is proposed which can be uses as a pre-processor for frequent itemset
mining. FASTER (FeAture SelecTion using Entropy and Rough sets)
is a hybrid pre-processor algorithm which utilizes entropy and roughsets
to carry out record reduction and feature (attribute) selection
respectively. FASTER for frequent itemset mining can produce a
speed up of 3.1 times when compared to original algorithm while
maintaining an accuracy of 71%.
Abstract: Color Histogram is considered as the oldest method
used by CBIR systems for indexing images. In turn, the global
histograms do not include the spatial information; this is why the
other techniques coming later have attempted to encounter this
limitation by involving the segmentation task as a preprocessing step.
The weak segmentation is employed by the local histograms while
other methods as CCV (Color Coherent Vector) are based on strong
segmentation. The indexation based on local histograms consists of
splitting the image into N overlapping blocks or sub-regions, and
then the histogram of each block is computed. The dissimilarity
between two images is reduced, as consequence, to compute the
distance between the N local histograms of the both images resulting
then in N*N values; generally, the lowest value is taken into account
to rank images, that means that the lowest value is that which helps to
designate which sub-region utilized to index images of the collection
being asked. In this paper, we make under light the local histogram
indexation method in the hope to compare the results obtained against
those given by the global histogram. We address also another
noteworthy issue when Relying on local histograms namely which
value, among N*N values, to trust on when comparing images, in
other words, which sub-region among the N*N sub-regions on which
we base to index images. Based on the results achieved here, it seems
that relying on the local histograms, which needs to pose an extra
overhead on the system by involving another preprocessing step
naming segmentation, does not necessary mean that it produces better
results. In addition to that, we have proposed here some ideas to
select the local histogram on which we rely on to encode the image
rather than relying on the local histogram having lowest distance with
the query histograms.
Abstract: The article presents the results of the application of
artificial neural networks to separate the fluorescent contribution of
nanodiamonds used as biomarkers, adsorbents and carriers of drugs
in biomedicine, from a fluorescent background of own biological
fluorophores. The principal possibility of solving this problem is
shown. Use of neural network architecture let to detect fluorescence
of nanodiamonds against the background autofluorescence of egg
white with high accuracy - better than 3 ug/ml.
Abstract: In this study, a comparative analysis of the approaches
associated with the use of neural network algorithms for effective
solution of a complex inverse problem – the problem of identifying
and determining the individual concentrations of inorganic salts in
multicomponent aqueous solutions by the spectra of Raman
scattering of light – is performed. It is shown that application of
artificial neural networks provides the average accuracy of
determination of concentration of each salt no worse than 0.025 M.
The results of comparative analysis of input data compression
methods are presented. It is demonstrated that use of uniform
aggregation of input features allows decreasing the error of
determination of individual concentrations of components by 16-18%
on the average.
Abstract: This research proposes a novel reconstruction protocol
for restoring missing surfaces and low-quality edges and shapes in
photos of artifacts at historical sites. The protocol starts with the
extraction of a cloud of points. This extraction process is based on
four subordinate algorithms, which differ in the robustness and
amount of resultant. Moreover, they use different -but
complementary- accuracy to some related features and to the way
they build a quality mesh. The performance of our proposed protocol
is compared with other state-of-the-art algorithms and toolkits. The
statistical analysis shows that our algorithm significantly outperforms
its rivals in the resultant quality of its object files used to reconstruct
the desired model.
Abstract: The paper presents the results of clusterization by
Kohonen self-organizing maps (SOM) applied for analysis of array of
Raman spectra of multi-component solutions of inorganic salts, for
determination of types of salts present in the solution. It is
demonstrated that use of SOM is a promising method for solution of
clusterization and classification problems in spectroscopy of multicomponent
objects, as attributing a pattern to some cluster may be
used for recognition of component composition of the object.
Abstract: The system is designed to show images which are
related to the query image. Extracting color, texture, and shape
features from an image plays a vital role in content-based image
retrieval (CBIR). Initially RGB image is converted into HSV color
space due to its perceptual uniformity. From the HSV image, Color
features are extracted using block color histogram, texture features
using Haar transform and shape feature using Fuzzy C-means
Algorithm. Then, the characteristics of the global and local color
histogram, texture features through co-occurrence matrix and Haar
wavelet transform and shape are compared and analyzed for CBIR.
Finally, the best method of each feature is fused during similarity
measure to improve image retrieval effectiveness and accuracy.
Abstract: Opportunistic routing is used, where the network has
the features like dynamic topology changes and intermittent network
connectivity. In Delay tolerant network or Disruption tolerant
network opportunistic forwarding technique is widely used. The key
idea of opportunistic routing is selecting forwarding nodes to forward
data packets and coordination among these nodes to avoid duplicate
transmissions. This paper gives the analysis of pros and cons of
various opportunistic routing techniques used in MANET.
Abstract: Over the past era, there have been a lot of efforts and
studies are carried out in growing proficient tools for performing
various tasks in big data. Recently big data have gotten a lot of
publicity for their good reasons. Due to the large and complex
collection of datasets it is difficult to process on traditional data
processing applications. This concern turns to be further mandatory
for producing various tools in big data. Moreover, the main aim of
big data analytics is to utilize the advanced analytic techniques
besides very huge, different datasets which contain diverse sizes from
terabytes to zettabytes and diverse types such as structured or
unstructured and batch or streaming. Big data is useful for data sets
where their size or type is away from the capability of traditional
relational databases for capturing, managing and processing the data
with low-latency. Thus the out coming challenges tend to the
occurrence of powerful big data tools. In this survey, a various
collection of big data tools are illustrated and also compared with the
salient features.
Abstract: This research is aimed to develop the online-class
scheduling management system and improve as a complex problem
solution, this must take into consideration in various conditions and
factors. In addition to the number of courses, the number of students
and a timetable to study, the physical characteristics of each class
room and regulations used in the class scheduling must also be taken
into consideration. This system is developed to assist management in
the class scheduling for convenience and efficiency. It can provide
several instructors to schedule simultaneously. Both lecturers and
students can check and publish a timetable and other documents
associated with the system online immediately. It is developed in a
web-based application. PHP is used as a developing tool. The
database management system was MySQL. The tool that is used for
efficiency testing of the system is questionnaire. The system was
evaluated by using a Black-Box testing. The sample was composed
of 2 groups: 5 experts and 100 general users. The average and the
standard deviation of results from the experts were 3.50 and 0.67.
The average and the standard deviation of results from the general
users were 3.54 and 0.54. In summary, the results from the research
indicated that the satisfaction of users were in a good level.
Therefore, this system could be implemented in an actual workplace
and satisfy the users’ requirement effectively.
Abstract: Access control is one of the most challenging issues
facing information security. Access control is defined as, the ability to
permit or deny access to a particular computational resource or digital
information by an unauthorized user or subject. The concept of usage
control (UCON) has been introduced as a unified approach to capture a
number of extensions for access control models and systems. In
UCON, an access decision is determined by three factors:
authorizations, obligations and conditions. Attribute mutability and
decision continuity are two distinct characteristics introduced by
UCON for the first time. An observation of UCON components
indicates that, the components are predefined and static. In this paper,
we propose a new and flexible model of usage control for the creation
and elimination of some of these components; for example new
objects, subjects, attributes and integrate these with the original
UCON model. We also propose a model for concurrent usage
scenarios in UCON.
Abstract: This paper presents general results on the Java source
code snippet detection problem. We propose the tool which uses
graph and subgraph isomorphism detection. A number of solutions
for all of these tasks have been proposed in the literature. However,
although that all these solutions are really fast, they compare just the
constant static trees. Our solution offers to enter an input sample
dynamically with the Scripthon language while preserving an
acceptable speed. We used several optimizations to achieve very low
number of comparisons during the matching algorithm.
Abstract: This paper addresses the reduction of peak to average
power ratio (PAPR) for the OFDM in Mobile-WiMAX physical layer
(PHY) standard. In the process, the best achievable PAPR of 0 dB is
found for the OFDM spectrum using phase modulation technique
which avoids the nonlinear distortion. The performance of the
WiMAX PHY standard is handled by the software defined radio
(SDR) prototype in which GNU Radio and USRP N210 employed as
software and hardware platforms respectively. It is also found that
BER performance is shown for different coding and different
modulation schemes. To empathize wireless propagation in specific
environments, a sliding correlator wireless channel sounding system
is designed by using SDR testbed.
Abstract: Cognitive Radio is a turning out technology that
empowers viable usage of the spectrum. Energy Detector-based
Sensing is the most broadly utilized spectrum sensing strategy.
Besides, it's a lot of generic as receivers doesn't would like any
information on the primary user's signals, channel data, of even the
sort of modulation. This paper puts forth the execution of energy
detection sensing for AM (Amplitude Modulated) signal at 710 KHz,
FM (Frequency Modulated) signal at 103.45 MHz (local station
frequency), Wi-Fi signal at 2.4 GHz and WiMAX signals at 6 GHz.
The OFDM/OFDMA based WiMAX physical layer with
convolutional channel coding is actualized utilizing USRP N210
(Universal Software Radio Peripheral) and GNU Radio based
Software Defined Radio (SDR). Test outcomes demonstrated the
BER (Bit Error Rate) augmentation with channel noise and BER
execution is dissected for different Eb/N0 (the energy per bit to noise
power spectral density ratio) values.
Abstract: A circularly polarized fractal boundary microstrip
antenna is presented. The sides of a square patch along x- axis, yaxis
are replaced with Minkowski and Koch curves correspondingly.
By using the fractal curves as edges, asymmetry in the structure is
created to excite two orthogonal modes for circular polarization (CP)
operation. The indentation factors of the fractal curves are optimized
for pure CP. The simulated results of the novel polyfractal antenna
are demonstrated.
Abstract: The success of any retail business is predisposed by its
swift response and its knack in understanding the constraints and the
requirements of customers. In this paper a conceptual design model
of an automated customer-friendly supermarket has been proposed.
In this model a 10-sided, space benefited, regular polygon shaped
gravity shelves have been designed for goods storage and effective
customer-specific algorithms have been built-in for quick automatic
delivery of the randomly listed goods. The algorithm is developed
with two main objectives, viz., delivery time and priority. For
meeting these objectives the randomly listed items are reorganized
according to the critical-path of the robotic arm specific to the
identified shop and its layout and the items are categorized according
to the demand, shape, size, similarity and nature of the product for an
efficient pick-up, packing and delivery process. We conjectured that
the proposed automated supermarket model reduces business
operating costs with much customer satisfaction warranting a winwin
situation.
Abstract: This paper is about method to produce a stable and
accurate constant output pulse width regardless of the amplitude,
period and pulse width variation of the input signal source. The pulse
generated is usually being used in numerous applications as the
reference input source to other circuits in the system. Therefore, it is
crucial to produce a clean and constant pulse width to make sure the
system is working accurately as expected.
Abstract: Class cohesion is a key object-oriented software
quality attribute that is used to evaluate the degree of relatedness of
class attributes and methods. Researchers have proposed several class
cohesion measures. However, the effect of considering the special
methods (i.e., constructors, destructors, and access and delegation
methods) in cohesion calculation is not thoroughly theoretically
studied for most of them. In this paper, we address this issue for three
popular connectivity-based class cohesion measures. For each of the
considered measures we theoretically study the impact of including
or excluding special methods on the values that are obtained by
applying the measure. This study is based on analyzing the
definitions and formulas that are proposed for the measures. The
results show that including/excluding special methods has a
considerable effect on the obtained cohesion values and that this
effect varies from one measure to another. For each of the three
connectivity-based measures, the proposed theoretical study
recommended excluding the special methods in cohesion
measurement.
Abstract: Load Forecasting plays a key role in making today's
and future's Smart Energy Grids sustainable and reliable. Accurate
power consumption prediction allows utilities to organize in advance
their resources or to execute Demand Response strategies more
effectively, which enables several features such as higher
sustainability, better quality of service, and affordable electricity
tariffs. It is easy yet effective to apply Load Forecasting at larger
geographic scale, i.e. Smart Micro Grids, wherein the lower available
grid flexibility makes accurate prediction more critical in Demand
Response applications. This paper analyses the application of
short-term load forecasting in a concrete scenario, proposed within the
EU-funded GreenCom project, which collect load data from single
loads and households belonging to a Smart Micro Grid. Three
short-term load forecasting techniques, i.e. linear regression, artificial
neural networks, and radial basis function network, are considered,
compared, and evaluated through absolute forecast errors and training
time. The influence of weather conditions in Load Forecasting is also
evaluated. A new definition of Gain is introduced in this paper, which
innovatively serves as an indicator of short-term prediction
capabilities of time spam consistency. Two models, 24- and
1-hour-ahead forecasting, are built to comprehensively compare these
three techniques.