Abstract: In recent years, there has been a decline in physical
activity among adults. Motivation has been shown to be a crucial
factor in maintaining physical activity. The purpose of this study was
to whether PA motives measured by the Physical Activity and
Leisure Motivation Scale PALMS predicted the actual amount of PA
at a later time to provide evidence for the construct validity of the
PALMS. A quantitative, cross-sectional descriptive research design
was employed. The Demographic Form, PALMS, and International
Physical Activity Questionnaire Short form (IPAQ-S) questionnaires
were used to assess motives and amount for physical activity in
adults on two occasions. A sample of 489 male undergraduate
students aged 18 to 25 years (mean ±SD; 22.30±8.13 years) took part
in the study. Participants were divided into three types of activities,
namely exercise, racquet sport, and team sports and female
participants only took part in one type of activity, namely team
sports. After 14 weeks, all 489 undergraduate students who had filled
in the initial questionnaire (Occasion 1) received the questionnaire
via email (Occasion 2). Of the 489 students, 378 males emailed back
the completed questionnaire. The results showed that not only were
pertinent sub-scales of PALMS positively related to amount of
physical activity, but separate regression analyses showed the
positive predictive effect of PALMS motives for amount of physical
activity for each type of physical activity among participants. This
study supported the construct validity of the PALMS by showing that
the motives measured by PALMS did predict amount of PA. This
information can be obtained to match people with specific sport or
activity which in turn could potentially promote longer adherence to
the specific activity.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: In this work, the Ictalurus punctatus species estimated
available physical habitat is compared with the estimated physical
habitat for the same but modified river reach, with the aim of creating
a linear park, along a length of 5 500 m.
To determine the effect of ecological park construction, on
physical habitat of the Lerma river stretch of study, first, the available
habitat for the Ictalurus punctatus species was estimated through the
simulation of the physical habitat, by using surveying, hydraulics,
and habitat information gotten at the river reach in its actual situation.
Second, it was estimated the available habitat for the above species,
upon the simulation of the physical habitat through the proposed
modification for the ecological park creation. Third, it is presented a
comparison between both scenarios in terms of available habitat
estimated for Ictalurus punctatus species, concluding that in cases of
adult and spawning life stages, changes in the channel to create an
ecological park would produce a considerable loss of potentially
usable habitat (PUH), while in the case of the juvenile life stage PUH
remains virtually unchanged, and in the case of life stage fry the PUH
would increase due to the presence of velocities and depths of lesser
magnitude, due to the presence of minor flow rates and lower volume
of the wet channel.
It is expected that habitat modification for linear park construction
may produce the lack of Ictalurus punktatus species conservation at
the river reach of the study.
Abstract: Any signal transmitted over a channel is corrupted by noise and interference. A host of channel coding techniques has been proposed to alleviate the effect of such noise and interference. Among these Turbo codes are recommended, because of increased capacity at higher transmission rates and superior performance over convolutional codes. The multimedia elements which are associated with ample amount of data are best protected by Turbo codes. Turbo decoder employs Maximum A-posteriori Probability (MAP) and Soft Output Viterbi Decoding (SOVA) algorithms. Conventional Turbo coded systems employ Equal Error Protection (EEP) in which the protection of all the data in an information message is uniform. Some applications involve Unequal Error Protection (UEP) in which the level of protection is higher for important information bits than that of other bits. In this work, enhancement to the traditional Log MAP decoding algorithm is being done by using optimized scaling factors for both the decoders. The error correcting performance in presence of UEP in Additive White Gaussian Noise channel (AWGN) and Rayleigh fading are analyzed for the transmission of image with Discrete Cosine Transform (DCT) as source coding technique. This paper compares the performance of log MAP, Modified log MAP (MlogMAP) and Enhanced log MAP (ElogMAP) algorithms used for image transmission. The MlogMAP algorithm is found to be best for lower Eb/N0 values but for higher Eb/N0 ElogMAP performs better with optimized scaling factors. The performance comparison of AWGN with fading channel indicates the robustness of the proposed algorithm. According to the performance of three different message classes, class3 would be more protected than other two classes. From the performance analysis, it is observed that ElogMAP algorithm with UEP is best for transmission of an image compared to Log MAP and MlogMAP decoding algorithms.
Abstract: A sensory network consists of multiple detection
locations called sensor nodes, each of which is tiny, featherweight
and portable. A single path routing protocols in wireless sensor
network can lead to holes in the network, since only the nodes
present in the single path is used for the data transmission. Apart
from the advantages like reduced computation, complexity and
resource utilization, there are some drawbacks like throughput,
increased traffic load and delay in data delivery. Therefore, multipath
routing protocols are preferred for WSN. Distributing the traffic
among multiple paths increases the network lifetime. We propose a
scheme, for the data to be transmitted through a dominant path to
save energy. In order to obtain a high delivery ratio, a basic route
reconstruction protocol is utilized to reconstruct the path whenever a
failure is detected. A basic reconstruction routing (BRR) algorithm is
proposed, in which a node can leap over path failure by using the
already existing routing information from its neighbourhood while
the composed data is transmitted from the source to the sink. In order
to save the energy and attain high data delivery ratio, data is
transmitted along a multiple path, which is achieved by BRR
algorithm whenever a failure is detected. Further, the analysis of
how the proposed protocol overcomes the drawback of the existing
protocols is presented. The performance of our protocol is compared
to AOMDV and energy efficient node-disjoint multipath routing
protocol (EENDMRP). The system is implemented using NS-2.34.
The simulation results show that the proposed protocol has high
delivery ratio with low energy consumption.
Abstract: Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.
Abstract: Image segmentation and color identification is an
important process used in various emerging fields like intelligent
robotics. A method is proposed for the manipulator to grasp and place
the color object into correct location. The existing methods such as
PSO, has problems like accelerating the convergence speed and
converging to a local minimum leading to sub optimal performance.
To improve the performance, we are using watershed algorithm and
for color identification, we are using EPSO. EPSO method is used to
reduce the probability of being stuck in the local minimum. The
proposed method offers the particles a more powerful global
exploration capability. EPSO methods can determine the particles
stuck in the local minimum and can also enhance learning speed as
the particle movement will be faster.
Abstract: Total Quality Management (TQM) is a managerial
approach that improves the competitiveness of the industry,
meanwhile Information technology (IT) was introduced with TQM
for handling the technical issues which is supported by quality
experts for fulfilling the customers’ requirement. Present paper aims
to utilise AHP (Analytic Hierarchy Process) methodology to
priorities and rank the hierarchy levels of TQM enablers and IT
resource together for its successful implementation in the Information
and Communication Technology (ICT) industry. A total of 17 TQM
enablers (nine) and IT resources (eight) were identified and
partitioned into 3 categories and were prioritised by AHP approach.
The finding indicates that the 17 sub-criteria can be grouped into
three main categories namely organizing, tools and techniques, and
culture and people. Further, out of 17 sub-criteria, three sub-criteria:
top management commitment and support, total employee
involvement, and continuous improvement got highest priority
whereas three sub-criteria such as structural equation modelling,
culture change, and customer satisfaction got lowest priority. The
result suggests a hierarchy model for ICT industry to prioritise the
enablers and resources as well as to improve the TQM and IT
performance in the ICT industry. This paper has some managerial
implication which suggests the managers of ICT industry to
implement TQM and IT together in their organizations to get
maximum benefits and how to utilize available resources. At the end,
conclusions, limitation, future scope of the study are presented.
Abstract: Due to rapid advancement of powerful image
processing software, digital images are easy to manipulate and
modify by ordinary people. Lots of digital images are edited for a
specific purpose and more difficult to distinguish form their original
ones. We propose a clustering method to detect a copy-move image
forgery of JPEG, BMP, TIFF, and PNG. The process starts with
reducing the color of the photos. Then, we use the clustering
technique to divide information of measuring data by Hausdorff
Distance. The result shows that the purposed methods is capable of
inspecting the image file and correctly identify the forgery.
Abstract: The Trustworthy link failure recovery algorithm is
introduced in this paper, to provide the forwarding continuity even
with compound link failures. The ephemeral failures are common in
IP networks and it also has some proposals based on local rerouting.
To ensure forwarding continuity, we are introducing the compound
link failure recovery algorithm, even with compound link failures.
For forwarding the information, each packet carries a blacklist, which
is a min set of failed links encountered along its path, and the next
hop is chosen by excluding the blacklisted links. Our proposed
method describes how it can be applied to ensure forwarding to all
reachable destinations in case of any two or more link or node
failures in the network. After simulating with NS2 contains lot of
samples proved that the proposed protocol achieves exceptional
concert even under elevated node mobility using Trustworthy link
Failure Recovery Algorithm.
Abstract: This paper discusses the forensic investigation of a
fatality-involved catastrophic structure collapse and the special
challenges faced when tasked with directing such an effort. While
this paper discusses the investigation’s findings and the outcome of
the event; this paper’s primary focus is on the challenges faced
directing a forensic investigation that requires coordinating with
governmental oversight while also having to accommodate multiple
parties’ investigative teams. In particular the challenges discussed
within this paper included maintaining on-site safety and operations
while accommodating outside investigator’s interests. In addition this
paper discusses unique challenges that one may face such as what to
do about unethical conduct of interested party’s investigative teams,
“off the record” sharing of information, and clandestinely transmitted
evidence.
Abstract: The substantial development of the construction
industry has forced the cement industry, its major support, to focus
on achieving maximum productivity to meet the growing demand for
this material. This means that the reliability of a cement production
system needs to be at the highest level that can be achieved by good
maintenance. This paper studies the extent to which the
implementation of RCM is needed as a strategy for increasing the
reliability of the production systems component can be increased,
thus ensuring continuous productivity. In a case study of four Libyan
cement factories, 80 employees were surveyed and 12 top and middle
managers interviewed. It is evident that these factories usually
breakdown more often than once per month which has led to a
decline in productivity. In many times they cannot achieve the
minimum level of production amount. This has resulted from the
poor reliability of their production systems as a result of poor or
insufficient maintenance. It has been found that most of the factories’
employees misunderstand maintenance and its importance. The main
cause of this problem is the lack of qualified and trained staff, but in
addition it has been found that most employees are not found to be
motivated as a result of a lack of management support and interest. In
response to these findings, it has been suggested that the RCM
strategy should be implemented in the four factories. The results
show the importance of the development of maintenance strategies
through the implementation of RCM in these factories. The purpose
of it would be to overcome the problems that could secure the
reliability of the production systems. This study could be a useful
source of information for academic researchers and the industrial
organizations which are still experiencing problems in maintenance
practices.
Abstract: Microbes have been used to solve environmental
problems for many years. The role of microorganism to sequester,
precipitate or alter the oxidation state of various heavy metals has
been extensively studied. Treatment using microorganism interacts
with toxic metal are very diverse. The purpose of this research is to
remove the mercury using Pseudomonas putida (P. putida), pure
culture ATTC 49128 at optimum growth parameters such as
techniques of culture, acclimatization time and speed of incubator
shaker. Thus, in this study, the optimum growth parameters of P.
putida were obtained to achieve the maximum of mercury removal.
Based on the optimum parameters of P. putida for specific growth
rate, the removal of two different mercury concentration, 1 ppm and
4 ppm were studied. From mercury nitrate solution, a mercuryresistant
bacterial strain which is able to reduce from ionic mercury
to metallic mercury was used to reduce ionic mercury. The overall
levels of mercury removal in this study were between 80% and 89%.
The information obtained in this study is of fundamental for
understanding of the survival of P. putida ATTC 49128 in mercury
solution. Thus, microbial mercury removal is a potential
bioremediation for wastewater especially in petrochemical industries
in Malaysia.
Abstract: Mammography is widely used technique for breast cancer
screening. There are various other techniques for breast cancer screening
but mammography is the most reliable and effective technique. The
images obtained through mammography are of low contrast which
causes problem for the radiologists to interpret. Hence, a high quality
image is mandatory for the processing of the image for extracting any
kind of information from it. Many contrast enhancement algorithms have
been developed over the years. In the present work, an efficient
morphology based technique is proposed for contrast enhancement of
masses in mammographic images. The proposed method is based on
Multiscale Morphology and it takes into consideration the scale of the
structuring element. The proposed method is compared with other stateof-
the-art techniques. The experimental results show that the proposed
method is better both qualitatively and quantitatively than the other
standard contrast enhancement techniques.
Abstract: Image Processing is a structure of Signal Processing
for which the input is the image and the output is also an image or
parameter of the image. Image Resolution has been frequently
referred as an important aspect of an image. In Image Resolution
Enhancement, images are being processed in order to obtain more
enhanced resolution. To generate highly resoluted image for a low
resoluted input image with high PSNR value. Stationary Wavelet
Transform is used for Edge Detection and minimize the loss occurs
during Downsampling. Inverse Discrete Wavelet Transform is to get
highly resoluted image. Highly resoluted output is generated from the
Low resolution input with high quality. Noisy input will generate
output with low PSNR value. So Noisy resolution enhancement
technique has been used for adaptive sub-band thresholding is used.
Downsampling in each of the DWT subbands causes information loss
in the respective subbands. SWT is employed to minimize this loss.
Inverse Discrete wavelet transform (IDWT) is to convert the object
which is downsampled using DWT into a highly resoluted object.
Used Image denoising and resolution enhancement techniques will
generate image with high PSNR value. Our Proposed method will
improve Image Resolution and reached the optimized threshold.
Abstract: Rapid growth of Information Technologies (IT) has
had huge influence on enterprises, and it has contributed to its
promotion and increasingly extensive use in enterprises. Information
Technologies have to a large extent determined the processes taking
place in an enterprise; what is more, IT development has brought the
need to adopt a brand new approach to human resources management
in an enterprise. The use of IT in human resource management
(HRM) is of high importance due to the growing role of information
and information technologies. The aim of this paper is to evaluate the
use of information technologies in human resources management in
enterprises. These practices will be presented in the following areas:
recruitment and selection, development and training, employee
assessment, motivation, talent management, personnel service.
Results of conducted survey show diversity of solutions applied in
particular areas of human resource management. In the future, further
development in this area should be expected, as well as integration of
individual HRM areas, growing mobile-enabled HR processes and
their transfer into the cloud. Presented IT solutions applied in HRM
are highly innovative, which is of great significance due to their
possible implementation in other enterprises.
Abstract: In educational technology, the idea of innovation is
usually tethered to contemporary technological inventions and
emerging technologies. Yet, using long-known technologies in ways
that are pedagogically or experimentially new can reposition them as
emerging educational technologies. In this study we explore how a
subtle pivot in pedagogical thinking led to an innovative education
technology. We describe the design and implementation of an online
writing tool that scaffolds students in the evaluation of their own
informational texts. We think about how pathways to innovation can
emerge from pivots, namely a leveraging of longstanding practices in
novel ways has the potential to cultivate new opportunities for
learning. We first unpack Infowriter in terms of its design, then we
describe some results of a study in which we implemented an
intervention which included our designed application.
Abstract: The article proposed intends to analyze the possibility
(and conditions) of a media regulation law in a democratic rule of law
in the twenty-first century. To do so, will be presented initially the
idea of the public sphere (by Jürgen Habermas), showing how it is
presented as an interface between the citizen and the state (or the
private and public) and how important is it in a deliberative
democracy. Based on this paradigm, the traditional perception of the
role of public information (such as system functional element) and on
the possibility of media regulation will be exposed, due to the public
nature of their activity. A critical argument will then be displayed
from two different perspectives: a) the formal function of the current
media information, considering that the digital age has fragmented
the information access; b) the concept of a constructive democracy,
which reduces the need for representation, changing the strategic
importance of the public sphere. The question to be addressed (based
on the comparative law) is if the regulation is justified in a
polycentric democracy, especially when it operates under the digital
age (with immediate and virtual communication). The proposal is to
be presented in the sense that even in a twenty-first century the media
in a democratic rule of law still has an extremely important role and
may be subject to regulation, but this should be on terms very
different (and narrower) from those usually defended.
Abstract: Thousands of organisations store important and
confidential information related to them, their customers, and their
business partners in databases all across the world. The stored data
ranges from less sensitive (e.g. first name, last name, date of birth) to
more sensitive data (e.g. password, pin code, and credit card
information). Losing data, disclosing confidential information or
even changing the value of data are the severe damages that
Structured Query Language injection (SQLi) attack can cause on a
given database. It is a code injection technique where malicious SQL
statements are inserted into a given SQL database by simply using a
web browser. In this paper, we propose an effective pattern
recognition neural network model for detection and classification of
SQLi attacks. The proposed model is built from three main elements
of: a Uniform Resource Locator (URL) generator in order to generate
thousands of malicious and benign URLs, a URL classifier in order
to: 1) classify each generated URL to either a benign URL or a
malicious URL and 2) classify the malicious URLs into different
SQLi attack categories, and a NN model in order to: 1) detect either a
given URL is a malicious URL or a benign URL and 2) identify the
type of SQLi attack for each malicious URL. The model is first
trained and then evaluated by employing thousands of benign and
malicious URLs. The results of the experiments are presented in
order to demonstrate the effectiveness of the proposed approach.
Abstract: The paper presents new results concerning selection of
optimal information fusion formula for ensembles of C-OTDR
channels. The goal of information fusion is to create an integral
classificator designed for effective classification of seismoacoustic
target events. The LPBoost (LP-β and LP-B variants), the Multiple
Kernel Learning, and Weighing of Inversely as Lipschitz Constants
(WILC) approaches were compared. The WILC is a brand new
approach to optimal fusion of Lipschitz Classifiers Ensembles.
Results of practical usage are presented.