Abstract: Patients with diabetes are susceptible to chronic foot
wounds which may be difficult to manage and slow to heal.
Diagnosis and treatment currently rely on the subjective judgement of
experienced professionals. An objective method of tissue assessment
is required. In this paper, a data fusion approach was taken to wound
tissue classification. The supervised Maximum Likelihood and
unsupervised Multi-Modal Expectation Maximisation algorithms
were used to classify tissues within simulated wound models by
weighting the contributions of both colour and 3D depth information.
It was found that, at low weightings, depth information could show
significant improvements in classification accuracy when compared
to classification by colour alone, particularly when using the
maximum likelihood method. However, larger weightings were
found to have an entirely negative effect on accuracy.
Abstract: Traditionally, Internet has provided best-effort service to every user regardless of its requirements. However, as Internet becomes universally available, users demand more bandwidth and applications require more and more resources, and interest has developed in having the Internet provide some degree of Quality of Service. Although QoS is an important issue, the question of how it will be brought into the Internet has not been solved yet. Researches, due to the rapid advances in technology are proposing new and more desirable capabilities for the next generation of IP infrastructures. But neither all applications demand the same amount of resources, nor all users are service providers. In this way, this paper is the first of a series of papers that presents an architecture as a first step to the optimization of QoS in the Internet environment as a solution to a SMSE's problem whose objective is to provide public service to internet with certain Quality of Service expectations. The service provides new business opportunities, but also presents new challenges. We have designed and implemented a scalable service framework that supports adaptive bandwidth based on user demands, and the billing based on usage and on QoS. The developed application has been evaluated and the results show that traffic limiting works at optimum and so it does exceeding bandwidth distribution. However, some considerations are done and currently research is under way in two basic areas: (i) development and testing new transfer protocols, and (ii) developing new strategies for traffic improvements based on service differentiation.
Abstract: Anti-money laundering is commonly recognized as a
set of procedures, laws or regulations designed to reduce the practice
of generating income through illegal actions. In Malaysia, the
government and law enforcement agencies have stepped up their
capacities and efforts to curb money laundering since 2001. One of
these measures was the enactment of the Anti-Money Laundering
Act (AMLA) in 2001. The implementation costs on anti-money
laundering requirements (AMLR) can be burdensome to those who
are involved in enforcing them. The objective of this paper is to
explore the perceived effectiveness of AMLR from the enforcement
agencies- perspective. This is a preliminary study whose findings
will help to give direction for further AML research in Malaysia. In
addition, the results of this study provide empirical evidences on the
perceived effectiveness of AMLR prior to further investigations on
barriers and improvements of the implementation of the anti-money
laundering regime in Malaysia.
Abstract: A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.
Abstract: The study aimed to identify the logical structure of
data and particularities of developing and testing a website designed
for selling farm products through online auctions.
The research is based on a short literature review in the field and
exploratory trials of some successful models from other industries, in
order to identify the advantages of using such tool, as well as the
optimal structure and functionality of an auction portal. In the last
part, the study focuses on the results of testing the website by the
potential beneficiaries.
Conclusions of the study underlines that the particularities of some
agricultural products could raise difficulties in the process of selling
them through online auctions, but the use of such system it is
perceived to bring significant improvements in the supply chain.
The results of scientific investigations require a more detailed
study regarding the importance of using quality standards for
agricultural products sold via online auction, the impact that
implementation of an online payment system could have on trade
with agricultural products and problems which could arise in using
the website in different countries.
Abstract: The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.
Abstract: This paper presents the hardware design of a unified
architecture to compute the 4x4, 8x8 and 16x16 efficient twodimensional
(2-D) transform for the HEVC standard. This
architecture is based on fast integer transform algorithms. It is
designed only with adders and shifts in order to reduce the hardware
cost significantly. The goal is to ensure the maximum circuit reuse
during the computing while saving 40% for the number of operations.
The architecture is developed using FIFOs to compute the second
dimension. The proposed hardware was implemented in VHDL. The
VHDL RTL code works at 240 MHZ in an Altera Stratix III FPGA.
The number of cycles in this architecture varies from 33 in 4-point-
2D-DCT to 172 when the 16-point-2D-DCT is computed. Results
show frequency improvements reaching 96% when compared to an
architecture described as the direct transcription of the algorithm.
Abstract: The paper describes ergonomics problems trend of
student at B5101 classroom building 2, Suranaree University of
Technology. The objective to survey ergonomics problems and effect
from use chairs for sitting in class room. The result from survey
method 100 student they use lecture chair for sitting in classroom
more than 2 hours/ day by RULA[1]. and Body discomfort survey[2].
The result from Body discomfort survey contribute fatigue problems
at neck, lower back, upper back and right shoulder 2.93, 2.91, 2.33,
1.75 respectively and result from RULA contribute fatigue problems
at neck, body and right upper arm 4.00, 3.75 and 3.00 respectively
are consistent. After that the researcher provide improvement plan
for design new chair support student fatigue reduction by prepare
data of sample anthropometry and design ergonomics chair prototype
3 unit. Then sample 100 student trial to use new chair and evaluate
again by RULA, Body discomfort and satisfaction. The result from
trial new chair after improvement by RULA present fatigue reduction
average of head and neck from 4.00 to 2.25 , body and trunk from
3.75 to 2.00 and arm force from 1.00 to 0.25 respectively. The result
from trial new chair after improvement by Body discomfort present
fatigue reduction average of lower back from 2.91 to 0.87, neck from
2.93 to 1.24, upper back 2.33 to 0.84 and right upper arm from 1.75
to 0.74. That statistical of RULA and Body discomfort survey
present fatigue reduction after improvement significance with a
confidence level of 95% (p-value 0.05). When analyzing the
relationship of fatigue as part of the body by Chi – square test during
RULA and Body discomfort that before and after improvements were
consistent with the significant level of confidence 95% (p-value 0.05)
. Moreover the students satisfaction result from trial with a new chair
for 30 minutes [3]. 72 percent very satisfied of the folding of the
secondary writing simple 66% the width of the writing plate, 64% the
suitability of the writing plate, 62% of soft seat cushion and 61%
easy to seat the chair.
Abstract: Result of the study on knowledge management systems in businesses was shown that the most of these businesses provide internet accessibility for their employees in order to study new knowledge via internet, corporate website, electronic mail, and electronic learning system. These business organizations use information technology application for knowledge management because of convenience, time saving, ease of use, accuracy of information and knowledge usefulness. The result indicated prominent improvements for corporate knowledge management systems as the following; 1) administrations must support corporate knowledge management system 2) the goal of corporate knowledge management must be clear 3) corporate culture should facilitate the exchange and sharing of knowledge within the organization 4) cooperation of personnel of all levels must be obtained 5) information technology infrastructure must be provided 6) they must develop the system regularly and constantly.
Abstract: Applying a rigorous process to optimize the elements
of a supply-chain network resulted in reduction of the waiting time
for a service provider and customer. Different sources of downtime
of hydraulic pressure controller/calibrator (HPC) were causing
interruptions in the operations. The process examined all the issues to
drive greater efficiencies. The issues included inherent design issues
with HPC pump, contamination of the HPC with impurities, and the
lead time required for annual calibration in the USA.
HPC is used for mandatory testing/verification of formation
tester/pressure measurement/logging-while drilling tools by oilfield
service providers, including Halliburton.
After market study andanalysis, it was concluded that the current
HPC model is best suited in the oilfield industry. To use theexisting
HPC model effectively, design andcontamination issues were
addressed through design and process improvements. An optimum
network is proposed after comparing different supply-chain models
for calibration lead-time reduction.
Abstract: The ideal sinc filter, ignoring the noise statistics, is often
applied for generating an arbitrary sample of a bandlimited signal by
using the uniformly sampled data. In this article, an optimal interpolator is proposed; it reaches a minimum mean square error (MMSE)
at its output in the presence of noise. The resulting interpolator is
thus a Wiener filter, and both the optimal infinite impulse response
(IIR) and finite impulse response (FIR) filters are presented. The
mean square errors (MSE-s) for the interpolator of different length
impulse responses are obtained by computer simulations; it shows that
the MSE-s of the proposed interpolators with a reasonable length are
improved about 0.4 dB under flat power spectra in noisy environment with signal-to-noise power ratio (SNR) equal 10 dB. As expected,
the results also demonstrate the improvements for the MSE-s with various fractional delays of the optimal interpolator against the ideal
sinc filter under a fixed length impulse response.
Abstract: The automatic transmission (AT) is one of the most
important components of many automobile transmission systems. The
shift quality has a significant influence on the ride comfort of the
vehicle. During the AT shift process, the joint elements such as the
clutch and bands engage or disengage, linking sets of gears to create a
fixed gear ratio. Since these ratios differ between gears in a fixed gear
ratio transmission, the motion of the vehicle could change suddenly
during the shift process if the joint elements are engaged or disengaged
inappropriately, additionally impacting the entire transmission system
and increasing the temperature of connect elements.The objective was
to establish a system model for an AT powertrain using
Matlab/Simulink. This paper further analyses the effect of varying
hydraulic pressure and the associated impact on shift quality during
both engagment and disengagement of the joint elements, proving that
shift quality improvements could be achieved with appropriate
hydraulic pressure control.
Abstract: Biclustering is a very useful data mining technique for
identifying patterns where different genes are co-related based on a
subset of conditions in gene expression analysis. Association rules
mining is an efficient approach to achieve biclustering as in
BIMODULE algorithm but it is sensitive to the value given to its
input parameters and the discretization procedure used in the
preprocessing step, also when noise is present, classical association
rules miners discover multiple small fragments of the true bicluster,
but miss the true bicluster itself. This paper formally presents a
generalized noise tolerant bicluster model, termed as μBicluster. An
iterative algorithm termed as BIDENS based on the proposed model
is introduced that can discover a set of k possibly overlapping
biclusters simultaneously. Our model uses a more flexible method to
partition the dimensions to preserve meaningful and significant
biclusters. The proposed algorithm allows discovering biclusters that
hard to be discovered by BIMODULE. Experimental study on yeast,
human gene expression data and several artificial datasets shows that
our algorithm offers substantial improvements over several
previously proposed biclustering algorithms.
Abstract: The development of information and communication
technology, the increased use of the internet, as well as the effects of
the recession within the last years, have lead to the increased use of
cloud computing based solutions, also called on-demand solutions.
These solutions offer a large number of benefits to organizations as
well as challenges and risks, mainly determined by data visualization
in different geographic locations on the internet. As far as the specific
risks of cloud environment are concerned, data security is still
considered a peak barrier in adopting cloud computing. The present
study offers an approach upon ensuring the security of cloud data,
oriented towards the whole data life cycle. The final part of the study
focuses on the assessment of data security in the cloud, this
representing the bases in determining the potential losses and the
premise for subsequent improvements and continuous learning.
Abstract: The success of IT-projects concerning the
implementation of business application Software is strongly
depending upon the application of an efficient requirements
management, to understand the business requirements and to realize
them in the IT. But in fact, the Potentials of the requirements
management are not fully exhausted by small and medium sized
enterprises (SME) of the IT sector. To work out recommendations for
action and furthermore a possible solution, allowing a better exhaust
of potentials, it shall be examined in a scientific research project,
which problems occur out of which causes. In the same place, the
storage of knowledge from the requirements management, and its
later reuse are important, to achieve sustainable improvements of the
competitive of the IT-SMEs. Requirements Engineering is one of the
most important topics in Product Management for Software to
achieve the goal of optimizing the success of the software product.
Abstract: Some fast exact algorithms for the maximum weight clique problem have been proposed. Östergard’s algorithm is one of them. Kumlander says his algorithm is faster than it. But we confirmed that the straightforwardly implemented Kumlander’s algorithm is slower than O¨ sterga˚rd’s algorithm. We propose some improvements on Kumlander’s algorithm.
Abstract: In this work, we consider a deterministic model for
the transmission of leptospirosis which is currently spreading in the
Thai population. The SIR model which incorporates the features of
this disease is applied to the epidemiological data in Thailand. It is
seen that the numerical solutions of the SIR equations are in good
agreement with real empirical data. Further improvements are
discussed.
Abstract: Nowadays companies strive to survive in a
competitive global environment. To speed up product
development/modifications, it is suggested to adopt a collaborative
product development approach. However, despite the advantages of
new IT improvements still many CAx systems work separately and
locally. Collaborative design and manufacture requires a product
information model that supports related CAx product data models. To
solve this problem many solutions are proposed, which the most
successful one is adopting the STEP standard as a product data model
to develop a collaborative CAx platform. However, the improvement
of the STEP-s Application Protocols (APs) over the time, huge
number of STEP AP-s and cc-s, the high costs of implementation,
costly process for conversion of older CAx software files to the STEP
neutral file format; and lack of STEP knowledge, that usually slows
down the implementation of the STEP standard in collaborative data
exchange, management and integration should be considered. In this
paper the requirements for a successful collaborative CAx system is
discussed. The STEP standard capability for product data integration
and its shortcomings as well as the dominant platforms for supporting
CAx collaboration management and product data integration are
reviewed. Finally a platform named LAYMOD to fulfil the
requirements of CAx collaborative environment and integrating the
product data is proposed. The platform is a layered platform to enable
global collaboration among different CAx software
packages/developers. It also adopts the STEP modular architecture
and the XML data structures to enable collaboration between CAx
software packages as well as overcoming the STEP standard
limitations. The architecture and procedures of LAYMOD platform
to manage collaboration and avoid contradicts in product data
integration are introduced.
Abstract: State-of-the-art methods for secondary structure (Porter, Psi-PRED, SAM-T99sec, Sable) and solvent accessibility (Sable, ACCpro) predictions use evolutionary profiles represented by the position specific scoring matrix (PSSM). It has been demonstrated that evolutionary profiles are the most important features in the feature space for these predictions. Unfortunately applying PSSM matrix leads to high dimensional feature spaces that may create problems with parameter optimization and generalization. Several recently published suggested that applying feature extraction for the PSSM matrix may result in improvements in secondary structure predictions. However, none of the top performing methods considered here utilizes dimensionality reduction to improve generalization. In the present study, we used simple and fast methods for features selection (t-statistics, information gain) that allow us to decrease the dimensionality of PSSM matrix by 75% and improve generalization in the case of secondary structure prediction compared to the Sable server.
Abstract: Paper deals with environmental metrics and assessment systems devoted to Small and Medium Sized Enterprises. Authors are presenting proposed assessment model which has an ability to discover current environmental strengths and weaknesses of Small and Middle Sized Enterprise. Suggested model has also an ambition to become a Sustainability Decision Tool. Model is able to identify "best environmental devision" in the company, and to quantify how this decision contributed into overall environmental improvement. Authors understand environmental improvements as environmental innovations (product, process and organizational). Suggested model is based on its own concept; however, authors are also utilizing already existing environmental assessment tools.