Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: General as well as the MSW management in Thailand is reviewed in this paper. Topics include the MSW generation, sources, composition, and trends. The review, then, moves to sustainable solutions for MSW management, sustainable alternative approaches with an emphasis on an integrated MSW management. Information of waste in Thailand is also given at the beginning of this paper for better understanding of later contents. It is clear that no one single method of MSW disposal can deal with all materials in an environmentally sustainable way. As such, a suitable approach in MSW management should be an integrated approach that could deliver both environmental and economic sustainability. With increasing environmental concerns, the integrated MSW management system has a potential to maximize the useable waste materials as well as produce energy as a by-product. In Thailand, the compositions of waste (86%) are mainly organic waste, paper, plastic, glass, and metal. As a result, the waste in Thailand is suitable for an integrated MSW management. Currently, the Thai national waste management policy starts to encourage the local administrations to gather into clusters to establish central MSW disposal facilities with suitable technologies and reducing the disposal cost based on the amount of MSW generated.
Abstract: The well known NP-complete problem of the Traveling Salesman Problem (TSP) is coded in genetic form. A software system is proposed to determine the optimum route for a Traveling Salesman Problem using Genetic Algorithm technique. The system starts from a matrix of the calculated Euclidean distances between the cities to be visited by the traveling salesman and a randomly chosen city order as the initial population. Then new generations are then created repeatedly until the proper path is reached upon reaching a stopping criterion. This search is guided by a solution evaluation function.
Abstract: This paper provides an introduction into the evolution
of information and communication technology and illustrates its
usage in the work domain. The paper is sub-divided into two parts.
The first part gives an overview over the different phases of
information processing in the work domain. It starts by charting the
past and present usage of computers in work environments and shows
current technological trends, which are likely to influence future
business applications. The second part starts by briefly describing,
how the usage of computers changed business processes in the past,
and presents first Ambient Intelligence applications based on
identification and localization information, which are already used in
the production and retail sector. Based on current systems and
prototype applications, the paper gives an outlook of how Ambient
Intelligence technologies could change business processes in the
future.
Abstract: Text Mining is an important step of Knowledge
Discovery process. It is used to extract hidden information from notstructured
o semi-structured data. This aspect is fundamental because
much of the Web information is semi-structured due to the nested
structure of HTML code, much of the Web information is linked,
much of the Web information is redundant. Web Text Mining helps
whole knowledge mining process to mining, extraction and
integration of useful data, information and knowledge from Web
page contents.
In this paper, we present a Web Text Mining process able to
discover knowledge in a distributed and heterogeneous multiorganization
environment. The Web Text Mining process is based on
flexible architecture and is implemented by four steps able to
examine web content and to extract useful hidden information
through mining techniques. Our Web Text Mining prototype starts
from the recovery of Web job offers in which, through a Text Mining
process, useful information for fast classification of the same are
drawn out, these information are, essentially, job offer place and
skills.
Abstract: The following paper shows an interactive tool which
main purpose is to teach how to play a flute. It consists of three
stages the first one is the instruction and teaching process through a
software application, the second is the practice part when the user
starts to play the flute (hardware specially designed for this
application) this flute is capable of capturing how is being played the
flute and the final stage is the one in which the data captured are sent
to the software and the user is evaluated in order to give him / she a
correction or an acceptance
Abstract: This is an application research presenting the
improvement of production quality using the six sigma solutions and
the analyses of benefit-cost ratio. The case of interest is the
production of tile-concrete. Such production has faced with the
problem of high nonconforming products from an inappropriate
surface coating and had low process capability based on the strength
property of tile. Surface coating and tile strength are the most critical
to quality of this product. The improvements followed five stages of
six sigma solutions. After the improvement, the production yield was
improved to 80% as target required and the defective products from
coating process was remarkably reduced from 29.40% to 4.09%. The
process capability based on the strength quality was increased from
0.87 to 1.08 as customer oriented. The improvement was able to save
the materials loss for 3.24 millions baht or 0.11 million dollars. The
benefits from the improvement were analyzed from (1) the reduction
of the numbers of non conforming tile using its factory price for
surface coating improvement and (2) the materials saved from the
increment of process capability. The benefit-cost ratio of overall
improvement was high as 7.03. It was non valuable investment in
define, measure, analyses and the initial of improve stages after that
it kept increasing. This was due to there were no benefits in define,
measure, and analyze stages of six sigma since these three stages
mainly determine the cause of problem and its effects rather than
improve the process. The benefit-cost ratio starts existing in the
improve stage and go on. Within each stage, the individual benefitcost
ratio was much higher than the accumulative one as there was an
accumulation of cost since the first stage of six sigma. The
consideration of the benefit-cost ratio during the improvement
project helps make decisions for cost saving of similar activities
during the improvement and for new project. In conclusion, the
determination of benefit-cost ratio behavior through out six sigma
implementation period provides the useful data for managing quality
improvement for the optimal effectiveness. This is the additional
outcome from the regular proceeding of six sigma.
Abstract: This study was conducted to investigate the incidence
of pathogenic bacteria: Salmonella, Shigella, Escherichia coli O157
and Staphylococcus aureus in cakes and tarts collected from thirtyfive
confectionery producing and selling premises located within
Tripoli city, Libya. The results revealed an incidence of S. aureus
with 94.4 and 48.0 %, E. coli O157 with 14.7 and 4.0 % and Salmonella
sp. with 5.9 and 8.0 % in cakes and tarts samples respectively;
while Shigella was not detected in all samples. In order to determine
the source of these pathogenic bacteria, cotton swabs were taken
from the hands of workers on the production line, the surfaces of
preparation tables and cream whipping instruments. The results
showed that the cotton swabs obtained from the hands of workers
contained S. aureus and Salmonella sp. with an incidence of 42.9 and
2.9 %, the cotton swabs obtained from the surfaces of preparation
tables 22.9 and 2.9 % and the cotton swabs obtained from the cream
whipping instruments 14.3 and 0.0 % respectively; while E. coli
O157 and Shigella sp. were not detected in all swabs. Additionally,
other bacteria were isolated from the hands of workers and the Surfaces
of producing equipments included: Aeromonas sp., Pseudomonas
sp., E. coli, Klebsiella sp., Enterobacter sp., Citrobacter sp.,
Proteus sp., Serratia sp. and Acinetobacter sp. These results indicate
that some of the cakes and tarts might pose threat to consumer's
health. Meanwhile, occurrences of pathogenic bacteria on the hands
of those who are working in production line and the surfaces of
equipments reflect poor hygienic practices at most confectionery
premises examined in this study. Thus, firm and continuous surveillance
of these premises is needed to insure the consumer's health and
safety.
Abstract: Mycophenolic acid “MPA" is a secondary metabolite
of Penicillium bervicompactum with antibiotic and
immunosuppressive properties. In this study, fermentation process
was established for production of mycophenolic acid by Penicillium
bervicompactum MUCL 19011 in shake flask. The maximum MPA
production, product yield and productivity were 1.379 g/L, 18.6 mg/g
glucose and 4.9 mg/L.h respectively. Glucose consumption, biomass
and MPA production profiles were investigated during fermentation
time. It was found that MPA production starts approximately after
180 hours and reaches to a maximum at 280 h. In the next step, the
effects of methionine and acetate concentrations on MPA production
were evaluated. Maximum MPA production, product yield and
productivity (1.763 g/L, 23.8 mg/g glucose and 6.30 mg/L. h
respectively) were obtained with using 2.5 g/L methionine in culture
medium. Further addition of methionine had not more positive effect
on MPA production. Finally, results showed that the addition of
acetate to the culture medium had not any observable effect on MPA
production
Abstract: The present work demonstrates the design and simulation of a fuzzy control of an air conditioning system at different pressures. The first order Sugeno fuzzy inference system is utilized to model the system and create the controller. In addition, an estimation of the heat transfer rate and water mass flow rate injection into or withdraw from the air conditioning system is determined by the fuzzy IF-THEN rules. The approach starts by generating the input/output data. Then, the subtractive clustering algorithm along with least square estimation (LSE) generates the fuzzy rules that describe the relationship between input/output data. The fuzzy rules are tuned by Adaptive Neuro-Fuzzy Inference System (ANFIS). The results show that when the pressure increases the amount of water flow rate and heat transfer rate decrease within the lower ranges of inlet dry bulb temperatures. On the other hand, and as pressure increases the amount of water flow rate and heat transfer rate increases within the higher ranges of inlet dry bulb temperatures. The inflection in the pressure effect trend occurs at lower temperatures as the inlet air humidity increases.
Abstract: Real-time 3D applications have to guarantee
interactive rendering speed. There is a restriction for the number of
polygons which is rendered due to performance of a graphics hardware
or graphics algorithms. Generally, the rendering performance will be
drastically increased when handling only the dynamic 3d models,
which is much fewer than the static ones. Since shapes and colors of
the static objects don-t change when the viewing direction is fixed, the
information can be reused. We render huge amounts of polygon those
cannot handled by conventional rendering techniques in real-time by
using a static object image and merging it with rendering result of the
dynamic objects. The performance must be decreased as a
consequence of updating the static object image including removing
an static object that starts to move, re-rending the other static objects
being overlapped by the moving ones. Based on visibility of the object
beginning to move, we can skip the updating process. As a result, we
enhance rendering performance and reduce differences of rendering
speed between each frame. Proposed method renders total
200,000,000 polygons that consist of 500,000 dynamic polygons and
the rest are static polygons in about 100 frames per second.
Abstract: Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.
Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of Pulping of Sugar Maple problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified problem where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: This paper proposes a methodology for mitigating the occurrence of cascading failure in stressed power systems. The methodology is essentially based on predicting voltage instability in the power system using a voltage stability index and then devising a corrective action in order to increase the voltage stability margin. The paper starts with a brief description of the cascading failure mechanism which is probable root cause of severe blackouts. Then, the voltage instability indices are introduced in order to evaluate stability limit. The aim of the analysis is to assure that the coordination of protection, by adopting load shedding scheme, capable of enhancing performance of the system after the major location of instability is determined. Finally, the proposed method to generate instability prediction is introduced.
Abstract: Unified Modeling Language (UML) extensions for real time embedded systems (RTES) co-design, are taking a growing interest by a great number of industrial and research communities. The extension mechanism is provided by UML profiles for RTES. It aims at improving an easily-understood method of system design for non-experts. On the other hand, one of the key items of the co- design methods is the Hardware/Software partitioning and scheduling tasks. Indeed, it is mandatory to define where and when tasks are implemented and run. Unfortunately the main goals of co-design are not included in the usual practice of UML profiles. So, there exists a need for mapping used models to an execution platform for both schedulability test and HW/SW partitioning. In the present work, test schedulability and design space exploration are performed at an early stage. The proposed approach adopts Model Driven Engineering MDE. It starts from UML specification annotated with the recent profile for the Modeling and Analysis of Real Time Embedded systems MARTE. Following refinement strategy, transformation rules allow to find a feasible schedule that satisfies timing constraints and to define where tasks will be implemented. The overall approach is experimented for the design of a football player robot application.
Abstract: Mycophenolic acid “MPA" is a secondary metabolite
of Penicillium bervicompactum with antibiotic and
immunosuppressive properties. In this study, fermentation process
was established for production of mycophenolic acid by Penicillium
bervicompactum MUCL 19011 in shake flask. The maximum MPA
production, product yield and productivity were 1.379 g/L, 18.6 mg/g
glucose and 4.9 mg/L.h respectively. Glucose consumption, biomass
and MPA production profiles were investigated during fermentation
time. It was found that MPA production starts approximately after
180 hours and reaches to a maximum at 280 h. In the next step, the
effects of methionine and acetate concentrations on MPA production
were evaluated. Maximum MPA production, product yield and
productivity (1.763 g/L, 23.8 mg/g glucose and 6.30 mg/L. h
respectively) were obtained with using 2.5 g/L methionine in culture
medium. Further addition of methionine had not more positive effect
on MPA production. Finally, results showed that the addition of
acetate to the culture medium had not any observable effect on MPA
production.
Abstract: This paper is a survey of current component-based
software technologies and the description of promotion and
inhibition factors in CBSE. The features that software components
inherit are also discussed. Quality Assurance issues in componentbased
software are also catered to. The feat research on the quality
model of component based system starts with the study of what the
components are, CBSE, its development life cycle and the pro &
cons of CBSE. Various attributes are studied and compared keeping
in view the study of various existing models for general systems and
CBS. When illustrating the quality of a software component an apt
set of quality attributes for the description of the system (or
components) should be selected. Finally, the research issues that can
be extended are tabularized.
Abstract: This paper describes a methodology for remote
performance monitoring of retail refrigeration systems. The proposed
framework starts with monitoring of the whole refrigeration circuit
which allows detecting deviations from expected behavior caused by
various faults and degradations. The subsequent diagnostics methods
drill down deeper in the equipment hierarchy to more specifically
determine root causes. An important feature of the proposed concept
is that it does not require any additional sensors, and thus, the
performance monitoring solution can be deployed at a low
installation cost. Moreover only a minimum of contextual
information is required, which also substantially reduces time and
cost of the deployment process.
Abstract: In the present work, we have developed a symmetric electrochemical capacitor based on the nanostructured iron oxide (Fe3O4)-activated carbon (AC) nanocomposite materials. The physical properties of the nanocomposites were characterized by Scanning Electron Microscopy (SEM) and Brunauer-Emmett-Teller (BET) analysis. The electrochemical performances of the composite electrode in 1.0 M Na2SO3 and 1.0 M Na2SO4 aqueous solutions were evaluated using cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS). The composite electrode with 4 wt% of iron oxide nanomaterials exhibits the highest capacitance of 86 F/g. The experimental results clearly indicate that the incorporation of iron oxide nanomaterials at low concentration to the composite can improve the capacitive performance, mainly attributed to the contribution of the pseudocapacitance charge storage mechanism and the enhancement on the effective surface area of the electrode. Nevertheless, there is an optimum threshold on the amount of iron oxide that needs to be incorporated into the composite system. When this optimum threshold is exceeded, the capacitive performance of the electrode starts to deteriorate, as a result of the undesired particle aggregation, which is clearly indicated in the SEM analysis. The electrochemical performance of the composite electrode is found to be superior when Na2SO3 is used as the electrolyte, if compared to the Na2SO4 solution. It is believed that Fe3O4 nanoparticles can provide favourable surface adsorption sites for sulphite (SO3 2-) anions which act as catalysts for subsequent redox and intercalation reactions.
Abstract: Memristor is also known as the fourth fundamental
passive circuit element. When current flows in one direction through
the device, the electrical resistance increases and when current flows
in the opposite direction, the resistance decreases. When the current
is stopped, the component retains the last resistance that it had, and
when the flow of charge starts again, the resistance of the circuit will
be what it was when it was last active. It behaves as a nonlinear
resistor with memory. Recently memristors have generated wide
research interest and have found many applications. In this paper we
survey the various applications of memristors which include non
volatile memory, nanoelectronic memories, computer logic,
neuromorphic computer architectures low power remote sensing
applications, crossbar latches as transistor replacements, analog
computations and switches.