Abstract: Pretreatment is an essential step in the conversion of
lignocellulosic biomass to fermentable sugar that used for biobutanol
production. Among pretreatment processes, microwave is considered
to improve pretreatment efficiency due to its high heating efficiency,
easy operation, and easily to combine with chemical reaction. The
main objectives of this work are to investigate the feasibility of
microwave pretreatment to enhance enzymatic hydrolysis of
corncobs and to determine the optimal conditions using response
surface methodology. Corncobs were pretreated via two-stage
pretreatment in dilute sodium hydroxide (2 %) followed by dilute
sulfuric acid 1 %. Pretreated corncobs were subjected to enzymatic
hydrolysis to produce reducing sugar. Statistical experimental design
was used to optimize pretreatment parameters including temperature,
residence time and solid-to-liquid ratio to achieve the highest amount
of glucose. The results revealed that solid-to-liquid ratio and
temperature had a significant effect on the amount of glucose.
Abstract: WiMAX is defined as Worldwide Interoperability for
Microwave Access by the WiMAX Forum, formed in June 2001 to
promote conformance and interoperability of the IEEE 802.16
standard, officially known as WirelessMAN. The attractive features
of WiMAX technology are very high throughput and Broadband
Wireless Access over a long distance. A detailed simulation
environment is demonstrated with the UGS, nrtPS and ertPS service
classes for throughput, delay and packet delivery ratio for a mixed
environment of fixed and mobile WiMAX. A simple mobility aspect
is considered for the mobile WiMAX and the PMP mode of
transmission is considered in TDD mode. The Network Simulator 2
(NS-2) is the tool which is used to simulate the WiMAX network
scenario. A simple Priority Scheduler and Weighted Round Robin
Schedulers are the WiMAX schedulers used in the research work
Abstract: The main criteria of designing in the most hydraulic
constructions essentially are based on runoff or discharge of water. Two of those important criteria are runoff and return period. Mostly,
these measures are calculated or estimated by stochastic data.
Another feature in hydrological data is their impreciseness.
Therefore, in order to deal with uncertainty and impreciseness, based
on Buckley-s estimation method, a new fuzzy method of evaluating hydrological measures are developed. The method introduces
triangular shape fuzzy numbers for different measures in which both
of the uncertainty and impreciseness concepts are considered. Besides, since another important consideration in most of the
hydrological studies is comparison of a measure during different
months or years, a new fuzzy method which is consistent with special form of proposed fuzzy numbers, is also developed. Finally, to
illustrate the methods more explicitly, the two algorithms are tested on one simple example and a real case study.
Abstract: In this paper we propose a method for recognition of
adult video based on support vector machine (SVM). Different kernel
features are proposed to classify adult videos. SVM has an advantage
that it is insensitive to the relative number of training example in
positive (adult video) and negative (non adult video) classes. This
advantage is illustrated by comparing performance between different
SVM kernels for the identification of adult video.
Abstract: One very interesting field of research in Pattern Recognition that has gained much attention in recent times is Gesture Recognition. In this paper, we consider a form of dynamic hand gestures that are characterized by total movement of the hand (arm) in space. For these types of gestures, the shape of the hand (palm) during gesturing does not bear any significance. In our work, we propose a model-based method for tracking hand motion in space, thereby estimating the hand motion trajectory. We employ the dynamic time warping (DTW) algorithm for time alignment and normalization of spatio-temporal variations that exist among samples belonging to the same gesture class. During training, one template trajectory and one prototype feature vector are generated for every gesture class. Features used in our work include some static and dynamic motion trajectory features. Recognition is accomplished in two stages. In the first stage, all unlikely gesture classes are eliminated by comparing the input gesture trajectory to all the template trajectories. In the next stage, feature vector extracted from the input gesture is compared to all the class prototype feature vectors using a distance classifier. Experimental results demonstrate that our proposed trajectory estimator and classifier is suitable for Human Computer Interaction (HCI) platform.
Abstract: The approach based on the wavelet transform has
been widely used for image denoising due to its multi-resolution
nature, its ability to produce high levels of noise reduction and the
low level of distortion introduced. However, by removing noise, high
frequency components belonging to edges are also removed, which
leads to blurring the signal features. This paper proposes a new
method of image noise reduction based on local variance and edge
analysis. The analysis is performed by dividing an image into 32 x 32
pixel blocks, and transforming the data into wavelet domain. Fast
lifting wavelet spatial-frequency decomposition and reconstruction is
developed with the advantages of being computationally efficient and
boundary effects minimized. The adaptive thresholding by local
variance estimation and edge strength measurement can effectively
reduce image noise while preserve the features of the original image
corresponding to the boundaries of the objects. Experimental results
demonstrate that the method performs well for images contaminated
by natural and artificial noise, and is suitable to be adapted for
different class of images and type of noises. The proposed algorithm
provides a potential solution with parallel computation for real time
or embedded system application.
Abstract: Classification is an interesting problem in functional
data analysis (FDA), because many science and application problems
end up with classification problems, such as recognition, prediction,
control, decision making, management, etc. As the high dimension
and high correlation in functional data (FD), it is a key problem to
extract features from FD whereas keeping its global characters, which
relates to the classification efficiency and precision to heavens. In this
paper, a novel automatic method which combined Genetic Algorithm
(GA) and classification algorithm to extract classification features is
proposed. In this method, the optimal features and classification model
are approached via evolutional study step by step. It is proved by
theory analysis and experiment test that this method has advantages in
improving classification efficiency, precision and robustness whereas
using less features and the dimension of extracted classification
features can be controlled.
Abstract: Water Sensitive Urban Design (WSUD) features are
increasingly used to treat and manage polluted stormwater runoff in urbanised areas. It is important to monitor and evaluate the effectiveness of the infrastructure in achieving their intended performance targets after constructing and operating these features
overtime. The paper presents the various methods of analysis used to
assess the effectiveness of the in-situ WSUD features, such as: onsite visual inspections during operational and non operational periods, maintenance audits and periodic water quality testing. The results will contribute to a better understanding of the operational and
maintenance needs of in-situ WSUD features and assist in providing recommendations to better manage life cycle performance.
Abstract: In this paper we focus on event extraction from Tamil
news article. This system utilizes a scoring scheme for extracting and
grouping event-specific sentences. Using this scoring scheme eventspecific
clustering is performed for multiple documents. Events are
extracted from each document using a scoring scheme based on
feature score and condition score. Similarly event specific sentences
are clustered from multiple documents using this scoring scheme.
The proposed system builds the Event Template based on user
specified query. The templates are filled with event specific details
like person, location and timeline extracted from the formed clusters.
The proposed system applies these methodologies for Tamil news
articles that have been enconverted into UNL graphs using a Tamil to
UNL-enconverter. The main intention of this work is to generate an
event based template.
Abstract: This paper explores the features of political economy in the dynamics of representative politics in India. Politics is seen as enhancing economic benefits through acquiring and maintenance of power in the realm of democratic set up. The system of representation is riddled with competitive populism. Emerging leaders and parties are forced to accommodate their ideologies in coping with competitive politics. Electoral politics and voting behaviour reflect series of influences mooted by the politicians. Voters are accustomed to expect benefits outs of state exchequer. The electoral competitors show a changing phase of investment and return policy. Every elector has to spend and realize his costs in his tenure. In the case of defeated electors, even the cost recovery is not possible directly; there are indirect means to recover their costs. The series of case studies show the method of party funding, campaign financing, electoral expenditure, and cost recovery. Regulations could not restrict the level of spending. Several cases of disproportionate accumulation of wealth by the politicians reveal that money played a major part in electoral process. The political economy of representative politics hitherto ignores how a politician spends and recovers his cost and multiples his wealth. To be sure, the acquiring and maintenance of power is to enhance the wealth of the electors.
Abstract: In this study three commercial semiconductor devices
were characterized in the laboratory for computed tomography
dosimetry: one photodiode and two phototransistors. It was evaluated
four responses to the irradiation: dose linearity, energy dependence,
angular dependence and loss of sensitivity after X ray exposure. The
results showed that the three devices have proportional response with
the air kerma; the energy dependence displayed for each device
suggests that some calibration factors would be applied for each one;
the angular dependence showed a similar pattern among the three
electronic components. In respect to the fourth parameter analyzed,
one phototransistor has the highest sensitivity however it also showed
the greatest loss of sensitivity with the accumulated dose. The
photodiode was the device with the smaller sensitivity to radiation,
on the other hand, the loss of sensitivity after irradiation is negligible.
Since high accuracy is a desired feature for a dosimeter, the
photodiode can be the most suitable of the three devices for
dosimetry in tomography. The phototransistors can also be used for
CT dosimetry, however it would be necessary a correction factor due
to loss of sensitivity with accumulated dose.
Abstract: An application framework provides a reusable
design and implementation for a family of software systems.
Frameworks are introduced to reduce the cost of a product line
(i.e., family of products that share the common features). Software
testing is a time consuming and costly ongoing activity during the
application software development process. Generating reusable test
cases for the framework applications at the framework
development stage, and providing and using the test cases to test
part of the framework application whenever the framework is used
reduces the application development time and cost considerably.
Framework Interface Classes (FICs) are classes introduced by
the framework hooks to be implemented at the application
development stage. They can have reusable test cases generated at
the framework development stage and provided with the
framework to test the implementations of the FICs at the
application development stage. In this paper, we conduct a case
study using thirteen applications developed using three
frameworks; one domain oriented and two application oriented.
The results show that, in general, the percentage of the number of
FICs in the applications developed using domain frameworks is, on
average, greater than the percentage of the number of FICs in the
applications developed using application frameworks.
Consequently, the reduction of the application unit testing time
using the reusable test cases generated for domain frameworks is,
in general, greater than the reduction of the application unit testing
time using the reusable test cases generated for application
frameworks.
Abstract: One of the most important issues in multi-criteria decision analysis (MCDA) is to determine the weights of criteria so that all alternatives can be compared based on the collective performance of criteria. In this paper, one of popular methods in data envelopment analysis (DEA) known as common weights (CWs) is used to determine the weights in MCDA. Two frontiers named ideal and anti-ideal frontiers, instead of ideal and anti-ideal alternatives, are defined based on two new proposed CWs models. Ideal and antiideal frontiers are more flexible than that of alternatives. According to the optimal solutions of these two models, the distances of an alternative from the ideal and anti-ideal frontiers are derived. Then, a relative distance is introduced to measure the value of each alternative. The suggested models are linear and despite weight restrictions are feasible. An example is presented for explaining the method and for comparing to the existing literature.
Abstract: After Apple's first introduction its smart phone, iPhone
in the end of 2009 in Korea, the number of Korean smarphone users
had been rapidly increasing so that the half of Korean population
became smart phone users as of February, 2012. Currently, smart
phones are positioned as a major digital media with powerful
influences in Korea. And, now, Koreans are leaning new information,
enjoying games and communicating other people every time and
everywhere. As smart phone devices' performances increased, the
number of usable services became more while adequate GUI
developments are required to implement various functions with smart
phones. The strategy to provide similar experiences on smart phones
through familiar features based on employment of existing media's
functions mostly contributed to smart phones' popularization in
connection with smart phone devices' iconic GUIs.
The spread of Smart phone increased mobile web accesses.
Therefore, the attempts to implement PC's web in the smart phone's
web are continuously made. The mobile web GUI provides familiar
experiences to users through designs adequately utilizing the smart
phone's GUIs. As the number of users familiarized to smart phones
and mobile web GUIs, opposite to reversed remediation from many
parts of PCs, PCs are starting to adapt smart phone GUIs.
This study defines this phenomenon as the reversed remediation,
and reviews the reversed remediation cases of Smart phone GUI'
characteristics of PCs. For this purpose, the established study issues
are as under:
· what is the reversed remediation?
· what are the smart phone GUI's characteristics?
· what kind of interrelationship exist s between the smart phone and
PC's web site?
It is meaningful in the forecast of the future GUI's change by
understanding of characteristics in the paradigm changes of PC and
smart phone's GUI designs. This also will be helpful to establish
strategies for digital devices' development and design.
Abstract: Due to its special data structure and manipulative principle, Object-Oriented Database (OODB) has a particular security protection and authorization methods. This paper first introduces the features of security mechanism about OODB, and then talked about authorization checking process of OODB. Implicit authorization mechanism is based on the subject hierarchies, object hierarchies and access hierarchies of the security authorization modes, and simplifies the authorization mode. In addition, to combine with other authorization mechanisms, implicit authorization can make protection on the authorization of OODB expediently and effectively.
Abstract: In order to consider the effects of the higher modes in
the pushover analysis, during the recent years several multi-modal
pushover procedures have been presented. In these methods the
response of the considered modes are combined by the square-rootof-
sum-of-squares (SRSS) rule while application of the elastic modal
combination rules in the inelastic phases is no longer valid. In this
research the feasibility of defining an efficient alternative
combination method is investigated. Two steel moment-frame
buildings denoted SAC-9 and SAC-20 under ten earthquake records
are considered. The nonlinear responses of the structures are
estimated by the directed algebraic combination of the weighted
responses of the separate modes. The weight of the each mode is
defined so that the resulted response of the combination has a
minimum error to the nonlinear time history analysis. The genetic
algorithm (GA) is used to minimize the error and optimize the weight
factors. The obtained optimal factors for each mode in different cases
are compared together to find unique appropriate weight factors for
each mode in all cases.
Abstract: Nowadays viruses use polymorphic techniques to mutate their code on each replication, thus evading detection by antiviruses. However detection by emulation can defeat simple polymorphism: thus metamorphic techniques are used which thoroughly change the viral code, even after decryption. We briefly detail this evolution of virus protection techniques against detection and then study the METAPHOR virus, today's most advanced metamorphic virus.
Abstract: This paper presents the use of the predictive fuzzy logic controller (PFLC) applied to attitude control system for agile micro-satellite. In order to reduce the effect of unpredictable time delays and large uncertainties, the algorithm employs predictive control to predict the attitude of the satellite. Comparison of the PFLC and conventional fuzzy logic controller (FLC) is presented to evaluate the performance of the control system during attitude maneuver. The two proposed models have been analyzed with the same level of noise and external disturbances. Simulation results demonstrated the feasibility and advantages of the PFLC on the attitude determination and control system (ADCS) of agile satellite.
Abstract: This paper presents a comparison of metaheuristic
algorithms, Genetic Algorithm (GA) and Ant Colony Optimization
(ACO), in producing freeman chain code (FCC). The main problem
in representing characters using FCC is the length of the FCC
depends on the starting points. Isolated characters, especially the
upper-case characters, usually have branches that make the traversing
process difficult. The study in FCC construction using one
continuous route has not been widely explored. This is our
motivation to use the population-based metaheuristics. The
experimental result shows that the route length using GA is better
than ACO, however, ACO is better in computation time than GA.
Abstract: Research has suggested that implicit learning tasks
may rely on episodic processing to generate above chance
performance on the standard classification tasks. The current
research examines the invariant features task (McGeorge and Burton,
1990) and argues that such episodic processing is indeed important.
The results of the experiment suggest that both rejection and
similarity strategies are used by participants in this task to
simultaneously reject unfamiliar items and to accept (falsely) familiar
items. Primarily these decisions are based on the presence of low or
high frequency goal based features of the stimuli presented in the
incidental learning phase. It is proposed that a goal based analysis of
the incidental learning task provides a simple step in understanding
which features of the episodic processing are most important for
explaining the match between incidental, implicit learning and test
performance.