Abstract: pH-sensitive drug targeting using nanoparticles for
cancer chemotherapy have been spotlighted in recent decades. Graft
copolymer composed of poly (L-histidine) (PHS) and dextran
(DexPHS) was synthesized and pH-sensitive nanoparticles were
fabricated for pH-responsive drug delivery of doxorubicin (DOX).
Nanoparticles of DexPHS showed pH-sensitive changes in particle
sizes and drug release behavior, i.e. particle sizes and drug release rate
were increased at acidic pH, indicating that DexPHS nanoparticles
have pH-sensitive drug delivery potentials. Antitumor activity of
DOX-incorporated DexPHS nanoparticles were studied using CT26
colorectal carcinoma cells. Results indicated that fluorescence
intensity was higher at acidic pH than basic pH. These results
indicated that DexPHS nanoparticles have pH-responsive drug
targeting.
Abstract: In this paper, we present optimal control for
movement and trajectory planning for four degrees-of-freedom robot
using Fuzzy Logic (FL) and Genetic Algorithms (GAs). We have
evaluated using Fuzzy Logic (FL) and Genetic Algorithms (GAs)
for four degree-of-freedom (4 DOF) robotics arm, Uncertainties like;
Movement, Friction and Settling Time in robotic arm movement
have been compensated using Fuzzy logic and Genetic Algorithms.
The development of a fuzzy genetic optimization algorithm is
presented and discussed. The result are compared only GA and
Fuzzy GA. This paper describes genetic algorithms, which is
designed to optimize robot movement and trajectory. Though the
model represents is a general model for redundant structures and
could represent any n-link structures. The result is a complete
trajectory planning with Fuzzy logic and Genetic algorithms
demonstrating the flexibility of this technique of artificial
intelligence.
Abstract: With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Abstract: Cognizant of the fact that enterprise systems involve
organizational change and their implementation is over shadowed by a
high failure rate, it is argued that there is the need to focus attention on
employees- perceptions of such organizational change when
explaining adoption behavior of enterprise systems. For this purpose,
the research incorporates a conceptual constructo fattitude toward
change that captures views about the need for organizational change.
Centered on this conceptual construct, the research model includes
beliefs regarding the system and behavioral intention as its
consequences, and the personal characteristics of organizational
commitment and perceived personal competence as its antecedents.
Structural equation analysis using LISREL provides significant
support for the proposed relationships. Theoretical and practical
implications are discussed along with limitations.
Abstract: A mathematical model of the surface roughness
has been developed by using response surface methodology
(RSM) in grinding of AISI D2 cold work tool steels. Analysis
of variance (ANOVA) was used to check the validity of the
model. Low and high value for work speed and feed rate are
decided from design of experiment. The influences of all
machining parameters on surface roughness have been
analyzed based on the developed mathematical model. The
developed prediction equation shows that both the feed rate
and work speed are the most important factor that influences
the surface roughness. The surface roughness was found to be
the lowers with the used of low feed rate and low work speed.
Accuracy of the best model was proved with the testing data.
Abstract: Sputter deposition processes, especially for sputtering
from metal targets, are well investigated. For practical reasons, i.e.
for industrial processes, energetic considerations for sputter
deposition are useful in order to optimize the sputtering process. In
particular, for substrates at floating conditions it is required to obtain
energetic conditions during film growth that enables sufficient dense
metal films of good quality. The influence of ion energies, energy
density and momentum transfer is thus examined both for sputtering
at the target as well as during film growth. Different regimes
dominated by ion energy, energy density and momentum transfer
were identified by using different plasma sources and by varying
power input, pressure and bias voltage.
Abstract: As the world move to the accomplishment of Performance Based Engineering philosophies in seismic design of Civil Engineering structures, new seismic design provisions require Structural Engineers to perform both static and dynamic analysis for the design of structures. While Linear Equivalent Static Analysis is performed for regular buildings up to 90m height in zone I and II, Dynamic Analysis should be performed for regular and irregular buildings in zone IV and V. Dynamic Analysis can take the form of a dynamic Time History Analysis or a linear Response Spectrum Analysis. In present study, Multi-storey irregular buildings with 20 stories have been modeled using software packages ETABS and SAP 2000 v.15 for seismic zone V in India. This paper also deals with the effect of the variation of the building height on the structural response of the shear wall building. Dynamic responses of building under actual earthquakes, EL-CENTRO 1949 and CHI-CHI Taiwan 1999 have been investigated. This paper highlights the accuracy and exactness of Time History analysis in comparison with the most commonly adopted Response Spectrum Analysis and Equivalent Static Analysis.
Abstract: Considering toxicity of heavy metals and their
accumulation in domestic wastes, immobilization of lead and
cadmium is envisaged inside glass-ceramics. We particularly
focused this work on calcium-rich phases embedded in a
glassy matrix.
Glass-ceramics were synthesized from glasses doped with
12 wt% and 16 wt% of PbO or CdO. They were observed and
analyzed by Electron MicroProbe Analysis (EMPA) and
Analytical Scanning Electron Microscopy (ASEM). Structural
characterization of the samples was performed by powder XRay
Diffraction.
Diopside crystals of CaMgSi2O6 composition are shown to
incorporate significant amounts of cadmium (up to 9 wt% of
CdO). Two new crystalline phases are observed with very
high Cd or Pb contents: about 40 wt% CdO for the cadmiumrich
phase and near 60 wt% PbO for the lead-rich phase. We
present complete chemical and structural characterization of
these phases. They represent a promising way for the
immobilization of toxic elements like Cd or Pb since glass
ceramics are known to propose a “double barrier" protection
(metal-rich crystals embedded in a glass matrix) against metal
release in the environment.
Abstract: Gas turbine air inlet cooling is a useful method for
increasing output for regions where significant power demand and
highest electricity prices occur during the warm months. Inlet air
cooling increases the power output by taking advantage of the gas
turbine-s feature of higher mass flow rate when the compressor inlet
temperature decreases. Different methods are available for reducing
gas turbine inlet temperature. There are two basic systems currently
available for inlet cooling. The first and most cost-effective system is
evaporative cooling. Evaporative coolers make use of the evaporation
of water to reduce the gas turbine-s inlet air temperature. The second
system employs various ways to chill the inlet air. In this method, the
cooling medium flows through a heat exchanger located in the inlet
duct to remove heat from the inlet air. However, the evaporative
cooling is limited by wet-bulb temperature while the chilling can cool
the inlet air to temperatures that are lower than the wet bulb
temperature. In the present work, a thermodynamic model of a gas
turbine is built to calculate heat rate, power output and thermal
efficiency at different inlet air temperature conditions. Computational
results are compared with ISO conditions herein called "base-case".
Therefore, the two cooling methods are implemented and solved for
different inlet conditions (inlet temperature and relative humidity).
Evaporative cooler and absorption chiller systems results show that
when the ambient temperature is extremely high with low relative
humidity (requiring a large temperature reduction) the chiller is the
more suitable cooling solution. The net increment in the power output
as a function of the temperature decrease for each cooling method is
also obtained.
Abstract: In recent years, tuned mass damper (TMD) control systems for civil engineering structures have attracted considerable attention. This paper emphasizes on the application of particle swarm application (PSO) to design and optimize the parameters of the TMD control scheme for achieving the best results in the reduction of the building response under earthquake excitations. The Integral of the Time multiplied Absolute value of the Error (ITAE) based on relative displacement of all floors in the building is taken as a performance index of the optimization criterion. The problem of robustly TMD controller design is formatted as an optimization problem based on the ITAE performance index to be solved using the PSO technique which has a story ability to find the most optimistic results. An 11- story realistic building, located in the city of Rasht, Iran is considered as a test system to demonstrate effectiveness of the proposed method. The results analysis through the time-domain simulation and some performance indices reveals that the designed PSO based TMD controller has an excellent capability in reduction of the seismically excited example building.
Abstract: Palm shell obtained from coastal part of southern
India was studied for the removal for the adsorption of Hg (II) ions.
Batch adsorption experiments were carried out as a function of pH,
concentration of Hg (II) ions, time, temperature and adsorbent dose.
Maximum removal was seen in the range pH 4.0- pH 7.0. The palm
shell powder used as adsorbent was characterized for its surface area,
SEM, PXRD, FTIR, ion exchange capacity, moisture content, and
bulk density, soluble content in water and acid and pH. The
experimental results were analyzed using Langmuir I, II, III, IV and
Freundlich adsorption isotherms. The batch sorption kinetics was
studied for the first order reversible reaction, pseudo first order;
pseudo second order reaction and the intra-particle diffusion reaction.
The biomass was successfully used for removal Hg (II) from
synthetic and industrial effluents and the technique appears
industrially applicable and viable.
Abstract: In this study, a new root-finding method for solving nonlinear equations is proposed. This method requires two starting values that do not necessarily bracketing a root. However, when the starting values are selected to be close to a root, the proposed method converges to the root quicker than the secant method. Another advantage over all iterative methods is that; the proposed method usually converges to two distinct roots when the given function has more than one root, that is, the odd iterations of this new technique converge to a root and the even iterations converge to another root. Some numerical examples, including a sine-polynomial equation, are solved by using the proposed method and compared with results obtained by the secant method; perfect agreements are found.
Abstract: A hybrid feature based adaptive particle filter algorithm is presented for object tracking in real scenarios with static camera.
The hybrid feature is combined by two effective features: the Grayscale Arranging Pairs (GAP) feature and the color histogram feature. The GAP feature has high discriminative ability even under conditions of severe illumination variation and dynamic background
elements, while the color histogram feature has high reliability to identify the detected objects. The combination of two features covers the shortage of single feature. Furthermore, we adopt an updating
target model so that some external problems such as visual angles can be overcame well. An automatic initialization algorithm is introduced which provides precise initial positions of objects. The experimental
results show the good performance of the proposed method.
Abstract: The literature reports a large number of approaches for
measuring the similarity between protein sequences. Most of these
approaches estimate this similarity using alignment-based techniques
that do not necessarily yield biologically plausible results, for two
reasons.
First, for the case of non-alignable (i.e., not yet definitively aligned
and biologically approved) sequences such as multi-domain, circular
permutation and tandem repeat protein sequences, alignment-based
approaches do not succeed in producing biologically plausible results.
This is due to the nature of the alignment, which is based on the
matching of subsequences in equivalent positions, while non-alignable
proteins often have similar and conserved domains in non-equivalent
positions.
Second, the alignment-based approaches lead to similarity measures
that depend heavily on the parameters set by the user for the alignment
(e.g., gap penalties and substitution matrices). For easily alignable
protein sequences, it's possible to supply a suitable combination of
input parameters that allows such an approach to yield biologically
plausible results. However, for difficult-to-align protein sequences,
supplying different combinations of input parameters yields different
results. Such variable results create ambiguities and complicate the
similarity measurement task.
To overcome these drawbacks, this paper describes a novel and
effective approach for measuring the similarity between protein
sequences, called SAF for Substitution and Alignment Free. Without
resorting either to the alignment of protein sequences or to substitution
relations between amino acids, SAF is able to efficiently detect the
significant subsequences that best represent the intrinsic properties of
protein sequences, those underlying the chronological dependencies of
structural features and biochemical activities of protein sequences.
Moreover, by using a new efficient subsequence matching scheme,
SAF more efficiently handles protein sequences that contain similar
structural features with significant meaning in chronologically
non-equivalent positions. To show the effectiveness of SAF, extensive
experiments were performed on protein datasets from different
databases, and the results were compared with those obtained by
several mainstream algorithms.
Abstract: E-travel is travel agency-s companies employing internet and website as e-commerce context. This study presents numerous initial key factors of electronic travel model based on small travel agencies perspectives. Browsing previous studies related to website travel activities are conducted. Five small travel agencies in Indonesia has been deeply interviewed in case studies. The finding of this research is identifying numerous characteristics and dimension factors and travel website operations including ownermanager roles, business experiences, characteristically business, and technological aspects. This study is the preliminary research related to travel website adoption in Indonesia. The further study would be conducted in questionnaires of the quantitative research in Indonesia contexts as a developing country.
Abstract: With data centers, end-users can realize the pervasiveness of services that will be one day the cornerstone of our lives. However, data centers are often classified as computing systems that consume the most amounts of power. To circumvent such a problem, we propose a self-adaptive weighted sum methodology that jointly optimizes the performance and power consumption of any given data center. Compared to traditional methodologies for multi-objective optimization problems, the proposed self-adaptive weighted sum technique does not rely on a systematical change of weights during the optimization procedure. The proposed technique is compared with the greedy and LR heuristics for large-scale problems, and the optimal solution for small-scale problems implemented in LINDO. the experimental results revealed that the proposed selfadaptive weighted sum technique outperforms both of the heuristics and projects a competitive performance compared to the optimal solution.
Abstract: We study in this paper the effect of the scene
changing on image sequences coding system using Embedded
Zerotree Wavelet (EZW). The scene changing considered here is the
full motion which may occurs. A special image sequence is generated
where the scene changing occurs randomly. Two scenarios are
considered: In the first scenario, the system must provide the
reconstruction quality as best as possible by the management of the
bit rate (BR) while the scene changing occurs. In the second scenario,
the system must keep the bit rate as constant as possible by the
management of the reconstruction quality. The first scenario may be
motivated by the availability of a large band pass transmission
channel where an increase of the bit rate may be possible to keep the
reconstruction quality up to a given threshold. The second scenario
may be concerned by the narrow band pass transmission channel
where an increase of the bit rate is not possible. In this last case,
applications for which the reconstruction quality is not a constraint
may be considered. The simulations are performed with five scales
wavelet decomposition using the 9/7-tap filter bank biorthogonal
wavelet. The entropy coding is performed using a specific defined
binary code book and EZW algorithm. Experimental results are
presented and compared to LEAD H263 EVAL. It is shown that if
the reconstruction quality is the constraint, the system increases the
bit rate to obtain the required quality. In the case where the bit rate
must be constant, the system is unable to provide the required quality
if the scene change occurs; however, the system is able to improve
the quality while the scene changing disappears.
Abstract: MATCH project [1] entitle the development of an
automatic diagnosis system that aims to support treatment of colon
cancer diseases by discovering mutations that occurs to tumour
suppressor genes (TSGs) and contributes to the development of
cancerous tumours. The constitution of the system is based on a)
colon cancer clinical data and b) biological information that will be
derived by data mining techniques from genomic and proteomic
sources The core mining module will consist of the popular, well
tested hybrid feature extraction methods, and new combined
algorithms, designed especially for the project. Elements of rough
sets, evolutionary computing, cluster analysis, self-organization maps
and association rules will be used to discover the annotations
between genes, and their influence on tumours [2]-[11].
The methods used to process the data have to address their high
complexity, potential inconsistency and problems of dealing with the
missing values. They must integrate all the useful information
necessary to solve the expert's question. For this purpose, the system
has to learn from data, or be able to interactively specify by a domain
specialist, the part of the knowledge structure it needs to answer a
given query. The program should also take into account the
importance/rank of the particular parts of data it analyses, and adjusts
the used algorithms accordingly.
Abstract: Literature reveals that many investors rely on technical trading rules when making investment decisions. If stock markets are efficient, one cannot achieve superior results by using these trading rules. However, if market inefficiencies are present, profitable opportunities may arise. The aim of this study is to investigate the effectiveness of technical trading rules in 34 emerging stock markets. The performance of the rules is evaluated by utilizing White-s Reality Check and the Superior Predictive Ability test of Hansen, along with an adjustment for transaction costs. These tests are able to evaluate whether the best model performs better than a buy-and-hold benchmark. Further, they provide an answer to data snooping problems, which is essential to obtain unbiased outcomes. Based on our results we conclude that technical trading rules are not able to outperform a naïve buy-and-hold benchmark on a consistent basis. However, we do find significant trading rule profits in 4 of the 34 investigated markets. We also present evidence that technical analysis is more profitable in crisis situations. Nevertheless, this result is relatively weak.
Abstract: A cancelable palmprint authentication system
proposed in this paper is specifically designed to overcome the
limitations of the contemporary biometric authentication system. In
this proposed system, Geometric and pseudo Zernike moments are
employed as feature extractors to transform palmprint image into a
lower dimensional compact feature representation. Before moment
computation, wavelet transform is adopted to decompose palmprint
image into lower resolution and dimensional frequency subbands.
This reduces the computational load of moment calculation
drastically. The generated wavelet-moment based feature
representation is used to generate cancelable verification key with a
set of random data. This private binary key can be canceled and
replaced. Besides that, this key also possesses high data capture
offset tolerance, with highly correlated bit strings for intra-class
population. This property allows a clear separation of the genuine
and imposter populations, as well as zero Equal Error Rate
achievement, which is hardly gained in the conventional biometric
based authentication system.