Abstract: Chinese Idioms are a type of traditional Chinese idiomatic
expressions with specific meanings and stereotypes structure
which are widely used in classical Chinese and are still common in
vernacular written and spoken Chinese today. Currently, Chinese
Idioms are retrieved in glossary with key character or key word in
morphology or pronunciation index that can not meet the need of
searching semantically. OCIRS is proposed to search the desired
idiom in the case of users only knowing its meaning without any key
character or key word. The user-s request in a sentence or phrase will
be grammatically analyzed in advance by word segmentation, key
word extraction and semantic similarity computation, thus can be
mapped to the idiom domain ontology which is constructed to provide
ample semantic relations and to facilitate description logics-based
reasoning for idiom retrieval. The experimental evaluation shows that
OCIRS realizes the function of searching idioms via semantics, obtaining
preliminary achievement as requested by the users.
Abstract: A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.
Abstract: This paper presents an algorithm of particle swarm
optimization with reduction for global optimization problems. Particle
swarm optimization is an algorithm which refers to the collective
motion such as birds or fishes, and a multi-point search algorithm
which finds a best solution using multiple particles. Particle
swarm optimization is so flexible that it can adapt to a number
of optimization problems. When an objective function has a lot of
local minimums complicatedly, the particle may fall into a local
minimum. For avoiding the local minimum, a number of particles are
initially prepared and their positions are updated by particle swarm
optimization. Particles sequentially reduce to reach a predetermined
number of them grounded in evaluation value and particle swarm
optimization continues until the termination condition is met. In order
to show the effectiveness of the proposed algorithm, we examine the
minimum by using test functions compared to existing algorithms.
Furthermore the influence of best value on the initial number of
particles for our algorithm is discussed.
Abstract: The image segmentation method described in this
paper has been developed as a pre-processing stage to be used in
methodologies and tools for video/image indexing and retrieval by
content. This method solves the problem of whole objects extraction
from background and it produces images of single complete objects
from videos or photos. The extracted images are used for calculating
the object visual features necessary for both indexing and retrieval
processes.
The segmentation algorithm is based on the cooperation among an
optical flow evaluation method, edge detection and region growing
procedures. The optical flow estimator belongs to the class of
differential methods. It permits to detect motions ranging from a
fraction of a pixel to a few pixels per frame, achieving good results in
presence of noise without the need of a filtering pre-processing stage
and includes a specialised model for moving object detection.
The first task of the presented method exploits the cues from
motion analysis for moving areas detection. Objects and background
are then refined using respectively edge detection and seeded region
growing procedures. All the tasks are iteratively performed until
objects and background are completely resolved.
The method has been applied to a variety of indoor and outdoor
scenes where objects of different type and shape are represented on
variously textured background.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: The drastic increase in the usage of SMS technology
has led service providers to seek for a solution that enable users of
mobile devices to access services through SMSs. This has resulted in
the proposal of solutions towards SMS-based service invocation in
service oriented environments. However, the dynamic nature of
service-oriented environments coupled with sudden load peaks
generated by service request, poses performance challenges to
infrastructures for supporting SMS-based service invocation. To
address this problem we adopt load balancing techniques. A load
balancing model with adaptive load balancing and load monitoring
mechanisms as its key constructs is proposed. The load balancing
model then led to realization of Least Loaded Load Balancing
Framework (LLLBF). Evaluation of LLLBF benchmarked with round
robin (RR) scheme on the queuing approach showed LLLBF
outperformed RR in terms of response time and throughput.
However, LLLBF achieved better result in the cost of high
processing power.
Abstract: A serious problem on the WWW is finding reliable
information. Not everything found on the Web is true and the
Semantic Web does not change that in any way. The problem will be
even more crucial for the Semantic Web, where agents will be
integrating and using information from multiple sources. Thus, if an
incorrect premise is used due to a single faulty source, then any
conclusions drawn may be in error. Thus, statements published on
the Semantic Web have to be seen as claims rather than as facts, and
there should be a way to decide which among many possibly
inconsistent sources is most reliable. In this work, we propose a trust
model for the Semantic Web. The proposed model is inspired by the
use trust in human society. Trust is a type of social knowledge and
encodes evaluations about which agents can be taken as reliable
sources of information or services. Our proposed model allows
agents to decide which among different sources of information to
trust and thus act rationally on the semantic web.
Abstract: Recently, business environment and customer needs
have become rapidly changing, hence it is very difficult to fulfill
sophisticated customer needs by product or service innovation only. In
practice, to cope with this problem, various manufacturing companies
have developed services to combine with their products. Along with
this, many academic studies on PSS (Product Service System) which is
the integrated system of products and services have been conducted
from the viewpoint of manufacturers. On the other hand, service
providers are also attempting to develop service-supporting products
to increase their service competitiveness and provide differentiated
value. However, there is a lack of research based on the service-centric
point of view. Accordingly, this paper proposes a concept generation
method for service-supporting product development from the
service-centric point of view. This method is designed to be executed
in five consecutive steps: situation analysis, problem definition,
problem resolution, solution evaluation, and concept generation. In
the proposed approach, some tools of TRIZ (Theory of Solving
Inventive Problem) such as ISQ (Innovative Situation Questionnaire)
and 40 inventive principles are employed in order to define problems
of the current services and solve them by generating
service-supporting product concepts. This research contributes to the
development of service-supporting products and service-centric PSSs.
Abstract: This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.
Abstract: Speech enhancement is the process of eliminating
noise and increasing the quality of a speech signal, which is
contaminated with other kinds of distortions. This paper is on
developing an optimum cascaded system for speech enhancement.
This aim is attained without diminishing any relevant speech
information and without much computational and time complexity.
LMS algorithm, Spectral Subtraction and Kalman filter have been
deployed as the main de-noising algorithms in this work. Since these
algorithms suffer from respective shortcomings, this work has been
undertaken to design cascaded systems in different combinations and
the evaluation of such cascades by qualitative (listening) and
quantitative (SNR) tests.
Abstract: Estimating the reliability of a computer network has been a subject of great interest. It is a well known fact that this problem is NP-hard. In this paper we present a very efficient combinatorial approach for Monte Carlo reliability estimation of a network with unreliable nodes and unreliable edges. Its core is the computation of some network combinatorial invariants. These invariants, once computed, directly provide pure and simple framework for computation of network reliability. As a specific case of this approach we obtain tight lower and upper bounds for distributed network reliability (the so called residual connectedness reliability). We also present some simulation results.
Abstract: Accurate modeling of high speed RLC interconnects
has become a necessity to address signal integrity issues in current
VLSI design. To accurately model a dispersive system of interconnects
at higher frequencies; a full-wave analysis is required.
However, conventional circuit simulation of interconnects with full
wave models is extremely CPU expensive. We present an algorithm
for reducing large VLSI circuits to much smaller ones with similar
input-output behavior. A key feature of our method, called Frequency
Shift Technique, is that it is capable of reducing linear time-varying
systems. This enables it to capture frequency-translation and sampling
behavior, important in communication subsystems such as mixers,
RF components and switched-capacitor filters. Reduction is obtained
by projecting the original system described by linear differential
equations into a lower dimension. Experiments have been carried out
using Cadence Design Simulator cwhich indicates that the proposed
technique achieves more % reduction with less CPU time than the
other model order reduction techniques existing in literature. We
also present applications to RF circuit subsystems, obtaining size
reductions and evaluation speedups of orders of magnitude with
insignificant loss of accuracy.
Abstract: Climate change causes severe effects on natural
habitats, especially wetlands. These challenges require the adaptation
of their management to probable effects of climate change. A
compilation of necessary changes in land management was collected
in a Hungarian area being both national park and Natura 2000 SAC
and SCI site in favor of increasing the resilience and reducing
vulnerability. Several factors, such as ecological aspects, nature
conservation and climatic adaptation should be combined with social
and economic factors during the process of developing climate
change adapted management on vulnerable wetlands. Planning
adaptive management should be determined by a priority order of
conservation aims and evaluation of factors at the determined
planning unit. Mowing techniques, frequency and exact date should
be observed as well as grazing species and their breed, due to
different grazing, group forming and trampling habits. Integrating
landscape history and historical land development into the planning
process is essential.
Abstract: This article deals with the popularity of candidates for the president of the United States of America. The popularity is assessed according to public comments on the Web 2.0. Social networking, blogging and online forums (collectively Web 2.0) are for common Internet users the easiest way to share their personal opinions, thoughts, and ideas with the entire world. However, the web content diversity, variety of technologies and website structure differences, all of these make the Web 2.0 a network of heterogeneous data, where things are difficult to find for common users. The introductory part of the article describes methodology for gathering and processing data from Web 2.0. The next part of the article is focused on the evaluation and content analysis of obtained information, which write about presidential candidates.
Abstract: Green home rating has emerged as an important
agenda to practice the principles of sustainability. In Malaysia, the
establishment of the 'Green Building Index ' Residential New
Construction- (GBI-RNC) has brought this agenda closer to the
stakeholders of the local green building industry. GBI-RNC focuses
on the evaluation of the environmental impacts posed by houses
rather than assessing the Triple-Bottom-Line (TBL) of Sustainability
which also include socio-economic factors. Therefore, as part of a
wider study, a survey was conducted to gather the backgrounds of
green building stakeholders in Malaysia and their responses to a
number of exploratory questions regarding the setting up of a
framework to rate green homes against the TBL. This paper reports
the findings from Section A and B from this survey and discusses
them accordingly with a conclusion that forms part of the basis for a
new generation green home rating framework specifically for use in
Malaysia.
Abstract: For a spatiotemporal database management system,
I/O cost of queries and other operations is an important performance
criterion. In order to optimize this cost, an intense research on
designing robust index structures has been done in the past decade.
With these major considerations, there are still other design issues
that deserve addressing due to their direct impact on the I/O cost.
Having said this, an efficient buffer management strategy plays a key
role on reducing redundant disk access. In this paper, we proposed an
efficient buffer strategy for a spatiotemporal database index
structure, specifically indexing objects moving over a network of
roads. The proposed strategy, namely MONPAR, is based on the data
type (i.e. spatiotemporal data) and the structure of the index
structure. For the purpose of an experimental evaluation, we set up a
simulation environment that counts the number of disk accesses
while executing a number of spatiotemporal range-queries over the
index. We reiterated simulations with query sets with different
distributions, such as uniform query distribution and skewed query
distribution. Based on the comparison of our strategy with wellknown
page-replacement techniques, like LRU-based and Prioritybased
buffers, we conclude that MONPAR behaves better than its
competitors for small and medium size buffers under all used query-distributions.
Abstract: Removing noise from the any processed images is very important. Noise should be removed in such a way that important information of image should be preserved. A decisionbased nonlinear algorithm for elimination of band lines, drop lines, mark, band lost and impulses in images is presented in this paper. The algorithm performs two simultaneous operations, namely, detection of corrupted pixels and evaluation of new pixels for replacing the corrupted pixels. Removal of these artifacts is achieved without damaging edges and details. However, the restricted window size renders median operation less effective whenever noise is excessive in that case the proposed algorithm automatically switches to mean filtering. The performance of the algorithm is analyzed in terms of Mean Square Error [MSE], Peak-Signal-to-Noise Ratio [PSNR], Signal-to-Noise Ratio Improved [SNRI], Percentage Of Noise Attenuated [PONA], and Percentage Of Spoiled Pixels [POSP]. This is compared with standard algorithms already in use and improved performance of the proposed algorithm is presented. The advantage of the proposed algorithm is that a single algorithm can replace several independent algorithms which are required for removal of different artifacts.
Abstract: Intelligent Transportation System integrates various modern advanced technologies into the ground transportation system, and it will be the goal of urban transport system in the future because of its comprehensive effects. However, it also brings some problems, such as project performance assessment, fairness of benefiting groups, fund management, which are directly related to its operation and implementation. Wuhan has difficulties in organizing transportation because of its nature feature (river and lake), therefore, calling Service of Taxi plays an important role in transportation. This paper researches on calling Service of Taxi in Wuhan, based on quantitative and qualitative analysis. It analyzes its operations management systematically, including business model, finance, usage analysis and users evaluation. As for business model, it is that the government leads the operation at the initial stage, and the third part dominates the operation at the mature stage, which not only eases the pressure of the third part and benefits the spread of the calling service at the initial stage, but also alleviates financial pressure of government and improve the efficiency of the operation at the mature stage. As for finance, it draws that this service will bring heavy financial burden of equipments, but it will be alleviated in the future because of its spread. As for usage analysis, through data comparison, this service can bring some benefits for taxi drivers, and time and spatial distribution of usage have certain features. As for user evaluation, it analyzes using group and the reason why choosing it. At last, according to the analysis above, the paper puts forward the potentials, limitations, and future development strategies for it.
Abstract: The objective of this study was to develop vaginal
suppository containing lactobacillus. Four kinds of vaginal
suppositories containing Lactobacillus paracasei HL32 were
formulated: 1) a conventional suppository with Witepsol H-15 as a
base, 2) a conventional suppository with mixed polyethylene glycols
(PEGs) as a base, 3) a hollow-type suppository with Witepsol H-15
as a base and 4) a hollow-type suppository with mixed PEGs as a
base. The release studies demonstrated that the hollow-type
suppository with mixed PEGs as the base gave the highest release of
L. paracasei HL32 and was microbiological stable after storage at 2-
8°C over the period of 3 months.
Abstract: Today-s Information and Knowledge Society has
placed new demands on education and a new paradigm of education
is required. Learning, facilitated by educational systems and the
pedagogic process, is globally undergoing dramatic changes. The aim
of this paper is the development of a simple Instructional Design tool
for E-Learning, named IDEL (Instructional Design for Electronic
Learning), that provides the educators with facilities to create their
own courses with the essential educational material and manage
communication with students. It offers flexibility in the way of
learning and provides ease in employment and reusability of
resources. IDEL is a web-based Instructional System and is designed
to facilitate course design process in accordance with the ADDIE
model and the instructional design principles with emphasis placed
on the use of technology enhanced learning. An example case of
using the ADDIE model to systematically develop a course and its
implementation with the aid of IDEL is given and some results from
student evaluation of the tool and the course are reported.