Abstract: In this study, stress distributions on dental implants
made of functionally graded biomaterials (FGBM) are investigated
numerically. The implant body is considered to be subjected to axial
compression loads. Numerical problem is assumed to be 2D, and
ANSYS commercial software is used for the analysis. The cross
section of the implant thread varies as varying the height (H) and the
width (t) of the thread. According to thread dimensions of implant
and material properties of FGBM, equivalent stress distribution on
the implant is determined and presented with contour plots along
with the maximum equivalent stress values. As a result, with
increasing material gradient parameter (n), the equivalent stress
decreases, but the minimum stress distribution increases. Maximum
stress values decrease with decreasing implant radius (r). Maximum
von Mises stresses increases with decreasing H when t is constant.
On the other hand, the stress values are not affected by variation of t
in the case of H = constant.
Abstract: The design of weight is one of the important parts in
fuzzy decision making, as it would have a deep effect on the evaluation
results. Entropy is one of the weight measure based on objective
evaluation. Non--probabilistic-type entropy measures for fuzzy set
and interval type-2 fuzzy sets (IT2FS) have been developed and applied
to weight measure. Since the entropy for (IT2FS) for decision
making yet to be explored, this paper proposes a new objective
weight method by using entropy weight method for multiple attribute
decision making (MADM). This paper utilizes the nature of IT2FS
concept in the evaluation process to assess the attribute weight based
on the credibility of data. An example was presented to demonstrate
the feasibility of the new method in decision making. The entropy
measure of interval type-2 fuzzy sets yield flexible judgment and
could be applied in decision making environment.
Abstract: This paper demonstrates how the soft systems
methodology can be used to improve the delivery of a module in data warehousing for fourth year information technology students.
Graduates in information technology needs to have academic skills
but also needs to have good practical skills to meet the skills requirements of the information technology industry. In developing
and improving current data warehousing education modules one has to find a balance in meeting the expectations of various role players such as the students themselves, industry and academia. The soft
systems methodology, developed by Peter Checkland, provides a
methodology for facilitating problem understanding from different world views. In this paper it is demonstrated how the soft systems methodology can be used to plan the improvement of data
warehousing education for fourth year information technology students.
Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: This article proposes a voltage-mode
multifunction filter using differential voltage current
controllable current conveyor transconductance amplifier
(DV-CCCCTA). The features of the circuit are that: the
quality factor and pole frequency can be tuned independently
via the values of capacitors: the circuit description is very
simple, consisting of merely 1 DV-CCCCTA, and 2
capacitors. Without any component matching conditions, the
proposed circuit is very appropriate to further develop into
an integrated circuit. Additionally, each function response
can be selected by suitably selecting input signals with
digital method. The PSpice simulation results are depicted.
The given results agree well with the theoretical anticipation.
Abstract: In this article, we aim to discuss the formulation of two explicit group iterative finite difference methods for time-dependent two dimensional Burger-s problem on a variable mesh. For the non-linear problems, the discretization leads to a non-linear system whose Jacobian is a tridiagonal matrix. We discuss the Newton-s explicit group iterative methods for a general Burger-s equation. The proposed explicit group methods are derived from the standard point and rotated point Crank-Nicolson finite difference schemes. Their computational complexity analysis is discussed. Numerical results are given to justify the feasibility of these two proposed iterative methods.
Abstract: Xanthan gum is one of the major commercial
biopolymers. Due to its excellent rheological properties xanthan gum
is used in many applications, mainly in food industry. Commercial
production of xanthan gum uses glucose as the carbon substrate;
consequently the price of xanthan production is high. One of the
ways to decrease xanthan price, is using cheaper substrate like
agricultural wastes. Iran is one of the biggest date producer countries.
However approximately 50% of date production is wasted annually.
The goal of this study is to produce xanthan gum from waste date
using Xanthomonas campestris PTCC1473 by submerged
fermentation. In this study the effect of three variables including
phosphor and nitrogen amount and agitation rate in three levels using
response surface methodology (RSM) has been studied. Results
achieved from statistical analysis Design Expert 7.0.0 software
showed that xanthan increased with increasing level of phosphor.
Low level of nitrogen leaded to higher xanthan production. Xanthan
amount, increasing agitation had positive influence. The statistical
model identified the optimum conditions nitrogen amount=3.15g/l,
phosphor amount=5.03 g/l and agitation=394.8 rpm for xanthan. To
model validation, experiments in optimum conditions for xanthan
gum were carried out. The mean of result for xanthan was 6.72±0.26.
The result was closed to the predicted value by using RSM.
Abstract: In this paper, we propose ablock-wise watermarking scheme for color image authentication to resist malicious tampering of digital media. The thresholding technique is incorporated into the scheme such that the tampered region of the color image can be recovered with high quality while the proofing result is obtained. The watermark for each block consists of its dual authentication data and the corresponding feature information. The feature information for recovery iscomputed bythe thresholding technique. In the proofing process, we propose a dual-option parity check method to proof the validity of image blocks. In the recovery process, the feature information of each block embedded into the color image is rebuilt for high quality recovery. The simulation results show that the proposed watermarking scheme can effectively proof the tempered region with high detection rate and can recover the tempered region with high quality.
Abstract: Based on an analysis of the current research and application of Road maintenance, geographic information system (WebGIS) and ArcGIS Server, the platform overhead construction for Road maintenance development is studied and the key issues are presented, including the organization and design of spatial data on the basis of the geodatabase technology, middleware technology, tiles cache index technology and dynamic segmentation of WebGIS. Road maintenance geographic information platform is put forward through the researching ideas of analysis of the system design. The design and application of WebGIS system are discussed on the basis of a case study of BaNan district of Chongqing highway maintenance management .The feasibility of the theories and methods are validated through the system.
Abstract: When acid is pumped into damaged reservoirs for
damage removal/stimulation, distorted inflow of acid into the
formation occurs caused by acid preferentially traveling into highly
permeable regions over low permeable regions, or (in general) into
the path of least resistance. This can lead to poor zonal coverage and
hence warrants diversion to carry out an effective placement of acid.
Diversion is desirably a reversible technique of temporarily reducing
the permeability of high perm zones, thereby forcing the acid into
lower perm zones.
The uniqueness of each reservoir can pose several challenges to
engineers attempting to devise optimum and effective diversion
strategies. Diversion techniques include mechanical placement and/or
chemical diversion of treatment fluids, further sub-classified into ball
sealers, bridge plugs, packers, particulate diverters, viscous gels,
crosslinked gels, relative permeability modifiers (RPMs), foams,
and/or the use of placement techniques, such as coiled tubing (CT)
and the maximum pressure difference and injection rate (MAPDIR)
methodology.
It is not always realized that the effectiveness of diverters greatly
depends on reservoir properties, such as formation type, temperature,
reservoir permeability, heterogeneity, and physical well
characteristics (e.g., completion type, well deviation, length of
treatment interval, multiple intervals, etc.). This paper reviews the
mechanisms by which each variety of diverter functions and
discusses the effect of various reservoir properties on the efficiency
of diversion techniques. Guidelines are recommended to help
enhance productivity from zones of interest by choosing the best
methods of diversion while pumping an optimized amount of
treatment fluid. The success of an overall acid treatment often
depends on the effectiveness of the diverting agents.
Abstract: The objectif of the present work is to determinate the
potential of the solar parabolic trough collector (PTC) for use in the
design of a solar thermal power plant in Algeria. The study is based
on a mathematical modeling of the PTC. Heat balance has been
established respectively on the heat transfer fluid (HTF), the absorber
tube and the glass envelop using the principle of energy conservation
at each surface of the HCE cross-sectionn. The modified Euler
method is used to solve the obtained differential equations. At first
the results for typical days of two seasons the thermal behavior of the
HTF, the absorber and the envelope are obtained. Then to determine
the thermal performances of the heat transfer fluid, different oils are
considered and their temperature and heat gain evolutions compared.
Abstract: The objectives of this research paper were to study the
influencing factors that contributed to the success of electronic
commerce (e-commerce) and to study the approach to enhance the
standard of e-commerce for small and medium enterprises (SME).
The research paper focused the study on only sole proprietorship
SMEs in Bangkok, Thailand. The factors contributed to the success
of SME included business management, learning in the organization,
business collaboration, and the quality of website. A quantitative and
qualitative mixed research methodology was used. In terms of
quantitative method, a questionnaire was used to collect data from
251 sole proprietorships. The System Equation Model (SEM) was
utilized as the tool for data analysis. In terms of qualitative method,
an in-depth interview, a dialogue with experts in the field of ecommerce
for SMEs, and content analysis were used.
By using the adjusted causal relationship structure model, it was
revealed that the factors affecting the success of e-commerce for
SMEs were found to be congruent with the empirical data. The
hypothesis testing indicated that business management influenced the
learning in the organization, the learning in the organization
influenced business collaboration and the quality of the website, and
these factors, in turn, influenced the success of SMEs. Moreover, the
approach to enhance the standard of SMEs revealed that the majority
of respondents wanted to enhance the standard of SMEs to a high
level in the category of safety of e-commerce system, basic structure
of e-commerce, development of staff potentials, assistance of budget
and tax reduction, and law improvement regarding the e-commerce
respectively.
Abstract: X-ray mammography is the most effective method for
the early detection of breast diseases. However, the typical diagnostic
signs such as microcalcifications and masses are difficult to detect
because mammograms are of low-contrast and noisy. In this paper, a
new algorithm for image denoising and enhancement in Orthogonal
Polynomials Transformation (OPT) is proposed for radiologists to
screen mammograms. In this method, a set of OPT edge coefficients
are scaled to a new set by a scale factor called OPT scale factor. The
new set of coefficients is then inverse transformed resulting in
contrast improved image. Applications of the proposed method to
mammograms with subtle lesions are shown. To validate the
effectiveness of the proposed method, we compare the results to
those obtained by the Histogram Equalization (HE) and the Unsharp
Masking (UM) methods. Our preliminary results strongly suggest
that the proposed method offers considerably improved enhancement
capability over the HE and UM methods.
Abstract: With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Abstract: Interactions among proteins are the basis of various
life events. So, it is important to recognize and research protein
interaction sites. A control set that contains 149 protein molecules
were used here. Then 10 features were extracted and 4 sample sets
that contained 9 sliding windows were made according to features.
These 4 sample sets were calculated by Radial Basis Functional neutral
networks which were optimized by Particle Swarm Optimization
respectively. Then 4 groups of results were obtained. Finally, these 4
groups of results were integrated by decision fusion (DF) and Genetic
Algorithm based Selected Ensemble (GASEN). A better accuracy was
got by DF and GASEN. So, the integrated methods were proved to
be effective.
Abstract: Self-efficacy, self-reliance, and motivation were
examined in a quasi-experimental study with 178 sophomore
university students. Participants used an interactive cardiovascular
anatomy and physiology CD-ROM, and completed a 15-item
questionnaire. Reliability of the questionnaire was established using
Cronbach-s alpha. Post-tests and course grades were examined using
a t-test, demonstrating no significance. Results of an item-to-item
analysis of the questionnaire showed overall satisfaction with the
teaching methodology and varied results for self-efficacy, selfreliance,
and motivation. Kendall-s Tau was calculated for all items
in the questionnaire.
Abstract: With respect to the dissipation of energy through
plastic deformation of joints of prefabricated wall units, the paper
points out the principal importance of efficient reinforcement of the
prefabricated system at its joints. The method, quality and amount of
reinforcement are essential for reaching the necessary degree of joint
ductility. The paper presents partial results of experimental research
of vertical joints of prefabricated units exposed to monotonously
rising loading and repetitive shear force and formulates a conclusion
that the limit state of the structure as a whole is preceded by the
disintegration of joints, or that the structure tends to pass from
linearly elastic behaviour to non-linearly elastic to plastic behaviour
by exceeding the proportional elastic limit in joints.Experimental
verification on a model of a 7-storey prefabricated structure revealed
weak points in its load-bearing systems, mainly at places of critical
points around openings situated in close proximity to vertical joints
of mutually perpendicularly oriented walls.
Abstract: The aim of this paper is to propose a mathematical
model to determine invariant sets, set covering, orbits and, in
particular, attractors in the set of tourism variables. Analysis was
carried out based on a pre-designed algorithm and applying our
interpretation of chaos theory developed in the context of General
Systems Theory. This article sets out the causal relationships
associated with tourist flows in order to enable the formulation of
appropriate strategies. Our results can be applied to numerous cases.
For example, in the analysis of tourist flows, these findings can be
used to determine whether the behaviour of certain groups affects that
of other groups and to analyse tourist behaviour in terms of the most
relevant variables. Unlike statistical analyses that merely provide
information on current data, our method uses orbit analysis to
forecast, if attractors are found, the behaviour of tourist variables in
the immediate future.
Abstract: Fuel cell's system requires regulating circuit for
voltage and current in order to control power in case of connecting to
other generative devices or load. In this paper Fuel cell system and
convertor, which is a multi-variable system, are controlled using
sliding mode method. Use of weighting matrix in design procedure
made it possible to regulate speed of control. Simulation results show
the robustness and accuracy of proposed controller for controlling
desired of outputs.
Abstract: Early detection of lung cancer through chest radiography is a widely used method due to its relatively affordable cost. In this paper, an approach to improve lung nodule visualization on chest radiographs is presented. The approach makes use of linear phase high-frequency emphasis filter for digital filtering and
histogram equalization for contrast enhancement to achieve improvements. Results obtained indicate that a filtered image can
reveal sharper edges and provide more details. Also, contrast enhancement offers a way to further enhance the global (or local) visualization by equalizing the histogram of the pixel values within
the whole image (or a region of interest). The work aims to improve lung nodule visualization of chest radiographs to aid detection of lung cancer which is currently the leading cause of cancer deaths worldwide.