Abstract: In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.
Abstract: In the proposed method for Web page-ranking, a
novel theoretic model is introduced and tested by examples of order
relationships among IP addresses. Ranking is induced using a
convexity feature, which is learned according to these examples
using a self-organizing procedure. We consider the problem of selforganizing
learning from IP data to be represented by a semi-random
convex polygon procedure, in which the vertices correspond to IP
addresses. Based on recent developments in our regularization
theory for convex polygons and corresponding Euclidean distance
based methods for classification, we develop an algorithmic
framework for learning ranking functions based on a Computational
Geometric Theory. We show that our algorithm is generic, and
present experimental results explaining the potential of our approach.
In addition, we explain the generality of our approach by showing its
possible use as a visualization tool for data obtained from diverse
domains, such as Public Administration and Education.
Abstract: The purpose of this paper is to study Database Models
to use them efficiently in E-commerce websites. In this paper we are
going to find a method which can save and retrieve information in Ecommerce
websites. Thus, semantic web applications can work with,
and we are also going to study different technologies of E-commerce
databases and we know that one of the most important deficits in
semantic web is the shortage of semantic data, since most of the
information is still stored in relational databases, we present an
approach to map legacy data stored in relational databases into the
Semantic Web using virtually any modern RDF query language, as
long as it is closed within RDF. To achieve this goal we study XML
structures for relational data bases of old websites and eventually we
will come up one level over XML and look for a map from relational
model (RDM) to RDF. Noting that a large number of semantic webs
get advantage of relational model, opening the ways which can be
converted to XML and RDF in modern systems (semantic web) is
important.
Abstract: A large amount of valuable information is available in
plain text clinical reports. New techniques and technologies are
applied to extract information from these reports. In this study, we
developed a domain based software system to transform 600
Otorhinolaryngology discharge notes to a structured form for
extracting clinical data from the discharge notes. In order to decrease
the system process time discharge notes were transformed into a data
table after preprocessing. Several word lists were constituted to
identify common section in the discharge notes, including patient
history, age, problems, and diagnosis etc. N-gram method was used
for discovering terms co-Occurrences within each section. Using this
method a dataset of concept candidates has been generated for the
validation step, and then Predictive Apriori algorithm for Association
Rule Mining (ARM) was applied to validate candidate concepts.
Abstract: Microarray data profiles gene expression on a whole
genome scale, therefore, it provides a good way to study associations
between gene expression and occurrence or progression of cancer.
More and more researchers realized that microarray data is helpful
to predict cancer sample. However, the high dimension of gene
expressions is much larger than the sample size, which makes this
task very difficult. Therefore, how to identify the significant genes
causing cancer becomes emergency and also a hot and hard research
topic. Many feature selection algorithms have been proposed in
the past focusing on improving cancer predictive accuracy at the
expense of ignoring the correlations between the features. In this
work, a novel framework (named by SGS) is presented for stable gene
selection and efficient cancer prediction . The proposed framework
first performs clustering algorithm to find the gene groups where
genes in each group have higher correlation coefficient, and then
selects the significant genes in each group with Bayesian Lasso and
important gene groups with group Lasso, and finally builds prediction
model based on the shrinkage gene space with efficient classification
algorithm (such as, SVM, 1NN, Regression and etc.). Experiment
results on real world data show that the proposed framework often
outperforms the existing feature selection and prediction methods,
say SAM, IG and Lasso-type prediction model.
Abstract: In this paper, we address the problem of reducing the
switching activity (SA) in on-chip buses through the use of a bus
binding technique in high-level synthesis. While many binding
techniques to reduce the SA exist, we present yet another technique for
further reducing the switching activity. Our proposed method
combines bus binding and data sequence reordering to explore a wider
solution space. The problem is formulated as a multiple traveling
salesman problem and solved using simulated annealing technique.
The experimental results revealed that a binding solution obtained
with the proposed method reduces 5.6-27.2% (18.0% on average) and
2.6-12.7% (6.8% on average) of the switching activity when compared
with conventional binding-only and hybrid binding-encoding
methods, respectively.
Abstract: The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.
Abstract: To determine the length of engagement threads of a bolt installed in a tapped part in order to avoid the threads stripping remains a very current problem in the design of the thread assemblies. It does not exist a calculation method formalized for the cases where the bolt is screwed directly in a ductile material. In this article, we study the behavior of the threads stripping of a loaded assembly by using a modelling by finite elements and a rupture criterion by damage. This modelling enables us to study the different parameters likely to influence the behavior of this bolted connection. We study in particular, the influence of couple of materials constituting the connection, of the bolt-s diameter and the geometrical characteristics of the tapped part, like the external diameter and the length of engagement threads. We established an experiments design to know the most significant parameters. That enables us to propose a simple expression making possible to calculate the resistance of the threads whatever the metallic materials of the bolt and the tapped part. We carried out stripping tests in order to validate our model. The estimated results are very close to those obtained by the tests.
Abstract: Names are important in many societies, even in technologically oriented ones which use e.g. ID systems to identify individual people. Names such as surnames are the most important as they are used in many processes, such as identifying of people and genealogical research. On the other hand variation of names can be a major problem for the identification and search for people, e.g. web search or security reasons. Name matching presumes a-priori that the recorded name written in one alphabet reflects the phonetic identity of two samples or some transcription error in copying a previously recorded name. We add to this the lode that the two names imply the same person. This paper describes name variations and some basic description of various name matching algorithms developed to overcome name variation and to find reasonable variants of names which can be used to further increasing mismatches for record linkage and name search. The implementation contains algorithms for computing a range of fuzzy matching based on different types of algorithms, e.g. composite and hybrid methods and allowing us to test and measure algorithms for accuracy. NYSIIS, LIG2 and Phonex have been shown to perform well and provided sufficient flexibility to be included in the linkage/matching process for optimising name searching.
Abstract: The reliable results of an insulated oval duct
considering heat radiation are obtained basing on accurate oval
perimeter obtained by integral method as well as one-dimensional
Plane Wedge Thermal Resistance (PWTR) model. This is an extension
study of former paper of insulated oval duct neglecting heat radiation.
It is found that in the practical situations with long-short-axes ratio a/b
4.5% while t/R2
Abstract: Recently, grid computing has been widely focused on
the science, industry, and business fields, which are required a vast
amount of computing. Grid computing is to provide the environment
that many nodes (i.e., many computers) are connected with each
other through a local/global network and it is available for many
users. In the environment, to achieve data processing among nodes
for any applications, each node executes mutual authentication by
using certificates which published from the Certificate Authority
(for short, CA). However, if a failure or fault has occurred in the
CA, any new certificates cannot be published from the CA. As
a result, a new node cannot participate in the gird environment.
In this paper, an off-the-shelf scheme for dependable grid systems
using virtualization techniques is proposed and its implementation is
verified. The proposed approach using the virtualization techniques
is to restart an application, e.g., the CA, if it has failed. The system
can tolerate a failure or fault if it has occurred in the CA. Since
the proposed scheme is implemented at the application level easily,
the cost of its implementation by the system builder hardly takes
compared it with other methods. Simulation results show that the
CA in the system can recover from its failure or fault.
Abstract: The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.
Abstract: Developing an accurate classifier for high dimensional microarray datasets is a challenging task due to availability of small sample size. Therefore, it is important to determine a set of relevant genes that classify the data well. Traditionally, gene selection method often selects the top ranked genes according to their discriminatory power. Often these genes are correlated with each other resulting in redundancy. In this paper, we have proposed a hybrid method using feature ranking and wrapper method (Genetic Algorithm with multiclass SVM) to identify a set of relevant genes that classify the data more accurately. A new fitness function for genetic algorithm is defined that focuses on selecting the smallest set of genes that provides maximum accuracy. Experiments have been carried on four well-known datasets1. The proposed method provides better results in comparison to the results found in the literature in terms of both classification accuracy and number of genes selected.
Abstract: An attempt in this paper proposes a re-modification to
the minimum moment approach of resource leveling which is a modified minimum moment approach to the traditional method by
Harris. The method is based on critical path method. The new approach suggests the difference between the methods in the
selection criteria of activity which needs to be shifted for leveling resource histogram. In traditional method, the improvement factor
found first to select the activity for each possible day of shifting. In
modified method maximum value of the product of Resources Rate
and Free Float was found first and improvement factor is then
calculated for that activity which needs to be shifted. In the proposed
method the activity to be selected first for shifting is based on the largest value of resource rate. The process is repeated for all the
remaining activities for possible shifting to get updated histogram.
The proposed method significantly reduces the number of iterations
and is easier for manual computations.
Abstract: By means of the idea of three-wave method, we obtain some analytic solutions for high nonlinear form of Benjamin-Bona- Mahony-Burgers (shortly BBMB) equations in its bilinear form.
Abstract: Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.
Abstract: The control design for unmanned underwater vehicles (UUVs) is challenging due to the uncertainties in the complex dynamic modeling of the vehicle as well as its unstructured operational environment. To cope with these difficulties, a practical robust control is therefore desirable. The paper deals with the application of coefficient diagram method (CDM) for a robust control design of an autonomous underwater vehicle. The CDM is an algebraic approach in which the characteristic polynomial and the controller are synthesized simultaneously. Particularly, a coefficient diagram (comparable to Bode diagram) is used effectively to convey pertinent design information and as a measure of trade-off between stability, response speed and robustness. In the polynomial ring, Kharitonov polynomials are employed to analyze the robustness of the controller due to parametric uncertainties.
Abstract: The feature of HIV genome is in a wide range because
of it is highly heterogeneous. Hence, the infection ability of the virus changes related with different chemokine receptors. From this point,
R5 and X4 HIV viruses use CCR5 and CXCR5 coreceptors respectively while R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to
classify by using the coreceptors of HIV genome.
The aim of this study is to develop the optimal Multilayer
Perceptron (MLP) for high classification accuracy of HIV sub-type viruses. To accomplish this purpose, the unit number in hidden layer
was incremented one by one, from one to a particular number. The statistical data of R5X4, R5 and X4 viruses was preprocessed by the
signal processing methods. Accessible residues of these virus sequences were extracted and modeled by Auto-Regressive Model
(AR) due to the dimension of residues is large and different from each other. Finally the pre-processed dataset was used to evolve MLP with various number of hidden units to determine R5X4
viruses. Furthermore, ROC analysis was used to figure out the optimal MLP structure.
Abstract: Chemical and physical functionalization of multiwalled
carbon nanotubes (MWCNT) has been commonly practiced to
achieve better dispersion of carbon nanotubes (CNTs) in polymer
matrix. This work describes various functionalization methods (acidtreatment,
non-ionic surfactant treatment with TritonX-100),
fabrication of MWCNT/PP nanocomposites via melt blending and
characterization of mechanical properties. Microscopy analysis
(FESEM, TEM, XPS) showed effective purification of MWCNTs
under acid treatment, and better dispersion under both chemical and
physical functionalization techniques combined, in their respective
order. Tensile tests showed increase in tensile strength for the
nanocomposites that contain MWCNTs up to 2 wt%. A decrease in
tensile strength was seen in samples that contain 4 wt% of MWCNTs
for both raw and Triton X-100 functionalized, signifying MWCNT
degradation/rebundling at composition with higher content of
MWCNTs. For the acid-treated MWCNTs, however, the tensile
results showed slight improvement even at 4wt%, indicating effective
dispersion of MWCNTs.
Abstract: In this paper we address the issue of classifying the fluorescent intensity of a sample in Indirect Immuno-Fluorescence (IIF). Since IIF is a subjective, semi-quantitative test in its very nature, we discuss a strategy to reliably label the image data set by using the diagnoses performed by different physicians. Then, we discuss image pre-processing, feature extraction and selection. Finally, we propose two ANN-based classifiers that can separate intrinsically dubious samples and whose error tolerance can be flexibly set. Measured performance shows error rates less than 1%, which candidates the method to be used in daily medical practice either to perform pre-selection of cases to be examined, or to act as a second reader.