Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: A virtualized and virtual approach is presented on
academically preparing students to successfully engage at a strategic
perspective to understand those concerns and measures that are both
structured and not structured in the area of cyber security and
information assurance. The Master of Science in Cyber Security and
Information Assurance (MSCSIA) is a professional degree for those
who endeavor through technical and managerial measures to ensure
the security, confidentiality, integrity, authenticity, control,
availability and utility of the world-s computing and information
systems infrastructure. The National University Cyber Security and
Information Assurance program is offered as a Master-s degree. The
emphasis of the MSCSIA program uniquely includes hands-on
academic instruction using virtual computers. This past year, 2011,
the NU facility has become fully operational using system
architecture to provide a Virtual Education Laboratory (VEL)
accessible to both onsite and online students. The first student cohort
completed their MSCSIA training this past March 2, 2012 after
fulfilling 12 courses, for a total of 54 units of college credits. The
rapid pace scheduling of one course per month is immensely
challenging, perpetually changing, and virtually multifaceted. This
paper analyses these descriptive terms in consideration of those
globalization penetration breaches as present in today-s world of
cyber security. In addition, we present current NU practices to
mitigate risks.
Abstract: This paper presents a comparison between two Pulse
Width Modulation (PWM) algorithms applied to a three-level Neutral
Point Clamped (NPC) Voltage Source Inverter (VSI). The first
algorithm applied is the triangular-sinusoidal strategy; the second is
the Space Vector Pulse Width Modulation (SVPWM) strategy. In the
first part, we present a topology of three-level NCP VSI. After that,
we develop the two PWM strategies to control this converter. At the
end the experimental results are presented.
Abstract: The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.
Abstract: A numerical simulation of vortex-induced vibration of
a 2-dimensional elastic circular cylinder with two degree of freedom
under the uniform flow is calculated when Reynolds is 200.
2-dimensional incompressible Navier-Stokes equations are solved
with the space-time finite element method, the equation of the cylinder
motion is solved with the new explicit integral method and the mesh
renew is achieved by the spring moving mesh technology. Considering
vortex-induced vibration with the low reduced damping parameter, the
variety trends of the lift coefficient, the drag coefficient, the
displacement of cylinder are analyzed under different oscillating
frequencies of cylinder. The phenomena of locked-in, beat and
phases-witch were captured successfully. The evolution of vortex
shedding from the cylinder with time is discussed. There are very
similar trends in characteristics between the results of the one degree
of freedom cylinder model and that of the two degree of freedom
cylinder model. The streamwise vibrations have a certain effect on the
lateral vibrations and their characteristics.
Abstract: Segmentation of a color image composed of different
kinds of regions can be a hard problem, namely to compute for an
exact texture fields. The decision of the optimum number of
segmentation areas in an image when it contains similar and/or un
stationary texture fields. A novel neighborhood-based segmentation
approach is proposed. A genetic algorithm is used in the proposed
segment-pass optimization process. In this pass, an energy function,
which is defined based on Markov Random Fields, is minimized. In
this paper we use an adaptive threshold estimation method for image
thresholding in the wavelet domain based on the generalized
Gaussian distribution (GGD) modeling of sub band coefficients. This
method called Normal Shrink is computationally more efficient and
adaptive because the parameters required for estimating the threshold
depend on sub band data energy that used in the pre-stage of
segmentation. A quad tree is employed to implement the multi
resolution framework, which enables the use of different strategies at
different resolution levels, and hence, the computation can be
accelerated. The experimental results using the proposed
segmentation approach are very encouraging.
Abstract: In this note, we consider a family of iterative formula for computing the weighted Minskowski inverses AM,N in Minskowski space, and give two kinds of iterations and the necessary and sufficient conditions of the convergence of iterations.
Abstract: A model based fault detection and diagnosis
technique for DC motor is proposed in this paper. Fault detection
using Kalman filter and its different variants are compared. Only
incipient faults are considered for the study. The Kalman Filter
iterations and all the related computations required for fault detection
and fault confirmation are presented. A second order linear state
space model of DC motor is used for this work. A comparative
assessment of the estimates computed from four different observers
and their relative performance is evaluated.
Abstract: In this paper we propose a new approach to constructing the Delaunay Triangulation and the optimum algorithm for the case of multidimensional spaces (d ≥ 2). Analysing the modern state, it is possible to draw a conclusion, that the ideas for the existing effective algorithms developed for the case of d ≥ 2 are not simple to generalize on a multidimensional case, without the loss of efficiency. We offer for the solving this problem an effective algorithm that satisfies all the given requirements. But theoretical complexity of the problem it is impossible to improve as the Worst - Case Optimality for algorithms of solving such a problem is proved.
Abstract: High purity hydrogen and the valuable by-product of carbon nanotubes (CNTs) can be produced by the methane catalytic decomposition. The methane conversion and the performance of CNTs were determined by the choices of catalysts and the condition of decomposition reaction. In this paper, Ni/MgO and Ni/O-D (oxidized diamond) catalysts were prepared by wetness impregnation method. The effects of reaction temperature and space velocity of methane on the methane conversion were investigated in a fixed-bed. The surface area, structure and micrography were characterized with BET, XPS, SEM, EDS technology. The results showed that the conversion of methane was above 8% within 150 min (T=500) for 33Ni/O-D catalyst and higher than 25% within 120 min (T=650) for 41Ni/MgO catalyst. The initial conversion increased with the increasing temperature of the decomposition reaction, but their catalytic activities decreased rapidly while at too higher temperature. To decrease the space velocity of methane was propitious to promote the methane conversion, but not favor of the hydrogen yields. The appearance of carbon resulted from the methane decomposition lied on the support type and the condition of catalytic reaction. It presented as fiber shape on the surface of Ni/O-D at the relatively lower temperature such as 500 and 550, but as grain shape stacked on and overlayed on the surface of the metal nickel while at 650. The carbon fiber can form on the Ni/MgO surface at 650 and the diameter of the carbon fiber increased with the decreasing space velocity.
Abstract: One of the aims of the paper is to make a comparison
of experimental results with numerical simulation for a side cooler.
Specifically, it was the amount of air to be delivered by the side
cooler with fans running at 100%. This integral value was measured
and evaluated within the plane parallel to the front side of the side
cooler at a distance of 20mm from the front side. The flow field
extending from the side cooler to the space was also evaluated.
Another objective was to address the contribution of evaluated values
to the increase of data center energy consumption.
Abstract: This paper aims to study the methodology of building the knowledge of planning adequate punches in order to complete the task of strip layout for shearing processes, using progressive dies. The proposed methodology uses die design rules and characteristics of different types of punches to classify them into five groups: prior use (the punches must be used first), posterior use (must be used last), compatible use (may be used together), sequential use (certain punches must precede some others) and simultaneous use (must be used together). With these five groups of punches, the searching space of feasible designs will be greatly reduced, and superimposition becomes a more effective method of punch layout. The superimposition scheme will generate many feasible solutions, an evaluation function based on number of stages, moment balancing and strip stability is developed for helping designers to find better solutions.
Abstract: The explosion of interest in online gaming and
virtual worlds is leading many universities to investigate
possible educational applications of the new environments.
In this paper we explore the possibilities of 3D online worlds
for teacher education, particularly the field experience
component. Drawing upon two pedagogical examples, we
suggest that virtual simulations may, with certain limitations,
create safe spaces that allow preservice teachers to adopt
alternate identities and interact safely with the “other." In so
doing they may become aware of the constructed nature of
social categories and gain the essential pedagogical skill of
perspective-taking. We suggest that, ultimately, the ability to
be the principal creators of themselves in virtual environments
can increase their ability to do the same in the real world.
Abstract: There-s a lack in understanding the indoor climate of Malaysian residential. The assumption of traditional house could
provide the best indoor environment is too good to be true. This research is to understand indoor environment in three types of
Malaysian residential and thermo recorder TR72Ui were placed in
indoor spaces for measurement. There are huge differences of indoor
environment between housing types, and building material helps to control indoor climate. Traditional house indoor climate was similar to
the outdoor. Temperature in the bedroom of terrace and town houses were slightly higher than the living room. Indoor temperature was 2oC
lower in the rainy season than the hot season. It was hard to control
indoor humidity level in traditional house compared with terrace and
town house. As for conclusion, town house provides the best thermal
environment to the building occupants and can be improved with good
roof insulation.
Abstract: This study was to search for the desirable direction of
the sidewalk planning in Korea by establishing the concepts of
walking and pedestrian space, and analyzing the advanced precedents
in and out of country. Also, based on the precedent studies and
relevant laws, regulations, and systems, it aimed for the following
sequential process: firstly, to derive design elements from the
functions and characteristics of sidewalk and cluster the similar
elements by each characteristics, sampling representative
characteristics and making them hierarchical; then, to analyze their
significances via the first questionnaire survey, and the relative
weights and priorities of each elements via the Analytic Hierarchy
Process(AHP); finally, based on the analysis result, to establish the
frame of suggesting the direction of policy to improve the pedestrian
environment of sidewalk in urban commercial district for the future
planning and design of pedestrian space.
Abstract: Automatic reusability appraisal could be helpful in
evaluating the quality of developed or developing reusable software
components and in identification of reusable components from
existing legacy systems; that can save cost of developing the software
from scratch. But the issue of how to identify reusable components
from existing systems has remained relatively unexplored. In this
paper, we have mentioned two-tier approach by studying the
structural attributes as well as usability or relevancy of the
component to a particular domain. Latent semantic analysis is used
for the feature vector representation of various software domains. It
exploits the fact that FeatureVector codes can be seen as documents
containing terms -the idenifiers present in the components- and so
text modeling methods that capture co-occurrence information in
low-dimensional spaces can be used. Further, we devised Neuro-
Fuzzy hybrid Inference System, which takes structural metric values
as input and calculates the reusability of the software component.
Decision tree algorithm is used to decide initial set of fuzzy rules for
the Neuro-fuzzy system. The results obtained are convincing enough
to propose the system for economical identification and retrieval of
reusable software components.
Abstract: Number of documents being created increases at an
increasing pace while most of them being in already known topics
and little of them introducing new concepts. This fact has started a
new era in information retrieval discipline where the requirements
have their own specialties. That is digging into topics and concepts
and finding out subtopics or relations between topics. Up to now IR
researches were interested in retrieving documents about a general
topic or clustering documents under generic subjects. However these
conventional approaches can-t go deep into content of documents
which makes it difficult for people to reach to right documents they
were searching. So we need new ways of mining document sets
where the critic point is to know much about the contents of the
documents. As a solution we are proposing to enhance LSI, one of
the proven IR techniques by supporting its vector space with n-gram
forms of words. Positive results we have obtained are shown in two
different application area of IR domain; querying a document
database, clustering documents in the document database.
Abstract: Recent changes in food retailing structure have led to the development of large supercenters in suburban areas of the United States. These changes have led some authors to suggest that there are food deserts in some urban areas, where food is difficult to access, especially for disadvantaged consumers. This study tests the food desert hypothesis by comparing the distance from food retailers to food secure and food insecure households in one urban, Midwest neighborhood. This study utilizes GIS to compare household survey respondent locations against the location of various types of area food retailers. Results of this study indicate no apparent difference between food secure and insecure households in the reported importance of distance on the decision to shop at various retailers. However, there were differences in the spatial relationship between households and retailers. Food insecure households tended to be located slightly farther from large food retailers and slightly closer to convenience stores. Furthermore, food insecure households reported traveling slightly farther to their primary food retailer. The differences between the two groups was, however, relatively small.
Abstract: Logic based methods for learning from structured data
is limited w.r.t. handling large search spaces, preventing large-sized
substructures from being considered by the resulting classifiers. A
novel approach to learning from structured data is introduced that
employs a structure transformation method, called finger printing, for
addressing these limitations. The method, which generates features
corresponding to arbitrarily complex substructures, is implemented in
a system, called DIFFER. The method is demonstrated to perform
comparably to an existing state-of-art method on some benchmark
data sets without requiring restrictions on the search space.
Furthermore, learning from the union of features generated by finger
printing and the previous method outperforms learning from each
individual set of features on all benchmark data sets, demonstrating
the benefit of developing complementary, rather than competing,
methods for structure classification.
Abstract: The roll center is one of the key parameters for designing a suspension. Several driving characteristics are affected significantly by the migration of the roll center during the suspension-s motion. The strut/SLA (strut/short-long-arm) suspension, which is widely used in production cars, combines the space-saving characteristics of a MacPherson strut suspension with some of the preferred handling characteristics of an SLA suspension. In this study, a front strut/SLA suspension is modeled by ADAMS/Car software. Kinematic roll analysis is then employed to investigate how the rolling characteristics change under the wheel travel and steering input. The related parameters, including the roll center height, roll camber gain, toe change, scrub radius and wheel track width change, are analyzed and discussed. It is found that the strut/SLA suspension clearly has a higher roll center than strut and SLA suspensions do. The variations in the roll center height under roll analysis are very different as the wheel travel displacement and steering angle are added. The results of the roll camber gain, scrub radius and wheel track width change are considered satisfactory. However, the toe change is too large and needs fine-tuning through a sensitivity analysis.