Abstract: In this paper, we study on finite projective Hjelmslev planes M(Zq) coordinatized by Hjelmslev ring Zq (where prime power q = pk). We obtain finite hyperbolic Klingenberg planes from these planes under certain conditions. Also, we give a combinatorical result on M(Zq), related by deleting a line from lines in same neighbour.
Abstract: In this paper we propose a novel Run Time Interface
(RTI) technique to provide an efficient environment for MPI jobs on
the heterogeneous architecture of PARAM Padma. It suggests an
innovative, unified framework for the job management interface
system in parallel and distributed computing. This approach employs
proxy scheme. The implementation shows that the proposed RTI is
highly scalable and stable. Moreover RTI provides the storage access
for the MPI jobs in various operating system platforms and improve
the data access performance through high performance C-DAC
Parallel File System (C-PFS). The performance of the RTI is
evaluated by using the standard HPC benchmark suites and the
simulation results show that the proposed RTI gives good
performance on large scale supercomputing system.
Abstract: Bumpers play an important role in preventing the
impact energy from being transferred to the automobile and
passengers. Saving the impact energy in the bumper to be released in
the environment reduces the damages of the automobile and
passengers.
The goal of this paper is to design a bumper with minimum weight
by employing the Glass Material Thermoplastic (GMT) materials.
This bumper either absorbs the impact energy with its deformation or
transfers it perpendicular to the impact direction.
To reach this aim, a mechanism is designed to convert about 80%
of the kinetic impact energy to the spring potential energy and
release it to the environment in the low impact velocity according to
American standard1. In addition, since the residual kinetic energy
will be damped with the infinitesimal elastic deformation of the
bumper elements, the passengers will not sense any impact. It should
be noted that in this paper, modeling, solving and result-s analysis
are done in CATIA, LS-DYNA and ANSYS V8.0 software
respectively.
Abstract: True stress-strain curve of railhead steel is required to
investigate the behaviour of railhead under wheel loading through elasto-plastic Finite Element (FE) analysis. To reduce the rate of wear, the railhead material is hardened through annealing and
quenching. The Australian standard rail sections are not fully hardened and hence suffer from non-uniform distribution of the
material property; usage of average properties in the FE modelling can potentially induce error in the predicted plastic strains. Coupons
obtained at varying depths of the railhead were, therefore, tested under axial tension and the strains were measured using strain gauges as well as an image analysis technique, known as the Particle Image Velocimetry (PIV). The head hardened steel exhibit existence of three distinct zones of yield strength; the yield strength as the ratio of the average yield strength provided in the standard (σyr=780MPa) and
the corresponding depth as the ratio of the head hardened zone along
the axis of symmetry are as follows: (1.17 σyr, 20%), (1.06 σyr, 20%-80%) and (0.71 σyr, > 80%). The stress-strain curves exhibit limited plastic zone with fracture occurring at strain less than 0.1.
Abstract: This paper presents a wavelet transform and Support
Vector Machine (SVM) based algorithm for estimating fault location
on transmission lines. The Discrete wavelet transform (DWT) is used
for data pre-processing and this data are used for training and testing
SVM. Five types of mother wavelet are used for signal processing to
identify a suitable wavelet family that is more appropriate for use in
estimating fault location. The results demonstrated the ability of SVM
to generalize the situation from the provided patterns and to
accurately estimate the location of faults with varying fault resistance.
Abstract: Modeling of complex dynamic systems, which are
very complicated to establish mathematical models, requires new and
modern methodologies that will exploit the existing expert
knowledge, human experience and historical data. Fuzzy cognitive
maps are very suitable, simple, and powerful tools for simulation and
analysis of these kinds of dynamic systems. However, human experts
are subjective and can handle only relatively simple fuzzy cognitive
maps; therefore, there is a need of developing new approaches for an
automated generation of fuzzy cognitive maps using historical data.
In this study, a new learning algorithm, which is called Big Bang-Big
Crunch, is proposed for the first time in literature for an automated
generation of fuzzy cognitive maps from data. Two real-world
examples; namely a process control system and radiation therapy
process, and one synthetic model are used to emphasize the
effectiveness and usefulness of the proposed methodology.
Abstract: The log periodogram regression is widely used in empirical
applications because of its simplicity, since only a least squares
regression is required to estimate the memory parameter, d, its good
asymptotic properties and its robustness to misspecification of the
short term behavior of the series. However, the asymptotic distribution
is a poor approximation of the (unknown) finite sample distribution
if the sample size is small. Here the finite sample performance of different
nonparametric residual bootstrap procedures is analyzed when
applied to construct confidence intervals. In particular, in addition to
the basic residual bootstrap, the local and block bootstrap that might
adequately replicate the structure that may arise in the errors of the
regression are considered when the series shows weak dependence in
addition to the long memory component. Bias correcting bootstrap
to adjust the bias caused by that structure is also considered. Finally,
the performance of the bootstrap in log periodogram regression based
confidence intervals is assessed in different type of models and how
its performance changes as sample size increases.
Abstract: Vertex configuration for a vertex in an orthogonal
pseudo-polyhedron is an identity of a vertex that is determined by the
number of edges, dihedral angles, and non-manifold properties
meeting at the vertex. There are up to sixteen vertex configurations
for any orthogonal pseudo-polyhedron (OPP). Understanding the
relationship between these vertex configurations will give us insight
into the structure of an OPP and help us design better algorithms for
many 3-dimensional geometric problems. In this paper, 16 vertex
configurations for OPP are described first. This is followed by a
number of formulas giving insight into the relationship between
different vertex configurations in an OPP. These formulas
will be useful as an extension of orthogonal polyhedra usefulness on
pattern analysis in 3D-digital images.
Abstract: This paper presents a distributed intrusion
detection system IDS, based on the concept of specialized
distributed agents community representing agents with the
same purpose for detecting distributed attacks. The semantic of
intrusion events occurring in a predetermined network has been
defined. The correlation rules referring the process which our
proposed IDS combines the captured events that is distributed
both spatially and temporally. And then the proposed IDS tries
to extract significant and broad patterns for set of well-known
attacks. The primary goal of our work is to provide intrusion
detection and real-time prevention capability against insider
attacks in distributed and fully automated environments.
Abstract: Purpose of this work is the development of an
automatic classification system which could be useful for radiologists
in the investigation of breast cancer. The software has been designed
in the framework of the MAGIC-5 collaboration.
In the automatic classification system the suspicious regions with
high probability to include a lesion are extracted from the image as
regions of interest (ROIs). Each ROI is characterized by some
features based on morphological lesion differences.
Some classifiers as a Feed Forward Neural Network, a K-Nearest
Neighbours and a Support Vector Machine are used to distinguish the
pathological records from the healthy ones.
The results obtained in terms of sensitivity (percentage of
pathological ROIs correctly classified) and specificity (percentage of
non-pathological ROIs correctly classified) will be presented through
the Receive Operating Characteristic curve (ROC). In particular the
best performances are 88% ± 1 of area under ROC curve obtained
with the Feed Forward Neural Network.
Abstract: Ultrafast doped zinc oxide crystal promised us a good
opportunity to build new instruments for ICF fusion neutron
measurement. Two pulsed neutron detectors based on ZnO crystal
wafer have been conceptually designed, the superfast ZnO timing
detector and the scintillation recoil proton neutron detection system.
The structure of these detectors was presented, and some characters
were studied as well. The new detectors could be much faster than
existing systems, and would be more competent for ICF neutron
diagnostics.
Abstract: Wireless Sensor Network is Multi hop Self-configuring
Wireless Network consisting of sensor nodes. The deployment of
wireless sensor networks in many application areas, e.g., aggregation
services, requires self-organization of the network nodes into clusters.
Efficient way to enhance the lifetime of the system is to partition the
network into distinct clusters with a high energy node as cluster head.
The different methods of node clustering techniques have appeared in
the literature, and roughly fall into two families; those based on the
construction of a dominating set and those which are based solely on
energy considerations. Energy optimized cluster formation for a set
of randomly scattered wireless sensors is presented. Sensors within a
cluster are expected to be communicating with cluster head only. The
energy constraint and limited computing resources of the sensor nodes
present the major challenges in gathering the data. In this paper we
propose a framework to study how partially correlated data affect the
performance of clustering algorithms. The total energy consumption
and network lifetime can be analyzed by combining random geometry
techniques and rate distortion theory. We also present the relation
between compression distortion and data correlation.
Abstract: Value engineering is an efficacious contraption for
administrators to make up their minds. Value perusals proffer the
gaffers a suitable instrument to decrease the expenditures of the life
span, quality amelioration, structural improvement, curtailment of the
construction schedule, longevity prolongation or a merging of the
aforementioned cases. Subjecting organizers to pressures on one
hand and their accountability towards their pertinent fields together
with inherent risks and ambiguities of other options on the other hand
set some comptrollers in a dilemma utilization of risk management
and the value engineering in projects manipulation with regard to
complexities of implementing projects can be wielded as a
contraption to identify and efface each item which wreaks
unnecessary expenses and time squandering sans inflicting any
damages upon the essential project applications. Of course It should
be noted that implementation of risk management and value
engineering with regard to the betterment of efficiency and functions
may lead to the project implementation timing elongation. Here time
revamping does not refer to time diminishing in the whole cases. his
article deals with risk and value engineering conceptualizations at
first. The germane reverberations effectuated due to its execution in
Iran Khodro Corporation are regarded together with the joint features
and amalgamation of the aforesaid entia; hence the proposed
blueprint is submitted to be taken advantage of in engineering and
industrial projects including Iran Khodro Corporation.
Abstract: Today-s children, who are born into a more colorful,
more creative, more abstract and more accessible communication
environment than their ancestors as a result of dizzying advances in
technology, have an interesting capacity to perceive and make sense
of the world. Millennium children, who live in an environment where
all kinds of efforts by marketing communication are more intensive
than ever are, from their early childhood on, subject to all kinds of
persuasive messages. As regards advertising communication, it
outperforms all the other marketing communication efforts in
creating little consumer individuals and, as a result of processing of
codes and signs, plays a significant part in building a world of seeing,
thinking and understanding for children. Children who are raised with
metaphorical expressions such as tales and riddles also meet that fast
and effective meaning communication in advertisements.
Children-s perception of metaphors, which help grasp the “product
and its promise" both verbally and visually and facilitate association
between them is the subject of this study. Stimulating and activating
imagination, metaphors have unique advantages in promoting the
product and its promise especially in regard to print advertisements,
which have certain limitations. This study deals comparatively with
both literal and metaphoric versions of print advertisements
belonging to various product groups and attempts to discover to what
extent advertisements are liked, recalled, perceived and are
persuasive. The sample group of the study, which was conducted in
two elementary schools situated in areas that had different socioeconomic
features, consisted of children aged 12.
Abstract: Recently ORC(Organic Rankine Cycle) has attracted
much attention due to its potential in reducing consumption of fossil
fuels and its favorable characteristics to exploit low-grade heat sources.
In this work thermodynamic performance of ORC with superheating of
vapor is comparatively assessed for various working fluids. Special
attention is paid to the effects of system parameters such as the evaporating
temperature and the turbine inlet temperature on the characteristics
of the system such as maximum possible work extraction from
the given source, volumetric flow rate per 1 kW of net work and
quality of the working fluid at turbine exit as well as thermal and
exergy efficiencies. Results show that for a given source the thermal
efficiency increases with decrease of the superheating but exergy
efficiency may have a maximum value with respect to the superheating
of the working fluid. Results also show that in selection of working
fluid it is required to consider various criteria of performance characteristics
as well as thermal efficiency.
Abstract: The aim of the present study was to analyze and
distinguish playing pattern between winning and losing field hockey
team in Delhi 2012 tournament. The playing pattern is focus to the D
penetration (right, center, left.) and to distinguish D penetration
linking to end shot made from it. The data was recorded and analyzed
using Sportscode elite computer software. 12 matches were analyzed
from the tournament. Two groups of performance indicators are used
to analyze, that is D penetration right, center, and left. The type of
shot chosen is hit, push, flick, drag, drag flick, deflect sweep, deflect
push, scoop, sweep, and reverse hit. This is to distinguish the pattern
of play between winning and losing, only 2 performance indicator
showed high significant differences from right (Z=-2.87, p=.004,
p
Abstract: We have proposed an information filtering system
using index word selection from a document set based on the
topics included in a set of documents. This method narrows
down the particularly characteristic words in a document set
and the topics are obtained by Sparse Non-negative Matrix
Factorization. In information filtering, a document is often
represented with the vector in which the elements correspond
to the weight of the index words, and the dimension of the
vector becomes larger as the number of documents is
increased. Therefore, it is possible that useless words as index
words for the information filtering are included. In order to
address the problem, the dimension needs to be reduced. Our
proposal reduces the dimension by selecting index words
based on the topics included in a document set. We have
applied the Sparse Non-negative Matrix Factorization to the
document set to obtain these topics. The filtering is carried out
based on a centroid of the learning document set. The centroid
is regarded as the user-s interest. In addition, the centroid is
represented with a document vector whose elements consist of
the weight of the selected index words. Using the English test
collection MEDLINE, thus, we confirm the effectiveness of
our proposal. Hence, our proposed selection can confirm the
improvement of the recommendation accuracy from the other
previous methods when selecting the appropriate number of
index words. In addition, we discussed the selected index
words by our proposal and we found our proposal was able to
select the index words covered some minor topics included in
the document set.
Abstract: This paper presents a 2-D hydrodynamic model of the ablated plasma when irradiating a 50 μm Al solid target with a single pulsed ion beam. The Lagrange method is used to solve the moving fluid for the ablated plasma production and formation mechanism. In the calculations, a 10-ns-single-pulsed of ion beam with a total energy density of 120 J/cm2, is used. The results show that the ablated plasma was formed after 2 ns of ion beam irradiation and it started to expand right after 4-6 ns. In addition, the 2-D model give a better understanding of pulsed ion beam-solid target ablated plasma production and expansion process clearer.
Abstract: Text similarity measurement is a fundamental issue in
many textual applications such as document clustering, classification,
summarization and question answering. However, prevailing approaches
based on Vector Space Model (VSM) more or less suffer
from the limitation of Bag of Words (BOW), which ignores the semantic
relationship among words. Enriching document representation
with background knowledge from Wikipedia is proven to be an effective
way to solve this problem, but most existing methods still
cannot avoid similar flaws of BOW in a new vector space. In this
paper, we propose a novel text similarity measurement which goes
beyond VSM and can find semantic affinity between documents.
Specifically, it is a unified graph model that exploits Wikipedia as
background knowledge and synthesizes both document representation
and similarity computation. The experimental results on two different
datasets show that our approach significantly improves VSM-based
methods in both text clustering and classification.
Abstract: Customer-supplier collaboration enables firms to
achieve greater success than acting independently. Nevertheless, not
many firms have fully utilized the potential of collaboration. This
paper presents organizational and human related success factors for
collaboration in manufacturing supply chains in casting industry. Our
research approach was a case study including multiple cases. Data
was gathered by interviews and group discussions in two different
research projects. In the first research project we studied seven firms
and in the second five. It was found that the success factors are
interrelated, in other words, organizational and human factors
together enable success but not any of them alone. Some of the found
success factors are a culture of following agreements, and a speed of
informing the partner about changes affecting to the product or the
delivery chain.