Abstract: In this paper, a new time discontinuous expanded mixed finite element method is proposed and analyzed for two-order convection-dominated diffusion problem. The proofs of the stability of the proposed scheme and the uniqueness of the discrete solution are given. Moreover, the error estimates of the scalar unknown, its gradient and its flux in the L1( ¯ J,L2( )-norm are obtained.
Abstract: Distributed denial-of-service (DDoS) attacks pose a
serious threat to network security. There have been a lot of
methodologies and tools devised to detect DDoS attacks and reduce
the damage they cause. Still, most of the methods cannot
simultaneously achieve (1) efficient detection with a small number of
false alarms and (2) real-time transfer of packets. Here, we introduce
a method for proactive detection of DDoS attacks, by classifying the
network status, to be utilized in the detection stage of the proposed
anti-DDoS framework. Initially, we analyse the DDoS architecture
and obtain details of its phases. Then, we investigate the procedures
of DDoS attacks and select variables based on these features. Finally,
we apply the k-nearest neighbour (k-NN) method to classify the
network status into each phase of DDoS attack. The simulation result
showed that each phase of the attack scenario is classified well and
we could detect DDoS attack in the early stage.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: The globe Sustainability has become the subject of international attention, the key reason is that global climate change. Climate and disasters around the abnormal frequency multiplier, the global temperature of the catastrophe and disaster continue to occur throughout the world, as well as countries around the world. Currently there are many important international conferences and policy, it is a "global environmental sustainability " and "living human health " as the goal of development, including the APEC 2007 meeting to "climate Clean Energy" as the theme Sydney Declaration, 2008 World Economic Forum's "Carbon - promote Cool Earth energy efficiency improvement project", the EU proposed "Green Idea" program, the Japanese annual policy, "low-carbon society, sustainable eco-city environment (Eco City) "And from 2009 to 2010 to promote the "Eco-Point" to promote green energy and carbon reduction products .And the 2010 World Climate Change Conference (COP16 United Nations Climate Change Conference Copenhagen), the world has been the subject of Negative conservative "Environmental Protection ", "save energy consumption, " into a positive response to the "Sustainable " and" LOHAS", while Taiwan has actively put forward eco-cities, green building, green building materials and other related environmental response Measures, especially green building construction environment that is the basis of factors, the most widely used application level, and direct contact with human health and the key to sustainable planet. "Sustainable development "is a necessary condition for continuation of the Earth, "healthy and comfortable" is a necessary condition for the continuation of life, and improve the "quality" is a necessary condition for economic development, balance between the three is "to enhance the efficiency of ", According to the World Business Council for Sustainable Development (WBCSD) for the "environmental efficiency "(Eco-Efficiency) proposed: " the achievement of environmental efficiency, the price to be competitive in the provision of goods or services to meet people's needs, improve living Quality at the same time, the goods or services throughout the life cycle. Its impact on the environment and natural resource utilization and gradually reduced to the extent the Earth can load. "whichever is the economy "Economic" and " Ecologic". The research into the methodology to obtain the Taiwan Green Building Material Labeling product as the scope of the study, by investigating and weight analysis to explore green building environmental load (Ln) factor and the Green Building Quality (Qn) factor to Establish green building environmental efficiency assessment model (GBM Eco-Efficiency). And building materials for healthy green label products for priority assessment object, the object is set in the material evidence for the direct response to the environmental load from the floor class-based, explicit feedback correction to the Green Building environmental efficiency assessment model, "efficiency " as a starting point to achieve balance between human "health "and Earth "sustainable development of win-win strategy. The study is expected to reach 1.To establish green building materials and the quality of environmental impact assessment system, 2. To establish value of GBM Eco-Efficiency model, 3. To establish the GBM Eco-Efficiency model for application of green building material feedback mechanisms.
Abstract: The authors have been developing several models
based on artificial neural networks, linear regression models, Box-
Jenkins methodology and ARIMA models to predict the time series
of tourism. The time series consist in the “Monthly Number of Guest
Nights in the Hotels" of one region. Several comparisons between the
different type models have been experimented as well as the features
used at the entrance of the models. The Artificial Neural Network
(ANN) models have always had their performance at the top of the
best models. Usually the feed-forward architecture was used due to
their huge application and results. In this paper the author made a
comparison between different architectures of the ANNs using
simply the same input. Therefore, the traditional feed-forward
architecture, the cascade forwards, a recurrent Elman architecture and
a radial based architecture were discussed and compared based on the
task of predicting the mentioned time series.
Abstract: The Navier–Stokes equations for unsteady, incompressible, viscous fluids in the axisymmetric coordinate system are solved using a control volume method. The volume-of-fluid (VOF) technique is used to track the free-surface of the liquid. Model predictions are in good agreement with experimental measurements. It is found that the dynamic processes after impact are sensitive to the initial droplet velocity and the liquid pool depth. The time evolution of the crown height and diameter are obtained by numerical simulation. The critical We number for splashing (Wecr) is studied for Oh (Ohnesorge) numbers in the range of 0.01~0.1; the results compares well with those of the experiments.
Abstract: This paper takes the actual scene of Aletheia
University campus – the Class 2 national monument, the first
educational institute in northern Taiwan as an example, to present a
3D virtual navigation system which supports user positioning and
pre-download mechanism. The proposed system was designed based
on the principle of Voronoi Diagra) to divide the virtual scenes and
its multimedia information, which combining outdoor GPS
positioning and the indoor RFID location detecting function. When
users carry mobile equipments such as notebook computer, UMPC,
EeePC...etc., walking around the actual scenes of indoor and outdoor
areas of campus, this system can automatically detect the moving
path of users and pre-download the needed data so that users will
have a smooth and seamless navigation without waiting.
Abstract: We consider a single-echelon, single-item inventory
system where both demand and lead-time are stochastic. Continuous
review policy is used to control the inventory system. The objective
is to calculate the reorder point level under stochastic parameters. A
case study is presented in Neonatal Intensive Care Unit.
Abstract: This paper is aimed at describing a delay-based endto-
end (e2e) congestion control algorithm, called Very FAST TCP
(VFAST), which is an enhanced version of FAST TCP. The main
idea behind this enhancement is to smoothly estimate the Round-Trip
Time (RTT) based on a nonlinear filter, which eliminates throughput
and queue oscillation when RTT fluctuates. In this context, an evaluation
of the suggested scheme through simulation is introduced, by
comparing our VFAST prototype with FAST in terms of throughput,
queue behavior, fairness, stability, RTT and adaptivity to changes in
network. The achieved simulation results indicate that the suggested
protocol offer better performance than FAST TCP in terms of RTT
estimation and throughput.
Abstract: The paper deals with determination of electromagnetic
and temperature field distribution of induction heating system used
for pipe brazing. The problem is considered as coupled – time
harmonic electromagnetic and transient thermal field. It has been
solved using finite element method. The detailed maps of
electromagnetic and thermal field distribution have been obtained.
The good understanding of the processes in the considered system
ensures possibilities for control, management and increasing the
efficiency of the welding process.
Abstract: Developing techniques for mobile robot navigation constitutes one of the major trends in the current
research on mobile robotics. This paper develops a local
model network (LMN) for mobile robot navigation. The
LMN represents the mobile robot by a set of locally valid
submodels that are Multi-Layer Perceptrons (MLPs).
Training these submodels employs Back Propagation (BP) algorithm. The paper proposes the fuzzy C-means (FCM) in this scheme to divide the input space to sub regions, and then a submodel (MLP) is identified to represent a particular
region. The submodels then are combined in a unified
structure. In run time phase, Radial Basis Functions (RBFs) are employed as windows for the activated submodels. This
proposed structure overcomes the problem of changing operating regions of mobile robots. Read data are used in all experiments. Results for mobile robot navigation using the
proposed LMN reflect the soundness of the proposed
scheme.
Abstract: The aim of this study was to compare the effects
of an altitude training camp on heart rate variability and
performance in elite triathletes. Ten athletes completed 20 days of live-high, train-low training at 1650m. Athletes
underwent pre and post 800-m swim time trials at sea-level, and two heart rate variability tests at 1650m on the first and
last day of the training camp. Based on their time trial results,
athletes were divided into responders and non-responders. Relative to the non-responders, the responders sympathetic-toparasympathetic
ratio decreased substantially after 20 days of altitude training (-0.68 ± 1.08 and -1.2 ± 0.96, mean ± 90%
confidence interval for supine and standing respectively). In
addition, sympathetic activity while standing was also
substantially lower post-altitude in the responders compared to the non-responders (-1869 ± 4764 ms2). Results indicate that
responders demonstrated a change to more vagal
predominance compared to non-responders.
Abstract: The evaluation of unit cell neutronic parameters and
lifetime for some innovant reactors without on sit-refuling will be
held in this work. the behavior of some small and medium reactors
without on site refueling with triso and cermet fuel. For the FBNR
long life except we propose to change the enrichment of the Cermet
MFE to 9%. For the AFPR reactor we can see that the use of the
Cermet MFE can extend the life of this reactor but to maintain the
same life period for AFPR-SC we most use burnup poison to have the
same slope for Kinf (Burnup). PFPWR50 cell behaves almost in
same way using both fuels Cermet and TRISO. So we can conclude
that PFPWR50 reactor, with CERMET Fuel, is kept among the long
cycle reactors and with the new configuration we avoid subcriticality
at the beginning of cycle. The evaluation of unit cell neutronic
parameters reveals a good agreement with the goal of BWR-PB
concept. It is found out that the Triso fuel assembly lifetime can be
extended for a reasonably long period without being refueled,
approximately up to 48GWd/t burnup. Using coated particles fuels
with the Cermet composition can be more extended the fuel assembly
life time, approximately 52 GWd/t.
Abstract: Many digital signal processing, techniques have been used to automatically distinguish protein coding regions (exons) from non-coding regions (introns) in DNA sequences. In this work, we have characterized these sequences according to their nonlinear dynamical features such as moment invariants, correlation dimension, and largest Lyapunov exponent estimates. We have applied our model to a number of real sequences encoded into a time series using EIIP sequence indicators. In order to discriminate between coding and non coding DNA regions, the phase space trajectory was first reconstructed for coding and non-coding regions. Nonlinear dynamical features are extracted from those regions and used to investigate a difference between them. Our results indicate that the nonlinear dynamical characteristics have yielded significant differences between coding (CR) and non-coding regions (NCR) in DNA sequences. Finally, the classifier is tested on real genes where coding and non-coding regions are well known.
Abstract: The accelerated growth in aircraft industries desire
effectual schemes, programs, innovative designs of advanced systems
to accomplishing the augmenting need for home-free air
transportation. In this paper, a contemporary conceptual design of an
airplane has been proposed without landing gear systems in order to
reducing accidents, time consumption, and to eliminating drawbacks
by using superconducting levitation phenomenon. This invention of
an airplane with superconductive material coating, on the solar plexus
region assist to reduce weight by approximately 4% of the total takeoff
weight, and cost effective. Moreover, we conjectured that
superconductor landing system reduces ground friction, mission fuel,
total drag, take-off and landing distance.
Abstract: The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Abstract: Over last two decades, due to hostilities of environment
over the internet the concerns about confidentiality of information
have increased at phenomenal rate. Therefore to safeguard the information
from attacks, number of data/information hiding methods have
evolved mostly in spatial and transformation domain.In spatial domain
data hiding techniques,the information is embedded directly on
the image plane itself. In transform domain data hiding techniques the
image is first changed from spatial domain to some other domain and
then the secret information is embedded so that the secret information
remains more secure from any attack. Information hiding algorithms
in time domain or spatial domain have high capacity and relatively
lower robustness. In contrast, the algorithms in transform domain,
such as DCT, DWT have certain robustness against some multimedia
processing.In this work the authors propose a novel steganographic
method for hiding information in the transform domain of the gray
scale image.The proposed approach works by converting the gray
level image in transform domain using discrete integer wavelet
technique through lifting scheme.This approach performs a 2-D
lifting wavelet decomposition through Haar lifted wavelet of the cover
image and computes the approximation coefficients matrix CA and
detail coefficients matrices CH, CV, and CD.Next step is to apply the
PMM technique in those coefficients to form the stego image. The
aim of this paper is to propose a high-capacity image steganography
technique that uses pixel mapping method in integer wavelet domain
with acceptable levels of imperceptibility and distortion in the cover
image and high level of overall security. This solution is independent
of the nature of the data to be hidden and produces a stego image
with minimum degradation.
Abstract: ZnO nanocrystals with mean diameter size 14 nm
have been prepared by precipitation method, and examined as
photocatalyst for the UV-induced degradation of insecticide diazinon
as deputy of organic pollutant in aqueous solution. The effects of
various parameters, such as illumination time, the amount of
photocatalyst, initial pH values and initial concentration of
insecticide on the photocatalytic degradation diazinon were
investigated to find desired conditions. In this case, the desired
parameters were also tested for the treatment of real water containing
the insecticide. Photodegradation efficiency of diazinon was
compared between commercial and prepared ZnO nanocrystals. The
results indicated that UV/ZnO process applying prepared
nanocrystalline ZnO offered electrical energy efficiency and
quantum yield better than commercial ZnO. The present study, on the
base of Langmuir-Hinshelwood mechanism, illustrated a pseudo
first-order kinetic model with rate constant of surface reaction equal
to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l
mg-1.
Abstract: A new approach for protection of power transformer is
presented using a time-frequency transform known as Wavelet transform.
Different operating conditions such as inrush, Normal, load,
External fault and internal fault current are sampled and processed
to obtain wavelet coefficients. Different Operating conditions provide
variation in wavelet coefficients. Features like energy and Standard
deviation are calculated using Parsevals theorem. These features
are used as inputs to PNN (Probabilistic neural network) for fault
classification. The proposed algorithm provides more accurate results
even in the presence of noise inputs and accurately identifies inrush
and fault currents. Overall classification accuracy of the proposed
method is found to be 96.45%. Simulation of the fault (with and
without noise) was done using MATLAB AND SIMULINK software
taking 2 cycles of data window (40 m sec) containing 800 samples.
The algorithm was evaluated by using 10 % Gaussian white noise.
Abstract: To meet the demands of wireless sensor networks
(WSNs) where data are usually aggregated at a single source prior to
transmitting to any distant user, there is a need to establish a tree
structure inside any given event region. In this paper , a novel
technique to create one such tree is proposed .This tree preserves the
energy and maximizes the lifetime of event sources while they are
constantly transmitting for data aggregation. The term Decentralized
Lifetime Maximizing Tree (DLMT) is used to denote this tree.
DLMT features in nodes with higher energy tend to be chosen as data
aggregating parents so that the time to detect the first broken tree link
can be extended and less energy is involved in tree maintenance. By
constructing the tree in such a way, the protocol is able to reduce the
frequency of tree reconstruction, minimize the amount of data loss
,minimize the delay during data collection and preserves the energy.