Abstract: Since its independence in 1962, Algeria has struggled
to establish an educational system tailored to the needs of the
population it may address. Considering the historical connection with
France, Algeria has always looked at the French language as a
cultural imperative until late in the seventies. After the Arabization
policy of 1971 and the socioeconomic changes taking place
worldwide, the use of English as a communicating vehicle started to
gain more space within globalized Algeria. Consequently, disparities
in the use of French started to fade away at the cross-roads leaving
more space to the teaching of English as a second foreign language.
Moreover, the introduction of the Bologna Process and the
European Credit Transfer System in Higher Education has
necessitated some innovations in the design and development of new
curricula adapted to the socioeconomic market. In this paper, I will
try to highlight the important historical dimensions Algeria has taken
towards the implementation of an English language methodology and
to the status it acquired from second foreign language, to first foreign
language to “the language of knowledge and sciences". I will also
propose new pedagogical perspectives for a better treatment of the
English language in order to encourage independent and autonomous
learning.
Abstract: The Kansei engineering is a technology which
converts human feelings into quantitative terms and helps designers
develop new products that meet customers- expectation. Standard
Kansei engineering procedure involves finding relationships between
human feelings and design elements of which many researchers have
found forward and backward relationship through various soft
computing techniques. In this paper, we proposed the framework of
Kansei engineering linking relationship not only between human
feelings and design elements, but also the whole part of product, by
constructing association rules. In this experiment, we obtain input
from emotion score that subjects rate when they see the whole part of
the product by applying semantic differentials. Then, association
rules are constructed to discover the combination of design element
which affects the human feeling. The results of our experiment
suggest the pattern of relationship of design elements according to
human feelings which can be derived from the whole part of product.
Abstract: Diagnostic goal of transformers in service is to detect the winding or the core in fault. Transformers are valuable equipment which makes a major contribution to the supply security of a power system. Consequently, it is of great importance to minimize the frequency and duration of unwanted outages of power transformers. So, Frequency Response Analysis (FRA) is found to be a useful tool for reliable detection of incipient mechanical fault in a transformer, by finding winding or core defects. The authors propose as first part of this article, the coupled circuits method, because, it gives most possible exhaustive modelling of transformers. And as second part of this work, the application of FRA in low frequency in order to improve and simplify the response reading. This study can be useful as a base data for the other transformers of the same categories intended for distribution grid.
Abstract: Elementary particles are created in pairs of equal and opposite momentums at a reference frame at the speed of light. The speed of light reference frame is viewed as a point in space as observed by observer at rest. This point in space is the bang location of the big bang theory. The bang in the big bang theory is not more than sustained flow of pairs of positive and negative elementary particles. Electrons and negative charged elementary particles are ejected from this point in space at velocities faster than light, while protons and positively charged particles obtain velocities lower than light. Subsonic masses are found to have real and positive charge, while supersonic masses are found to be negative and imaginary indicating that the two masses are of different entities. The electron-s super-sonic speed, as viewed by rest observer was calculated and found to be less than the speed of light and is little higher than the electron speed in Bohr-s orbit. The newly formed hydrogen gas temperature was found to be in agreement with temperatures found on newly formed stars. Universe expansion was found to be in agreement. Partial mass and charge elementary particles and particles with momentum only were explained in the context of this theoretical approach.
Abstract: In this paper, we deal with the Steiner tree problem
(STP) on a graph in which a fuzzy number, instead of a real number,
is assigned to each edge. We propose a modification of the shortest
paths approximation based on the fuzzy shortest paths (FSP)
evaluations. Since a fuzzy min operation using the extension
principle leads to nondominated solutions, we propose another
approach to solving the FSP using Cheng's centroid point fuzzy
ranking method.
Abstract: Iodine radionuclides in accident releases under severe
accident conditions at NPP with VVER are the most radiationimportant
with a view to population dose generation at the beginning
of the accident. To decrease radiation consequences of severe
accidents the technical solutions for severe accidents management
have been proposed in MIR.1200 project, with consideration of the
measures for suppression of volatile iodine forms generation in the
containment. Behavior dynamics of different iodine forms in the
containment under severe accident conditions has been analyzed for
the purpose of these technical solutions justification.
Abstract: Carbon fibers have specific characteristics in
comparison with industrial and structural materials used in different
applications. Special properties of carbon fibers make them attractive
for reinforcing and fabrication of composites. These fibers have been
utilized for composites of metals, ceramics and plastics. However,
it-s mainly used in different forms to reinforce lightweight polymer
materials such as epoxy resin, polyesters or polyamides. The
composites of carbon fiber are stronger than steel, stiffer than
titanium, and lighter than aluminum and nowadays they are used in a
variety of applications. This study explains applications of carbon
fibers in different fields such as space, aviation, transportation,
medical, construction, energy, sporting goods, electronics, and the
other commercial/industrial applications. The last findings of
composites with polymer, metal and ceramic matrices containing
carbon fibers and their applications in the world investigated.
Researches show that carbon fibers-reinforced composites due to
unique properties (including high specific strength and specific
modulus, low thermal expansion coefficient, high fatigue strength,
and high thermal stability) can be replaced with common industrial
and structural materials.
Abstract: The aim of this studywas toinvestigate the effect
ofrunning classification (sprint, middle, and long distance)and two
distances on blood lactate (BLa), heart rate (HR), and rating of
perceived exertion (RPE) Borg scale ratings in collegiate athletes. On
different days, runners (n = 15) ran 400m and 1600m at a five min
mile pace, followed by a two min 6mph jog, and a two min 3mph
walk as part of the cool down. BLa, HR, and RPE were taken at
baseline, post-run, plus 2 and 4 min recovery times. The middle and
long distance runners exhibited lower BLa concentrations than sprint
runners after two min of recovery post 400 m runs, immediately after,
and two and four min recovery periods post 1600 m runs. When
compared to sprint runners, distance runners may have exhibited the
ability to clear BLa more quickly, particularly after running 1600 m.
Abstract: The approach based on the wavelet transform has
been widely used for image denoising due to its multi-resolution
nature, its ability to produce high levels of noise reduction and the
low level of distortion introduced. However, by removing noise, high
frequency components belonging to edges are also removed, which
leads to blurring the signal features. This paper proposes a new
method of image noise reduction based on local variance and edge
analysis. The analysis is performed by dividing an image into 32 x 32
pixel blocks, and transforming the data into wavelet domain. Fast
lifting wavelet spatial-frequency decomposition and reconstruction is
developed with the advantages of being computationally efficient and
boundary effects minimized. The adaptive thresholding by local
variance estimation and edge strength measurement can effectively
reduce image noise while preserve the features of the original image
corresponding to the boundaries of the objects. Experimental results
demonstrate that the method performs well for images contaminated
by natural and artificial noise, and is suitable to be adapted for
different class of images and type of noises. The proposed algorithm
provides a potential solution with parallel computation for real time
or embedded system application.
Abstract: Classification is an interesting problem in functional
data analysis (FDA), because many science and application problems
end up with classification problems, such as recognition, prediction,
control, decision making, management, etc. As the high dimension
and high correlation in functional data (FD), it is a key problem to
extract features from FD whereas keeping its global characters, which
relates to the classification efficiency and precision to heavens. In this
paper, a novel automatic method which combined Genetic Algorithm
(GA) and classification algorithm to extract classification features is
proposed. In this method, the optimal features and classification model
are approached via evolutional study step by step. It is proved by
theory analysis and experiment test that this method has advantages in
improving classification efficiency, precision and robustness whereas
using less features and the dimension of extracted classification
features can be controlled.
Abstract: Frequency domain independent component analysis has
a scaling indeterminacy and a permutation problem. The scaling
indeterminacy can be solved by use of a decomposed spectrum. For
the permutation problem, we have proposed the rules in terms of gain
ratio and phase difference derived from the decomposed spectra and
the source-s coarse directions.
The present paper experimentally clarifies that the gain ratio and
the phase difference work effectively in a real environment but their
performance depends on frequency bands, a microphone-space and
a source-microphone distance. From these facts it is seen that it is
difficult to attain a perfect solution for the permutation problem in a
real environment only by either the gain ratio or the phase difference.
For the perfect solution, this paper gives a solution to the problems
in a real environment. The proposed method is simple, the amount of
calculation is small. And the method has high correction performance
without depending on the frequency bands and distances from source
signals to microphones. Furthermore, it can be applied under the real
environment. From several experiments in a real room, it clarifies
that the proposed method has been verified.
Abstract: This paper proposes a Wavelength Division
Multiplexing (WDM) technology based Storage Area Network
(SAN) for all type of Disaster recovery operation. It considers
recovery when all paths failure in the network as well as the main
SAN site failure also the all backup sites failure by the effect of
natural disasters such as earthquakes, fires and floods, power outage,
and terrorist attacks, as initially SAN were designed to work within
distance limited environments[2]. Paper also presents a NEW PATH
algorithm when path failure occurs. The simulation result and
analysis is presented for the proposed architecture with performance
consideration.
Abstract: Different pseudo-random or pseudo-noise (PN) as well as orthogonal sequences that can be used as spreading codes for code division multiple access (CDMA) cellular networks or can be used for encrypting speech signals to reduce the residual intelligence are investigated. We briefly review the theoretical background for direct sequence CDMA systems and describe the main characteristics of the maximal length, Gold, Barker, and Kasami sequences. We also discuss about variable- and fixed-length orthogonal codes like Walsh- Hadamard codes. The equivalence of PN and orthogonal codes are also derived. Finally, a new PN sequence is proposed which is shown to have certain better properties than the existing codes.
Abstract: In this paper, the dam-reservoir interaction is
analyzed using a finite element approach. The fluid is assumed to be
incompressible, irrotational and inviscid. The assumed boundary
conditions are that the interface of the dam and reservoir is vertical
and the bottom of reservoir is rigid and horizontal. The governing
equation for these boundary conditions is implemented in the
developed finite element code considering the horizontal and vertical
earthquake components. The weighted residual standard Galerkin
finite element technique with 8-node elements is used to discretize
the equation that produces a symmetric matrix equation for the damreservoir
system. A new boundary condition is proposed for
truncating surface of unbounded fluid domain to show the energy
dissipation in the reservoir, through radiation in the infinite upstream
direction. The Sommerfeld-s and perfect damping boundary
conditions are also implemented for a truncated boundary to compare
with the proposed far end boundary. The results are compared with
an analytical solution to demonstrate the accuracy of the proposed
formulation and other truncated boundary conditions in modeling the
hydrodynamic response of an infinite reservoir.
Abstract: In this work the effects of uniaxial mechanical stress on a pixel readout circuit are theoretically analyzed. It is the effects of mechanical stress on the in-pixel transistors do not arise at the output, when a correlated double sampling circuit is used. However, mechanical stress effects on the photodiode will directly appear at the readout chain output. Therefore, compensation techniques are needed to overcome this situation. Moreover simulation technique of mechanical stress is proposed and diverse layout as well as design recommendations are put forward, in order to minimize stress related effects on the output of a circuit. he shown, that wever, Moreover, a out
Abstract: Background, measuring an individual-s Health
Literacy is gaining attention, yet no appropriate instrument is available
in Taiwan. Measurement tools that were developed and used in
western countries may not be appropriate for use in Taiwan due to a
different language system. Purpose of this research was to develop a
Health Literacy measurement instrument specific for Taiwan adults.
Methods, several experts of clinic physicians; healthcare
administrators and scholars identified 125 common used health related
Chinese phrases from major medical knowledge sources that easy
accessible to the public. A five-point Likert scale is used to measure
the understanding level of the target population. Such measurement is
then used to compare with the correctness of their answers to a health
knowledge test for validation. Samples, samples under study were
purposefully taken from four groups of people in the northern
Pingtung, OPD patients, university students, community residents,
and casual visitors to the central park. A set of health knowledge index
with 10 questions is used to screen those false responses. A sample
size of 686 valid cases out of 776 was then included to construct this
scale. An independent t-test was used to examine each individual
phrase. The phrases with the highest significance are then identified
and retained to compose this scale. Result, a Taiwan Health Literacy
Scale (THLS) was finalized with 66 health-related phrases under nine
divisions. Cronbach-s alpha of each division is at a satisfactory level
of 89% and above. Conclusions, factors significantly differentiate the
levels of health literacy are education, female gender, age, family
members of stroke victims, experience with patient care, and
healthcare professionals in the initial application in this study..
Abstract: The optimal operation of proton exchange membrane fuel cell (PEMFC) requires good water management which is presented under two forms vapor and liquid. Moreover, fuel cells have to reach higher output require integration of some accessories which need electrical power. In order to analyze fuel cells operation and different species transport phenomena a biphasic mathematical model is presented by governing equations set. The numerical solution of these conservation equations is calculated by Matlab program. A multi-criteria optimization with weighting between two opposite objectives is used to determine the compromise solutions between maximum output and minimal stack size. The obtained results are in good agreement with available literature data.
Abstract: Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.
Abstract: In this paper a new Joint Adaptive Block Matching
Search (JABMS) algorithm is proposed to generate motion vector
and search a best match macro block by classifying the motion vector
movement based on prediction error. Diamond Search (DS)
algorithm generates high estimation accuracy when motion vector is
small and Adaptive Rood Pattern Search (ARPS) algorithm can
handle large motion vector but is not very accurate. The proposed
JABMS algorithm which is capable of considering both small and
large motions gives improved estimation accuracy and the
computational cost is reduced by 15.2 times compared with
Exhaustive Search (ES) algorithm and is 1.3 times less compared
with Diamond search algorithm.
Abstract: Scarcity of resources for biodiversity conservation gives rise to the need of strategic investment with priorities given to the cost of conservation. While the literature provides abundant methodological options for biodiversity conservation; estimating true cost of conservation remains abstract and simplistic, without recognising dynamic nature of the cost. Some recent works demonstrate the prominence of economic theory to inform biodiversity decisions, particularly on the costs and benefits of biodiversity however, the integration of the concept of true cost into biodiversity actions and planning are very slow to come by, and specially on a farm level. Conservation planning studies often use area as a proxy for costs neglecting different land values as well as protected areas. These literature consider only heterogeneous benefits while land costs are considered homogenous. Analysis with the assumption of cost homogeneity results in biased estimation; since not only it doesn’t address the true total cost of biodiversity actions and plans, but also it fails to screen out lands that are more (or less) expensive and/or difficult (or more suitable) for biodiversity conservation purposes, hindering validity and comparability of the results. Economies of scope” is one of the other most neglected aspects in conservation literature. The concept of economies of scope introduces the existence of cost complementarities within a multiple output production system and it suggests a lower cost during the concurrent production of multiple outputs by a given farm. If there are, indeed, economies of scope then simplistic representation of costs will tend to overestimate the true cost of conservation leading to suboptimal outcomes. The aim of this paper, therefore, is to provide first road review of the various theoretical ways in which economies of scope are likely to occur of how they might occur in conservation. Consequently, the paper addresses gaps that have to be filled in future analysis.