Abstract: WiMAX is defined as Worldwide Interoperability for
Microwave Access by the WiMAX Forum, formed in June 2001 to
promote conformance and interoperability of the IEEE 802.16
standard, officially known as WirelessMAN. The attractive features
of WiMAX technology are very high throughput and Broadband
Wireless Access over a long distance. A detailed simulation
environment is demonstrated with the UGS, nrtPS and ertPS service
classes for throughput, delay and packet delivery ratio for a mixed
environment of fixed and mobile WiMAX. A simple mobility aspect
is considered for the mobile WiMAX and the PMP mode of
transmission is considered in TDD mode. The Network Simulator 2
(NS-2) is the tool which is used to simulate the WiMAX network
scenario. A simple Priority Scheduler and Weighted Round Robin
Schedulers are the WiMAX schedulers used in the research work
Abstract: Coal tar is a liquid by-product of the process of coal
gasification and carbonation. This liquid oil mixture contains various
kinds of useful compounds such as phenol, o-cresol, and p-cresol.
These compounds are widely used as raw material for insecticides,
dyes, medicines, perfumes, coloring matters, and many others.
This research needed to be done that given the optimum conditions
for the separation of phenol, o-cresol, and p-cresol from the coal tar
by solvent extraction process. The aim of the present work was to
study the effect of two kinds of aqueous were used as solvents:
methanol and acetone solutions, the effect of temperature (298, 306,
and 313K) and mixing (30, 35, and 40rpm) for the separation of
phenol, o-cresol, and p-cresol from coal tar by solvent extraction.
Results indicated that phenol, o-cresol, and p-cresol in coal tar
were selectivity extracted into the solvent phase and these
components could be separated by solvent extraction. The aqueous
solution of methanol, mass ratio of solvent to feed, Eo/Ro=1,
extraction temperature 306K and mixing 35 rpm were the most
efficient for extraction of phenol, o-cresol, and p-cresol from coal tar.
Abstract: How to simulate experimentally the air flow and heat
transfer under microgravity on the ground is important, which has not
been completely solved so far. Influence of gravity on air natural
convection results in convection heat transfer on ground difference
from that on orbit. In order to obtain air temperature and velocity
deviations of manned spacecraft during terrestrial thermal test,
dimensionless number analysis and numerical simulation analysis are
performed. The calculated temperature distribution and velocity
distribution of the horizontal test cases are compared to the vertical
cases. The results show that the influence of gravity is neglected for
facility drawer racks and more obvious for vertical cabins.
Abstract: The significant effects of the interactions between the
system boundaries and the near wall molecules in miniaturized
gaseous devices lead to the formation of the Knudsen layer in which
the Navier-Stokes-Fourier (NSF) equations fail to predict the correct
associated phenomena. In this paper, the well-known lattice
Boltzmann method (LBM) is employed to simulate the fluid flow and
heat transfer processes in rarefied gaseous micro media. Persuaded
by the problematic deficiency of the LBM in capturing the Knudsen
layer phenomena, present study tends to concentrate on the effective
molecular mean free path concept the main essence of which is to
compensate the incapability of this mesoscopic method in dealing
with the momentum and energy transport within the above mentioned
kinetic boundary layer. The results show qualitative and quantitative
accuracy comparable to the solutions of the linearized Boltzmann
equation or the DSMC data for the Knudsen numbers of O (1) .
Abstract: The present study was conducted to observe the effect
of Plantago psyllium on blood glucose and cholesterol levels in
normal and alloxan induced diabetic rats. To investigate the effect of
Plantago psyllium 40 rats were included in this study divided into
four groups of ten rats in each group. One group A was normal,
second group B was diabetic, third group C was non diabetic and
hypercholesterolemic and fourth group D was diabetic and
hypercholesterolemic. Two groups B and D were made diabetic by
intraperitonial injection of alloxan dissolved in 1mL distilled water at
a dose of 125mg/Kg of body weight. Two groups C and D were
made hypercholesterolemic by oral administration of powder
cholesterol (1g/Kg of body weight). The blood samples from all the
rats were collected from coccygial vein on 1st day, then on 21st and
42nd day respectively. All the samples were analyzed for blood
glucose and cholesterol level by using enzymatic kits. The blood
glucose and cholesterol levels of treated groups of rats showed
significant reduction after 7 weeks of treatment with Plantago
psyllium. By statistical analysis of results it was found that Plantago
psyllium has anti-diabetic and hypocholesterolemic activity in
diabetic and hypercholesterolemic albino rats.
Abstract: This paper proposes a method which reduces power consumption in single-error correcting, double error-detecting checker circuits that perform memory error correction code. Power is minimized with little or no impact on area and delay, using the degrees of freedom in selecting the parity check matrix of the error correcting codes. The genetic algorithm is employed to solve the non linear power optimization problem. The method is applied to two commonly used SEC-DED codes: standard Hamming and odd column weight Hsiao codes. Experiments were performed to show the performance of the proposed method.
Abstract: The main objective of this paper is to provide an efficient tool for delineating brain tumors in three-dimensional magnetic resonance images. To achieve this goal, we use basically a level-sets approach to delineating three-dimensional brain tumors. Then we introduce a compression plan of 3D brain structures based for the meshes simplification, adapted for time to the specific needs of the telemedicine and to the capacities restricted by network communication. We present here the main stages of our system, and preliminary results which are very encouraging for clinical practice.
Abstract: This paper discusses the implementation of the Kalman
Filter along with the Global Positioning System (GPS) for indoor
robot navigation. Two dimensional coordinates is used for the map
building, and refers to the global coordinate which is attached to the
reference landmark for position and direction information the robot
gets. The Discrete Kalman Filter is used to estimate the robot position,
project the estimated current state ahead in time through time update
and adjust the projected estimated state by an actual measurement at
that time via the measurement update. The navigation test has been
performed and has been found to be robust.
Abstract: It is believed that major account on language diversity must be taken in learning, and especially in learning using ICT. This paper-s objective is to exhibit language and communication barriers in learning, to approach the topic from socioculture and cognitivist perspectives, and to give exploratory solutions of handling such barriers. The review is mainly conducted by approaching the journal Computers & Education, but also an initially broad search was conducted. The results show that not much attention is paid on language and communication barriers in an immediate relation to learning using ICT. The results shows, inter alia, that language and communication barriers are caused because of not enough account is taken on both the individual-s background and the technology.
Abstract: This study was aimed to study the probability about
the production of fiberboard made of durian rind through latex with
phenolic resin as binding agent. The durian rind underwent the
boiling process with NaOH [7], [8] and then the fiber from durian
rind was formed into fiberboard through heat press. This means that
durian rind could be used as replacement for plywood in plywood
industry by using durian fiber as composite material with adhesive
substance. This research would study the probability about the
production of fiberboard made of durian rind through latex with
phenolic resin as binding agent. At first, durian rind was split,
exposed to light, boiled and steamed in order to gain durian fiber.
Then, fiberboard was tested with the density of 600 Kg/m3 and 800
Kg/m3. in order to find a suitable ratio of durian fiber and latex.
Afterwards, mechanical properties were tested according to the
standards of ASTM and JIS A5905-1994. After the suitable ratio was
known, the test results would be compared with medium density
fiberboard (MDF) and other related research studies. According to
the results, fiberboard made of durian rind through latex with
phenolic resin at the density of 800 Kg/m3 at ratio of 1:1, the
moisture was measured to be 5.05% with specific gravity (ASTM D
2395-07a) of 0.81, density (JIS A 5905-1994) of 0.88 g/m3, tensile
strength, hardness (ASTM D2240), flexibility or elongation at break
yielded similar values as the ones by medium density fiberboard
(MDF).
Abstract: In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models.
This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.
Abstract: Elementary particles are created in pairs of equal and opposite momentums at a reference frame at the speed of light. The speed of light reference frame is viewed as a point in space as observed by observer at rest. This point in space is the bang location of the big bang theory. The bang in the big bang theory is not more than sustained flow of pairs of positive and negative elementary particles. Electrons and negative charged elementary particles are ejected from this point in space at velocities faster than light, while protons and positively charged particles obtain velocities lower than light. Subsonic masses are found to have real and positive charge, while supersonic masses are found to be negative and imaginary indicating that the two masses are of different entities. The electron-s super-sonic speed, as viewed by rest observer was calculated and found to be less than the speed of light and is little higher than the electron speed in Bohr-s orbit. The newly formed hydrogen gas temperature was found to be in agreement with temperatures found on newly formed stars. Universe expansion was found to be in agreement. Partial mass and charge elementary particles and particles with momentum only were explained in the context of this theoretical approach.
Abstract: Carbon fibers have specific characteristics in
comparison with industrial and structural materials used in different
applications. Special properties of carbon fibers make them attractive
for reinforcing and fabrication of composites. These fibers have been
utilized for composites of metals, ceramics and plastics. However,
it-s mainly used in different forms to reinforce lightweight polymer
materials such as epoxy resin, polyesters or polyamides. The
composites of carbon fiber are stronger than steel, stiffer than
titanium, and lighter than aluminum and nowadays they are used in a
variety of applications. This study explains applications of carbon
fibers in different fields such as space, aviation, transportation,
medical, construction, energy, sporting goods, electronics, and the
other commercial/industrial applications. The last findings of
composites with polymer, metal and ceramic matrices containing
carbon fibers and their applications in the world investigated.
Researches show that carbon fibers-reinforced composites due to
unique properties (including high specific strength and specific
modulus, low thermal expansion coefficient, high fatigue strength,
and high thermal stability) can be replaced with common industrial
and structural materials.
Abstract: The aim of this studywas toinvestigate the effect
ofrunning classification (sprint, middle, and long distance)and two
distances on blood lactate (BLa), heart rate (HR), and rating of
perceived exertion (RPE) Borg scale ratings in collegiate athletes. On
different days, runners (n = 15) ran 400m and 1600m at a five min
mile pace, followed by a two min 6mph jog, and a two min 3mph
walk as part of the cool down. BLa, HR, and RPE were taken at
baseline, post-run, plus 2 and 4 min recovery times. The middle and
long distance runners exhibited lower BLa concentrations than sprint
runners after two min of recovery post 400 m runs, immediately after,
and two and four min recovery periods post 1600 m runs. When
compared to sprint runners, distance runners may have exhibited the
ability to clear BLa more quickly, particularly after running 1600 m.
Abstract: The approach based on the wavelet transform has
been widely used for image denoising due to its multi-resolution
nature, its ability to produce high levels of noise reduction and the
low level of distortion introduced. However, by removing noise, high
frequency components belonging to edges are also removed, which
leads to blurring the signal features. This paper proposes a new
method of image noise reduction based on local variance and edge
analysis. The analysis is performed by dividing an image into 32 x 32
pixel blocks, and transforming the data into wavelet domain. Fast
lifting wavelet spatial-frequency decomposition and reconstruction is
developed with the advantages of being computationally efficient and
boundary effects minimized. The adaptive thresholding by local
variance estimation and edge strength measurement can effectively
reduce image noise while preserve the features of the original image
corresponding to the boundaries of the objects. Experimental results
demonstrate that the method performs well for images contaminated
by natural and artificial noise, and is suitable to be adapted for
different class of images and type of noises. The proposed algorithm
provides a potential solution with parallel computation for real time
or embedded system application.
Abstract: The heart tissue is an excitable media. A Cellular
Automata is a type of model that can be used to model cardiac action
potential propagation. One of the advantages of this approach against
the methods based on differential equations is its high speed in large
scale simulations. Recent cellular automata models are not able to
avoid flat edges in the result patterns or have large neighborhoods. In
this paper, we present a new model to eliminate flat edges by
minimum number of neighbors.
Abstract: In this paper, the dam-reservoir interaction is
analyzed using a finite element approach. The fluid is assumed to be
incompressible, irrotational and inviscid. The assumed boundary
conditions are that the interface of the dam and reservoir is vertical
and the bottom of reservoir is rigid and horizontal. The governing
equation for these boundary conditions is implemented in the
developed finite element code considering the horizontal and vertical
earthquake components. The weighted residual standard Galerkin
finite element technique with 8-node elements is used to discretize
the equation that produces a symmetric matrix equation for the damreservoir
system. A new boundary condition is proposed for
truncating surface of unbounded fluid domain to show the energy
dissipation in the reservoir, through radiation in the infinite upstream
direction. The Sommerfeld-s and perfect damping boundary
conditions are also implemented for a truncated boundary to compare
with the proposed far end boundary. The results are compared with
an analytical solution to demonstrate the accuracy of the proposed
formulation and other truncated boundary conditions in modeling the
hydrodynamic response of an infinite reservoir.
Abstract: Background, measuring an individual-s Health
Literacy is gaining attention, yet no appropriate instrument is available
in Taiwan. Measurement tools that were developed and used in
western countries may not be appropriate for use in Taiwan due to a
different language system. Purpose of this research was to develop a
Health Literacy measurement instrument specific for Taiwan adults.
Methods, several experts of clinic physicians; healthcare
administrators and scholars identified 125 common used health related
Chinese phrases from major medical knowledge sources that easy
accessible to the public. A five-point Likert scale is used to measure
the understanding level of the target population. Such measurement is
then used to compare with the correctness of their answers to a health
knowledge test for validation. Samples, samples under study were
purposefully taken from four groups of people in the northern
Pingtung, OPD patients, university students, community residents,
and casual visitors to the central park. A set of health knowledge index
with 10 questions is used to screen those false responses. A sample
size of 686 valid cases out of 776 was then included to construct this
scale. An independent t-test was used to examine each individual
phrase. The phrases with the highest significance are then identified
and retained to compose this scale. Result, a Taiwan Health Literacy
Scale (THLS) was finalized with 66 health-related phrases under nine
divisions. Cronbach-s alpha of each division is at a satisfactory level
of 89% and above. Conclusions, factors significantly differentiate the
levels of health literacy are education, female gender, age, family
members of stroke victims, experience with patient care, and
healthcare professionals in the initial application in this study..
Abstract: The present study was performed in Musa bay (northern part of the Persian Gulf) around the coastal area of Bandare-Imam Khomeini and Razi Petrochemical Companies. Sediment samples and effluent samples were collected from the selected stations, from June 2009 to June 2010. The samples were analyzed to determine the degree of hydrocarbon contamination. The average level of TPH concentration in the study area was more than the natural background value at all of the stations, especially at station BI1 which was the main effluent outlet of Bandar-e- Imam Khomeini petrochemical company. Also the concentration of total petroleum hydrocarbon was monitored in the effluents of aforementioned petrochemical companies and the results showed that the concentration of TPH in the effluents of Bandar-e- Imam Khomeini petrochemical company was greater than Razi petrochemical company which is may be related to the products of Bandar-e- Imam Khomeini petrochemical company (aromatics, polymers, chemicals, fuel).
Abstract: Outlier detection in streaming data is very challenging because streaming data cannot be scanned multiple times and also new concepts may keep evolving. Irrelevant attributes can be termed as noisy attributes and such attributes further magnify the challenge of working with data streams. In this paper, we propose an unsupervised outlier detection scheme for streaming data. This scheme is based on clustering as clustering is an unsupervised data mining task and it does not require labeled data, both density based and partitioning clustering are combined for outlier detection. In this scheme partitioning clustering is also used to assign weights to attributes depending upon their respective relevance and weights are adaptive. Weighted attributes are helpful to reduce or remove the effect of noisy attributes. Keeping in view the challenges of streaming data, the proposed scheme is incremental and adaptive to concept evolution. Experimental results on synthetic and real world data sets show that our proposed approach outperforms other existing approach (CORM) in terms of outlier detection rate, false alarm rate, and increasing percentages of outliers.