Abstract: This paper deals with stability analysis for synchronous reluctance motors drive. Special attention is paid to the transient performance with variations in motor's parameters such as Ld and Rs. A study of the dynamic control using d-q model is presented first in order to clarify the stability of the motor drive system. Based on the experimental parameters of the synchronous reluctance motor, this paper gives some simulation results using MATLAB/SIMULINK software packages. It is concluded that the motor parameters, especially Ld, affect the estimator stability and hence the whole drive system.
Abstract: A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.
Abstract: The Time-Domain Boundary Element Method (TDBEM)
is a well known numerical technique that handles quite
properly dynamic analyses considering infinite dimension media.
However, when these analyses are also related to nonlinear behavior,
very complex numerical procedures arise considering the TD-BEM,
which may turn its application prohibitive. In order to avoid this
drawback and model nonlinear infinite media, the present work
couples two BEM formulations, aiming to achieve the best of two
worlds. In this context, the regions expected to behave nonlinearly
are discretized by the Domain Boundary Element Method (D-BEM),
which has a simpler mathematical formulation but is unable to deal
with infinite domain analyses; the TD-BEM is employed as in the
sense of an effective non-reflexive boundary. An iterative procedure
is considered for the coupling of the TD-BEM and D-BEM, which is
based on a relaxed renew of the variables at the common interfaces.
Elastoplastic models are focused and different time-steps are allowed
to be considered by each BEM formulation in the coupled analysis.
Abstract: The development of wireless communication technologies has changed our living style in global level. After the international success of mobile telephony standards, the location and time independent voice connection has become a default method in daily telecommunications. As for today, highly advanced multimedia messaging plays a key role in value added service handling. Along with evolving data services, the need for more complex applications can be seen, including the mobile usage of broadcast technologies. Here performance of a system design for terrestrial multimedia content is examined with emphasis on mobile reception. This review paper has accommodated the understanding of physical layer role and the flavour of terrestrial channel effects on the terrestrial multimedia transmission using OFDM keeping DVB-H as benchmark standard.
Abstract: In this paper, we introduce an mobile agent framework
with proactive load balancing for ambient intelligence (AmI) environments.
One of the main obstacles of AmI is the scalability in
which the openness of AmI environment introduces dynamic resource
requirements on agencies. To mediate this scalability problem, our
framework proposes a load balancing module to proactively analyze
the resource consumption of network bandwidth and preferred agencies
to suggest the optimal communication method to its user. The
framework generally formulates an AmI environment that consists
of three main components: (1) mobile devices, (2) hosts or agencies,
and (3) directory service center (DSC). A preliminary implementation
was conducted with NetLogo and the experimental results show that
the proposed approach provides enhanced system performance by
minimizing the network utilization to provide users with responsive
services.
Abstract: This paper presents a sensor-based motion planning algorithm for 3-DOF car-like robots with a nonholonomic constraint. Similar to the classic Bug family algorithms, the proposed algorithm enables the car-like robot to navigate in a completely unknown environment using only the range sensor information. The car-like robot uses the local range sensor view to determine the local path so that it moves towards the goal. To guarantee that the robot can approach the goal, the two modes of motion are repeated, termed motion-to-goal and wall-following. The motion-to-goal behavior lets the robot directly move toward the goal, and the wall-following behavior makes the robot circumnavigate the obstacle boundary until it meets the leaving condition. For each behavior, the nonholonomic motion for the car-like robot is planned in terms of the instantaneous turning radius. The proposed algorithm is implemented to the real robot and the experimental results show the performance of proposed algorithm.
Abstract: This paper presents a cold flow simulation study of a small gas turbine combustor performed using laboratory scale test rig. The main objective of this investigation is to obtain physical insight of the main vortex, responsible for the efficient mixing of fuel and air. Such models are necessary for predictions and optimization of real gas turbine combustors. Air swirler can control the combustor performance by assisting in the fuel-air mixing process and by producing recirculation region which can act as flame holders and influences residence time. Thus, proper selection of a swirler is needed to enhance combustor performance and to reduce NOx emissions. Three different axial air swirlers were used based on their vane angles i.e., 30°, 45°, and 60°. Three-dimensional, viscous, turbulent, isothermal flow characteristics of the combustor model operating at room temperature were simulated via Reynolds- Averaged Navier-Stokes (RANS) code. The model geometry has been created using solid model, and the meshing has been done using GAMBIT preprocessing package. Finally, the solution and analysis were carried out in a FLUENT solver. This serves to demonstrate the capability of the code for design and analysis of real combustor. The effects of swirlers and mass flow rate were examined. Details of the complex flow structure such as vortices and recirculation zones were obtained by the simulation model. The computational model predicts a major recirculation zone in the central region immediately downstream of the fuel nozzle and a second recirculation zone in the upstream corner of the combustion chamber. It is also shown that swirler angles changes have significant effects on the combustor flowfield as well as pressure losses.
Abstract: Reducing energy consumption of embedded systems requires careful memory management. It has been shown that Scratch- Pad Memories (SPMs) are low size, low cost, efficient (i.e. energy saving) data structures directly managed at the software level. In this paper, the focus is on heuristic methods for SPMs management. A method is efficient if the number of accesses to SPM is as large as possible and if all available space (i.e. bits) is used. A Tabu Search (TS) approach for memory management is proposed which is, to the best of our knowledge, a new original alternative to the best known existing heuristic (BEH). In fact, experimentations performed on benchmarks show that the Tabu Search method is as efficient as BEH (in terms of energy consumption) but BEH requires a sorting which can be computationally expensive for a large amount of data. TS is easy to implement and since no sorting is necessary, unlike BEH, the corresponding sorting time is saved. In addition to that, in a dynamic perspective where the maximum capacity of the SPM is not known in advance, the TS heuristic will perform better than BEH.
Abstract: Textile structures are engineered and fabricated to
meet worldwide structural applications. Nevertheless, research
varying textile structure on natural fibre as composite reinforcement
was found to be very limited. Most of the research is focusing on
short fibre and random discontinuous orientation of the reinforcement
structure. Realizing that natural fibre (NF) composite had been
widely developed to be used as synthetic fibre composite
replacement, this research attempted to examine the influence of
woven and cross-ply laminated structure towards its mechanical
performances. Laminated natural fibre composites were developed
using hand lay-up and vacuum bagging technique. Impact and
flexural strength were investigated as a function of fibre type (coir
and kenaf) and reinforcement structure (imbalanced plain woven,
0°/90° cross-ply and +45°/-45° cross-ply). Multi-level full factorial
design of experiment (DOE) and analysis of variance (ANOVA) was
employed to impart data as to how fibre type and reinforcement
structure parameters affect the mechanical properties of the
composites. This systematic experimentation has led to determination
of significant factors that predominant influences the impact and
flexural properties of the textile composites. It was proven that both
fibre type and reinforcement structure demonstrated significant
difference results. Overall results indicated that coir composite and
woven structure exhibited better impact and flexural strength. Yet,
cross-ply composite structure demonstrated better fracture resistance.
Abstract: The image segmentation method described in this
paper has been developed as a pre-processing stage to be used in
methodologies and tools for video/image indexing and retrieval by
content. This method solves the problem of whole objects extraction
from background and it produces images of single complete objects
from videos or photos. The extracted images are used for calculating
the object visual features necessary for both indexing and retrieval
processes.
The segmentation algorithm is based on the cooperation among an
optical flow evaluation method, edge detection and region growing
procedures. The optical flow estimator belongs to the class of
differential methods. It permits to detect motions ranging from a
fraction of a pixel to a few pixels per frame, achieving good results in
presence of noise without the need of a filtering pre-processing stage
and includes a specialised model for moving object detection.
The first task of the presented method exploits the cues from
motion analysis for moving areas detection. Objects and background
are then refined using respectively edge detection and seeded region
growing procedures. All the tasks are iteratively performed until
objects and background are completely resolved.
The method has been applied to a variety of indoor and outdoor
scenes where objects of different type and shape are represented on
variously textured background.
Abstract: Fine alignment of main ship power plants mechanisms
and shaft lines provides long-term and failure-free performance of
propulsion system while fast and high-quality installation of
mechanisms and shaft lines decreases common labor intensity. For
checking shaft line allowed stress and setting its alignment it is
required to perform calculations considering various stages of life
cycle. In 2012 JSC SSTC developed special software complex
“Shaftline” for calculation of alignment of having its own I/O
interface and display of shaft line 3D model. Alignment of shaft line
as per bearing loads is rather labor-intensive procedure. In order to
decrease its duration, JSC SSTC developed automated alignment
system from ship power plants mechanisms. System operation
principle is based on automatic simulation of design load on bearings.
Initial data for shaft line alignment can be exported to automated
alignment system from PC “Shaft line”.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: Two geometrically nonlinear plate theories, based either on first- or third-order transverse shear deformation theory are used for finite element modeling and simulation of the transient response of smart structures incorporating piezoelectric layers. In particular the time histories of nonlinear vibrations and sensor voltage output of a thin beam with a piezoelectric patch bonded to the surface due to an applied step force are studied.
Abstract: one of the significant factors for improving the
accuracy of Land Surface Temperature (LST) retrieval is the correct
understanding of the directional anisotropy for thermal radiance. In
this paper, the multiple scattering effect between heterogeneous
non-isothermal surfaces is described rigorously according to the
concept of configuration factor, based on which a directional thermal
radiance model is built, and the directional radiant character for urban
canopy is analyzed. The model is applied to a simple urban canopy
with row structure to simulate the change of Directional Brightness
Temperature (DBT). The results show that the DBT is aggrandized
because of the multiple scattering effects, whereas the change range of
DBT is smoothed. The temperature difference, spatial distribution,
emissivity of the components can all lead to the change of DBT. The
“hot spot" phenomenon occurs when the proportion of high
temperature component in the vision field came to a head. On the other
hand, the “cool spot" phenomena occur when low temperature
proportion came to the head. The “spot" effect disappears only when
the proportion of every component keeps invariability. The model
built in this paper can be used for the study of directional effect on
emissivity, the LST retrieval over urban areas and the adjacency effect
of thermal remote sensing pixels.
Abstract: In this study, an inland metropolitan area, Gwangju, in Korea was selected to assess the amplification potential of earthquake motion and provide the information for regional seismic countermeasure. A geographic information system-based expert system was implemented for reliably predicting the spatial geotechnical layers in the entire region of interesting by building a geo-knowledge database. Particularly, the database consists of the existing boring data gathered from the prior geotechnical projects and the surface geo-knowledge data acquired from the site visit. For practical application of the geo-knowledge database to estimate the earthquake hazard potential related to site amplification effects at the study area, seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created within GIS framework. In addition, seismic zonation of site classification was also performed to determine the site amplification coefficients for seismic design at any site in the study area. KeywordsEarthquake hazard, geo-knowledge, geographic information system, seismic zonation, site period.
Abstract: Here we have considered non uniform microstrip
leaky-wave antenna implemented on a dielectric waveguide by a
sinusoidal profile of periodic metallic grating. The non distribution of
the attenuation constant α along propagation axis, optimize the
radiating characteristics and performances of such antennas. The
method developped here is based on an integral method where the
formalism of the admittance operator is combined to a BKW
approximation. First, the effect of the modeling in the modal analysis
of complex waves is studied in detail. Then, the BKW model is used
for the dispersion analysis of the antenna of interest. According to
antenna theory, a forced continuity of the leaky-wave magnitude at
discontinuities of the non uniform structure is established. To test the
validity of our dispersion analysis, computed radiation patterns are
presented and compared in the millimeter band.
Abstract: Structured catalysts formed from the growth of
zeolites on substrates is an area of increasing interest due to the
increased efficiency of the catalytic process, and the ability to
provide superior heat transfer and thermal conductivity for both
exothermic and endothermic processes.
However, the generation of structured catalysts represents a
significant challenge when balancing the relationship variables
between materials properties and catalytic performance, with the
Na2O, H2O and Al2O3 gel composition paying a significant role in
this dynamic, thereby affecting the both the type and range of
application.
The structured catalyst films generated as part of this
investigation have been characterised using a range of techniques,
including X-ray diffraction (XRD), Electron microscopy (SEM),
Energy Dispersive X-ray analysis (EDX) and Thermogravimetric
Analysis (TGA), with the transition from oxide-on-alloy wires to
hydrothermally synthesised uniformly zeolite coated surfaces being
demonstrated using both SEM and XRD. The robustness of the
coatings has been ascertained by subjecting these to thermal cycling
(ambient to 550oC), with the results indicating that the synthesis time
and gel compositions have a crucial effect on the quality of zeolite
growth on the FeCrAlloy wires.
Finally, the activity of the structured catalyst was verified by a
series of comparison experiments with standard zeolite Y catalysts in
powdered pelleted forms.
Abstract: Solar power plants(SPPs) have shown a lot of good outcomes
in providing a various functions depending on industrial expectations by
deploying ad-hoc networking with helps of light loaded and battery powered
sensor nodes. In particular, it is strongly requested to develop an algorithm to
deriver the sensing data from the end node of solar power plants to the sink node
on time. In this paper, based on the above observation we have proposed an
IEEE802.15.4 based self routing scheme for solar power plants. The proposed
beacon based priority routing Algorithm (BPRA) scheme utilizes beacon
periods in sending message with embedding the high priority data and thus
provides high quality of service(QoS) in the given criteria. The performance
measures are the packet Throughput, delivery, latency, total energy
consumption. Simulation results under TinyOS Simulator(TOSSIM) have
shown the proposed scheme outcome the conventional Ad hoc On-Demand
Distance Vector(AODV) Routing in solar power plants.
Abstract: In this study, we developed an algorithm for detecting
seam cracks in a steel plate. Seam cracks are generated in the edge
region of a steel plate. We used the Gabor filter and an adaptive double
threshold method to detect them. To reduce the number of pseudo
defects, features based on the shape of seam cracks were used. To
evaluate the performance of the proposed algorithm, we tested 989
images with seam cracks and 9470 defect-free images. Experimental
results show that the proposed algorithm is suitable for detecting seam
cracks. However, it should be improved to increase the true positive
rate.
Abstract: Deoxyribonucleic Acid or DNA computing has
emerged as an interdisciplinary field that draws together chemistry,
molecular biology, computer science and mathematics. Thus, in this
paper, the possibility of DNA-based computing to solve an absolute
1-center problem by molecular manipulations is presented. This is
truly the first attempt to solve such a problem by DNA-based
computing approach. Since, part of the procedures involve with
shortest path computation, research works on DNA computing for
shortest path Traveling Salesman Problem, in short, TSP are reviewed.
These approaches are studied and only the appropriate one is adapted
in designing the computation procedures. This DNA-based
computation is designed in such a way that every path is encoded by
oligonucleotides and the path-s length is directly proportional to the
length of oligonucleotides. Using these properties, gel electrophoresis
is performed in order to separate the respective DNA molecules
according to their length. One expectation arise from this paper is that
it is possible to verify the instance absolute 1-center problem using
DNA computing by laboratory experiments.