Abstract: Reducing energy consumption of embedded systems requires careful memory management. It has been shown that Scratch- Pad Memories (SPMs) are low size, low cost, efficient (i.e. energy saving) data structures directly managed at the software level. In this paper, the focus is on heuristic methods for SPMs management. A method is efficient if the number of accesses to SPM is as large as possible and if all available space (i.e. bits) is used. A Tabu Search (TS) approach for memory management is proposed which is, to the best of our knowledge, a new original alternative to the best known existing heuristic (BEH). In fact, experimentations performed on benchmarks show that the Tabu Search method is as efficient as BEH (in terms of energy consumption) but BEH requires a sorting which can be computationally expensive for a large amount of data. TS is easy to implement and since no sorting is necessary, unlike BEH, the corresponding sorting time is saved. In addition to that, in a dynamic perspective where the maximum capacity of the SPM is not known in advance, the TS heuristic will perform better than BEH.
Abstract: The image segmentation method described in this
paper has been developed as a pre-processing stage to be used in
methodologies and tools for video/image indexing and retrieval by
content. This method solves the problem of whole objects extraction
from background and it produces images of single complete objects
from videos or photos. The extracted images are used for calculating
the object visual features necessary for both indexing and retrieval
processes.
The segmentation algorithm is based on the cooperation among an
optical flow evaluation method, edge detection and region growing
procedures. The optical flow estimator belongs to the class of
differential methods. It permits to detect motions ranging from a
fraction of a pixel to a few pixels per frame, achieving good results in
presence of noise without the need of a filtering pre-processing stage
and includes a specialised model for moving object detection.
The first task of the presented method exploits the cues from
motion analysis for moving areas detection. Objects and background
are then refined using respectively edge detection and seeded region
growing procedures. All the tasks are iteratively performed until
objects and background are completely resolved.
The method has been applied to a variety of indoor and outdoor
scenes where objects of different type and shape are represented on
variously textured background.
Abstract: The paper explores the development of an optimization of method and apparatus for retrieving extended high dynamic range from digital negative image. Architectural photo imaging can benefit from high dynamic range imaging (HDRI) technique for preserving and presenting sufficient luminance in the shadow and highlight clipping image areas. The HDRI technique that requires multiple exposure images as the source of HDRI rendering may not be effective in terms of time efficiency during the acquisition process and post-processing stage, considering it has numerous potential imaging variables and technical limitations during the multiple exposure process. This paper explores an experimental method and apparatus that aims to expand the dynamic range from digital negative image in HDRI environment. The method and apparatus explored is based on a single source of RAW image acquisition for the use of HDRI post-processing. It will cater the optimization in order to avoid and minimize the conventional HDRI photographic errors caused by different physical conditions during the photographing process and the misalignment of multiple exposed image sequences. The study observes the characteristics and capabilities of RAW image format as digital negative used for the retrieval of extended high dynamic range process in HDRI environment.
Abstract: Fine alignment of main ship power plants mechanisms
and shaft lines provides long-term and failure-free performance of
propulsion system while fast and high-quality installation of
mechanisms and shaft lines decreases common labor intensity. For
checking shaft line allowed stress and setting its alignment it is
required to perform calculations considering various stages of life
cycle. In 2012 JSC SSTC developed special software complex
“Shaftline” for calculation of alignment of having its own I/O
interface and display of shaft line 3D model. Alignment of shaft line
as per bearing loads is rather labor-intensive procedure. In order to
decrease its duration, JSC SSTC developed automated alignment
system from ship power plants mechanisms. System operation
principle is based on automatic simulation of design load on bearings.
Initial data for shaft line alignment can be exported to automated
alignment system from PC “Shaft line”.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: one of the significant factors for improving the
accuracy of Land Surface Temperature (LST) retrieval is the correct
understanding of the directional anisotropy for thermal radiance. In
this paper, the multiple scattering effect between heterogeneous
non-isothermal surfaces is described rigorously according to the
concept of configuration factor, based on which a directional thermal
radiance model is built, and the directional radiant character for urban
canopy is analyzed. The model is applied to a simple urban canopy
with row structure to simulate the change of Directional Brightness
Temperature (DBT). The results show that the DBT is aggrandized
because of the multiple scattering effects, whereas the change range of
DBT is smoothed. The temperature difference, spatial distribution,
emissivity of the components can all lead to the change of DBT. The
“hot spot" phenomenon occurs when the proportion of high
temperature component in the vision field came to a head. On the other
hand, the “cool spot" phenomena occur when low temperature
proportion came to the head. The “spot" effect disappears only when
the proportion of every component keeps invariability. The model
built in this paper can be used for the study of directional effect on
emissivity, the LST retrieval over urban areas and the adjacency effect
of thermal remote sensing pixels.
Abstract: Three-dimensional geometric models have been used
to present architectural and engineering works, showing their final
configuration. When the clarification of a detail or the constitution of
a construction step in needed, these models are not appropriate. They
do not allow the observation of the construction progress of a
building. Models that could present dynamically changes of the
building geometry are a good support to the elaboration of projects.
Techniques of geometric modeling and virtual reality were used to
obtain models that could visually simulate the construction activity.
The applications explain the construction work of a cavity wall and a
bridge. These models allow the visualization of the physical
progression of the work following a planned construction sequence,
the observation of details of the form of every component of the
works and support the study of the type and method of operation of
the equipment applied in the construction. These models presented
distinct advantage as educational aids in first-degree courses in Civil
Engineering. The use of Virtual Reality techniques in the
development of educational applications brings new perspectives to
the teaching of subjects related to the field of civil construction.
Abstract: The ability of agricultural and decorative plants to
absorb and detoxify TNT and RDX has been studied. All tested 8
plants, grown hydroponically, were able to absorb these explosives
from water solutions: Alfalfa > Soybean > Chickpea> Chikling vetch
>Ryegrass > Mung bean> China bean > Maize. Differently from
TNT, RDX did not exhibit negative influence on seed germination
and plant growth. Moreover, some plants, exposed to RDX
containing solution were increased in their biomass by 20%. Study of
the fate of absorbed [1-14ðí]-TNT revealed the label distribution in
low and high-molecular mass compounds, both in roots and above
ground parts of plants, prevailing in the later. Content of 14ðí in lowmolecular
compounds in plant roots are much higher than in above
ground parts. On the contrary, high-molecular compounds are more
intensively labeled in aboveground parts of soybean. Most part (up to
70%) of metabolites of TNT, formed either by enzymatic reduction
or oxidation, is found in high molecular insoluble conjugates.
Activation of enzymes, responsible for reduction, oxidation and
conjugation of TNT, such as nitroreductase, peroxidase,
phenoloxidase and glutathione S-transferase has been demonstrated.
Among these enzymes, only nitroreductase was shown to be induced
in alfalfa, exposed to RDX. The increase in malate dehydrogenase
activities in plants, exposed to both explosives, indicates
intensification of Tricarboxylic Acid Cycle, that generates reduced
equivalents of NAD(P)H, necessary for functioning of the
nitroreductase. The hypothetic scheme of TNT metabolism in plants
is proposed.
Abstract: Stipples are desired for pattern fillings and
transparency effects. In contrast, some graphics standards, including
OpenGL ES 1.1 and 2.0, omitted this feature. We represent details of
providing line stipples and polygon stipples, through combining
texture mapping and alpha blending functions. We start from the
OpenGL-specified stipple-related API functions. The details of
mathematical transformations are explained to get the correct texture
coordinates. Then, the overall algorithm is represented, and its
implementation results are followed. We accomplished both of line
and polygon stipples, and verified its result with conformance test
routines.
Abstract: In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.
Abstract: This paper explores the sense of place in the Vredefort Dome World Heritage site, South Africa, as an essential input for the formulation of spatial planning proposals for the area. Intangible aspects such as personal and symbolic meanings of sites are currently not integrated in spatial planning in South Africa. This may have a detrimental effect on local inhabitants who have a long history with the site and built up a strong place identity. Involving local inhabitants at an early stage of the planning process and incorporating their attitudes and opinions in future intervention in the area, may also contribute to the acceptance of the legitimacy of future policy. An interdisciplinary and mixed-method research approach was followed in this study in order to identify possible ways to anchor spatial planning proposals in the identity of the place. In essence, the qualitative study revealed that inhabitants reflect a deep and personal relationship with and within the area, which contributes significantly to their sense of emotional security and selfidentity. Results include a strong conservation-orientated attitude with regard to the natural rural character of the site, especially in the inner core.
Abstract: In the highly competitive and rapidly changing global
marketplace, independent organizations and enterprises often come
together and form a temporary alignment of virtual enterprise in a
supply chain to better provide products or service. As firms adopt the
systems approach implicit in supply chain management, they must
manage the quality from both internal process control and external
control of supplier quality and customer requirements. How to
incorporate quality management of upstream and downstream supply
chain partners into their own quality management system has recently
received a great deal of attention from both academic and practice.
This paper investigate the collaborative feature and the entities-
relationship in a supply chain, and presents an ontology of
collaborative supply chain from an approach of aligning
service-oriented framework with service-dominant logic. This
perspective facilitates the segregation of material flow management
from manufacturing capability management, which provides a
foundation for the coordination and integration of the business process
to measure, analyze, and continually improve the quality of products,
services, and process. Further, this approach characterizes the different
interests of supply chain partners, providing an innovative approach to
analyze the collaborative features of supply chain. Furthermore, this
ontology is the foundation to develop quality management system
which internalizes the quality management in upstream and
downstream supply chain partners and manages the quality in supply
chain systematically.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: In this paper, we propose use of convolutional codes
for file dispersal. The proposed method is comparable in complexity
to the information Dispersal Algorithm proposed by M.Rabin and for
particular choices of (non-binary) convolutional codes, is almost as
efficient as that algorithm in terms of controlling expansion in the
total storage. Further, our proposed dispersal method allows string
search.
Abstract: This paper aims to present a framework for the
organizational knowledge management, which seeks to deploy a
standardized structure for the integrated management of knowledge is
a common language based on domains, processes and global
indicators inspired by the COBIT framework 5 (ISACA, 2012),
which supports the integration of three technologies, enterprise
information architecture (EIA), the business process modeling (BPM)
and service-oriented architecture (SOA). The Gomak Framework is a
management platform that seeks to integrate the information
technology infrastructure, the structure of applications, information
infrastructure, and business logic and business model to support a
sound strategy of organizational knowledge management, low
process-based approach and concurrent engineering. Concurrent
engineering (CE) is a systematic approach to integrated product
development that respond to customer expectations, involving all
perspectives in parallel, from the beginning of the product life cycle.
(European Space Agency, 2000).
Abstract: This paper presents a new technique of compensation
of the effect of variation parameters in the direct field oriented
control of induction motor. The proposed method uses an adaptive
tuning of the value of synchronous speed to obtain the robustness for
the field oriented control. We show that this adaptive tuning allows
having robustness for direct field oriented control to changes in rotor
resistance, load torque and rotational speed. The effectiveness of the
proposed control scheme is verified by numerical simulations. The
numerical validation results of the proposed scheme have presented
good performances compared to the usual direct-field oriented
control.
Abstract: This paper describes the development of a fully
automated measurement software for antenna radiation pattern
measurements in a Compact Antenna Test Range (CATR). The
CATR has a frequency range from 2-40 GHz and the measurement
hardware includes a Network Analyzer for transmitting and
Receiving the microwave signal and a Positioner controller to control
the motion of the Styrofoam column. The measurement process
includes Calibration of CATR with a Standard Gain Horn (SGH)
antenna followed by Gain versus angle measurement of the Antenna
under test (AUT). The software is designed to control a variety of
microwave transmitter / receiver and two axis Positioner controllers
through the standard General Purpose interface bus (GPIB) interface.
Addition of new Network Analyzers is supported through a slight
modification of hardware control module. Time-domain gating is
implemented to remove the unwanted signals and get the isolated
response of AUT. The gated response of the AUT is compared with
the calibration data in the frequency domain to obtain the desired
results. The data acquisition and processing is implemented in
Agilent VEE and Matlab. A variety of experimental measurements
with SGH antennas were performed to validate the accuracy of
software. A comparison of results with existing commercial
softwares is presented and the measured results are found to be
within .2 dBm.
Abstract: In this work we present a solution for DAGC (Digital
Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4
GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used
enables gain control over Low Noise Amplifier (LNA) and a
Variable Gain Amplifier (VGA). The control over those signals is
performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better
signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the
average power of the baseband signal close to the desired set point.
DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and
actual gain setting, adjusting a gain factor of the accumulation, and
applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying
the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the
DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.
Abstract: Flow field around hypersonic vehicles is very
complex and difficult to simulate. The boundary layers are squeezed
between shock layer and body surface. Resolution of boundary layer,
shock wave and turbulent regions where the flow field has high
values is difficult of capture. Detached eddy simulation (DES) is a
modification of a RANS model in which the model switches to a
subgrid scale formulation in regions fine enough for LES
calculations. Regions near solid body boundaries and where the
turbulent length scale is less than the maximum grid dimension are
assigned the RANS mode of solution. As the turbulent length scale
exceeds the grid dimension, the regions are solved using the LES
mode. Therefore the grid resolution is not as demanding as pure LES,
thereby considerably cutting down the cost of the computation. In
this research study hypersonic flow is simulated at Mach 8 and
different angle of attacks to resolve the proper boundary layers and
discontinuities. The flow is also simulated in the long wake regions.
Mesh is little different than RANS simulations and it is made dense
near the boundary layers and in the wake regions to resolve it
properly. Hypersonic blunt cone cylinder body with frustrum at angle
5o and 10 o are simulated and there aerodynamics study is performed
to calculate aerodynamics characteristics of different geometries. The
results and then compared with experimental as well as with some
turbulence model (SA Model). The results achieved with DES
simulation have very good resolution as well as have excellent
agreement with experimental and available data. Unsteady
simulations are performed for DES calculations by using duel time
stepping method or implicit time stepping. The simulations are
performed at Mach number 8 and angle of attack from 0o to 10o for
all these cases. The results and resolutions for DES model found
much better than SA turbulence model.
Abstract: The development of the power electronics has allowed
increasing the precision and reliability of the electrical trainings,
thanks to the adjustable inverters, as the Pulse Wide Modulation
(PWM) five level inverters, which is the object of study in this
article.The authors treat the relation between the law order adopted for
a given system and the oscillations of the electrical and mechanical
parameters of which the tolerance depends on the process with which
they are integrated (paper factory, lifting of the heavy loads,
etc.).Thus the best choice of the regulation indexes allows us to
achieve stability and safety training without investment (management
of existing equipment).