Abstract: this paper presents an auto-regressive network called the Auto-Regressive Multi-Context Recurrent Neural Network (ARMCRN), which forecasts the daily peak load for two large power plant systems. The auto-regressive network is a combination of both recurrent and non-recurrent networks. Weather component variables are the key elements in forecasting because any change in these variables affects the demand of energy load. So the AR-MCRN is used to learn the relationship between past, previous, and future exogenous and endogenous variables. Experimental results show that using the change in weather components and the change that occurred in past load as inputs to the AR-MCRN, rather than the basic weather parameters and past load itself as inputs to the same network, produce higher accuracy of predicted load. Experimental results also show that using exogenous and endogenous variables as inputs is better than using only the exogenous variables as inputs to the network.
Abstract: The occurrence and removal of trace organic
contaminants in the aquatic environment has become a focus of
environmental concern. For the selective removal of carbamazepine
from loaded waters molecularly imprinted polymers (MIPs) were
synthesized with carbamazepine as template. Parameters varied were
the type of monomer, crosslinker, and porogen, the ratio of starting
materials, and the synthesis temperature. Best results were obtained
with a template to crosslinker ratio of 1:20, toluene as porogen, and
methacrylic acid (MAA) as monomer. MIPs were then capable to
recover carbamazepine by 93% from a 10-5 M landfill leachate
solution containing also caffeine and salicylic acid. By comparison,
carbamazepine recoveries of 75% were achieved using a nonimprinted
polymer (NIP) synthesized under the same conditions, but
without template. In landfill leachate containing solutions
carbamazepine was adsorbed by 93-96% compared with an uptake of
73% by activated carbon. The best solvent for desorption was
acetonitrile, with which the amount of solvent necessary and dilution
with water was tested. Selected MIPs were tested for their reusability
and showed good results for at least five cycles. Adsorption
isotherms were prepared with carbamazepine solutions in the
concentration range of 0.01 M to 5*10-6 M. The heterogeneity index
showed a more homogenous binding site distribution.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: Program slicing is the task of finding all statements in
a program that directly or indirectly influence the value of a variable
occurrence. The set of statements that can affect the value of a
variable at some point in a program is called a program backward
slice. In several software engineering applications, such as program
debugging and measuring program cohesion and parallelism, several
slices are computed at different program points. The existing
algorithms for computing program slices are introduced to compute a
slice at a program point. In these algorithms, the program, or the
model that represents the program, is traversed completely or
partially once. To compute more than one slice, the same algorithm
is applied for every point of interest in the program. Thus, the same
program, or program representation, is traversed several times.
In this paper, an algorithm is introduced to compute all forward
static slices of a computer program by traversing the program
representation graph once. Therefore, the introduced algorithm is
useful for software engineering applications that require computing
program slices at different points of a program. The program
representation graph used in this paper is called Program Dependence
Graph (PDG).
Abstract: Three-dimensional geometric models have been used
to present architectural and engineering works, showing their final
configuration. When the clarification of a detail or the constitution of
a construction step in needed, these models are not appropriate. They
do not allow the observation of the construction progress of a
building. Models that could present dynamically changes of the
building geometry are a good support to the elaboration of projects.
Techniques of geometric modeling and virtual reality were used to
obtain models that could visually simulate the construction activity.
The applications explain the construction work of a cavity wall and a
bridge. These models allow the visualization of the physical
progression of the work following a planned construction sequence,
the observation of details of the form of every component of the
works and support the study of the type and method of operation of
the equipment applied in the construction. These models presented
distinct advantage as educational aids in first-degree courses in Civil
Engineering. The use of Virtual Reality techniques in the
development of educational applications brings new perspectives to
the teaching of subjects related to the field of civil construction.
Abstract: The design of a landing gear is one of the fundamental aspects of aircraft design. The need for a light weight, high strength, and stiffness characteristics coupled with techno economic feasibility are a key to the acceptability of any landing gear construction. In this paper, an approach for analyzing two different designed landing gears for an unmanned aircraft vehicle (UAV) using advanced CAE techniques will be applied. Different landing conditions have been considered for both models. The maximum principle stresses for each model along with the factor of safety are calculated for every loading condition. A conclusion is drawing about better geometry.
Abstract: An overview of the important aspects of managing
and controlling industrial effluent discharges to public sewers namely
sampling, characterization, quantification and legislative controls has
been presented. The findings have been validated by means of a case
study covering three industrial sectors namely, tanning, textile
finishing and food processing industries. Industrial effluents
discharges were found to be best monitored by systematic and
automatic sampling and quantified using water meter readings
corrected for evaporative and consumptive losses. Based on the
treatment processes employed in the public owned treatment works
and the chemical oxygen demand and biochemical oxygen demand
levels obtained, the effluent from all the three industrial sectors
studied were found to lie in the toxic zone. Thus, physico-chemical
treatment of these effluents is required to bring them into the
biodegradable zone. KL values (quoted to base e) were greater than
0.50 day-1 compared to 0.39 day-1 for typical municipality
wastewater.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: Over the years, there is a growing trend towards
quality-based specifications in highway construction. In many
Quality Control/Quality Assurance (QC/QA) specifications, the
contractor is primarily responsible for quality control of the process,
whereas the highway agency is responsible for testing the acceptance
of the product. A cooperative investigation was conducted in Illinois
over several years to develop a prototype End-Result Specification
(ERS) for asphalt pavement construction. The final characteristics of
the product are stipulated in the ERS and the contractor is given
considerable freedom in achieving those characteristics. The risk for
the contractor or agency depends on how the acceptance limits and
processes are specified. Stochastic simulation models are very useful
in estimating and analyzing payment risk in ERS systems and these
form an integral part of the Illinois-s prototype ERS system. This
paper describes the development of an innovative methodology to
estimate the variability components in in-situ density, air voids and
asphalt content data from ERS projects. The information gained from
this would be crucial in simulating these ERS projects for estimation
and analysis of payment risks associated with asphalt pavement
construction. However, these methods require at least two parties to
conduct tests on all the split samples obtained according to the
sampling scheme prescribed in present ERS implemented in Illinois.
Abstract: Deoxyribonucleic Acid or DNA computing has
emerged as an interdisciplinary field that draws together chemistry,
molecular biology, computer science and mathematics. Thus, in this
paper, the possibility of DNA-based computing to solve an absolute
1-center problem by molecular manipulations is presented. This is
truly the first attempt to solve such a problem by DNA-based
computing approach. Since, part of the procedures involve with
shortest path computation, research works on DNA computing for
shortest path Traveling Salesman Problem, in short, TSP are reviewed.
These approaches are studied and only the appropriate one is adapted
in designing the computation procedures. This DNA-based
computation is designed in such a way that every path is encoded by
oligonucleotides and the path-s length is directly proportional to the
length of oligonucleotides. Using these properties, gel electrophoresis
is performed in order to separate the respective DNA molecules
according to their length. One expectation arise from this paper is that
it is possible to verify the instance absolute 1-center problem using
DNA computing by laboratory experiments.
Abstract: In this work we present a solution for DAGC (Digital
Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4
GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used
enables gain control over Low Noise Amplifier (LNA) and a
Variable Gain Amplifier (VGA). The control over those signals is
performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better
signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the
average power of the baseband signal close to the desired set point.
DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and
actual gain setting, adjusting a gain factor of the accumulation, and
applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying
the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the
DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.
Abstract: The development of the power electronics has allowed
increasing the precision and reliability of the electrical trainings,
thanks to the adjustable inverters, as the Pulse Wide Modulation
(PWM) five level inverters, which is the object of study in this
article.The authors treat the relation between the law order adopted for
a given system and the oscillations of the electrical and mechanical
parameters of which the tolerance depends on the process with which
they are integrated (paper factory, lifting of the heavy loads,
etc.).Thus the best choice of the regulation indexes allows us to
achieve stability and safety training without investment (management
of existing equipment).
Abstract: As is needless to say; a majority of accidents, which occur, are due to drunk driving. As such, there is no effective mechanism to prevent this. Here we have designed an integrated system for the same purpose. Alcohol content in the driver-s body is detected by means of an infrared breath analyzer placed at the steering wheel. An infrared cell directs infrared energy through the sample and any unabsorbed energy at the other side is detected. The higher the concentration of ethanol, the more infrared absorption occurs (in much the same way that a sunglass lens absorbs visible light, alcohol absorbs infrared light). Thus the alcohol level of the driver is continuously monitored and calibrated on a scale. When it exceeds a particular limit the fuel supply is cutoff. If the device is removed also, the fuel supply will be automatically cut off or an alarm is sounded depending upon the requirement. This does not happen abruptly and special indicators are fixed at the back to avoid inconvenience to other drivers using the highway signals. Frame work for integration of sensors and control module in a scalable multi-agent system is provided .A SMS which contains the current GPS location of the vehicle is sent via a GSM module to the police control room to alert the police. The system is foolproof and the driver cannot tamper with it easily. Thus it provides an effective and cost effective solution for the problem of drunk driving in vehicles.
Abstract: The concepts of knowledge creation and innovation
have a strong relationship but this relationship has not been examined
systematically. This study examines the utilization of knowledge
creation processes of the Theory of Knowledge Creation in Higher
Education Institutions. These processes consist of socialization,
externalization, combination and internalization. This study suggests
that the utilization of these processes will give impacts on innovation
in academic performance. A cross-sectional study was conducted
using survey questionnaires to collect data of the utilization of
knowledge creation processes and classroom-s innovation. The
samples are Business Management students of a Malaysian Higher
Education Institution. The results of this study could help Higher
Education Institutions to enrich the learning process of students
through knowledge creation and innovation.
Abstract: This research was aimed to develop and determine the
quality of online learning activities kit as well as to examine the
learning achievement of students and their satisfaction towards the kit
through authentic assessment. The tools in this research contained
online learning activities kit on plant in Thai literature in compliance
with the School Botanical Garden of Plant Genetic Conservation
Project under the Royal Initiative of Her Royal Highness Princess
Maha Chakri Sirindhorn, the assessment form, the learning
achievement test, the satisfaction form and the authentic assessment
form. The population consisted of 40 students in the second range of
primary years (Prathomsuksa 4 to 6) at Ban Khao Rak School,
Suratthani Province, Thailand. The research results showed that the
content quality of the developed online learning activities kit as
assessed by the experts was 4.70 on average or at very high level.
The pre-test and post-test comparison was made to examine the
learning achievement and it revealed that the post-test score was
higher than the pre-test score with statistical significance at the .01
level. The satisfaction of the sampling group towards the online
learning activities kit was 4.74 or at the highest level. The authentic
assessment showed an average of 1.69 or at good level. Therefore,
the online learning activities kit on plant in Thai literature in
compliance with the School Botanical Garden of Plant Genetic
Conservation Project under the Royal Initiative of Her Royal
Highness Princess Maha Chakri Sirindhorn could be used in real
classroom situations.
Abstract: Super-resolution is nowadays used for a high-resolution
image produced from several low-resolution noisy frames. In
this work, we consider the problem of high-quality interpolation of a
single noise-free image. Such images may come from different sources,
i.e., they may be frames of videos, individual pictures, etc. On
the other hand, in the encoder we apply a downsampling via
bidimen-sional interpolation of each frame, and in the decoder we
apply a upsampling by which we restore the original size of the
image. If the compression ratio is very high, then we use a
convolutive mask that restores the edges, eliminating the blur.
Finally, both, the encoder and the complete decoder are implemented
on General-Purpose computation on Graphics Processing Units
(GPGPU) cards. In fact, the mentioned mask is coded inside texture
memory of a GPGPU.
Abstract: This paper presents design, analysis and comparison of the different rotor type permanent magnet machines. The presented machines are designed as having same geometrical dimensions and same materials for comparison. The main machine parameters of interior and exterior rotor type machines including eddy current effect, torque-speed characteristics and magnetic analysis are investigated using MAXWELL program. With this program, the components of the permanent magnet machines can be calculated with high accuracy. Six types of Permanent machines are compared with respect to their topology, size, magnetic field, air gap flux, voltage, torque, loss and efficiency. The analysis results demonstrate the effectiveness of the proposed machines design methodology. We believe that, this study will be a helpful resource in terms of examination and comparison of the basic structure and magnetic features of the PM (Permanent magnet) machines which have different rotor structure.
Abstract: In this paper a Public Key Cryptosystem is proposed
using the number theoretic transforms (NTT) over a ring of integer
modulo a composite number. The key agreement is similar to
ElGamal public key algorithm. The security of the system is based on
solution of multivariate linear congruence equations and discrete
logarithm problem. In the proposed cryptosystem only fixed numbers
of multiplications are carried out (constant complexity) and hence the
encryption and decryption can be done easily. At the same time, it is
very difficult to attack the cryptosystem, since the cipher text is a
sequence of integers which are interrelated. The system provides
authentication also. Using Mathematica version 5.0 the proposed
algorithm is justified with a numerical example.
Abstract: Most of the Question Answering systems
composed of three main modules: question processing,
document processing and answer processing. Question
processing module plays an important role in QA systems. If
this module doesn't work properly, it will make problems for
other sections. Moreover answer processing module is an
emerging topic in Question Answering, where these systems
are often required to rank and validate candidate answers.
These techniques aiming at finding short and precise answers
are often based on the semantic classification.
This paper discussed about a new model for question
answering which improved two main modules, question
processing and answer processing.
There are two important components which are the bases
of the question processing. First component is question
classification that specifies types of question and answer.
Second one is reformulation which converts the user's
question into an understandable question by QA system in a
specific domain. Answer processing module, consists of
candidate answer filtering, candidate answer ordering
components and also it has a validation section for interacting
with user. This module makes it more suitable to find exact
answer. In this paper we have described question and answer
processing modules with modeling, implementing and
evaluating the system. System implemented in two versions.
Results show that 'Version No.1' gave correct answer to 70%
of questions (30 correct answers to 50 asked questions) and
'version No.2' gave correct answers to 94% of questions (47
correct answers to 50 asked questions).
Abstract: To help overcome limits to the density of conventional SRAMs and leakage current of SRAM cell in nanoscaled CMOS technology, we have developed a four-transistor SRAM cell. The newly developed CMOS four-transistor SRAM cell uses one word-line and one bit-line during read/write operation. This cell retains its data with leakage current and positive feedback without refresh cycle. The new cell size is 19% smaller than a conventional six-transistor cell using same design rules. Also the leakage current of new cell is 60% smaller than a conventional sixtransistor SRAM cell. Simulation result in 65nm CMOS technology shows new cell has correct operation during read/write operation and idle mode.