Abstract: This paper analyzes the patterns of the Monte Carlo
data for a large number of variables and minterms, in order to
characterize the circuit path length behavior. We propose models
that are determined by training process of shortest path length
derived from a wide range of binary decision diagram (BDD)
simulations. The creation of the model was done use of feed forward
neural network (NN) modeling methodology. Experimental results
for ISCAS benchmark circuits show an RMS error of 0.102 for the
shortest path length complexity estimation predicted by the NN
model (NNM). Use of such a model can help reduce the time
complexity of very large scale integrated (VLSI) circuitries and
related computer-aided design (CAD) tools that use BDDs.
Abstract: A new current-mode multifunction filter using minimum number of passive elements is proposed. The proposed filter has single-input and four high-impedance outputs. It uses four passive elements (two capacitors and two resistors) and four dual output second generation current conveyors. Each output provides a different filter response, namely, low-pass, high-pass, band-pass and band-reject. The sensitivity analysis is also carried out on both ideal and non-ideal filter configurations. The validity of the proposed filter is verified through PSPICE simulations.
Abstract: One of the main concerns about parallel mechanisms
is the presence of singular points within their workspaces. In singular
positions the mechanism gains or loses one or several degrees of
freedom. It is impossible to control the mechanism in singular
positions. Therefore, these positions have to be avoided. This is a
vital need especially in computer controlled machine tools designed
and manufactured on the basis of parallel mechanisms. This need has
to be taken into consideration when selecting design parameters. A
prerequisite to this is a thorough knowledge about the effect of
design parameters and constraints on singularity. In this paper,
quality condition index was introduced as a criterion for evaluating
singularities of different configurations of a hexapod mechanism
obtainable by different design parameters. It was illustrated that this
method can effectively be employed to obtain the optimum
configuration of hexapod mechanism with the aim of avoiding
singularity within the workspace. This method was then employed to
design the hexapod table of a CNC milling machine.
Abstract: Wireless sensor networks (WSNs) consist of number
of tiny, low cost and low power sensor nodes to monitor some physical phenomenon. The major limitation in these networks is the use of non-rechargeable battery having limited power supply. The
main cause of energy consumption in such networks is
communication subsystem. This paper presents an energy efficient
Cluster Cooperative Caching at Sensor (C3S) based upon grid type clustering. Sensor nodes belonging to the same cluster/grid form a
cooperative cache system for the node since the cost for
communication with them is low both in terms of energy
consumption and message exchanges. The proposed scheme uses
cache admission control and utility based data replacement policy to
ensure that more useful data is retained in the local cache of a node.
Simulation results demonstrate that C3S scheme performs better in
various performance metrics than NICoCa which is existing
cooperative caching protocol for WSNs.
Abstract: Discourse pronominal anaphora resolution must be part of any efficient information processing systems, since the reference of a pronoun is dependent on an antecedent located in the discourse. Contrary to knowledge-poor approaches, this paper shows that syntax-semantic relations are basic in pronominal anaphora resolution. The identification of quantified expressions to which pronouns can be anaphorically related provides further evidence that pronominal anaphora is based on domains of interpretation where asymmetric agreement holds.
Abstract: Wireless sensor network is formed with the combination of sensor nodes and sink nodes. Recently Wireless sensor network has attracted attention of the research community. The main application of wireless sensor network is security from different attacks both for mass public and military. However securing these networks, by itself is a critical issue due to many constraints like limited energy, computational power and lower memory. Researchers working in this area have proposed a number of security techniques for this purpose. Still, more work needs to be done.In this paper we provide a detailed discussion on security in wireless sensor networks. This paper will help to identify different obstacles and requirements for security of wireless sensor networks as well as highlight weaknesses of existing techniques.
Abstract: Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Abstract: Extracting in-play scenes in sport videos is essential for
quantitative analysis and effective video browsing of the sport
activities. Game analysis of badminton as of the other racket sports
requires detecting the start and end of each rally period in an
automated manner. This paper describes an automatic serve scene
detection method employing cubic higher-order local auto-correlation
(CHLAC) and multiple regression analysis (MRA). CHLAC can
extract features of postures and motions of multiple persons without
segmenting and tracking each person by virtue of shift-invariance and
additivity, and necessitate no prior knowledge. Then, the specific
scenes, such as serve, are detected by linear regression (MRA) from
the CHLAC features. To demonstrate the effectiveness of our method,
the experiment was conducted on video sequences of five badminton
matches captured by a single ceiling camera. The averaged precision
and recall rates for the serve scene detection were 95.1% and 96.3%,
respectively.
Abstract: The capturing of gel electrophoresis image represents
the output of a DNA computing algorithm. Before this image is being
captured, DNA computing involves parallel overlap assembly (POA)
and polymerase chain reaction (PCR) that is the main of this
computing algorithm. However, the design of the DNA
oligonucleotides to represent a problem is quite complicated and is
prone to errors. In order to reduce these errors during the design stage
before the actual in-vitro experiment is carried out; a simulation
software capable of simulating the POA and PCR processes is
developed. This simulation software capability is unlimited where
problem of any size and complexity can be simulated, thus saving
cost due to possible errors during the design process. Information
regarding the DNA sequence during the computing process as well as
the computing output can be extracted at the same time using the
simulation software.
Abstract: An embedded system for SEU(single event upset) test
needs to be designed to prevent system failure by high-energy particles
during measuring SEU. SEU is a phenomenon in which the data is changed temporary in semiconductor device caused by high-energy particles. In this paper, we present an embedded system for
SRAM(static random access memory) SEU test. SRAMs are on the DUT(device under test) and it is separated from control board which
manages the DUT and measures the occurrence of SEU. It needs to
have considerations for preventing system failure while managing the
DUT and making an accurate measurement of SEUs. We measure the occurrence of SEUs from five different SRAMs at three different
cyclotron beam energies 30, 35, and 40MeV. The number of SEUs of SRAMs ranges from 3.75 to 261.00 in average.
Abstract: In text categorization problem the most used method
for documents representation is based on words frequency vectors
called VSM (Vector Space Model). This representation is based only
on words from documents and in this case loses any “word context"
information found in the document. In this article we make a
comparison between the classical method of document representation
and a method called Suffix Tree Document Model (STDM) that is
based on representing documents in the Suffix Tree format. For the
STDM model we proposed a new approach for documents
representation and a new formula for computing the similarity
between two documents. Thus we propose to build the suffix tree
only for any two documents at a time. This approach is faster, it has
lower memory consumption and use entire document representation
without using methods for disposing nodes. Also for this method is
proposed a formula for computing the similarity between documents,
which improves substantially the clustering quality. This
representation method was validated using HAC - Hierarchical
Agglomerative Clustering. In this context we experiment also the
stemming influence in the document preprocessing step and highlight
the difference between similarity or dissimilarity measures to find
“closer" documents.
Abstract: The use of polypropylene mesh devices for Pelvic
Organ Prolapse (POP) spread rapidly during the last decade, yet our
knowledge of the mesh-tissue interaction is far from complete. We
aimed to perform a thorough pathological examination of explanted
POP meshes and describe findings that may explain mechanisms of
complications resulting in product excision. We report a spectrum of
important findings, including nerve ingrowth, mesh deformation,
involvement of detrusor muscle with neural ganglia, and
polypropylene degradation. Analysis of these findings may improve
and guide future treatment strategies.
Abstract: Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. Fueling the attention have been the possibilities of international agreements to reduce greenhouse gas emissions, electricity sector restructuring, high power reliability requirements for certain activities, and concern about easing transmission and distribution capacity bottlenecks and congestion. So it is necessary that impact of these kinds of generators on distribution feeder reconfiguration would be investigated. This paper presents an approach for distribution reconfiguration considering Distributed Generators (DGs). The objective function is summation of electrical power losses A Tabu search optimization is used to solve the optimal operation problem. The approach is tested on a real distribution feeder.
Abstract: Partitions can play a significant role in minimising cochannel
interference of Wireless LANs by attenuating signals across
room boundaries. This could pave the way towards higher density
deployments in home and office environments through spatial
channel reuse. Yet, due to protocol limitations, the latest incantation
of IEEE 802.11 standard is still unable to take advantage of this fact:
Despite having clearly adequate Signal to Interference Ratio (SIR)
over co-channel neighbouring networks in other rooms, its goodput
falls significantly lower than its maximum in the absence of cochannel
interferers. In this paper, we describe how this situation can
be remedied via modest modifications to the standard.
Abstract: The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. The global technological scenario has paved the way to new pedagogies in teaching-learning process focusing on technology based learning environment and its impact on student achievement. The present experimental study was conducted to determine the effectiveness of technology based learning environment on student achievement in English as a foreign language. The sample of the study was 90 students of 10th grade of a public school located in Islamabad. A pretest- posttest equivalent group design was used to compare the achievement of the two groups. A Pretest and A posttest containing 50 items each from English textbook were developed and administered. The collected data were statistically analyzed. The results showed that there was a significant difference between the mean scores of Experimental group and the Control group. The performance of Experimental group was better on posttest scores that indicted that teaching through technology based learning environment enhanced the achievement level of the students. On the basis of the results, it was recommended that teaching and learning through information and communication technologies may be adopted to enhance the language learning capability of the students.
Abstract: In this paper we consider a nonlinear feedback control
called augmented automatic choosing control (AACC) using the
gradient optimization automatic choosing functions for nonlinear
systems. Constant terms which arise from sectionwise linearization
of a given nonlinear system are treated as coefficients of a stable
zero dynamics. Parameters included in the control are suboptimally
selected by expanding a stable region in the sense of Lyapunov
with the aid of the genetic algorithm. This approach is applied to
a field excitation control problem of power system to demonstrate
the splendidness of the AACC. Simulation results show that the new
controller can improve performance remarkably well.
Abstract: This study proposes a multi-response surface
optimization problem (MRSOP) for determining the proper choices
of a process parameter design (PPD) decision problem in a noisy
environment of a grease position process in an electronic industry.
The proposed models attempts to maximize dual process responses
on the mean of parts between failure on left and right processes. The
conventional modified simplex method and its hybridization of the
stochastic operator from the hunting search algorithm are applied to
determine the proper levels of controllable design parameters
affecting the quality performances. A numerical example
demonstrates the feasibility of applying the proposed model to the
PPD problem via two iterative methods. Its advantages are also
discussed. Numerical results demonstrate that the hybridization is
superior to the use of the conventional method. In this study, the
mean of parts between failure on left and right lines improve by
39.51%, approximately. All experimental data presented in this
research have been normalized to disguise actual performance
measures as raw data are considered to be confidential.
Abstract: The purpose of this study is to introduce a new
interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues
in proton therapy. This interface program was developed under
MATLAB software and includes a friendly graphical user interface
with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image
segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton
beam. The result of the mentioned technique is a number of nonoverlapped
squares with different sizes in every image. By this way
the resolution of image segmentation is high enough in and near
heterogeneous areas to preserve the precision of dose calculations
and is low enough in homogeneous areas to reduce the number of
cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron
and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.
Abstract: As mobile service's subscriber is increasing; mobile
contents services are getting more and more variables. So, mobile
contents development needs not only contents design but also
guideline for just mobile. And when mobile contents are developed, it
is important to pass the limit and restriction of the mobile. The
restrictions of mobile are small browser and screen size, limited
download size and uncomfortable navigation. So each contents of
mobile guideline will be presented for user's usability, easy of
development and consistency of rule. This paper will be proposed
methodology which is each contents of mobile guideline. Mobile web
will be developed by mobile guideline which I proposed.
Abstract: Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.