Abstract: In this work, neural networks methods MLP type were
applied to a database from an array of six sensors for the detection of
three toxic gases. The choice of the number of hidden layers and the
weight values are influential on the convergence of the learning
algorithm. We proposed, in this article, a mathematical formula to
determine the optimal number of hidden layers and good weight
values based on the method of back propagation of errors. The results
of this modeling have improved discrimination of these gases and
optimized the computation time. The model presented here has
proven to be an effective application for the fast identification of
toxic gases.
Abstract: The continuous decline of petroleum and natural gas
reserves and non linear rise of oil price has brought about a
realisation of the need for a change in our perpetual dependence on
the fossil fuel. A day to day increased consumption of crude and
petroleum products has made a considerable impact on our foreign
exchange reserves. Hence, an alternate resource for the conversion of
energy (both liquid and gas) is essential for the substitution of
conventional fuels. Biomass is the alternate solution for the present
scenario. Biomass can be converted into both liquid as well as
gaseous fuels and other feedstocks for the industries.
Abstract: The exact theoretical expression describing the
probability distribution of nonlinear sea-surface elevations derived
from the second-order narrowband model has a cumbersome form
that requires numerical computations, not well-disposed to theoretical
or practical applications. Here, the same narrowband model is reexamined
to develop a simpler closed-form approximation suitable
for theoretical and practical applications. The salient features of the
approximate form are explored, and its relative validity is verified
with comparisons to other readily available approximations, and
oceanic data.
Abstract: In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation,
style, illumination, and can suffer from perspective distortion.
Pre-processing is performed to make the characters scale and
rotation invariant. Since text degradations can not be appropriately
defined using well-known geometric transformations such
as translation, rotation, affine transformation and shearing, we
use the whole character black pixels as our feature vector.
Classification is performed with minimum distance classifier
using the maximum likelihood criterion, which delivers very
promising Character Recognition Rate (CRR) of 89%. We
achieve considerably higher Word Recognition Rate (WRR) of
99% when using lower level linguistic knowledge about product
words during the recognition process.
Abstract: Taking the design tolerance into account, this paper
presents a novel efficient approach to generate iso-scallop tool path for
five-axis strip machining with a barrel cutter. The cutter location is
first determined on the scallop surface instead of the design surface,
and then the cutter is adjusted to locate the optimal tool position based
on the differential rotation of the tool axis and satisfies the design
tolerance simultaneously. The machining strip width and error are
calculated with the aid of the grazing curve of the cutter. Based on the
proposed tool positioning algorithm, the tool paths are generated by
keeping the scallop height formed by adjacent tool paths constant. An
example is conducted to confirm the validity of the proposed method.
Abstract: The use of eXtensible Markup Language (XML) in
web, business and scientific databases lead to the development of
methods, techniques and systems to manage and analyze XML data.
Semi-structured documents suffer due to its heterogeneity and
dimensionality. XML structure and content mining represent
convergence for research in semi-structured data and text mining. As
the information available on the internet grows drastically, extracting
knowledge from XML documents becomes a harder task. Certainly,
documents are often so large that the data set returned as answer to a
query may also be very big to convey the required information. To
improve the query answering, a Semantic Tree Based Association
Rule (STAR) mining method is proposed. This method provides
intentional information by considering the structure, content and the
semantics of the content. The method is applied on Reuter’s dataset
and the results show that the proposed method outperforms well.
Abstract: At present, the evaluation of voltage stability
assessment experiences sizeable anxiety in the safe operation of
power systems. This is due to the complications of a strain power
system. With the snowballing of power demand by the consumers
and also the restricted amount of power sources, therefore, the system
has to perform at its maximum proficiency. Consequently, the
noteworthy to discover the maximum ability boundary prior to
voltage collapse should be undertaken. A preliminary warning can be
perceived to evade the interruption of power system’s capacity. The
effectiveness of line voltage stability indices (LVSI) is differentiated
in this paper. The main purpose of the indices used is to predict the
proximity of voltage instability of the electric power system. On the
other hand, the indices are also able to decide the weakest load buses
which are close to voltage collapse in the power system. The line
stability indices are assessed using the IEEE 14 bus test system to
validate its practicability. Results demonstrated that the implemented
indices are practically relevant in predicting the manifestation of
voltage collapse in the system. Therefore, essential actions can be
taken to dodge the incident from arising.
Abstract: E-learning has become an efficient and widespread
means of education at all levels of human activities. Statistics is no
exception. Unfortunately the main focus in statistics teaching is
usually paid to the substitution in formulas. Suitable websites can
simplify and automate calculations and provide more attention and
time to the basic principles of statistics, mathematization of real-life
situations and following interpretation of results. We now introduce
our own web-site for hypothesis testing. Its didactic aspects, the
technical possibilities of the individual tools, the experience of use
and the advantages or disadvantages are discussed in this paper. This
web-site is not a substitute for common statistical software but should
significantly improve the teaching of statistics at universities.
Abstract: Securing the confidential data transferred via wireless
network remains a challenging problem. It is paramount to ensure
that data are accessible only by the legitimate users rather than by the
attackers. One of the most serious threats to organization is jamming,
which disrupts the communication between any two pairs of nodes.
Therefore, designing an attack-defending scheme without any packet
loss in data transmission is an important challenge. In this paper,
Dependence based Malicious Route Defending DMRD Scheme has
been proposed in multi path routing environment to prevent jamming
attack. The key idea is to defend the malicious route to ensure
perspicuous transmission. This scheme develops a two layered
architecture and it operates in two different steps. In the first step,
possible routes are captured and their agent dependence values are
marked using triple agents. In the second step, the dependence values
are compared by performing comparator filtering to detect malicious
route as well as to identify a reliable route for secured data
transmission. By simulation studies, it is observed that the proposed
scheme significantly identifies malicious route by attaining lower
delay time and route discovery time; it also achieves higher
throughput.
Abstract: Femtocells are regarded as a milestone for next
generation cellular networks. As femtocells are deployed in an
unplanned manner, there is a chance of assigning same resource to
neighboring femtocells. This scenario may induce co-channel
interference and may seriously affect the service quality of
neighboring femtocells. In addition, the dominant transmit power of a
femtocell will induce co-tier interference to neighboring femtocells.
Thus to jointly handle co-tier and co-channel interference, we
propose an interference-free power and resource block allocation
(IFPRBA) algorithm for closely located, closed access femtocells.
Based on neighboring list, inter-femto-base station distance and
uplink noise power, the IFPRBA algorithm assigns non-interfering
power and resource to femtocells. The IFPRBA algorithm also
guarantees the quality of service to femtouser based on the
knowledge of resource requirement, connection type, and the
tolerable delay budget. Simulation result shows that the interference
power experienced in IFPRBA algorithm is below the tolerable
interference power and hence the overall service success ratio, PRB
efficiency and network throughput are maximum when compared to
conventional resource allocation framework for femtocell (RAFF)
algorithm.
Abstract: The main purpose of this research is to
comprehensively explore and identify the problems of attestation of
the public servants and to propose solutions for these issues through
deeply analyzing laws and the legal theoretical literature. For the
detailed analysis of the above-mentioned problems we will use some
research methods, the implementation of which has a goal to ensure
the objectivity and clarity of scientific research and its results.
Abstract: The focal aspire of e-Government (eGovt) is to offer
citizen-centered service delivery. Accordingly, the citizenry
consumes services from multiple government agencies through
national portal. Thus, eGovt is an enterprise with the primary
business motive of transparent, efficient and effective public services
to its citizenry and its logical structure is the eGovernment Enterprise
Architecture (eGEA). Since eGovt is IT oriented multifaceted
service-centric system, EA doesn’t do much on an automated
enterprise other than the business artifacts. Service-Oriented
Architecture (SOA) manifestation led some governments to pertain
this in their eGovts, but it limits the source of business artifacts. The
concurrent use of EA and SOA in eGovt executes interoperability and
integration and leads to Service-Oriented e-Government Enterprise
(SOeGE). Consequently, agile eGovt system becomes a reality. As an
IT perspective eGovt comprises of centralized public service artifacts
with the existing application logics belong to various departments at
central, state and local level. The eGovt is renovating to SOeGE by
apply the Service-Orientation (SO) principles in the entire system.
This paper explores IT perspective of SOeGE in India which
encompasses the public service models and illustrated with a case
study the Passport service of India.
Abstract: Real time image and video processing is a demand in
many computer vision applications, e.g. video surveillance, traffic
management and medical imaging. The processing of those video
applications requires high computational power. Thus, the optimal
solution is the collaboration of CPU and hardware accelerators. In
this paper, a Canny edge detection hardware accelerator is proposed.
Edge detection is one of the basic building blocks of video and image
processing applications. It is a common block in the pre-processing
phase of image and video processing pipeline. Our presented
approach targets offloading the Canny edge detection algorithm from
processing system (PS) to programmable logic (PL) taking the
advantage of High Level Synthesis (HLS) tool flow to accelerate the
implementation on Zynq platform. The resulting implementation
enables up to a 100x performance improvement through hardware
acceleration. The CPU utilization drops down and the frame rate
jumps to 60 fps of 1080p full HD input video stream.
Abstract: This paper investigates the viability of using carbon
fiber reinforced epoxy composites modified with carbon nanotubes to
strengthening reinforced concrete (RC) columns. Six RC columns
was designed and constructed according to ASCE standards. The
columns were wrapped using carbon fiber sheets impregnated with
either neat epoxy or CNTs modified epoxy. These columns were then
tested under concentric axial loading. Test results show that;
compared to the unwrapped specimens; wrapping concrete columns
with carbon fiber sheet embedded in CNTs modified epoxy resulted
in an increase in its axial load resistance, maximum displacement,
and toughness values by 24%, 109% and 232%, respectively. These
results reveal that adding CNTs into epoxy resin enhanced the
confinement effect, specifically, increased the axial load resistance,
maximum displacement, and toughness values by 11%, 6%, and
19%, respectively compared with columns strengthening with carbon
fiber sheet embedded in neat epoxy.
Abstract: Software Architecture is the basic structure of
software that states the development and advancement of a software
system. Software architecture is also considered as a significant tool
for the construction of high quality software systems. A clean design
leads to the control, value and beauty of software resulting in its
longer life while a bad design is the cause of architectural erosion
where a software evolution completely fails. This paper discusses the
occurrence of software architecture erosion and presents a set of
methods for the detection, declaration and prevention of architecture
erosion. The causes and symptoms of architecture erosion are
observed with the examples of prescriptive and descriptive
architectures and the practices used to stop this erosion are also
discussed by considering different types of software erosion and their
affects. Consequently finding and devising the most suitable
approach for fighting software architecture erosion and in some way
reducing its affect is evaluated and tested on different scenarios.
Abstract: This study investigates how AlGaAs/GaAs thin film
solar cells perform under varying global solar spectrum due to the
changes of environmental parameters such as the air mass and the
atmospheric turbidity. The solar irradiance striking the solar cell is
simulated using the spectral irradiance model SMARTS2 (Simple
Model of the Atmospheric Radiative Transfer of Sunshine) for clear
skies on the site of Setif (Algeria). The results show a reduction in the
short circuit current due to increasing atmospheric turbidity, it is
63.09% under global radiation. However increasing air mass leads to
a reduction in the short circuit current of 81.73%. The efficiency
decreases with increasing atmospheric turbidity and air mass.
Abstract: The star network is one of the promising
interconnection networks for future high speed parallel computers, it
is expected to be one of the future-generation networks. The star
network is both edge and vertex symmetry, it was shown to have
many gorgeous topological proprieties also it is owns hierarchical
structure framework. Although much of the research work has been
done on this promising network in literature, it still suffers from
having enough algorithms for load balancing problem. In this paper
we try to work on this issue by investigating and proposing an
efficient algorithm for load balancing problem for the star network.
The proposed algorithm is called Star Clustered Dimension Exchange
Method SCDEM to be implemented on the star network. The
proposed algorithm is based on the Clustered Dimension Exchange
Method (CDEM). The SCDEM algorithm is shown to be efficient in
redistributing the load balancing as evenly as possible among all
nodes of different factor networks.
Abstract: This work was one of the tasks of the
Manufacturing2Client project, whose objective was to develop a
frontal deflector to be commercialized in the automotive industry,
using new project and manufacturing methods. In this task, in
particular, it was proposed to develop the ability to predict
computationally the aerodynamic influence of flow in vehicles, in an
effort to reduce fuel consumption in vehicles from class 3 to 8. With
this aim, two deflector models were developed and their aerodynamic
performance analyzed. The aerodynamic study was done using the
Computational Fluid Dynamics (CFD) software Ansys CFX and
allowed the calculation of the drag coefficient caused by the vehicle
motion for the different configurations considered. Moreover, the
reduction of diesel consumption and carbon dioxide (CO2) emissions
associated with the optimized deflector geometry could be assessed.
Abstract: Axial flow fans, while incapable of developing high
pressures, they are well suitable for handling large volumes of air at
relatively low pressures. In general, they are low in cost and possess
good efficiency, and can have blades of airfoil shape. Axial flow fans
show good efficiencies, and can operate at high static pressures if
such operation is necessary. Our objective is to model and analyze
the flow through AXIAL FANS using CFD Software and draw
inference from the obtained results, so as to get maximum efficiency.
The performance of an axial fan was simulated using CFD and the
effect of variation of different parameters such as the blade number,
noise level, velocity, temperature and pressure distribution on the
blade surface was studied. This paper aims to present a final 3D CAD
model of axial flow fan. Adapting this model to the available
components in the market, the first optimization was done. After this
step, CFX flow solver is used to do the necessary numerical analyses
on the aerodynamic performance of this model. This analysis results
in a final optimization of the proposed 3D model which is presented
in this article.
Abstract: The final step to complete the “Analytical Systems
Engineering Process” is the “Allocated Architecture” in which all
Functional Requirements (FRs) of an engineering system must be
allocated into their corresponding Physical Components (PCs). At
this step, any design for developing the system’s allocated
architecture in which no clear pattern of assigning the exclusive
“responsibility” of each PC for fulfilling the allocated FR(s) can be
found is considered a poor design that may cause difficulties in
determining the specific PC(s) which has (have) failed to satisfy a
given FR successfully. The present study utilizes the Axiomatic
Design method principles to mathematically address this problem and
establishes an “Axiomatic Model” as a solution for reaching good
alternatives for developing the allocated architecture. This study
proposes a “loss Function”, as a quantitative criterion to monetarily
compare non-ideal designs for developing the allocated architecture
and choose the one which imposes relatively lower cost to the
system’s stakeholders. For the case-study, we use the existing design
of U. S. electricity marketing subsystem, based on data provided by
the U.S. Energy Information Administration (EIA). The result for
2012 shows the symptoms of a poor design and ineffectiveness due to
coupling among the FRs of this subsystem.