Abstract: Nowadays, the earth is countered with serious problem
of air pollution. This problem has been started from the industrial
revolution and has been faster in recent years, so that leads the earth
to ecological and environmental disaster. One of its results is the
global warming problem and its related increase in global
temperature. The most important factors in air pollution especially in
urban environments are Automobiles and residential buildings that are
the biggest consumers of the fossil energies, so that if the residential
buildings as a big part of the consumers of such energies reduce their
consumption rate, the air pollution will be decreased. Since
Metropolises are the main centers of air pollution in the world,
assessment and analysis of efficient strategies in decreasing air
pollution in such cities, can lead to the desirable and suitable results
and can solve the problem at least in critical level. Tabriz city is one
of the most important metropolises in North west of Iran that about
two million people are living there. for its situation in cold dry
climate, has a high rate of fossil energies consumption that make air
pollution in its urban environment. These two factors, being both
metropolis and in cold dry climate, make this article try to analyze the
strategies of climatic design in old districts of the city and use them in
new districts of the future. These strategies can be used in this city
and other similar cities and pave the way to reduce energy
consumption and related air pollution to save whole world.
Abstract: This paper proposes a novel solution for optimizing
the size and communication overhead of a distributed multiagent
system without compromising the performance. The proposed approach
addresses the challenges of scalability especially when the
multiagent system is large. A modified spectral clustering technique
is used to partition a large network into logically related clusters.
Agents are assigned to monitor dedicated clusters rather than monitor
each device or node. The proposed scalable multiagent system is
implemented using JADE (Java Agent Development Environment)
for a large power system. The performance of the proposed topologyindependent
decentralized multiagent system and the scalable multiagent
system is compared by comprehensively simulating different
fault scenarios. The time taken for reconfiguration, the overall computational
complexity, and the communication overhead incurred are
computed. The results of these simulations show that the proposed
scalable multiagent system uses fewer agents efficiently, makes faster
decisions to reconfigure when a fault occurs, and incurs significantly
less communication overhead.
Abstract: A numerical study has been carried out to investigate
the heat transfer by natural convection of nanofluid taking Cu as
nanoparticles and the water as based fluid in a three dimensional
annulus enclosure filled with porous media (silica sand) between two
horizontal concentric cylinders with 12 annular fins of 2.4mm
thickness attached to the inner cylinder under steady state conditions.
The governing equations which used are continuity, momentum and
energy equations under an assumptions used Darcy law and
Boussinesq-s approximation which are transformed to dimensionless
equations. The finite difference approach is used to obtain all the
computational results using the MATLAB-7. The parameters affected
on the system are modified Rayleigh number (10 ≤Ra*≤ 1000), fin
length Hf (3, 7 and 11mm), radius ratio Rr (0.293, 0.365 and 0.435)
and the volume fraction(0 ≤ ¤ò ≤ 0 .35). It was found that the
average Nusselt number depends on (Ra*, Hf, Rr and φ). The results
show that, increasing of fin length decreases the heat transfer rate and
for low values of Ra*, decreasing Rr cause to decrease Nu while for
Ra*
greater than 100, decreasing Rr cause to increase Nu and adding
Cu nanoparticles with 0.35 volume fraction cause 27.9%
enhancement in heat transfer. A correlation for Nu in terms of Ra*,
Hf and φ, has been developed for inner hot cylinder.
Abstract: This paper presents the results of an analytical study
on the seismic response of a Multi-Span-Simply-Supported precast
bridge in Washington State. The bridge was built in the early 1960's
along Interstate 5 and was widened the first time in 1979 and the
second time in 2001. The primary objective of this research project
is to determine the seismic vulnerability of the bridge in order to
develop the required retrofit measure. The seismic vulnerability of
the bridge is evaluated using two seismic evaluation methods
presented in the FHWA Seismic Retrofitting Manual for Highway
Bridges, Method C and Method D2. The results of the seismic
analyses demonstrate that Method C and Method D2 vary markedly
in terms of the information they provide to the bridge designer
regarding the vulnerability of the bridge columns.
Abstract: Business rules and data warehouse are concepts and
technologies that impact a wide variety of organizational tasks. In
general, each area has evolved independently, impacting application
development and decision-making. Generating knowledge from data
warehouse is a complex process. This paper outlines an approach to
ease import of information and knowledge from a data warehouse
star schema through an inference class of business rules. The paper
utilizes the Oracle database for illustrating the working of the
concepts. The star schema structure and the business rules are stored
within a relational database. The approach is explained through a
prototype in Oracle-s PL/SQL Server Pages.
Abstract: This paper describes a code clone visualization method, called FC graph, and the implementation issues. Code clone detection tools usually show the results in a textual representation. If the results are large, it makes a problem to software maintainers with understanding them. One of the approaches to overcome the situation is visualization of code clone detection results. A scatter plot is a popular approach to the visualization. However, it represents only one-to-one correspondence and it is difficult to find correspondence of code clones over multiple files. FC graph represents correspondence among files, code clones and packages in Java. All nodes in FC graph are positioned using force-directed graph layout, which is dynami- cally calculated to adjust the distances of nodes until stabilizing them. We applied FC graph to some open source programs and visualized the results. In the author’s experience, FC graph is helpful to grasp correspondence of code clones over multiple files and also code clones with in a file.
Abstract: This paper proposes a simple model of economic geography within the Dixit-Stiglitz-Iceberg framework that may be used to analyze migration patterns among three cities. The cost–benefit tradeoffs affecting incentives for three types of migration, including echelon migration, are discussed. This paper develops a tractable, heterogeneous-agent, general equilibrium model, where agents share constant human capital, and explores the relationship between the benefits of echelon migration and gross human capital. Using Chinese numerical solutions, we study the manifestation of echelon migration and how it responds to changes in transportation cost and elasticity of substitution. Numerical results demonstrate that (i) there are positive relationships between a migration-s benefit-and-wage ratio, (ii) there are positive relationships between gross human capital ratios and wage ratios as to origin and destination, and (iii) we identify 13 varieties of human capital convergence among cities. In particular, this model predicts population shock resulting from the processes of migration choice and echelon migration.
Abstract: The Kowsar dam supply water for different usages
such as drinking, industrial, agricultural and aquaculture farms
usages and located next to the city of Dehdashat in Kohgiluye and
Boyerahmad province in southern Iran. There are some towns and
villages on the Kowsar dam watersheds, which Dehdasht and
Choram are the most important and populated cities in this area. The
study was undertaken to assess the status of water quality in the
urban areas of the Kowsar dam. A total of 28 water samples were
collected from 6 stations on surface water and 1 station from
groundwater on the watershed of the Kowsar dam. All the samples
were analyzed for Cd concentration using standard procedures. The
results were compared with other national and international
standards. Among the analyzed samples, as the maximum value of
cadmium (1.131 μg/L) was observed on the station 2 at the winter
2009, all the samples analyzed were within the maximum admissible
limits by the United States Environmental Protection Agency, EU,
WHO, New Zealand , Australian, Iranian, and the Indian standards.
In general results of the present study have shown that Cd mean
values of stations No. 4, 1 and 2 with 0.5135, 0.0.4733 and 0.4573
μg/L respectively are higher than the other stations . Although Cd
level of all samples and stations have had normal values but this is
an indication of pollution potential and hazards because of human
activity and waste water of towns in the areas, which can effect on
human health implications in future. This research, therefore,
recommends the government and other responsible authorities to take
suitable improving measures in the Kowsar dam watershed-s.
Abstract: Data stream analysis is the process of computing
various summaries and derived values from large amounts of data
which are continuously generated at a rapid rate. The nature of a
stream does not allow a revisit on each data element. Furthermore,
data processing must be fast to produce timely analysis results. These
requirements impose constraints on the design of the algorithms to
balance correctness against timely responses. Several techniques
have been proposed over the past few years to address these
challenges. These techniques can be categorized as either dataoriented
or task-oriented. The data-oriented approach analyzes a
subset of data or a smaller transformed representation, whereas taskoriented
scheme solves the problem directly via approximation
techniques. We propose a hybrid approach to tackle the data stream
analysis problem. The data stream has been both statistically
transformed to a smaller size and computationally approximated its
characteristics. We adopt a Monte Carlo method in the approximation
step. The data reduction has been performed horizontally and
vertically through our EMR sampling method. The proposed method
is analyzed by a series of experiments. We apply our algorithm on
clustering and classification tasks to evaluate the utility of our
approach.
Abstract: Sorting appears the most attention among all computational tasks over the past years because sorted data is at the heart of many computations. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. Many parallel sorting algorithms have been investigated for a variety of parallel computer architectures. In this paper, three parallel sorting algorithms have been implemented and compared in terms of their overall execution time. The algorithms implemented are the odd-even transposition sort, parallel merge sort and parallel rank sort. Cluster of Workstations or Windows Compute Cluster has been used to compare the algorithms implemented. The C# programming language is used to develop the sorting algorithms. The MPI (Message Passing Interface) library has been selected to establish the communication and synchronization between processors. The time complexity for each parallel sorting algorithm will also be mentioned and analyzed.
Abstract: The main mission of Ezilla is to provide a friendly
interface to access the virtual machine and quickly deploy the high
performance computing environment. Ezilla has been developed by
Pervasive Computing Team at National Center for High-performance
Computing (NCHC). Ezilla integrates the Cloud middleware,
virtualization technology, and Web-based Operating System (WebOS)
to form a virtual computer in distributed computing environment. In
order to upgrade the dataset and speedup, we proposed the sensor
observation system to deal with a huge amount of data in the
Cassandra database. The sensor observation system is based on the
Ezilla to store sensor raw data into distributed database. We adopt the
Ezilla Cloud service to create virtual machines and login into virtual
machine to deploy the sensor observation system. Integrating the
sensor observation system with Ezilla is to quickly deploy experiment
environment and access a huge amount of data with distributed
database that support the replication mechanism to protect the data
security.
Abstract: This paper presents Faults Forecasting System (FFS)
that utilizes statistical forecasting techniques in analyzing process
variables data in order to forecast faults occurrences. FFS is
proposing new idea in detecting faults. Current techniques used in
faults detection are based on analyzing the current status of the
system variables in order to check if the current status is fault or not.
FFS is using forecasting techniques to predict future timing for faults
before it happens. Proposed model is applying subset modeling
strategy and Bayesian approach in order to decrease dimensionality
of the process variables and improve faults forecasting accuracy. A
practical experiment, designed and implemented in Okayama
University, Japan, is implemented, and the comparison shows that
our proposed model is showing high forecasting accuracy and
BEFORE-TIME.
Abstract: The experiment was conducted to evaluate
digestibility quantities of protein in Canola Meals (CMs) between
caecectomised and intact adult Rhode Island Red (RIR) cockerels
with using conventional addition method (CAM) for 7 d: a 4-d
adaptation and a 3-d experiment period on the basis of a completely
randomized design with 4 replicates. Results indicated that
caecectomy decreased (P
Abstract: In this paper first, Two buildings have been modeled
and then analyzed using nonlinear static analysis method under two
different conditions in Nonlinear SAP 2000 software. In the first
condition the interaction of soil adjacent to the walls of basement are
ignored while in the second case this interaction have been modeled
using Gap elements of nonlinear SAP2000 software. Finally,
comparing the results of two models, the effects of soil-structure on
period, target point displacement, internal forces, shape deformations
and base shears have been studied. According to the results, this
interaction has always increased the base shear of buildings,
decreased the period of structure and target point displacement, and
often decreased the internal forces and displacements.
Abstract: The draw solute separation process in Forward
Osmosis desalination was simulated in Aspen Plus chemical process
modeling software, to estimate the energy consumption and compare
it with other desalination processes, mainly the Reverse Osmosis
process which is currently most prevalent. The electrolytic chemistry
for the system was retrieved using the Elec – NRTL property method
in the Aspen Plus database. Electrical equivalent of energy required
in the Forward Osmosis desalination technique was estimated and
compared with the prevalent desalination techniques.
Abstract: In this paper the vibration behaviors of a structure equipped with a tuned liquid column damper (TLCD) under a harmonic type of earthquake loading are studied. However, due to inherent nonlinear liquid damping, it is no doubt that a great deal of computational effort is required to search the optimum parameters of the TLCD, numerically. Therefore by linearization the equation of motion of the single degree of freedom structure equipped with the TLCD, the closed form solutions of the TLCD-structure system are derived. To find the reliability of the analytical method, the results have been compared with other researcher and have good agreement. Further, the effects of optimal design parameters such as length ratio and mass ratio on the performance of the TLCD for controlling the responses of a structure are investigated by using the harmonic type of earthquake excitation. Finally, the Citicorp Center which has a very flexible structure is used as an example to illustrate the design procedure for the TLCD under the earthquake excitation.
Abstract: Wavelet transform provides several important
characteristics which can be used in a texture analysis and
classification. In this work, an efficient texture classification method,
which combines concepts from wavelet and co-occurrence matrices,
is presented. An Euclidian distance classifier is used to evaluate the
various methods of classification. A comparative study is essential to
determine the ideal method. Using this conjecture, we developed a
novel feature set for texture classification and demonstrate its
effectiveness
Abstract: In this paper, we present a vertical nanowire thin film transistor with gate-all-around architecture, fabricated using CMOS compatible processes. A novel method of fabricating polysilicon vertical nanowires of diameter as small as 30 nm using wet-etch is presented. Both n-type and p-type vertical poly-silicon nanowire transistors exhibit superior electrical characteristics as compared to planar devices. On a poly-crystalline nanowire of 30 nm diameter, high Ion/Ioff ratio of 106, low drain-induced barrier lowering (DIBL) of 50 mV/V, and low sub-threshold slope SS~100mV/dec are demonstrated for a device with channel length of 100 nm.
Abstract: This paper deals with the design, development & implementation of a temperature sensor using zigbee. The main aim of the work undertaken in this paper is to sense the temperature and to display the result on the LCD using the zigbee technology. ZigBee operates in the industrial, scientific and medical (ISM) radio bands; 868 MHz in Europe, 915 MHz in the USA and 2.4 GHz in most jurisdictions worldwide. The technology is intended to be simpler and cheaper than other WPANs such as Bluetooth. The most capable ZigBee node type is said to require only about 10 % of the software of a typical Bluetooth or Wireless Internet node, while the simplest nodes are about 2 %. However, actual code sizes are much higher, more like 50 % of the Bluetooth code size. ZigBee chip vendors have announced 128-kilobyte devices. In this work undertaken in the design & development of the temperature sensor, it senses the temperature and after amplification is then fed to the micro controller, this is then connected to the zigbee module, which transmits the data and at the other end the zigbee reads the data and displays on to the LCD. The software developed is highly accurate and works at a very high speed. The method developed shows the effectiveness of the scheme employed.
Abstract: In wireless sensor network (WSN) the use of mobile
sink has been attracting more attention in recent times. Mobile sinks
are more effective means of balancing load, reducing hotspot
problem and elongating network lifetime. The sensor nodes in WSN
have limited power supply, computational capability and storage and
therefore for continuous data delivery reliability becomes high
priority in these networks. In this paper, we propose a Reliable
Energy-efficient Data Dissemination (REDD) scheme for WSNs with
multiple mobile sinks. In this strategy, sink first determines the
location of source and then directly communicates with the source
using geographical forwarding. Every forwarding node (FN) creates a
local zone comprising some sensor nodes that can act as
representative of FN when it fails. Analytical and simulation study
reveals significant improvement in energy conservation and reliable
data delivery in comparison to existing schemes.