Abstract: Building inspection is one of the key components of building maintenance. The primary purpose of performing a building inspection is to evaluate the building-s condition. Without inspection, it is difficult to determine a built asset-s current condition, so failure to inspect can contribute to the asset-s future failure. Traditionally, a longhand survey description has been widely used for property condition reports. Surveys that employ ratings instead of descriptions are gaining wide acceptance in the industry because they cater to the need for numerical analysis output. These kinds of surveys are also in keeping with the new RICS HomeBuyer Report 2009. In this paper, we propose a new assessment method, derived from the current rating systems, for assessing the specifically smart school building-s condition and rating the seriousness of each defect identified. These two assessment criteria are then multiplied to find the building-s score, which we called the Condition Survey Protocol (CSP) 1 Matrix. Instead of a longhand description of a building-s defects, this matrix requires concise explanations about the defects identified, thus saving on-site time during a smart school building inspection. The full score is used to give the building an overall rating: Good, Fair or Dilapidated.
Abstract: The purpose of this paper is to provide a practical
example to the Linear Quadratic Gaussian (LQG) controller. This
method includes a description and some discussion of the discrete
Kalman state estimator. One aspect of this optimality is that the
estimator incorporates all information that can be provided to it. It
processes all available measurements, regardless of their precision, to
estimate the current value of the variables of interest, with use of
knowledge of the system and measurement device dynamics, the
statistical description of the system noises, measurement errors, and
uncertainty in the dynamics models.
Since the time of its introduction, the Kalman filter has been the
subject of extensive research and application, particularly in the area
of autonomous or assisted navigation. For example, to determine the
velocity of an aircraft or sideslip angle, one could use a Doppler
radar, the velocity indications of an inertial navigation system, or the
relative wind information in the air data system. Rather than ignore
any of these outputs, a Kalman filter could be built to combine all of
this data and knowledge of the various systems- dynamics to
generate an overall best estimate of velocity and sideslip angle.
Abstract: In this paper, an approach to reduce the computation steps required by fast neural networksfor the searching process is presented. The principle ofdivide and conquer strategy is applied through imagedecomposition. Each image is divided into small in sizesub-images and then each one is tested separately usinga fast neural network. The operation of fast neuralnetworks based on applying cross correlation in thefrequency domain between the input image and theweights of the hidden neurons. Compared toconventional and fast neural networks, experimentalresults show that a speed up ratio is achieved whenapplying this technique to locate human facesautomatically in cluttered scenes. Furthermore, fasterface detection is obtained by using parallel processingtechniques to test the resulting sub-images at the sametime using the same number of fast neural networks. Incontrast to using only fast neural networks, the speed upratio is increased with the size of the input image whenusing fast neural networks and image decomposition.
Abstract: Plasma plume will be produced and arrive at spacecraft when the electric thruster operates on orbit. It-s important to characterize the thruster plasma parameters because the plume has significant effects or hazards on spacecraft sub-systems and parts. Through the ground test data of the desired parameters, the major characteristics of the thruster plume will be achieved. Also it is very important for optimizing design of Ion thruster. Retarding Potential Analyzer (RPA) is an effective instrument for plasma ion energy per unit charge distribution measurement. Special RPA should be designed according to certain plume plasma parameters range and feature. In this paper, major principles usable for good RPA design are discussed carefully. Conform to these principles, a four-grid planar electrostatic energy analyzer RPA was designed to avoid false data, and details were discussed including construction, materials, aperture diameter and so on. At the same time, it was designed more suitable for credible and long-duration measurements in the laboratory. In the end, RPA measurement results in the laboratory were given and discussed.
Abstract: Moral decisions are considered as an intuitive process,
while conscious reasoning is mostly used only to justify those
intuitions. This problem is described in few different dual-process
theories of mind, that are being developed e.g. by Frederick and
Kahneman, Stanovich and Evans. Those theories recently evolved
into tri-process theories with a proposed process that makes ultimate
decision or allows to paraformal processing with focal bias..
Presented experiment compares the decision patterns to the
implications of those models.
In presented study participants (n=179) considered different
aspects of trolley dilemma or its footbridge version and decided after
that.
Results show that in the control group 70% of people decided to
use the lever to change tracks for the running trolley, and 20% chose
to push the fat man down the tracks. In contrast, after experimental
manipulation almost no one decided to act. Also the decision time
difference between dilemmas disappeared after experimental
manipulation.
The result supports the idea of three co-working processes:
intuitive (TASS), paraformal (reflective mind) and algorithmic
process.
Abstract: EEG signal is one of the oldest measures of brain
activity that has been used vastly for clinical diagnoses and
biomedical researches. However, EEG signals are highly
contaminated with various artifacts, both from the subject and from
equipment interferences. Among these various kinds of artifacts,
ocular noise is the most important one. Since many applications such
as BCI require online and real-time processing of EEG signal, it is
ideal if the removal of artifacts is performed in an online fashion.
Recently, some methods for online ocular artifact removing have
been proposed. One of these methods is ARMAX modeling of EEG
signal. This method assumes that the recorded EEG signal is a
combination of EOG artifacts and the background EEG. Then the
background EEG is estimated via estimation of ARMAX parameters.
The other recently proposed method is based on adaptive filtering.
This method uses EOG signal as the reference input and subtracts
EOG artifacts from recorded EEG signals. In this paper we
investigate the efficiency of each method for removing of EOG
artifacts. A comparison is made between these two methods. Our
undertaken conclusion from this comparison is that adaptive filtering
method has better results compared with the results achieved by
ARMAX modeling.
Abstract: A lot of matching algorithms with different characteristics have been introduced in recent years. For real time systems these algorithms are usually based on minutiae features. In this paper we introduce a novel approach for feature extraction in which the extracted features are independent of shift and rotation of the fingerprint and at the meantime the matching operation is performed much more easily and with higher speed and accuracy. In this new approach first for any fingerprint a reference point and a reference orientation is determined and then based on this information features are converted into polar coordinates. Due to high speed and accuracy of this approach and small volume of extracted features and easily execution of matching operation this approach is the most appropriate for real time applications.
Abstract: For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.
Abstract: This paper evaluates the performance of a novel
algorithm for tracking of a mobile node, interms of execution time
and root mean square error (RMSE). Particle Filter algorithm is used
to track the mobile node, however a new technique in particle filter
algorithm is also proposed to reduce the execution time. The
stationary points were calculated through trilateration and finally by
averaging the number of points collected for a specific time, whereas
tracking is done through trilateration as well as particle filter
algorithm. Wi-Fi signal is used to get initial guess of the position of
mobile node in x-y coordinates system. Commercially available
software “Wireless Mon" was used to read the WiFi signal strength
from the WiFi card. Visual Cµ version 6 was used to interact with
this software to read only the required data from the log-file
generated by “Wireless Mon" software. Results are evaluated through
mathematical modeling and MATLAB simulation.
Abstract: Although, it is a long time that human know about
the importance of environment in life, but at the last decade of 20
century, the space that was full of hot scientific, collegial and
political were made in environmental challenge, So much that, this
problem not only disarrange the peace and security of life, but also it
has threatened human existence. One of the problems in last years
that are significant for authorities is unsatisfactory achieved results
against of using huge cost for magnificent environmental projects.
This subject leads thinker to this thought that for solving the
environmental problems it is needed new methods include of
sociology, ethics and philosophic, etc. methods apart of technical
affairs. Environment ethics is a new branch of philosophic ethics
discussion that discusses about the ethics relationship between
humans and universe that is around them. By notifying to the above
considered affairs, in today world, necessity of environmental ethics
for environment management is reduplicated. In the following the
article has been focused on environmental ethics role and
environmental management methods and techniques for developing
it.
Abstract: The quality improvements of the environmental
elements could increase the recreational opportunities in a certain
area (destination). The technique of the need for recreation focuses
on choosing certain destinations for recreational purposes. The basic
exchange taken into consideration is the one between the satisfaction
gained after staying in that area and the value expressed in money
and time allocated. The number of tourists in the respective area, the
duration of staying and the money spent including transportation
provide information on how individuals rank the place or certain
aspects of the area (such as the quality of the environmental
elements).
For the statistical analysis of the environmental benefits offered by
an area through the need of recreation technique, the following stages
are suggested:
- characterization of the reference area based on the
statistical variables considered;
- estimation of the environmental benefit through
comparing the reference area with other similar areas
(having the same environmental characteristics), from
the perspective of the statistical variables considered.
The model compared in recreation technique faced with a series of
difficulties which refers to the reference area and correct
transformation of time in money.
Abstract: The understanding of knee movement during swing
importance for golf swing improving and preventing injury. Thirty
male professional and amateur golfers were assigned to swing time
by time for 3 times. Data from a vedio-based motion capture were
used to compute knee joint movement variables. The results showed
that professional and amateur golfers were significantly in left knee
flexion angle at the impact point and mid follow through phase.
Nevertheless, left knee external rotation in both groups was also
significant. The right knee were no significant different in all
variable. However, pattern of knee joint movement are also likely
between professional and amateur golfers.
Abstract: Alpinia galanga is rhizome, generally known as
Greater galangal and is selected for isolation of newer constituents
accountable for various therapeutic activities. Present study is
intended to isolate glycoside from Alpinia galanga rhizomes. Alpinia
galanga methanolic extract was column chromatograph and eluted
with ethyl acetate-methanol (99:1) to isolate compound β-Sitosterol
Diarabinoside. Herein, the isolation and structural elucidation of new
compound is described. Chemical investigation of methanolic extract
of rhizomes of Alpinia galanga furnished a new compound β-
Sitosterol Diarabinoside. The IR, NMR and MASS investigations of
isolated compound confirmed its structure as β-Sitosterol
Diarabinoside, which is isolated for the first time from a medicinal
plant or any synthetic source.
Abstract: Shirvan is located in plain in Northern Khorasan province north east of Iran and has semiarid to temperate climate. To investigate the annual changes in some qualitative parameters such as electrical conductivity, total dissolved solids and chloride concentrations which have increased during ten continuous years. Fourteen groundwater sources including deep as well as semi-deep wells were sampled and were analyzed using standard methods. The trends of obtained data were analyzed during these years and the effects of different factors on the changes in electrical conductivity, concentration of chloride and total dissolved solids were clarified. The results showed that the amounts of some qualitative parameters have been increased during 10 years time which has led to decrease in water quality. The results also showed that increased in urban populations as well as extensive industrialization in the studied area are the most important reasons to influence underground water quality. Furthermore decrease in water quantity is also evident due to more water utilization and occurrence of recent droughts in the region during recent years.
Abstract: Hazard rate estimation is one of the important topics
in forecasting earthquake occurrence. Forecasting earthquake
occurrence is a part of the statistical seismology where the main
subject is the point process. Generally, earthquake hazard rate is
estimated based on the point process likelihood equation called the
Hazard Rate Likelihood of Point Process (HRLPP). In this research,
we have developed estimation method, that is hazard rate single
decrement HRSD. This method was adapted from estimation method
in actuarial studies. Here, one individual associated with an
earthquake with inter event time is exponentially distributed. The
information of epicenter and time of earthquake occurrence are used
to estimate hazard rate. At the end, a case study of earthquake hazard
rate will be given. Furthermore, we compare the hazard rate between
HRLPP and HRSD method.
Abstract: This contribution aims to outline some topics around the process of introduction of compulsory electronic exchange of documents (so called e-Boxes) in public administration. The research was conducted in order to gauge the difference between the expectation of those using internal email and their experience in reality. Both qualitative and quantitative research is employed to lead also to an estimation of the willingness and readiness of government bodies, business units and citizens to adopt new technologies. At the same time the most potent barriers to successful e-communication through the e-Boxes are identified.
Abstract: The network of delivering commodities has been an important design problem in our daily lives and many transportation applications. The delivery performance is evaluated based on the system reliability of delivering commodities from a source node to a sink node in the network. The system reliability is thus maximized to find the optimal routing. However, the design problem is not simple because (1) each path segment has randomly distributed attributes; (2) there are multiple commodities that consume various path capacities; (3) the optimal routing must successfully complete the delivery process within the allowable time constraints. In this paper, we want to focus on the design optimization of the Multi-State Flow Network (MSFN) for multiple commodities. We propose an efficient approach to evaluate the system reliability in the MSFN with respect to randomly distributed path attributes and find the optimal routing subject to the allowable time constraints. The delivery rates, also known as delivery currents, of the path segments are evaluated and the minimal-current arcs are eliminated to reduce the complexity of the MSFN. Accordingly, the correct optimal routing is found and the worst-case reliability is evaluated. It has been shown that the reliability of the optimal routing is at least higher than worst-case measure. Two benchmark examples are utilized to demonstrate the proposed method. The comparisons between the original and the reduced networks show that the proposed method is very efficient.
Abstract: This paper studies the pth moment exponential synchronization of a class of stochastic neural networks with mixed delays. Based on Lyapunov stability theory, by establishing a new integrodifferential inequality with mixed delays, several sufficient conditions have been derived to ensure the pth moment exponential stability for the error system. The criteria extend and improve some earlier results. One numerical example is presented to illustrate the validity of the main results.
Abstract: The knee bracing steel frame (KBF) is a new kind of energy dissipating frame, which combines excellent ductility and lateral stiffness. In this framing system, a special form of diagonal brace connected to a knee element instead of beam-column joint, is investigated. Recently, a similar system was proposed and named as chevron knee bracing system (CKB) which in comparison with the former system has a better energy absorption characteristic and at the same time retains the elastic nature of the structures. Knee bracing can provide a stiffer bracing system but reduces the ductility of the steel frame. Chevron knee bracing can be employed to provide the desired ductility level for a design. In this article, relation between seismic performance and structural parameters of the two above mentioned systems are investigated and compared. Frames with similar dimensions but various heights in both systems are designed according to Iranian code of practice for seismic resistant design of building, and then based on a non-linear push over static analysis; the seismic parameters such as behavior factor and performance levels are compared.
Abstract: Locality Sensitive Hashing (LSH) is one of the most
promising techniques for solving nearest neighbour search problem in
high dimensional space. Euclidean LSH is the most popular variation
of LSH that has been successfully applied in many multimedia
applications. However, the Euclidean LSH presents limitations that
affect structure and query performances. The main limitation of the
Euclidean LSH is the large memory consumption. In order to achieve
a good accuracy, a large number of hash tables is required. In this
paper, we propose a new hashing algorithm to overcome the storage
space problem and improve query time, while keeping a good
accuracy as similar to that achieved by the original Euclidean LSH.
The Experimental results on a real large-scale dataset show that the
proposed approach achieves good performances and consumes less
memory than the Euclidean LSH.