Abstract: Shear walls are used in most of the tall buildings for
carrying the lateral load. When openings for doors or windows are
necessary to be existed in the shear walls, a special type of the shear
walls is used called "coupled shear walls" which in some cases is
stiffened by specific beams and so, called "stiffened coupled shear
walls".
In this paper, a mathematical method for geometrically nonlinear
analysis of the stiffened coupled shear walls has been presented.
Then, a suitable formulation for determining the critical load of the
stiffened coupled shear walls under gravity force has been proposed.
The governing differential equations for equilibrium and deformation
of the stiffened coupled shear walls have been obtained by setting up
the equilibrium equations and the moment-curvature relationships for
each wall. Because of the complexity of the differential equation, the
energy method has been adopted for approximate solution of the
equations.
Abstract: Rainfall data at fine resolution and knowledge of its
characteristics plays a major role in the efficient design and operation
of agricultural, telecommunication, runoff and erosion control as well
as water quality control systems. The paper is aimed to study the
statistical distribution of hourly rainfall depth for 12 representative
stations spread across Peninsular Malaysia. Hourly rainfall data of 10
to 22 years period were collected and its statistical characteristics
were estimated. Three probability distributions namely, Generalized
Pareto, Exponential and Gamma distributions were proposed to
model the hourly rainfall depth, and three goodness-of-fit tests,
namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared
tests were used to evaluate their fitness. Result indicates that the east
cost of the Peninsular receives higher depth of rainfall as compared
to west coast. However, the rainfall frequency is found to be
irregular. Also result from the goodness-of-fit tests show that all the
three models fit the rainfall data at 1% level of significance.
However, Generalized Pareto fits better than Exponential and
Gamma distributions and is therefore recommended as the best fit.
Abstract: This research paper is based upon the simulation of
gradient of mathematical functions and scalar fields using MATLAB.
Scalar fields, their gradient, contours and mesh/surfaces are
simulated using different related MATLAB tools and commands for
convenient presentation and understanding. Different mathematical
functions and scalar fields are examined here by taking their
gradient, visualizing results in 3D with different color shadings and
using other necessary relevant commands. In this way the outputs of
required functions help us to analyze and understand in a better way
as compared to just theoretical study of gradient.
Abstract: Petri Net being one of the most useful graphical tools for modelling complex asynchronous systems, we have used Petri Net to model multi-track railway level crossing system. The roadway has been augmented with four half-size barriers. For better control, a three stage control mechanism has been introduced to ensure that no road-vehicle is trapped on the level crossing. Timed Petri Net is used to include the temporal nature of the signalling system. Safeness analysis has also been included in the discussion section.
Abstract: Database management systems that integrate user preferences promise better solution for personalization, greater flexibility and higher quality of query responses. This paper presents a tentative work that studies and investigates approaches to express user preferences in queries. We sketch an extend capabilities of SQLf language that uses the fuzzy set theory in order to define the user preferences. For that, two essential points are considered: the first concerns the expression of user preferences in SQLf by so-called fuzzy commensurable predicates set. The second concerns the bipolar way in which these user preferences are expressed on mandatory and/or optional preferences.
Abstract: It has become crucial over the years for nations to
improve their credit scoring methods and techniques in light of the
increasing volatility of the global economy. Statistical methods or
tools have been the favoured means for this; however artificial
intelligence or soft computing based techniques are becoming
increasingly preferred due to their proficient and precise nature and
relative simplicity. This work presents a comparison between Support
Vector Machines and Artificial Neural Networks two popular soft
computing models when applied to credit scoring. Amidst the
different criteria-s that can be used for comparisons; accuracy,
computational complexity and processing times are the selected
criteria used to evaluate both models. Furthermore the German credit
scoring dataset which is a real world dataset is used to train and test
both developed models. Experimental results obtained from our study
suggest that although both soft computing models could be used with
a high degree of accuracy, Artificial Neural Networks deliver better
results than Support Vector Machines.
Abstract: LSP routing is among the prominent issues in MPLS
networks traffic engineering. The objective of this routing is to
increase number of the accepted requests while guaranteeing the
quality of service (QoS). Requested bandwidth is the most important
QoS criterion that is considered in literatures, and a various number
of heuristic algorithms have been presented with that regards. Many
of these algorithms prevent flows through bottlenecks of the network
in order to perform load balancing, which impedes optimum
operation of the network. Here, a modern routing algorithm is
proposed as MIRAD: having a little information of the network
topology, links residual bandwidth, and any knowledge of the
prospective requests it provides every request with a maximum
bandwidth as well as minimum end-to-end delay via uniform load
distribution across the network. Simulation results of the proposed
algorithm show a better efficiency in comparison with similar
algorithms.
Abstract: For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.
Abstract: In face recognition, feature extraction techniques
attempts to search for appropriate representation of the data. However,
when the feature dimension is larger than the samples size, it brings
performance degradation. Hence, we propose a method called
Normalization Discriminant Independent Component Analysis
(NDICA). The input data will be regularized to obtain the most
reliable features from the data and processed using Independent
Component Analysis (ICA). The proposed method is evaluated on
three face databases, Olivetti Research Ltd (ORL), Face Recognition
Technology (FERET) and Face Recognition Grand Challenge
(FRGC). NDICA showed it effectiveness compared with other
unsupervised and supervised techniques.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: In this paper, we have compared the performance of a Turbo and Trellis coded optical code division multiple access (OCDMA) system. The comparison of the two codes has been accomplished by employing optical orthogonal codes (OOCs). The Bit Error Rate (BER) performances have been compared by varying the code weights of address codes employed by the system. We have considered the effects of optical multiple access interference (OMAI), thermal noise and avalanche photodiode (APD) detector noise. Analysis has been carried out for the system with and without double optical hard limiter (DHL). From the simulation results it is observed that a better and distinct comparison can be drawn between the performance of Trellis and Turbo coded systems, at lower code weights of optical orthogonal codes for a fixed number of users. The BER performance of the Turbo coded system is found to be better than the Trellis coded system for all code weights that have been considered for the simulation. Nevertheless, the Trellis coded OCDMA system is found to be better than the uncoded OCDMA system. Trellis coded OCDMA can be used in systems where decoding time has to be kept low, bandwidth is limited and high reliability is not a crucial factor as in local area networks. Also the system hardware is less complex in comparison to the Turbo coded system. Trellis coded OCDMA system can be used without significant modification of the existing chipsets. Turbo-coded OCDMA can however be employed in systems where high reliability is needed and bandwidth is not a limiting factor.
Abstract: The asymmetric trafc between uplink and downlink
over recent mobile communication systems has been conspicuous because
of providing new communication services. This paper proposes
an asymmetric trafc accommodation scheme adopting a multihop
cooperative transmission technique for CDMA/FDD cellular networks.
The proposed scheme employs the cooperative transmission
technique in the already proposed downlink multihop transmissions
for the accommodation of the asymmetric trafc, which utilizes
the vacant uplink band for the downlink relay transmissions. The
proposed scheme reduces the transmission power at the downlink
relay transmissions and then suppresses the interference to the uplink
communications, and thus, improves the uplink performance. The
proposed scheme is evaluated by computer simulation and the results
show that it can achieve better throughput performance.
Abstract: In this paper, Optimum adaptive loading algorithms
are applied to multicarrier system with Space-Time Block Coding
(STBC) scheme associated with space-time processing based on
singular-value decomposition (SVD) of the channel matrix over
Rayleigh fading channels. SVD method has been employed in
MIMO-OFDM system in order to overcome subchannel interference.
Chaw-s and Compello-s algorithms have been implemented to obtain
a bit and power allocation for each subcarrier assuming instantaneous
channel knowledge. The adaptive loaded SVD-STBC scheme is
capable of providing both full-rate and full-diversity for any number
of transmit antennas. The effectiveness of these techniques has
demonstrated through the simulation of an Adaptive loaded SVDSTBC
system, and the comparison shown that the proposed
algorithms ensure better performance in the case of MIMO.
Abstract: To provide a better understanding of fair share policies supported by current production schedulers and their impact on scheduling performance, A relative fair share policy supported in four well-known production job schedulers is evaluated in this study. The experimental results show that fair share indeed reduces heavy-demand users from dominating the system resources. However, the detailed per-user performance analysis show that some types of users may suffer unfairness under fair share, possibly due to priority mechanisms used by the current production schedulers. These users typically are not heavy-demands users but they have mixture of jobs that do not spread out.
Abstract: As a part of an evaluation system for R&D programs,
the Korean Government has applied the preliminary feasibility study
to new government R&D program plans. Basically, the fundamental purpose of the preliminary feasibility study is to decide that the
government will either do or do not invest in a new R&D Program. Additionally, the preliminary feasibility study can contribute to the
improvement of R&D program plans. For example, 2 cases of new
R&D program plans applied to the study are explained in this paper and there are expectations that these R&D programs would yield better
performance than without the study. It is thought that the important point of the preliminary feasibility study is not only the effective
decision making process of R&D program but also the opportunity to improve R&D program plan actually.
Abstract: The influences of pulsed electric fields on early
physiological development in Arabidopsis thaliana were studied.
Inside a 4-mm electroporation cuvette, pre-germination seeds were
subjected to high-intensity, nanosecond electrical pulses generated
using laboratory-assembled pulsed electric field system. The field
strength was varied from 5 to 20 kV.cm-1 and the pulse width and the
pulse number were maintained at 10 ns and 100, respectively,
corresponding to the specific treatment energy from 300 J.kg-1 to 4.5
kJ.kg-1. Statistical analyses on the average leaf area 5 and 15 days
following pulsed electric field treatment showed that the effects
appear significant the second week after treatments with a maximum
increase of 80% compared to the control (P < 0.01).
Abstract: Deep cold rolling (DCR) is a cold working process, which easily produces a smooth and work-hardened surface by plastic deformation of surface irregularities. In the present study, the influence of main deep cold rolling process parameters on the surface roughness and the hardness of AISI 4140 steel were studied by using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in terms of identifying the predominant factor amongst the selected parameters, their order of significance and setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. It was found that the ball diameter, rolling force, initial surface roughness and number of tool passes are the most pronounced parameters, which have great effects on the work piece-s surface during the deep cold rolling process. A simple, inexpensive and newly developed DCR tool, with interchangeable collet for using different ball diameters, was used throughout the experimental work presented in this paper.
Abstract: Nowadays, the pace of business change is such that,
increasingly, new functionality has to be realized and reliably
installed in a matter of days, or even hours. Consequently, more and
more business processes are prone to a continuous change. The
objective of the research in progress is to use the MAP model, in a
conceptual modeling method for flexible and adaptive business
process. This method can be used to capture the flexibility
dimensions of a business process; it takes inspiration from
modularity concept in the object oriented paradigm to establish a
hierarchical construction of the BP modeling. Its intent is to provide
a flexible modeling that allows companies to quickly adapt their
business processes.
Abstract: Human always tried to create a suitable situation for their life according to environmental conditions. In fact, geography has an important role in the shape of our living area. Iran also as a four-season country has different climate type: hot and humid, hot and dry, mid and humid, and cold; therefore, we can find different architecture styles in Iran. Gilan-s traditional architecture is a suitable sample of sustainable construction in Iran. Because the main factors of every dwelling are the climatic, social, economic and cultural effects which demonstrate the interaction between environment and people settlement. This paper was determined the interaction between environmental factors and the rural dwellings in the Gilan province. Also, traditional village (city) of Masouleh as a rare sample of rural and sustainable architecture was introduced.
Abstract: The aim of this paper is to provide a better
understanding of the implementation of Project Management
practices by UiTM contractors to ensure project success. A
questionnaire survey was administered to 120 UiTM contractors in
Malaysia. The purpose of this method was to gather information on
the contractors- project background and project management skills. It
was found that all of the contractors had basic knowledge and
understanding of project management skills. It is suggested that a
reasonable project plan and an appropriate organizational structure
are influential factors for project success. It is recommended that the
contractors need to have an effective program of work and up to date
information system are emphasized.