Abstract: This contribution deals with the relationship between
communication effectiveness and the extent of communication
among organizational units. To facilitate communication between
employees and to increase the level of understanding, the knowledge
of communication tools is necessary. Recent experience has shown
that personal communication is critical for smooth running of
companies and cannot be fully replaced by any form of technical
communication devices.
Below are presented the outcomes of the research on the
relationship between the extent of communication among
organisational units and its efficiency.
Abstract: To support user mobility for a wireless network new mechanisms are needed and are fundamental, such as paging, location updating, routing, and handover. Also an important key feature is mobile QoS offered by the WATM. Several ATM network protocols should be updated to implement mobility management and to maintain the already ATM QoS over wireless ATM networks. A survey of the various schemes and types of handover is provided. Handover procedure allows guarantee the terminal connection reestablishment when it moves between areas covered by different base stations. It is useful to satisfy user radio link transfer without interrupting a connection. However, failure to offer efficient solutions will result in handover important packet loss, severe delays and degradation of QoS offered to the applications. This paper reviews the requirements, characteristics and open issues of wireless ATM, particularly with regard to handover. It introduces key aspects of WATM and mobility extensions, which are added in the fixed ATM network. We propose a flexible approach for handover management that will minimize the QoS deterioration. Functional entities of this flexible approach are discussed in order to achieve minimum impact on the connection quality when a MT crosses the BS.
Abstract: Recent evidences on liquidity and valuation of securities in the capital markets clearly show the importance of stock market liquidity and valuation of firms. In this paper, relationship between transparency, liquidity, and valuation is studied by using data obtained from 70 companies listed in Tehran Stock Exchange during2003-2012. In this study, discriminatory earnings management, as a sign of lack of transparency and Tobin's Q, was used as the criteria of valuation. The results indicate that there is a significant and reversed relationship between earnings management and liquidity. On the other hand, there is a relationship between liquidity and transparency.The results also indicate a significant relationship between transparency and valuation. Transparency has an indirect effect on firm valuation alone or through the liquidity channel. Although the effect of transparency on the value of a firm was reduced by adding the variable of liquidity, the cumulative effect of transparency and liquidity increased.
Abstract: This paper describes the design of a voltage based maximum power point tracker (MPPT) for photovoltaic (PV) applications. Of the various MPPT methods, the voltage based method is considered to be the simplest and cost effective. The major disadvantage of this method is that the PV array is disconnected from the load for the sampling of its open circuit voltage, which inevitably results in power loss. Another disadvantage, in case of rapid irradiance variation, is that if the duration between two successive samplings, called the sampling period, is too long there is a considerable loss. This is because the output voltage of the PV array follows the unchanged reference during one sampling period. Once a maximum power point (MPP) is tracked and a change in irradiation occurs between two successive samplings, then the new MPP is not tracked until the next sampling of the PV array voltage. This paper proposes an MPPT circuit in which the sampling interval of the PV array voltage, and the sampling period have been shortened. The sample and hold circuit has also been simplified. The proposed circuit does not utilize a microcontroller or a digital signal processor and is thus suitable for low cost and low power applications.
Abstract: QoS Routing aims to find paths between senders and
receivers satisfying the QoS requirements of the application which
efficiently using the network resources and underlying routing
algorithm to be able to find low-cost paths that satisfy given QoS
constraints. The problem of finding least-cost routing is known to be
NP hard or complete and some algorithms have been proposed to
find a near optimal solution. But these heuristics or algorithms either
impose relationships among the link metrics to reduce the complexity
of the problem which may limit the general applicability of the
heuristic, or are too costly in terms of execution time to be applicable
to large networks. In this paper, we analyzed two algorithms namely
Characterized Delay Constrained Routing (CDCR) and Optimized
Delay Constrained Routing (ODCR). The CDCR algorithm dealt an
approach for delay constrained routing that captures the trade-off
between cost minimization and risk level regarding the delay
constraint. The ODCR which uses an adaptive path weight function
together with an additional constraint imposed on the path cost, to
restrict search space and hence ODCR finds near optimal solution in
much quicker time.
Abstract: The mixing behaviors of dry and wet granular
materials in gas fluidized bed systems were investigated
computationally using the combined Computational Fluid Dynamics
and Discrete Element Method (CFD-DEM). Dry particles were
observed to mix fairly rapidly during the fluidization process due to
vigorous relative motions between particles induced by the flow of
gas. In contrast, due to the presence of strong cohesive forces arising
from capillary liquid bridges between wet particles, the mixing
efficiencies of wet granular materials under similar operating
conditions were observed to be reduced significantly.
Abstract: Ground-level tropospheric ozone is one of the air
pollutants of most concern. It is mainly produced by photochemical
processes involving nitrogen oxides and volatile organic compounds
in the lower parts of the atmosphere. Ozone levels become
particularly high in regions close to high ozone precursor emissions
and during summer, when stagnant meteorological conditions with
high insolation and high temperatures are common.
In this work, some results of a study about urban ozone
distribution patterns in the city of Badajoz, which is the largest and
most industrialized city in Extremadura region (southwest Spain) are
shown. Fourteen sampling campaigns, at least one per month, were
carried out to measure ambient air ozone concentrations, during
periods that were selected according to favourable conditions to
ozone production, using an automatic portable analyzer.
Later, to evaluate the ozone distribution at the city, the measured
ozone data were analyzed using geostatistical techniques. Thus, first,
during the exploratory analysis of data, it was revealed that they were
distributed normally, which is a desirable property for the subsequent
stages of the geostatistical study. Secondly, during the structural
analysis of data, theoretical spherical models provided the best fit for
all monthly experimental variograms. The parameters of these
variograms (sill, range and nugget) revealed that the maximum
distance of spatial dependence is between 302-790 m and the
variable, air ozone concentration, is not evenly distributed in reduced
distances. Finally, predictive ozone maps were derived for all points
of the experimental study area, by use of geostatistical algorithms
(kriging). High prediction accuracy was obtained in all cases as
cross-validation showed. Useful information for hazard assessment
was also provided when probability maps, based on kriging
interpolation and kriging standard deviation, were produced.
Abstract: Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.
Abstract: Simulation of occlusal function during laboratory
material-s testing becomes essential in predicting long-term
performance before clinical usage. The aim of the study was to assess
the influence of chamfer preparation depth on failure risk of heat
pressed ceramic crowns with and without zirconia framework by
means of finite element analysis. 3D models of maxillary central
incisor, prepared for full ceramic crowns with different depths of the
chamfer margin (between 0.8 and 1.2 mm) and 6-degree tapered
walls together with the overlying crowns were generated using
literature data (Fig. 1, 2). The crowns were designed with and
without a zirconia framework with a thickness of 0.4 mm. For all
preparations and crowns, stresses in the pressed ceramic crown,
zirconia framework, pressed ceramic veneer, and dentin were
evaluated separately. The highest stresses were registered in the
dentin. The depth of the preparations had no significant influence on
the stress values of the teeth and pressed ceramics for the studied
cases, only for the zirconia framework. The zirconia framework
decreases the stress values in the veneer.
Abstract: In this paper we canvass three case studies of unique
research partnerships between universities and schools in the wider
community. In doing so, we consider those areas of indeterminate
zones of professional practice explored by academics in their
research activities within the wider community. We discuss three
cases: an artist-in-residence program designed to engage primary
school children with new understandings about local Indigenous
Australian issues in their pedagogical and physical landscapes; an
assessment of pedagogical concerns in relation to the use of physical
space in classrooms; and the pedagogical underpinnings of a
costumed museum school program. In doing so, we engage issues of
research as playing an integral part in the development,
implementation and maintenance of academic engagements with
wider community issues.
Abstract: In hypersonic environments, the aerothermal effect
makes it difficult for the optical side windows of optical guided
missiles to withstand high heat. This produces cracking or breaking,
resulting in an inability to function. This study used computational
fluid mechanics to investigate the external cooling jet conditions of
optical side windows. The turbulent models k-ε and k-ω were
simulated. To be in better accord with actual aerothermal
environments, a thermal radiation model was added to examine
suitable amounts of external coolants and the optical window
problems of aero-thermodynamics. The simulation results indicate that
when there are no external cooling jets, because airflow on the optical
window and the tail groove produce vortices, the temperatures in these
two locations reach a peak of approximately 1600 K. When the
external cooling jets worked at 0.15 kg/s, the surface temperature of
the optical windows dropped to approximately 280 K. When adding
thermal radiation conditions, because heat flux dissipation was faster,
the surface temperature of the optical windows fell from 280 K to
approximately 260 K. The difference in influence of the different
turbulence models k-ε and k-ω on optical window surface temperature
was not significant.
Abstract: Through 1980s, management accounting researchers
described the increasing irrelevance of traditional control and
performance measurement systems. The Balanced Scorecard (BSC)
is a critical business tool for a lot of organizations. It is a
performance measurement system which translates mission and
strategy into objectives. Strategy map approach is a development
variant of BSC in which some necessary causal relations must be
established. To recognize these relations, experts usually use
experience. It is also possible to utilize regression for the same
purpose. Structural Equation Modeling (SEM), which is one of the
most powerful methods of multivariate data analysis, obtains more
appropriate results than traditional methods such as regression. In the
present paper, we propose SEM for the first time to identify the
relations between objectives in the strategy map, and a test to
measure the importance of relations. In SEM, factor analysis and test
of hypotheses are done in the same analysis. SEM is known to be
better than other techniques at supporting analysis and reporting. Our
approach provides a framework which permits the experts to design
the strategy map by applying a comprehensive and scientific method
together with their experience. Therefore this scheme is a more
reliable method in comparison with the previously established
methods.
Abstract: The mobile users with Laptops need to have an
efficient access to i.e. their home personal data or to the Internet from
any place in the world, regardless of their location or point of
attachment, especially while roaming outside the home subnet. An
efficient interpretation of packet losses problem that is encountered
from this roaming is to the centric of all aspects in this work, to be
over-highlighted. The main previous works, such as BER-systems,
Amigos, and ns-2 implementation that are considered to be in
conjunction with that problem under study are reviewed and
discussed. Their drawbacks and limitations, of stopping only at
monitoring, and not to provide an actual solution for eliminating or
even restricting these losses, are mentioned. Besides that, the
framework around which we built a Triple-R sequence as a costeffective
solution to eliminate the packet losses and bridge the gap
between subnets, an area that until now has been largely neglected, is
presented. The results show that, in addition to the high bit error rate
of wireless mobile networks, mainly the low efficiency of mobile-IP
registration procedure is a direct cause of these packet losses.
Furthermore, the output of packet losses interpretation resulted an
illustrated triangle of the registration process. This triangle should be
further researched and analyzed in our future work.
Abstract: The effect of the blade tip geometry of a high pressure
gas turbine is studied experimentally and computationally for high
speed leakage flows. For this purpose two simplified models are
constructed, one models a flat tip of the blade and the second models
a cavity tip of the blade. Experimental results are obtained from a
transonic wind tunnel to show the static pressure distribution along
the tip wall and provide flow visualization. RANS computations
were carried to provide further insight into the mean flow behavior
and to calculate the discharge coefficient which is a measure of the
flow leaking over the tip. It is shown that in both geometries of tip
the flow separates over the tip to form a separation bubble. The
bubble is higher for the cavity tip while a complete shock wave
system of oblique waves ending with a normal wave can be seen for
the flat tip. The discharge coefficient for the flat tip shows less
dependence on the pressure ratio over the blade tip than the cavity
tip. However, the discharge coefficient for the cavity tip is lower than
that of the flat tip, showing a better ability to reduce the leakage flow
and thus increase the turbine efficiency.
Abstract: The objective of this paper is to a design of pattern
classification model based on the back-propagation (BP) algorithm for
decision support system. Standard BP model has done full connection
of each node in the layers from input to output layers. Therefore, it
takes a lot of computing time and iteration computing for good
performance and less accepted error rate when we are doing some
pattern generation or training the network.
However, this model is using exclusive connection in between
hidden layer nodes and output nodes. The advantage of this model is
less number of iteration and better performance compare with standard
back-propagation model. We simulated some cases of classification
data and different setting of network factors (e.g. hidden layer number
and nodes, number of classification and iteration). During our
simulation, we found that most of simulations cases were satisfied by
BP based using exclusive connection network model compared to
standard BP. We expect that this algorithm can be available to
identification of user face, analysis of data, mapping data in between
environment data and information.
Abstract: This paper presents an adaptive feedback linearization approach to derive helicopter. Ideal feedback linearization is defined for the cases when the system model is known. Adaptive feedback linearization is employed to get asymptotically exact cancellation for the inherent uncertainty in the knowledge of the given parameters of system. The control algorithm is implemented using the feedback linearization technique and adaptive method. The controller parameters are unknown where an adaptive control law aims to drive them towards their ideal values for providing perfect model matching between the reference model and the closed-loop plant model. The converged parameters of controller would then provide good estimates for the unknown plant parameters.
Abstract: Atmospheric stability plays the most important role in
the transport and dispersion of air pollutants. Different methods are
used for stability determination with varying degrees of complexity.
Most of these methods are based on the relative magnitude of
convective and mechanical turbulence in atmospheric motions.
Richardson number, Monin-Obukhov length, Pasquill-Gifford
stability classification and Pasquill–Turner stability classification, are
the most common parameters and methods. The Pasquill–Turner
Method (PTM), which is employed in this study, makes use of
observations of wind speed, insolation and the time of day to classify
atmospheric stability with distinguishable indices. In this study, a
model is presented to determination of atmospheric stability
conditions using PTM. As a case study, meteorological data of
Mehrabad station in Tehran from 2000 to 2005 is applied to model.
Here, three different categories are considered to deduce the pattern
of stability conditions. First, the total pattern of stability classification
is obtained and results show that atmosphere is 38.77%, 27.26%,
33.97%, at stable, neutral and unstable condition, respectively. It is
also observed that days are mostly unstable (66.50%) while nights are
mostly stable (72.55%). Second, monthly and seasonal patterns are
derived and results indicate that relative frequency of stable
conditions decrease during January to June and increase during June
to December, while results for unstable conditions are exactly in
opposite manner. Autumn is the most stable season with relative
frequency of 50.69% for stable condition, whilst, it is 42.79%,
34.38% and 27.08% for winter, summer and spring, respectively.
Hourly stability pattern is the third category that points out that
unstable condition is dominant from approximately 03-15 GTM and
04-12 GTM for warm and cold seasons, respectively. Finally,
correlation between atmospheric stability and CO concentration is
achieved.
Abstract: The present study investigated the relationship
between personality characteristics of drivers and the number and
amount of fines they have in a year .This study was carried out on
120 male taxi drivers that worked at least seven hours in a day in
Lamerd - a city in the south of IRAN. Subjects were chosen
voluntarily among those available. Predictive variables were the NEO
–five great personality factors (1. conscientiousness 2. Openness to
Experience 3.Neuroticism4 .Extraversion 5.Agreeableness )
thecriterion variables were the number and amount of fines the
drivers have had the last three years. the result of regression analysis
showed that conscientiousness factor was able to negatively predict
the number and amount of financial fines the drivers had during the
last three years. The openness factor positively predicted the number
of fines they had in last 3 years and the amount of financial fines
during the last year. The extraversion factor both meaningfully and
positively could predict only the amount of financial fines they had
during the last year. Increasing age was associated with decreasing
driving offenses as well as financial loss.The findings can be useful
in recognizing the high-risk drivers and leading them to counseling
centers .They can also be used to inform the drivers about their
personality and it’s relation with their accident rate. Such criteria
would be of great importance in employing drivers in different places
such as companies, offices etc…
Abstract: With the approaching of digital era, various interactive
service platforms and systems support human beings- needs in lives by
different contents and measures. Design strategies have gradually
turned from function-based to user-oriented, and are often customized.
In other words, how designers include users- value reaction in creation
becomes the goal. Creative design service of interior design requires
positive interaction and communication to allow users to obtain full
design information, recognize the style and process of personal needs,
develop creative service design, lower communication time and cost
and satisfy users- sense of achievement. Thus, by constructing a
co-design method, based on the communication between interior
designers and users, this study recognizes users- real needs and
provides the measure of co-design for designers and users.
Abstract: Energy and exergy study of air-water combined solar collector which is called dual purpose solar collector (DPSC) is investigated. The method of ε - NTU is used. Analysis is performed for triangle channels. Parameters like the air flow rate and water inlet temperature are studied. Results are shown that DPSC has better energy and exergy efficiency than single collector. In addition, the triangle passage with water inlet temperature of 60O C has shown better exergy and energy efficiency.