Abstract: One of the potential and effective ways of
storing thermal energy in buildings is the integration of brick with phase change materials (PCMs). This paper presents a two-dimensional model for simulating and analyzing of PCM
in order to minimize energy consumption in the buildings. The numerical approach has been used with the real weather data of a selected city of Iran (Tehran). Two kinds of brick integrated PCM are investigated and compared base on
outdoor weather conditions and the amount of energy
consumption. The results show a significant reduction in
maximum entering heat flux to building about 32.8%
depending on PCM quantity. The results are analyzed by
various temperature contour plots. The contour plots
illustrated the time dependent mechanism of entering heat flux for a brick integrated with PCM. Further analysis is developed to investigate the effect of PCM location on the inlet heat flux. The results demonstrated that to achieve maximum performance of PCM it is better to locate PCM near the outdoor.
Abstract: A novel calibration approach that aims to reduce
ASM2d parameter subsets and decrease the model complexity is
presented. This approach does not require high computational
demand and reduces the number of modeling parameters required to
achieve the ASMs calibration by employing a sensitivity and iteration
methodology. Parameter sensitivity is a crucial factor and the
iteration methodology enables refinement of the simulation parameter
values. When completing the iteration process, parameters values are
determined in descending order of their sensitivities. The number of
iterations required is equal to the number of model parameters of the
parameter significance ranking. This approach was used for the
ASM2d model to the evaluated EBPR phosphorus removal and it was
successful. Results of the simulation provide calibration parameters.
These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA,
KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were
corresponding to the experimental data available.
Abstract: Governments around the world are expending
considerable time and resources framing strategies and policies to
deliver energy security. The term 'energy security' has quietly
slipped into the energy lexicon without any meaningful discourse
about its meaning or assumptions. An examination of explicit and
inferred definitions finds that the concept is inherently slippery
because it is polysemic in nature having multiple dimensions and
taking on different specificities depending on the country (or
continent), timeframe or energy source to which it is applied. But
what does this mean for policymakers? Can traditional policy
approaches be used to address the problem of energy security or does
its- polysemic qualities mean that it should be treated as a 'wicked'
problem? To answer this question, the paper assesses energy security
against nine commonly cited characteristics of wicked policy
problems and finds strong evidence of 'wickedness'.
Abstract: This article presents a numerical study of the doublediffusive
mixed convection in a vertical channel filled with porous
medium by using non-equilibrium model. The flow is assumed
fully developed, uni-directional and steady state. The controlling
parameters are thermal Rayleigh number (RaT ), Darcy number (Da),
Forchheimer number (F), buoyancy ratio (N), inter phase heat transfer
coefficient (H), and porosity scaled thermal conductivity ratio
(γ). The Brinkman-extended non-Darcy model is considered. The
governing equations are solved by spectral collocation method. The
main emphasize is given on flow profiles as well as heat and solute
transfer rates, when two diffusive components in terms of buoyancy
ratio are in favor (against) of each other and solid matrix and fluid
are thermally non-equilibrium. The results show that, for aiding flow
(RaT = 1000), the heat transfer rate of fluid (Nuf ) increases upto a
certain value of H, beyond that decreases smoothly and converges
to a constant, whereas in case of opposing flow (RaT = -1000),
the result is same for N = 0 and 1. The variation of Nuf in (N,
Nuf )-plane shows sinusoidal pattern for RaT = -1000. For both cases
(aiding and opposing) the flow destabilize on increasing N by inviting
point of inflection or flow separation on the velocity profile. Overall,
the buoyancy force have significant impact on the non-Darcy mixed
convection under LTNE conditions.
Abstract: There is significant interest in achieving technology
innovation through new product development activities. It is
recognized, however, that traditional project management practices
focused only on performance, cost, and schedule attributes, can often
lead to risk mitigation strategies that limit new technology
innovation. In this paper, a new approach is proposed for formally
managing and quantifying technology innovation. This approach uses
a risk-based framework that simultaneously optimizes innovation
attributes along with traditional project management and system
engineering attributes. To demonstrate the efficacy of the new riskbased
approach, a comprehensive product development experiment
was conducted. This experiment simultaneously managed the
innovation risks and the product delivery risks through the proposed
risk-based framework. Quantitative metrics for technology
innovation were tracked and the experimental results indicate that the
risk-based approach can simultaneously achieve both project
deliverable and innovation objectives.
Abstract: This paper focuses on the probabilistic numerical
solution of the problems in biomechanics and mining. Applications of
Simulation-Based Reliability Assessment (SBRA) Method are
presented in the solution of designing of the external fixators applied
in traumatology and orthopaedics (these fixators can be applied for
the treatment of open and unstable fractures etc.) and in the solution
of a hard rock (ore) disintegration process (i.e. the bit moves into the
ore and subsequently disintegrates it, the results are compared with
experiments, new design of excavation tool is proposed.
Abstract: In this paper multi-objective genetic algorithms are
employed for Pareto approach optimization of ideal Turboshaft
engines. In the multi-objective optimization a number of conflicting
objective functions are to be optimized simultaneously. The
important objective functions that have been considered for
optimization are specific thrust (F/m& 0), specific fuel consumption
( P S ), output shaft power 0 (& /&) shaft W m and overall efficiency( ) O
η .
These objectives are usually conflicting with each other. The design
variables consist of thermodynamic parameters (compressor pressure
ratio, turbine temperature ratio and Mach number).
At the first stage single objective optimization has been
investigated and the method of NSGA-II has been used for multiobjective
optimization. Optimization procedures are performed for
two and four objective functions and the results are compared for
ideal Turboshaft engine. In order to investigate the optimal
thermodynamic behavior of two objectives, different set, each
including two objectives of output parameters, are considered
individually. For each set Pareto front are depicted. The sets of
selected decision variables based on this Pareto front, will cause the
best possible combination of corresponding objective functions.
There is no superiority for the points on the Pareto front figure,
but they are superior to any other point. In the case of four objective
optimization the results are given in tables.
Abstract: The six sigma method is a project-driven management approach to improve the organization-s products, services, and processes by continually reducing defects in the organization. Understanding the key features, obstacles, and shortcomings of the six sigma method allows organizations to better support their strategic directions, and increasing needs for coaching, mentoring, and training. It also provides opportunities to better implement six sigma projects. The purpose of this paper is the survey of six sigma process and its impact on the organizational productivity. So I have studied key concepts , problem solving process of six sigmaas well as the survey of important fields such as: DMAIC, six sigma and productivity applied programme, and other advantages of six sigma. In the end of this paper, present research conclusions. (direct and positive relation between six sigma and productivity)
Abstract: Arbitrarily shaped video objects are an important
concept in modern video coding methods. The techniques presently
used are not based on image elements but rather video objects having
an arbitrary shape. In this paper, spatial shape error concealment
techniques to be used for object-based image in error-prone
environments are proposed. We consider a geometric shape
representation consisting of the object boundary, which can be
extracted from the α-plane. Three different approaches are used to
replace a missing boundary segment: Bézier interpolation, Bézier
approximation and NURBS approximation. Experimental results on
object shape with different concealment difficulty demonstrate the
performance of the proposed methods. Comparisons with proposed
methods are also presented.
Abstract: Today, Higher Education in a global scope is subordinated to the greater institutional controls through the policies of the Quality of Education. These include processes of over evaluation of all the academic activities: students- and professors- performance, educational logistics, managerial standards for the administration of institutions of higher education, as well as the establishment of the imaginaries of excellence and prestige as the foundations on which universities of the XXI century will focus their present and future goals and interests. But at the same time higher education systems worldwide are facing the most profound crisis of sense and meaning and attending enormous mutations in their identity. Based in a qualitative research approach, this paper shows the social configurations that the scholars at the Universities in Mexico build around the discourse of the Quality of Education, and how these policies put in risk the social recognition of these individuals.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: In this paper an analytical crack propagation scenario
is proposed which assumes that a crack propagates in the tooth root in
both the crack depth direction and the tooth width direction, and
which is more reasonable and realistic for non-uniform load
distribution cases than the other presented scenarios. An analytical
approach is used for quantifying the loss of time-varying gear mesh
stiffness with the presence of crack propagation in the gear tooth root.
The proposed crack propagation scenario can be applied for crack
propagation modelling and monitoring simulation, but further
research is required for comparison and evaluation of all the
presented crack propagation scenarios from the condition monitoring
point of view.
Abstract: In the last decades to supply the various and different
demands of clients, a lot of manufacturers trend to use the mixedmodel
assembly line (MMAL) in their production lines, since this
policy make possible to assemble various and different models of the
equivalent goods on the same line with the MTO approach.
In this article, we determine the sequence of (MMAL) line, with
applying the kitting approach and planning of rest time for general
workers to reduce the wastages, increase the workers effectiveness
and apply the sector of lean production approach.
This Multi-objective sequencing problem solved in small size with
GAMS22.2 and PSO meta heuristic in 10 test problems and compare
their results together and conclude that their results are very similar
together, next we determine the important factors in computing the
cost, which improving them cost reduced. Since this problem, is NPhard
in large size, we use the particle swarm optimization (PSO)
meta-heuristic for solving it. In large size we define some test
problems to survey it-s performance and determine the important
factors in calculating the cost, that by change or improved them
production in minimum cost will be possible.
Abstract: Web applications have become very complex and
crucial, especially when combined with areas such as CRM
(Customer Relationship Management) and BPR (Business Process
Reengineering), the scientific community has focused attention to
Web applications design, development, analysis, and testing, by
studying and proposing methodologies and tools. This paper
proposes an approach to automatic multi-dimensional concern
mining for Web Applications, based on concepts analysis, impact
analysis, and token-based concern identification. This approach lets
the user to analyse and traverse Web software relevant to a particular
concern (concept, goal, purpose, etc.) via multi-dimensional
separation of concerns, to document, understand and test Web
applications. This technique was developed in the context of WAAT
(Web Applications Analysis and Testing) project. A semi-automatic
tool to support this technique is currently under development.
Abstract: Induction machine models used for steady-state and
transient analysis require machine parameters that are usually
considered design parameters or data. The knowledge of induction
machine parameters is very important for Indirect Field Oriented
Control (IFOC). A mismatched set of parameters will degrade the
response of speed and torque control. This paper presents an
improvement approach on rotor time constant adaptation in IFOC for
Induction Machines (IM). Our approach tends to improve the
estimation accuracy of the fundamental model for flux estimation.
Based on the reduced order of the IM model, the rotor fluxes and
rotor time constant are estimated using only the stator currents and
voltages. This reduced order model offers many advantages for real
time identification parameters of the IM.
Abstract: Reverse engineering of full-genomic interaction networks based on compendia of expression data has been successfully applied for a number of model organisms. This study adapts these approaches for an important non-model organism: The major human fungal pathogen Candida albicans. During the infection process, the pathogen can adapt to a wide range of environmental niches and reversibly changes its growth form. Given the importance of these processes, it is important to know how they are regulated. This study presents a reverse engineering strategy able to infer fullgenomic interaction networks for C. albicans based on a linear regression, utilizing the sparseness criterion (LASSO). To overcome the limited amount of expression data and small number of known interactions, we utilize different prior-knowledge sources guiding the network inference to a knowledge driven solution. Since, no database of known interactions for C. albicans exists, we use a textmining system which utilizes full-text research papers to identify known regulatory interactions. By comparing with these known regulatory interactions, we find an optimal value for global modelling parameters weighting the influence of the sparseness criterion and the prior-knowledge. Furthermore, we show that soft integration of prior-knowledge additionally improves the performance. Finally, we compare the performance of our approach to state of the art network inference approaches.
Abstract: One of the determinants of a firm-s prosperity is the
customers- perceived service quality and satisfaction. While service
quality is wide in scope, and consists of various dimensions, there
may be differences in the relative importance of these dimensions in
affecting customers- overall satisfaction of service quality.
Identifying the relative rank of different dimensions of service quality
is very important in that it can help managers to find out which
service dimensions have a greater effect on customers- overall
satisfaction. Such an insight will consequently lead to more effective
resource allocation which will finally end in higher levels of
customer satisfaction. This issue – despite its criticality- has not
received enough attention so far. Therefore, using a sample of 240
bank customers in Iran, an artificial neural network is developed to
address this gap in the literature. As customers- evaluation of service
quality is a subjective process, artificial neural networks –as a brain
metaphor- may appear to have a potentiality to model such a
complicated process. Proposing a neural network which is able to
predict the customers- overall satisfaction of service quality with a
promising level of accuracy is the first contribution of this study. In
addition, prioritizing the service quality dimensions in affecting
customers- overall satisfaction –by using sensitivity analysis of
neural network- is the second important finding of this paper.
Abstract: In the current economy of increasing global
competition, many organizations are attempting to use knowledge as
one of the means to gain sustainable competitive advantage. Besides
large organizations, the success of SMEs can be linked to how well
they manage their knowledge. Despite the profusion of research
about knowledge management within large organizations, fewer
studies tried to analyze KM in SMEs.
This research proposes a new framework showing the determinant
role of organizational dimensions onto KM approaches. The paper
and its propositions are based on a literature review and analysis.
In this research, personalization versus codification,
individualization versus institutionalization and IT-based versus non
IT-based are highlighted as three distinct dimensions of knowledge
management approaches.
The study contributes to research by providing a more nuanced
classification of KM approaches and provides guidance to managers
about the types of KM approaches that should be adopted based on
the size, geographical dispersion and task nature of SMEs.
To the author-s knowledge, the paper is the first of its kind to
examine if there are suitable configurations of KM approaches for
SMEs with different dimensions. It gives valuable information, which
hopefully will help SME sector to accomplish KM.
Abstract: Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.