Abstract: This paper introduces an approach to construct a set of criteria for evaluating alternative options. Content analysis was used to collet criterion elements. Then the elements were classified and organized yielding to hierarchic structure. The reliability of the constructed criteria was evaluated in an experiment. Finally the criteria were used to evaluate alternative options indecision-making.
Abstract: The Bangalore City is facing the acute problem of
pollution in the atmosphere due to the heavy increase in the traffic
and developmental activities in recent years. The present study is an
attempt in the direction to assess trend of the ambient air quality
status of three stations, viz., AMCO Batteries Factory, Mysore Road,
GRAPHITE INDIA FACTORY, KHB Industrial Area, Whitefield
and Ananda Rao Circle, Gandhinagar with respect to some of the
major criteria pollutants such as Total Suspended particular matter
(SPM), Oxides of nitrogen (NOx), and Oxides of sulphur (SO2). The
sites are representative of various kinds of growths viz., commercial,
residential and industrial, prevailing in Bangalore, which are
contributing to air pollution. The concentration of Sulphur Dioxide
(SO2) at all locations showed a falling trend due to use of refined
petrol and diesel in the recent years. The concentration of Oxides of
nitrogen (NOx) showed an increasing trend but was within the
permissible limits. The concentration of the Suspended particular
matter (SPM) showed the mixed trend. The correlation between
model and observed values is found to vary from 0.4 to 0.7 for SO2,
0.45 to 0.65 for NOx and 0.4 to 0.6 for SPM. About 80% of data is
observed to fall within the error band of ±50%. Forecast test for the
best fit models showed the same trend as actual values in most of the
cases. However, the deviation observed in few cases could be
attributed to change in quality of petro products, increase in the
volume of traffic, introduction of LPG as fuel in many types of
automobiles, poor condition of roads, prevailing meteorological
conditions, etc.
Abstract: The dramatic increasing of sea-freight container
transportations and the developing trends for using containers in the
multimodal handling systems through the sea, rail, road and land in
nowadays market cause general managers of container terminals to
face challenges such as increasing demand, competitive situation,
new investments and expansion of new activities and need to use new
methods to fulfil effective operations both along quayside and within
the yard. Among these issues, minimizing the turnaround time of
vessels is considered to be the first aim of every container port
system. Regarding the complex structure of container ports, this
paper presents a simulation model that calculates the number of
trucks needed in the Iranian Shahid Rajaee Container Port for
handling containers between the berth and the yard. In this research,
some important criteria such as vessel turnaround time, gantry crane
utilization and truck utilization have been considered. By analyzing
the results of the model, it has been shown that increasing the number
of trucks to 66 units has a significant effect on the performance
indices of the port and can increase the capacity of loading and
unloading up to 10.8%.
Abstract: The growing influence of service industries has
prompted greater attention being paid to service operations
management. However, service managers often have difficulty
articulating the veritable effects of their service innovation. Especially,
the performance evaluation process of service innovation problems
generally involves uncertain and imprecise data. This paper presents a
2-tuple fuzzy linguistic computing approach to dealing with
heterogeneous information and information loss problems while the
processes of subjective evaluation integration. The proposed method
based on group decision-making scenario to assist business managers
in measuring performance of service innovation manipulates the
heterogeneity integration processes and avoids the information loss
effectively.
Abstract: Ren et al. presented an efficient carrier frequency offset
(CFO) estimation method for orthogonal frequency division multiplexing
(OFDM), which has an estimation range as large as the
bandwidth of the OFDM signal and achieves high accuracy without
any constraint on the structure of the training sequence. However,
its detection probability of the integer frequency offset (IFO) rapidly
varies according to the fractional frequency offset (FFO) change. In
this paper, we first analyze the Ren-s method and define two criteria
suitable for detection of IFO. Then, we propose a novel method for
the IFO estimation based on the maximum-likelihood (ML) principle
and the detection criteria defined in this paper. The simulation results
demonstrate that the proposed method outperforms the Ren-s method
in terms of the IFO detection probability irrespective of a value of
the FFO.
Abstract: One of the criteria in production scheduling is Make
Span, minimizing this criteria causes more efficiently use of the
resources specially machinery and manpower. By assigning some
budget to some of the operations the operation time of these activities
reduces and affects the total completion time of all the operations
(Make Span). In this paper this issue is practiced in parallel flow
shops. At first we convert parallel flow shop to a network model and
by using a linear programming approach it is identified in order to
minimize make span (the completion time of the network) which
activities (operations) are better to absorb the predetermined and
limited budget. Minimizing the total completion time of all the
activities in the network is equivalent to minimizing make span in
production scheduling.
Abstract: The research objective of the project and article
“European Ecological Network Natura 2000 – opportunities and
threats” Natura 2000 sites constitute a form of environmental
protection, several legal problems are likely to result. Most
controversially, certain sites will be subject to two regimes of
protection: as national parks and as Natura 2000 sites. This dualism
of the legal regulation makes it difficult to perform certain legal
obligations related to the regimes envisaged under each form of
environmental protection. Which regime and which obligations
resulting from the particular form of environmental protection have
priority and should prevail? What should be done if these obligations
are contradictory? Furthermore, an institutional problem consists in
that no public administration authority has the power to resolve legal
conflicts concerning the application of a particular regime on a given
site. There are also no criteria to decide priority and superiority of
one form of environmental protection over the other. Which
regulations are more important, those that pertain to national parks or
to Natura 2000 sites? In the light of the current regulations, it is
impossible to give a decisive answer to these questions. The internal
hierarchy of forms of environmental protection has not been
determined, and all such forms should be treated equally.
Abstract: This paper presents the significant factor and give
some suggestion that should know before design. The main objective of this paper is guide the first step for someone who attends to design of grounding system before study in details later. The overview of
grounding system can protect damage from fault such as can save a human life and power system equipment. The unsafe conditions have
three cases. Case 1) maximum touch voltage exceeds the safety
criteria. In this case, the conductor compression ratio of the ground gird should be first adjusted to have optimal spacing of ground grid
conductors. If it still over limit, earth resistivity should be consider afterward. Case 2) maximum step voltage exceeds the safety criteria.
In this case, increasing the number of ground grid conductors around
the boundary can solve this problem. Case 3) both of maximum touch
and step voltage exceed the safety criteria. In this case, follow the solutions explained in case 1 and case 2. Another suggestion, vary depth of ground grid until maximum step and touch voltage do not
exceed the safety criteria.
Abstract: Influence of octane and benzene on plant cell
ultrastructure and enzymes of basic metabolism, such as nitrogen
assimilation and energy generation have been studied. Different
plants: perennial ryegrass (Lolium perenne) and alfalfa (Medicago
sativa); crops- maize (Zea mays L.) and bean (Phaseolus vulgaris);
shrubs – privet (Ligustrum sempervirens) and trifoliate orange
(Poncirus trifoliate); trees - poplar (Populus deltoides) and white
mulberry (Morus alba L.) were exposed to hydrocarbons of different
concentrations (1, 10 and 100 mM). Destructive changes in bean and
maize leaves cells ultrastructure under the influence of benzene
vapour were revealed at the level of photosynthetic and energy
generation subcellular organells. Different deviations at the level of
subcellular organelles structure and distribution were observed in
alfalfa and ryegrass root cells under the influence of benzene and
octane, absorbed through roots. The level of destructive changes is
concentration dependent. Benzene at low 1 and 10 mM concentration
caused the increase in glutamate dehydrogenase (GDH) activity in
maize roots and leaves and in poplar and mulberry shoots, though to
higher extent in case of lower, 1mM concentration. The induction
was more intensive in plant roots. The highest tested 100mM
concentration of benzene was inhibitory to the enzyme in all plants.
Octane caused induction of GDH in all grassy plants at all tested
concentrations; however the rate of induction decreased parallel to
increase of the hydrocarbon concentration. Octane at concentration 1
mM caused induction of GDH in privet, trifoliate and white mulberry
shoots. The highest, 100mM octane was characterized by inhibitory
effect to GDH activity in all plants. Octane had inductive effect on
malate dehydrogenase in almost all plants and tested concentrations,
indicating the intensification of Trycarboxylic Acid Cycle.
The data could be suggested for elaboration of criteria for plant
selection for phytoremediation of oil hydrocarbons contaminated
soils.
Abstract: The effective machine-job assignment of injection
molding machines is very important for industry because it is not
only directly affects the quality of the product but also the
performance and lifetime of the machine as well. The phase of
machine selection was mostly done by professionals or experienced
planners, so the possibility of matching a job with an inappropriate
machine might occur when it was conducted by an inexperienced
person. It could lead to an uneconomical plan and defects. This
research aimed to develop a machine selection system for plastic
injection machines as a tool to help in decision making of the user.
This proposed system could be used both in normal times and in
times of emergency. Fuzzy logic principle is applied to deal with
uncertainty and mechanical factors in the selection of both quantity
and quality criteria. The six criteria were obtained from a plastic
manufacturer's case study to construct a system based on fuzzy logic
theory using MATLAB. The results showed that the system was able
to reduce the defects of Short Shot and Sink Mark to 24.0% and
8.0% and the total defects was reduced around 8.7% per month.
Abstract: Sharing motivations of viral advertisements by
consumers and the impacts of these advertisements on the
perceptions for brand will be questioned in this study. Three
fundamental questions are answered in the study. These are
advertisement watching and sharing motivations of individuals,
criteria of liking viral advertisement and the impact of individual
attitudes for viral advertisement on brand perception respectively.
This study will be carried out via a viral advertisement which was
practiced in Turkey. The data will be collected by survey method and
the sample of the study consists of individuals who experienced the
practice of sample advertisement. Data will be collected by online
survey method and will be analyzed by using SPSS statistical
package program.
Recently traditional advertisement mind have been changing. New
advertising approaches which have significant impacts on consumers
have been argued. Viral advertising is a modernist advertisement
mind which offers significant advantages to brands apart from
traditional advertising channels such as television, radio and
magazines. Viral advertising also known as Electronic Word-of-
Mouth (eWOM) consists of free spread of convincing messages sent
by brands among interpersonal communication. When compared to
the traditional advertising, a more provocative thematic approach is
argued.
The foundation of this approach is to create advertisements that
are worth sharing with others by consumers. When that fact is taken
into consideration, in a manner of speaking it can also be stated that
viral advertising is media engineering.
The content worth sharing makes people being a volunteer
spokesman of a brand and strengthens the emotional bonds among
brand and consumer. Especially for some sectors in countries which
are having traditional advertising channel limitations, viral
advertising creates vital advantages.
Abstract: This research deals with a flexible flowshop
scheduling problem with arrival and delivery of jobs in groups and
processing them individually. Due to the special characteristics of
each job, only a subset of machines in each stage is eligible to
process that job. The objective function deals with minimization of
sum of the completion time of groups on one hand and minimization
of sum of the differences between completion time of jobs and
delivery time of the group containing that job (waiting period) on the
other hand. The problem can be stated as FFc / rj , Mj / irreg which
has many applications in production and service industries. A
mathematical model is proposed, the problem is proved to be NPcomplete,
and an effective heuristic method is presented to schedule
the jobs efficiently. This algorithm can then be used within the body
of any metaheuristic algorithm for solving the problem.
Abstract: The prevalence of non organic constipation differs
from country to country and the reliability of the estimate rates is
uncertain. Moreover, the clinical relevance of subdividing the
heterogeneous functional constipation disorders into pre-defined
subgroups is largely unknown.. Aim: to estimate the prevalence of
constipation in a population-based sample and determine whether
clinical subgroups can be identified. An age and gender stratified
sample population from 5 Italian cities was evaluated using a
previously validated questionnaire. Data mining by cluster analysis
was used to determine constipation subgroups. Results: 1,500
complete interviews were obtained from 2,083 contacted households
(72%). Self-reported constipation correlated poorly with symptombased
constipation found in 496 subjects (33.1%). Cluster analysis
identified four constipation subgroups which correlated to subgroups
identified according to pre-defined symptom criteria. Significant
differences in socio-demographics and lifestyle were observed
among subgroups.
Abstract: One of the main processes of supply chain
management is supplier selection process which its accurate
implementation can dramatically increase company competitiveness.
In presented article model developed based on the features of
second tiers suppliers and four scenarios are predicted in order to
help the decision maker (DM) in making up his/her mind. In addition
two tiers of suppliers have been considered as a chain of suppliers.
Then the proposed approach is solved by a method combined of
concepts of fuzzy set theory (FST) and linear programming (LP)
which has been nourished by real data extracted from an engineering
design and supplying parts company. At the end results reveal the
high importance of considering second tier suppliers features as
criteria for selecting the best supplier.
Abstract: Majority of Business Software Systems (BSS)
Development and Enhancement Projects (D&EP) fail to meet criteria
of their effectiveness, what leads to the considerable financial losses.
One of the fundamental reasons for such projects- exceptionally low
success rate are improperly derived estimates for their costs and time.
In the case of BSS D&EP these attributes are determined by the work
effort, meanwhile reliable and objective effort estimation still appears
to be a great challenge to the software engineering. Thus this paper is
aimed at presenting the most important synthetic conclusions coming
from the author-s own studies concerning the main factors of
effective BSS D&EP work effort estimation. Thanks to the rational
investment decisions made on the basis of reliable and objective
criteria it is possible to reduce losses caused not only by abandoned
projects but also by large scale of overrunning the time and costs of
BSS D&EP execution.
Abstract: This paper investigates experimental and numerical study of the airflow characteristics for vortex, round and square ceiling diffusers and its effect on the thermal comfort in a ventilated room. Three different thermal comfort criteria namely; Mean Age of the Air (MAA), ventilation effectiveness (E), and Effective Draft Temperature (EDT) have been used to predict the thermal comfort zone inside the room. In experimental work, a sub-scale room is set-up to measure the temperature field in the room. In numerical analysis, unstructured grids have been used to discretize the numerical domain. Conservation equations are solved using FLUENT commercial flow solver. The code is validated by comparing the numerical results obtained from three different turbulence models with the available experimental data. The comparison between the various numerical models shows that the standard k-ε turbulence model can be used to simulate these cases successfully. After validation of the code, effect of supply air velocity on the flow and thermal field could be investigated and hence the thermal comfort. The results show that the pressure coefficient created by the square diffuser is 1.5 times greater than that created by the vortex diffuser. The velocity decay coefficient is nearly the same for square and round diffusers and is 2.6 times greater than that for the vortex diffuser.
Abstract: This paper considers a multi criteria cell formation
problem in Cellular Manufacturing System (CMS). Minimizing the
number of voids and exceptional elements in cells simultaneously are
two proposed objective functions. This problem is an Np-hard
problem according to the literature, and therefore, we can-t find the
optimal solution by an exact method. In this paper we developed two
ant algorithms, Ant Colony Optimization (ACO) and Max-Min Ant
System (MMAS), based on Data Envelopment Analysis (DEA). Both
of them try to find the efficient solutions based on efficiency concept
in DEA. Each artificial ant is considered as a Decision Making Unit
(DMU). For each DMU we considered two inputs, the values of
objective functions, and one output, the value of one for all of them.
In order to evaluate performance of proposed methods we provided
an experimental design with some empirical problem in three
different sizes, small, medium and large. We defined three different
criteria that show which algorithm has the best performance.
Abstract: Owing the fact that optimization of business process
is a crucial requirement to navigate, survive and even thrive in
today-s volatile business environment, this paper presents a
framework for selecting a best-fit optimization package for solving
complex business problems. Complexity level of the problem and/or
using incorrect optimization software can lead to biased solutions of
the optimization problem. Accordingly, the proposed framework
identifies a number of relevant factors (e.g. decision variables,
objective functions, and modeling approach) to be considered during
the evaluation and selection process. Application domain, problem
specifications, and available accredited optimization approaches are
also to be regarded. A recommendation of one or two optimization
software is the output of the framework which is believed to provide
the best results of the underlying problem. In addition to a set of
guidelines and recommendations on how managers can conduct an
effective optimization exercise is discussed.
Abstract: In this work the opportunity of construction of the
qualifiers for face-recognition systems based on conjugation criteria
is investigated. The linkage between the bipartite conjugation, the
conjugation with a subspace and the conjugation with the null-space
is shown. The unified solving rule is investigated. It makes the
decision on the rating of face to a class considering the linkage
between conjugation values. The described recognition method can
be successfully applied to the distributed systems of video control
and video observation.
Abstract: Trust is essential for further and wider acceptance of
contemporary e-services. It was first addressed almost thirty years
ago in Trusted Computer System Evaluation Criteria standard by
the US DoD. But this and other proposed approaches of that
period were actually solving security. Roughly some ten years ago,
methodologies followed that addressed trust phenomenon at its core,
and they were based on Bayesian statistics and its derivatives, while
some approaches were based on game theory. However, trust is a
manifestation of judgment and reasoning processes. It has to be dealt
with in accordance with this fact and adequately supported in cyber
environment. On the basis of the results in the field of psychology
and our own findings, a methodology called qualitative algebra has
been developed, which deals with so far overlooked elements of trust
phenomenon. It complements existing methodologies and provides a
basis for a practical technical solution that supports management of
trust in contemporary computing environments. Such solution is also
presented at the end of this paper.