Abstract: This paper deals with the issue of biomass and sorted
municipal waste gasification and cogeneration using hot-air turbo-set.
It brings description of designed pilot plant with electrical output 80
kWe. The generated gas is burned in secondary combustion chamber
located beyond the gas generator. Flue gas flows through the heat
exchanger where the compressed air is heated and consequently
brought to a micro turbine. Except description, this paper brings our
basic experiences from operating of pilot plant (operating parameters,
contributions, problems during operating, etc.). The principal
advantage of the given cycle is the fact that there is no contact
between the generated gas and the turbine. So there is no need for
costly and complicated gas cleaning which is the main source of
operating problems in direct use in combustion engines because the
content of impurities in the gas causes operation problems to the units
due to clogging and tarring of working surfaces of engines and
turbines, which may lead as far as serious damage to the equipment
under operation. Another merit is the compact container package
making installation of the facility easier or making it relatively more
mobile. We imagine, this solution of cogeneration from biomass or
waste can be suitable for small industrial or communal applications,
for low output cogeneration.
Abstract: Over the past era, there have been a lot of efforts and
studies are carried out in growing proficient tools for performing
various tasks in big data. Recently big data have gotten a lot of
publicity for their good reasons. Due to the large and complex
collection of datasets it is difficult to process on traditional data
processing applications. This concern turns to be further mandatory
for producing various tools in big data. Moreover, the main aim of
big data analytics is to utilize the advanced analytic techniques
besides very huge, different datasets which contain diverse sizes from
terabytes to zettabytes and diverse types such as structured or
unstructured and batch or streaming. Big data is useful for data sets
where their size or type is away from the capability of traditional
relational databases for capturing, managing and processing the data
with low-latency. Thus the out coming challenges tend to the
occurrence of powerful big data tools. In this survey, a various
collection of big data tools are illustrated and also compared with the
salient features.
Abstract: In this study, a three dimensional numerical heat
transfer model has been used to simulate the laser structuring of
polymer substrate material in the Three-Dimensional Molded
Interconnect Device (3D MID) which is used in the advanced multifunctional
applications. A finite element method (FEM) transient
thermal analysis is performed using APDL (ANSYS Parametric
Design Language) provided by ANSYS. In this model, the effect of
surface heat source was modeled with Gaussian distribution, also the
effect of the mixed boundary conditions which consist of convection
and radiation heat transfers have been considered in this analysis. The
model provides a full description of the temperature distribution, as
well as calculates the depth and the width of the groove upon material
removal at different set of laser parameters such as laser power and
laser speed. This study also includes the experimental procedure to
study the effect of laser parameters on the depth and width of the
removal groove metal as verification to the modeled results. Good
agreement between the experimental and the model results is
achieved for a wide range of laser powers. It is found that the quality
of the laser structure process is affected by the laser scan speed and
laser power. For a high laser structured quality, it is suggested to use
laser with high speed and moderate to high laser power.
Abstract: Estimation of a proportion has many applications in
economics and social studies. A common application is the estimation
of the low income proportion, which gives the proportion of people
classified as poor into a population. In this paper, we present this
poverty indicator and propose to use the logistic regression estimator
for the problem of estimating the low income proportion. Various
sampling designs are presented. Assuming a real data set obtained
from the European Survey on Income and Living Conditions, Monte
Carlo simulation studies are carried out to analyze the empirical
performance of the logistic regression estimator under the various
sampling designs considered in this paper. Results derived from
Monte Carlo simulation studies indicate that the logistic regression
estimator can be more accurate than the customary estimator under
the various sampling designs considered in this paper. The stratified
sampling design can also provide more accurate results.
Abstract: This paper aims at introducing finite automata theory,
the different ways to describe regular languages and create a program
to implement the subset construction algorithms to convert
nondeterministic finite automata (NFA) to deterministic finite
automata (DFA). This program is written in c++ programming
language. The program reads FA 5tuples from text file and then
classifies it into either DFA or NFA. For DFA, the program will read
the string w and decide whether it is acceptable or not. If accepted, the
program will save the tracking path and point it out. On the other hand,
when the automation is NFA, the program will change the Automation
to DFA so that it is easy to track and it can decide whether the w exists
in the regular language or not.
Abstract: This paper aims at finding a suitable neural network
for monitoring congestion level in electrical power systems. In this
paper, the input data has been framed properly to meet the target
objective through supervised learning mechanism by defining normal
and abnormal operating conditions for the system under study. The
congestion level, expressed as line congestion index (LCI), is
evaluated for each operating condition and is presented to the NN
along with the bus voltages to represent the input and target data.
Once, the training goes successful, the NN learns how to deal with a
set of newly presented data through validation and testing
mechanism. The crux of the results presented in this paper rests on
performance comparison of a multi-layered feed forward neural
network with eleven types of back propagation techniques so as to
evolve the best training criteria. The proposed methodology has been
tested on the standard IEEE-14 bus test system with the support of
MATLAB based NN toolbox. The results presented in this paper
signify that the Levenberg-Marquardt backpropagation algorithm
gives best training performance of all the eleven cases considered in
this paper, thus validating the proposed methodology.
Abstract: The European Union Survey on Income and Living
Conditions (EU-SILC) is a popular survey which provides
information on income, poverty, social exclusion and living
conditions of households and individuals in the European Union.
The EU-SILC contains variables which may contain outliers. The
presence of outliers can have an impact on the measures and
indicators used by the EU-SILC. In this paper, we used data sets
from various countries to analyze the presence of outliers. In addition,
we obtain some indicators after removing these outliers, and a
comparison between both situations can be observed. Finally, some
conclusions are obtained.
Abstract: This research paper aims to identify, analyze and rank
factors affecting labor productivity in Spain with respect to their
relative importance. Using a selected set of 35 factors, a structured
questionnaire survey was utilized as the method to collect data from
companies. Target population is comprised by a random
representative sample of practitioners related with the Spanish
construction industry. Findings reveal the top five ranked factors are
as follows: (1) shortage or late supply of materials; (2) clarity of the
drawings and project documents; (3) clear and daily task assignment;
(4) tools or equipment shortages; (5) level of skill and experience of
laborers. Additionally, this research also pretends to provide simple
and comprehensive recommendations so that they could be
implemented by construction managers for an effective management
of construction labor forces.
Abstract: Anultra-low power capacitor less low-dropout voltage
regulator with improved transient response using gain enhanced feed
forward path compensation is presented in this paper. It is based on a
cascade of a voltage amplifier and a transconductor stage in the feed
forward path with regular error amplifier to form a composite gainenhanced
feed forward stage. It broadens the gain bandwidth and thus
improves the transient response without substantial increase in power
consumption. The proposed LDO, designed for a maximum output
current of 100 mA in UMC 180 nm, requires a quiescent current of
69 )A. An undershot of 153.79mV for a load current changes from
0mA to 100mA and an overshoot of 196.24mV for current change of
100mA to 0mA. The settling time is approximately 1.1 )s for the
output voltage undershooting case. The load regulation is of 2.77
)V/mA at load current of 100mA. Reference voltage is generated by
using an accurate band gap reference circuit of 0.8V.The costly
features of SOC such as total chip area and power consumption is
drastically reduced by the use of only a total compensation
capacitance of 6pF while consuming power consumption of 0.096
mW.
Abstract: Composite materials have important assets compared
to traditional materials. They bring many functional advantages:
lightness, mechanical resistance and chemical, etc. In the present
study we examine the effect of a circular central notch and a precrack
on the tensile fracture of two woven composite materials. The tensile
tests were applied to a standardized specimen, notched and a
precarcked (orientation of the crack 0°, 45° and 90°). These tensile
tests were elaborated according to an experimental planning design of
the type 23.31 requiring 24 experiments with three repetitions. By the
analysis of regression, we obtained a mathematical model describing
the maximum load according to the influential parameters (hole
diameter, precrack length, angle of a precrack orientation). The
specimens precracked at 90° have a better behavior than those having
a precrack at 45° and still better than those having of the precracks
oriented at 0°. In addition the maximum load is inversely
proportional to the notch size.
Abstract: Frequent pattern mining is the process of finding a
pattern (a set of items, subsequences, substructures, etc.) that occurs
frequently in a data set. It was proposed in the context of frequent
itemsets and association rule mining. Frequent pattern mining is used
to find inherent regularities in data. What products were often
purchased together? Its applications include basket data analysis,
cross-marketing, catalog design, sale campaign analysis, Web log
(click stream) analysis, and DNA sequence analysis. However, one of
the bottlenecks of frequent itemset mining is that as the data increase
the amount of time and resources required to mining the data
increases at an exponential rate. In this investigation a new algorithm
is proposed which can be uses as a pre-processor for frequent itemset
mining. FASTER (FeAture SelecTion using Entropy and Rough sets)
is a hybrid pre-processor algorithm which utilizes entropy and roughsets
to carry out record reduction and feature (attribute) selection
respectively. FASTER for frequent itemset mining can produce a
speed up of 3.1 times when compared to original algorithm while
maintaining an accuracy of 71%.
Abstract: A generalized vortex lattice method for complex
lifting surfaces with flap and aileron deflection is formulated. The
method is not restricted by the linearized theory assumption and
accounts for all standard geometric lifting surface parameters:
camber, taper, sweep, washout, dihedral, in addition to flap and
aileron deflection. Thickness is not accounted for since the physical
lifting body is replaced by a lattice of panels located on the mean
camber surface. This panel lattice setup and the treatment of different
wake geometries is what distinguish the present work form the
overwhelming majority of previous solutions based on the vortex
lattice method. A MATLAB code implementing the proposed
formulation is developed and validated by comparing our results to
existing experimental and numerical ones and good agreement is
demonstrated. It is then used to study the accuracy of the widely used
classical vortex-lattice method. It is shown that the classical approach
gives good agreement in the clean configuration but is off by as much
as 30% when a flap or aileron deflection of 30° is imposed. This
discrepancy is mainly due the linearized theory assumption
associated with the conventional method. A comparison of the effect
of four different wake geometries on the values of aerodynamic
coefficients was also carried out and it is found that the choice of the
wake shape had very little effect on the results.
Abstract: This paper clarifies the role of ICT capital in economic
growth. Albeit ICT remarkably contributes to economic growth, there
are few studies on ICT capital in ICT sector from theoretical point of
view. In this paper, production function of ICT which is used as input
of intermediate good in final good and ICT sectors is incorporated
into our model. In this setting, we analyze the role of ICT on balance
growth path and show the possibility of general equilibrium solutions
for this model. Through the simulation of the equilibrium solutions,
we find that when ICT impacts on economy and economic growth
increases, it is necessary that increases of efficiency at ICT sector and
of accumulation of non-ICT and ICT capitals occur simultaneously.
Abstract: From an organizational perspective, leaders are a
variation of the same talent pool in that they all score a larger than
average value on the bell curve that maps leadership behaviors and
characteristics, namely competence, vision, communication,
confidence, cultural sensibility, stewardship, empowerment,
authenticity, reinforcement, and creativity. The question that remains
unanswered and essentially unresolved is how to explain the irony
that leaders are so much alike yet their organizations diverge so
noticeably in their ability to innovate. Leadership intersects with
innovation at the point where human interactions get exceedingly
complex and where certain paradoxical forces cohabit: conflict with
conciliation, sovereignty with interdependence, and imagination with
realism. Rather than accepting that leadership is without context, we
argue that leaders are specialists of their domain and that those
effective at leading for innovation are distinct within the broader pool
of leaders. Keeping in view the extensive literature on leadership and
innovation, we carried out a quantitative study with data collected
over a five-year period involving 240 participants from across five
dissimilar companies based in the United States. We found that while
innovation and leadership are, in general, strongly interrelated (r =
.89, p = 0.0), there are five qualities that set leaders apart on
innovation. These qualities include a large radius of trust, a restless
curiosity with a low need for acceptance, an honest sense of self and
other, a sense for knowledge and creativity as the yin and yang of
innovation, and an ability to use multiple senses in the engagement
with followers. When these particular behaviors and characteristics
are present in leaders, organizations out-innovate their rivals by a
margin of 29.3 per cent to gain an unassailable edge in a business
environment that is regularly disruptive. A strategic outcome of this
study is a psychometric scale named iLeadership, proposed with the
underlying evidence, limitations, and potential for leadership and
innovation in organizations.c
Abstract: With the rapid progress of modern cities, the railway
construction must be developing quickly in China.As a typical
high-density country, shopping center on the subway should be one
important factor during the process of urban development. The paper
discusses the influence of the layout of shopping center on the subway,
and put it in the time and space’s axis of Shanghai urban development.
We usethe digital technology to establish the database of relevant
information. And then get the change role about shopping center on
subway in Shanghaiby the Kernel density estimate.The result shows
the development of shopping center on subway has a relationship with
local economic strength, population size, policysupport, and city
construction. And the suburbanization trend of shopping center would
be increasingly significant.By this case research, we could see the
Kernel density estimate is an efficient analysis method on the spatial
layout. It could reveal the characters of layout form of shopping center
on subway in essence. And it can also be applied to the other research
of space form.
Abstract: Consumer-to-Consumer (C2C) E-commerce has been
growing at a very high speed in recent years. Since identical or
nearly-same kinds of products compete one another by relying on
keyword search in C2C E-commerce, some sellers describe their
products with spam keywords that are popular but are not related to
their products. Though such products get more chances to be retrieved
and selected by consumers than those without spam keywords,
the spam keywords mislead the consumers and waste their time.
This problem has been reported in many commercial services like
ebay and taobao, but there have been little research to solve this
problem. As a solution to this problem, this paper proposes a method
to classify whether keywords of a product are spam or not. The
proposed method assumes that a keyword for a given product is
more reliable if the keyword is observed commonly in specifications
of products which are the same or the same kind as the given
product. This is because that a hierarchical category of a product
in general determined precisely by a seller of the product and so is
the specification of the product. Since higher layers of the hierarchical
category represent more general kinds of products, a reliable degree
is differently determined according to the layers. Hence, reliable
degrees from different layers of a hierarchical category become
features for keywords and they are used together with features only
from specifications for classification of the keywords. Support Vector
Machines are adopted as a basic classifier using the features, since
it is powerful, and widely used in many classification tasks. In
the experiments, the proposed method is evaluated with a golden
standard dataset from Yi-han-wang, a Chinese C2C E-commerce,
and is compared with a baseline method that does not consider
the hierarchical category. The experimental results show that the
proposed method outperforms the baseline in F1-measure, which
proves that spam keywords are effectively identified by a hierarchical
category in C2C E-commerce.
Abstract: In this paper the issue of dimensionality reduction is
investigated in finger vein recognition systems using kernel Principal
Component Analysis (KPCA). One aspect of KPCA is to find the
most appropriate kernel function on finger vein recognition as there
are several kernel functions which can be used within PCA-based
algorithms. In this paper, however, another side of PCA-based
algorithms -particularly KPCA- is investigated. The aspect of
dimension of feature vector in PCA-based algorithms is of
importance especially when it comes to the real-world applications
and usage of such algorithms. It means that a fixed dimension of
feature vector has to be set to reduce the dimension of the input and
output data and extract the features from them. Then a classifier is
performed to classify the data and make the final decision. We
analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in
this paper and investigate the optimal feature extraction dimension in
finger vein recognition using KPCA.
Abstract: This paper presents the variation of the dynamic
characteristics of a spindle with the change of bearing preload. The
correlations between the variation of bearing preload and fundamental
modal parameters were first examined by conducting vibration tests on
physical spindle units. Experimental measurements show that the
dynamic compliance and damping ratio associated with the
dominating modes were affected to vary with variation of the bearing
preload. When the bearing preload was slightly deviated from a
standard value, the modal frequency and damping ability also vary to
different extent, which further enable the spindle to perform with
different compliance. For the spindle used in this study, a standard
preload value set on bearings would enable the spindle to behave a
higher stiffness as compared with others with a preload variation. This
characteristic can be served as a reference to examine the variation of
bearing preload of spindle in assemblage or operation.
Abstract: In order to evaluate the performance of a unified power
flow controller (UPFC), mathematical models for steady state and
dynamic analysis are to be developed. The steady state model is
mainly concerned with the incorporation of the UPFC in load flow
studies. Several load flow models for UPFC have been introduced
in literature, and one of the most reliable models is the decoupled
UPFC model. In spite of UPFC decoupled load flow model simplicity,
it is more robust compared to other UPFC load flow models and it
contains unique capabilities. Some shortcoming such as additional
set of nonlinear equations are to be solved separately after the load
flow solution is obtained. The aim of this study is to investigate the
different control strategies that can be realized in the decoupled load
flow model (individual control and combined control), and the impact
of the location of the UPFC in the network on its control parameters.
Abstract: The high Peak to Average Power Ratio (PAR) in Filter
Bank Multicarrier with Offset Quadrature Amplitude Modulation
(FBMC-OQAM) can significantly reduce power efficiency and
performance. In this paper, we address the problem of PAPR
reduction for FBMC-OQAM systems using Tone Reservation (TR)
technique. Due to the overlapping structure of FBMCOQAM signals,
directly applying TR schemes of OFDM systems to FBMC-OQAM
systems is not effective. We improve the tone reservation (TR)
technique by employing sliding window with Active Constellation
Extension for the PAPR reduction of FBMC-OQAM signals, called
sliding window tone reservation Active Constellation Extension
(SW-TRACE) technique. The proposed SW-TRACE technique uses
the peak reduction tones (PRTs) of several consecutive data
blocks to cancel the peaks of the FBMC-OQAM signal inside a
window, with dynamically extending outer constellation points in
active(data-carrying) channels, within margin-preserving constraints,
in order to minimize the peak magnitude. Analysis and simulation
results compared to the existing Tone Reservation (TR) technique for
FBMC/OQAM system. The proposed method SW-TRACE has better
PAPR performance and lower computational complexity.