Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: The objective of this paper is to construct a creativity
composite index designed to capture the growing role of creativity in
driving economic and social development for the 27 European Union
countries.
The paper proposes a new approach for the measurement of EU-27
creative potential and for determining its capacity to attract and
develop creative human capital. We apply a modified version of the
3T model developed by Richard Florida and Irene Tinagli for
constructing a Euro-Creativity Index. The resulting indexes establish
a quantitative base for policy makers, supporting their efforts to
determine the contribution of creativity to economic development.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.
Abstract: This paper discusses the landscape design that could
increase energy efficiency in a house. By planting trees in a house
compound, the tree shades prevent direct sunlight from heating up
the building, and it enables cooling off the surrounding air. The
requirement for air-conditioning could be minimized and the air
quality could be improved. During the life time of a tree, the saving
cost from the mentioned benefits could be up to US $ 200 for each
tree. The project intends to visually describe the landscape design in
a house compound that could enhance energy efficiency and
consequently lead to energy saving. The house compound model was
developed in three dimensions by using AutoCAD 2005, the
animation was programmed by using LightWave 3D softwares i.e.
Modeler and Layout to display the tree shadings in the wall. The
visualization was executed on a VRML Pad platform and
implemented on a web environment.
Abstract: The approach of subset selection in polynomial
regression model building assumes that the chosen fixed full set of
predefined basis functions contains a subset that is sufficient to
describe the target relation sufficiently well. However, in most cases
the necessary set of basis functions is not known and needs to be
guessed – a potentially non-trivial (and long) trial and error process.
In our research we consider a potentially more efficient approach –
Adaptive Basis Function Construction (ABFC). It lets the model
building method itself construct the basis functions necessary for
creating a model of arbitrary complexity with adequate predictive
performance. However, there are two issues that to some extent
plague the methods of both the subset selection and the ABFC,
especially when working with relatively small data samples: the
selection bias and the selection instability. We try to correct these
issues by model post-evaluation using Cross-Validation and model
ensembling. To evaluate the proposed method, we empirically
compare it to ABFC methods without ensembling, to a widely used
method of subset selection, as well as to some other well-known
regression modeling methods, using publicly available data sets.
Abstract: For more than 120 years, gold mining formed the
backbone the South Africa-s economy. The consequence of mine
closure was observed in large-scale land degradation and widespread
pollution of surface water and groundwater. This paper investigates
the feasibility of using natural zeolite in removing heavy metals
contaminating the Wonderfonteinspruit Catchment Area (WCA), a
water stream with high levels of heavy metals and radionuclide
pollution. Batch experiments were conducted to study the adsorption
behavior of natural zeolite with respect to Fe2+, Mn2+, Ni2+, and Zn2+.
The data was analysed using the Langmuir and Freudlich isotherms.
Langmuir was found to correlate the adsorption of Fe2+, Mn2+, Ni2+,
and Zn2+ better, with the adsorption capacity of 11.9 mg/g, 1.2 mg/g,
1.3 mg/g, and 14.7 mg/g, respectively. Two kinetic models namely,
pseudo-first order and pseudo second order were also tested to fit the
data. Pseudo-second order equation was found to be the best fit for
the adsorption of heavy metals by natural zeolite. Zeolite
functionalization with humic acid increased its uptake ability.
Abstract: Short Message Service (SMS) has grown in
popularity over the years and it has become a common way of
communication, it is a service provided through General System
for Mobile Communications (GSM) that allows users to send text
messages to others.
SMS is usually used to transport unclassified information, but
with the rise of mobile commerce it has become a popular tool for
transmitting sensitive information between the business and its
clients. By default SMS does not guarantee confidentiality and
integrity to the message content.
In the mobile communication systems, security (encryption)
offered by the network operator only applies on the wireless link.
Data delivered through the mobile core network may not be
protected. Existing end-to-end security mechanisms are provided
at application level and typically based on public key
cryptosystem.
The main concern in a public-key setting is the authenticity of
the public key; this issue can be resolved by identity-based (IDbased)
cryptography where the public key of a user can be derived
from public information that uniquely identifies the user.
This paper presents an encryption mechanism based on the IDbased
scheme using Elliptic curves to provide end-to-end security
for SMS. This mechanism has been implemented over the standard
SMS network architecture and the encryption overhead has been
estimated and compared with RSA scheme. This study indicates
that the ID-based mechanism has advantages over the RSA
mechanism in key distribution and scalability of increasing
security level for mobile service.
Abstract: Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.
Abstract: In the Enhanced Oil Recovery (EOR) method, use of Carbon dioxide flooding whereby CO2 is injected into an oil reservoir to increase output when extracting oil resulted significant recovery worldwide. The carbon dioxide function as a pressurizing agent when mixed into the underground crude oil will reduce its viscosity and will enable a rapid oil flow. Despite the CO2’s advantage in the oil recovery, it may result to asphaltene precipitation a problem that will cause the reduction of oil produced from oil wells. In severe cases, asphaltene precipitation can cause costly blockages in oil pipes and machinery. This paper presents reviews of several studies done on mathematical modeling of asphaltene precipitation. The synthesized result from several researches done on this topic can be used as guide in order to better understand asphaltene precipitation. Likewise, this can be used as initial reference for students, and new researchers doing study on asphaltene precipitation.
Abstract: In this paper a stochastic scenario-based model predictive control applied to molten salt storage systems in concentrated solar tower power plant is presented. The main goal of this study is to build up a tool to analyze current and expected future resources for evaluating the weekly power to be advertised on electricity secondary market. This tool will allow plant operator to maximize profits while hedging the impact on the system of stochastic variables such as resources or sunlight shortage.
Solving the problem first requires a mixed logic dynamic modeling of the plant. The two stochastic variables, respectively the sunlight incoming energy and electricity demands from secondary market, are modeled by least square regression. Robustness is achieved by drawing a certain number of random variables realizations and applying the most restrictive one to the system. This scenario approach control technique provides the plant operator a confidence interval containing a given percentage of possible stochastic variable realizations in such a way that robust control is always achieved within its bounds. The results obtained from many trajectory simulations show the existence of a ‘’reliable’’ interval, which experimentally confirms the algorithm robustness.
Abstract: This research aims to analyze the regenerative burner and the recuperative burner for the different reheating furnaces in the steel industry. The warm air temperatures of the burners are determined to suit with the sizes of the reheating furnaces by considering the air temperature, the fuel cost and the investment cost. The calculations of the payback period and the net present value are studied to compare the burners for the different reheating furnaces. The energy balance is utilized to calculate and compare the energy used in the different sizes of reheating furnaces for each burner. It is found that the warm air temperature is different if the sizes of reheating furnaces are varied. Based on the considerations of the net present value and the payback period, the regenerative burner is suitable for all plants at the same life of the burner. Finally, the sensitivity analysis of all factors has been discussed in this research.
Abstract: In order to evaluation the effects of soil organic
matter and biofertilizer on chickpea quality and biological
nitrogen fixation, field experiments were carried out in 2007
and 2008 growing seasons. In this research the effects of
different strategies for soil fertilization were investigated on
grain yield and yield component, minerals, organic compounds
and cooking time of chickpea. Experimental units were
arranged in split-split plots based on randomized complete
blocks with three replications. Main plots consisted of (G1):
establishing a mixed vegetation of Vicia panunica and
Hordeum vulgare and (G2): control, as green manure levels.
Also, five strategies for obtaining the base fertilizer
requirement including (N1): 20 t.ha-1 farmyard manure; (N2):
10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate;
(N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5):
10 t.ha-1 farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1
triple super phosphate were considered in sub plots.
Furthermoree four levels of biofertilizers consisted of (B1):
Bacillus lentus + Pseudomonas putida; (B2): Trichoderma
harzianum; (B3): Bacillus lentus + Pseudomonas putida +
Trichoderma harzianum; and (B4): control (without
biofertilizers) were arranged in sub-sub plots. Results showed
that integrating biofertilizers (B3) and green manure (G1)
produced the highest grain yield. The highest amounts of yield
were obtained in G1×N5 interaction. Comparison of all 2-way
and 3-way interactions showed that G1N5B3 was determined
as the superior treatment. Significant increasing of N, P2O5,
K2O, Fe and Mg content in leaves and grains emphasized on
superiority of mentioned treatment because each one of these
nutrients has an approved role in chlorophyll synthesis and
photosynthesis abilities of the crops. The combined application
of compost, farmyard manure and chemical phosphorus (N5)
in addition to having the highest yield, had the best grain
quality due to high protein, starch and total sugar contents, low
crude fiber and reduced cooking time.
Abstract: In this paper, we consider the problem of logic simplification for a special class of logic functions, namely complementary Boolean functions (CBF), targeting low power implementation using static CMOS logic style. The functions are uniquely characterized by the presence of terms, where for a canonical binary 2-tuple, D(mj) ∪ D(mk) = { } and therefore, we have | D(mj) ∪ D(mk) | = 0 [19]. Similarly, D(Mj) ∪ D(Mk) = { } and hence | D(Mj) ∪ D(Mk) | = 0. Here, 'mk' and 'Mk' represent a minterm and maxterm respectively. We compare the circuits minimized with our proposed method with those corresponding to factored Reed-Muller (f-RM) form, factored Pseudo Kronecker Reed-Muller (f-PKRM) form, and factored Generalized Reed-Muller (f-GRM) form. We have opted for algebraic factorization of the Reed-Muller (RM) form and its different variants, using the factorization rules of [1], as it is simple and requires much less CPU execution time compared to Boolean factorization operations. This technique has enabled us to greatly reduce the literal count as well as the gate count needed for such RM realizations, which are generally prone to consuming more cells and subsequently more power consumption. However, this leads to a drawback in terms of the design-for-test attribute associated with the various RM forms. Though we still preserve the definition of those forms viz. realizing such functionality with only select types of logic gates (AND gate and XOR gate), the structural integrity of the logic levels is not preserved. This would consequently alter the testability properties of such circuits i.e. it may increase/decrease/maintain the same number of test input vectors needed for their exhaustive testability, subsequently affecting their generalized test vector computation. We do not consider the issue of design-for-testability here, but, instead focus on the power consumption of the final logic implementation, after realization with a conventional CMOS process technology (0.35 micron TSMC process). The quality of the resulting circuits evaluated on the basis of an established cost metric viz., power consumption, demonstrate average savings by 26.79% for the samples considered in this work, besides reduction in number of gates and input literals by 39.66% and 12.98% respectively, in comparison with other factored RM forms.
Abstract: This policy participation action research explores the
roles of Thai government units during its 2010 fiscal year on how to
create value added to recycling business in the central part of
Thailand. The research aims to a) study how the government plays a
role to support the business, and its problems and obstacles on
supporting the business, b) to design a strategic action – short,
medium, and long term plans -- to create value added to the recycling
business, particularly in local full-loop companies/organizations
licensed by Wongpanit Waste Separation Plant as well as those
licensed by the Department of Provincial Administration. Mixed
method research design, i.e., a combination of quantitative and
qualitative methods is utilized in the present study in both data
collection and analysis procedures. Quantitative data was analyzed
by frequency, percent value, mean scores, and standard deviation,
and aimed to note trend and generalizations. Qualitative data was
collected via semi-structured interviews/focus group interviews to
explore in-depth views of the operators. The sampling included 1,079
operators in eight provinces in the central part of Thailand.
Abstract: Proposal for a secure stream cipher based on Linear Feedback Shift Registers (LFSR) is presented here. In this method, shift register structure used for polynomial modular division is combined with LFSR keystream generator to yield a new keystream generator with much higher periodicity. Security is brought into this structure by using the Boolean function to combine state bits of the LFSR keystream generator and taking the output through the Boolean function. This introduces non-linearity and security into the structure in a way similar to the Non-linear filter generator. The security and throughput of the suggested stream cipher is found to be much greater than the known LFSR based structures for the same key length.
Abstract: There is a acute water problem especially in the dry
season in and around Perundurai (Erode district, Tamil Nadu, India)
where there are more number of tannery units. Hence an attempt was
made to use the waste water from tannery industry for construction
purpose. The mechanical properties such as compressive strength,
tensile strength, flexural strength etc were studied by casting various
concrete specimens in form of cube, cylinders and beams etc and
were found to be satisfactory. Hence some special properties such as
chloride attack, sulphate attack and chemical attack are considered
and comparatively studied with the conventional potable water. In
this experimental study the results of specimens prepared by using
treated and untreated tannery effluent were compared with the
concrete specimens prepared by using potable water. It was observed
that the concrete had some reduction in strength while subjected to
chloride attack, sulphate attack and chemical attack. So admixtures
were selected and optimized in suitable proportion to counter act the
adverse effects and the results were found to be satisfactory.
Abstract: In this paper we study the use of a new code called
Random Diagonal (RD) code for Spectral Amplitude Coding (SAC)
optical Code Division Multiple Access (CDMA) networks, using
Fiber Bragg-Grating (FBG), FBG consists of a fiber segment whose
index of reflection varies periodically along its length. RD code is
constructed using code level and data level, one of the important
properties of this code is that the cross correlation at data level is
always zero, which means that Phase intensity Induced Phase (PIIN)
is reduced. We find that the performance of the RD code will be
better than Modified Frequency Hopping (MFH) and Hadamard code
It has been observed through experimental and theoretical simulation
that BER for RD code perform significantly better than other codes.
Proof –of-principle simulations of encoding with 3 channels, and 10
Gbps data transmission have been successfully demonstrated together
with FBG decoding scheme for canceling the code level from SAC-signal.
Abstract: The aim of this research is to use artificial neural networks computing technology for estimating the net heating value (NHV) of crude oil by its Properties. The approach is based on training the neural network simulator uses back-propagation as the learning algorithm for a predefined range of analytically generated well test response. The network with 8 neurons in one hidden layer was selected and prediction of this network has been good agreement with experimental data.
Abstract: Load forecasting has always been the essential part of
an efficient power system operation and planning. A novel approach
based on support vector machines is proposed in this paper for annual
power load forecasting. Different kernel functions are selected to
construct a combinatorial algorithm. The performance of the new
model is evaluated with a real-world dataset, and compared with two
neural networks and some traditional forecasting techniques. The
results show that the proposed method exhibits superior performance.