Abstract: In this paper, some practical solid transportation models are formulated considering per trip capacity of each type of conveyances with crisp and rough unit transportation costs. This is applicable for the system in which full vehicles, e.g. trucks, rail coaches are to be booked for transportation of products so that transportation cost is determined on the full of the conveyances. The models with unit transportation costs as rough variables are transformed into deterministic forms using rough chance constrained programming with the help of trust measure. Numerical examples are provided to illustrate the proposed models in crisp environment as well as with unit transportation costs as rough variables.
Abstract: A satured liquid is warmed until boiling in a parallelepipedic boiler. The heat is supplied in a liquid through the horizontal bottom of the boiler, the other walls being adiabatic. During the process of boiling, the liquid evaporates through its free surface by deforming it. This surface which subdivides the boiler into two regions occupied on both sides by the boiled liquid (broth) and its vapor which surmounts it. The broth occupying the region and its vapor the superior region. A two- fluids model is used to describe the dynamics of the broth, its vapor and their interface. In this model, the broth is treated as a monophasic fluid (homogeneous model) and form with its vapor adiphasic pseudo fluid (two-fluid model). Furthermore, the interface is treated as a zone of mixture characterized by superficial void fraction noted α* . The aim of this article is to describe the dynamics of the interface between the boiled fluid and its vapor within a boiler. The resolution of the problem allowed us to show the evolution of the broth and the level of the liquid.
Abstract: In a particular case of behavioural model reduction by ANNs, a validity domain shortening has been found. In mechanics, as in other domains, the notion of validity domain allows the engineer to choose a valid model for a particular analysis or simulation. In the study of mechanical behaviour for a cantilever beam (using linear and non-linear models), Multi-Layer Perceptron (MLP) Backpropagation (BP) networks have been applied as model reduction technique. This reduced model is constructed to be more efficient than the non-reduced model. Within a less extended domain, the ANN reduced model estimates correctly the non-linear response, with a lower computational cost. It has been found that the neural network model is not able to approximate the linear behaviour while it does approximate the non-linear behaviour very well. The details of the case are provided with an example of the cantilever beam behaviour modelling.
Abstract: Mycophenolic acid “MPA" is a secondary metabolite
of Penicillium bervicompactum with antibiotic and
immunosuppressive properties. In this study, fermentation process
was established for production of mycophenolic acid by Penicillium
bervicompactum MUCL 19011 in shake flask. The maximum MPA
production, product yield and productivity were 1.379 g/L, 18.6 mg/g
glucose and 4.9 mg/L.h respectively. Glucose consumption, biomass
and MPA production profiles were investigated during fermentation
time. It was found that MPA production starts approximately after
180 hours and reaches to a maximum at 280 h. In the next step, the
effects of methionine and acetate concentrations on MPA production
were evaluated. Maximum MPA production, product yield and
productivity (1.763 g/L, 23.8 mg/g glucose and 6.30 mg/L. h
respectively) were obtained with using 2.5 g/L methionine in culture
medium. Further addition of methionine had not more positive effect
on MPA production. Finally, results showed that the addition of
acetate to the culture medium had not any observable effect on MPA
production
Abstract: This paper presents the experimental results of silicone rubber polymer insulators for 22 kV systems under salt water dip wheel test based on IEC 62217. Straight shed silicone rubber polymer insulators having leakage distance 685 mm were tested continuously 30,000 cycles. One test cycle includes 4 positions, energized, de-energized, salt water dip and deenergized, respectively. For one test cycle, each test specimen remains stationary for about 40 second in each position and takes 8 second for rotate to next position. By visual observation, sever surface erosion was observed on the trunk near the energized end of tested specimen. Puncture was observed on the upper shed near the energized end. In addition, decreasing in hydrophobicity and increasing in hardness were measured on tested specimen comparing with new specimen. Furthermore, chemical analysis by ATR-FTIR was conducted in order to elucidate the chemical change of tested specimens comparing with new specimen.
Abstract: The evaluation of residual reliability of large sized
parallel computer interconnection systems is not practicable with
the existing methods. Under such conditions, one must go for
approximation techniques which provide the upper bound and lower
bound on this reliability. In this context, a new approximation method
for providing bounds on residual reliability is proposed here. The
proposed method is well supported by two algorithms for simulation
purpose. The bounds on residual reliability of three different categories
of interconnection topologies are efficiently found by using
the proposed method
Abstract: This paper aims to discuss the influence of resistance
characteristic on the high conductive concrete considering the changes
of voltage and environment. The high conductive concrete with
appropriate proportion is produced to the press-electrode method. The
curve of resistivity with the changes of voltage and environment is
plotted and the changes of resistivity are explored.
Abstract: A new concept for long-term reagent storage for Labon- a-Chip (LoC) devices is described. Here we present a polymer multilayer stack with integrated stick packs for long-term storage of several liquid reagents, which are necessary for many diagnostic applications. Stick packs are widely used in packaging industry for storing solids and liquids for long time. The storage concept fulfills two main requirements: First, a long-term storage of reagents in stick packs without significant losses and interaction with surroundings, second, on demand releasing of liquids, which is realized by pushing a membrane against the stick pack through pneumatic pressure. This concept enables long-term on-chip storage of liquid reagents at room temperature and allows an easy implementation in different LoC devices.
Abstract: If price and quantity are the fundamental building
blocks of any theory of market interactions, the importance of trading
volume in understanding the behavior of financial markets is clear.
However, while many economic models of financial markets have
been developed to explain the behavior of prices -predictability,
variability, and information content- far less attention has been
devoted to explaining the behavior of trading volume. In this article,
we hope to expand our understanding of trading volume by
developing a new measure of herding behavior based on a cross
sectional dispersion of volumes betas. We apply our measure to the
Toronto stock exchange using monthly data from January 2000 to
December 2002. Our findings show that the herd phenomenon
consists of three essential components: stationary herding, intentional
herding and the feedback herding.
Abstract: Given the motivation of maps impact in enhancing the
perception of the quality of life in a region, this work examines the
use of spatial analytical techniques in exploring the role of space in
shaping human development patterns in Assiut governorate.
Variations of human development index (HDI) of the governorate-s
villages, districts and cities are mapped using geographic information
systems (GIS). Global and local spatial autocorrelation measures are
employed to assess the levels of spatial dependency in the data and to
map clusters of human development. Results show prominent
disparities in HDI between regions of Assiut. Strong patterns of
spatial association were found proving the presence of clusters on the
distribution of HDI. Finally, the study indicates several "hot-spots" in
the governorate to be area of more investigations to explore the
attributes of such levels of human development. This is very
important for accomplishing the development plan of poorest regions
currently adopted in Egypt.
Abstract: In most of the cases, natural disasters lead to the
necessity of evacuating people. The quality of evacuation
management is dramatically improved by the use of information
provided by decision support systems, which become indispensable
in case of large scale evacuation operations. This paper presents a
best practice case study. In November 2007, officers from the
Emergency Situations Inspectorate “Crisana" of Bihor County from
Romania participated to a cross-border evacuation exercise, when
700 people have been evacuated from Netherlands to Belgium. One
of the main objectives of the exercise was the test of four different
decision support systems. Afterwards, based on that experience,
software system called TEVAC (Trans Border Evacuation) has been
developed “in house" by the experts of this institution. This original
software system was successfully tested in September 2008, during
the deployment of the international exercise EU-HUROMEX 2008,
the scenario involving real evacuation of 200 persons from Hungary
to Romania. Based on the lessons learned and results, starting from
April 2009, the TEVAC software is used by all Emergency
Situations Inspectorates all over Romania.
Abstract: A major requirement for Grid application developers is ensuring performance and scalability of their applications. Predicting the performance of an application demands understanding its specific features. This paper discusses performance modeling and prediction of multi-agent based simulation (MABS) applications on the Grid. An experiment conducted using a synthetic MABS workload explains the key features to be included in the performance model. The results obtained from the experiment show that the prediction model developed for the synthetic workload can be used as a guideline to understand to estimate the performance characteristics of real world simulation applications.
Abstract: The purposes of this paper are to (1) promote excellence in computer science by suggesting a cohesive innovative approach to fill well documented deficiencies in current computer science education, (2) justify (using the authors' and others anecdotal evidence from both the classroom and the real world) why this approach holds great potential to successfully eliminate the deficiencies, (3) invite other professionals to join the authors in proof of concept research. The authors' experiences, though anecdotal, strongly suggest that a new approach involving visual modeling technologies should allow computer science programs to retain a greater percentage of prospective and declared majors as students become more engaged learners, more successful problem-solvers, and better prepared as programmers. In addition, the graduates of such computer science programs will make greater contributions to the profession as skilled problem-solvers. Instead of wearily rememorizing code as they move to the next course, students will have the problem-solving skills to think and work in more sophisticated and creative ways.
Abstract: Measurement of competitiveness between countries or regions is an important topic of many economic analysis and scientific papers. In European Union (EU), there is no mainstream approach of competitiveness evaluation and measuring. There are many opinions and methods of measurement and evaluation of competitiveness between states or regions at national and European level. The methods differ in structure of using the indicators of competitiveness and ways of their processing. The aim of the paper is to analyze main sources of competitive potential of the EU Member States with the help of Factor analysis (FA) and to classify the EU Member States to homogeneous units (clusters) according to the similarity of selected indicators of competitiveness factors by Cluster analysis (CA) in reference years 2000 and 2011. The theoretical part of the paper is devoted to the fundamental bases of competitiveness and the methodology of FA and CA methods. The empirical part of the paper deals with the evaluation of competitiveness factors in the EU Member States and cluster comparison of evaluated countries by cluster analysis.
Abstract: Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Abstract: Day by day technology increases and problems
associated with this technology also increase. Several researches
were carried out to investigate the deployment of such material safely
in geotechnical engineering in particular and civil engineering in
general. However, different types of waste material have such as
cement duct, fly ash and slag been proven to be suitable in several
applications. In this research cement dust mixed with different
percentages of sand will be used in some civil engineering
application as will be explained later in this paper throughout filed
and laboratory test. The used mixer (waste material with sand) prove
high performance, durability to environmental condition, low cost
and high benefits. At higher cement dust ratio, small cement ratio is
valuable for compressive strength and permeability. Also at small
cement dust ratio higher cement ratio is valuable for compressive
strength.
Abstract: With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.
Abstract: Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.
Abstract: The inphase/quadrature (I/Q) amplitude and phase
imbalance effects are studied in coherent optical orthogonal
frequency division multiplexing (CO-OFDM) systems. An analytical
model for the I/Q imbalance is developed and supported by
simulation results. The results indicate that the I/Q imbalance degrades the BER performance considerably.
Abstract: HSDPA is a new feature which is introduced in
Release-5 specifications of the 3GPP WCDMA/UTRA standard to
realize higher speed data rate together with lower round-trip times.
Moreover, the HSDPA concept offers outstanding improvement of
packet throughput and also significantly reduces the packet call
transfer delay as compared to Release -99 DSCH. Till now the
HSDPA system uses turbo coding which is the best coding technique
to achieve the Shannon limit. However, the main drawbacks of turbo
coding are high decoding complexity and high latency which makes
it unsuitable for some applications like satellite communications,
since the transmission distance itself introduces latency due to
limited speed of light. Hence in this paper it is proposed to use LDPC
coding in place of Turbo coding for HSDPA system which decreases
the latency and decoding complexity. But LDPC coding increases the
Encoding complexity. Though the complexity of transmitter
increases at NodeB, the End user is at an advantage in terms of
receiver complexity and Bit- error rate. In this paper LDPC Encoder
is implemented using “sparse parity check matrix" H to generate a
codeword at Encoder and “Belief Propagation algorithm "for LDPC
decoding .Simulation results shows that in LDPC coding the BER
suddenly drops as the number of iterations increase with a small
increase in Eb/No. Which is not possible in Turbo coding. Also same
BER was achieved using less number of iterations and hence the
latency and receiver complexity has decreased for LDPC coding.
HSDPA increases the downlink data rate within a cell to a theoretical
maximum of 14Mbps, with 2Mbps on the uplink. The changes that
HSDPA enables includes better quality, more reliable and more
robust data services. In other words, while realistic data rates are
only a few Mbps, the actual quality and number of users achieved
will improve significantly.