Abstract: Hydrogen diffusion is the main problem for corrosion fatigue in corrosive environment. In order to analyze the phenomenon, it is needed to understand their behaviors specially the hydrogen behavior during the diffusion. So, Hydrogen embrittlement and prediction its behavior as a main corrosive part of the fractions, needed to solve combinations of different equations mathematically. The main point to obtain the equation, having knowledge about the source of causing diffusion and running the atoms into materials, called driving force. This is produced by either gradient of electrical or chemical potential. In this work, we consider the gradient of chemical potential to obtain the property equation. In diffusion of atoms, some of them may be trapped but, it could be ignorable in some conditions. According to the phenomenon of hydrogen embrittlement, the thermodynamic and chemical properties of hydrogen are considered to justify and relate them to fracture mechanics. It is very important to get a stress intensity factor by using fugacity as a property of hydrogen or other gases. Although, the diffusive behavior and embrittlement event are common and the same for other gases but, for making it more clear, we describe it for hydrogen. This considering on the definite gas and describing it helps us to understand better the importance of this relation.
Abstract: In this paper, an extreme learning machine with an automatic segmentation algorithm is applied to heart disorder classification by heart sound signals. From continuous heart sound signals, the starting points of the first (S1) and the second heart pulses (S2) are extracted and corrected by utilizing an inter-pulse histogram. From the corrected pulse positions, a single period of heart sound signals is extracted and converted to a feature vector including the mel-scaled filter bank energy coefficients and the envelope coefficients of uniform-sized sub-segments. An extreme learning machine is used to classify the feature vector. In our cardiac disorder classification and detection experiments with 9 cardiac disorder categories, the proposed method shows significantly better performance than multi-layer perceptron, support vector machine, and hidden Markov model; it achieves the classification accuracy of 81.6% and the detection accuracy of 96.9%.
Abstract: New ways of working- refers to non-traditional work practices, settings and locations with information and communication technologies (ICT) to supplement or replace traditional ways of working. It questions the contemporary work practices and settings still very much used in knowledge-intensive organizations today. In this study new ways of working is seen to consist of two elements: work environment (incl. physical, virtual and social) and work practices. This study aims to gather the scattered information together and deepen the understanding on new ways of working. Moreover, the objective is to provide some evidence of the unclear productivity impacts of new ways of working using case study approach.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: The problems with high complexity had been the challenge in combinatorial problems. Due to the none-determined and polynomial characteristics, these problems usually face to unreasonable searching budget. Hence combinatorial optimizations attracted numerous researchers to develop better algorithms. In recent academic researches, most focus on developing to enhance the conventional evolutional algorithms and facilitate the local heuristics, such as VNS, 2-opt and 3-opt. Despite the performances of the introduction of the local strategies are significant, however, these improvement cannot improve the performance for solving the different problems. Therefore, this research proposes a meta-heuristic evolutional algorithm which can be applied to solve several types of problems. The performance validates BBEA has the ability to solve the problems even without the design of local strategies.
Abstract: Today advertising is actively penetrating into many spheres of our lives. We cannot imagine the existence of a lot of economic activities without advertising. That mostly concerns trade and services. Everyone of us should look better into the everyday communication and carefully consider the amount and the quality of the information we receive as well as its influence on our behaviour. Special attention should be paid to the young generation. Theoretical and practical research has proved the ever growing influence of information (especially the one contained in advertising) on a society; on its economics, culture, religion, politics and even people-s private lives and behaviour. Children have plenty of free time and, therefore, see a lot of different advertising. Though education of children is in the hands of parents and schools, advertising makers and customers should think with responsibility about the selection of time and transmission channels of child targeted advertising. The purpose of the present paper is to investigate the influence of advertising upon consumer views and behaviour of children in different age groups. The present investigation has clarified the influence of advertising as a means of information on a certain group of society, which in the modern information society is the most vulnerable – children. In this paper we assess children-s perception and their understanding of advertising.
Abstract: Devices in a pervasive computing system (PCS) are characterized by their context-awareness. It permits them to provide proactively adapted services to the user and applications. To do so, context must be well understood and modeled in an appropriate form which enhance its sharing between devices and provide a high level of abstraction. The most interesting methods for modeling context are those based on ontology however the majority of the proposed methods fail in proposing a generic ontology for context which limit their usability and keep them specific to a particular domain. The adaptation task must be done automatically and without an explicit intervention of the user. Devices of a PCS must acquire some intelligence which permits them to sense the current context and trigger the appropriate service or provide a service in a better suitable form. In this paper we will propose a generic service ontology for context modeling and a context-aware service adaptation based on a service oriented definition of context.
Abstract: Grasslands of Iran are encountered with a vast
desertification and destruction. Some legumes are plants of forage
importance with high palatability. Studied legumes in this project are
Onobrychis, Medicago sativa (alfalfa) and Trifolium repens. Seeds
were cultivated in research field of Kaboutarabad (33 km East of
Isfahan, Iran) with an average 80 mm. annual rainfall. Plants were
cultivated in a split plot design with 3 replicate and two water
treatments (weekly irrigation, and under stress with same amount per
15 days interval). Water entrance to each plots were measured by
Partial flow. This project lasted 20 weeks. Destructive samplings
(1m2 each time) were done weekly. At each sampling plants were
gathered and weighed separately for each vegetative parts. An Area
Meter (Vista) was used to measure root surface and leaf area. Total
shoot and root fresh and dry weight, leaf area index and soil coverage
were evaluated too. Dry weight was achieved in 750c oven after 24
hours. Statgraphic and Harvard Graphic software were used to
formulate and demonstrate the parameters curves due to time. Our
results show that Trifolium repens has affected 60 % and Medicago
sativa 18% by water stress. Onobrychis total fresh weight was
reduced 45%. Dry weight or Biomass in alfalfa is not so affected by
water shortage. This means that in alfalfa fields we can decrease the
irrigation amount and have some how same amount of Biomass.
Onobrychis show a drastic decrease in Biomass. The increases in
total dry matter due to time in studied plants are formulated. For
Trifolium repens if removal or cattle entrance to meadows do not
occurred at perfect time, it will decrease the palatability and water
content of the shoots. Water stress in a short period could develop the
root system in Trifolium repens, but if it last more than this other
ecological and soil factors will affect the growth of this plant. Low
level of soil water is not so important for studied legume forges. But
water shortage affect palatability and water content of aerial parts.
Leaf area due to time in studied legumes is formulated. In fact leaf
area is decreased by shortage in available water. Higher leaf area
means higher forage and biomass production. Medicago and
Onobrychis reach to the maximum leaf area sooner than Trifolium
and are able to produce an optimum soil cover and inhibit the
transpiration of soil water of meadows. Correlation of root surface to
Total biomass in studied plants is formulated. Medicago under water
stress show a 40% decrease in crown cover while at optimum
condition this amount reach to 100%. In order to produce forage in
areas without soil erosion Medicago is the best choice even with a
shortage in water resources. It is tried to represent the growth
simulation of three famous Forage Legumes. By growth simulation
farmers and range managers could better decide to choose best plant
adapted to water availability without designing different time and
labor consuming field experiments.
Abstract: Debates on residential satisfaction topic have been
vigorously discussed in family house setting. Nonetheless, less or
lack of attention was given to survey on student residential
satisfaction in the campus house setting. This study, however, tried to
fill in the gap by focusing more on the relationship between students-
socio-economic backgrounds and student residential satisfaction with
their on-campus student housing facilities. Two-stage cluster
sampling method was employed to classify the respondents. Then,
self-administered questionnaires were distributed face-to-face to the
students. In general, it was confirmed that the students- socioeconomic
backgrounds have significantly influence the students-
satisfaction with their on-campus student housing facilities. The main
influential factors were revealed as the economic status, sense of
sharing, and the ethnicity of roommates. Likewise, this study could
also provide some useful feedback for the universities administration
in order to improve their student housing facilities.
Abstract: Microscopic emission and fuel consumption models
have been widely recognized as an effective method to quantify real
traffic emission and energy consumption when they are applied with
microscopic traffic simulation models. This paper presents a
framework for developing the Microscopic Emission (HC, CO, NOx,
and CO2) and Fuel consumption (MEF) models for light-duty
vehicles. The variable of composite acceleration is introduced into
the MEF model with the purpose of capturing the effects of historical
accelerations interacting with current speed on emission and fuel
consumption. The MEF model is calibrated by multivariate
least-squares method for two types of light-duty vehicle using
on-board data collected in Beijing, China by a Portable Emission
Measurement System (PEMS). The instantaneous validation results
shows the MEF model performs better with lower Mean Absolute
Percentage Error (MAPE) compared to other two models. Moreover,
the aggregate validation results tells the MEF model produces
reasonable estimations compared to actual measurements with
prediction errors within 12%, 10%, 19%, and 9% for HC, CO, NOx
emissions and fuel consumption, respectively.
Abstract: Email has become a fast and cheap means of online
communication. The main threat to email is Unsolicited Bulk Email
(UBE), commonly called spam email. The current work aims at
identification of unigrams in more than 2700 UBE that advertise
body-enhancement drugs. The identification is based on the
requirement that the unigram is neither present in dictionary, nor is a
slang term. The motives of the paper are many fold. This is an
attempt to analyze spamming behaviour and employment of wordmutation
technique. On the side-lines of the paper, we have
attempted to better understand the spam, the slang and their interplay.
The problem has been addressed by employing Tokenization
technique and Unigram BOW model. We found that the non-lexicon
words constitute nearly 66% of total number of lexis of corpus
whereas non-slang words constitute nearly 2.4% of non-lexicon
words. Further, non-lexicon non-slang unigrams composed of 2
lexicon words, form more than 71% of the total number of such
unigrams. To the best of our knowledge, this is the first attempt to
analyze usage of non-lexicon non-slang unigrams in any kind of
UBE.
Abstract: In this work, the influence of temperature on the
different parameters of solar cells based on organic semiconductors
are studied. The short circuit current Isc increases so monotonous
with temperature and then saturates to a maximum value before
decreasing at high temperatures. The open circuit voltage Vco
decreases linearly with temperature. The fill factor FF and efficiency,
which are directly related with Isc and Vco follow the variations of
the letters. The phenomena are explained by the behaviour of the
mobility which is a temperature activated process.
Abstract: Defect prevention is the most vital but habitually
neglected facet of software quality assurance in any project. If
functional at all stages of software development, it can condense the
time, overheads and wherewithal entailed to engineer a high quality
product. The key challenge of an IT industry is to engineer a
software product with minimum post deployment defects.
This effort is an analysis based on data obtained for five selected
projects from leading software companies of varying software
production competence. The main aim of this paper is to provide
information on various methods and practices supporting defect
detection and prevention leading to thriving software generation. The
defect prevention technique unearths 99% of defects. Inspection is
found to be an essential technique in generating ideal software
generation in factories through enhanced methodologies of abetted
and unaided inspection schedules. On an average 13 % to 15% of
inspection and 25% - 30% of testing out of whole project effort time
is required for 99% - 99.75% of defect elimination.
A comparison of the end results for the five selected projects
between the companies is also brought about throwing light on the
possibility of a particular company to position itself with an
appropriate complementary ratio of inspection testing.
Abstract: Calcium oxide (CaO) as carbon dioxide (CO2)
adsorbent at the elevated temperature has been very well-received
thus far. The CaO can be synthesized from natural calcium carbonate
(CaCO3) sources through the reversible calcination-carbonation
process. In the study, cockle shell has been selected as CaO
precursors. The objectives of the study are to investigate the
performance of calcination and carbonation with respect to different
temperature, heating rate, particle size and the duration time. Overall,
better performance is shown at the calcination temperature of 850oC
for 40 minutes, heating rate of 20oC/min, particle size of < 0.125mm
and the carbonation temperature is at 650oC. The synthesized
materials have been characterized by nitrogen physisorption and
surface morphology analysis. The effectiveness of the synthesized
cockle shell in capturing CO2 (0.72 kg CO2/kg adsorbent) which is
comparable to the commercialized adsorbent (0.60 kg CO2/kg
adsorbent) makes them as the most promising materials for CO2
capture.
Abstract: To derive the fractional flow equation oil
displacement will be assumed to take place under the so-called
diffusive flow condition. The constraints are that fluid saturations at
any point in the linear displacement path are uniformly distributed
with respect to thickness; this allows the displacement to be described
mathematically in one dimension. The simultaneous flow of oil and
water can be modeled using thickness averaged relative permeability,
along the centerline of the reservoir. The condition for fluid potential
equilibrium is simply that of hydrostatic equilibrium for which the
saturation distribution can be determined as a function of capillary
pressure and therefore, height. That is the fluids are distributed in
accordance with capillary-gravity equilibrium.
This paper focused on the fraction flow of water versus
cumulative oil recoveries using Buckley Leverett method. Several
field cases have been developed to aid in analysis. Producing watercut
(at surface conditions) will be compared with the cumulative oil
recovery at breakthrough for the flowing fluid.
Abstract: Maximal length sequences (m-sequences) are also
known as pseudo random sequences or pseudo noise sequences
for closely following Golomb-s popular randomness properties: (P1)
balance, (P2) run, and (P3) ideal autocorrelation. Apart from these,
there also exist certain other less known properties of such sequences
all of which are discussed in this tutorial paper. Comprehensive proofs
to each of these properties are provided towards better understanding
of such sequences. A simple test is also proposed at the end of
the paper in order to distinguish pseudo noise sequences from truly
random sequences such as Bernoulli sequences.
Abstract: This study uses a simulation to establish a realistic
environment for laboratory research on Accountable Care
Organizations. We study network attributes in order to gain insights
regarding healthcare providers- conduct and performance. Our
findings indicate how network structure creates significant
differences in organizational performance. We demonstrate how
healthcare providers positioning themselves at the central, pivotal
point of the network while maintaining their alliances with their
partners produce better outcomes.
Abstract: A Space Vector based Pulse Width Modulation
control technique for the three-phase PWM converter is proposed in
this paper. The proposed control scheme is based on a synchronous
reference frame model. High performance and efficiency is obtained
with regards to the DC bus voltage and the power factor
considerations of the PWM rectifier thus leading to low losses.
MATLAB/SIMULINK are used as a platform for the simulations and
a SIMULINK model is presented in the paper. The results show that
the proposed model demonstrates better performance and properties
compared to the traditional SPWM method and the method improves
the dynamic performance of the closed loop drastically.
For the Space Vector based Pulse Width Modulation, Sine signal
is the reference waveform and triangle waveform is the carrier
waveform. When the value sine signal is large than triangle signal,
the pulse will start produce to high. And then when the triangular
signals higher than sine signal, the pulse will come to low. SPWM
output will changed by changing the value of the modulation index
and frequency used in this system to produce more pulse width. The
more pulse width produced, the output voltage will have lower
harmonics contents and the resolution increase.
Abstract: Any use of energy in industrial productive activities is combined with various environment impacts. Withintransportation,
this fact was not only found among land transport, railways and maritime transport, but also in the air transport industry. An effective climate protection requires strategies and measures for reducing all
greenhouses gas emissions, in particular carbon dioxide, and must
take into account the economic, ecologic and social aspects. It seem simperative now to develop and manufacture environmentally
friendly products and systems, to reduce consumption and use less
resource, and to save energy and power. Today-sproducts could
better serve these requirements taking into account the integration of
a power management system into the electrical power system.This
paper gives an overview of an approach ofpower management with
load prioritization in modernaircraft. Load dimensioning and load
management strategies on current civil aircraft will be presented and
used as a basis for the proposed approach.
Abstract: Decentralized Tuple Space (DTS) implements tuple
space model among a series of decentralized hosts and provides the
logical global shared tuple repository. Replication has been introduced
to promote performance problem incurred by remote tuple access. In
this paper, we propose a replication approach of DTS allowing
replication policies self-adapting. The accesses from users or other
nodes are monitored and collected to contribute the decision making.
The replication policy may be changed if the better performance is
expected. The experiments show that this approach suitably adjusts the
replication policies, which brings negligible overhead.