Abstract: the research was accomplished on fresh in Latvia wild
growing cranberries and cranberry cultivars. The aim of the study
was to evaluate effect of pretreatment method and drying conditions
on the volatile compounds composition in cranberries. Berries
pre-treatment methods were: perforation, halving and
steam-blanching. The berries before drying in a cabinet drier were
pre-treated using all three methods, in microwave vacuum
drier – using a steam-blanching and halving. Volatile compounds in
cranberries were analysed using GC-MS of extracts obtained by
SPME. During present research 21 various volatile compounds were
detected in fresh cranberries: the cultivar 'Steven' - 15, 'Bergman'
and 'Early black' – 13, 'Ben Lear' and 'Pilgrim' – 11 and wild
cranberries – 14 volatile compounds. In dried cranberries 20 volatile
compounds were detected. Mathematical data processing allows
drawing a conclusion that there exists the significant influence of
cranberry cultivar, pre-treatment method and drying condition on
volatile compounds in berries and new volatile compound formation.
Abstract: Pore water pressure is normally because of
consolidation, compaction and water level fluctuation on reservoir.
Measuring, controlling and analyzing of pore water pressure have
significant importance in both of construction and operation period.
Since end of 2002, (dam start up) nature of KARKHEH dam has
been analyzed by using the gathered information from
instrumentation system of dam. In this lecture dam condition after
start up have been analyzed by using the gathered data from located
piezometers in core of dam. According to TERZAGHI equation and
records of piezometers, consolidation lasted around five years during
early years of construction stage, and current pore water pressure in
core of dam is caused by water level fluctuation in reservoir.
Although there is time lag between water level fluctuation and results
of piezometers. These time lags have been checked and the results
clearly show that one of the most important causes of it is distance
between piezometer and reservoir.
Abstract: In this paper a comprehensive algorithm is presented to alleviate the undesired simultaneous effects of target maneuvering, observed glint noise distribution, and colored noise spectrum using online colored glint noise parameter estimation. The simulation results illustrate a significant reduction in the root mean square error (RMSE) produced by the proposed algorithm compared to the algorithms that do not compensate all the above effects simultaneously.
Abstract: Steganography is the process of hiding one file inside another such that others can neither identify the meaning of the embedded object, nor even recognize its existence. Current trends favor using digital image files as the cover file to hide another digital file that contains the secret message or information. One of the most common methods of implementation is Least Significant Bit Insertion, in which the least significant bit of every byte is altered to form the bit-string representing the embedded file. Altering the LSB will only cause minor changes in color, and thus is usually not noticeable to the human eye. While this technique works well for 24-bit color image files, steganography has not been as successful when using an 8-bit color image file, due to limitations in color variations and the use of a colormap. This paper presents the results of research investigating the combination of image compression and steganography. The technique developed starts with a 24-bit color bitmap file, then compresses the file by organizing and optimizing an 8-bit colormap. After the process of compression, a text message is hidden in the final, compressed image. Results indicate that the final technique has potential of being useful in the steganographic world.
Abstract: Digital libraries become more and more necessary in
order to support users with powerful and easy-to-use tools for
searching, browsing and retrieving media information. The starting
point for these tasks is the segmentation of video content into shots.
To segment MPEG video streams into shots, a fully automatic
procedure to detect both abrupt and gradual transitions (dissolve and
fade-groups) with minimal decoding in real time is developed in this
study. Each was explored through two phases: macro-block type's
analysis in B-frames, and on-demand intensity information analysis.
The experimental results show remarkable performance in
detecting gradual transitions of some kinds of input data and
comparable results of the rest of the examined video streams. Almost
all abrupt transitions could be detected with very few false positive
alarms.
Abstract: Provision of optical devices without proper instruction
and training may cause frustration resulting in rejection or incorrect
use of the magnifiers. However training in the use of magnifiers
increases the cost of providing these devices. This study compared
the efficacy of providing instruction alone and instruction plus
training in the use of magnifiers. 24 participants randomly assigned
to two groups. 15 received instruction and training and 9 received
instruction only. Repeated measures of print size and reading speed
were performed at pre, post training and follow up. Print size
decreased in both groups between pre and post training maintained at
follow up. Reading speed increased in both groups over time with the
training group demonstrating more rapid improvement. Whilst
overall outcomes were similar, training decreased the time required
to increase reading speed supporting the use of training for increased
efficiency. A cost effective form of training is suggested.
Abstract: Thousands of masters athletes participate
quadrennially in the World Masters Games (WMG), yet this cohort
of athletes remains proportionately under-investigated. Due to a
growing global obesity pandemic in context of benefits of physical
activity across the lifespan, the prevalence of obesity in this unique
population was of particular interest. Data gathered on a sub-sample
of 535 football code athletes, aged 31-72 yrs ( =47.4, s =±7.1),
competing at the Sydney World Masters Games (2009) demonstrated
a significantly (p
Abstract: A gradient learning method to regulate the trajectories
of some nonlinear chaotic systems is proposed. The method is
motivated by the gradient descent learning algorithms for neural
networks. It is based on two systems: dynamic optimization system
and system for finding sensitivities. Numerical results of several
examples are presented, which convincingly illustrate the efficiency
of the method.
Abstract: Part IV of the Civil Code of the Russian Federation dedicated to legal regulation of Intellectual property rights came into force in 2008. It is a first attempt of codification in Intellectual property sphere in Russia. That is why a lot of new norms appeared. The main problem of the Russian Civil Code (part IV) is that many rules (norms of Law) contradict the norms of International Intellectual property Law (i.e. protection of inventions, creations, ideas, know-how, trade secrets, innovations). Intellectual property rights protect innovations and creations and reward innovative and creative activity. Intellectual property rights are international in character and in that respect they fit in rather well with the economic reality of the global economy. Inventors prefer not to take out a patent for inventions because it is a very difficult procedure, it takes a lot of time and is very expensive. That-s why they try to protect their inventions as ideas, know-how, confidential information. An idea is the main element of any object of Intellectual property (creation, invention, innovation, know-how, etc.). But ideas are not protected by Civil Code of Russian Federation. The aim of the paper is to reveal the main problems of legal regulation of Intellectual property in Russia and to suggest possible solutions. The authors of this paper have raised these essential issues through different activities. Through the panel survey, questionnaires which were spread among the participants of intellectual activities the main problems of implementation of innovations, protecting of the ideas and know-how were identified. The implementation of research results will help to solve economic and legal problems of innovations, transfer of innovations and intellectual property.1
Abstract: When faced with stochastic networks with an uncertain
duration for their activities, the securing of network completion time
becomes problematical, not only because of the non-identical pdf of
duration for each node, but also because of the interdependence of
network paths. As evidenced by Adlakha & Kulkarni [1], many
methods and algorithms have been put forward in attempt to resolve
this issue, but most have encountered this same large-size network
problem. Therefore, in this research, we focus on network reduction
through a Series/Parallel combined mechanism. Our suggested
algorithm, named the Activity Network Reduction Algorithm
(ANRA), can efficiently transfer a large-size network into an S/P
Irreducible Network (SPIN). SPIN can enhance stochastic network
analysis, as well as serve as the judgment of symmetry for the Graph
Theory.
Abstract: In order to Study the efficacy application of green
manure as chickpea pre plant, field experiments were carried out in
2007 and 2008 growing seasons. In this research the effects of
different strategies for soil fertilization were investigated on grain
yield and yield component, minerals, organic compounds and
cooking time of chickpea. Experimental units were arranged in splitsplit
plots based on randomized complete blocks with three
replications. Main plots consisted of (G1): establishing a mixed
vegetation of Vicia panunica and Hordeum vulgare and (G2):
control, as green manure levels. Also, five strategies for obtaining the
base fertilizer requirement including (N1): 20 t.ha-1 farmyard manure;
(N2): 10 t.ha-1 compost; (N3): 75 kg.ha-1 triple super phosphate;
(N4): 10 t.ha-1 farmyard manure + 5 t.ha-1 compost and (N5): 10 t.ha-1
farmyard manure + 5 t.ha-1 compost + 50 kg.ha-1 triple super
phosphate were considered in sub plots. Furthermoree four levels of
biofertilizers consisted of (B1): Bacillus lentus + Pseudomonas
putida; (B2): Trichoderma harzianum; (B3): Bacillus lentus +
Pseudomonas putida + Trichoderma harzianum; and (B4): control
(without biofertilizers) were arranged in sub-sub plots. Results
showed that integrating biofertilizers (B3) and green manure (G1)
produced the highest grain yield. The highest amounts of yield were
obtained in G1×N5 interaction. Comparison of all 2-way and 3-way
interactions showed that G1N5B3 was determined as the superior
treatment. Significant increasing of N, P2O5, K2O, Fe and Mg content
in leaves and grains emphasized on superiority of mentioned
treatment because each one of these nutrients has an approved role in
chlorophyll synthesis and photosynthesis abilities of the crops. The
combined application of compost, farmyard manure and chemical
phosphorus (N5) in addition to having the highest yield, had the best
grain quality due to high protein, starch and total sugar contents, low
crude fiber and reduced cooking time.
Abstract: Hepatitis B and hepatitis C are among the most
significant hepatic infections all around the world that may lead to
hepatocellular carcinoma. This study is first time performed at the
blood transfussion centre of Omar hospital, Lahore. It aims to
determine the sero-prevalence of these diseases by screening the
apparently healthy blood donors who might be the carriers of HBV or
HCV and pose a high risk in the transmission. It also aims the
comparison between the sensitivity of two diagnostic tests;
chromatographic immunoassay – one step test device and Enzyme
Linked Immuno Sorbant Assay (ELISA). Blood serum of 855
apparently healthy blood donors was screened for Hepatitis B surface
antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0
and X2 (Chi-square) test were used for statistical analysis. The seroprevalence
of HCV was 8.07% by the device method and by ELISA
9.12% and that of HBV was 5.6% by the device and 6.43% by
ELISA. The unavailability of vaccination against HCV makes it more
prevalent. Comparing the two diagnostic methods, ELISA proved to
be more sensitive.
Abstract: Hydrogen diffusion is the main problem for corrosion fatigue in corrosive environment. In order to analyze the phenomenon, it is needed to understand their behaviors specially the hydrogen behavior during the diffusion. So, Hydrogen embrittlement and prediction its behavior as a main corrosive part of the fractions, needed to solve combinations of different equations mathematically. The main point to obtain the equation, having knowledge about the source of causing diffusion and running the atoms into materials, called driving force. This is produced by either gradient of electrical or chemical potential. In this work, we consider the gradient of chemical potential to obtain the property equation. In diffusion of atoms, some of them may be trapped but, it could be ignorable in some conditions. According to the phenomenon of hydrogen embrittlement, the thermodynamic and chemical properties of hydrogen are considered to justify and relate them to fracture mechanics. It is very important to get a stress intensity factor by using fugacity as a property of hydrogen or other gases. Although, the diffusive behavior and embrittlement event are common and the same for other gases but, for making it more clear, we describe it for hydrogen. This considering on the definite gas and describing it helps us to understand better the importance of this relation.
Abstract: In this paper, an extreme learning machine with an automatic segmentation algorithm is applied to heart disorder classification by heart sound signals. From continuous heart sound signals, the starting points of the first (S1) and the second heart pulses (S2) are extracted and corrected by utilizing an inter-pulse histogram. From the corrected pulse positions, a single period of heart sound signals is extracted and converted to a feature vector including the mel-scaled filter bank energy coefficients and the envelope coefficients of uniform-sized sub-segments. An extreme learning machine is used to classify the feature vector. In our cardiac disorder classification and detection experiments with 9 cardiac disorder categories, the proposed method shows significantly better performance than multi-layer perceptron, support vector machine, and hidden Markov model; it achieves the classification accuracy of 81.6% and the detection accuracy of 96.9%.
Abstract: In an emergency, combining Wireless Sensor Network's data with the knowledge gathered from various other information sources and navigation algorithms, could help safely guide people to a building exit while avoiding the risky areas. This paper presents an emergency response and navigation support architecture for data gathering, knowledge manipulation, and navigational support in an emergency situation. At normal state, the system monitors the environment. When an emergency event detects, the system sends messages to first responders and immediately identifies the risky areas from safe areas to establishing escape paths. The main functionalities of the system include, gathering data from a wireless sensor network which is deployed in a multi-story indoor environment, processing it with information available in a knowledge base, and sharing the decisions made, with first responders and people in the building. The proposed architecture will act to reduce risk of losing human lives by evacuating people much faster with least congestion in an emergency environment.
Abstract: New ways of working- refers to non-traditional work practices, settings and locations with information and communication technologies (ICT) to supplement or replace traditional ways of working. It questions the contemporary work practices and settings still very much used in knowledge-intensive organizations today. In this study new ways of working is seen to consist of two elements: work environment (incl. physical, virtual and social) and work practices. This study aims to gather the scattered information together and deepen the understanding on new ways of working. Moreover, the objective is to provide some evidence of the unclear productivity impacts of new ways of working using case study approach.
Abstract: There are many classical algorithms for finding
routing in FPGA. But Using DNA computing we can solve the routes
efficiently and fast. The run time complexity of DNA algorithms is
much less than other classical algorithms which are used for solving
routing in FPGA. The research in DNA computing is in a primary
level. High information density of DNA molecules and massive
parallelism involved in the DNA reactions make DNA computing a
powerful tool. It has been proved by many research accomplishments
that any procedure that can be programmed in a silicon computer can
be realized as a DNA computing procedure. In this paper we have
proposed two tier approaches for the FPGA routing solution. First,
geometric FPGA detailed routing task is solved by transforming it
into a Boolean satisfiability equation with the property that any
assignment of input variables that satisfies the equation specifies a
valid routing. Satisfying assignment for particular route will result in
a valid routing and absence of a satisfying assignment implies that
the layout is un-routable. In second step, DNA search algorithm is
applied on this Boolean equation for solving routing alternatives
utilizing the properties of DNA computation. The simulated results
are satisfactory and give the indication of applicability of DNA
computing for solving the FPGA Routing problem.
Abstract: This paper analyses the structural changes in
education sector since the introduction of liberalization policy in
India. This paper explains how the so-called non-profit trusts and
societies appropriated the liberalization policy and enhanced
themselves as new capitalist class in higher education sector. Over
the decades, the policy witnessed the role of private sector in terms
of maintaining market equilibrium. The state also witnessed the
incompatibility of the private sector in inculcating the values of
social justice. The most important consequence of the policy is to
witness the rise of new capitalist class and academic capitalism.
When the state came to realize that it no longer cope up with
market demands, it opens the entry of private sector in higher
education. Concessions and tax exemptions were provided to the
trusts and societies to establish higher education institutions. There
is a basic difference between western countries and India in
providing higher education by the trusts and societies. In western
countries the big business houses contributed their surplus
revenues to promote higher education and research as a
complementary service to society and nation. In India, several
entrepreneurs came up with business motive using education
sector. Over the period, they accumulated wealth at the cost of
students and concessions from the government. Four major results
can now be identified: production of manpower in view of market
demands; reduction of standards in higher education; bypassing the
values of social justice; and the rise of new capitalist class from the
business of education. This paper tries to substantiate these issues
with the inputs from case studies.
Abstract: Signature amortization schemes have been introduced
for authenticating multicast streams, in which, a single signature is
amortized over several packets. The hash value of each packet is
computed, some hash values are appended to other packets, forming
what is known as hash chain. These schemes divide the stream into
blocks, each block is a number of packets, the signature packet in
these schemes is either the first or the last packet of the block.
Amortization schemes are efficient solutions in terms of computation
and communication overhead, specially in real-time environment.
The main effictive factor of amortization schemes is it-s hash chain
construction. Some studies show that signing the first packet of each
block reduces the receiver-s delay and prevents DoS attacks, other
studies show that signing the last packet reduces the sender-s delay.
To our knowledge, there is no studies that show which is better, to
sign the first or the last packet in terms of authentication probability
and resistance to packet loss.
In th is paper we will introduce another scheme for authenticating
multicast streams that is robust against packet loss, reduces the
overhead, and prevents the DoS attacks experienced by the receiver
in the same time. Our scheme-The Multiple Connected Chain signing
the First packet (MCF) is to append the hash values of specific
packets to other packets,then append some hashes to the signature
packet which is sent as the first packet in the block. This scheme
is aspecially efficient in terms of receiver-s delay. We discuss and
evaluate the performance of our proposed scheme against those that
sign the last packet of the block.
Abstract: The goal of this paper is to examine the effects of laser
radiation on the skin wound healing using infrared thermography as
non-invasive method for the monitoring of the skin temperature
changes during laser treatment. Thirty Wistar rats were used in this
study. A skin lesion was performed at the leg on all rats. The animals
were exposed to laser radiation (λ = 670 nm, P = 15 mW, DP = 16.31
mW/cm2) for 600 s. Thermal images of wound were acquired before
and after laser irradiation. The results have demonstrated that the
tissue temperature decreases from 35.5±0.50°C in the first treatment
day to 31.3±0.42°C after the third treatment day. This value is close
to the normal value of the skin temperature and indicates the end of
the skin repair process. In conclusion, the improvements in the
wound healing following exposure to laser radiation have been
revealed by infrared thermography.