Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Abstract: This study aims to investigate the gender differences in
spatial navigation using the tasks of 2-D matrix navigation and
recognition of real driving scene. The results can be summarized as
followings. First, female subjects responded faster in 2-D matrix
navigation task than male subjects when landmark instructions were
provided. Second, in recognition task, male subjects recognized the
key elements involved in the past driving scene more accurately than
female subjects. In particular, female subjects tended to miss
peripheral information. These results suggest the possibility of gender
differences in spatial navigation.
Abstract: The increased use of biodiesel implies variations on both greenhouse gases and air pollutant emissions. Some studies point out that the use of biodiesel blends on diesel can help in controlling air pollution and promote a reduction of CO2 emissions. Reductions on PM, SO2, VOC and CO emissions are also expected, however NOx emissions may increase, which may potentiate O3 formation. This work aims to assess the impact of the biodiesel use on air quality, through a numerical modeling study, taking the Northern region of Portugal as a case study. The emission scenarios are focused on 2008 (baseline year) and 2020 (target year of Renewable Energy Directive-RED) and on three biodiesel blends (B0, B10 and B20). In a general way the use of biodiesel by 2020 will reduce the CO2 and air pollutants emissions in the Northern Portugal, improving air quality. However it will be in a very small extension.
Abstract: In this paper, a semi-fragile watermarking scheme is proposed for color image authentication. In this particular scheme, the color image is first transformed from RGB to YST color space, suitable for watermarking the color media. Each channel is divided into 4×4 non-overlapping blocks and its each 2×2 sub-block is selected. The embedding space is created by setting the two LSBs of selected sub-block to zero, which will hold the authentication and recovery information. For verification of work authentication and parity bits denoted by 'a' & 'p' are computed for each 2×2 subblock. For recovery, intensity mean of each 2×2 sub-block is computed and encoded upto six to eight bits depending upon the channel selection. The size of sub-block is important for correct localization and fast computation. For watermark distribution 2DTorus Automorphism is implemented using a private key to have a secure mapping of blocks. The perceptibility of watermarked image is quite reasonable both subjectively and objectively. Our scheme is oblivious, correctly localizes the tampering and able to recovery the original work with probability of near one.
Abstract: This paper looks into areas not covered by prominent
Agent-Oriented Software Engineering (AOSE) methodologies.
Extensive paper review led to the identification of two issues, first
most of these methodologies almost neglect semantic web and
ontology. Second, as expected, each one has its strength and
weakness and may focus on some phases of the development
lifecycle but not all of the phases. The work presented here builds
extensions to a highly regarded AOSE methodology (MaSE) in order
to cover the areas that this methodology does not concentrate on. The
extensions include introducing an ontology stage for semantic
representation and integrating early requirement specification from a
methodology which mainly focuses on that. The integration involved
developing transformation rules (with the necessary handling of nonmatching
notions) between the two sets of representations and
building the software which automates the transformation. The
application of this integration on a case study is also presented in the
paper. The main flow of MaSE stages was changed to smoothly
accommodate the new additions.
Abstract: In this paper, we introduce a novel algorithm for object tracking in video sequence. In order to represent the object to be tracked, we propose a spatial color histogram model which encodes both the color distribution and spatial information. The object tracking from frame to frame is accomplished via center voting and back projection method. The center voting method has every pixel in the new frame to cast a vote on whereabouts the object center is. The back projection method segments the object from the background. The segmented foreground provides information on object size and orientation, omitting the need to estimate them separately. We do not put any assumption on camera motion; the proposed algorithm works equally well for object tracking in both static and moving camera videos.
Abstract: Pabdeh shaly formation (Paleocene-Oligomiocene)
has been expanded in Fars, Khozestan and Lorestan. The lower
lithostratigraphic limit of this formation in Shiraz area is
distinguished from Gurpi formation by purple shale. Its upper limit is
gradational and conformable with Asmari formation. In order to
study sequence stratigraphy and microfacies of Pabdeh formation in
Shiraz area, one stratigraphic section have been chosen (Zanjiran
section). Petrographic studies resulted in the identification of 9
pelagic and calciturbidite microfacies. The calciturbidite microfacies
have been formed when the sea level was high, the rate of carbonate
deposition was high and it slumped into the deep marine. Sequence
stratigraphy studies show that Pabdeh formation in the studied zone
consists of two depositional sequences (DS) that the lower contact is
erosional (purple shale - type one, SBI or type two, SB2) and the
upper contact is correlative conformity (type two, SB2).
Abstract: Power transformer consists of components which are
under consistent thermal and electrical stresses. The major
component which degrades under these stresses is the paper
insulation of the power transformer. At site, lightning impulses and
cable faults may cause the winding deformation. In addition, the
winding may deform due to impact during transportation. A
deformed winding will excite more stress to its insulating paper thus
will degrade it. Insulation degradation will shorten the life-span of
the transformer. Currently there are two methods of detecting the
winding deformation which are Sweep Frequency Response
Analysis (SFRA) and Low Voltage Impulse Test (LVI). The latter
injects current pulses to the winding and capture the admittance
plot. In this paper, a transformer which experienced overheating and
arcing was identified, and both SFRA and LVI were performed.
Next, the transformer was brought to the factory for untanking. The
untanking results revealed that the LVI is more accurate than the
SFRA method for this case study.
Abstract: The cDNA encoding the 326 amino acids of a Class I
basic chitinase gene from Leucaena leucocephala de Wit (KB3,
Genbank accession: AAM49597) was cloned under the control of
CaMV35S promoter in pCAMBIA 1300 and transferred to
Koshihikari. Calli of Koshihikari rice was transformed with
agrobacterium with this construct expressing the chitinase and β-
glucouronidase (GUS). The frequencies of calli 90 % has been
obtained from rice seedlings cultured on NB medium. The high
regeneration frequencies, 74% was obtained from calli cultured on
regeneration medium containing 4 mg/l BAP, and 7 g/l phytagel at
25°C. Various factors were studied in order to establish a procedure
for the transformation of Koshihikari Agrobacterium tumefaciens.
Supplementation of 50 mM acetosyringone to the medium during
coculivation was important to enhance the frequency to transient
transformation. The 4 week-old scutellum-derived calli were
excellent starting materials. Selection medium based on NB medium
supplement with 40 mg/l hygromycin and 400 mg/l cefotaxime were
an optimized medium for selection of transformed rice calli. The
percentage of transformation 70 was obtained. Recombinant calli and
regenerated rice plants were checked the expression of chitinase and
gus by PCR, northern blot gel, southern blot gel, and gus assay.
Chitinase and gus were expressed in all parts of recombinant rice.
The rice line expressing the KB3 chiitnase was more resistant to the
blast fungus Fusarium monoliforme than control line.
Abstract: The development of Web has affected different aspects of our lives, such as communication, sharing knowledge, searching for jobs, social activities, etc. The web portal as a gateway in the World Wide Web is a starting point for people who are connecting to the Internet. The web portal as the type of knowledge management system provides a rich space to share and search information as well as communication services like free email or content provision for the users. This research aims to discover the university needs to the web portal as a necessary tool for students in the universities to help them in getting the required information. A survey was conducted to gather students' requirements which can be incorporated in to portal to be developed.
Abstract: While import-substituting industrialization policy
constitute the basis for the industrialization strategies of the 1960s
and 1970s in Turkey, this policy was no longer sustainable by the
1980s. For this reason, export-oriented industrialization policy was
adopted with the decisions taken on January 24, 1980. In other words,
the post-1980 period, Turkey's economy has adopted outwardoriented
industrialization strategy.
In this study, it is aimed to analyze the effect of the change in
economic structure on foreign trade with the transformation of
foreign trade and industrialization policies in the post-1980 period. In
this respect, in order to analyze the relationship between import,
export and economic growth by using variables of the 1960-2011
period, Chow test was applied. In the analysis the reason for using
Chow test is whether there is any difference in economic terms
between import-substituting industrialization policy applied in the
1960-1980 period and the 1981-2011 period during which exportoriented
industrialization policy was applied as a result of the
structural transformation.
Abstract: A neuron can emit spikes in an irregular time basis and by averaging over a certain time window one would ignore a lot of information. It is known that in the context of fast information processing there is no sufficient time to sample an average firing rate of the spiking neurons. The present work shows that the spiking neurons are capable of computing the radial basis functions by storing the relevant information in the neurons' delays. One of the fundamental findings of the this research also is that when using overlapping receptive fields to encode the data patterns it increases the network-s clustering capacity. The clustering algorithm that is discussed here is interesting from computer science and neuroscience point of view as well as from a perspective.
Abstract: The purposes of this study were as follows to evaluate
the economic value of Phu Kradueng National Park by the travel cost
method (TCM) and the contingent valuation method (CVM) and to
estimate the demand for traveling and the willingness to pay. The
data for this study were collected by conducting two large scale
surveys on users and non-users. A total of 1,016 users and 1,034
non-users were interviewed. The data were analyzed using multiple
linear regression analysis, logistic regression model and the
consumer surplus (CS) was the integral of demand function for trips.
The survey found, were as follows:
1)Using the travel cost method which provides an estimate of direct
benefits to park users, we found that visitors- total willingness to pay
per visit was 2,284.57 bath, of which 958.29 bath was travel cost,
1,129.82 bath was expenditure for accommodation, food, and
services, and 166.66 bath was consumer surplus or the visitors -net
gain or satisfaction from the visit (the integral of demand function for
trips).
2) Thai visitors to Phu Kradueng National Park were further willing
to pay an average of 646.84 bath per head per year to ensure the
continued existence of Phu Kradueng National Park and to preserve
their option to use it in the future.
3) Thai non-visitors, on the other hand, are willing to pay an average
of 212.61 bath per head per year for the option and existence value
provided by the Park.
4) The total economic value of Phu Kradueng National Park to Thai
visitors and non-visitors taken together stands today at 9,249.55
million bath per year.
5) The users- average willingness to pay for access to Phu Kradueng
National Park rises
from 40 bath to 84.66 bath per head per trip for improved services
such as road improvement, increased cleanliness, and upgraded
information.
This paper was needed to investigate of the potential market
demand for bio prospecting in Phu Kradueng national Park and to
investigate how a larger share of the economic benefits of tourism
could be distributed income to the local residents.
Abstract: This study examines the influence of information
transparency and corporate governance on purchase directors and
officers liability (D&O) insurance decisions. The results show that
companies with greater information transparency have significant
demand for D&O insurance. Greater transparency in voluntary
disclosures is significantly and positively associated with demand for
insurance, indicating that increasing the degree of information
disclosure reduces information asymmetry for insurers, which
stimulates their willingness to provide greater protection.
Analysis of insured and uninsured subsamples indicates that
uninsured companies have superior corporate governance compared to
insured companies. Although insured companies tend to have weaker
corporate governance structures, they appoint Big 4 firms or industry
experts to compensate for the weakness of their corporate governance.
Empirical results indicate that purchasing D&O insurance can
strengthen external corporate governance and increase companies’
willingness to voluntarily provide more transparent information.
Abstract: As privacy becomes a major concern for consumers
and enterprises, many research have been focused on the privacy
protecting technology in recent years. In this paper, we present a
comprehensive approach for usage access control based on the notion
purpose. In our model, purpose information associated with a given
data element specifies the intended use of the subjects and objects in
the usage access control model. A key feature of our model is that it
allows when an access is required, the access purpose is checked
against the intended purposes for the data item. We propose an
approach to represent purpose information to support access control
based on purpose information. Our proposed solution relies on usage
access control (UAC) models as well as the components which based
on the notions of the purpose information used in subjects and
objects. Finally, comparisons with related works are analyzed.
Abstract: In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.
Abstract: Understanding of how and where NOx formation
occurs in industrial burner is very important for efficient and clean
operation of utility burners. Also the importance of this problem is
mainly due to its relation to the pollutants produced by more burners
used widely of gas turbine in thermal power plants and glass and steel
industry.
In this article, a numerical model of an industrial burner operating
in MILD combustion is validated with experimental data.. Then
influence of air flow rate and air temperature on combustor
temperature profiles and NOX product are investigated. In order to
modification this study reports on the effects of fuel and air dilution
(with inert gases H2O, CO2, N2), and also influence of lean-premixed
of fuel, on the temperature profiles and NOX emission.
Conservation equations of mass, momentum and energy, and
transport equations of species concentrations, turbulence, combustion
and radiation modeling in addition to NO modeling equations were
solved together to present temperature and NO distribution inside the
burner.
The results shows that dilution, cause to a reduction in value of
temperature and NOX emission, and suppresses any flame
propagation inside the furnace and made the flame inside the furnace
invisible. Dilution with H2O rather than N2 and CO2 decreases further
the value of the NOX. Also with raise of lean-premix level, local
temperature of burner and the value of NOX product are decreases
because of premixing prevents local “hot spots" within the combustor
volume that can lead to significant NOx formation. Also leanpremixing
of fuel with air cause to amount of air in reaction zone is
reach more than amount that supplied as is actually needed to burn
the fuel and this act lead to limiting NOx formation
Abstract: The medical studies often require different methods
for parameters selection, as a second step of processing, after the
database-s designing and filling with information. One common
task is the selection of fields that act as risk factors using wellknown
methods, in order to find the most relevant risk factors and
to establish a possible hierarchy between them. Different methods
are available in this purpose, one of the most known being the
binary logistic regression. We will present the mathematical
principles of this method and a practical example of using it in the
analysis of the influence of 10 different psychiatric diagnostics
over 4 different types of offences (in a database made from 289
psychiatric patients involved in different types of offences).
Finally, we will make some observations about the relation
between the risk factors hierarchy established through binary
logistic regression and the individual risks, as well as the results of
Chi-squared test. We will show that the hierarchy built using the
binary logistic regression doesn-t agree with the direct order of risk
factors, even if it was naturally to assume this hypothesis as being
always true.
Abstract: Thailand-s health system is challenged by the rising
number of patients and decreasing ratio of medical
practitioners/patients, especially in rural areas. This may tempt
inexperienced GPs to rush through the process of anamnesis with the
risk of incorrect diagnosis. Patients have to travel far to the hospital
and wait for a long time presenting their case. Many patients try to
cure themselves with traditional Thai medicine. Many countries are
making use of the Internet for medical information gathering,
distribution and storage. Telemedicine applications are a relatively
new field of study in Thailand; the infrastructure of ICT had
hampered widespread use of the Internet for using medical
information. With recent improvements made health and technology
professionals can work out novel applications and systems to help
advance telemedicine for the benefit of the people. Here we explore
the use of telemedicine for people with health problems in rural areas
in Thailand and present a Telemedicine Diagnosis System for Rural
Thailand (TEDIST) for diagnosing certain conditions that people
with Internet access can use to establish contact with Community
Health Centers, e.g. by mobile phone. The system uses a Web-based
input method for individual patients- symptoms, which are taken by
an expert system for the analysis of conditions and appropriate
diseases. The analysis harnesses a knowledge base and a backward
chaining component to find out, which health professionals should be
presented with the case. Doctors have the opportunity to exchange
emails or chat with the patients they are responsible for or other
specialists. Patients- data are then stored in a Personal Health Record.
Abstract: This paper presents a method to detect multiple cracks
based on frequency information. When a structure is subjected to
dynamic or static loads, cracks may develop and the modal
frequencies of the cracked structure may change. To detect cracks in a
structure, we construct a high precision wavelet finite element (EF)
model of a certain structure using the B-spline wavelet on the interval
(BSWI). Cracks can be modeled by rotational springs and added to the
FE model. The crack detection database will be obtained by solving
that model. Then the crack locations and depths can be determined
based on the frequency information from the database. The
performance of the proposed method has been numerically verified by
a rotor example.