Abstract: This study aims to propose three evaluation methods to
evaluate the Tokyo Cap and Trade Program when emissions trading is
performed virtually among enterprises, focusing on carbon dioxide
(CO2), which is the only emitted greenhouse gas that tends to increase.
The first method clarifies the optimum reduction rate for the highest
cost benefit, the second discusses emissions trading among enterprises
through market trading, and the third verifies long-term emissions
trading during the term of the plan (2010-2019), checking the validity
of emissions trading partly using Geographic Information Systems
(GIS). The findings of this study can be summarized in the following
three points.
1. Since the total cost benefit is the greatest at a 44% reduction rate, it
is possible to set it more highly than that of the Tokyo Cap and
Trade Program to get more total cost benefit.
2. At a 44% reduction rate, among 320 enterprises, 8 purchasing
enterprises and 245 sales enterprises gain profits from emissions
trading, and 67 enterprises perform voluntary reduction without
conducting emissions trading. Therefore, to further promote
emissions trading, it is necessary to increase the sales volumes of
emissions trading in addition to sales enterprises by increasing the
number of purchasing enterprises.
3. Compared to short-term emissions trading, there are few enterprises
which benefit in each year through the long-term emissions trading
of the Tokyo Cap and Trade Program. Only 81 enterprises at the
most can gain profits from emissions trading in FY 2019. Therefore,
by setting the reduction rate more highly, it is necessary to increase
the number of enterprises that participate in emissions trading and
benefit from the restraint of CO2 emissions.
Abstract: Groups where the discrete logarithm problem (DLP) is believed to be intractable have proved to be inestimable building blocks for cryptographic applications. They are at the heart of numerous protocols such as key agreements, public-key cryptosystems, digital signatures, identification schemes, publicly verifiable secret sharings, hash functions and bit commitments. The search for new groups with intractable DLP is therefore of great importance.The goal of this article is to study elliptic curves over the ring Fq[], with Fq a finite field of order q and with the relation n = 0, n ≥ 3. The motivation for this work came from the observation that several practical discrete logarithm-based cryptosystems, such as ElGamal, the Elliptic Curve Cryptosystems . In a first time, we describe these curves defined over a ring. Then, we study the algorithmic properties by proposing effective implementations for representing the elements and the group law. In anther article we study their cryptographic properties, an attack of the elliptic discrete logarithm problem, a new cryptosystem over these curves.
Abstract: Performance of millimeter-wave (mm-wave) multiband
orthogonal frequency division multiplexing (MB-OFDM) ultrawideband
(UWB) signal generation using frequency quadrupling
technique and transmission over fiber is experimentally investigated.
The frequency quadrupling is achived by using only one Mach-
Zehnder modulator (MZM) that is biased at maximum transmission
(MATB) point. At the output, a frequency quadrupling signal is
obtained then sent to a second MZM. This MZM is used for MBOFDM
UWB signal modulation. In this work, we demonstrate 30-
GHz mm-wave wireless that carries three-bands OFDM UWB
signals, and error vector magnitude (EVM) is used to analyze the
transmission quality. It is found that our proposed technique leads to
an improvement of 3.5 dB in EVM at 40% of local oscillator (LO)
modulation with comparison to the technique using two cascaded
MZMs biased at minimum transmission (MITB) point.
Abstract: The aim of this study is to determine the effect of
strategic management implementations on the institutionalization
levels. In this regard a field study has been made over 31 stone quarry
enterprises in cement producing sector in Konya by using survey
method. In this study, institutionalization levels of the enterprises
have been evaluated regarding three dimensions: professionalization,
management approach, participation in decisions and delegation of
authority. According to the results of the survey, there is a highly
positive and statistically significant relationship between the strategic
management implementations and institutionalization levels of the
enterprises. Additionally,-considering the results of regression
analysis made for establishing the relationship between strategic
management and institutionalization levels- it has been determined
that strategic management implementations of the enterprises can be
used as a variable to explain the institutionalization levels of them,
and also strategic management implementations of the enterprises
increase the institutionalization levels of them.
Abstract: The least mean square (LMS) algorithmis one of the
most well-known algorithms for mobile communication systems
due to its implementation simplicity. However, the main limitation
is its relatively slow convergence rate. In this paper, a booster
using the concept of Markov chains is proposed to speed up the
convergence rate of LMS algorithms. The nature of Markov
chains makes it possible to exploit the past information in the
updating process. Moreover, since the transition matrix has a
smaller variance than that of the weight itself by the central limit
theorem, the weight transition matrix converges faster than the
weight itself. Accordingly, the proposed Markov-chain based
booster thus has the ability to track variations in signal
characteristics, and meanwhile, it can accelerate the rate of
convergence for LMS algorithms. Simulation results show that the
LMS algorithm can effectively increase the convergence rate and
meantime further approach the Wiener solution, if the
Markov-chain based booster is applied. The mean square error is
also remarkably reduced, while the convergence rate is improved.
Abstract: Extensive use of the Internet coupled with the
marvelous growth in e-commerce and m-commerce has created a
huge demand for information security. The Secure Socket Layer
(SSL) protocol is the most widely used security protocol in the
Internet which meets this demand. It provides protection against
eaves droppings, tampering and forgery. The cryptographic
algorithms RC4 and HMAC have been in use for achieving security
services like confidentiality and authentication in the SSL. But recent
attacks against RC4 and HMAC have raised questions in the
confidence on these algorithms. Hence two novel cryptographic
algorithms MAJE4 and MACJER-320 have been proposed as
substitutes for them. The focus of this work is to demonstrate the
performance of these new algorithms and suggest them as dependable
alternatives to satisfy the need of security services in SSL. The
performance evaluation has been done by using practical
implementation method.
Abstract: The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.
Abstract: The importance of ensuring safe meat handling and
processing practices has been demonstrated in global reports on food
safety scares and related illness and deaths. This necessitated stricter
meat safety control strategies. Today, many countries have regulated
towards preventative and systematic control over safe meat
processing at abattoirs utilizing the Hazard Analysis Critical Control
Point (HACCP) principles. HACCP systems have been reported as
effective in managing food safety risks, if correctly implemented.
South Africa has regulated the Hygiene Management System (HMS)
based on HACCP principles applicable to abattoirs. Regulators utilise
the Hygiene Assessment System (HAS) to audit compliance at
abattoirs. These systems were benchmarked from the United
Kingdom (UK). Little research has been done them since inception as
of 2004. This paper presents a review of the two systems, its
implementation and comparison with HACCP. Recommendations are
made for future research to demonstrate the utility of the HMS and
HAS in assuring safe meat to consumers.
Abstract: This paper aims at presenting the biotechnology used
to obtain collagen-based gels from shark (Squalus acanthias) and brill
skin, marine fish growing in the Black Sea. Due to the structure of its
micro-fibres, collagen can be considered a nanomaterial; in order to
use collagen-based matrixes as biomaterial, rheological studies must
be performed first, to state whether they are stable or not. For the
triple-helix structure to remain stable within these gels at room or
human body temperature, they must be stabilized by reticulation.
Abstract: Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.
Abstract: We propose a fast and robust hierarchical face detection system which finds and localizes face images with a cascade of classifiers. Three modules contribute to the efficiency of our detector. First, heterogeneous feature descriptors are exploited to enrich feature types and feature numbers for face representation. Second, a PSO-Adaboost algorithm is proposed to efficiently select discriminative features from a large pool of available features and reinforce them into the final ensemble classifier. Compared with the standard exhaustive Adaboost for feature selection, the new PSOAdaboost algorithm reduces the training time up to 20 times. Finally, a three-stage hierarchical classifier framework is developed for rapid background removal. In particular, candidate face regions are detected more quickly by using a large size window in the first stage. Nonlinear SVM classifiers are used instead of decision stump functions in the last stage to remove those remaining complex nonface patterns that can not be rejected in the previous two stages. Experimental results show our detector achieves superior performance on the CMU+MIT frontal face dataset.
Abstract: In this paper, we propose a new approach to query-by-humming, focusing on MP3 songs database. Since MP3 songs are much more difficult in melody representation than symbolic performance data, we adopt to extract feature descriptors from the vocal sounds part of the songs. Our approach is based on signal filtering, sub-band spectral processing, MDCT coefficients analysis and peak energy detection by ignorance of the background music as much as possible. Finally, we apply dual dynamic programming algorithm for feature similarity matching. Experiments will show us its online performance in precision and efficiency.
Abstract: This paper presents a new data oriented model of image. Then a representation of it, ADBT, is introduced. The ability of ADBT is clustering, segmentation, measuring similarity of images etc, with desired precision and corresponding speed.
Abstract: This paper presents the source extraction system which can extract only target signals with constraints on source localization in on-line systems. The proposed system is a kind of methods for enhancing a target signal and suppressing other interference signals. But, the performance of proposed system is superior to any other methods and the extraction of target source is comparatively complete. The method has a beamforming concept and uses an improved time-frequency (TF) mask-based BSS algorithm to separate a target signal from multiple noise sources. The target sources are assumed to be in front and test data was recorded in a reverberant room. The experimental results of the proposed method was evaluated by the PESQ score of real-recording sentences and showed a noticeable speech enhancement.
Abstract: The present study has been carried out with a view to calculate the coastal vulnerability index (CVI) to know the high and low sensitive areas and area of inundation due to future SLR. Both conventional and remotely sensed data were used and analyzed through the modelling technique. Out of the total study area, 8.26% is very high risk, 14.21% high, 9.36% medium, 22.46% low and 7.35% in the very low vulnerable category, due to costal components. Results of the inundation analysis indicate that 225.2 km² and 397 km² of the land area will be submerged by flooding at 1m and 10m inundation levels. The most severely affected sectors are expected to be the residential, industrial and recreational areas. As this coast is planned for future coastal developmental activities, measures such as industrializations, building regulation, urban growth planning and agriculture, development of an integrated coastal zone management, strict enforcement of the Coastal Regulation Zone (CRZ) Act, monitoring of impacts and further research in this regard are recommended for the study area.
Abstract: Deformable active contours are widely used in
computer vision and image processing applications for image
segmentation, especially in biomedical image analysis. The active
contour or “snake" deforms towards a target object by controlling the
internal, image and constraint forces. However, if the contour
initialized with a lesser number of control points, there is a high
probability of surpassing the sharp corners of the object during
deformation of the contour. In this paper, a new technique is
proposed to construct the initial contour by incorporating prior
knowledge of significant corners of the object detected using the
Harris operator. This new reconstructed contour begins to deform, by
attracting the snake towards the targeted object, without missing the
corners. Experimental results with several synthetic images show the
ability of the new technique to deal with sharp corners with a high
accuracy than traditional methods.
Abstract: In this work a surgical simulator is produced which
enables a training otologist to conduct a virtual, real-time prosthetic
insertion. The simulator provides the Ear, Nose and Throat surgeon
with real-time visual and haptic responses during virtual cochlear
implantation into a 3D model of the human Scala Tympani (ST). The
parametric model is derived from measured data as published in the
literature and accounts for human morphological variance, such as
differences in cochlear shape, enabling patient-specific pre- operative
assessment. Haptic modeling techniques use real physical data and
insertion force measurements, to develop a force model which
mimics the physical behavior of an implant as it collides with the ST
walls during an insertion. Output force profiles are acquired from the
insertion studies conducted in the work, to validate the haptic model.
The simulator provides the user with real-time, quantitative insertion
force information and associated electrode position as user inserts the
virtual implant into the ST model. The information provided by this
study may also be of use to implant manufacturers for design
enhancements as well as for training specialists in optimal force
administration, using the simulator. The paper reports on the methods
for anatomical modeling and haptic algorithm development, with
focus on simulator design, development, optimization and validation.
The techniques may be transferrable to other medical applications
that involve prosthetic device insertions where user vision is
obstructed.
Abstract: Undoubtedly, chassis is one of the most important
parts of a vehicle. Chassis that today are produced for vehicles are
made up of four parts. These parts are jointed together by screwing.
Transverse parts are called cross member.
This study reviews the stress generated by cyclic laboratory loads
in front cross member of Peugeot 405. In this paper the finite element
method is used to simulate the welding process and to determine the
physical response of the spot-welded joints. Analysis is done by the
Abaqus software.
The Stresses generated in cross member structure are generally
classified into two groups: The stresses remained in form of residual
stresses after welding process and the mechanical stress generated by
cyclic load. Accordingly the total stress must be obtained by
determining residual stress and mechanical stress separately and then
sum them according to the superposition principle.
In order to improve accuracy, material properties including
physical, thermal and mechanical properties were supposed to be
temperature-dependent. Simulation shows that maximum Von Misses
stresses are located at special points. The model results are then
compared to the experimental results which are reported by
producing factory and good agreement is observed.
Abstract: The fundamental defect inherent to the thermoforming
technology is wall-thickness variation of the products due to
inadequate thermal processing during production of polymer. A
nonlinear viscoelastic rheological model is implemented for
developing the process model. This model describes deformation
process of a sheet in thermoforming process. Because of relaxation
pause after plug-assist stage and also implementation of two stage
thermoforming process have minor wall-thickness variation and
consequently better mechanical properties of polymeric articles. For
model validation, a comparative analysis of the theoretical and
experimental data is presented.
Abstract: Color constancy algorithms are generally based on the
simplified assumption about the spectral distribution or the reflection
attributes of the scene surface. However, in reality, these assumptions
are too restrictive. The methodology is proposed to extend existing
algorithm to applying color constancy locally to image patches rather
than globally to the entire images.
In this paper, a method based on low-level image features using
superpixels is proposed. Superpixel segmentation partition an image
into regions that are approximately uniform in size and shape. Instead
of using entire pixel set for estimating the illuminant, only superpixels
with the most valuable information are used. Based on large scale
experiments on real-world scenes, it can be derived that the estimation
is more accurate using superpixels than when using the entire image.