Abstract: In this paper an effective approach for segmenting
human skin regions in images taken at different environment is
proposed. The proposed method uses a color distance map that is
flexible enough to reliably detect the skin regions even if the
illumination conditions of the image vary. Local image conditions is
also focused, which help the technique to adaptively detect differently
illuminated skin regions of an image. Moreover, usage of local
information also helps the skin detection process to get rid of picking
up much noisy pixels.
Abstract: Innovation, technology and knowledge are the trilogy
of impact to support the challenges arising from uncertainty.
Evidence showed an opportunity to ask how to manage in this
environment under constant innovation. In an attempt to get a
response from the field of Management Sciences, based in the
Contingency Theory, a research was conducted, with
phenomenological and descriptive approaches, using the Case Study
Method and the usual procedures for this task involving a focus
group composed of managers and employees working in the
pharmaceutical field. The problem situation was raised; the state of
the art was interpreted and dissected the facts. In this tasks were
involved four establishments. The result indicates that these focused
ventures have been managed by its founder empirically and is
experimenting agility described in this work. The expectation of this
study is to improve concepts for stakeholders on creativity in
business.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: This paper provides an analysis of corporate income
tax (CIT) incentives in the Western Balkan countries: Slovenia,
Croatia, Serbia, Montenegro, Macedonia and Albania. Western
Balkan countries, as other transition and developing countries, use
large number of the corporate income tax incentives (CIT) to attract
foreign investments and to stimulate economic activity. The main
goal of this paper is to investigate how often these countries use CIT
incentives and provide review of existing tax incentives in Western
Balkan countries. Paper will focus on reduced CIT rates, tax
holidays, and other investment incentives which imply incentives
like accelerated depreciation, tax allowances and tax credits.
Abstract: Proactive coping directed at an upcoming as opposed
to an ongoing stressor, is a new focus in positive psychology. The
present study explored the proactive coping-s effect on the workplace
adaptation after transition from college to workplace. In order to
demonstrate the influence process between them, we constructed the
model of proactive coping style effecting the actual positive coping
efforts and outcomes by mediating proactive competence during one
year after the transition. Participants (n = 100) started to work right
after graduating from college completed all the four time-s surveys
--one month before (Time 0), one month after (Time 1), three months
after (Time 2), and one year after (Time 3) the transition. Time 0
survey included the measurement of proactive coping style and
competence. Time 1, 2, 3 surveys included the measurement of the
challenge cognitive appraisal, problem solving coping strategy, and
subjective workplace adaptation. The result indicated that proactive
coping style effected newcomers- actual coping efforts and outcomes
by mediating proactive coping competence. The result also showed
that proactive coping competence directly promoted Time1-s actual
positive coping efforts and outcomes, and indirectly promoted Time
2-s and Time 3-s.
Abstract: Speech corpus is one of the major components in a
Speech Processing System where one of the primary requirements
is to recognize an input sample. The quality and details captured
in speech corpus directly affects the precision of recognition. The
current work proposes a platform for speech corpus generation using
an adaptive LMS filter and LPC cepstrum, as a part of an ANN
based Speech Recognition System which is exclusively designed to
recognize isolated numerals of Assamese language- a major language
in the North Eastern part of India. The work focuses on designing an
optimal feature extraction block and a few ANN based cooperative
architectures so that the performance of the Speech Recognition
System can be improved.
Abstract: In this study, we experiment on precise control outlet
temperature of water from the water cooler with hot-gas bypass
method based on PI control logic for machine tool. Recently, technical
trend for machine tools is focused on enhancement of speed and
accuracy. High speedy processing causes thermal and structural
deformation of objects from the machine tools. Water cooler has to be
applied to machine tools to reduce the thermal negative influence with
accurate temperature controlling system. The goal of this study is to
minimize temperature error in steady state. In addition, control period
of an electronic expansion valve were considered to increment of
lifetime of the machine tools and quality of product with a water
cooler.
Abstract: We report on the development of a model to
understand why the range of experience with respect to HIV
infection is so diverse, especially with respect to the latency period.
To investigate this, an agent-based approach is used to extract highlevel
behaviour which cannot be described analytically from the set
of interaction rules at the cellular level. A network of independent
matrices mimics the chain of lymph nodes. Dealing with massively
multi-agent systems requires major computational effort. However,
parallelisation methods are a natural consequence and advantage of
the multi-agent approach and, using the MPI library, are here
implemented, tested and optimized. Our current focus is on the
various implementations of the data transfer across the network.
Three communications strategies are proposed and tested, showing
that the most efficient approach is communication based on the
natural lymph-network connectivity.
Abstract: DS-CDMA system is well known wireless
technology. This system suffers from MAI (Multiple Access
Interference) caused by Direct Sequence users. Multi-User Detection
schemes were introduced to detect the users- data in presence of
MAI. This paper focuses on linear multi-user detection schemes used
for data demodulation. Simulation results depict the performance of
three detectors viz-conventional detector, Decorrelating detector and
Subspace MMSE (Minimum Mean Square Error) detector. It is seen
that the performance of these detectors depends on the number of
paths and the length of Gold code used.
Abstract: The liberalization and privatization processes have
forced public utility companies to face new competitive challenges,
implementing strategies to gain market share and, at the same time,
keep the old customers. To this end, many companies have carried
out mergers, acquisitions and conglomerations in order to diversify
their business. This paper focuses on companies operating in the free
energy market in Italy. In the last decade, this sector has undergone
profound changes that have radically changed the competitive
scenario and have led companies to implement diversification
strategies of the business. Our work aims to evaluate the economic
and financial performances obtained by energy companies, following
the beginning of the liberalization process, verifying the possible
relationship with the implemented diversification strategies.
Abstract: In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.
Abstract: The Korean government has applied preliminary feasibility study for new and huge R&D programs since 2008.The study is carried out from the viewpoints of technology, policy, and Economics. Then integrate the separate analysis and finally arrive at a definite result; whether a program is feasible or unfeasible, This paper describes the concept and method of the feasibility analysis focused on technological viability assessment for technical analysis. It consists of technology trend assessment and technology level assessment. Through the analysis, we can determine the chance of schedule delay or cost overrun occurring in the proposed plan.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: This paper presents Cost per Equivalent Wafer Out, which we find useful in wafer fab operational cost monitoring and controlling. It removes the loading and product mix effect in the cost variance analysis. The operation heads, therefore, could immediately focus on identifying areas for cost improvement. Without this, they would have to measure the impact of the loading variance and product mix variance between actual and budgeted prior to make any decision on cost improvement. Cost per Equivalent Wafer Out, thereby, increases efficiency in wafer fab operational cost monitoring and controlling.
Abstract: With the exponential progress of technological
development comes a strong sense that events are moving too quickly
for our schools and that teachers may be losing control of them in the
process. This paper examines the impact of e-learning and e-teaching
in universities, from both the student and teacher perspective. In
particular, it is shown that e-teachers should focus not only on the
technical capacities and functions of IT materials and activities, but
must attempt to more fully understand how their e-learners perceive
the learning environment. From the e-learner perspective, this paper
indicates that simply having IT tools available does not automatically
translate into all students becoming effective learners. More
evidence-based evaluative research is needed to allow e-learning and
e-teaching to reach full potential.
Abstract: This study aims to propose three evaluation methods to
evaluate the Tokyo Cap and Trade Program when emissions trading is
performed virtually among enterprises, focusing on carbon dioxide
(CO2), which is the only emitted greenhouse gas that tends to increase.
The first method clarifies the optimum reduction rate for the highest
cost benefit, the second discusses emissions trading among enterprises
through market trading, and the third verifies long-term emissions
trading during the term of the plan (2010-2019), checking the validity
of emissions trading partly using Geographic Information Systems
(GIS). The findings of this study can be summarized in the following
three points.
1. Since the total cost benefit is the greatest at a 44% reduction rate, it
is possible to set it more highly than that of the Tokyo Cap and
Trade Program to get more total cost benefit.
2. At a 44% reduction rate, among 320 enterprises, 8 purchasing
enterprises and 245 sales enterprises gain profits from emissions
trading, and 67 enterprises perform voluntary reduction without
conducting emissions trading. Therefore, to further promote
emissions trading, it is necessary to increase the sales volumes of
emissions trading in addition to sales enterprises by increasing the
number of purchasing enterprises.
3. Compared to short-term emissions trading, there are few enterprises
which benefit in each year through the long-term emissions trading
of the Tokyo Cap and Trade Program. Only 81 enterprises at the
most can gain profits from emissions trading in FY 2019. Therefore,
by setting the reduction rate more highly, it is necessary to increase
the number of enterprises that participate in emissions trading and
benefit from the restraint of CO2 emissions.
Abstract: Since the world printing industry has to confront
globalization with a constant change, the Thai printing industry, as a
small but increasingly significant part of the world printing industry,
cannot inevitably escape but has to encounter with the similar change
and also the need to revamp its production processes, designs and
technology to make them more appealing to both international and
domestic market. The essential question is what is the Thai
competitive edge in the printing industry in changing environment?
This research is aimed to study the Thai level of competitive edge in
terms of marketing, technology, environment friendly, and the level
of satisfaction of the process of using printing machines. To access
the extent to which is the trends in competitiveness of Thai printing
industry, both quantitative and qualitative study were conducted. The
quantitative analysis was restricted to 100 respondents. The
qualitative analysis was restricted to a focus group of 10 individuals
from various backgrounds in the Thai printing industry. The findings
from the quantitative analysis revealed that the overall mean scores
are 4.53, 4.10, and 3.50 for the competitiveness of marketing, the
competitiveness of technology, and the competitiveness of being
environment friendly respectively. However, the level of satisfaction
for the process of using machines has a mean score only 3.20. The
findings from the qualitative analysis have revealed that target
customers have increasingly reordered due to their contentment in
both low prices and the acceptable quality of the products. Moreover,
the Thai printing industry has a tendency to convert to ambient green
technology which is friendly to the environment. The Thai printing
industry is choosing to produce or substitute with products that are
less damaging to the environment. It is also found that the Thai
printing industry has been transformed into a very competitive
industry which bargaining power rests on consumers who have a
variety of choices.
Abstract: Extensive use of the Internet coupled with the
marvelous growth in e-commerce and m-commerce has created a
huge demand for information security. The Secure Socket Layer
(SSL) protocol is the most widely used security protocol in the
Internet which meets this demand. It provides protection against
eaves droppings, tampering and forgery. The cryptographic
algorithms RC4 and HMAC have been in use for achieving security
services like confidentiality and authentication in the SSL. But recent
attacks against RC4 and HMAC have raised questions in the
confidence on these algorithms. Hence two novel cryptographic
algorithms MAJE4 and MACJER-320 have been proposed as
substitutes for them. The focus of this work is to demonstrate the
performance of these new algorithms and suggest them as dependable
alternatives to satisfy the need of security services in SSL. The
performance evaluation has been done by using practical
implementation method.
Abstract: Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.
Abstract: In this paper, we propose a new approach to query-by-humming, focusing on MP3 songs database. Since MP3 songs are much more difficult in melody representation than symbolic performance data, we adopt to extract feature descriptors from the vocal sounds part of the songs. Our approach is based on signal filtering, sub-band spectral processing, MDCT coefficients analysis and peak energy detection by ignorance of the background music as much as possible. Finally, we apply dual dynamic programming algorithm for feature similarity matching. Experiments will show us its online performance in precision and efficiency.