Abstract: Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Abstract: This study aims to propose three evaluation methods to
evaluate the Tokyo Cap and Trade Program when emissions trading is
performed virtually among enterprises, focusing on carbon dioxide
(CO2), which is the only emitted greenhouse gas that tends to increase.
The first method clarifies the optimum reduction rate for the highest
cost benefit, the second discusses emissions trading among enterprises
through market trading, and the third verifies long-term emissions
trading during the term of the plan (2010-2019), checking the validity
of emissions trading partly using Geographic Information Systems
(GIS). The findings of this study can be summarized in the following
three points.
1. Since the total cost benefit is the greatest at a 44% reduction rate, it
is possible to set it more highly than that of the Tokyo Cap and
Trade Program to get more total cost benefit.
2. At a 44% reduction rate, among 320 enterprises, 8 purchasing
enterprises and 245 sales enterprises gain profits from emissions
trading, and 67 enterprises perform voluntary reduction without
conducting emissions trading. Therefore, to further promote
emissions trading, it is necessary to increase the sales volumes of
emissions trading in addition to sales enterprises by increasing the
number of purchasing enterprises.
3. Compared to short-term emissions trading, there are few enterprises
which benefit in each year through the long-term emissions trading
of the Tokyo Cap and Trade Program. Only 81 enterprises at the
most can gain profits from emissions trading in FY 2019. Therefore,
by setting the reduction rate more highly, it is necessary to increase
the number of enterprises that participate in emissions trading and
benefit from the restraint of CO2 emissions.
Abstract: The purpose of this study was to explore the learning
effects on dance domain in Arts Curriculum at junior and senior high
levels. A total of 1,366 students from 9th to 11th grade of different
areas from Taiwan were administered a self-designed dance
achievement test. Data were analyzed through descriptive analysis,
independent sample t test, one-way ANOVA and Post hoc comparison
analysis using Scheffé Test. The results showed (1) female students
Abstract: Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Maxima with threshold of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.
Abstract: Since the world printing industry has to confront
globalization with a constant change, the Thai printing industry, as a
small but increasingly significant part of the world printing industry,
cannot inevitably escape but has to encounter with the similar change
and also the need to revamp its production processes, designs and
technology to make them more appealing to both international and
domestic market. The essential question is what is the Thai
competitive edge in the printing industry in changing environment?
This research is aimed to study the Thai level of competitive edge in
terms of marketing, technology, environment friendly, and the level
of satisfaction of the process of using printing machines. To access
the extent to which is the trends in competitiveness of Thai printing
industry, both quantitative and qualitative study were conducted. The
quantitative analysis was restricted to 100 respondents. The
qualitative analysis was restricted to a focus group of 10 individuals
from various backgrounds in the Thai printing industry. The findings
from the quantitative analysis revealed that the overall mean scores
are 4.53, 4.10, and 3.50 for the competitiveness of marketing, the
competitiveness of technology, and the competitiveness of being
environment friendly respectively. However, the level of satisfaction
for the process of using machines has a mean score only 3.20. The
findings from the qualitative analysis have revealed that target
customers have increasingly reordered due to their contentment in
both low prices and the acceptable quality of the products. Moreover,
the Thai printing industry has a tendency to convert to ambient green
technology which is friendly to the environment. The Thai printing
industry is choosing to produce or substitute with products that are
less damaging to the environment. It is also found that the Thai
printing industry has been transformed into a very competitive
industry which bargaining power rests on consumers who have a
variety of choices.
Abstract: A four-lobe pressure dam bearing which is
produced by cutting two pressure dams on the upper two lobes and
two relief-tracks on the lower two lobes of an ordinary four-lobe
bearing is found to be more stable than a conventional four-lobe
bearing. In this paper a four-lobe pressure dam bearing supporting
rigid and flexible rotors is analytically investigated to determine its
performance when L/D ratio is varied in the range 0.75 to 1.5. The
static and dynamic characteristics are studied at various L/D ratios.
The results show that the stability of a four-lobe pressure dam
bearing increases with decrease in L/D ratios both for rigid as well as
flexible rotors.
Abstract: The aim of this study is to determine the effect of
strategic management implementations on the institutionalization
levels. In this regard a field study has been made over 31 stone quarry
enterprises in cement producing sector in Konya by using survey
method. In this study, institutionalization levels of the enterprises
have been evaluated regarding three dimensions: professionalization,
management approach, participation in decisions and delegation of
authority. According to the results of the survey, there is a highly
positive and statistically significant relationship between the strategic
management implementations and institutionalization levels of the
enterprises. Additionally,-considering the results of regression
analysis made for establishing the relationship between strategic
management and institutionalization levels- it has been determined
that strategic management implementations of the enterprises can be
used as a variable to explain the institutionalization levels of them,
and also strategic management implementations of the enterprises
increase the institutionalization levels of them.
Abstract: The least mean square (LMS) algorithmis one of the
most well-known algorithms for mobile communication systems
due to its implementation simplicity. However, the main limitation
is its relatively slow convergence rate. In this paper, a booster
using the concept of Markov chains is proposed to speed up the
convergence rate of LMS algorithms. The nature of Markov
chains makes it possible to exploit the past information in the
updating process. Moreover, since the transition matrix has a
smaller variance than that of the weight itself by the central limit
theorem, the weight transition matrix converges faster than the
weight itself. Accordingly, the proposed Markov-chain based
booster thus has the ability to track variations in signal
characteristics, and meanwhile, it can accelerate the rate of
convergence for LMS algorithms. Simulation results show that the
LMS algorithm can effectively increase the convergence rate and
meantime further approach the Wiener solution, if the
Markov-chain based booster is applied. The mean square error is
also remarkably reduced, while the convergence rate is improved.
Abstract: Many agent-oriented software engineering
methodologies have been proposed for software developing; however
their application is still limited due to their lack of maturity.
Evaluating the strengths and weaknesses of these methodologies
plays an important role in improving them and in developing new
stronger methodologies. This paper presents an evaluation framework
for agent-oriented methodologies, which addresses six major areas:
concepts, notation, process, pragmatics, support for software
engineering and marketability. The framework is then used to
evaluate the Gaia methodology to identify its strengths and
weaknesses, and to prove the ability of the framework for promoting
the agent-oriented methodologies by detecting their weaknesses in
detail.
Abstract: The effects of enzyme action and heat pretreatment on oil extraction yield from sunflower kernels were analysed using hexane extraction with Soxhlet, and aqueous extraction with incubator shaker. Ground kernels of raw and heat treated kernels, each with and without Viscozyme treatment were used. Microscopic images of the kernels were taken to analyse the visible effects of each treatment on the cotyledon cell structure of the kernels. Heat pretreated kernels before both extraction processes produced enhanced oil extraction yields than the control, with steam explosion the most efficient. In hexane extraction, applying a combination of steam explosion and Viscozyme treatments to the kernels before the extraction gave the maximum oil extractable in 1 hour; while for aqueous extraction, raw kernels treated with Viscozyme gave the highest oil extraction yield. Remarkable cotyledon cell disruption was evident in kernels treated with Viscozyme; whereas steam explosion and conventional heat treated kernels had similar effects.
Abstract: The problem of laminar fluid flow which results from
the shrinking of a permeable surface in a nanofluid has been
investigated numerically. The model used for the nanofluid
incorporates the effects of Brownian motion and thermophoresis. A
similarity solution is presented which depends on the mass suction
parameter S, Prandtl number Pr, Lewis number Le, Brownian motion
number Nb and thermophoresis number Nt. It was found that the
reduced Nusselt number is decreasing function of each dimensionless
number.
Abstract: The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.
Abstract: In this study spatial-temporal speckle correlation techniques have been applied for the quality evaluation of three different Indian fruits namely apple, pear and tomato for the first time. The method is based on the analysis of variations of laser light scattered from biological samples. The results showed that crosscorrelation coefficients of biospeckle patterns change subject to their freshness and the storage conditions. The biospeckle activity was determined by means of the cross-correlation functions of the intensity fluctuations. Significant changes in biospeckle activity were observed during their shelf lives. From the study, it is found that the biospeckle activity decreases with the shelf-life storage time. Further it has been shown that biospeckle activity changes according to their respiration rates.
Abstract: This study sought to determine whether there were relationships existed among leisure satisfaction, self-esteem, and spiritual wellness. Four hundred survey instruments were distributed, and 334 effective instruments were returned, for an effective rate of 83.5%. The participants were recruited from a purposive sampling that subjects were at least 60 years of age and retired in Tainan City, Taiwan. Three instruments were used in this research: Leisure Satisfaction Scale (LSS), Self-Esteem Scale (SES), and Spirituality Assessment Scale (SAS). The collected data were analyzed statistically. The findings of this research were as follows: 1. There is significantly correlated between leisure satisfaction and spiritual wellness. 2. There is significantly correlated between leisure satisfaction and self-esteem. 3. There is significantly correlated between spiritual wellness and self-esteem.
Abstract: Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.
Abstract: Plasmodium vivax malaria differs from P. falciparum malaria in that a person suffering from P. vivax infection can suffer relapses of the disease. This is due the parasite being able to remain dormant in the liver of the patients where it is able to re-infect the patient after a passage of time. During this stage, the patient is classified as being in the dormant class. The model to describe the transmission of P. vivax malaria consists of a human population divided into four classes, the susceptible, the infected, the dormant and the recovered. The effect of a time delay on the transmission of this disease is studied. The time delay is the period in which the P. vivax parasite develops inside the mosquito (vector) before the vector becomes infectious (i.e., pass on the infection). We analyze our model by using standard dynamic modeling method. Two stable equilibrium states, a disease free state E0 and an endemic state E1, are found to be possible. It is found that the E0 state is stable when a newly defined basic reproduction number G is less than one. If G is greater than one the endemic state E1 is stable. The conditions for the endemic equilibrium state E1 to be a stable spiral node are established. For realistic values of the parameters in the model, it is found that solutions in phase space are trajectories spiraling into the endemic state. It is shown that the limit cycle and chaotic behaviors can only be achieved with unrealistic parameter values.
Abstract: This paper presents a new data oriented model of image. Then a representation of it, ADBT, is introduced. The ability of ADBT is clustering, segmentation, measuring similarity of images etc, with desired precision and corresponding speed.
Abstract: Public health surveillance system focuses on outbreak detection and data sources used. Variation or aberration in the frequency distribution of health data, compared to historical data is often used to detect outbreaks. It is important that new techniques be developed to improve the detection rate, thereby reducing wastage of resources in public health. Thus, the objective is to developed technique by applying frequent mining and outlier mining techniques in outbreak detection. 14 datasets from the UCI were tested on the proposed technique. The performance of the effectiveness for each technique was measured by t-test. The overall performance shows that DTK can be used to detect outlier within frequent dataset. In conclusion the outbreak detection technique using anomaly-based on frequent-outlier technique can be used to identify the outlier within frequent dataset.
Abstract: The present study has been carried out with a view to calculate the coastal vulnerability index (CVI) to know the high and low sensitive areas and area of inundation due to future SLR. Both conventional and remotely sensed data were used and analyzed through the modelling technique. Out of the total study area, 8.26% is very high risk, 14.21% high, 9.36% medium, 22.46% low and 7.35% in the very low vulnerable category, due to costal components. Results of the inundation analysis indicate that 225.2 km² and 397 km² of the land area will be submerged by flooding at 1m and 10m inundation levels. The most severely affected sectors are expected to be the residential, industrial and recreational areas. As this coast is planned for future coastal developmental activities, measures such as industrializations, building regulation, urban growth planning and agriculture, development of an integrated coastal zone management, strict enforcement of the Coastal Regulation Zone (CRZ) Act, monitoring of impacts and further research in this regard are recommended for the study area.
Abstract: In this work a surgical simulator is produced which
enables a training otologist to conduct a virtual, real-time prosthetic
insertion. The simulator provides the Ear, Nose and Throat surgeon
with real-time visual and haptic responses during virtual cochlear
implantation into a 3D model of the human Scala Tympani (ST). The
parametric model is derived from measured data as published in the
literature and accounts for human morphological variance, such as
differences in cochlear shape, enabling patient-specific pre- operative
assessment. Haptic modeling techniques use real physical data and
insertion force measurements, to develop a force model which
mimics the physical behavior of an implant as it collides with the ST
walls during an insertion. Output force profiles are acquired from the
insertion studies conducted in the work, to validate the haptic model.
The simulator provides the user with real-time, quantitative insertion
force information and associated electrode position as user inserts the
virtual implant into the ST model. The information provided by this
study may also be of use to implant manufacturers for design
enhancements as well as for training specialists in optimal force
administration, using the simulator. The paper reports on the methods
for anatomical modeling and haptic algorithm development, with
focus on simulator design, development, optimization and validation.
The techniques may be transferrable to other medical applications
that involve prosthetic device insertions where user vision is
obstructed.