Abstract: In this article, a method has been offered to classify
normal and defective tiles using wavelet transform and artificial
neural networks. The proposed algorithm calculates max and min
medians as well as the standard deviation and average of detail
images obtained from wavelet filters, then comes by feature vectors
and attempts to classify the given tile using a Perceptron neural
network with a single hidden layer. In this study along with the
proposal of using median of optimum points as the basic feature and
its comparison with the rest of the statistical features in the wavelet
field, the relational advantages of Haar wavelet is investigated. This
method has been experimented on a number of various tile designs
and in average, it has been valid for over 90% of the cases. Amongst
the other advantages, high speed and low calculating load are
prominent.
Abstract: The objectives of this research were 1) to study the
opinions of newspaper journalists about their trustworthiness in the
National Press Council of Thailand (NPCT) and the NPCT-s success
in regulating the professional ethics; and 2) to study the differences
among mean vectors of the variables of trustworthiness in the NPCT
and opinions on the NPCT-s success in regulating professional ethics
among samples working at different work positions and from
different affiliation of newspaper organizations. The results showed
that 1) Interaction effects between the variables of work positions and
affiliation were not statistically significant at the confidence level of
0.05. 2) There was a statistically significant difference (p
Abstract: This paper describes simple implementation of
homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed
point methods. In homoptopy methods, an embedding
parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed method
Abstract: Fuzzy logic control (FLC) systems have been tested in
many technical and industrial applications as a useful modeling tool
that can handle the uncertainties and nonlinearities of modern control
systems. The main drawback of the FLC methodologies in the
industrial environment is challenging for selecting the number of
optimum tuning parameters.
In this paper, a method has been proposed for finding the optimum
membership functions of a fuzzy system using particle swarm
optimization (PSO) algorithm. A synthetic algorithm combined from
fuzzy logic control and PSO algorithm is used to design a controller
for a continuous stirred tank reactor (CSTR) with the aim of
achieving the accurate and acceptable desired results. To exhibit the
effectiveness of proposed algorithm, it is used to optimize the
Gaussian membership functions of the fuzzy model of a nonlinear
CSTR system as a case study. It is clearly proved that the optimized
membership functions (MFs) provided better performance than a
fuzzy model for the same system, when the MFs were heuristically
defined.
Abstract: In this research, the diabetes conditions of people (healthy, prediabete and diabete) were tried to be identified with noninvasive palm perspiration measurements. Data clusters gathered from 200 subjects were used (1.Individual Attributes Cluster and 2. Palm Perspiration Attributes Cluster). To decrase the dimensions of these data clusters, Principal Component Analysis Method was used. Data clusters, prepared in that way, were classified with Support Vector Machines. Classifications with highest success were 82% for Glucose parameters and 84% for HbA1c parametres.
Abstract: This study investigated the number of Aedes larvae,
the key breeding sites of Aedes sp., and the relationship between
climatic factors and the incidence of DHF in Samui Islands. We
conducted our questionnaire and larval surveys from randomly
selected 105 households in Samui Islands in July-September 2006.
Pearson-s correlation coefficient was used to explore the primary
association between the DHF incidence and all climatic factors.
Multiple stepwise regression technique was then used to fit the
statistical model. The results showed that the positive indoor
containers were small jars, cement tanks, and plastic tanks. The
positive outdoor containers were small jars, cement tanks, plastic
tanks, used cans, tires, plastic bottles, discarded objects, pot saucers,
plant pots, and areca husks. All Ae. albopictus larval indices (i.e., CI,
HI, and BI) were higher than Ae. aegypti larval indices in this area.
These larval indices were higher than WHO standard. This indicated
a high risk of DHF transmission at Samui Islands. The multiple
stepwise regression model was y = –288.80 + 11.024xmean temp. The
mean temperature was positively associated with the DHF incidence
in this area.
Abstract: This paper examines the effect of corporate diversification on the profitability of the Financial services sector in Nigeria. The study relied on historic accounting data generated from financial (annual) reports and accounts of sampled banks between the period 1998 and 2007 (a ten-year period). A regression equation was formulated, in line with previous studies to shed light on the effect of corporate diversification on the profitability of the Financial services sector in Nigeria. The results of the regression analysis revealed that diversification impacts strongly on banks profitability. Conclusively the paper produces strong evidence to assert that diversification impacts positively and significantly on banks profitability because among other things such diversified banks can pool their internally generated funds and allocate them properly.
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Innovation, technology and knowledge are the trilogy
of impact to support the challenges arising from uncertainty.
Evidence showed an opportunity to ask how to manage in this
environment under constant innovation. In an attempt to get a
response from the field of Management Sciences, based in the
Contingency Theory, a research was conducted, with
phenomenological and descriptive approaches, using the Case Study
Method and the usual procedures for this task involving a focus
group composed of managers and employees working in the
pharmaceutical field. The problem situation was raised; the state of
the art was interpreted and dissected the facts. In this tasks were
involved four establishments. The result indicates that these focused
ventures have been managed by its founder empirically and is
experimenting agility described in this work. The expectation of this
study is to improve concepts for stakeholders on creativity in
business.
Abstract: Energy efficient protocol design is the aim of current
researches in the area of sensor networks where limited power
resources impose energy conservation considerations. In this paper
we care for Medium Access Control (MAC) protocols and after an
extensive literature review, two adaptive schemes are discussed. Of
them, adaptive-rate MACs which were introduced for throughput
enhancement show the potency to save energy, even more than
adaptive-power schemes. Then we propose an allocation algorithm
for getting accurate and reliable results. Through a simulation study
we validated our claim and showed the power saving of adaptive-rate
protocols.
Abstract: KREISIG is a computer simulation program, firstly developed by Munawar (1994) in Germany to optimize signalized roundabout. The traffic movement is based on the car following theory. Turbine method has been implemented for signal setting. The program has then been further developed in Indonesia to meet the traffic characteristics in Indonesia by adjusting the sensitivity of the drivers. Trial and error method has been implemented to adjust the saturation flow. The saturation flow output has also been compared to the calculation method according to 1997 Indonesian Highway Capacity Manual. It has then been implemented to optimize signalized roundabout at Kleringan roundabout in Malioboro area, Yogyakarta, Indonesia. It is found that this method can optimize the signal setting of this roundabout. Therefore, it is recommended to use this program to optimize signalized roundabout.
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: To improve the material characteristics of single- and
poly-crystals of pure copper, the respective relationships between crystallographic orientations and microstructures, and the bending and mechanical properties were examined. And texture distribution is also
analyzed. A grain refinement procedure was performed to obtain a
grained structure. Furthermore, some analytical results related to
crystal direction maps, inverse pole figures, and textures were obtained from SEM-EBSD analyses. Results showed that these
grained metallic materials have peculiar springback characteristics with various bending angles.
Abstract: One of the most important aspects expected from ERP systems is to integrate various operations existing in administrative, financial, commercial, human resources, and production departments of the consumer organization. Also, it is often needed to integrate the new ERP system with the organization legacy systems when implementing the ERP package in the organization. Without relying on an appropriate software architecture to realize the required integration, ERP implementation processes become error prone and time consuming; in some cases, the ERP implementation may even encounters serious risks. In this paper, we propose a new architecture that is based on the agent oriented vision and supplies the integration expected from ERP systems using several independent but cooperator agents. Besides integration which is the main issue of this paper, the presented architecture will address some aspects of intelligence and learning capabilities existing in ERP systems
Abstract: Understanding the consumption and production of
various metabolites of fibroblast conditioned media is needed for its
proper and optimized use in expansion of pluripotent stem cells. For
this purpose, we have used the HPLC method to analyse the
consumption of glucose and the production of lactate over time by
mouse embryonic fibroblasts. The experimental data have also been
compared with mathematical model fits. 0.025 moles of lactate was
produced after 72 hrs while the glucose concentration decreased from
0.017 moles to 0.011 moles. The mathematical model was able to
predict the trends of glucose consumption and lactate production.
Abstract: Decentralized eco-sanitation system is a promising and sustainable mode comparing to the century-old centralized conventional sanitation system. The decentralized concept relies on an environmentally and economically sound management of water, nutrient and energy fluxes. Source-separation systems for urban waste management collect different solid waste and wastewater streams separately to facilitate the recovery of valuable resources from wastewater (energy, nutrients). A resource recovery centre constituted for 20,000 people will act as the functional unit for the treatment of urban waste of a high-density population community, like Singapore. The decentralized system includes urine treatment, faeces and food waste co-digestion, and horticultural waste and organic fraction of municipal solid waste treatment in composting plants. A design model is developed to estimate the input and output in terms of materials and energy. The inputs of urine (yellow water, YW) and faeces (brown water, BW) are calculated by considering the daily mean production of urine and faeces by humans and the water consumption of no-mix vacuum toilet (0.2 and 1 L flushing water for urine and faeces, respectively). The food waste (FW) production is estimated to be 150 g wet weight/person/day. The YW is collected and discharged by gravity into tank. It was found that two days are required for urine hydrolysis and struvite precipitation. The maximum nitrogen (N) and phosphorus (P) recovery are 150-266 kg/day and 20-70 kg/day, respectively. In contrast, BW and FW are mixed for co-digestion in a thermophilic acidification tank and later a decentralized/centralized methanogenic reactor is used for biogas production. It is determined that 6.16-15.67 m3/h methane is produced which is equivalent to 0.07-0.19 kWh/ca/day. The digestion residues are treated with horticultural waste and organic fraction of municipal waste in co-composting plants.
Abstract: In this study, workplace environmental monitoring
systems were established using USN(Ubiquitous Sensor Networks)
and LabVIEW. Although existing direct sampling methods enable
finding accurate values as of the time points of measurement, those
methods are disadvantageous in that continuous management and
supervision are difficult and costs for are high when those methods are
used. Therefore, the efficiency and reliability of workplace
management by supervisors are relatively low when those methods are
used. In this study, systems were established so that information on
workplace environmental factors such as temperatures, humidity and
noises is measured and transmitted to the PC in real time to enable
supervisors to monitor workplaces through LabVIEW on the PC.
When any accidents have occurred in workplaces, supervisors can
immediately respond through the monitoring system and this system
enables integrated workplace management and the prevention of
safety accidents. By introducing these monitoring systems, safety
accidents due to harmful environmental factors in workplaces can be
prevented and these monitoring systems will be also helpful in finding
out the correlation between safety accidents and occupational diseases
by comparing and linking databases established by this monitoring
system with existing statistical data.
Abstract: The mosaicing technique has been employed in more and more application fields, from entertainment to scientific ones. In the latter case, often the final evaluation is still left to human beings, that assess visually the quality of the mosaic. Many times, a lack of objective measurements in microscopic mosaicing may prevent the mosaic from being used as a starting image for further analysis. In this work we analyze three different metrics and indexes, in the domain of signal analysis, image analysis and visual quality, to measure the quality of different aspects of the mosaicing procedure, such as registration errors and visual quality. As the case study we consider the mosaicing algorithm we developed. The experiments have been carried out by considering mosaics with very different features: histological samples, that are made of detailed and contrasted images, and live stem cells, that show a very low contrast and low detail levels.
Abstract: Recently, the advanced technologies that offer high
precision product, relative easy, economical process and also rapid
production are needed to realize the high demand of ultra precision
micro part. In our research, micromanufacturing based on soft
lithography and nanopowder injection molding was investigated. The
silicone metal pattern with ultra thick and high aspect ratio succeeds to
fabricate Polydimethylsiloxane (PDMS) micro mold. The process
followed by nanopowder injection molding (PIM) by a simple vacuum
hot press. The 17-4ph nanopowder with diameter of 100 nm, succeed
to be injected and it forms green sample microbearing with thickness,
microchannel and aspect ratio is 700μm, 60μm and 12, respectively.
Sintering process was done in 1200 C for 2 hours and heating rate
0.83oC/min. Since low powder load (45% PL) was applied to achieve
green sample fabrication, ~15% shrinkage happen in the 86% relative
density. Several improvements should be done to produce high
accuracy and full density sintered part.
Abstract: Proactive coping directed at an upcoming as opposed
to an ongoing stressor, is a new focus in positive psychology. The
present study explored the proactive coping-s effect on the workplace
adaptation after transition from college to workplace. In order to
demonstrate the influence process between them, we constructed the
model of proactive coping style effecting the actual positive coping
efforts and outcomes by mediating proactive competence during one
year after the transition. Participants (n = 100) started to work right
after graduating from college completed all the four time-s surveys
--one month before (Time 0), one month after (Time 1), three months
after (Time 2), and one year after (Time 3) the transition. Time 0
survey included the measurement of proactive coping style and
competence. Time 1, 2, 3 surveys included the measurement of the
challenge cognitive appraisal, problem solving coping strategy, and
subjective workplace adaptation. The result indicated that proactive
coping style effected newcomers- actual coping efforts and outcomes
by mediating proactive coping competence. The result also showed
that proactive coping competence directly promoted Time1-s actual
positive coping efforts and outcomes, and indirectly promoted Time
2-s and Time 3-s.