Abstract: In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.
Abstract: This research aims to analyze the regenerative burner and the recuperative burner for the different reheating furnaces in the steel industry. The warm air temperatures of the burners are determined to suit with the sizes of the reheating furnaces by considering the air temperature, the fuel cost and the investment cost. The calculations of the payback period and the net present value are studied to compare the burners for the different reheating furnaces. The energy balance is utilized to calculate and compare the energy used in the different sizes of reheating furnaces for each burner. It is found that the warm air temperature is different if the sizes of reheating furnaces are varied. Based on the considerations of the net present value and the payback period, the regenerative burner is suitable for all plants at the same life of the burner. Finally, the sensitivity analysis of all factors has been discussed in this research.
Abstract: Based on Rayleigh beam theory, the sub-impacts of a
free-free beam struck horizontally by a round-nosed rigid mass is
simulated by the finite difference method and the impact-separation
conditions. In order to obtain the sub-impact force, a uniaxial
compression elastic-plastic contact model is employed to analyze the
local deformation field on contact zone. It is found that the horizontal
impact is a complicated process including the elastic plastic
sub-impacts in sequence. There are two sub-zones of sub-impact. In
addition, it found that the elastic energy of the free-free beam is more
suitable for the Poisson collision hypothesis to explain compression
and recovery processes.
Abstract: The Inter feeder Power Flow Regulator (IFPFR)
proposed in this paper consists of several voltage source inverters
with common dc bus; each inverter is connected in series with one of
different independent distribution feeders in the power system. This
paper is concerned with how to transfer power between the feeders for
load sharing purpose. The power controller of each inverter injects
the power (for sending feeder) or absorbs the power (for receiving
feeder) via injecting suitable voltage; this voltage injection is
simulated by voltage drop across series virtual impedance, the
impedance value is selected to achieve the concept of power exchange
between the feeders without perturbing the load voltage magnitude of
each feeder. In this paper a new control scheme for load sharing using
IFPFR is proposed.
Abstract: There is a acute water problem especially in the dry
season in and around Perundurai (Erode district, Tamil Nadu, India)
where there are more number of tannery units. Hence an attempt was
made to use the waste water from tannery industry for construction
purpose. The mechanical properties such as compressive strength,
tensile strength, flexural strength etc were studied by casting various
concrete specimens in form of cube, cylinders and beams etc and
were found to be satisfactory. Hence some special properties such as
chloride attack, sulphate attack and chemical attack are considered
and comparatively studied with the conventional potable water. In
this experimental study the results of specimens prepared by using
treated and untreated tannery effluent were compared with the
concrete specimens prepared by using potable water. It was observed
that the concrete had some reduction in strength while subjected to
chloride attack, sulphate attack and chemical attack. So admixtures
were selected and optimized in suitable proportion to counter act the
adverse effects and the results were found to be satisfactory.
Abstract: The use of renewable energy sources becomes more
necessary and interesting. As wider applications of renewable energy
devices at domestic, commercial and industrial levels has not only
resulted in greater awareness, but also significantly installed
capacities. In addition, biomass principally is in the form of woods,
which is a form of energy by humans for a long time. Gasification is
a process of conversion of solid carbonaceous fuel into combustible
gas by partial combustion. Many gasifier models have various
operating conditions; the parameters kept in each model are different.
This study applied experimental data, which has three inputs, which
are; biomass consumption, temperature at combustion zone and ash
discharge rate. One output is gas flow rate. For this paper, neural
network was used to identify the gasifier system suitable for the
experimental data. In the result,neural networkis usable to attain the
answer.
Abstract: This paper presents a new sensor-based online method for generating collision-free near-optimal paths for mobile robots pursuing a moving target amidst dynamic and static obstacles. At each iteration, first the set of all collision-free directions are calculated using velocity vectors of the robot relative to each obstacle and target, forming the Directive Circle (DC), which is a novel concept. Then, a direction close to the shortest path to the target is selected from feasible directions in DC. The DC prevents the robot from being trapped in deadlocks or local minima. It is assumed that the target's velocity is known, while the speeds of dynamic obstacles, as well as the locations of static obstacles, are to be calculated online. Extensive simulations and experimental results demonstrated the efficiency of the proposed method and its success in coping with complex environments and obstacles.
Abstract: Following the loss of NASA's Space Shuttle
Columbia in 2003, it was determined that problems in the agency's
organization created an environment that led to the accident. One
component of the proposed solution resulted in the formation of the
NASA Engineering Network (NEN), a suite of information retrieval
and knowledge-sharing tools. This paper describes the
implementation of communities of practice, which are formed along
engineering disciplines. Communities of practice enable engineers to
leverage their knowledge and best practices to collaborate and take
information learning back to their jobs and embed it into the
procedures of the agency. This case study offers insight into using
traditional engineering disciplines for virtual collaboration, including
lessons learned during the creation and establishment of NASA-s
communities.
Abstract: The common bean is the most important grain legume for direct human consumption in the world and BCMV is one of the world's most serious bean diseases that can reduce yield and quality of harvested product. To determine the best tolerance index to BCMV and recognize tolerant genotypes, 2 experiments were conducted in field conditions. Twenty five common bean genotypes were sown in 2 separate RCB design with 3 replications under contamination and non-contamination conditions. On the basis of the results of indices correlations GMP, MP and HARM were determined as the most suitable tolerance indices. The results of principle components analysis indicated 2 first components totally explained 98.52% of variations among data. The first and second components were named potential yield and stress susceptible respectively. Based on the results of BCMV tolerance indices assessment and biplot analysis WA8563-4, WA8563-2 and Cardinal were the genotypes that exhibited potential seed yield under contamination and noncontamination conditions.
Abstract: Nanostructured Iron Oxide with different
morphologies of rod-like and granular have been suc-cessfully
prepared via a solid-state reaction in the presence of NaCl, NaBr, NaI
and NaN3, respectively. The added salts not only prevent a drastic
increase in the size of the products but also provide suitable
conditions for the oriented growth of primary nanoparticles. The
formation mechanisms of these materials by solid-state reaction at
ambient temperature are proposed. The photocatalytic experiments
for congo red (CR) have demonstrated that the mixture of α-Fe2O3
and Fe3O4 nanostructures were more efficient than α-Fe2O3
nanostructures.
Abstract: Gold passbook is an investing tool that is especially
suitable for investors to do small investment in the solid gold. The gold
passbook has the lower risk than other ways investing in gold, but its
price is still affected by gold price. However, there are many factors
can cause influences on gold price. Therefore, building a model to
predict the price of gold passbook can both reduce the risk of
investment and increase the benefits. This study investigates the
important factors that influence the gold passbook price, and utilize
the Group Method of Data Handling (GMDH) to build the predictive
model. This method can not only obtain the significant variables but
also perform well in prediction. Finally, the significant variables of
gold passbook price, which can be predicted by GMDH, are US dollar
exchange rate, international petroleum price, unemployment rate,
whole sale price index, rediscount rate, foreign exchange reserves,
misery index, prosperity coincident index and industrial index.
Abstract: In this investigation, types of commercial and special
polyacrylonitrile (PAN) fibers contain sodium 2-methyl-2-
acrylamidopropane sulfonate (SAMPS) and itaconic acid (IA)
comonomers were studied by fourier transform infrared (FT-IR)
spectroscopy. The study of FT-IR spectra of PAN fibers samples
with different comonomers shows that during stabilization of PAN
fibers, the peaks related to C≡N bonds and CH2 are reduced sharply.
These reductions are related to cyclization of nitrile groups and
stabilization procedure. This reduction in PAN fibers contain IA
comonomer is very intense in comparison with PAN fibers contain
SAMPS comonomer. This fact indicates the cycling and stabilization
for sample contain IA comonomer have been conducted more
completely. Therefore the carbon fibers produced from this material
have higher tensile strength due to suitable stabilization.
Abstract: Based on the feature of model disturbances and uncertainty being compensated dynamically in auto – disturbances-rejection-controller (ADRC), a new method using ADRC is proposed for the decoupling control of dispenser longitudinal movement in big flight envelope. Developed from nonlinear model directly, ADRC is especially suitable for dynamic model that has big disturbances. Furthermore, without changing the structure and parameters of the controller in big flight envelope, this scheme can simplify the design of flight control system. The simulation results in big flight envelope show that the system achieves high dynamic performance, steady state performance and the controller has strong robustness.
Abstract: Business transformation initiatives are required by
any organization to jump from its normal mode of operation to the
one that is suitable for the change in the environment such as
competitive pressures, regulatory requirements, changes in labor
market, etc., or internal such as changes in strategy/vision, changes in
the capability, change in the management, etc. Recent advances in
information technology in automating the business processes have
the potential to transform an organization to provide it with a
sustained competitive advantage. Process constitutes the skeleton of
a business. Thus, for a business to exist and compete well, it is
essential for the skeleton to be robust and agile. This paper details
“transformation" from a business perspective, methodologies to bring
about an effective transformation, process-based transformation, and
the role of services computing in this. Further, it details the benefits
that could be achieved through services computing.
Abstract: The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC-s participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC-s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC-s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC-s consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC-s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: The stochastic nature of tool life using conventional discrete-wear data from experimental tests usually exists due to many individual and interacting parameters. It is a common practice in batch production to continually use the same tool to machine different parts, using disparate machining parameters. In such an environment, the optimal points at which tools have to be changed, while achieving minimum production cost and maximum production rate within the surface roughness specifications, have not been adequately studied. In the current study, two relevant aspects are investigated using coated and uncoated inserts in turning operations: (i) the accuracy of using machinability information, from fixed parameters testing procedures, when variable parameters situations are emerged, and (ii) the credibility of tool life machinability data from prior discrete testing procedures in a non-stop machining. A novel technique is proposed and verified to normalize the conventional fixed parameters machinability data to suit the cases when parameters have to be changed for the same tool. Also, an experimental investigation has been established to evaluate the error in the tool life assessment when machinability from discrete testing procedures is employed in uninterrupted practical machining.
Abstract: The production of a plant can be measured in terms of
seeds. The generation of seeds plays a critical role in our social and
daily life. The fruit production which generates seeds, depends on the
various parameters of the plant, such as shoot length, leaf number,
root length, root number, etc When the plant is growing, some leaves
may be lost and some new leaves may appear. It is very difficult to
use the number of leaves of the tree to calculate the growth of the
plant.. It is also cumbersome to measure the number of roots and
length of growth of root in several time instances continuously after
certain initial period of time, because roots grow deeper and deeper
under ground in course of time. On the contrary, the shoot length of
the tree grows in course of time which can be measured in different
time instances. So the growth of the plant can be measured using the
data of shoot length which are measured at different time instances
after plantation. The environmental parameters like temperature, rain
fall, humidity and pollution are also play some role in production of
yield. The soil, crop and distance management are taken care to
produce maximum amount of yields of plant. The data of the growth
of shoot length of some mustard plant at the initial stage (7,14,21 &
28 days after plantation) is available from the statistical survey by a
group of scientists under the supervision of Prof. Dilip De. In this
paper, initial shoot length of Ken( one type of mustard plant) has
been used as an initial data. The statistical models, the methods of
fuzzy logic and neural network have been tested on this mustard
plant and based on error analysis (calculation of average error) that
model with minimum error has been selected and can be used for the
assessment of shoot length at maturity. Finally, all these methods
have been tested with other type of mustard plants and the particular
soft computing model with the minimum error of all types has been
selected for calculating the predicted data of growth of shoot length.
The shoot length at the stage of maturity of all types of mustard
plants has been calculated using the statistical method on the
predicted data of shoot length.