Abstract: Many works have been carried out to compare the
efficiency of several goodness of fit procedures for identifying
whether or not a particular distribution could adequately explain a
data set. In this paper a study is conducted to investigate the power
of several goodness of fit tests such as Kolmogorov Smirnov (KS),
Anderson-Darling(AD), Cramer- von- Mises (CV) and a proposed
modification of Kolmogorov-Smirnov goodness of fit test which
incorporates a variance stabilizing transformation (FKS). The
performances of these selected tests are studied under simple
random sampling (SRS) and Ranked Set Sampling (RSS). This
study shows that, in general, the Anderson-Darling (AD) test
performs better than other GOF tests. However, there are some
cases where the proposed test can perform as equally good as the
AD test.
Abstract: Extensive information is required within a R&D environment,
and a considerable amount of time and efforts are being
spent on finding the necessary information. An adaptive information
providing system would be beneficial to the environment, and a
conceptual model of the resources, people and context is mandatory
for developing such applications. In this paper, an information model
on various contexts and resources is proposed which provides the
possibility of effective applications for use in adaptive information
systems within a R&D project and meeting environment.
Abstract: This research is designed for helping a WAPbased mobile phone-s user in order to analyze of logistics in the traffic area by applying and designing the accessible processes from mobile user to server databases. The research-s design comprises Mysql 4.1.8-nt database system for being the server which there are three sub-databases, traffic light – times of intersections in periods of the day, distances on the road of area-blocks where are divided from the main sample-area and speeds of sample vehicles (motorcycle, personal car and truck) in periods of the day. For interconnections between the server and user, PHP is used to calculate distances and travelling times from the beginning point to destination, meanwhile XHTML applied for receiving, sending and displaying data from PHP to user-s mobile. In this research, the main sample-area is focused at the Huakwang-Ratchada-s area, Bangkok, Thailand where usually the congested point and 6.25 km2 surrounding area which are split into 25 blocks, 0.25 km2 for each. For simulating the results, the designed server-database and all communicating models of this research have been uploaded to www.utccengineering.com/m4tg and used the mobile phone which supports WAP 2.0 XHTML/HTML multimode browser for observing values and displayed pictures. According to simulated results, user can check the route-s pictures from the requiring point to destination along with analyzed consuming times when sample vehicles travel in various periods of the day.
Abstract: Quality costs are the costs associated with preventing,
finding, and correcting defective work. Since the main language of
corporate management is money, quality-related costs act as means of
communication between the staff of quality engineering departments
and the company managers. The objective of quality engineering is to
minimize the total quality cost across the life of product. Quality
costs provide a benchmark against which improvement can be
measured over time. It provides a rupee-based report on quality
improvement efforts. It is an effective tool to identify, prioritize and
select quality improvement projects. After reviewing through the
literature it was noticed that a simplified methodology for data
collection of quality cost in a manufacturing industry was required.
The quantified standard methodology is proposed for collecting data
of various elements of quality cost categories for manufacturing
industry. Also in the light of research carried out so far, it is felt
necessary to standardise cost elements in each of the prevention,
appraisal, internal failure and external failure costs. . Here an attempt
is made to standardise the various cost elements applicable to
manufacturing industry and data is collected by using the proposed
quantified methodology. This paper discusses the case study carried
in luggage manufacturing industry.
Abstract: Electronic Systems are the core of everyday lives.
They form an integral part in financial networks, mass transit,
telephone systems, power plants and personal computers. Electronic
systems are increasingly based on complex VLSI (Very Large Scale
Integration) integrated circuits. Initial electronic design automation is
concerned with the design and production of VLSI systems. The next
important step in creating a VLSI circuit is Physical Design. The
input to the physical design is a logical representation of the system
under design. The output of this step is the layout of a physical
package that optimally or near optimally realizes the logical
representation. Physical design problems are combinatorial in nature
and of large problem sizes. Darwin observed that, as variations are
introduced into a population with each new generation, the less-fit
individuals tend to extinct in the competition of basic necessities.
This survival of fittest principle leads to evolution in species. The
objective of the Genetic Algorithms (GA) is to find an optimal
solution to a problem .Since GA-s are heuristic procedures that can
function as optimizers, they are not guaranteed to find the optimum,
but are able to find acceptable solutions for a wide range of
problems. This survey paper aims at a study on Efficient Algorithms
for VLSI Physical design and observes the common traits of the
superior contributions.
Abstract: Functional imaging procedures for the non-invasive assessment of tissue microcirculation are highly requested, but require a mathematical approach describing the trans- and intercapillary passage of tracer particles. Up to now, two theoretical, for the moment different concepts have been established for tracer kinetic modeling of contrast agent transport in tissues: pharmacokinetic compartment models, which are usually written as coupled differential equations, and the indicator dilution theory, which can be generalized in accordance with the theory of lineartime- invariant (LTI) systems by using a convolution approach. Based on mathematical considerations, it can be shown that also in the case of an open two-compartment model well-known from functional imaging, the concentration-time course in tissue is given by a convolution, which allows a separation of the arterial input function from a system function being the impulse response function, summarizing the available information on tissue microcirculation. Due to this reason, it is possible to integrate the open two-compartment model into the system-theoretic concept of indicator dilution theory (IDT) and thus results known from IDT remain valid for the compartment approach. According to the long number of applications of compartmental analysis, even for a more general context similar solutions of the so-called forward problem can already be found in the extensively available appropriate literature of the seventies and early eighties. Nevertheless, to this day, within the field of biomedical imaging – not from the mathematical point of view – there seems to be a trench between both approaches, which the author would like to get over by exemplary analysis of the well-known model.
Abstract: The two significant overvoltages in power system,
switching overvoltage and lightning overvoltage, are investigated in
this paper. Firstly, the effect of various power system parameters on
Line Energization overvoltages is evaluated by simulation in ATP.
The dominant parameters include line parameters; short-circuit
impedance and circuit breaker parameters. Solutions to reduce
switching overvoltages are reviewed and controlled closing using
switchsync controllers is proposed as proper method.
This paper also investigates lightning overvoltages in the
overhead-cable transition. Simulations are performed in
PSCAD/EMTDC. Surge arresters are applied in both ends of cable to
fulfill the insulation coordination. The maximum amplitude of
overvoltages inside the cable is surveyed which should be of great
concerns in insulation coordination studies.
Abstract: This paper proposes a meta-heuristic called Ant Colony Optimization to solve multi-objective production problems. The multi-objective function is to minimize lead time and work in process. The problem is related to the decision variables, i.e.; distance and process time. According to decision criteria, the mathematical model is formulated. In order to solve the model an ant colony optimization approach has been developed. The proposed algorithm is parameterized by the number of ant colonies and the number of pheromone trails. One example is given to illustrate the effectiveness of the proposed model. The proposed formulations; Max-Min Ant system are then used to solve the problem and the results evaluate the performance and efficiency of the proposed algorithm using simulation.
Abstract: In studying the possibility of using plants as
rhizoremediators, barley and grass mixture which showed resistance
to various concentrations of oil were selected. The minimum
inhibitory effect of oil on these plants by morphological parameters
such as survival of plants, length and biomass of shoot and root
compared with the control was showed. In determining physiological
parameters, a slight decrease in the number of chlorophyll a and b in
the leaves of plants was noted. The differences in the ratio of the total
surface of the roots to the work surface with the growth of plants in
soil with oil in the study of adsorption of the root surface were
showed.
Abstract: This paper describes the evolution of language
politics and the part played by political leaders with reference to
the Dravidian parties in Tamil Nadu. It explores the interesting
evolution from separatism to coalition in sustaining the values of
parliamentary democracy and federalism. It seems that the
appropriation of language politics is fully ascribed to the DMK
leadership under Annadurai and Karunanidhi. For them, the Tamil
language is a self-determining power, a terrain of nationhood, and
a perennial source of social and political powers. The DMK
remains a symbol of Tamil nationalist party playing language
politics in the interest of the Tamils. Though electoral alliances
largely determine the success, the language politics still has
significant space in the politics of Tamil Nadu. Ironically, DMK
moves from the periphery to centre for getting national recognition
for the Tamils as well as for its own maximization of power. The
evolution can be seen in two major phases as: language politics for
party building; and language politics for state building with three
successive political processes, namely, language politics in the
process of separatism, representative politics and coalition. The
much pronounced Dravidian Movement is radical enough to
democratize the party ideology to survive the spirit of
parliamentary democracy. This has secured its own rewards in
terms of political power. The political power provides the means to
achieve the social and political goal of the political party.
Language politics and leadership pattern actualized this trend
though the movement is shifted from separatism to coalition.
Abstract: Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.
Abstract: This paper focuses on cost and profit analysis of
single-server Markovian queuing system with two priority classes. In
this paper, functions of total expected cost, revenue and profit of the
system are constructed and subjected to optimization with respect to
its service rates of lower and higher priority classes. A computing
algorithm has been developed on the basis of fast converging
numerical method to solve the system of non linear equations formed
out of the mathematical analysis. A novel performance measure of
cost and profit analysis in view of its economic interpretation for the
system with priority classes is attempted to discuss in this paper. On
the basis of computed tables observations are also drawn to enlighten
the variational-effect of the model on the parameters involved
therein.
Abstract: The frequency contents of the non-stationary
signals vary with time. For proper characterization of such
signals, a smart time-frequency representation is necessary.
Classically, the STFT (short-time Fourier transform) is
employed for this purpose. Its limitation is the fixed timefrequency
resolution. To overcome this drawback an enhanced
STFT version is devised. It is based on the signal driven
sampling scheme, which is named as the cross-level sampling.
It can adapt the sampling frequency and the window function
(length plus shape) by following the input signal local
variations. This adaptation results into the proposed technique
appealing features, which are the adaptive time-frequency
resolution and the computational efficiency.
Abstract: The effect of seed inoculation by VA- mycorrhiza and
different levels of phosphorus fertilizer on growth and yield of
sunflower (Azargol cultivar) was studied in experiment farm of
Islamic Azad University, Karaj Branch during 2008 growing season.
The experiment treatments were arranged in factorial based on a
complete randomized block design with three replications. Four
phosphorus fertilizer levels of 25%, 50% 75% and 100% P
recommended with two levels of Mycorrhiza: with and without
Mycorrhiza (control) were assigned in a factorial combination.
Results showed that head diameter, number of seeds in head, seed
yield and oil yield were significantly higher in inoculated plants than
in non-inoculated plants. Head diameter, number of seeds in head,
1000 seeds weight, biological yield, seed yield and oil yield increased
with increasing P level above 75% P recommended in non-inoculated
plants, whereas no significant difference was observed between 75%
and 100% P recommended. The positive effect of mycorrhizal
inoculation decreased with increasing P levels due to decreased
percent root colonization at higher P levels. According to the results
of this experiment, application of mycorrhiza in present of 50% P
recommended had an appropriate performance and could increase
seed yield and oil production to an acceptable level, so it could be
considered as a suitable substitute for chemical phosphorus fertilizer
in organic agricultural systems.
Abstract: Super-quadrics can represent a set of implicit surfaces,
which can be used furthermore as primitive surfaces to construct a
complex object via Boolean set operations in implicit surface
modeling. In fact, super-quadrics were developed to create a
parametric surface by performing spherical product on two parametric
curves and some of the resulting parametric surfaces were also
represented as implicit surfaces. However, because not every
parametric curve can be redefined implicitly, this causes only implicit
super-elliptic and super-hyperbolic curves are applied to perform
spherical product and so only implicit super-ellipsoids and
hyperboloids are developed in super-quadrics. To create implicit
surfaces with more diverse shapes than super-quadrics, this paper
proposes an implicit representation of spherical product, which
performs spherical product on two implicit curves like super-quadrics
do. By means of the implicit representation, many new implicit curves
such as polygonal, star-shaped and rose-shaped curves can be used to
develop new implicit surfaces with a greater variety of shapes than
super-quadrics, such as polyhedrons, hyper-ellipsoids, superhyperboloids
and hyper-toroids containing star-shaped and roseshaped
major and minor circles. Besides, the newly developed implicit
surfaces can also be used to define new primitive implicit surfaces for
constructing a more complex implicit surface in implicit surface
modeling.
Abstract: This paper presents the results of an experimental
investigation carried out to evaluate the shrinkage of High Strength
Concrete. High Strength Concrete is made by partially replacement of
cement by flyash and silica fume. The shrinkage of High Strength
Concrete has been studied using the different types of coarse and fine
aggregates i.e. Sandstone and Granite of 12.5 mm size and Yamuna
and Badarpur Sand. The Mix proportion of concrete is 1:0.8:2.2 with
water cement ratio as 0.30. Superplasticizer dose @ of 2% by weight
of cement is added to achieve the required degree of workability in
terms of compaction factor.
From the test results of the above investigation it can be concluded
that the shrinkage strain of High Strength Concrete increases with
age. The shrinkage strain of concrete with replacement of cement by
10% of Flyash and Silica fume respectively at various ages are more
(6 to 10%) than the shrinkage strain of concrete without Flyash and
Silica fume. The shrinkage strain of concrete with Badarpur sand as
Fine aggregate at 90 days is slightly less (10%) than that of concrete
with Yamuna Sand. Further, the shrinkage strain of concrete with
Granite as Coarse aggregate at 90 days is slightly less (6 to 7%) than
that of concrete with Sand stone as aggregate of same size. The
shrinkage strain of High Strength Concrete is also compared with that
of normal strength concrete. Test results show that the shrinkage
strain of high strength concrete is less than that of normal strength
concrete.
Abstract: For the electrical metrics that describe photovoltaic
cell performance are inherently multivariate in nature, use of a
univariate, or one variable, statistical process control chart can have
important limitations. Development of a comprehensive process
control strategy is known to be significantly beneficial to reducing
process variability that ultimately drives up the manufacturing cost
photovoltaic cells. The multivariate moving average or MMA chart,
is applied to the electrical metrics of photovoltaic cells to illustrate
the improved sensitivity on process variability this method of control
charting offers. The result show the ability of the MMA chart to
expand to as any variables as needed, suggests an application
with multiple photovoltaic electrical metrics being used in
concert to determine the processes state of control.
Abstract: We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.
Abstract: The main aim of this study is to identify the most
influential variables that cause defects on the items produced by a
casting company located in Turkey. To this end, one of the items
produced by the company with high defective percentage rates is
selected. Two approaches-the regression analysis and decision treesare
used to model the relationship between process parameters and
defect types. Although logistic regression models failed, decision tree
model gives meaningful results. Based on these results, it can be
claimed that the decision tree approach is a promising technique for
determining the most important process variables.
Abstract: Software maintenance and mainly software
comprehension pose the largest costs in the software lifecycle. In
order to assess the cost of software comprehension, various
complexity measures have been proposed in the literature. This paper
proposes new cognitive-spatial complexity measures, which combine
the impact of spatial as well as architectural aspect of the software to
compute the software complexity. The spatial aspect of the software
complexity is taken into account using the lexical distances (in
number of lines of code) between different program elements and the
architectural aspect of the software complexity is taken into
consideration using the cognitive weights of control structures
present in control flow of the program. The proposed measures are
evaluated using standard axiomatic frameworks and then, the
proposed measures are compared with the corresponding existing
cognitive complexity measures as well as the spatial complexity
measures for object-oriented software. This study establishes that the
proposed measures are better indicators of the cognitive effort
required for software comprehension than the other existing
complexity measures for object-oriented software.