Abstract: This paper deals optimized model to investigate the
effects of peak current, pulse on time and pulse off time in EDM performance on material removal rate of titanium alloy utilizing copper tungsten as electrode and positive polarity of the electrode. The experiments are carried out on Ti6Al4V. Experiments were
conducted by varying the peak current, pulse on time and pulse off time. A mathematical model is developed to correlate the influences of these variables and material removal rate of workpiece. Design of
experiments (DOE) method and response surface methodology
(RSM) techniques are implemented. The validity test of the fit and adequacy of the proposed models has been carried out through
analysis of variance (ANOVA). The obtained results evidence that as
the material removal rate increases as peak current and pulse on time
increases. The effect of pulse off time on MRR changes with peak ampere. The optimum machining conditions in favor of material removal rate are verified and compared. The optimum machining
conditions in favor of material removal rate are estimated and verified with proposed optimized results. It is observed that the developed model is within the limits of the agreeable error (about
4%) when compared to experimental results. This result leads to desirable material removal rate and economical industrial machining to optimize the input parameters.
Abstract: The study of the geometric shape of the plunging wave enclosed vortices as a possible indicator for the breaking intensity of ocean waves has been ongoing for almost 50 years with limited success. This paper investigates the validity of using the vortex ratio and vortex angle as methods of predicting breaking intensity. Previously published works on vortex parameters, based on regular wave flume results or solitary wave theory, present contradictory results and conclusions. Through the first complete analysis of field collected irregular wave breaking vortex parameters it is illustrated that the vortex ratio and vortex angle cannot be accurately predicted using standard breaking wave characteristics and hence are not suggested as a possible indicator for breaking intensity.
Abstract: Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.
Abstract: Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: It was analyzed of fatty acid composition of 16 strains
of microalgae lipid fractions isolated from different basins of
Kazakhstan and characterized by stable active growth in the
laboratory. Three species of green microalgae (Oocystis
rhomboideus, Chlorococcum infusionum, Dictyochlorella globosa)
and three species of diatoms (Synedra sp., Nitzshia sp., Pleurosigma
attenuatum) are characterized by a high content of lipids and are
promising for further study as a source of polyunsaturated fatty acids.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: Existing literature ondesign reasoning seems to give
either one sided accounts on expert design behaviour based on
internal processing. In the same way ecological theoriesseem to
focus one sidedly on external elementsthat result in a lack of unifying
design cognition theory. Although current extended design cognition
studies acknowledge the intellectual interaction between internal and
external resources, there still seems to be insufficient understanding
of the complexities involved in such interactive processes. As
such,this paper proposes a novelmulti-directional model for design
researchers tomap the complex and dynamic conduct controlling
behaviour in which both the computational and ecological
perspectives are integrated in a vertical manner. A clear distinction
between identified intentional and emerging physical drivers, and
relationships between them during the early phases of experts- design
process, is demonstrated by presenting a case study in which the
model was employed.
Abstract: This paper looks into detailed investigation of
thermal-hydraulic characteristics of the flow field in a fuel rod
model, especially near the spacer. The area investigate represents a
source of information on the velocity flow field, vortex, and on the
amount of heat transfer into the coolant all of which are critical for
the design and improvement of the fuel rod in nuclear power plants.
The flow field investigation uses three-dimensional Computational
Fluid Dynamics (CFD) with the Reynolds stresses turbulence model
(RSM). The fuel rod model incorporates a vertical annular channel
where three different shapes of spacers are used; each spacer shape is
addressed individually. These spacers are mutually compared in
consideration of heat transfer capabilities between the coolant and
the fuel rod model. The results are complemented with the calculated
heat transfer coefficient in the location of the spacer and along the
stainless-steel pipe.
Abstract: The recycling of concrete, bricks and masonry rubble
as concrete aggregates is an important way to contribute to a
sustainable material flow. However, there are still various
uncertainties limiting the widespread use of Recycled Concrete
Aggregates (RCA). The fluctuations in the composition of grade
recycled aggregates and their influence on the properties of fresh and
hardened concrete are of particular concern regarding the use of
RCA. Most of problems occurring while using recycled concrete
aggregates as aggregates are due to higher porosity and hence higher
water absorption, lower mechanical strengths, residual impurities on
the surface of the RCA forming weaker bond between cement paste
and aggregate. So, the reuse of RCA is still limited. Efficient
polymer based treatment is proposed in order to reuse RCA easier.
The silicon-based polymer treatments of RCA were carried out and
were compared. This kind of treatment can improve the properties of
RCA such as the rate of water absorption on treated RCA is
significantly reduced.
Abstract: According to the statistics, the prevalence of congenital hearing loss in Taiwan is approximately six thousandths; furthermore, one thousandths of infants have severe hearing impairment. Hearing ability during infancy has significant impact in the development of children-s oral expressions, language maturity, cognitive performance, education ability and social behaviors in the future. Although most children born with hearing impairment have sensorineural hearing loss, almost every child more or less still retains some residual hearing. If provided with a hearing aid or cochlear implant (a bionic ear) timely in addition to hearing speech training, even severely hearing-impaired children can still learn to talk. On the other hand, those who failed to be diagnosed and thus unable to begin hearing and speech rehabilitations on a timely manner might lose an important opportunity to live a complete and healthy life. Eventually, the lack of hearing and speaking ability will affect the development of both mental and physical functions, intelligence, and social adaptability. Not only will this problem result in an irreparable regret to the hearing-impaired child for the life time, but also create a heavy burden for the family and society. Therefore, it is necessary to establish a set of computer-assisted predictive model that can accurately detect and help diagnose newborn hearing loss so that early interventions can be provided timely to eliminate waste of medical resources. This study uses information from the neonatal database of the case hospital as the subjects, adopting two different analysis methods of using support vector machine (SVM) for model predictions and using logistic regression to conduct factor screening prior to model predictions in SVM to examine the results. The results indicate that prediction accuracy is as high as 96.43% when the factors are screened and selected through logistic regression. Hence, the model constructed in this study will have real help in clinical diagnosis for the physicians and actually beneficial to the early interventions of newborn hearing impairment.
Abstract: It is known that the heart interacts with and adapts to
its venous and arterial loading conditions. Various experimental
studies and modeling approaches have been developed to investigate
the underlying mechanisms. This paper presents a model of the left
ventricle derived based on nonlinear stress-length myocardial
characteristics integrated over truncated ellipsoidal geometry, and
second-order dynamic mechanism for the excitation-contraction
coupling system. The results of the model presented here describe the
effects of the viscoelastic damping element of the electromechanical
coupling system on the hemodynamic response. Different heart rates
are considered to study the pacing effects on the performance of the
left-ventricle against constant preload and afterload conditions under
various damping conditions. The results indicate that the pacing
process of the left ventricle has to take into account, among other
things, the viscoelastic damping conditions of the myofilament
excitation-contraction process.
Abstract: This paper presents reliability evaluation techniques
which are applied in distribution system planning studies and
operation. Reliability of distribution systems is an important issue in
power engineering for both utilities and customers. Reliability is a
key issue in the design and operation of electric power distribution
systems and load. Reliability evaluation of distribution systems has
been the subject of many recent papers and the modeling and
evaluation techniques have improved considerably.
Abstract: In quality control of freeze-dried durian, crispiness is
a key quality index of the product. Generally, crispy testing has to be
done by a destructive method. A nondestructive testing of the
crispiness is required because the samples can be reused for other
kinds of testing. This paper proposed a crispiness classification
method of freeze-dried durians using fuzzy logic for decision
making. The physical changes of a freeze-dried durian include the
pores appearing in the images. Three physical features including (1)
the diameters of pores, (2) the ratio of the pore area and the
remaining area, and (3) the distribution of the pores are considered to
contribute to the crispiness. The fuzzy logic is applied for making the
decision. The experimental results comparing with food expert
opinion showed that the accuracy of the proposed classification
method is 83.33 percent.
Abstract: Data mining is the process of sifting through large
volumes of data, analyzing data from different perspectives and
summarizing it into useful information. One of the widely used
desktop applications for data mining is the Weka tool which is
nothing but a collection of machine learning algorithms implemented
in Java and open sourced under the General Public License (GPL). A
web service is a software system designed to support interoperable
machine to machine interaction over a network using SOAP
messages. Unlike a desktop application, a web service is easy to
upgrade, deliver and access and does not occupy any memory on the
system. Keeping in mind the advantages of a web service over a
desktop application, in this paper we are demonstrating how this Java
based desktop data mining application can be implemented as a web
service to support data mining across the internet.
Abstract: This research aims to examine the key success factors
for the diffusion of mobile entertainment services in Malaysia. The
drivers and barriers observed in this research include perceived
benefit; concerns pertaining to pricing, product and technological
standardization, privacy and security; as well as influences from
peers and community. An analysis of a Malaysian survey of 384
respondents between 18 to 25 years shows that subscribers placed
greater importance on perceived benefit of mobile entertainment
services compared to other factors. Results of the survey also show
that there are strong positive correlations between all the factors,
with pricing issue–perceived benefit showing the strongest
relationship. This paper aims to provide an extensive study on the
drivers and barriers that could be used to derive architecture for
entertainment service provision to serve as a guide for telcos to
outline suitable approaches in order to encourage mass market
adoption of mobile entertainment services in Malaysia.
Abstract: This study was conducted Ismailoglu grape type (Vitis
vinifera L.) and its vine which was aged 15 was grown on its own
root in a vegetation period of 2013 in Nevşehir province in Turkey.
In this research, it was investigated whether the applications of
Control (C), 1/3 cluster tip reduction (1/3 CTR), shoot tip reduction
(STR), 1/3 CTR + STR, TKI-HUMAS (TKI-HM) (Soil) (S), TKIHM
(Foliar) (F), TKI-HM (S + F), 1/3 CTR + TKI-HM (S), 1/3 CTR
+ TKI-HM (F), 1/3 CTR + TKI-HM (S+F), STR + TKI-HM (S), STR
+ TKI-HM (F), STR + TKI-HM (S + F), 1/3 CTR + STR+TKI-HM
(S), 1/3 CTR + STR + TKI-HM (F), 1/3 CTR + STR + TKI-HM (S +
F) on yield and yield components of Ismailoglu grape type. The
results were obtained as the highest fresh grape yield (16.15 kg/vine)
with TKI-HM (S), as the highest cluster weight (652.39 g) with 1/3
CTR + STR, as the highest 100 berry weight (419.07 g) with 1/3
CTR + STR + TKI-HM (F), as the highest maturity index (44.06)
with 1/3 CTR, as the highest must yield (810.00 ml) with STR +
TKI-HM (F), as the highest intensity of L* color (42.04) with TKIHM
(S + F), as the highest intensity of a* color (2.60) with 1/3 CTR
+ TKI-HM (S), as the highest intensity of b* color (7.16) with 1/3
CTR + TKI-HM (S) applications. To increase the fresh grape yield of
Ismailoglu grape type can be recommended TKI-HM (S) application.
Abstract: Research on damage of gears and gear pairs using
vibration signals remains very attractive, because vibration signals
from a gear pair are complex in nature and not easy to interpret.
Predicting gear pair defects by analyzing changes in vibration signal
of gears pairs in operation is a very reliable method. Therefore, a
suitable vibration signal processing technique is necessary to extract
defect information generally obscured by the noise from dynamic
factors of other gear pairs.This article presents the value of cepstrum
analysis in vehicle gearbox fault diagnosis. Cepstrum represents the
overall power content of a whole family of harmonics and sidebands
when more than one family of sidebands is present at the same time.
The concept for the measurement and analysis involved in using the
technique are briefly outlined. Cepstrum analysis is used for detection
of an artificial pitting defect in a vehicle gearbox loaded with
different speeds and torques. The test stand is equipped with three
dynamometers; the input dynamometer serves asthe internal
combustion engine, the output dynamometers introduce the load on
the flanges of the output joint shafts. The pitting defect is
manufactured on the tooth side of a gear of the fifth speed on the
secondary shaft. Also, a method for fault diagnosis of gear faults is
presented based on order Cepstrum. The procedure is illustrated with
the experimental vibration data of the vehicle gearbox. The results
show the effectiveness of Cepstrum analysis in detection and
diagnosis of the gear condition.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: In 3D-wavelet video coding framework temporal
filtering is done along the trajectory of motion using Motion
Compensated Temporal Filtering (MCTF). Hence computationally
efficient motion estimation technique is the need of MCTF. In this
paper a predictive technique is proposed in order to reduce the
computational complexity of the MCTF framework, by exploiting
the high correlation among the frames in a Group Of Picture (GOP).
The proposed technique applies coarse and fine searches of any fast
block based motion estimation, only to the first pair of frames in a
GOP. The generated motion vectors are supplied to the next
consecutive frames, even to subsequent temporal levels and only fine
search is carried out around those predicted motion vectors. Hence
coarse search is skipped for all the motion estimation in a GOP
except for the first pair of frames. The technique has been tested for
different fast block based motion estimation algorithms over different
standard test sequences using MC-EZBC, a state-of-the-art scalable
video coder. The simulation result reveals substantial reduction (i.e.
20.75% to 38.24%) in the number of search points during motion
estimation, without compromising the quality of the reconstructed
video compared to non-predictive techniques. Since the motion
vectors of all the pair of frames in a GOP except the first pair will
have value ±1 around the motion vectors of the previous pair of
frames, the number of bits required for motion vectors is also
reduced by 50%.