Abstract: The effect of autofrettage process in strain hardened
thick-walled pressure vessels has been investigated theoretically by
finite element modeling. Equivalent von Mises stress is used as yield
criterion to evaluate the optimum autofrettage pressure and the
optimum radius of elastic-plastic junction. It has been observed that
the optimum autofrettage pressure increases along with the working
pressure. For two different working pressures, the effect of the ratio
of outer to inner radius (b/a=k) value on the optimum autofrettage
pressure is also noticed. The Optimum autofrettage pressure solely
depends on K value rather than on the inner or outer radius.
Furthermore, percentage reduction of von Mises stresses is compared
for different working pressures and different k values. Maximum von
Mises stress developed at different autofrettage pressure is equated
for elastic perfectly plastic and elastic-plastic material with different
slope of strain hardening segment. Cylinder material having higher
slope of strain hardening segment provides better benedictions in the
autofrettage process.
Abstract: Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Abstract: Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.
Abstract: The tensile properties of Mg-3%Al nanocrystalline
alloys were investigated at different test environment. Bulk
nanocrystalline samples of these alloy was successfully prepared by
mechanical alloying (MA) followed by cold compaction, sintering,
and hot extrusion process. The crystal size of the consolidated milled
sample was calculated by X-Ray line profile analysis. The
deformation mechanism and microstructural characteristic at
different test condition was discussed extensively. At room
temperature, relatively lower value of activation volume (AV) and
higher value of strain rate sensitivity (SRS) suggests that new rate
controlling mechanism accommodating plastic flow in the present
nanocrystalline sample. The deformation behavior and the
microstructural character of the present samples were discussed in
details.
Abstract: In view of growing competition in the service sector,
services are as much in need of modeling, analysis and improvement
as business or working processes. Graphical process models are
important means to capture process-related know-how for an
effective management of the service process. In this contribution, a
human performance analysis of process model development paying
special attention to model development time and the working method
was conducted. It was found that modelers with higher application
experience need significantly less time for mental activities than
modelers with lower application experience, spend more time on
labeling graphical elements, and achieved higher process model
quality in terms of activity label quality.
Abstract: Iris-based biometric authentication is gaining importance
in recent times. Iris biometric processing however, is a complex
process and computationally very expensive. In the overall processing
of iris biometric in an iris-based biometric authentication system,
feature processing is an important task. In feature processing, we extract
iris features, which are ultimately used in matching. Since there
is a large number of iris features and computational time increases
as the number of features increases, it is therefore a challenge to
develop an iris processing system with as few as possible number of
features and at the same time without compromising the correctness.
In this paper, we address this issue and present an approach to feature
extraction and feature matching process. We apply Daubechies D4
wavelet with 4 levels to extract features from iris images. These
features are encoded with 2 bits by quantizing into 4 quantization
levels. With our proposed approach it is possible to represent an
iris template with only 304 bits, whereas existing approaches require
as many as 1024 bits. In addition, we assign different weights to
different iris region to compare two iris templates which significantly
increases the accuracy. Further, we match the iris template based on
a weighted similarity measure. Experimental results on several iris
databases substantiate the efficacy of our approach.
Abstract: Skin color can provide a useful and robust cue
for human-related image analysis, such as face detection,
pornographic image filtering, hand detection and tracking,
people retrieval in databases and Internet, etc. The major
problem of such kinds of skin color detection algorithms is
that it is time consuming and hence cannot be applied to a real
time system. To overcome this problem, we introduce a new
fast technique for skin detection which can be applied in a real
time system. In this technique, instead of testing each image
pixel to label it as skin or non-skin (as in classic techniques),
we skip a set of pixels. The reason of the skipping process is
the high probability that neighbors of the skin color pixels are
also skin pixels, especially in adult images and vise versa. The
proposed method can rapidly detect skin and non-skin color
pixels, which in turn dramatically reduce the CPU time
required for the protection process. Since many fast detection
techniques are based on image resizing, we apply our
proposed pixel skipping technique with image resizing to
obtain better results. The performance evaluation of the
proposed skipping and hybrid techniques in terms of the
measured CPU time is presented. Experimental results
demonstrate that the proposed methods achieve better result
than the relevant classic method.
Abstract: Fingerprint based identification system; one of a well
known biometric system in the area of pattern recognition and has
always been under study through its important role in forensic
science that could help government criminal justice community. In
this paper, we proposed an identification framework of individuals by
means of fingerprint. Different from the most conventional
fingerprint identification frameworks the extracted Geometrical
element features (GEFs) will go through a Discretization process.
The intention of Discretization in this study is to attain individual
unique features that could reflect the individual varianceness in order
to discriminate one person from another. Previously, Discretization
has been shown a particularly efficient identification on English
handwriting with accuracy of 99.9% and on discrimination of twins-
handwriting with accuracy of 98%. Due to its high discriminative
power, this method is adopted into this framework as an independent
based method to seek for the accuracy of fingerprint identification.
Finally the experimental result shows that the accuracy rate of
identification of the proposed system using Discretization is 100%
for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is
much better than the conventional or the existing fingerprint
identification system (72% for FVC2000, 26% for FVC2002 and
32.8% for FVC2004). The result indicates that Discretization
approach manages to boost up the classification effectively, and
therefore prove to be suitable for other biometric features besides
handwriting and fingerprint.
Abstract: This paper proposes a low-voltage and low-power
fully integrated digitally tuned continuous-time channel selection
filter for WiMAX applications. A 5th-order elliptic low-pass filter is
realized in a Gm-C topology. The bandwidth of the fully differential
filter is reconfigurable from 2.5MHz to 20MHz (8x) for different
requirements in WiMAX applications. The filter is simulated in a
standard 90nm CMOS process. Simulation results show the THD
(@Vout =100mVpp) is less than -66dB. The in-band ripple of the
filter is about 0.15dB. The filter consumes 1.5mW from a supply
voltage of 0.9V.
Abstract: In this research, Forming Limit Diagrams for supertension
sheet metals which are using in automobile industry have
been obtained. The exerted strains to sheet metals have been
measured with four different methods and the errors of each method
have also been represented. These methods have been compared with
together and the most efficient and economic way of extracting of the
exerted strains to sheet metals has been introduced. In this paper total
error and uncertainty of FLD extraction procedures have been
derived. Determination of the measurement uncertainty in extracting
of FLD has a great importance in design and analysis of the sheet
metal forming process.
Abstract: This paper introduces a new instantaneous frequency
computation approach -Counting Instantaneous Frequency for a
general class of signals called simple waves. The classsimple wave
contains a wide range of continuous signals for which the concept
instantaneous frequency has a perfect physical sense. The concept of
-Counting Instantaneous Frequency also applies to all the discrete data.
For all the simple wave signals and the discrete data, -Counting
instantaneous frequency can be computed directly without signal
decomposition process. The intrinsic mode functions obtained through
empirical mode decomposition belongs to simple wave. So
-Counting instantaneous frequency can be used together with
empirical mode decomposition.
Abstract: This research aims to study value-creation process of
producing monk-s bowls, Thai traditional handicrafts, which is facing problems in adapting to the changing society. It also aims to identify
problems and obstacles to value creation. This research is based on a case study of monk-s bowl manufactures from Ban-Baat Village,
Bangkok. The conceptual framework is based on the model of value
chain to analyze the process.
The research methodology is qualitative. This research found that the value-creation process of monk-s bowls consists of eight
activities contributing to adding value to the products and increasing
profits to the producers in return. Five major problems and obstacles
are found.
The research suggests that these problems and obstacles limit the manufacturers- potential for creating more valued product and lead to business stagnation. These problems should be addressed and solved with collaboration among the government, the private sector and the
manufacturers.
Abstract: Each year many people are reported missing in most of the countries in the world owing to various reasons. Arrangements have to be made to find these people after some time. So the investigating agencies are compelled to make out these people by using manpower. But in many cases, the investigations carried out to find out an absconding for a long time may not be successful. At a time like that it may be difficult to identify these people by examining their old photographs, because their facial appearance might have changed mainly due to the natural aging process. On some occasions in forensic medicine if a dead body is found, investigations should be held to make sure that this corpse belongs to the same person disappeared some time ago. With the passage of time the face of the person might have changed and there should be a mechanism to reveal the person-s identity. In order to make this process easy, we must guess and decide as to how he will look like by now. To address this problem this paper presents a way of synthesizing a facial image with the aging effects.
Abstract: The aim of the study was to determine how different
ripening processes (traditional vs. industrial) influenced the
proteolysis in traditional Serbian dry-fermented sausage Petrovská
klobása. The obtained results indicated more intensive pH decline
(0.7 units after 9 days) in industrially ripened products (I), what had a
positive impact on drying process and proteolytic changes in these
samples. Thus, moisture content in I sausages was lower at each
sampling time, amounting 24.7% at the end of production period
(90 days). Likewise, the process of proteolysis was more pronounced
in I samples, resulting in higher contents of non-protein nitrogen
(NPN) and free amino acids nitrogen (FAAN), as well as in faster
and more intensive degradation of myosin (≈220 kDa), actin (≈45
kDa) and other polypeptides during processing. Consequently, the
appearance and accumulation of several protein fragments were
registered.
Abstract: Commercially available lipases (Candida antarctica lipase B, Novozyme 435, Thermomyces lanuginosus lipase, and Lipozyme TL IM), as well as sol-gel immobilized lipases, have been screened for their ability to acylate regioselectively xylitol, sorbitol, and mannitol with a phenolic ester in a binary mixture of t-butanol and dimethylsulfoxide. HPLC and MALDI-TOF MS analysis revealed the exclusive formation of monoesters for all studied sugar alcohols. The lipases immobilized by the sol-gel entrapment method proved to be efficient catalysts, leading to high conversions (up to 60%) in the investigated acylation reactions. From a sequence of silane precursors with different nonhydrolyzable groups in their structure, the presence of octyl and i-butyl group was most beneficial for the catalytic activity of sol-gel entrapped lipases in the studied process.
Abstract: Water 2H NMR signal on the surface of nano-silica material, MCM-41, consists of two overlapping resonances. The 2H water spectrum shows a superposition of a Lorentzian line shape and the familiar NMR powder pattern line shape, indicating the existence of two spin components. Chemical exchange occurs between these two groups. Decomposition of the two signals is a crucial starting point for study the exchange process. In this article we have determined these spin component populations along with other important parameters for the 2H water NMR signal over a temperature range between 223 K and 343 K.
Abstract: Text Mining is around applying knowledge discovery
techniques to unstructured text is termed knowledge discovery in text
(KDT), or Text data mining or Text Mining. In decision tree
approach is most useful in classification problem. With this
technique, tree is constructed to model the classification process.
There are two basic steps in the technique: building the tree and
applying the tree to the database. This paper describes a proposed
C5.0 classifier that performs rulesets, cross validation and boosting
for original C5.0 in order to reduce the optimization of error ratio.
The feasibility and the benefits of the proposed approach are
demonstrated by means of medial data set like hypothyroid. It is
shown that, the performance of a classifier on the training cases from
which it was constructed gives a poor estimate by sampling or using a
separate test file, either way, the classifier is evaluated on cases that
were not used to build and evaluate the classifier are both are large. If
the cases in hypothyroid.data and hypothyroid.test were to be
shuffled and divided into a new 2772 case training set and a 1000
case test set, C5.0 might construct a different classifier with a lower
or higher error rate on the test cases. An important feature of see5 is
its ability to classifiers called rulesets. The ruleset has an error rate
0.5 % on the test cases. The standard errors of the means provide an
estimate of the variability of results. One way to get a more reliable
estimate of predictive is by f-fold –cross- validation. The error rate of
a classifier produced from all the cases is estimated as the ratio of the
total number of errors on the hold-out cases to the total number of
cases. The Boost option with x trials instructs See5 to construct up to
x classifiers in this manner. Trials over numerous datasets, large and
small, show that on average 10-classifier boosting reduces the error
rate for test cases by about 25%.
Abstract: Improving performance measures in the construction
processes has been a major concern for managers and decision
makers in the industry. They seek for ways to recognize the key
factors which have the largest effect on the process. Identifying such
factors can guide them to focus on the right parts of the process in
order to gain the best possible result. In the present study design of
experiment (DOE) has been applied to a computer simulation model
of brick laying process to determine significant factors while
productivity has been chosen as the response of the experiment. To
this end, four controllable factors and their interaction have been
experimented and the best factor level has been calculated for each
one. The results indicate that three factors, namely, labor of brick,
labor of mortar and inter arrival time of mortar along with interaction
of labor of brick and labor of mortar are significant.
Abstract: this study was carried out to investigate the changes in
quality parameters of rye bread packaged in different polymer films
during convection air-flow thermal treatment process. Whole loafs of
bread were placed in polymer pouches, which were sealed in reduced
pressure air ambiance, bread was thermally treated in
at temperature +(130; 140; and 150) ± 5 ºC within 40min, as long as
the core temperature of the samples have reached accordingly
+80±1 ºC. For bread packaging pouches were used: anti-fog
Mylar®OL12AF and thermo resistant combined polymer material.
Main quality parameters was analysed using standard methods:
temperature in bread core, bread crumb and crust firmness value,
starch granules volume and microflora. In the current research it was
proved, that polymer films significantly influence rye bread quality
parameters changes during thermal treatment. Thermo resistant
combined polymer material film could be recommendable for
packaged rye bread pasteurization, for maximal bread quality
parameter keeping.
Abstract: Rule Discovery is an important technique for mining
knowledge from large databases. Use of objective measures for
discovering interesting rules leads to another data mining problem,
although of reduced complexity. Data mining researchers have
studied subjective measures of interestingness to reduce the volume
of discovered rules to ultimately improve the overall efficiency of
KDD process.
In this paper we study novelty of the discovered rules as a
subjective measure of interestingness. We propose a hybrid approach
based on both objective and subjective measures to quantify novelty
of the discovered rules in terms of their deviations from the known
rules (knowledge). We analyze the types of deviation that can arise
between two rules and categorize the discovered rules according to
the user specified threshold. We implement the proposed framework
and experiment with some public datasets. The experimental results
are promising.