Abstract: This paper presents development of an ignition system using spark electrodes for application in a research explosion vessel.
A single spark is aimed to be discharged with quantifiable ignition energy. The spark electrode system would enable study of flame
propagation, ignitability of fuel-air mixtures and other fundamental characteristics of flames. The principle of the capacitive spark circuit
of ASTM is studied to charge an appropriate capacitance connected across the spark gap through a large resistor by a high voltage from
the source of power supply until the initiation of spark. Different spark energies could be obtained mainly by varying the value of the
capacitance and the supply current. The spark sizes produced are found to be affected by the spark gap, electrode size, input voltage
and capacitance value.
Abstract: I/O workload is a critical and important factor to
analyze I/O pattern and to maximize file system performance.
However to measure I/O workload on running distributed parallel file
system is non-trivial due to collection overhead and large volume of
data. In this paper, we measured and analyzed file system activities on
two large-scale cluster systems which had TFlops level high
performance computation resources. By comparing file system
activities of 2009 with those of 2006, we analyzed the change of I/O
workloads by the development of system performance and high-speed
network technology.
Abstract: Two commercial proteases from Bacillus
licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were
screened for the production of Z-Ala-Phe-NH2 in batch reaction.
Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal
amidation of Z-Ala-Phe-OMe using ammonium carbamate as
ammonium source. Immobilization of protease has been achieved by
the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and
tetramethoxysilane (TMOS) as precursors (unpublished results). In
batch production, about 95% of Z-Ala-Phe-NH2 was obtained at
30°C after 24 hours of incubation. Reproducibility of different
batches of commercial Alcalase 2.4 L FG preparations was also
investigated by evaluating the amidation activity and the entrapment
yields in the case of immobilization. A packed-bed reactor (0.68 cm
ID, 15.0 cm long) was operated successfully for the continuous
synthesis of peptide amides. The immobilized enzyme retained the
initial activity over 10 cycles of repeated use in continuous reactor at
ambient temperature. At 0.75 mL/min flow rate of the substrate
mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5
hours of substrate recycling. The product contained about 90%
peptide amide and 10% hydrolysis byproduct.
Abstract: A boundary layer wind tunnel facility has been
adopted in order to conduct experimental measurements of the flow field around a model of the Panorama Giustinelli Building, Trieste
(Italy). Information on the main flow structures has been obtained by means of flow visualization techniques and has been compared to the
numerical predictions of the vortical structures spread on top of the roof, in order to investigate the optimal positioning for a vertical-axis
wind energy conversion system, registering a good agreement between experimental measurements and numerical predictions.
Abstract: The background estimation approach using a small
window median filter is presented on the bases of analyzing IR point
target, noise and clutter model. After simplifying the two-dimensional
filter, a simple method of adopting one-dimensional median filter is
illustrated to make estimations of background according to the
characteristics of IR scanning system. The adaptive threshold is used
to segment canceled image in the background. Experimental results
show that the algorithm achieved good performance and satisfy the
requirement of big size image-s real-time processing.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: In a competitive production environment, critical
decision making are based on data resulted by random sampling of
product units. Efficiency of these decisions depends on data quality
and also their reliability scale. This point leads to the necessity of a
reliable measurement system. Therefore, the conjecture process and
analysing the errors contributes to a measurement system known as
Measurement System Analysis (MSA). The aim of this research is on
determining the necessity and assurance of extensive development in
analysing measurement systems, particularly with the use of
Repeatability and Reproducibility Gages (GR&R) to improve
physical measurements. Nowadays in productive industries,
repeatability and reproducibility gages released so well but they are
not applicable as well as other measurement system analysis
methods. To get familiar with this method and gain a feedback in
improving measurement systems, this survey would be on
“ANOVA" method as the most widespread way of calculating
Repeatability and Reproducibility (R&R).
Abstract: Many methods exist for either measuring or estimating
evaporation from free water surfaces. Evaporation pans provide one
of the simplest, inexpensive, and most widely used methods of
estimating evaporative losses. In this study, the rate of evaporation
starting from a water surface was calculated by modeling with
application to dams in wet, arid and semi arid areas in Algeria.
We calculate the evaporation rate from the pan using the energy
budget equation, which offers the advantage of an ease of use, but
our results do not agree completely with the measurements taken by
the National Agency of areas carried out using dams located in areas
of different climates. For that, we develop a mathematical model to
simulate evaporation. This simulation uses an energy budget on the
level of a vat of measurement and a Computational Fluid Dynamics
(Fluent). Our calculation of evaporation rate is compared then by the
two methods and with the measures of areas in situ.
Abstract: The acidity of different raw Jordanian clays
containing zeolite, bentonite, red and white kaolinite and diatomite
was characterized by means of temperature programmed desorption
(TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH),
FTIR and BET-measurements. FTIR spectra proved presence of
silanol and bridged hydroxyls on the clay surface. The number of
acidic sites was calculated from experimental TPD-profiles. We
observed the decrease of surface acidity correlates with the decrease
of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two
maxima were registered due to different strength of surface acidic
sites. Values of MBOH conversion, product yields and selectivity
were calculated for the catalysis on Jordanian clays. We obtained that
all clay samples are able to convert MBOH into a major product
which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid
surface sites with the selectivity close to 70%. There was found a
correlation between MBOH conversion and acidity of clays
determined by TPD-NH3, i.e. the higher the acidity the higher the
conversion of MBOH. However, diatomite provided the lowest
conversion of MBOH as result of poor polarization of silanol groups.
Comparison of surface areas and conversions revealed the highest
density of active sites for red kaolinite and the lowest for zeolite and
diatomite.
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.
Abstract: Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.
Abstract: Chaiyaphum Starch Co. Ltd. is one of many starch
manufacturers that has introduced machinery to aid in manufacturing.
Even though machinery has replaced many elements and is now a
significant part in manufacturing processes, problems that must be
solved with respect to current process flow to increase efficiency still
exist. The paper-s aim is to increase productivity while maintaining
desired quality of starch, by redesigning the flipping machine-s
mechanical control system which has grossly low functional lifetime.
Such problems stem from the mechanical control system-s bearings,
as fluids and humidity can access into said bearing directly, in
tandem with vibrations from the machine-s function itself. The wheel
which is used to sense starch thickness occasionally falls from its
shaft, due to high speed rotation during operation, while the shaft
may bend from impact when processing dried bread. Redesigning its
mechanical control system has increased its efficiency, allowing
quality thickness measurement while increasing functional lifetime
an additional 62 days.
Abstract: Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).
Abstract: Cu-mesoporous TiO2 is developed for removal acid
odor cooperated with ozone assistance and online- regeneration
system with/without UV irradiation (all weather) in study. The results
showed that Cu-mesoporous TiO2 present the desirable adsorption
efficiency of acid odor without UV irradiation, due to the larger
surface area, pore sizeand the additional absorption ability provided by
Cu. In the photocatalysis process, the material structure also benefits
Cu-mesoporous TiO2 to perform the more outstanding efficiency on
degrading acid odor. Cu also postponed the recombination of
electron-hole pairs excited from TiO2 to enhance photodegradation
ability. Cu-mesoporous TiO2 could gain the conspicuous increase on
photocatalysis ability from ozone assistance, but without any benefit
on adsorption. In addition, the online regeneration procedure could
process the used Cu-mesoporous TiO2 to reinstate the adsorption
ability and maintain the photodegradtion performance, depended on
scrubbing, desorping acid odor and reducing Cu to metal state.
Abstract: Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.
Abstract: Given a parallel program to be executed on a heterogeneous
computing system, the overall execution time of the program
is determined by a schedule. In this paper, we analyze the worst-case
performance of the list scheduling algorithm for scheduling tasks
of a parallel program in a mixed-machine heterogeneous computing
system such that the total execution time of the program is minimized.
We prove tight lower and upper bounds for the worst-case
performance ratio of the list scheduling algorithm. We also examine
the average-case performance of the list scheduling algorithm. Our
experimental data reveal that the average-case performance of the list
scheduling algorithm is much better than the worst-case performance
and is very close to optimal, except for large systems with large
heterogeneity. Thus, the list scheduling algorithm is very useful in
real applications.
Abstract: Encrypted messages sending frequently draws the attention
of third parties, perhaps causing attempts to break and
reveal the original messages. Steganography is introduced to hide
the existence of the communication by concealing a secret message
in an appropriate carrier like text, image, audio or video. Quantum
steganography where the sender (Alice) embeds her steganographic
information into the cover and sends it to the receiver (Bob) over a
communication channel. Alice and Bob share an algorithm and hide
quantum information in the cover. An eavesdropper (Eve) without
access to the algorithm can-t find out the existence of the quantum
message. In this paper, a text quantum steganography technique based
on the use of indefinite articles (a) or (an) in conjunction with the nonspecific
or non-particular nouns in English language and quantum
gate truth table have been proposed. The authors also introduced a
new code representation technique (SSCE - Secret Steganography
Code for Embedding) at both ends in order to achieve high level of
security. Before the embedding operation each character of the secret
message has been converted to SSCE Value and then embeds to cover
text. Finally stego text is formed and transmits to the receiver side.
At the receiver side different reverse operation has been carried out
to get back the original information.
Abstract: Cryptography provides the secure manner of
information transmission over the insecure channel. It authenticates
messages based on the key but not on the user. It requires a lengthy
key to encrypt and decrypt the sending and receiving the messages,
respectively. But these keys can be guessed or cracked. Moreover,
Maintaining and sharing lengthy, random keys in enciphering and
deciphering process is the critical problem in the cryptography
system. A new approach is described for generating a crypto key,
which is acquired from a person-s iris pattern. In the biometric field,
template created by the biometric algorithm can only be
authenticated with the same person. Among the biometric templates,
iris features can efficiently be distinguished with individuals and
produces less false positives in the larger population. This type of iris
code distribution provides merely less intra-class variability that aids
the cryptosystem to confidently decrypt messages with an exact
matching of iris pattern. In this proposed approach, the iris features
are extracted using multi resolution wavelets. It produces 135-bit iris
codes from each subject and is used for encrypting/decrypting the
messages. The autocorrelators are used to recall original messages
from the partially corrupted data produced by the decryption process.
It intends to resolve the repudiation and key management problems.
Results were analyzed in both conventional iris cryptography system
(CIC) and non-repudiation iris cryptography system (NRIC). It
shows that this new approach provides considerably high
authentication in enciphering and deciphering processes.
Abstract: Rainfall data at fine resolution and knowledge of its
characteristics plays a major role in the efficient design and operation
of agricultural, telecommunication, runoff and erosion control as well
as water quality control systems. The paper is aimed to study the
statistical distribution of hourly rainfall depth for 12 representative
stations spread across Peninsular Malaysia. Hourly rainfall data of 10
to 22 years period were collected and its statistical characteristics
were estimated. Three probability distributions namely, Generalized
Pareto, Exponential and Gamma distributions were proposed to
model the hourly rainfall depth, and three goodness-of-fit tests,
namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared
tests were used to evaluate their fitness. Result indicates that the east
cost of the Peninsular receives higher depth of rainfall as compared
to west coast. However, the rainfall frequency is found to be
irregular. Also result from the goodness-of-fit tests show that all the
three models fit the rainfall data at 1% level of significance.
However, Generalized Pareto fits better than Exponential and
Gamma distributions and is therefore recommended as the best fit.