Abstract: Vinegar is a precious food additive and complement as well as effective preservative against food spoilage. Recently traditional vinegar production has been improved using various natural substrates and fruits such as grape, palm, cherry, coconut, date, sugarcane, rice and balsam. These neoclassical fermentations resulted in several vinegar types with different tastes, fragrances and nutritional values because of applying various acetic acid bacteria as starters. Acetic acid bacteria include genera Acetobacter, Gluconacetobacter and Gluconobacter according to latest edition of Bergy-s Manual of Systematic Bacteriology that classifies genera on the basis of their 16s RNA differences. Acetobacter spp as the main vinegar starters belong to family Acetobacteraceae that are gram negative obligate aerobes, chemoorganotrophic bacilli that are oxidase negative and oxidize ethanol to acetic acid. In this research we isolated and identified a native Acetobacter strain with high acetic acid productivity and tolerance against high ethanol concentrations from Iranian peach as a summer delicious fruit that is very susceptible to food spoilage and decay. We used selective and specific laboratorial culture media such as Standard GYC, Frateur and Carr medium. Also we used a new industrial culture medium and a miniature fermentor with a new aeration system innovated by Pars Yeema Biotechnologists Co., Isfahan Science and Technology Town (ISTT), Isfahan, Iran. The isolated strain was successfully cultivated in modified Carr media with 2.5% and 5% ethanol simultaneously in high temperatures, 34 - 40º C after 96 hours of incubation period. We showed that the increase of ethanol concentration resulted in rising of strain sensitivity to high temperature. In conclusion we isolated and characterized a new Acetobacter strain from Iranian peach that could be considered as a potential strain for production of a new vinegar type, peach vinegar, with a delicious taste and advantageous nutritional value in food biotechnology and industrial microbiology.
Abstract: In this paper we propose a method which improves the efficiency of video coding. Our method combines an adaptive GOP (group of pictures) structure and the shot cut detection. We have analyzed different approaches for shot cut detection with aim to choose the most appropriate one. The next step is to situate N frames to the positions of detected cuts during the process of video encoding. Finally the efficiency of the proposed method is confirmed by simulations and the obtained results are compared with fixed GOP structures of sizes 4, 8, 12, 16, 32, 64, 128 and GOP structure with length of entire video. Proposed method achieved the gain in bit rate from 0.37% to 50.59%, while providing PSNR (Peak Signal-to-Noise Ratio) gain from 1.33% to 0.26% in comparison to simulated fixed GOP structures.
Abstract: Real world Speaker Identification (SI) application
differs from ideal or laboratory conditions causing perturbations that
leads to a mismatch between the training and testing environment
and degrade the performance drastically. Many strategies have been
adopted to cope with acoustical degradation; wavelet based Bayesian
marginal model is one of them. But Bayesian marginal models
cannot model the inter-scale statistical dependencies of different
wavelet scales. Simple nonlinear estimators for wavelet based
denoising assume that the wavelet coefficients in different scales are
independent in nature. However wavelet coefficients have significant
inter-scale dependency. This paper enhances this inter-scale
dependency property by a Circularly Symmetric Probability Density
Function (CS-PDF) related to the family of Spherically Invariant
Random Processes (SIRPs) in Log Gabor Wavelet (LGW) domain
and corresponding joint shrinkage estimator is derived by Maximum
a Posteriori (MAP) estimator. A framework is proposed based on
these to denoise speech signal for automatic speaker identification
problems. The robustness of the proposed framework is tested for
Text Independent Speaker Identification application on 100 speakers
of POLYCOST and 100 speakers of YOHO speech database in three
different noise environments. Experimental results show that the
proposed estimator yields a higher improvement in identification
accuracy compared to other estimators on popular Gaussian Mixture
Model (GMM) based speaker model and Mel-Frequency Cepstral
Coefficient (MFCC) features.
Abstract: The purpose of this research was to study five vital
factors related to employees’ job performance. A total of 250
respondents were sampled from employees who worked at a public
warehouse organization, Bangkok, Thailand. Samples were divided
into two groups according to their work experience. The average
working experience was about 9 years for group one and 28 years for
group two. A questionnaire was utilized as a tool to collect data.
Statistics utilized in this research included frequency, percentage,
mean, standard deviation, t-test analysis, one way ANOVA, and
Pearson Product-moment correlation coefficient. Data were analyzed
by using Statistical Package for the Social Sciences. The findings
disclosed that the majority of respondents were female between 23-
31 years old, single, and hold an undergraduate degree. The average
income of respondents was less than 30,900 baht. The findings also
revealed that the factors of organization chart awareness, job process
and technology, internal environment, employee loyalty, and policy
and management were ranked as medium level. The hypotheses
testing revealed that difference in gender, age, and position had
differences in terms of the awareness of organization chart, job
process and technology, internal environment, employee loyalty, and
policy and management in the same direction with low level.
Abstract: The main emphasis of metallurgists has been to process the materials to obtain the balanced mechanical properties for the given application. One of the processing routes to alter the properties is heat treatment. Nearly 90% of the structural applications are related to the medium carbon an alloyed steels and hence are regarded as structural steels. The major requirement in the conventional steel is to improve workability, toughness, hardness and grain refinement. In this view, it is proposed to study the mechanical and tribological properties of unalloyed structural (AISI 1140) steel with different thermal (heat) treatments like annealing, normalizing, tempering and hardening and compared with as brought (cold worked) specimen. All heat treatments are carried out in atmospheric condition. Hardening treatment improves hardness of the material, a marginal decrease in hardness value with improved ductility is observed in tempering. Annealing and normalizing improve ductility of the specimen. Normalized specimen shows ultimate ductility. Hardened specimen shows highest wear resistance in the initial period of slide wear where as above 25KM of sliding distance, as brought steel dominates the hardened specimen. Both mild and severe wear regions are observed. Microstructural analysis shows the existence of pearlitic structure in normalized specimen, lath martensitic structure in hardened, pearlitic, ferritic structure in annealed specimen.
Abstract: A new, combinatorial model for analyzing and inter-
preting an electrocardiogram (ECG) is presented. An application of
the model is QRS peak detection. This is demonstrated with an
online algorithm, which is shown to be space as well as time efficient.
Experimental results on the MIT-BIH Arrhythmia database show that
this novel approach is promising. Further uses for this approach are
discussed, such as taking advantage of its small memory requirements
and interpreting large amounts of pre-recorded ECG data.
Abstract: The IFRS for Small and Medium-sized Entities
(SMEs) was issued in July 2009 and currently regulators are
considering various implementation strategies of this standard.
Romania is a member of the European Union since 2007, thus
accounting regulations were issued in order to ensure compliance
with the European Accounting Directives. As the European
Commission rejected recently the mandatory use of IFRS for SMEs,
regulatory bodies from the Member States have to decide if the
standard will affect or not the accounting practices of SMEs from
their countries. Recently IASB invited stakeholders to discuss the
revision of IFRS for SMEs. Empirical studies on the differences and
similarities between national standards and IFRS for SMEs could
inform decision makers on the actual level of convergence in
different countries. The purpose of this paper is to provide empirical
evidences on the convergence of the Romanian regulations with IFRS
for SMEs analyzing the results in the context of the last revisions
proposed to the EU Accounting Directives.
Abstract: The paper shows some ability to manage two-phase
flows arising from the use of unsteady effects. In one case, we
consider the condition of fragmentation of the interface between the
two components leads to the intensification of mixing. The problem
is solved when the temporal and linear scale are small for the
appearance of the developed mixing layer. Showing that exist such
conditions for unsteady flow velocity at the surface of the channel,
which will lead to the creation and fragmentation of vortices at Re
numbers of order unity. Also showing that the Re is not a criterion of
similarity for this type of flows, but we can introduce a criterion that
depends on both the Re, and the frequency splitting of the vortices. It
turned out that feature of this situation is that streamlines behave
stable, and if we analyze the behavior of the interface between the
components it satisfies all the properties of unstable flows. The other
problem we consider the behavior of solid impurities in the extensive
system of channels. Simulated unsteady periodic flow modeled
breaths. Consider the behavior of the particles along the trajectories.
It is shown that, depending on the mass and diameter of the particles,
they can be collected in a caustic on the channel walls, stop in a
certain place or fly back. Of interest is the distribution of particle
velocity in frequency. It turned out that by choosing a behavior of the
velocity field of the carrier gas can affect the trajectory of individual
particles including force them to fly back.
Abstract: The objective of this research was to study the factors
related to the satisfaction of consumers who purchased a Toyota
SUV Fortuner. This paper was a survey data which collected 400
samples from 65 car dealerships. The survey was conducted mainly
in Bangkok, Thailand. The statistics utilized in this paper included
percentage, mean, standard deviation and Pearson Product-Moment.
The findings revealed that the majority of respondent were male with
an undergraduate degree, married and live together. The average
income of the respondents was between 20,001 - 30,000 baht. Most
of them worked for private companies. Most of them had a family
with the average of 4 members. The hypotheses testing revealed that
the factors of marketing mix in terms of product (ability, gas
mileage, and safety) were related to overall satisfaction at the
medium level. However, the findings also revealed that the factors of
marketing mix in terms of product (image), price, and promotion, and
service center were related to the overall satisfaction at the low level.
Abstract: From food consumption surveys has been found that potato consumption comparing to other European countries is one of the highest. Hence acrylamide (AA) intake coming from fried potatoes in population might be high as well. The aim of the research was to determine acrylamide content and estimate intake of acrylamide from roasted potatoes bred and cultivated in Latvia. Five common Latvian potato varieties were selected: Lenora, Brasla, Imanta, Zile, and Madara. A two-year research was conducted during two periods: just after harvesting and after six months of storage. Time and temperature (210 ± 5°C) was recorded during frying. AA was extracted from potatoes by solid phase extraction and AA content was determined by LC-MS/MS. estimated intake of acrylamide ranges from 0.012 to 0.496μgkg-1 BW per day.
Abstract: Most of the losses in a power system relate to
the distribution sector which always has been considered.
From the important factors which contribute to increase losses
in the distribution system is the existence of radioactive flows.
The most common way to compensate the radioactive power
in the system is the power to use parallel capacitors. In
addition to reducing the losses, the advantages of capacitor
placement are the reduction of the losses in the release peak of
network capacity and improving the voltage profile. The point
which should be considered in capacitor placement is the
optimal placement and specification of the amount of the
capacitor in order to maximize the advantages of capacitor
placement.
In this paper, a new technique has been offered for the
placement and the specification of the amount of the constant
capacitors in the radius distribution network on the basis of
Genetic Algorithm (GA). The existing optimal methods for
capacitor placement are mostly including those which reduce
the losses and voltage profile simultaneously. But the
retaliation cost and load changes have not been considered as
influential UN the target function .In this article, a holistic
approach has been considered for the optimal response to this
problem which includes all the parameters in the distribution
network: The price of the phase voltage and load changes. So,
a vast inquiry is required for all the possible responses. So, in
this article, we use Genetic Algorithm (GA) as the most
powerful method for optimal inquiry.
Abstract: Two commercial proteases from Bacillus
licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were
screened for the production of Z-Ala-Phe-NH2 in batch reaction.
Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal
amidation of Z-Ala-Phe-OMe using ammonium carbamate as
ammonium source. Immobilization of protease has been achieved by
the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and
tetramethoxysilane (TMOS) as precursors (unpublished results). In
batch production, about 95% of Z-Ala-Phe-NH2 was obtained at
30°C after 24 hours of incubation. Reproducibility of different
batches of commercial Alcalase 2.4 L FG preparations was also
investigated by evaluating the amidation activity and the entrapment
yields in the case of immobilization. A packed-bed reactor (0.68 cm
ID, 15.0 cm long) was operated successfully for the continuous
synthesis of peptide amides. The immobilized enzyme retained the
initial activity over 10 cycles of repeated use in continuous reactor at
ambient temperature. At 0.75 mL/min flow rate of the substrate
mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5
hours of substrate recycling. The product contained about 90%
peptide amide and 10% hydrolysis byproduct.
Abstract: In a competitive production environment, critical
decision making are based on data resulted by random sampling of
product units. Efficiency of these decisions depends on data quality
and also their reliability scale. This point leads to the necessity of a
reliable measurement system. Therefore, the conjecture process and
analysing the errors contributes to a measurement system known as
Measurement System Analysis (MSA). The aim of this research is on
determining the necessity and assurance of extensive development in
analysing measurement systems, particularly with the use of
Repeatability and Reproducibility Gages (GR&R) to improve
physical measurements. Nowadays in productive industries,
repeatability and reproducibility gages released so well but they are
not applicable as well as other measurement system analysis
methods. To get familiar with this method and gain a feedback in
improving measurement systems, this survey would be on
“ANOVA" method as the most widespread way of calculating
Repeatability and Reproducibility (R&R).
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The cable tower of Liede Bridge is a double-column curved-lever arched-beam portal framed structure. Being novel and unique in structure, its cable tower differs in complexity from traditional ones. This paper analyzes the ultimate load capacity of cable tower by adopting the finite element calculations and model tests which indicate that constitutive relations applied here give a better simulation of actual failure process of prestressed reinforced concrete. In vertical load, horizontal load and overloading tests, the stepped loading of the tower model is of linear relationship, and the test data has good repeatability. All suggests that the cable tower has good bearing capacity, rational design and high emergency capacity.
Abstract: Experimental investigations were made on the instability of supercritical kerosene flowing in active cooling channels. Two approaches were used to control the pressure in the channel. One is the back-pressure valve while the other is the venturi. In both conditions, a kind of low-frequency oscillation of pressure and temperature is observed. And the oscillation periods are calculated. By comparison with the flow time, it is concluded that the instability occurred in active cooling channels is probably one kind of density wave instability. And its period has no relationship with the cooling channel geometry, nor the pressure, but only depends on the flow time of kerosene in active cooling channels. When the mass flow rate, density and pressure drop couple with each other, the density wave instability will appear.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: Open and distance learning is a fairly new concept in
Malawi. The major public provider, the Malawi College of Distance
Education, rolled out its activities only about 40 years ago. Over the
years, the demand for distance education has tremendously increased.
The present government has displayed positive political will to uplift
ODL as outlined in the Malawi Growth and Development Strategy as
well as the National Education Sector Plan. A growing national
interest in education coupled with political stability and a booming
ICT industry also raise hope for success. However, a fragile economy
with a GNI per capita of -US$ 200 over the last decade, poor public
funding, erratic power supply and lack of expertise put strain on
efforts towards the promotion of ODL initiatives. Despite the
challenges, the nation appears determined to go flat out and explore
all possible avenues that could revolutionise education access and
equity through ODL.
Abstract: One astonishing capability of humans is to recognize thousands of different objects visually, and to learn the semantic association between those objects and words referring to them. This work is an attempt to build a computational model of such capacity,simulating the process by which infants learn how to recognize objects and words through exposure to visual stimuli and vocal sounds.One of the main fact shaping the brain of a newborn is that lights and colors come from entities of the world. Gradually the visual system learn which light sensations belong to same entities, despite large changes in appearance. This experience is common between humans and several other mammals, like non-human primates. But humans only can recognize a huge variety of objects, most manufactured by himself, and make use of sounds to identify and categorize them. The aim of this model is to reproduce these processes in a biologically plausible way, by reconstructing the essential hierarchy of cortical circuits on the visual and auditory neural paths.