Abstract: Article devoted to the development of technologies
for medicine and agroecology by using plant organelle – spherosome.
Technological method of purification and isolation of this organelle
by using novel nanostructured carbon sorbent – “nanocarbosorb"
ARK type are presented. Also the methods of preparation of
nanocontainers based on using of spherosome with loaded isosorbide
dinitrate, piroxicam or diclofenak are exhibited. We found that the
spherosome could be applied for ecological aims as bioregulator and
also as biosensor for determination of ammonia ions in water
reservoirs at concentration range 1mM to 100mM.
Abstract: There is growing interest in biodiesel (fatty acid
methyl ester or FAME) because of the similarity in its properties
when compared to those of diesel fuels. Diesel engines operated on
biodiesel have lower emissions of carbon monoxide, unburned
hydrocarbons, particulate matter, and air toxics than when operated
on petroleum-based diesel fuel. Production of fatty acid methyl ester
(FAME) from rapeseed (nonedible oil) fatty acid distillate having
high free fatty acids (FFA) was investigated in this work. Conditions
for esterification process of rapeseed oil were 1.8 % H2SO4 as
catalyst, MeOH/oil of molar ratio 2 : 0.1 and reaction temperature
65 °C, for a period of 3h. The yield of methyl ester was > 90 % in 1
h.
The amount of FFA was reduced from 93 wt % to less than 2 wt %
at the end of the esterification process. The FAME was pureed by
neutralization with 1 M sodium hydroxide in water solution at a
reaction temperature of 62 °C. The final FAME product met with the
biodiesel quality standard, and ASTM D 6751.
Abstract: Excilamps are new UV sources with great potential
for application in wastewater treatment. In the present work, a XeBr
excilamp emitting radiation at 283 nm has been used for the
photodegradation of 4-chlorophenol within a range of concentrations
from 50 to 500 mg L-1. Total removal of 4-chlorophenol was
achieved for all concentrations assayed. The two main photoproduct
intermediates formed along the photodegradation process,
benzoquinone and hydroquinone, although not being completely
removed, remain at very low residual concentrations. Such
concentrations are insignificant compared to the 4-chlorophenol
initial ones and non-toxic. In order to simulate the process and scaleup,
a kinetic model has been developed and validated from the
experimental data.
Abstract: We study the typical domain size and configuration
character of a randomly perturbed system exhibiting continuous
symmetry breaking. As a model system we use rod-like objects
within a cubic lattice interacting via a Lebwohl–Lasher-type
interaction. We describe their local direction with a headless unit
director field. An example of such systems represents nematic LC or
nanotubes. We further introduce impurities of concentration p, which
impose the random anisotropy field-type disorder to directors. We
study the domain-type pattern of molecules as a function of p,
anchoring strength w between a neighboring director and impurity,
temperature, history of samples. In simulations we quenched the
directors either from the random or homogeneous initial
configuration. Our results show that a history of system strongly
influences: i) the average domain coherence length; and ii) the range
of ordering in the system. In the random case the obtained order is
always short ranged (SR). On the contrary, in the homogeneous case,
SR is obtained only for strong enough anchoring and large enough
concentration p. In other cases, the ordering is either of quasi long
range (QLR) or of long range (LR). We further studied memory
effects for the random initial configuration. With increasing external
ordering field B either QLR or LR is realized.
Abstract: In this paper we introduce a novel kernel classifier
based on a iterative shrinkage algorithm developed for compressive
sensing. We have adopted Bregman iteration with soft and hard
shrinkage functions and generalized hinge loss for solving l1 norm
minimization problem for classification. Our experimental results
with face recognition and digit classification using SVM as the
benchmark have shown that our method has a close error rate
compared to SVM but do not perform better than SVM. We have
found that the soft shrinkage method give more accuracy and in some
situations more sparseness than hard shrinkage methods.
Abstract: This paper presents an optimal design of linear phase
digital high pass finite impulse response (FIR) filter using Improved
Particle Swarm Optimization (IPSO). In the design process, the filter
length, pass band and stop band frequencies, feasible pass band and
stop band ripple sizes are specified. FIR filter design is a multi-modal
optimization problem. An iterative method is introduced to find the
optimal solution of FIR filter design problem. Evolutionary
algorithms like real code genetic algorithm (RGA), particle swarm
optimization (PSO), improved particle swarm optimization (IPSO)
have been used in this work for the design of linear phase high pass
FIR filter. IPSO is an improved PSO that proposes a new definition
for the velocity vector and swarm updating and hence the solution
quality is improved. A comparison of simulation results reveals the
optimization efficacy of the algorithm over the prevailing
optimization techniques for the solution of the multimodal, nondifferentiable,
highly non-linear, and constrained FIR filter design
problems.
Abstract: Modeling of a manufacturing system enables one to
identify the effects of key design parameters on the system performance and as a result to make correct decision. This paper
proposes a manufacturing system modeling approach using a spreadsheet model based on queuing network theory, in which a
static capacity planning model and stochastic queuing model are integrated. The model was used to improve the existing system utilization in relation to product design. The model incorporates few
parameters such as utilization, cycle time, throughput, and batch size.
The study also showed that the validity of developed model is good enough to apply and the maximum value of relative error is 10%, far
below the limit value 32%. Therefore, the model developed in this
study is a valuable alternative model in evaluating a manufacturing system
Abstract: We succeeded to produce a high performance and flexible graphene/Manganese dioxide (G/MnO2) electrode coated on flexible polyethylene terephthalate (PET) substrate. The graphene film is initially synthesized by drop-casting the graphene oxide (GO) solution on the PET substrate, followed by simultaneous reduction and patterning of the dried film using carbon dioxide (CO2) laser beam with power of 1.8 W. Potentiostatic Anodic Deposition method was used to deposit thin film of MnO2 with different loading mass 10 – 50 and 100 μg.cm-2 on the pre-prepared graphene film. The electrodes were fully characterized in terms of structure, morphology, and electrochemical performance. A maximum specific capacitance of 973 F.g-1 was attributed when depositing 50μg.cm-2 MnO2 on the laser reduced graphene oxide rGO (or G/50MnO2) and over 92% of its initial capacitance was retained after 1000 cycles. The good electrochemical performance and long-term cycling stability make our proposed approach a promising candidate in the supercapacitor applications.
Abstract: Grey mold on grape is caused by the fungus Botrytis
cinerea Pers. Trichodex WP, a new biofungicide, that contains fungal
spores of Trichoderma harzianum Rifai, was used for biological
control of Grey mold on grape. The efficacy of Trichodex WP has
been reported from many experiments. Experiments were carried out
in the locality – Banatski Karlovac, on grapevine species – talijanski
rizling. The trials were set according to instructions of methods
PP1/152(2) and PP1/17(3) , according to a fully randomized block
design. Phytotoxicity was estimated by PP methods 1/135(2), the
intensity of infection according to Towsend Heuberger , the
efficiency by Abbott, the analysis of variance with Duncan test and
PP/181(2). Application of Trichodex WP is limited to the first two
treatments. Other treatments are performed with the fungicides based
on a.i. procymidone, vinclozoline and iprodione.
Abstract: The customer satisfaction for textile sector carries
great importance like the customer satisfaction for other sectors
carry. Especially, if it is considered that gaining new customers
create four times more costs than protecting existing customers from
leaving, it can be seen that the customer satisfaction plays a great
role for the firms. In this study the affecting independent variables of
customer satisfaction are chosen as brand image, perceived service
quality and perceived product quality. By these independent
variables, it is investigated that if any differences exist in perception
of customer satisfaction according to the Turkish textile consumers in
the view of gender. In data analysis of this research the SPSS
program is used.
Abstract: This paper proposes a method that discovers time series event patterns from textual data with time information. The patterns are composed of sequences of events and each event is extracted from the textual data, where an event is characteristic content included in the textual data such as a company name, an action, and an impression of a customer. The method introduces 7 types of time constraints based on the analysis of the textual data. The method also evaluates these constraints when the frequency of a time series event pattern is calculated. We can flexibly define the time constraints for interesting combinations of events and can discover valid time series event patterns which satisfy these conditions. The paper applies the method to daily business reports collected by a sales force automation system and verifies its effectiveness through numerical experiments.
Abstract: Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
Abstract: Experiments have been carried out at the Latvia
University of Agriculture Department of Food Technology. The aim
of this work was to assess the effect of thermal treatment in flexible
retort pouch packaging on the quality of potatoes’ produce during the
storage time. Samples were evaluated immediately after retort
thermal treatment; and following 1; 2; 3 and 4 storage months at the
ambient temperature of +18±2ºC in vacuum packaging from
polyamide/polyethylene (PA/PE) and aluminum/polyethylene
(Al/PE) film pouches with barrier properties. Experimentally the
quality of the potatoes’ produce in dry butter and mushroom
dressings was characterized by measuring pH, hardness, color,
microbiological properties and sensory evaluation. The sterilization
was effective in protecting the produce from physical, chemical, and
microbial quality degradation. According to the study of obtained
data, it can be argued that the selected product processing technology
and packaging materials could be applied to provide the safety and
security during four-month storage period.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the right
economic opportunities through business innovation and growth. We
found evidences in literature that SOA (Service Oriented
Architecture) is a promising emerging technology which can deliver
the desired economic opportunity through modularity, flexibility and
loose-coupling. SOA can also help firms to connect in network which
can open a new window of opportunity to collaborate in innovation
and right kind of outsourcing. There are many articles and research
reports indicates that failure rate in outsourcing is very high but at the
same time research indicates that successful outsourcing projects
adds tangible and intangible benefits to the service consumer.
Business executives and policy makers in the west should not afraid
of outsourcing but they should choose the right strategy through the
use of emerging technology to significantly reduce the failure rate in
outsourcing.
Abstract: In this paper we introduce a novel method for
the characterization of synchronziation and coupling effects
in multivariate time series that can be used for the analysis
of EEG or ECoG signals recorded during epileptic seizures.
The method allows to visualize the spatio-temporal evolution
of synchronization and coupling effects that are characteristic
for epileptic seizures. Similar to other methods proposed for
this purpose our method is based on a regression analysis.
However, a more general definition of the regression together
with an effective channel selection procedure allows to use the
method even for time series that are highly correlated, which
is commonly the case in EEG/ECoG recordings with large
numbers of electrodes. The method was experimentally tested
on ECoG recordings of epileptic seizures from patients with
temporal lobe epilepsies. A comparision with the results from
a independent visual inspection by clinical experts showed
an excellent agreement with the patterns obtained with the
proposed method.
Abstract: In this paper, a new proposed system for Persian
printed numeral characters recognition with emphasis on
representation and recognition stages is introduced. For the first time,
in Persian optical character recognition, geometrical central moments
as character image descriptor and fuzzy min-max neural network for
Persian numeral character recognition has been used. Set of different
experiments on binary images of regular, translated, rotated and
scaled Persian numeral characters has been done and variety of
results has been presented. The best result was 99.16% correct
recognition demonstrating geometrical central moments and fuzzy
min-max neural network are adequate for Persian printed numeral
character recognition.
Abstract: In this paper, we will implement three-dimensional pursuit guidance law with feedback linearization control method and study the effects of parameters. First, we introduce guidance laws and equations of motion of a missile. Pursuit guidance law is our highlight. We apply feedback linearization control method to obtain the accelerations to implement pursuit guidance law. The solution makes warhead direction follow with line-of-sight. Final, the simulation results show that the exact solution derived in this paper is correct and some factors e.g. control gain, time delay, are important to implement pursuit guidance law.
Abstract: A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.
Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: Unsatisfactory effectiveness of software systems
development and enhancement projects is one of the main reasons
why in software engineering there are attempts being made to use
experiences coming from other engineering disciplines. In spite of
specificity of software product and process a belief had come out that
the execution of software could be more effective if these objects
were subject to measurement – as it is true in other engineering
disciplines for which measurement is an immanent feature. Thus
objective and reliable approaches to the measurement of software
processes and products have been sought in software engineering for
several dozens of years already. This may be proved, among others,
by the current version of CMMI for Development model. This paper
is aimed at analyzing the approach to the software processes and
products measurement proposed in the latest version of this very
model, indicating growing acceptance for this issue in software
engineering.