Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over
some complex biological phenomena, such as problematic diseases
like cancer. This paper presents a new technique that allows for an
accurate analysis of the human interactome network. It is basically
a two-step analysis process that involves, at first, the detection of
each protein-s absolute importance through the betweenness centrality
computation. Then, the second step determines the functionallyrelated
communities of proteins. For this purpose, we use a community
detection technique that is based on the edge betweenness
calculation. The new technique was thoroughly tested on real biological
data and the results prove some interesting properties of those proteins that are involved in the carcinogenesis process. Apart from its
experimental usefulness, the novel technique is also computationally
effective in terms of execution times. Based on the analysis- results, some topological features of cancer mutated proteins are presented
and a possible optimization solution for cancer drugs design is suggested.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: In this paper real money demand function is analyzed
within multivariate time-series framework. Cointegration approach is
used (Johansen procedure) assuming interdependence between
money demand determinants, which are nonstationary variables. This
will help us to understand the behavior of money demand in Croatia,
revealing the significant influence between endogenous variables in
vector autoregrression system (VAR), i.e. vector error correction
model (VECM). Exogeneity of the explanatory variables is tested.
Long-run money demand function is estimated indicating slow speed
of adjustment of removing the disequilibrium. Empirical results
provide the evidence that real industrial production and exchange
rate explains the most variations of money demand in the long-run,
while interest rate is significant only in short-run.
Abstract: Solid waste can be considered as an urban burden or
as a valuable resource depending on how it is managed. To meet the
rising demand for energy and to address environmental concerns, a
conversion from conventional energy systems to renewable resources
is essential. For the sustainability of human civilization, an
environmentally sound and techno-economically feasible waste
treatment method is very important to treat recyclable waste. Several
technologies are available for realizing the potential of solid waste as
an energy source, ranging from very simple systems for disposing of
dry waste to more complex technologies capable of dealing with
large amounts of industrial waste. There are three main pathways for
conversion of waste material to energy: thermo chemical,
biochemical and physicochemical. This paper investigates the thermo
chemical conversion of solid waste for energy recovery. The
processes, advantages and dis-advantages of various thermo chemical
conversion processes are discussed and compared. Special attention
is given to Gasification process as it provides better solutions
regarding public acceptance, feedstock flexibility, near-zero
emissions, efficiency and security. Finally this paper presents
comparative statements of thermo chemical processes and introduces
an integrated waste management system.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: The article deals with the classification of alternative water resources in terms of potential risks which is the prerequisite for incorporating these water resources to the emergency plans. The classification is based on the quantification of risks resulting from possible damage, disruption or total destruction of water resource caused by natural and anthropogenic hazards, assessment of water quality and availability, traffic accessibility of the assessed resource and finally its water yield. The aim is to achieve the development of an integrated rescue system, which will be capable of supplying the population with drinking water on the whole stricken territory during the states of emergency.
Abstract: The purpose of this study is to determine the
circumstances affecting elementary school students in their family
and school lives and what kind of emotions children may feel
because of these circumstances. The study was carried out according
to the survey model. Four Turkish elementary schools provided 123
fourth grade students for participation in the study. The study-s data
were collected by using worksheets for the activity titled “Important
Days in Our Lives", which was part of the Elementary School Social
Sciences Course 4th Grade Education Program. Data analysis was
carried out according to the content analysis technique used in
qualitative research. The study detected that circumstances of their
family and school lives caused children to feel emotions such as
happiness, sadness, anger, fear and jealousy. The circumstances and
the emotions caused by these circumstances were analyzed according
to gender and interpreted by presenting them with their frequencies.
Abstract: In this paper, a novel copyright protection scheme for digital images based on Visual Cryptography and Statistics is proposed. In our scheme, the theories and properties of sampling distribution of means and visual cryptography are employed to achieve the requirements of robustness and security. Our method does not need to alter the original image and can identify the ownership without resorting to the original image. Besides, our method allows multiple watermarks to be registered for a single host image without causing any damage to other hidden watermarks. Moreover, it is also possible for our scheme to cast a larger watermark into a smaller host image. Finally, experimental results will show the robustness of our scheme against several common attacks.
Abstract: One of the purposes of the robust method of
estimation is to reduce the influence of outliers in the data, on the
estimates. The outliers arise from gross errors or contamination from
distributions with long tails. The trimmed mean is a robust estimate.
This means that it is not sensitive to violation of distributional
assumptions of the data. It is called an adaptive estimate when the
trimming proportion is determined from the data rather than being
fixed a “priori-.
The main objective of this study is to find out the robustness
properties of the adaptive trimmed means in terms of efficiency, high
breakdown point and influence function. Specifically, it seeks to find
out the magnitude of the trimming proportion of the adaptive
trimmed mean which will yield efficient and robust estimates of the
parameter for data which follow a modified Weibull distribution with
parameter λ = 1/2 , where the trimming proportion is determined by a
ratio of two trimmed means defined as the tail length. Secondly, the
asymptotic properties of the tail length and the trimmed means are
also investigated. Finally, a comparison is made on the efficiency of
the adaptive trimmed means in terms of the standard deviation for the
trimming proportions and when these were fixed a “priori".
The asymptotic tail lengths defined as the ratio of two trimmed
means and the asymptotic variances were computed by using the
formulas derived. While the values of the standard deviations for the
derived tail lengths for data of size 40 simulated from a Weibull
distribution were computed for 100 iterations using a computer
program written in Pascal language.
The findings of the study revealed that the tail lengths of the
Weibull distribution increase in magnitudes as the trimming
proportions increase, the measure of the tail length and the adaptive
trimmed mean are asymptotically independent as the number of
observations n becomes very large or approaching infinity, the tail
length is asymptotically distributed as the ratio of two independent
normal random variables, and the asymptotic variances decrease as
the trimming proportions increase. The simulation study revealed
empirically that the standard error of the adaptive trimmed mean
using the ratio of tail lengths is relatively smaller for different values
of trimming proportions than its counterpart when the trimming
proportions were fixed a 'priori'.
Abstract: A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.
Abstract: Pattern matching based on regular tree grammars have been widely used in many areas of computer science. In this paper, we propose a pattern matcher within the framework of code generation, based on a generic and a formalized approach. According to this approach, parsers for regular tree grammars are adapted to a general pattern matching solution, rather than adapting the pattern matching according to their parsing behavior. Hence, we first formalize the construction of the pattern matches respective to input trees drawn from a regular tree grammar in a form of the so-called match trees. Then, we adopt a recently developed generic parser and tightly couple its parsing behavior with such construction. In addition to its generality, the resulting pattern matcher is characterized by its soundness and efficient implementation. This is demonstrated by the proposed theory and by the derived algorithms for its implementation. A comparison with similar and well-known approaches, such as the ones based on tree automata and LR parsers, has shown that our pattern matcher can be applied to a broader class of grammars, and achieves better approximation of pattern matches in one pass. Furthermore, its use as a machine code selector is characterized by a minimized overhead, due to the balanced distribution of the cost computations into static ones, during parser generation time, and into dynamic ones, during parsing time.
Abstract: A combination of photosynthetic bacteria along with
anaerobic acidogenic bacteria is an ideal option for efficient
hydrogen production. In the present study, the optimum
concentration of substrates for the growth of Rhodobacter
sphaeroides was found by response surface methodology. The
optimum combination of three individual fatty acids was determined
by Box Behnken design. Increase of volatile fatty acid concentration
decreased the growth. Combination of sodium acetate and sodium
propionate was most significant for the growth of the organism. The
results showed that a maximum biomass concentration of 0.916 g/l
was obtained when the concentrations of acetate, propionate and
butyrate were 0.73g/l,0.99g/l and 0.799g/l, respectively. The growth
was studied under an optimum concentration of volatile fatty acids
and at a light intensity of 3000 lux, initial pH of 7 and a temperature
of 35°C.The maximum biomass concentration of 0.92g/l was
obtained which verified the practicability of this optimization.
Abstract: In this paper, the process of obtaining Q and R
matrices for optimal pitch aircraft control system has been described.
Since the innovation of optimal control method, the determination of
Q and R matrices for such system has not been fully specified. The
value of Q and R for optimal pitch aircraft control application, have
been simulated and calculated. The suitable results for Q and R have
been observed through the performance index (PI). If the PI is small
“enough", we would say the Q & R values are suitable for that
certain type of optimal control system. Moreover, for the same value
of PI, we could have different Q and R sets. Due to the rule-free
determination of Q and R matrices, a specific method is brought to
find out the rough value of Q and R referring to rather small value of
PI.
Abstract: The machining performance is determined by the
frequency characteristics of the machine-tool structure and the
dynamics of the cutting process. Therefore, the prediction of dynamic
vibration behavior of spindle tool system is of great importance for the
design of a machine tool capable of high-precision and high-speed
machining. The aim of this study is to develop a finite element model
to predict the dynamic characteristics of milling machine tool and
hence evaluate the influence of the preload of the spindle bearings. To
this purpose, a three dimensional spindle bearing model of a high
speed engraving spindle tool was created. In this model, the rolling
interfaces with contact stiffness defined by Harris model were used to
simulate the spindle bearing components. Then a full finite element
model of a vertical milling machine was established by coupling the
spindle tool unit with the machine frame structure. Using this model,
the vibration mode that had a dominant influence on the dynamic
stiffness was determined. The results of the finite element simulations
reveal that spindle bearing with different preloads greatly affect the
dynamic behavior of the spindle tool unit and hence the dynamic
responses of the vertical column milling system. These results were
validated by performing vibration on the individual spindle tool unit
and the milling machine prototype, respectively. We conclude that
preload of the spindle bearings is an important component affecting
the dynamic characteristics and machining performance of the entire
vertical column structure of the milling machine.
Abstract: To support mobility in ATM networks, a number of
technical challenges need to be resolved. The impact of handoff
schemes in terms of service disruption, handoff latency, cost
implications and excess resources required during handoffs needs to
be addressed. In this paper, a one phase handoff and route
optimization solution using reserved PVCs between adjacent ATM
switches to reroute connections during inter-switch handoff is
studied. In the second phase, a distributed optimization process is
initiated to optimally reroute handoff connections. The main
objective is to find the optimal operating point at which to perform
optimization subject to cost constraint with the purpose of reducing
blocking probability of inter-switch handoff calls for delay tolerant
traffic. We examine the relation between the required bandwidth
resources and optimization rate. Also we calculate and study the
handoff blocking probability due to lack of bandwidth for resources
reserved to facilitate the rapid rerouting.
Abstract: By the application of an improved back-propagation
neural network (BPNN), a model of current densities for a solid oxide
fuel cell (SOFC) with 10 layers is established in this study. To build
the learning data of BPNN, Taguchi orthogonal array is applied to
arrange the conditions of operating parameters, which totally 7 factors
act as the inputs of BPNN. Also, the average current densities
achieved by numerical method acts as the outputs of BPNN.
Comparing with the direct solution, the learning errors for all learning
data are smaller than 0.117%, and the predicting errors for 27
forecasting cases are less than 0.231%. The results show that the
presented model effectively builds a mathematical algorithm to predict
performance of a SOFC stack immediately in real time.
Also, the calculating algorithms are applied to proceed with the
optimization of the average current density for a SOFC stack. The
operating performance window of a SOFC stack is found to be
between 41137.11 and 53907.89. Furthermore, an inverse predicting
model of operating parameters of a SOFC stack is developed here by
the calculating algorithms of the improved BPNN, which is proved to
effectively predict operating parameters to achieve a desired
performance output of a SOFC stack.
Abstract: The effect of time-periodic oscillations of the Rayleigh- Benard system on the heat transport in dielectric liquids is investigated by weakly nonlinear analysis. We focus on stationary convection using the slow time scale and arrive at the real Ginzburg- Landau equation. Classical fourth order Runge-kutta method is used to solve the Ginzburg-Landau equation which gives the amplitude of convection and this helps in quantifying the heat transfer in dielectric liquids in terms of the Nusselt number. The effect of electrical Rayleigh number and the amplitude of modulation on heat transport is studied.
Abstract: Agriculture products are being more demanding in
market today. To increase its productivity, automation to produce
these products will be very helpful. The purpose of this work is to
measure and determine the ripeness and quality of watermelon. The
textures on watermelon skin will be captured using digital camera.
These images will be filtered using image processing technique. All
these information gathered will be trained using ANN to determine
the watermelon ripeness accuracy. Initial results showed that the best
model has produced percentage accuracy of 86.51%, when measured
at 32 hidden units with a balanced percentage rate of training dataset.
Abstract: fibers of pure cellulose can be made from some bacteria such as acetobacter xylinum. Bacterial cellulose fibers are very pure, tens of nm across and about 0.5 micron long. The fibers are very stiff and, although nobody seems to have measured the strength of individual fibers. Their stiffness up to 70 GPa. Fundamental strengths should be at least greater than those of the best commercial polymers, but best bulk strength seems to about the same as that of steel. They can potentially be produced in industrial quantities at greatly lowered cost and water content, and with triple the yield, by a new process. This article presents a critical review of the available information on the bacterial cellulose as a biological nonwoven fabric with special emphasis on its fermentative production and applications. Characteristics of bacterial cellulose biofabric with respect to its structure and physicochemical properties are discussed. Current and potential applications of bacterial cellulose in textile, nonwoven cloth, paper, films synthetic fiber coating, food, pharmaceutical and other industries are also presented.
Abstract: Task of object localization is one of the major
challenges in creating intelligent transportation. Unfortunately, in
densely built-up urban areas, localization based on GPS only
produces a large error, or simply becomes impossible. New
opportunities arise for the localization due to the rapidly emerging
concept of a wireless ad-hoc network. Such network, allows
estimating potential distance between these objects measuring
received signal level and construct a graph of distances in which
nodes are the localization objects, and edges - estimates of the
distances between pairs of nodes. Due to the known coordinates of
individual nodes (anchors), it is possible to determine the location of
all (or part) of the remaining nodes of the graph. Moreover, road
map, available in digital format can provide localization routines
with valuable additional information to narrow node location search.
However, despite abundance of well-known algorithms for solving
the problem of localization and significant research efforts, there are
still many issues that currently are addressed only partially. In this
paper, we propose localization approach based on the graph mapped
distances on the digital road map data basis. In fact, problem is
reduced to distance graph embedding into the graph representing area
geo location data. It makes possible to localize objects, in some cases
even if only one reference point is available. We propose simple
embedding algorithm and sample implementation as spatial queries
over sensor network data stored in spatial database, allowing
employing effectively spatial indexing, optimized spatial search
routines and geometry functions.