Abstract: Human genome is not only the evolutionary
summation of all advantageous events, but also houses lesions of
deleterious foot prints. A single gene mutation sometimes may
express multiple consequences in numerous tissues and a linear
relationship of the genotype and the phenotype may often be obscure.
ß Thalassemia minor, a transfusion independent mild anaemia,
coupled with environment among other factors may articulate into
phenotypic pleotropy with Hypocholesterolemia, Vitamin D
deficiency, Tissue hypoxia, Hyper-parathyroidism and Psychological
alterations. Occurrence of Pancreatic insufficiency, resultant
steatorrhoea, Vitamin-D (25-OH) deficiency (13.86 ngm/ml) with
Hypocholesterolemia (85mg/dl) in a 30 years old male ß Thal-minor
patient (Hemoglobin 11mg/dl with Fetal Hemoglobin 2.10%, Hb A2
4.60% and Hb Adult 84.80% and altered Hemogram) with increased
Para thyroid hormone (62 pg/ml) & moderate Serum Ca+2
(9.5mg/ml) indicate towards a cascade of phenotypic pleotropy
where the ß Thalassemia mutation ,be it in the 5’ cap site of the
mRNA , differential splicing etc in heterozygous state is effecting
several metabolic pathways. Compensatory extramedulary
hematopoiesis may not coped up well with the stressful life style of
the young individual and increased erythropoietic stress with high
demand for cholesterol for RBC membrane synthesis may have
resulted in Hypocholesterolemia.Oxidative stress and tissue hypoxia
may have caused the pancreatic insufficiency, leading to Vitamin D
deficiency. This may in turn have caused the secondary
hyperparathyroidism to sustain serum Calcium level. Irritability and
stress intolerance of the patient was a cumulative effect of the vicious
cycle of metabolic compromises. From these findings we propose
that the metabolic deficiencies in the ß Thalassemia mutations may
be considered as the phenotypic display of the pleotropy to explain
the genetic epidemiology.
According to the recommendations from the NIH Workshop on
Gene-Environment Interplay in Common Complex Diseases: Forging
an Integrative Model, study design of observations should be
informed by gene-environment hypotheses and results of a study
(genetic diseases) should be published to inform future hypotheses.
Variety of approaches is needed to capture data on all possible
aspects, each of which is likely to contribute to the etiology of
disease. Speakers also agreed that there is a need for development of
new statistical methods and measurement tools to appraise
information that may be missed out by conventional method where
large sample size is needed to segregate considerable effect.
A meta analytic cohort study in future may bring about significant
insight on to the title comment.
Abstract: This paper aims to study decomposition behavior in
pyrolytic environment of four lignocellulosic biomass (oil palm shell,
oil palm frond, rice husk and paddy straw), and two commercial
components of biomass (pure cellulose and lignin), performed in a
thermogravimetry analyzer (TGA). The unit which consists of a
microbalance and a furnace flowed with 100 cc (STP) min-1 Nitrogen,
N2 as inert. Heating rate was set at 20⁰C min-1 and temperature
started from 50 to 900⁰C. Hydrogen gas production during the
pyrolysis was observed using Agilent Gas Chromatography Analyzer
7890A. Oil palm shell, oil palm frond, paddy straw and rice husk
were found to be reactive enough in a pyrolytic environment of up to
900°C since pyrolysis of these biomass starts at temperature as low as
200°C and maximum value of weight loss is achieved at about
500°C. Since there was not much different in the cellulose,
hemicelluloses and lignin fractions between oil palm shell, oil palm
frond, paddy straw and rice husk, the T-50 and R-50 values obtained
are almost similar. H2 productions started rapidly at this temperature
as well due to the decompositions of biomass inside the TGA.
Biomass with more lignin content such as oil palm shell was found to
have longer duration of H2 production compared to materials of high
cellulose and hemicelluloses contents.
Abstract: A different concept for designing and detailing of
reinforced concrete precast frame structures is analyzed in this paper.
The new detailing of the joints derives from the special hybrid
moment frame joints. The special reinforcements of this alternative
detailing, named modified special hybrid joint, are bondless with
respect to both column and beams. Full scale tests were performed on
a plan model, which represents a part of 5 story structure, cropped in
the middle of the beams and columns spans. Theoretical approach
was developed, based on testing results on twice repaired model,
subjected to lateral seismic type loading. Discussion regarding the
modified special hybrid joint behavior and further on widening
research needed concludes the presentation.
Abstract: In this paper, a new recursive strategy is proposed for determining $\frac{(n-1)!}{2}$ of $n$th order diagrams. The generalization of $n$th diagram for cross multiplication method were proposed by Pavlovic and Bankier but the specific rule of determining $\frac{(n-1)!}{2}$ of the $n$th order diagrams for square matrix is yet to be discovered. Thus using combinatorial approach, $\frac{(n-1)!}{2}$ of the $n$th order diagrams will be presented as $\frac{(n-1)!}{2}$ starter sets. These starter sets will be generated based on exchanging one element. The advantages of this new strategy are the discarding process was eliminated and the sign of starter set is alternated to each others.
Abstract: Through a proper analysis of residual strain and stress
distributions obtained at the surface of high speed milled specimens
of AA 6082–T6 aluminium alloy, the performance of an improved
indentation method is evaluated. This method integrates a special
device of indentation to a universal measuring machine. The
mentioned device allows introducing elongated indents allowing to
diminish the absolute error of measurement. It must be noted that the
present method offers the great advantage of avoiding both the
specific equipment and highly qualified personnel, and their inherent
high costs. In this work, the cutting tool geometry and high speed
parameters are selected to introduce reduced plastic damage.
Through the variation of the depth of cut, the stability of the shapes
adopted by the residual strain and stress distributions is evaluated.
The results show that the strain and stress distributions remain
unchanged, compressive and small. Moreover, these distributions
reveal a similar asymmetry when the gradients corresponding to
conventional and climb cutting zones are compared.
Abstract: In this paper, we study the application of Extreme
Learning Machine (ELM) algorithm for single layered feedforward
neural networks to non-linear chaotic time series problems. In this
algorithm the input weights and the hidden layer bias are randomly
chosen. The ELM formulation leads to solving a system of linear
equations in terms of the unknown weights connecting the hidden
layer to the output layer. The solution of this general system of
linear equations will be obtained using Moore-Penrose generalized
pseudo inverse. For the study of the application of the method we
consider the time series generated by the Mackey Glass delay
differential equation with different time delays, Santa Fe A and
UCR heart beat rate ECG time series. For the choice of sigmoid,
sin and hardlim activation functions the optimal values for the
memory order and the number of hidden neurons which give the
best prediction performance in terms of root mean square error are
determined. It is observed that the results obtained are in close
agreement with the exact solution of the problems considered
which clearly shows that ELM is a very promising alternative
method for time series prediction.
Abstract: The physical methods for RNA secondary structure prediction are time consuming and expensive, thus methods for computational prediction will be a proper alternative. Various algorithms have been used for RNA structure prediction including dynamic programming and metaheuristic algorithms. Musician's behaviorinspired harmony search is a recently developed metaheuristic algorithm which has been successful in a wide variety of complex optimization problems. This paper proposes a harmony search algorithm (HSRNAFold) to find RNA secondary structure with minimum free energy and similar to the native structure. HSRNAFold is compared with dynamic programming benchmark mfold and metaheuristic algorithms (RnaPredict, SetPSO and HelixPSO). The results showed that HSRNAFold is comparable to mfold and better than metaheuristics in finding the minimum free energies and the number of correct base pairs.
Abstract: This paper presents the work of signal discrimination
specifically for Electrocardiogram (ECG) waveform. ECG signal is
comprised of P, QRS, and T waves in each normal heart beat to
describe the pattern of heart rhythms corresponds to a specific
individual. Further medical diagnosis could be done to determine any
heart related disease using ECG information. The emphasis on QRS
Complex classification is further discussed to illustrate the
importance of it. Pan-Tompkins Algorithm, a widely known
technique has been adapted to realize the QRS Complex
classification process. There are eight steps involved namely
sampling, normalization, low pass filter, high pass filter (build a band
pass filter), derivation, squaring, averaging and lastly is the QRS
detection. The simulation results obtained is represented in a
Graphical User Interface (GUI) developed using MATLAB.
Abstract: This paper presents a controller design technique for
Synchronous Reluctance Motor to improve its dynamic performance
with fast response and high accuracy. The sliding mode control is the
most attractive and suitable method to use for this purpose, since it is
simple in design and for its insensitivity to parameter variations or
external disturbances. When this method implemented it yields fast
dynamic response without overshoot and a zero steady-state error.
The current loop control with decentralized sliding mode is presented
in this paper. The mathematical model for the synchronous machine,
the inverter and the controller is developed. The stability of the
sliding mode controller is analyzed. Simulation of synchronous
reluctance motor and the controller with PWM-inverter has been
curried out, using the SIMULINK software package of MATLAB.
Simulation results are presented to show the effectiveness of the
approach.
Abstract: This paper presents a new approach for the prob-ability density function estimation using the Support Vector Ma-chines (SVM) and the Expectation Maximization (EM) algorithms.In the proposed approach, an advanced algorithm for the SVM den-sity estimation which incorporates the Mean Field theory in the learning process is used. Instead of using ad-hoc values for the para-meters of the kernel function which is used by the SVM algorithm,the proposed approach uses the EM algorithm for an automatic optimization of the kernel. Experimental evaluation using simulated data set shows encouraging results.
Abstract: In this paper, we consider the control of time delay system
by Proportional-Integral (PI) controller. By Using the Hermite-
Biehler theorem, which is applicable to quasi-polynomials, we seek
a stability region of the controller for first order delay systems. The
essence of this work resides in the extension of this approach to
second order delay system, in the determination of its stability region
and the computation of the PI optimum parameters. We have used
the genetic algorithms to lead the complexity of the optimization
problem.
Abstract: Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.
Abstract: Domain-specific languages describe specific solutions to problems in the application domain. Traditionally they form a solution composing black-box abstractions together. This, usually, involves non-deep transformations over the target model. In this paper we argue that it is potentially powerful to operate with grey-box abstractions to build a domain-specific software system. We present parametric code templates as grey-box abstractions and conceptual tools to encapsulate and manipulate these templates. Manipulations introduce template-s merging routines and can be defined in a generic way. This involves reasoning mechanisms at the code templates level. We introduce the concept of Neurath Modelling Language (NML) that operates with parametric code templates and specifies a visualisation mapping mechanism for target models. Finally we provide an example of calculating a domain-specific software system with predefined NML elements.
Abstract: This paper introduces a framework that aims to
support the design and development of mobile services. The
traditional innovation process and its supporting instruments in form
of creativity tools, acceptance research and user-generated content
analysis are screened for potentials for improvement. The result is a
reshaped innovation process where acceptance research and usergenerated
content analysis are fully integrated within a creativity
tool. Advantages of this method are the enhancement of design
relevant information for developers and designers and the possibility
to forecast market success.
Abstract: The proper design of RF pulses in magnetic resonance imaging (MRI) has a direct impact on the quality of acquired images, and is needed for many applications. Several techniques have been proposed to obtain the RF pulse envelope given the desired slice profile. Unfortunately, these techniques do not take into account the limitations of practical implementation such as limited amplitude resolution. Moreover, implementing constraints for special RF pulses on most techniques is not possible. In this work, we propose to develop an approach for designing optimal RF pulses under theoretically any constraints. The new technique will pose the RF pulse design problem as a combinatorial optimization problem and uses efficient techniques from this area such as genetic algorithms (GA) to solve this problem. In particular, an objective function will be proposed as the norm of the difference between the desired profile and the one obtained from solving the Bloch equations for the current RF pulse design values. The proposed approach will be verified using analytical solution based RF simulations and compared to previous methods such as Shinnar-Le Roux (SLR) method, and analysis, selected, and tested the options and parameters that control the Genetic Algorithm (GA) can significantly affect its performance to get the best improved results and compared to previous works in this field. The results show a significant improvement over conventional design techniques, select the best options and parameters for GA to get most improvement over the previous works, and suggest the practicality of using of the new technique for most important applications as slice selection for large flip angles, in the area of unconventional spatial encoding, and another clinical use.
Abstract: The current trend of increasing quality and demands
of the final product is affected by time analysis of the entire
manufacturing process. The primary requirement of manufacturing is
to produce as many products as soon as possible, at the lowest
possible cost, but of course with the highest quality. Such
requirements may be satisfied only if all the elements entering and
affecting the production cycle are in a fully functional condition.
These elements consist of sensory equipment and intelligent control
elements that are essential for building intelligent manufacturing
systems. The intelligent manufacturing paradigm includes a new
approach to production system structure design. Intelligent behaviors
are based on the monitoring of important parameters of system and
its environment. The flexible reaction to changes. The realization and
utilization of this design paradigm as an "intelligent manufacturing
system" enables the flexible system reaction to production
requirement as soon as environmental changes too. Results of these
flexible reactions are a smaller layout space, be decreasing of
production and investment costs and be increasing of productivity.
Intelligent manufacturing system itself should be a system that can
flexibly respond to changes in entering and exiting the process in
interaction with the surroundings.
Abstract: This study was aimed for investigating of
manufacturing high aluminum content Mg alloys using a horizontal
twin roll caster. Recently, weight saving has been key issues for lighter
transport equipments as well as electronic component parts. As
alternative materials to aluminum alloys, developing magnesium alloy
with higher strength has been expected. Normally high Aluminum
content Mg alloy has poor ductility and is difficult to be rolled because
of its high strength. However, twin roll casting process is suitable for
manufacturing wrought Mg alloys because materials can be cast
directly from molten metal. In this study, manufacturing of high
aluminum content magnesium alloy sheet using the roll casting
process has been carried out. Effects of manufacturing parameter, such
as roll velocity, pouring temperature and roll gap, on casting was
investigated. A microscopic observation of the crystals of cross section
of as cast strip as well as rolled strip was conducted.
Abstract: The shortest path routing problem is a multiobjective nonlinear optimization problem with constraints. This problem has been addressed by considering Quality of service parameters, delay and cost objectives separately or as a weighted sum of both objectives. Multiobjective evolutionary algorithms can find multiple pareto-optimal solutions in one single run and this ability makes them attractive for solving problems with multiple and conflicting objectives. This paper uses an elitist multiobjective evolutionary algorithm based on the Non-dominated Sorting Genetic Algorithm (NSGA), for solving the dynamic shortest path routing problem in computer networks. A priority-based encoding scheme is proposed for population initialization. Elitism ensures that the best solution does not deteriorate in the next generations. Results for a sample test network have been presented to demonstrate the capabilities of the proposed approach to generate well-distributed pareto-optimal solutions of dynamic routing problem in one single run. The results obtained by NSGA are compared with single objective weighting factor method for which Genetic Algorithm (GA) was applied.
Abstract: Data warehousing success is not high enough. User
dissatisfaction and failure to adhere to time frames and budgets are
too common. Most traditional information systems practices are
rooted in hard systems thinking. Today, the great systems thinkers
are forgotten by information systems developers. A data warehouse
is still a system and it is worth investigating whether systems
thinkers such as Churchman can enhance our practices today. This
paper investigates data warehouse development practices from a
systems thinking perspective. An empirical investigation is done in
order to understand the everyday practices of data warehousing
professionals from a systems perspective. The paper presents a
model for the application of Churchman-s systems approach in data
warehouse development.
Abstract: In communication networks where communication nodes are connected with finite capacity transmission links, the packet inter-arrival times are strongly correlated with the packet length and the link capacity (or the packet service time). Such correlation affects the system performance significantly, but little attention has been paid to this issue. In this paper, we propose a mathematical framework to study the impact of the correlation between the packet service times and the packet inter-arrival times on system performance. With our mathematical model, we analyze the system performance, e.g., the unfinished work of the system, and show that the correlation affects the system performance significantly. Some numerical examples are also provided.