Abstract: Vernonia divergens Benth., commonly known as
“Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the
leaves of the plant, boiled in water are successfully administered to a
large number of diabetic patients. The present study evaluates the
putative anti-diabetic ingredients, isolated from the in vivo and in
vitro grown plantlets of V. divergens for their antimicrobial and
anticancer activities. Sterilized explants of nodal segments were
cultured on MS (Musashige and Skoog, 1962) medium in presence of
different combinations of hormones. Multiple shoots along with
bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA.
Micro-plantlets were separated and sub-cultured on the double
strength (2X) of the above combination of hormones leading to
increased length of roots and shoots. These plantlets were
successfully transferred to soil and survived well in nature. The
ethanol extract of plantlets from both in vivo & in vitro sources were
prepared in soxhlet extractor and then concentrated to dryness under
reduced pressure in rotary evaporator. Thus obtainedconcentrated
extracts showed significant inhibitory activity against gram
negative bacteria like Escherichia coli and Pseudomonas
aeruginosa but no inhibition was found against gram positive
bacteria. Further, these ethanol extracts were screened for in vitro
percentage cytotoxicity at different time periods (24 h, 48 h and 72 h)
of different dilutions. The in vivo plant extract inhibited the growth of
EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50,
25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in
vitro origin, the inhibition was found against EAC cell lines even at
48h. During spectrophotometric scanning, the extracts exhibited
different maxima (ʎ) - four peaks in in vitro extracts as against single
in in vivo preparation suggesting the possible change in the nature of
ingredients during micropropagation through tissue culture
techniques.
Abstract: Purpose of this paper is two-folded. At first it explains
the major problems that are causing stagnation in brownfield
redevelopment. In addition, these problems given the context of the
present multi-actor built environment are becoming more complex to
observe. Therefore, this paper suggests also a prospective decisionmaking
approach that is the most appropriate to observe and react on
the given stagnation problems. Such an approach should be regarded
as prescriptive-interactive decision-making approach, a barely
established branch. This approach should offer models that have
prescriptive as well as an interactive component enabling them to
successfully cope with the multi-actor environment. Overall, this
paper provides up-to-date insight on the brownfield stagnation by
gradually introducing the nowadays major problems and offers a
prospective decision-making approach how these problems could be
tackled.
Abstract: Recordings from recent earthquakes have provided evidence that ground motions in the near field of a rupturing fault differ from ordinary ground motions, as they can contain a large energy, or “directivity" pulse. This pulse can cause considerable damage during an earthquake, especially to structures with natural periods close to those of the pulse. Failures of modern engineered structures observed within the near-fault region in recent earthquakes have revealed the vulnerability of existing RC buildings against pulse-type ground motions. This may be due to the fact that these modern structures had been designed primarily using the design spectra of available standards, which have been developed using stochastic processes with relatively long duration that characterizes more distant ground motions. Many recently designed and constructed buildings may therefore require strengthening in order to perform well when subjected to near-fault ground motions. Fiber Reinforced Polymers are considered to be a viable alternative, due to their relatively easy and quick installation, low life cycle costs and zero maintenance requirements. The objective of this paper is to investigate the adequacy of Artificial Neural Networks (ANN) to determine the three dimensional dynamic response of FRP strengthened RC buildings under the near-fault ground motions. For this purpose, one ANN model is proposed to estimate the base shear force, base bending moments and roof displacement of buildings in two directions. A training set of 168 and a validation set of 21 buildings are produced from FEA analysis results of the dynamic response of RC buildings under the near-fault earthquakes. It is demonstrated that the neural network based approach is highly successful in determining the response.
Abstract: Object-oriented simulation is considered one of the most sophisticated techniques that has been widely used in planning, designing, executing and maintaining construction projects. This technique enables the modeler to focus on objects which is extremely important for thorough understanding of a system. Thus, identifying an object is an essential point of building a successful simulation model. In a maintenance process an object is a maintenance work order (MWO). This study demonstrates a maintenance simulation model for the building maintenance division of Saudi Consolidated Electric Company (SCECO) in Dammam, Saudi Arabia. The model focused on both types of maintenance processes namely: (1) preventive maintenance (PM) and (2) corrective maintenance (CM). It is apparent from the findings that object-oriented simulation is a good diagnostic and experimental tool. This is because problems, limitations, bottlenecks and so forth are easily identified. These features are very difficult to obtain when using other tools.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: This paper presents a heuristic approach to solve the Generalized Assignment Problem (GAP) which is NP-hard. It is worth mentioning that many researches used to develop algorithms for identifying the redundant constraints and variables in linear programming model. Some of the algorithms are presented using intercept matrix of the constraints to identify redundant constraints and variables prior to the start of the solution process. Here a new heuristic approach based on the dominance property of the intercept matrix to find optimal or near optimal solution of the GAP is proposed. In this heuristic, redundant variables of the GAP are identified by applying the dominance property of the intercept matrix repeatedly. This heuristic approach is tested for 90 benchmark problems of sizes upto 4000, taken from OR-library and the results are compared with optimum solutions. Computational complexity is proved to be O(mn2) of solving GAP using this approach. The performance of our heuristic is compared with the best state-ofthe- art heuristic algorithms with respect to both the quality of the solutions. The encouraging results especially for relatively large size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.
Abstract: Eigenvector methods are gaining increasing acceptance in the area of spectrum estimation. This paper presents a successful attempt at testing and evaluating the performance of two of the most popular types of subspace techniques in determining the parameters of multiexponential signals with real decay constants buried in noise. In particular, MUSIC (Multiple Signal Classification) and minimum-norm techniques are examined. It is shown that these methods perform almost equally well on multiexponential signals with MUSIC displaying better defined peaks.
Abstract: Manufacturing, production and service industries within Libya have struggled with many problems during the past two decades due to many difficulties. These problems have created a negative impact on the productivity and utilization of many industries around the country. This paper studies the implementation levels of the manufacturing control systems known as Manufacturing Resource Planning (MRPII) being adapted within some Libyan industries. A survey methodology has been applied for this research, based on the survey analysis, the results pointed out that the system within these industries has a modest strategy towards most of the areas that are considered as being very crucial in implementing these systems successfully. The findings also show a variation within these implementation levels with a respect to the key-elements that related to MRPII, giving the highest levels in the emphasise on financial data accuracy. The paper has also identified limitations within the investigated manufacturing and managerial areas and has pointed to where senior managers should take immediate actions in order to achieve effective implementation of MRPII within their business area.
Abstract: A novel calibration approach that aims to reduce
ASM2d parameter subsets and decrease the model complexity is
presented. This approach does not require high computational
demand and reduces the number of modeling parameters required to
achieve the ASMs calibration by employing a sensitivity and iteration
methodology. Parameter sensitivity is a crucial factor and the
iteration methodology enables refinement of the simulation parameter
values. When completing the iteration process, parameters values are
determined in descending order of their sensitivities. The number of
iterations required is equal to the number of model parameters of the
parameter significance ranking. This approach was used for the
ASM2d model to the evaluated EBPR phosphorus removal and it was
successful. Results of the simulation provide calibration parameters.
These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA,
KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were
corresponding to the experimental data available.
Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: A unique combination of adsorption and
electrochemical regeneration with a proprietary adsorbent material
called Nyex 100 was introduced at the University of Manchester for
waste water treatment applications. Nyex 100 is based on graphite
intercalation compound. It is non porous and electrically conducing
adsorbent material. This material exhibited very small BET surface
area i.e. 2.75 m2g-1, in consequence, small adsorptive capacities for
the adsorption of various organic pollutants were obtained. This work
aims to develop composite adsorbent material essentially capable of
electrochemical regeneration coupled with improved adsorption
characteristics. An organic dye, acid violet 17 was used as standard
organic pollutant. The developed composite material was
successfully electrochemically regenerated using a DC current of 1 A
for 60 minutes. Regeneration efficiency was maintained at around
100% for five adsorption-regeneration cycles.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: Reverse engineering of full-genomic interaction networks based on compendia of expression data has been successfully applied for a number of model organisms. This study adapts these approaches for an important non-model organism: The major human fungal pathogen Candida albicans. During the infection process, the pathogen can adapt to a wide range of environmental niches and reversibly changes its growth form. Given the importance of these processes, it is important to know how they are regulated. This study presents a reverse engineering strategy able to infer fullgenomic interaction networks for C. albicans based on a linear regression, utilizing the sparseness criterion (LASSO). To overcome the limited amount of expression data and small number of known interactions, we utilize different prior-knowledge sources guiding the network inference to a knowledge driven solution. Since, no database of known interactions for C. albicans exists, we use a textmining system which utilizes full-text research papers to identify known regulatory interactions. By comparing with these known regulatory interactions, we find an optimal value for global modelling parameters weighting the influence of the sparseness criterion and the prior-knowledge. Furthermore, we show that soft integration of prior-knowledge additionally improves the performance. Finally, we compare the performance of our approach to state of the art network inference approaches.
Abstract: An autonomous environmental monitoring system
(Smart Landfill) has been constructed for the quantitative
measurement of the components of landfill gas found at borehole
wells at the perimeter of landfill sites. The main components of
landfill gas are the greenhouse gases, methane and carbon dioxide
and have been monitored in the range 0-5 % volume. This monitoring
system has not only been tested in the laboratory but has been
deployed in multiple field trials and the data collected successfully
compared with on-site monitors. This success shows the potential of
this system for application in environments where reliable gas
monitoring is crucial.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: As more people from non-technical backgrounds
are becoming directly involved with large-scale ontology
development, the focal point of ontology research has shifted
from the more theoretical ontology issues to problems
associated with the actual use of ontologies in real-world,
large-scale collaborative applications. Recently the National
Science Foundation funded a large collaborative ontology
development project for which a new formal ontology model,
the Ontology Abstract Machine (OAM), was developed to
satisfy some unique functional and data representation
requirements. This paper introduces the OAM model and the
related algorithms that enable maintenance of an ontology that
supports node-based user access. The successful software
implementation of the OAM model and its subsequent
acceptance by a large research community proves its validity
and its real-world application value.
Abstract: The paper presented a transient population dynamics of phase singularities in 2D Beeler-Reuter model. Two stochastic modelings are examined: (i) the Master equation approach with the transition rate (i.e., λ(n, t) = λ(t)n and μ(n, t) = μ(t)n) and (ii) the nonlinear Langevin equation approach with a multiplicative noise. The exact general solution of the Master equation with arbitrary time-dependent transition rate is given. Then, the exact solution of the mean field equation for the nonlinear Langevin equation is also given. It is demonstrated that transient population dynamics is successfully identified by the generalized Logistic equation with fractional higher order nonlinear term. It is also demonstrated the necessity of introducing time-dependent transition rate in the master equation approach to incorporate the effect of nonlinearity.
Abstract: This paper presents a novel approach for representing
the spatio-temporal topology of the camera network with overlapping
and non-overlapping fields of view (FOVs). The topology is
determined by tracking moving objects and establishing object
correspondence across multiple cameras. To track people successfully
in multiple camera views, we used the Merge-Split (MS) approach for
object occlusion in a single camera and the grid-based approach for
extracting the accurate object feature. In addition, we considered the
appearance of people and the transition time between entry and exit
zones for tracking objects across blind regions of multiple cameras
with non-overlapping FOVs. The main contribution of this paper is to
estimate transition times between various entry and exit zones, and to
graphically represent the camera topology as an undirected weighted
graph using the transition probabilities.
Abstract: Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Abstract: User-based Collaborative filtering (CF), one of the
most prevailing and efficient recommendation techniques, provides
personalized recommendations to users based on the opinions of other
users. Although the CF technique has been successfully applied in
various applications, it suffers from serious sparsity problems. The
cloud-model approach addresses the sparsity problems by
constructing the user-s global preference represented by a cloud
eigenvector. The user-based CF approach works well with dense
datasets while the cloud-model CF approach has a greater
performance when the dataset is sparse. In this paper, we present a
hybrid approach that integrates the predictions from both the
user-based CF and the cloud-model CF approaches. The experimental
results show that the proposed hybrid approach can ameliorate the
sparsity problem and provide an improved prediction quality.