Abstract: This paper explains how mobile learning assures sustainable e-education for multicultural group of students. This paper reports the impact of mobile learning on distance education in multicultural environment. The emergence of learning technologies through CD, internet, and mobile is increasingly adopted by distance institutes for quick delivery and cost-effective purposes. Their sustainability is conditioned by the structure of learners as well as the teaching community. The experimental study was conducted among the distant learners of Vinayaka Missions University located at Salem in India. Students were drawn from multicultural environment based on different languages, religions, class and communities. During the mobile learning sessions, the students, who are divided on language, religion, class and community, were dominated by play impulse rather than study anxiety or cultural inhibitions. This study confirmed that mobile learning improved the performance of the students despite their division based on region, language or culture. In other words, technology was able to transcend the relative deprivation in the multicultural groups. It also confirms sustainable e-education through mobile learning and cost-effective system of instruction. Mobile learning appropriates the self-motivation and play impulse of the young learners in providing sustainable e-education to multicultural social groups of students.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the right
economic opportunities through business innovation and growth. We
found evidences in literature that SOA (Service Oriented
Architecture) is a promising emerging technology which can deliver
the desired economic opportunity through modularity, flexibility and
loose-coupling. SOA can also help firms to connect in network which
can open a new window of opportunity to collaborate in innovation
and right kind of outsourcing. There are many articles and research
reports indicates that failure rate in outsourcing is very high but at the
same time research indicates that successful outsourcing projects
adds tangible and intangible benefits to the service consumer.
Business executives and policy makers in the west should not afraid
of outsourcing but they should choose the right strategy through the
use of emerging technology to significantly reduce the failure rate in
outsourcing.
Abstract: This study describes a capillary-based device
integrated with the heating and cooling modules for polymerase chain
reaction (PCR). The device consists of the reaction
polytetrafluoroethylene (PTFE) capillary, the aluminum blocks, and is
equipped with two cartridge heaters, a thermoelectric (TE) cooler, a
fan, and some thermocouples for temperature control. The cartridge
heaters are placed into the heating blocks and maintained at two
different temperatures to achieve the denaturation and the extension
step. Some thermocouples inserted into the capillary are used to obtain
the transient temperature profiles of the reaction sample during
thermal cycles. A 483-bp DNA template is amplified successfully in
the designed system and the traditional thermal cycler. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: Vernonia divergens Benth., commonly known as
“Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the
leaves of the plant, boiled in water are successfully administered to a
large number of diabetic patients. The present study evaluates the
putative anti-diabetic ingredients, isolated from the in vivo and in
vitro grown plantlets of V. divergens for their antimicrobial and
anticancer activities. Sterilized explants of nodal segments were
cultured on MS (Musashige and Skoog, 1962) medium in presence of
different combinations of hormones. Multiple shoots along with
bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA.
Micro-plantlets were separated and sub-cultured on the double
strength (2X) of the above combination of hormones leading to
increased length of roots and shoots. These plantlets were
successfully transferred to soil and survived well in nature. The
ethanol extract of plantlets from both in vivo & in vitro sources were
prepared in soxhlet extractor and then concentrated to dryness under
reduced pressure in rotary evaporator. Thus obtainedconcentrated
extracts showed significant inhibitory activity against gram
negative bacteria like Escherichia coli and Pseudomonas
aeruginosa but no inhibition was found against gram positive
bacteria. Further, these ethanol extracts were screened for in vitro
percentage cytotoxicity at different time periods (24 h, 48 h and 72 h)
of different dilutions. The in vivo plant extract inhibited the growth of
EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50,
25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in
vitro origin, the inhibition was found against EAC cell lines even at
48h. During spectrophotometric scanning, the extracts exhibited
different maxima (ʎ) - four peaks in in vitro extracts as against single
in in vivo preparation suggesting the possible change in the nature of
ingredients during micropropagation through tissue culture
techniques.
Abstract: Purpose of this paper is two-folded. At first it explains
the major problems that are causing stagnation in brownfield
redevelopment. In addition, these problems given the context of the
present multi-actor built environment are becoming more complex to
observe. Therefore, this paper suggests also a prospective decisionmaking
approach that is the most appropriate to observe and react on
the given stagnation problems. Such an approach should be regarded
as prescriptive-interactive decision-making approach, a barely
established branch. This approach should offer models that have
prescriptive as well as an interactive component enabling them to
successfully cope with the multi-actor environment. Overall, this
paper provides up-to-date insight on the brownfield stagnation by
gradually introducing the nowadays major problems and offers a
prospective decision-making approach how these problems could be
tackled.
Abstract: Recordings from recent earthquakes have provided evidence that ground motions in the near field of a rupturing fault differ from ordinary ground motions, as they can contain a large energy, or “directivity" pulse. This pulse can cause considerable damage during an earthquake, especially to structures with natural periods close to those of the pulse. Failures of modern engineered structures observed within the near-fault region in recent earthquakes have revealed the vulnerability of existing RC buildings against pulse-type ground motions. This may be due to the fact that these modern structures had been designed primarily using the design spectra of available standards, which have been developed using stochastic processes with relatively long duration that characterizes more distant ground motions. Many recently designed and constructed buildings may therefore require strengthening in order to perform well when subjected to near-fault ground motions. Fiber Reinforced Polymers are considered to be a viable alternative, due to their relatively easy and quick installation, low life cycle costs and zero maintenance requirements. The objective of this paper is to investigate the adequacy of Artificial Neural Networks (ANN) to determine the three dimensional dynamic response of FRP strengthened RC buildings under the near-fault ground motions. For this purpose, one ANN model is proposed to estimate the base shear force, base bending moments and roof displacement of buildings in two directions. A training set of 168 and a validation set of 21 buildings are produced from FEA analysis results of the dynamic response of RC buildings under the near-fault earthquakes. It is demonstrated that the neural network based approach is highly successful in determining the response.
Abstract: Object-oriented simulation is considered one of the most sophisticated techniques that has been widely used in planning, designing, executing and maintaining construction projects. This technique enables the modeler to focus on objects which is extremely important for thorough understanding of a system. Thus, identifying an object is an essential point of building a successful simulation model. In a maintenance process an object is a maintenance work order (MWO). This study demonstrates a maintenance simulation model for the building maintenance division of Saudi Consolidated Electric Company (SCECO) in Dammam, Saudi Arabia. The model focused on both types of maintenance processes namely: (1) preventive maintenance (PM) and (2) corrective maintenance (CM). It is apparent from the findings that object-oriented simulation is a good diagnostic and experimental tool. This is because problems, limitations, bottlenecks and so forth are easily identified. These features are very difficult to obtain when using other tools.
Abstract: For the past couple of decades Weak signal detection
is of crucial importance in various engineering and scientific
applications. It finds its application in areas like Wireless
communication, Radars, Aerospace engineering, Control systems and
many of those. Usually weak signal detection requires phase sensitive
detector and demodulation module to detect and analyze the signal.
This article gives you a preamble to intrusion detection system which
can effectively detect a weak signal from a multiplexed signal. By
carefully inspecting and analyzing the respective signal, this
system can successfully indicate any peripheral intrusion. Intrusion
detection system (IDS) is a comprehensive and easy approach
towards detecting and analyzing any signal that is weakened and
garbled due to low signal to noise ratio (SNR). This approach
finds significant importance in applications like peripheral security
systems.
Abstract: Research papers are usually evaluated via peer
review. However, peer review has limitations in evaluating research
papers. In this paper, Scienstein and the new idea of 'collaborative
document evaluation' are presented. Scienstein is a project to
evaluate scientific papers collaboratively based on ratings, links,
annotations and classifications by the scientific community using the
internet. In this paper, critical success factors of collaborative
document evaluation are analyzed. That is the scientists- motivation
to participate as reviewers, the reviewers- competence and the
reviewers- trustworthiness. It is shown that if these factors are
ensured, collaborative document evaluation may prove to be a more
objective, faster and less resource intensive approach to scientific
document evaluation in comparison to the classical peer review
process. It is shown that additional advantages exist as collaborative
document evaluation supports interdisciplinary work, allows
continuous post-publishing quality assessments and enables the
implementation of academic recommendation engines. In the long
term, it seems possible that collaborative document evaluation will
successively substitute peer review and decrease the need for
journals.
Abstract: This paper describes the results of an extensive study
and comparison of popular hash functions SHA-1, SHA-256,
RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash
function. The compression functions of hash functions like SHA-1
and SHA-256 are designed using serial successive iteration whereas
those like RIPEMD-160 and RIPEMD-320 are designed using two
parallel lines of message processing. JERIM-320 uses four parallel
lines of message processing resulting in higher level of security than
other hash functions at comparable speed and memory requirement.
The performance evaluation of these methods has been done by using
practical implementation and also by using step computation
methods. JERIM-320 proves to be secure and ensures the integrity of
messages at a higher degree. The focus of this work is to establish
JERIM-320 as an alternative of the present day hash functions for the
fast growing internet applications.
Abstract: The use of buffer thresholds, blocking and adequate
service strategies are well-known techniques for computer networks
traffic congestion control. This motivates the study of series queues
with blocking, feedback (service under Head of Line (HoL) priority
discipline) and finite capacity buffers with thresholds. In this paper,
the external traffic is modelled using the Poisson process and the
service times have been modelled using the exponential distribution.
We consider a three-station network with two finite buffers, for
which a set of thresholds (tm1 and tm2) is defined. This computer
network behaves as follows. A task, which finishes its service at
station B, gets sent back to station A for re-processing with
probability o. When the number of tasks in the second buffer exceeds
a threshold tm2 and the number of task in the first buffer is less than
tm1, the fed back task is served under HoL priority discipline. In
opposite case, for fed backed tasks, “no two priority services in
succession" procedure (preventing a possible overflow in the first
buffer) is applied. Using an open Markovian queuing schema with
blocking, priority feedback service and thresholds, a closed form
cost-effective analytical solution is obtained. The model of servers
linked in series is very accurate. It is derived directly from a twodimensional
state graph and a set of steady-state equations, followed
by calculations of main measures of effectiveness. Consequently,
efficient expressions of the low computational cost are determined.
Based on numerical experiments and collected results we conclude
that the proposed model with blocking, feedback and thresholds can
provide accurate performance estimates of linked in series networks.
Abstract: A therapeutic success is the aim of any therapeutic
intervention, but a therapeutic failure is the other side of the same
coin. The purpose of this study is to present the activity of a personal
development group, composed of 14 participants (psychologists,
doctors and a priest) registered for a 2 days course of integrative
psychotherapy. The objectives of this study are centred on: the
management of the personal development group breaking moment
realized by the therapist/trainer; the analysis of the trainer’s personal
situation and of some group participants and the brief presentation of
the main work methods applied on participants in the repairing of the
therapeutic relation and in the counter transfer management. The
therapist’s orientation is an integrative one and the demarche realized
includes T.A. techniques, role play, Gestalt and family systemic
psychotherapy. The conclusions obtained represent landmarks for the
future activity within that group and strengthen the therapeutic
relation with the group.
Abstract: This paper presents a heuristic approach to solve the Generalized Assignment Problem (GAP) which is NP-hard. It is worth mentioning that many researches used to develop algorithms for identifying the redundant constraints and variables in linear programming model. Some of the algorithms are presented using intercept matrix of the constraints to identify redundant constraints and variables prior to the start of the solution process. Here a new heuristic approach based on the dominance property of the intercept matrix to find optimal or near optimal solution of the GAP is proposed. In this heuristic, redundant variables of the GAP are identified by applying the dominance property of the intercept matrix repeatedly. This heuristic approach is tested for 90 benchmark problems of sizes upto 4000, taken from OR-library and the results are compared with optimum solutions. Computational complexity is proved to be O(mn2) of solving GAP using this approach. The performance of our heuristic is compared with the best state-ofthe- art heuristic algorithms with respect to both the quality of the solutions. The encouraging results especially for relatively large size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.
Abstract: Eigenvector methods are gaining increasing acceptance in the area of spectrum estimation. This paper presents a successful attempt at testing and evaluating the performance of two of the most popular types of subspace techniques in determining the parameters of multiexponential signals with real decay constants buried in noise. In particular, MUSIC (Multiple Signal Classification) and minimum-norm techniques are examined. It is shown that these methods perform almost equally well on multiexponential signals with MUSIC displaying better defined peaks.
Abstract: Manufacturing, production and service industries within Libya have struggled with many problems during the past two decades due to many difficulties. These problems have created a negative impact on the productivity and utilization of many industries around the country. This paper studies the implementation levels of the manufacturing control systems known as Manufacturing Resource Planning (MRPII) being adapted within some Libyan industries. A survey methodology has been applied for this research, based on the survey analysis, the results pointed out that the system within these industries has a modest strategy towards most of the areas that are considered as being very crucial in implementing these systems successfully. The findings also show a variation within these implementation levels with a respect to the key-elements that related to MRPII, giving the highest levels in the emphasise on financial data accuracy. The paper has also identified limitations within the investigated manufacturing and managerial areas and has pointed to where senior managers should take immediate actions in order to achieve effective implementation of MRPII within their business area.
Abstract: A novel calibration approach that aims to reduce
ASM2d parameter subsets and decrease the model complexity is
presented. This approach does not require high computational
demand and reduces the number of modeling parameters required to
achieve the ASMs calibration by employing a sensitivity and iteration
methodology. Parameter sensitivity is a crucial factor and the
iteration methodology enables refinement of the simulation parameter
values. When completing the iteration process, parameters values are
determined in descending order of their sensitivities. The number of
iterations required is equal to the number of model parameters of the
parameter significance ranking. This approach was used for the
ASM2d model to the evaluated EBPR phosphorus removal and it was
successful. Results of the simulation provide calibration parameters.
These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA,
KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were
corresponding to the experimental data available.
Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: The nature of wireless ad hoc and sensor networks
make them very attractive to attackers. One of the most popular and
serious attacks in wireless ad hoc networks is wormhole attack and
most proposed protocols to defend against this attack used
positioning devices, synchronized clocks, or directional antennas.
This paper analyzes the nature of wormhole attack and existing
methods of defending mechanism and then proposes round trip time
(RTT) and neighbor numbers based wormhole detection mechanism.
The consideration of proposed mechanism is the RTT between two
successive nodes and those nodes- neighbor number which is needed
to compare those values of other successive nodes. The identification
of wormhole attacks is based on the two faces. The first consideration
is that the transmission time between two wormhole attack affected
nodes is considerable higher than that between two normal neighbor
nodes. The second detection mechanism is based on the fact that by
introducing new links into the network, the adversary increases the
number of neighbors of the nodes within its radius. This system does
not require any specific hardware, has good performance and little
overhead and also does not consume extra energy. The proposed
system is designed in ad hoc on-demand distance vector (AODV)
routing protocol and analysis and simulations of the proposed system
are performed in network simulator (ns-2).
Abstract: A unique combination of adsorption and
electrochemical regeneration with a proprietary adsorbent material
called Nyex 100 was introduced at the University of Manchester for
waste water treatment applications. Nyex 100 is based on graphite
intercalation compound. It is non porous and electrically conducing
adsorbent material. This material exhibited very small BET surface
area i.e. 2.75 m2g-1, in consequence, small adsorptive capacities for
the adsorption of various organic pollutants were obtained. This work
aims to develop composite adsorbent material essentially capable of
electrochemical regeneration coupled with improved adsorption
characteristics. An organic dye, acid violet 17 was used as standard
organic pollutant. The developed composite material was
successfully electrochemically regenerated using a DC current of 1 A
for 60 minutes. Regeneration efficiency was maintained at around
100% for five adsorption-regeneration cycles.
Abstract: This study focuses on teamwork in Finnish working
life. Through a wide cross-section of teams the study examines the
causes to which team members attribute the outcomes of their teams.
Qualitative data was collected from 314 respondents. They wrote 616
stories to describe memorable experiences of success and failure in
teamwork. The stories revealed 1930 explanations. The findings
indicate that both favorable and unfavorable team outcomes are
perceived as being caused by the characteristics of team members,
relationships between members, team communication, team
structure, team goals, team leadership, and external forces. The types
represent different attribution levels in the context of organizational
teamwork.