Abstract: One of the main concerns in the Information Technology field is adoption with new technologies in organizations which may result in increasing the usage paste of these technologies.This study aims to look at the issue of culture-s role in accepting and using new technologies in organizations. The study examines the effect of culture on accepting and intention to use new technology in organizations. Studies show culture is one of the most important barriers in adoption new technologies. The model used for accepting and using new technology is Technology Acceptance Model (TAM), while for culture and dimensions a well-known theory by Hofsted was used. Results of the study show significant effect of culture on intention to use new technologies. All four dimensions of culture were tested to find the strength of relationship with behavioral intention to use new technologies. Findings indicate the important role of culture in the level of intention to use new technologies and different role of each dimension to improve adaptation process. The study suggests that transferring of new technologies efforts are most likely to be successful if the parties are culturally aligned.
Abstract: Subjective loneliness describes people who feel a
disagreeable or unacceptable lack of meaningful social relationships,
both at the quantitative and qualitative level. The studies to be
presented tested an Italian 18-items self-report loneliness measure,
that included items adapted from scales previously developed,
namely a short version of the UCLA (Russell, Peplau and Cutrona,
1980), and the 11-items Loneliness scale by De Jong-Gierveld &
Kamphuis (JGLS; 1985). The studies aimed at testing the developed
scale and at verifying whether loneliness is better conceptualized as a
unidimensional (so-called 'general loneliness') or a bidimensional
construct, namely comprising the distinct facets of social and
emotional loneliness. The loneliness questionnaire included 2 singleitem
criterion measures of sad mood, and social contact, and asked
participants to supply information on a number of socio-demographic
variables. Factorial analyses of responses obtained in two
preliminary studies, with 59 and 143 Italian participants respectively,
showed good factor loadings and subscale reliability and confirmed
that perceived loneliness has clearly two components, a social and an
emotional one, the latter measured by two subscales, a 7-item
'general' loneliness subscale derived from UCLA, and a 6–item
'emotional' scale included in the JGLS. Results further showed that
type and amount of loneliness are related, negatively, to frequency of
social contacts, and, positively, to sad mood. In a third study data
were obtained from a nation-wide sample of 9.097 Italian subjects,
12 to about 70 year-olds, who filled the test on-line, on the Italian
web site of a large-audience magazine, Focus. The results again
confirmed the reliability of the component subscales, namely social,
emotional, and 'general' loneliness, and showed that they were
highly correlated with each other, especially the latter two.
Loneliness scores were significantly predicted by sex, age, education
level, sad mood and social contact, and, less so, by other variables –
e.g., geographical area and profession. The scale validity was
confirmed by the results of a fourth study, with elderly men and
women (N 105) living at home or in residential care units. The three
subscales were significantly related, among others, to depression, and
to various measures of the extension of, and satisfaction with, social
contacts with relatives and friends. Finally, a fifth study with 315
career-starters showed that social and emotional loneliness correlate
with life satisfaction, and with measures of emotional intelligence.
Altogether the results showed a good validity and reliability in the
tested samples of the entire scale, and of its components.
Abstract: Arguments on a popular microblogging site were analysed by means of a methodological approach to business rhetoric focusing on the logos communication technique. The focus of the analysis was the 100 day countdown to the 2011 Rugby World Cup as advanced by the organisers. Big sporting events provide an attractive medium for sport event marketers in that they have become important strategic communication tools directed at sport consumers. Sport event marketing is understood in the sense of using a microblogging site as a communication tool whose purpose it is to disseminate a company-s marketing messages by involving the target audience in experiential activities. Sport creates a universal language in that it excites and increases the spread of information by word of mouth and other means. The findings highlight the limitations of a microblogging site in terms of marketing messages which can assist in better practices. This study can also serve as a heuristic tool for other researchers analysing sports marketing messages in social network environments.
Abstract: The analysis to detect arrhythmias and life-threatening
conditions are highly essential in today world and this analysis
can be accomplished by advanced non-linear processing methods
for accurate analysis of the complex signals of heartbeat dynamics.
In this perspective, recent developments in the field of multiscale
information content have lead to the Microcanonical Multiscale
Formalism (MMF). We show that such framework provides several
signal analysis techniques that are especially adapted to the
study of heartbeat dynamics. In this paper, we just show first hand
results of whether the considered heartbeat dynamics signals have
the multiscale properties by computing local preticability exponents
(LPEs) and the Unpredictable Points Manifold (UPM), and thereby
computing the singularity spectrum.
Abstract: Compliance requires an effective communication
within an enterprise as well as towards a company-s external
environment. This requirement commences with the
implementation of compliance within large scale compliance
projects and still persists in the compliance reporting within
standard operations. On the one hand the understanding of
compliance necessities within the organization is promoted.
On the other hand reduction of asymmetric information with
compliance stakeholders is achieved. To reach this goal, a
central reporting must provide a consolidated view of different
compliance efforts- statuses. A concept which could be
adapted for this purpose is the balanced scorecard by Kaplan /
Norton. This concept has not been analyzed in detail
concerning its adequacy for a holistic compliance reporting
starting in compliance projects until later usage in regularly
compliance operations.
At first, this paper evaluates if a holistic compliance
reporting can be designed by using the balanced scorecard
concept. The current status of compliance reporting clearly
shows that scorecards are generally accepted as a compliance
reporting tool and are already used for corporate governance
reporting. Additional specialized compliance IT - solutions
exist in the market. After the scorecard-s adequacy is
thoroughly examined and proofed, an example strategy map as
the basis to derive a compliance balanced scorecard is defined.
This definition answers the question on proceeding in
designing a compliance reporting tool.
Abstract: High-voltage power transmission lines are the back
bone of electrical power utilities. The stability and continuous
monitoring of this critical infrastructure is pivotal. Nine-Sigma
representing Eskom Holding SOC limited, South Africa has a major
problem on proactive detection of fallen power lines and real time
sagging measurement together with slipping of such conductors. The
main objective of this research is to innovate RFID technology to
solve this challenge. Various options and technologies such as GPS,
PLC, image processing, MR sensors and etc., have been reviewed
and draw backs were made. The potential of RFID to give precision
measurement will be observed and presented. The future research
will look at magnetic and electrical interference as well as corona
effect on the technology.
Abstract: Users of computer systems may often require the
private transfer of messages/communications between parties across
a network. Information warfare and the protection and dominance of
information in the military context is a prime example of an
application area in which the confidentiality of data needs to be
maintained. The safe transportation of critical data is therefore often
a vital requirement for many private communications. However,
unwanted interception/sniffing of communications is also a
possibility. An elementary stealthy transfer scheme is therefore
proposed by the authors. This scheme makes use of encoding,
splitting of a message and the use of a hashing algorithm to verify the
correctness of the reconstructed message. For this proof-of-concept
purpose, the authors have experimented with the random sending of
encoded parts of a message and the construction thereof to
demonstrate how data can stealthily be transferred across a network
so as to prevent the obvious retrieval of data.
Abstract: It is necessary to incorporate technological advances
achieved in the field of engineering into dentistry in order to enhance
the process of diagnosis, treatment planning and enable the doctors to
render better treatment to their patients. To achieve this ultimate goal
long distance collaborations are often necessary. This paper discusses
the various collaborative tools and their applications to solve a few
burning problems confronted by the dentists. Customization is often
the solution to most of the problems. But rapid designing,
development and cost effective manufacturing is a difficult task to
achieve. This problem can be solved using the technique of digital
manufacturing. Cases from 6 major branches of dentistry have been
discussed and possible solutions with the help of state of art
technology using rapid digital manufacturing have been proposed in
the present paper. The paper also entails the usage of existing tools in
collaborative and digital manufacturing area.
Abstract: Empirical force fields and density functional theory
(DFT) was used to study the binding energies and structures of
methylamine on the surface of activated carbons (ACs). This is a first
step in studying the adsorption of alkyl amines on the surface of
functionalized ACs. The force fields used were Dreiding (DFF),
Universal (UFF) and Compass (CFF) models. The generalized
gradient approximation with Perdew Wang 91 (PW91) functional
was used for DFT calculations. In addition to obtaining the aminecarboxylic
acid adsorption energies, the results were used to establish
reliability of the empirical models for these systems. CFF predicted a
binding energy of -9.227 (kcal/mol) which agreed with PW91 at -
13.17 (kcal/mol), compared to DFF 0 (kcal/mol) and UFF -0.72
(kcal/mol). However, the CFF binding energies for the amine to ester
and ketone disagreed with PW91 results. The structures obtained
from all models agreed with PW91 results.
Abstract: The deterministic quantum transfer-matrix (QTM)
technique and its mathematical background are presented. This
important tool in computational physics can be applied to a class of
the real physical low-dimensional magnetic systems described by the
Heisenberg hamiltonian which includes the macroscopic molecularbased
spin chains, small size magnetic clusters embedded in some
supramolecules and other interesting compounds. Using QTM, the
spin degrees of freedom are accurately taken into account, yielding
the thermodynamical functions at finite temperatures.
In order to test the application for the susceptibility calculations to
run in the parallel environment, the speed-up and efficiency of
parallelization are analyzed on our platform SGI Origin 3800 with
p = 128 processor units. Using Message Parallel Interface (MPI)
system libraries we find the efficiency of the code of 94% for
p = 128 that makes our application highly scalable.
Abstract: This paper presents a recognition system for isolated
words like robot commands. It’s carried out by Time Delay Neural
Networks; TDNN. To teleoperate a robot for specific tasks as turn,
close, etc… In industrial environment and taking into account the
noise coming from the machine. The choice of TDNN is based on its
generalization in terms of accuracy, in more it acts as a filter that
allows the passage of certain desirable frequency characteristics of
speech; the goal is to determine the parameters of this filter for
making an adaptable system to the variability of speech signal and to
noise especially, for this the back propagation technique was used in
learning phase. The approach was applied on commands pronounced
in two languages separately: The French and Arabic. The results for
two test bases of 300 spoken words for each one are 87%, 97.6% in
neutral environment and 77.67%, 92.67% when the white Gaussian
noisy was added with a SNR of 35 dB.
Abstract: Mature landfill leachates contain some macromolecular organic substances that are resistant to biological degradation. Recently, Fenton-s oxidation has been investigated for chemical treatment or pre-treatment of mature landfill leachates. The aim of this study was to reduce the recalcitrant organic load still remaining after the complete treatment of a mature landfill leachate by Fenton-s oxidation post-treatment. The effect of various parameters such as H2O2 to Fe2+ molar ratio, dosage of Fe2+ reagent, initial pH, reaction time and initial chemical oxygen demand (COD) strength, that have an important role on the oxidation, was analysed. A molar ratio H2O2/Fe2+ = 3, a Fe2+ dosage of 4 mmol·L-1, pH 3, and a reaction time of 40 min were found to achieve better oxidation performances. At these favorable conditions, COD removal efficiency was 60.9% and 31.1% for initial COD of 93 and 743 mg·L-1 respectively (diluted and non diluted leachate). Fenton-s oxidation also presented good results for color removal. In spite of being extremely difficult to treat this leachate, the above results seem rather encouraging on the application of Fenton-s oxidation.
Abstract: This research analyzes factors affecting the success of
Litecoin Value within Thailand and develops a guideline for selfreliance
for effective business implementation. Samples in this study
included 119 people through surveys. The results revealed four main
factors affecting the success as follows: 1) Future Career training
should be pursued in applied Litecoin development. 2) Didn't grasp
the concept of a digital currency or see the benefit of a digital
currency. 3) There is a great need to educate the next generation of
learners on the benefits of Litecoin within the community. 4) A great
majority didn't know what Litecoin was.
The guideline for self-reliance planning consisted of 4 aspects: 1)
Development planning: by arranging meet up groups to conduct
further education on Litecoin and share solutions on adoption into
every day usage. Local communities need to develop awareness of
the usefulness of Litecoin and share the value of Litecoin among
friends and family. 2) Computer Science and Business Management
staff should develop skills to expand on the benefits of Litecoin
within their departments. 3) Further research should be pursued on
how Litecoin Value can improve business and tourism within
Thailand. 4) Local communities should focus on developing Litecoin
awareness by encouraging street vendors to accept Litecoin as
another form of payment for services rendered.
Abstract: Information of nodes’ locations is an important
criterion for lots of applications in Wireless Sensor Networks. In the
hop-based range-free localization methods, anchors transmit the
localization messages counting a hop count value to the whole
network. Each node receives this message and calculates its own
distance with anchor in hops and then approximates its own position.
However the estimative distances can provoke large error, and affect
the localization precision. To solve the problem, this paper proposes
an algorithm, which makes the unknown nodes fix the nearest anchor
as a reference and select two other anchors which are the most
accurate to achieve the estimated location. Compared to the DV-Hop
algorithm, experiment results illustrate that proposed algorithm has
less average localization error and is more effective.
Abstract: A study was conducted to formally characterize
notebook computer performance under various environmental and
usage conditions. Software was developed to collect data from the
operating system of the computer. An experiment was conducted to
evaluate the performance parameters- variations, trends, and
correlations, as well as the extreme value they can attain in various
usage and environmental conditions. An automated software script
was written to simulate user activity. The variability of each
performance parameter was addressed by establishing the empirical
relationship between performance parameters. These equations were
presented as baseline estimates for performance parameters, which
can be used to detect system deviations from normal operation and
for prognostic assessment. The effect of environmental factors,
including different power sources, ambient temperatures, humidity,
and usage, on performance parameters of notebooks was studied.
Abstract: The Learning Management Systems present learning
environment which offers a collection of e-learning tools in a
package that allows a common interface and information sharing
among the tools. South East European University initial experience
in LMS was with the usage of the commercial LMS-ANGEL. After a
three year experience on ANGEL usage because of expenses that
were very high it was decided to develop our own software. As part
of the research project team for the in-house design and development
of the new LMS, we primarily had to select the features that would
cover our needs and also comply with the actual trends in the area of
software development, and then design and develop the system. In
this paper we present the process of LMS in-house development for
South East European University, its architecture, conception and
strengths with a special accent on the process of migration and
integration with other enterprise applications.
Abstract: In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: Characterized as rich mineral substances, low
temperature, few bacteria, and stability with numerous implementation
aspects on aquaculture, food, drinking, and leisure, the deep sea water
(DSW) development has become a new industry in the world. It has
been report that marine algae contain various biologically active
compounds. This research focued on the affections in cultivating
Sagrassum cristaefolium with different concentration of deep sea
water(DSW) and surface sea water(SSW). After two and four weeks,
the total phenolic contents were compared in Sagrassum cristaefolium
culturing with different ways, and the reductive activity of them was
also be tried with potassium ferricyanide. Those fresh seaweeds were
dried with oven and were ground to powder. Progressively, the marine
algae we cultured was extracted by water under the condition with
heating them at 90Ôäâ for 1hr.The total phenolic contents were be
executed using Folin–Ciocalteu method. The results were explaining
as follows: the highest total phenolic contents and the best reductive
ability of all could be observed on the 1/4 proportion of DSW to SSW
culturing in two weeks. Furthermore, the 1/2 proportion of DSW to
SSW also showed good reductive ability and plentiful phenolic
compositions. Finally, we confirmed that difference proportion of
DSW and SSW is the major point relating to ether the total phenolic
components or the reductive ability in the Sagrassum cristaefolium. In
the future, we will use this way to mass production the marine algae or
other micro algae on industry applications.
Abstract: Malaysia is aggressive in promoting the usage of ICT
to its mass population through the support by the government
policies and programs targeting the general population. However,
with the uneven distribution of the basic telecommunication
infrastructure between the urban and rural area, cost for being
“interconnected" that is considered high among the poorer rural
population and the lack of local contents that suit the rural population
needs or lifestyles, it is still a challenge for Malaysia to achieve its
Vision 2020 Agenda moving the nation towards an information
society by the year 2020. Among the existing programs that have
been carried out by the government to encourage the usage of ICT by
the rural population is “Kedaikom", a community telecenter with the
general aim is to engage the community to get exposed and to use the
ICT, encouraging the diffusion of the ICT technology to the rural
population. The research investigated by using a questionnaire
survey of how Kedaikom, as a community telecenter could play a
role in encouraging the rural or underserved community to use the
ICT. The result from the survey has proven that the community
telecenter could bridge the digital divide between the underserved
rural population and the well-accessed urban population in Malaysia.
More of the rural population, especially from the younger generation
and those with higher educational background are using the
community telecenter to be connected to the ICT.