Abstract: User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Abstract: One of the most important areas of knowledge management studies is knowledge sharing. Measured in terms of number of scientific articles and organization-s applications, knowledge sharing stands as an example of success in the field. This paper reviews the related papers in the context of the underlying individual behavioral variables to providea direction framework for future research and writing.
Abstract: Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Abstract: The number of intrusions and attacks against critical
infrastructures and other information networks is increasing rapidly.
While there is no identified evidence that terrorist organizations are
currently planning a coordinated attack against the vulnerabilities of
computer systems and network connected to critical infrastructure,
and origins of the indiscriminate cyber attacks that infect computers
on network remain largely unknown. The growing trend toward the
use of more automated and menacing attack tools has also
overwhelmed some of the current methodologies used for tracking
cyber attacks. There is an ample possibility that this kind of cyber
attacks can be transform to cyberterrorism caused by illegal purposes.
Cyberterrorism is a matter of vital importance to national welfare.
Therefore, each countries and organizations have to take a proper
measure to meet the situation and consider effective legislation about
cyberterrorism.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: Encrypted messages sending frequently draws the attention
of third parties, perhaps causing attempts to break and
reveal the original messages. Steganography is introduced to hide
the existence of the communication by concealing a secret message
in an appropriate carrier like text, image, audio or video. Quantum
steganography where the sender (Alice) embeds her steganographic
information into the cover and sends it to the receiver (Bob) over a
communication channel. Alice and Bob share an algorithm and hide
quantum information in the cover. An eavesdropper (Eve) without
access to the algorithm can-t find out the existence of the quantum
message. In this paper, a text quantum steganography technique based
on the use of indefinite articles (a) or (an) in conjunction with the nonspecific
or non-particular nouns in English language and quantum
gate truth table have been proposed. The authors also introduced a
new code representation technique (SSCE - Secret Steganography
Code for Embedding) at both ends in order to achieve high level of
security. Before the embedding operation each character of the secret
message has been converted to SSCE Value and then embeds to cover
text. Finally stego text is formed and transmits to the receiver side.
At the receiver side different reverse operation has been carried out
to get back the original information.
Abstract: Sediment and mangrove root samples from Iko River
Estuary, Nigeria were analyzed for microbial and polycyclic
aromatic hydrocarbon (PAH) content. The total heterotrophic
bacterial (THB) count ranged from 1.1x107 to 5.1 x107 cfu/g, total
fungal (TF) count ranged from 1.0x106 to 2.7x106 cfu/g, total
coliform (TC) count ranged from 2.0x104 to 8.0x104cfu/g while
hydrocarbon utilizing bacterial (HUB) count ranged from 1.0x 105 to
5.0 x 105cfu/g. There was a range of positive correlation (r = 0.72 to
0.93) between THB count and total HUB count, respectively. The
organisms were Staphylococcus aureus, Bacillus cereus,
Flavobacterium breve, Pseudomonas aeruginosa, Erwinia
amylovora, Escherichia coli, Enterobacter sp, Desulfovibrio sp,
Acinetobacter iwoffii, Chromobacterium violaceum, Micrococcus
sedentarius, Corynebacterium sp, and Pseudomonas putrefaciens.
The PAH were Naphthalene, 2-Methylnaphthalene, Acenapthylene,
Acenaphthene, Fluorene, Phenanthene, Anthracene, Fluoranthene,
Pyrene, Benzo(a)anthracene, Chrysene, Benzo(b)fluoranthene,
Benzo(k)fluoranthene, Benzo(a)pyrene, Dibenzo(a,h)anthracene,
Benzo(g,h,l)perylene ,Indeno(1,2,3-d)pyrene with individual PAH
concentrations that ranged from 0.20mg/kg to 1.02mg/kg, 0.20mg/kg
to 1.07mg/kg and 0.2mg/kg to 4.43mg/kg in the benthic sediment,
epipellic sediment and mangrove roots, respectively. Total PAH
ranged from 6.30 to 9.93mg/kg, 6.30 to 9.13mg/kg and 9.66 to
16.68mg/kg in the benthic sediment, epipellic sediment and
mangrove roots, respectively. The high concentrations in the
mangrove roots are indicative of bioaccumulation of the pollutant in
the plant tissue. The microorganisms are of ecological significance
and the detectable quantities of polycyclic aromatic hydrocarbon
could be partitioned and accumulated in tissues of infaunal and
epifaunal organisms in the study area.
Abstract: The purpose of this study was to elucidate the factors affecting antimicrobial effectiveness of essential oils against food spoilage and pathogenic bacteria. The minimum inhibition concentrations (MIC) of the essential oils, were determined by turbidimetric technique using Biocreen C, analyzer. The effects of pH ranging from 7.3 to 5.5 in absence and presence of essential oils and/or NaCl on the lag time and mean generation time of the bacteria at 370C, were carried out and results were determined showed that, combination of low pH and essential oil at 370C had additive effects against the test micro-organisms. The combination of 1.2 % (w/v) of NaCl and clove essential oil at 0.0325% (v/v) was effective against E. coli. The use of concentrations less than MIC in combination with low pH and or NaCl has the potential of being used as an alternative to “traditional food preservatives".
Abstract: In Both developed and developing countries,
governments play a basic role in making policies, programs and
instruments which support the development of micro, small and
medium enterprises. One of the mechanisms employed to nurture
small firms for more than two decades is business incubation. One of
the mechanisms employed to nurture small firms for more than two
decades is technology business incubation. The main aim of this
research was to establish influencing factors in Technology Business
Incubator's effectiveness and their explanatory model. Therefore,
among 56 Technology Business Incubators in Iran, 32 active
incubators were selected and by stratified random sampling, 528
start-ups were chosen. The validity of research questionnaires
was determines by expert consensus, item analysis and factor
analysis; and their reliability calculated by Cronbach-s alpha.
Data analysis was then made through SPSS and LISREL soft wares.
Both organizational procedures and entrepreneurial behaviors were
the meaningful mediators. Organizational procedures with (P < .01, β
=0.45) was stronger mediator for the improvement of Technology
Business Incubator's effectiveness comparing to entrepreneurial
behavior with (P < .01, β =0.36).
Abstract: Without uncertainty by applying external loads on
beams, bending is created. The created bending in I-beams, puts one
of the flanges in tension and the other one in compression. With increasing of bending, compression flange buckled and beam in out
of its plane direction twisted, this twisting well-known as Lateral Torsional Buckling. Providing bending moment varieties along the
beam, the critical moment is greater than the case its under pure bending. In other words, the value of bending gradient coefficient is
always greater than unite. In this article by the use of " ANSYS 10.0" software near 80 3-D finite element models developed for the
propose of analyzing beams` lateral torsional buckling and surveying influence of slenderness on beams' bending gradient coefficient.
Results show that, presented Cb coefficient via AISC is not correct for some of beams and value of this coefficient is smaller than what proposed by AISC. Therefore instead of using a constant Cb for each
case of loading , a function with two criterion for calculation of Cb coefficient for some cases is proposed.
Abstract: The relevance of the study of everyday life in Almaty
and Kyzylorda are associated with the emergence of the modern
trends in historiography and socializing areas of government reform.
The relevance is due to the fact that in the early twentieth century
Kyzylorda and Almaty began to develop as a city and this period has
a special place in the life of the state. An interesting aspect of the
everyday life of the inhabitants of the new city, which was built in the
era of Stalin's Five-Year Plans, can be examined through the eyes of
the Soviet people living in a specific environment, reflecting the life
of the citizens. The study of industrialization of the Soviet Union and
the attention paid to new developments in the first five years of
everyday aspects as the impact of the modernization of the 1930s was
one of the decisive factors in the lives of residents. Among these
factors, we would like to highlight the medical field, which is the
basis of all human life, specifically focusing on the state of medicine
in Alma-Ata in the first 20-30-years of the twentieth century, and
analyze the different aspects of human life, determining the quality of
medical care to the population during this period.
Abstract: Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: One astonishing capability of humans is to recognize thousands of different objects visually, and to learn the semantic association between those objects and words referring to them. This work is an attempt to build a computational model of such capacity,simulating the process by which infants learn how to recognize objects and words through exposure to visual stimuli and vocal sounds.One of the main fact shaping the brain of a newborn is that lights and colors come from entities of the world. Gradually the visual system learn which light sensations belong to same entities, despite large changes in appearance. This experience is common between humans and several other mammals, like non-human primates. But humans only can recognize a huge variety of objects, most manufactured by himself, and make use of sounds to identify and categorize them. The aim of this model is to reproduce these processes in a biologically plausible way, by reconstructing the essential hierarchy of cortical circuits on the visual and auditory neural paths.
Abstract: Environmental micro-organisms include a large number of taxa and some species that are generally considered nonpathogenic, but can represent a risk in certain conditions, especially for elderly people and immunocompromised individuals. Chemotaxonomic identification techniques are powerful tools for environmental micro-organisms, and cellular fatty acid methyl esters (FAME) content is a powerful fingerprinting identification technique. A system based on an unsupervised artificial neural network (ANN) was set up using the fatty acid profiles of standard bacterial strains, obtained by gas-chromatography, used as learning data. We analysed 45 certified strains belonging to Acinetobacter, Aeromonas, Alcaligenes, Aquaspirillum, Arthrobacter, Bacillus, Brevundimonas, Enterobacter, Flavobacterium, Micrococcus, Pseudomonas, Serratia, Shewanella and Vibrio genera. A set of 79 bacteria isolated from a drinking water line (AMGA, the major water supply system in Genoa) were used as an example for identification compared to standard MIDI method. The resulting ANN output map was found to be a very powerful tool to identify these fresh isolates.
Abstract: Porcelain specimens were fired at 6C/min to 1250C (dwell time 0.5-3h) and cooled at 6C/min to room temperature. Additionally, three different slower firing/cooling cycles were tried. Sintering profile and effects on MOR, crystalline phase content and morphology were investigated using dilatometry, 4-point bending strength, XRD and FEG-SEM respectively. Industrial-sized specimens prepared using the promising cycle were tested basing on the ANSI standards. Increasing dwell time from 1h to 3h at peak temperature of 1250C resulted in neither a significant effect on the quartz and mullite content nor MOR. Reducing the firing/cooling rate to below 6C/min, for peak temperature of 1250C (dwell time of 1h) does not result in improvement of strength of porcelain. The industrial sized specimen exhibited flashover voltages of 20.3kV (dry) and 9.3kV (wet) respectively, transverse strength of 12.5kN and bulk density of 2.27g/cm3, which are satisfactory. There was however dye penetration during porosity test. KeywordsDwell time, Microstructure, Porcelain, Strength.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: We present a non standard Euclidean vehicle
routing problem adding a level of clustering, and we revisit the use
of self-organizing maps as a tool which naturally handles such
problems. We present how they can be used as a main operator
into an evolutionary algorithm to address two conflicting
objectives of route length and distance from customers to bus stops
minimization and to deal with capacity constraints. We apply the
approach to a real-life case of combined clustering and vehicle
routing for the transportation of the 780 employees of an
enterprise. Basing upon a geographic information system we
discuss the influence of road infrastructures on the solutions
generated.
Abstract: The purpose of this paper was to study motivation
factors affecting job performance effectiveness. This paper drew
upon data collected from an Internal Audit Staffs of Internal Audit
Line of Head Office of Krung Thai Public Company Limited.
Statistics used included frequency, percentage, mean and standard
deviation, t-test, and one-way ANOVA test. The finding revealed that
the majority of the respondents were female of 46 years of age and
over, married and live together, hold a bachelor degree, with an
average monthly income over 70,001 Baht. The majority of
respondents had over 15 years of work experience. They generally
had high working motivation as well as high job performance
effectiveness.
The hypotheses testing disclosed that employees with different
working status had different level of job performance effectiveness at
a 0.01 level of significance. Working motivation factors had an effect
on job performance in the same direction with high level. Individual
working motivation included working completion, reorganization,
working progression, working characteristic, opportunity,
responsibility, management policy, supervision, relationship with
their superior, relationship with co-worker, working position,
working stability, safety, privacy, working conditions, and payment.
All of these factors related to job performance effectiveness in the
same direction with medium level.
Abstract: Proteomics is one of the largest areas of research for
bioinformatics and medical science. An ambitious goal of proteomics
is to elucidate the structure, interactions and functions of all proteins
within cells and organisms. Predicting Protein-Protein Interaction
(PPI) is one of the crucial and decisive problems in current research.
Genomic data offer a great opportunity and at the same time a lot of
challenges for the identification of these interactions. Many methods
have already been proposed in this regard. In case of in-silico
identification, most of the methods require both positive and negative
examples of protein interaction and the perfection of these examples
are very much crucial for the final prediction accuracy. Positive
examples are relatively easy to obtain from well known databases. But
the generation of negative examples is not a trivial task. Current PPI
identification methods generate negative examples based on some
assumptions, which are likely to affect their prediction accuracy.
Hence, if more reliable negative examples are used, the PPI prediction
methods may achieve even more accuracy. Focusing on this issue, a
graph based negative example generation method is proposed, which
is simple and more accurate than the existing approaches. An
interaction graph of the protein sequences is created. The basic
assumption is that the longer the shortest path between two
protein-sequences in the interaction graph, the less is the possibility of
their interaction. A well established PPI detection algorithm is
employed with our negative examples and in most cases it increases
the accuracy more than 10% in comparison with the negative pair
selection method in that paper.