Abstract: Effective knowledge support relies on providing
operation-relevant knowledge to workers promptly and accurately. A
knowledge flow represents an individual-s or a group-s
knowledge-needs and referencing behavior of codified knowledge
during operation performance. The flow has been utilized to facilitate
organizational knowledge support by illustrating workers-
knowledge-needs systematically and precisely. However,
conventional knowledge-flow models cannot work well in cooperative
teams, which team members usually have diverse knowledge-needs in
terms of roles. The reason is that those models only provide one single
view to all participants and do not reflect individual knowledge-needs
in flows. Hence, we propose a role-based knowledge-flow view model
in this work. The model builds knowledge-flow views (or virtual
knowledge flows) by creating appropriate virtual knowledge nodes
and generalizing knowledge concepts to required concept levels. The
customized views could represent individual role-s knowledge-needs
in teamwork context. The novel model indicates knowledge-needs in
condensed representation from a roles perspective and enhances the
efficiency of cooperative knowledge support in organizations.
Abstract: It is impossible to think about democracy without elections. The litmus test of any electoral process in any country is the possibility of a one time minority to become a majority at another time and a peaceful transition of power. In many countries in Sub-Saharan Africa though the multi-party elections appeared to be competitive they failed the acid test of democracy: peaceful regime change in a free and fair election. Failure to solve electoral disputes might lead to bloody electoral conflicts as witnessed in many emerging democracies in Africa. The aim of this paper is to investigate electoral conflicts in Africa since the end of the Cold War by using the 2005 post-election violence in Ethiopia as a case study. In Ethiopia, the coming to power of the EPRDF in 1991 marked the fall of the Derg dictatorial military government and the beginning of a multi-party democracy. The country held multi-party parliamentary elections in 1995, 2000, and 2005 where the ruling EPRDF party “won" the elections through violence, involving intimidation, manipulation, detentions of political opponents, torture, and political assassinations. The 2005 electoral violence was the worst electoral violence in the country-s political history that led to the death of 193 protestors and the imprisonment of more than 40, 000 people. It is found out that the major causes of the 2005 Ethiopian election were the defeat of the ruling party in the election and its attempt to reverse the poll results by force; the Opposition-s lack of decisive leadership; the absence of independent courts and independent electoral management body; and the ruling party-s direct control over the army and police.
Abstract: In association with path dependence, researchers often
talk of institutional “lock-in", thereby indicating that far-reaching
path deviation or path departure are to be regarded as exceptional
cases. This article submits the alleged general inclination for stability
of path-dependent processes to a critical review. The different
reasons for path dependence found in the literature indicate that
different continuity-ensuring mechanisms are at work when people
talk about path dependence (“increasing returns", complementarity,
sequences etc.). As these mechanisms are susceptible to fundamental
change in different ways and to different degrees, the path
dependence concept alone is of only limited explanatory value. It is
therefore indispensable to identify the underlying continuity-ensuring
mechanism as well if a statement-s empirical value is to go beyond
the trivial, always true “history matters".
Abstract: Road crashes not only claim lives and inflict injuries but also create economic burden to the society due to loss of productivity. The problem of deaths and injuries as a result of road traffic crashes is now acknowledged to be a global phenomenon with authorities in virtually all countries of the world concerned about the growth in the number of people killed and seriously injured on their roads. However, the road crash scenario of a developing country like Bangladesh is much worse comparing with this of developed countries. For developing proper countermeasures it is necessary to identify the factors affecting crash occurrences. The objectives of the study is to examine the effect of district wise road infrastructure, socioeconomic and demographic features on crash occurrence .The unit of analysis will be taken as individual district which has not been explored much in the past. Reported crash data obtained from Bangladesh Road Transport Authority (BRTA) from the year 2004 to 2010 are utilized to develop negative binomial model. The model result will reveal the effect of road length (both paved and unpaved), road infrastructure and several socio economic characteristics on district level crash frequency in Bangladesh.
Abstract: This study examines the impact of working capital
management on firms- performance and market value of the firms in
Nigeria. A sample of fifty four non-financial quoted firms in Nigeria
listed on the Nigeria Stock Exchange was used for this study. Data
were collected from annual reports of the sampled firms for the
period 1995-2009. This result shows there is a significant negative
relationship between cash conversion cycle and market valuation
and firm-s performance. It also shows that debt ratio is positively
related to market valuation and negatively related firm-s
performance. The findings confirm that there is a significant
relationship between Market valuation, profitability and working
capital component in line with previous studies. This mean that
Nigeria firms should ensure adequate management of working
capital especially cash conversion cycle components of account
receivables, account payables and inventories, as efficiency working
capital management is expected to contribute positively to the firms-
market value.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: Direct search methods are evolutionary algorithms used to solve optimization problems. (DS) methods do not require any information about the gradient of the objective function at hand while searching for an optimum solution. One of such methods is Pattern Search (PS) algorithm. This paper presents a new approach based on a constrained pattern search algorithm to solve a security constrained power system economic dispatch problem (SCED). Operation of power systems demands a high degree of security to keep the system satisfactorily operating when subjected to disturbances, while and at the same time it is required to pay attention to the economic aspects. Pattern recognition technique is used first to assess dynamic security. Linear classifiers that determine the stability of electric power system are presented and added to other system stability and operational constraints. The problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Pattern search method is then applied to solve the constrained optimization formulation. In particular, the method is tested using one system. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that pattern search (PS) is very applicable for solving security constrained power system economic dispatch problem (SCED).
Abstract: This paper tries to represent a new method for
computing the reliability of a system which is arranged in series or
parallel model. In this method we estimate life distribution function
of whole structure using the asymptotic Extreme Value (EV)
distribution of Type I, or Gumbel theory. We use EV distribution in
minimal mode, for estimate the life distribution function of series
structure and maximal mode for parallel system. All parameters also
are estimated by Moments method. Reliability function and failure
(hazard) rate and p-th percentile point of each function are
determined. Other important indexes such as Mean Time to Failure
(MTTF), Mean Time to repair (MTTR), for non-repairable and
renewal systems in both of series and parallel structure will be
computed.
Abstract: Network Management Systems have played a great important role in information systems. Management is very important and essential in any fields. There are many managements such as configuration management, fault management, performance management, security management, accounting management and etc. Among them, configuration, fault and security management is more important than others. Because these are essential and useful in any fields. Configuration management is to monitor and maintain the whole system or LAN. Fault management is to detect and troubleshoot the system. Security management is to control the whole system. This paper intends to increase the network management functionalities including configuration management, fault management and security management. In configuration management system, this paper specially can support the USB ports and devices to detect and read devices configuration and solve to detect hardware port and software ports. In security management system, this paper can provide the security feature for the user account setting and user management and proxy server feature. And all of the history of the security such as user account and proxy server history are kept in the java standard serializable file. So the user can view the history of the security and proxy server anytime. If the user uses this system, the user can ping the clients from the network and the user can view the result of the message in fault management system. And this system also provides to check the network card and can show the NIC card setting. This system is used RMI (Remote Method Invocation) and JNI (Java Native Interface) technology. This paper is to implement the client/server network management system using Java 2 Standard Edition (J2SE). This system can provide more than 10 clients. And then this paper intends to show data or message structure of client/server and how to work using TCP/IP protocol.
Abstract: In this paper, we present a novel statistical approach to
corpus-based speech synthesis. Classically, phonetic information is
defined and considered as acoustic reference to be respected. In this
way, many studies were elaborated for acoustical unit classification.
This type of classification allows separating units according to their
symbolic characteristics. Indeed, target cost and concatenation cost
were classically defined for unit selection.
In Corpus-Based Speech Synthesis System, when using large text
corpora, cost functions were limited to a juxtaposition of symbolic
criteria and the acoustic information of units is not exploited in the
definition of the target cost.
In this manuscript, we token in our consideration the unit phonetic
information corresponding to acoustic information. This would be realized
by defining a probabilistic linguistic Bi-grams model basically
used for unit selection. The selected units would be extracted from
the English TIMIT corpora.
Abstract: This paper presents a new method for the
implementation of a direct rotor flux control (DRFOC) of induction
motor (IM) drives. It is based on the rotor flux components
regulation. The d and q axis rotor flux components feed proportional
integral (PI) controllers. The outputs of which are the target stator
voltages (vdsref and vqsref). While, the synchronous speed is depicted at
the output of rotor speed controller. In order to accomplish variable
speed operation, conventional PI like controller is commonly used.
These controllers provide limited good performances over a wide
range of operations even under ideal field oriented conditions. An
alternate approach is to use the so called fuzzy logic controller. The
overall investigated system is implemented using dSpace system
based on digital signal processor (DSP). Simulation and experimental
results have been presented for a one kw IM drives to confirm the
validity of the proposed algorithms.
Abstract: This paper focuses on testing database of existing
information system. At the beginning we describe the basic problems
of implemented databases, such as data redundancy, poor design of
database logical structure or inappropriate data types in columns of
database tables. These problems are often the result of incorrect
understanding of the primary requirements for a database of an
information system. Then we propose an algorithm to compare the
conceptual model created from vague requirements for a database
with a conceptual model reconstructed from implemented database.
An algorithm also suggests steps leading to optimization of
implemented database. The proposed algorithm is verified by an
implemented prototype. The paper also describes a fuzzy system
which works with the vague requirements for a database of an
information system, procedure for creating conceptual from vague
requirements and an algorithm for reconstructing a conceptual model
from implemented database.
Abstract: Communication is becoming a significant tool to engage stakeholders since half of the century ago. In the recent years, there has been rapid growth of new technology developments. In tandem with such developments, there has been growing emphasis in communication strategies and management especially in determining the level of influence and management strategies among the said stakeholders on particular field. This paper presents a research conceptual framework focusing on stakeholder theories, communication and management strategies to be implied on the engagement of stakeholders of new technology developments of fertilizer industry in Malaysia. Framework espoused in this paper will provide insights into the various stakeholder theories and engagement strategies from different principal necessary for a successful introduction of new technology development in the above stated industry. The proposed framework has theoretical significance in filling the gap of the body of knowledge in the implementation of communication strategies in Malaysian fertilizer industry.
Abstract: This paper proposes a novel stereo vision technique
for top view book scanners which provide us with dense 3d point
clouds of page surfaces. This is a precondition to dewarp bound
volumes independent of 2d information on the page. Our method is
based on algorithms, which normally require the projection of pattern
sequences with structured light. We use image sequences of the
moving stripe lighting of the top view scanner instead of an additional
light projection. Thus the stereo vision setup is simplified without
losing measurement accuracy. Furthermore we improve a surface
model dewarping method through introducing a difference vector
based on real measurements. Although our proposed method is hardly
expensive neither in calculation time nor in hardware requirements
we present good dewarping results even for difficult examples.
Abstract: Saudi Arabia in recent years has seen drastic increase
in traffic related crashes. With population of over 29 million, Saudi
Arabia is considered as a fast growing and emerging economy. The
rapid population increase and economic growth has resulted in rapid
expansion of transportation infrastructure, which has led to increase
in road crashes. Saudi Ministry of Interior reported more than 7,000
people killed and 68,000 injured in 2011 ranking Saudi Arabia to be
one of the worst worldwide in traffic safety. The traffic safety issues
in the country also result in distress to road users and cause and
economic loss exceeding 3.7 billion Euros annually. Keeping this in
view, the researchers in Saudi Arabia are investigating ways to
improve traffic safety conditions in the country. This paper presents a
multilevel approach to collect traffic safety related data required to do
traffic safety studies in the region. Two highway corridors including
King Fahd Highway 39 kilometre and Gulf Cooperation Council
Highway 42 kilometre long connecting the cities of Dammam and
Khobar were selected as a study area. Traffic data collected included
traffic counts, crash data, travel time data, and speed data. The
collected data was analysed using geographic information system to
evaluate any correlation. Further research is needed to investigate the
effectiveness of traffic safety related data when collected in a
concerted effort.
Abstract: Dedicated Short Range Communication (DSRC) is a
key enabling technology for the next generation of
communication-based safety applications. One of the important
problems for DSRC deployment is maintaining high performance
under heavy channel load. Many studies focus on congestion control
mechanisms for simulating hundreds of physical radios deployed on
vehicles. The U.S. department of transportation-s (DOT) Intelligent
Transportation Systems (ITS) division has a plan to chosen prototype
on-board devices capable of transmitting basic “Here I am" safety
messages to other vehicles. The devices will be used in an IntelliDrive
safety pilot deployment of up to 3,000 vehicles. It is hard to log the
information of 3,000 vehicles. In this paper we present the designs and
issues related to the DSRC Radio Testbed under heavy channel load.
The details not only include the architecture of DSRC Radio Testbed,
but also describe how the Radio Interfere System is used to help for
emulating the congestion radio environment.
Abstract: In the paper, the relative performances on spectral
classification of short exon and intron sequences of the human and
eleven model organisms is studied. In the simulations, all
combinations of sixteen one-sequence numerical representations, four
threshold values, and four window lengths are considered. Sequences
of 150-base length are chosen and for each organism, a total of
16,000 sequences are used for training and testing. Results indicate
that an appropriate combination of one-sequence numerical
representation, threshold value, and window length is essential for
arriving at top spectral classification results. For fixed-length
sequences, the precisions on exon and intron classification obtained
for different organisms are not the same because of their genomic
differences. In general, precision increases as sequence length
increases.
Abstract: Several models of vulnerability assessment have been proposed. The selection of one of these models depends on the objectives of the study. The classical methodologies for seismic vulnerability analysis, as a part of seismic risk analysis, have been formulated with statistical criteria based on a rapid observation. The information relating to the buildings performance is statistically elaborated. In this paper, we use the European Macroseismic Scale EMS-98 to define the relationship between damage and macroseismic intensity to assess the seismic vulnerability. Applying to Algiers area, the first step is to identify building typologies and to assign vulnerability classes. In the second step, damages are investigated according to EMS-98.
Abstract: The pedagogy project has been proven as an active
learning method, which is used to develop learner-s skills and
knowledge.The use of technology in the learning world, has filed
several gaps in the implementation of teaching methods, and online
evaluation of learners. However, the project methodology presents
challenges in the assessment of learners online.
Indeed, interoperability between E-learning platforms (LMS) is
one of the major challenges of project-based learning assessment.
Firstly, we have reviewed the characteristics of online assessment
in the context of project-based teaching. We addressed the
constraints encountered during the peer evaluation process.
Our approach is to propose a meta-model, which will describe a
language dedicated to the conception of peer assessment scenario in
project-based learning. Then we illustrate our proposal by an
instantiation of the meta-model through a business process in a
scenario of collaborative assessment on line.
Abstract: The struggle between modern and postmodern
understanding is also displayed in terms of the superiorities of
quantitative and qualitative methods to each other which are
evaluated within the scope of these understandings. By way of
assuming that the quantitative researches (modern) are able to
account for structure while the qualitative researches (postmodern)
explain the process, these methods are turned into a means for
worldviews specific to a period. In fact, process is not a functioning
independent of structure. In addition to this issue, the ability of
quantitative methods to provide scientific knowledge is also
controversial so long as they exclude the dialectical method. For this
reason, the critiques charged against modernism in terms of
quantitative methods are, in a sense, legitimate. Nevertheless, the
main issue is in which parameters postmodernist critique tries to
legitimize its critiques and whether these parameters represent a point
of view enabling democratic solutions.
In this respect, the scientific knowledge covered in Turkish media
as a means through which ordinary people have access to scientific
knowledge will be evaluated by means of content analysis within a
new objectivity conception.