Abstract: In this competitive age, one of the key tools of most successful organizations is knowledge management. Today some organizations measure their current knowledge and use it as an indicator for rating the organization on their reports. Noting that the universities and colleges of medical science have a great role in public health of societies, their access to newest scientific research and the establishment of organizational knowledge management systems is very important. In order to explore the Application of Knowledge Management Factors, a national study was undertaken. The main purpose of this study was to find the rate of the application of knowledge management factors and some ways to establish more application of knowledge management system in Esfahan University-s Medical College (EUMC). Esfahan is the second largest city after Tehran, the capital city of Iran, and the EUMC is the biggest medical college in Esfahan. To rate the application of knowledge management, this study uses a quantitative research methodology based on Probst, Raub and Romhardt model of knowledge management. A group of 267 faculty members and staff of the EUMC were asked via questionnaire. Finding showed that the rate of the application of knowledge management factors in EUMC have been lower than average. As a result, an interview with ten faculty members conducted to find the guidelines to establish more applications of knowledge management system in EUMC.
Abstract: Since 1984 many schemes have been proposed for
digital signature protocol, among them those that based on discrete
log and factorizations. However a new identification scheme based
on iterated function (IFS) systems are proposed and proved to be
more efficient. In this study the proposed identification scheme is
transformed into a digital signature scheme by using a one way hash
function. It is a generalization of the GQ signature schemes. The
attractor of the IFS is used to obtain public key from a private one,
and in the encryption and decryption of a hash function. Our aim is
to provide techniques and tools which may be useful towards
developing cryptographic protocols. Comparisons between the
proposed scheme and fractal digital signature scheme based on RSA
setting, as well as, with the conventional Guillou-Quisquater
signature, and RSA signature schemes is performed to prove that, the
proposed scheme is efficient and with high performance.
Abstract: Reservoirs with high pressures and temperatures
(HPHT) that were considered to be atypical in the past are now
frequent targets for exploration. For downhole oilfield drilling tools
and components, the temperature and pressure affect the mechanical
strength. To address this issue, a finite element analysis (FEA) for
206.84 MPa (30 ksi) pressure and 165°C has been performed on the
pressure housing of the measurement-while-drilling/logging-whiledrilling
(MWD/LWD) density tool.
The density tool is a MWD/LWD sensor that measures the density
of the formation. One of the components of the density tool is the
pressure housing that is positioned in the tool. The FEA results are
compared with the experimental test performed on the pressure
housing of the density tool. Past results show a close match between
the numerical results and the experimental test. This FEA model can
be used for extreme HPHT and ultra HPHT analyses, and/or optimal
design changes.
Abstract: General requirements for knowledge representation in
the form of logic rules, applicable to design and control of industrial
processes, are formulated. Characteristic behavior of decision trees
(DTs) and rough sets theory (RST) in rules extraction from recorded
data is discussed and illustrated with simple examples. The
significance of the models- drawbacks was evaluated, using
simulated and industrial data sets. It is concluded that performance of
DTs may be considerably poorer in several important aspects,
compared to RST, particularly when not only a characterization of a
problem is required, but also detailed and precise rules are needed,
according to actual, specific problems to be solved.
Abstract: Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.
Abstract: This paper present the harmonic elimination of hybrid
multilevel inverters (HMI) which could be increase the number of
output voltage level. Total Harmonic Distortion (THD) is one of the
most important requirements concerning performance indices.
Because of many numbers output levels of HMI, it had numerous
unknown variables of eliminate undesired individual harmonic and
THD nonlinear equations set. Optimized harmonic stepped waveform
(OHSW) is solving switching angles conventional method, but most
complicated for solving as added level. The artificial intelligent
techniques are deliberation to solve this problem. This paper presents
the Particle Swarm Optimization (PSO) technique for solving
switching angles to get minimum THD and eliminate undesired
individual harmonics of 15-levels hybrid multilevel inverters.
Consequently it had many variables and could eliminate numerous
harmonics. Both advantages including high level of inverter and
Particle Swarm Optimization (PSO) are used as powerful tools for
harmonics elimination.
Abstract: To illustrate diversity of methods used to extract relevant (where the concept of relevance can be differently defined for different applications) visual data, the paper discusses three groups of such methods. They have been selected from a range of alternatives to highlight how hardware and software tools can be complementarily used in order to achieve various functionalities in case of different specifications of “relevant data". First, principles of gated imaging are presented (where relevance is determined by the range). The second methodology is intended for intelligent intrusion detection, while the last one is used for content-based image matching and retrieval. All methods have been developed within projects supervised by the author.
Abstract: An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Abstract: The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.
Abstract: Retention in the IT profession is critical for
organizations to stay competitive and operate reliably in the dynamic
business environment. Most organizations rely on compensation and
rewards as primary tools to enhance retention of employees. In this
quantitative survey-based study conducted at a large global bank, we
analyze the perceptions of 575 information technology (IT) software
professionals in India and Malaysia and find that fairness of rewards
has very little impact on retention likelihood. It is far more important
to actively involve employees in organizational activities. In
addition, our findings indicate that involvement is far more important
than information flow: the typical organizational communication to
keep employees informed.
Abstract: Researches related to standard product model and
development of neutral manufacturing interfaces for numerical
control machines becomes a significant topic since the last 25 years.
In this paper, a detail description of STEP implementation on turnmill
manufacturing has been discussed. It shows requirements of
information contents from ISO14649 data model. It covers to
describe the design of STEP-NC framework applicable to turn-mill
manufacturing. In the framework, EXPRESS-G and UML modeling
tools are used to depict the information contents of the system and
established the bases of information model requirement. A product
and manufacturing data model applicable for STEP compliant
manufacturing. The next generation turn-mill operations
requirements have been represented by a UML diagram. An object
oriented classes of ISO1449 has been developed on Visual Basic dot
NET platform for binding the static information model represented
by the UML diagram. An architect of the proposed system
implementation has been given on the bases of the design and
manufacturing module of STEP-NC interface established. Finally, a
part 21 file process plan generated for an illustration of turn-mill
components.
Abstract: Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.
Abstract: The recent drive for use of performance-based methodologies in design and assessment of structures in seismic areas has significantly increased the demand for the development of reliable nonlinear inelastic static pushover analysis tools. As a result, the adaptive pushover methods have been developed during the last decade, which unlike their conventional pushover counterparts, feature the ability to account for the effect that higher modes of vibration and progressive stiffness degradation might have on the distribution of seismic storey forces. Even in advanced pushover methods, little attention has been paid to the Unsymmetric structures. This study evaluates the seismic demands for three dimensional Unsymmetric-Plan buildings determined by the Displacement-based Adaptive Pushover (DAP) analysis, which has been introduced by Antoniou and Pinho [2004]. The capability of DAP procedure in capturing the torsional effects due to the irregularities of the structures, is investigated by comparing its estimates to the exact results, obtained from Incremental Dynamic Analysis (IDA). Also the capability of the procedure in prediction the seismic behaviour of the structure is discussed.
Abstract: Virtually all existing networked system management
tools use a Manager/Agent paradigm. That is, distributed agents are
deployed on managed devices to collect local information and report
it back to some management unit. Even those that use standard
protocols such as SNMP fall into this model. Using standard protocol
has the advantage of interoperability among devices from different
vendors. However, it may not be able to provide customized
information that is of interest to satisfy specific management needs.
In this dissertation work, different approaches are used to
collect information regarding the devices attached to a Local Area
Network. An SNMP aware application is being developed that will
manage the discovery procedure and will be used as data collector.
Abstract: Along with the basic features of students\' culture
information, with its widely usage oriented on implementation of the
new information technologies in educational process that determines
the search for ways of pointing to the similarity of interdisciplinary
connections content, aims and objectives of the study. In this regard,
the article questions about students\' information culture, and also
presented information about the aims and objectives of the
information culture process among students. In the formation of a
professional interest in relevant information, which is an opportunity
to assist in informing the professional activities of the essence of
effective use of interactive methods and innovative technologies in
the learning process. The result of the experiment proves the
effectiveness of the information culture process of students in
training the system of higher education based on the credit
technology. The main purpose of this paper is a comprehensive
review of students\' information culture.
Abstract: Natural Language Understanding Systems (NLU) will not be widely deployed unless they are technically mature and cost effective to develop. Cost effective development hinges on the availability of tools and techniques enabling the rapid production of NLU applications through minimal human resources. Further, these tools and techniques should allow quick development of applications in a user friendly way and should be easy to upgrade in order to continuously follow the evolving technologies and standards. This paper presents a visual tool for the structuring and editing of dialog forms, the key element of driving conversation in NLU applications based on IBM technology. The main focus is given on the basic component used to describe Human – Machine interactions of that kind, the Dialogue Manager. In essence, the description of a tool that enables the visual representation of the Dialogue Manager mainly during the implementation phase is illustrated.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Abstract: The reliability of the tools developed to learn the
learning styles is essential to find out students- learning styles
trustworthily. For this purpose, the psychometric features of Grasha-
Riechman Student Learning Style Inventory developed by Grasha
was studied to contribute to this field. The study was carried out on
6th, 7th, and 8th graders of 10 primary education schools in Konya.
The inventory was applied twice with an interval of one month, and
according to the data of this application, the reliability coefficient
numbers of the 6 sub-dimensions pointed in the theory of the
inventory was found to be medium. Besides, it was found that the
inventory does not have a structure with 6 factors for both
Mathematics and English courses as represented in the theory.
Abstract: There is no doubt that Internet technology is widely used by hotels and its demand is constantly booming. Hotels have largely adopted website information services through using different interactive tools, dimensions and attributes to achieve excellence in functionality and usability but these do not necessary equate with website effectiveness. One way to investigate the effectiveness of hotel website is from the perspective ofe-consumers. This exploratory research is to investigate the perceived importance of websites effectiveness of some selected independent small and medium-sized hotels (SMHs) located in Dubai, United Arab Emirates, from the perspective of Omanie-consumers by using non-random sampling method. From 400 questionnaire addressed to respondents in 27 organizations in Muscat the capital city of Oman, 173 are valid. Findings of this study assist SMHs management in Dubai with the reallocation of their resources and efforts in order to supportebusiness development and to sustain a competitive advantage.