Abstract: This paper describes a method to measure and
compensate a 4 axes ultra-precision machine tool that generates micro
patterns on the large surfaces. The grooving machine is usually used
for making a micro mold for many electrical parts such as a light guide
plate for LCD and fuel cells. The ultra precision machine tool has three
linear axes and one rotational table. Shaping is usually used to
generate micro patterns. In the case of 50 μm pitch and 25 μm height
pyramid pattern machining with a 90° wedge angle bite, one of linear
axis is used for long stroke motion for high cutting speed and other
linear axis are used for feeding. The triangular patterns can be
generated with many times of long stroke of one axis. Then 90°
rotation of work piece is needed to make pyramid patterns with
superposition of machined two triangular patterns.
To make a two dimensional positioning error, straightness of two
axes in out of plane, squareness between the each axis are important.
Positioning errors, straightness and squarness were measured by laser
interferometer system. Those were compensated and confirmed by
ISO230-6. One of difficult problem to measure the error motions is
squareness or parallelism of axis between the rotational table and
linear axis. It was investigated by simultaneous moving of rotary table
and XY axes. This compensation method is introduced in this paper.
Abstract: Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.
Abstract: One of the main research directions in CAD/CAM
machining area is the reducing of machining time.
The feedrate scheduling is one of the advanced techniques that
allows keeping constant the uncut chip area and as sequel to keep
constant the main cutting force. They are two main ways for feedrate
optimization. The first consists in the cutting force monitoring, which
presumes to use complex equipment for the force measurement and
after this, to set the feedrate regarding the cutting force variation. The
second way is to optimize the feedrate by keeping constant the
material removal rate regarding the cutting conditions.
In this paper there is proposed a new approach using an extended
database that replaces the system model.
The feedrate scheduling is determined based on the identification
of the reconfigurable machine tool, and the feed value determination
regarding the uncut chip section area, the contact length between tool
and blank and also regarding the geometrical roughness.
The first stage consists in the blank and tool monitoring for the
determination of actual profiles. The next stage is the determination
of programmed tool path that allows obtaining the piece target
profile.
The graphic representation environment models the tool and blank
regions and, after this, the tool model is positioned regarding the
blank model according to the programmed tool path. For each of
these positions the geometrical roughness value, the uncut chip area
and the contact length between tool and blank are calculated. Each of
these parameters are compared with the admissible values and
according to the result the feed value is established.
We can consider that this approach has the following advantages:
in case of complex cutting processes the prediction of cutting force is
possible; there is considered the real cutting profile which has
deviations from the theoretical profile; the blank-tool contact length
limitation is possible; it is possible to correct the programmed tool
path so that the target profile can be obtained.
Applying this method, there are obtained data sets which allow the
feedrate scheduling so that the uncut chip area is constant and, as a
result, the cutting force is constant, which allows to use more
efficiently the machine tool and to obtain the reduction of machining
time.
Abstract: This work presents a neural network model for the
clustering analysis of data based on Self Organizing Maps (SOM).
The model evolves during the training stage towards a hierarchical
structure according to the input requirements. The hierarchical structure
symbolizes a specialization tool that provides refinements of the
classification process. The structure behaves like a single map with
different resolutions depending on the region to analyze. The benefits
and performance of the algorithm are discussed in application to the
Iris dataset, a classical example for pattern recognition.
Abstract: The recent drive for use of performance-based methodologies in design and assessment of structures in seismic areas has significantly increased the demand for the development of reliable nonlinear inelastic static pushover analysis tools. As a result, the adaptive pushover methods have been developed during the last decade, which unlike their conventional pushover counterparts, feature the ability to account for the effect that higher modes of vibration and progressive stiffness degradation might have on the distribution of seismic storey forces. Even in advanced pushover methods, little attention has been paid to the Unsymmetric structures. This study evaluates the seismic demands for three dimensional Unsymmetric-Plan buildings determined by the Displacement-based Adaptive Pushover (DAP) analysis, which has been introduced by Antoniou and Pinho [2004]. The capability of DAP procedure in capturing the torsional effects due to the irregularities of the structures, is investigated by comparing its estimates to the exact results, obtained from Incremental Dynamic Analysis (IDA). Also the capability of the procedure in prediction the seismic behaviour of the structure is discussed.
Abstract: Virtually all existing networked system management
tools use a Manager/Agent paradigm. That is, distributed agents are
deployed on managed devices to collect local information and report
it back to some management unit. Even those that use standard
protocols such as SNMP fall into this model. Using standard protocol
has the advantage of interoperability among devices from different
vendors. However, it may not be able to provide customized
information that is of interest to satisfy specific management needs.
In this dissertation work, different approaches are used to
collect information regarding the devices attached to a Local Area
Network. An SNMP aware application is being developed that will
manage the discovery procedure and will be used as data collector.
Abstract: Along with the basic features of students\' culture
information, with its widely usage oriented on implementation of the
new information technologies in educational process that determines
the search for ways of pointing to the similarity of interdisciplinary
connections content, aims and objectives of the study. In this regard,
the article questions about students\' information culture, and also
presented information about the aims and objectives of the
information culture process among students. In the formation of a
professional interest in relevant information, which is an opportunity
to assist in informing the professional activities of the essence of
effective use of interactive methods and innovative technologies in
the learning process. The result of the experiment proves the
effectiveness of the information culture process of students in
training the system of higher education based on the credit
technology. The main purpose of this paper is a comprehensive
review of students\' information culture.
Abstract: The seismic vulnerability of an urban area is of a great
deal for local authorities especially those facing earthquakes. So, it is
important to have an efficient tool to assess the vulnerability of
existing buildings. The use of the VIP (Vulnerability Index Program)
and the GIS (Geographic Information System) let us to identify the
most vulnerable districts of an urban area.
The use of the vulnerability index method lets us to assess the
vulnerability of the center town of Blida (Algeria) which is a
historical town and which has grown enormously during the last
decades. In this method, three levels of vulnerability are defined. The
GIS has been used to build a data base in order to perform different
thematic analyses. These analyses show the seismic vulnerability of
Blida.
Abstract: This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.
Abstract: Natural Language Understanding Systems (NLU) will not be widely deployed unless they are technically mature and cost effective to develop. Cost effective development hinges on the availability of tools and techniques enabling the rapid production of NLU applications through minimal human resources. Further, these tools and techniques should allow quick development of applications in a user friendly way and should be easy to upgrade in order to continuously follow the evolving technologies and standards. This paper presents a visual tool for the structuring and editing of dialog forms, the key element of driving conversation in NLU applications based on IBM technology. The main focus is given on the basic component used to describe Human – Machine interactions of that kind, the Dialogue Manager. In essence, the description of a tool that enables the visual representation of the Dialogue Manager mainly during the implementation phase is illustrated.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: The main objectif of this paper is to present a tool that
we have developed subject to characterize and modelling indoor radio
channel propagation at millimetric wave. The tool is based on the
ray tracing technique (RTT). As, in realistic environment we cannot
neglect the significant impact of Human Body Shadowing and other
objects in motion on indoor 60 GHz propagation channel. Hence,
our proposed model allows a simulation of propagation in a dynamic
indoor environment. First, we describe a model of human body.
Second, RTT with this model is used to simulate the propagation
of millimeter waves in the presence of persons in motion. Results
of the simulation show that this tool gives results in agreement with
those reported in the literature. Specially, the effects of people motion
on temporal channel properties.
Abstract: Using activity theory, organisational theory and
didactics as theoretical foundations, a comprehensive model of the
organisational dimensions relevant for learning and knowledge
transfer will be developed. In a second step, a Learning Assessment
Guideline will be elaborated. This guideline will be designed to
permit a targeted analysis of organisations to identify the status quo
in those areas crucial to the implementation of learning and
knowledge transfer. In addition, this self-analysis tool will enable
learning managers to select adequate didactic models for e- and
blended learning. As part of the European Integrated Project
"Process-oriented Learning and Information Exchange" (PROLIX),
this model of organisational prerequisites for learning and knowledge
transfer will be empirically tested in four profit and non-profit
organisations in Great Britain, Germany and France (to be finalized
in autumn 2006). The findings concern not only the capability of the
model of organisational dimensions, but also the predominant
perceptions of and obstacles to learning in organisations.
Abstract: The objectives of this research paper were to study the
influencing factors that contributed to the success of electronic
commerce (e-commerce) and to study the approach to enhance the
standard of e-commerce for small and medium enterprises (SME).
The research paper focused the study on only sole proprietorship
SMEs in Bangkok, Thailand. The factors contributed to the success
of SME included business management, learning in the organization,
business collaboration, and the quality of website. A quantitative and
qualitative mixed research methodology was used. In terms of
quantitative method, a questionnaire was used to collect data from
251 sole proprietorships. The System Equation Model (SEM) was
utilized as the tool for data analysis. In terms of qualitative method,
an in-depth interview, a dialogue with experts in the field of ecommerce
for SMEs, and content analysis were used.
By using the adjusted causal relationship structure model, it was
revealed that the factors affecting the success of e-commerce for
SMEs were found to be congruent with the empirical data. The
hypothesis testing indicated that business management influenced the
learning in the organization, the learning in the organization
influenced business collaboration and the quality of the website, and
these factors, in turn, influenced the success of SMEs. Moreover, the
approach to enhance the standard of SMEs revealed that the majority
of respondents wanted to enhance the standard of SMEs to a high
level in the category of safety of e-commerce system, basic structure
of e-commerce, development of staff potentials, assistance of budget
and tax reduction, and law improvement regarding the e-commerce
respectively.
Abstract: With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Abstract: The reliability of the tools developed to learn the
learning styles is essential to find out students- learning styles
trustworthily. For this purpose, the psychometric features of Grasha-
Riechman Student Learning Style Inventory developed by Grasha
was studied to contribute to this field. The study was carried out on
6th, 7th, and 8th graders of 10 primary education schools in Konya.
The inventory was applied twice with an interval of one month, and
according to the data of this application, the reliability coefficient
numbers of the 6 sub-dimensions pointed in the theory of the
inventory was found to be medium. Besides, it was found that the
inventory does not have a structure with 6 factors for both
Mathematics and English courses as represented in the theory.
Abstract: End milling process is one of the common metal
cutting operations used for machining parts in manufacturing
industry. It is usually performed at the final stage in manufacturing a
product and surface roughness of the produced job plays an
important role. In general, the surface roughness affects wear
resistance, ductility, tensile, fatigue strength, etc., for machined parts
and cannot be neglected in design. In the present work an
experimental investigation of end milling of aluminium alloy with
carbide tool is carried out and the effect of different cutting
parameters on the response are studied with three-dimensional
surface plots. An artificial neural network (ANN) is used to establish
the relationship between the surface roughness and the input cutting
parameters (i.e., spindle speed, feed, and depth of cut). The Matlab
ANN toolbox works on feed forward back propagation algorithm is
used for modeling purpose. 3-12-1 network structure having
minimum average prediction error found as best network architecture
for predicting surface roughness value. The network predicts surface
roughness for unseen data and found that the result/prediction is
better. For desired surface finish of the component to be produced
there are many different combination of cutting parameters are
available. The optimum cutting parameter for obtaining desired
surface finish, to maximize tool life is predicted. The methodology is
demonstrated, number of problems are solved and algorithm is coded
in Matlab®.
Abstract: There is no doubt that Internet technology is widely used by hotels and its demand is constantly booming. Hotels have largely adopted website information services through using different interactive tools, dimensions and attributes to achieve excellence in functionality and usability but these do not necessary equate with website effectiveness. One way to investigate the effectiveness of hotel website is from the perspective ofe-consumers. This exploratory research is to investigate the perceived importance of websites effectiveness of some selected independent small and medium-sized hotels (SMHs) located in Dubai, United Arab Emirates, from the perspective of Omanie-consumers by using non-random sampling method. From 400 questionnaire addressed to respondents in 27 organizations in Muscat the capital city of Oman, 173 are valid. Findings of this study assist SMHs management in Dubai with the reallocation of their resources and efforts in order to supportebusiness development and to sustain a competitive advantage.
Abstract: This paper presents an optimization of the hull
separation, i.e. transverse clearance. The main objective is to identify
the feasible speed ranges and find the optimum transverse clearance
considering the minimum wave-making resistance. The dimensions
and the weight of hardware systems installed in the catamaran
structured fuel cell powered USV (Unmanned Surface Vehicle) were
considered as constraints. As the CAE (Computer Aided Engineering)
platform FRIENDSHIP-Framework was used. The hull surface
modeling, DoE (Design of Experiment), Tangent search optimization,
tool integration and the process automation were performed by
FRIENDSHIP-Framework. The hydrodynamic result was evaluated
by XPAN the potential solver of SHIPFLOW.