Abstract: Management Systems are powerful tools for businesses
to manage quality , environmental and occupational health and safety
requirements . where once these systems were considered as stand
alone control mechanisms , industry is now opting to increase the
efficiency of these documented systems through a more integrated
approach . System integration offers a significant step forward, where
there are similarities between system components , reducing
duplication and adminstration costs and increasing efficiency .
At first , this paper reviews integrated management system structure
and its benefits. The second part of this paper focuses on the one
example implementation of such a system at Imam Khomeini
Hospital and in final part of the paper will be discuss outcomes of
that proccess .
Abstract: The internet has become an attractive avenue for
global e-business, e-learning, knowledge sharing, etc. Due to
continuous increase in the volume of web content, it is not practically
possible for a user to extract information by browsing and integrating
data from a huge amount of web sources retrieved by the existing
search engines. The semantic web technology enables advancement
in information extraction by providing a suite of tools to integrate
data from different sources. To take full advantage of semantic web,
it is necessary to annotate existing web pages into semantic web
pages. This research develops a tool, named OWIE (Ontology-based
Web Information Extraction), for semantic web annotation using
domain specific ontologies. The tool automatically extracts
information from html pages with the help of pre-defined ontologies
and gives them semantic representation. Two case studies have been
conducted to analyze the accuracy of OWIE.
Abstract: Lean, which was initially developed by Toyota, is
widely implemented in other companies to improve competitiveness.
This research is an attempt to identify the adoption of lean in the
production system of Malaysian car manufacturer, Proton using case
study approach. To gain the in-depth information regarding lean
implementation, an activity on the assembly line called Set Parts
Supply (SPS) was studied. The result indicates that by using lean
principles, tools and techniques in the implementation of SPS enabled
to achieve the goals on safety, quality, cost, delivery and morale. The
implementation increased the size of the workspace, improved the
quality of assembly and the delivery of parts supply, reduced the
manpower, achieved cost savings on electricity and also increased the
motivation of manpower in respect of attendance at work. A
framework of SPS implementation is suggested as a contribution for
lean practices in production system.
Abstract: Modular fixtures (MFs) are very important tools in
manufacturing processes in terms of reduction the cost and the
production time. This paper introduces an automated approach for
assembling MFs elements by employing SolidWorks as a powerful
3D CAD software. Visual Basic (VB) programming language was
applied integrating with SolidWorks API (Application programming
interface) functions. This integration allowed creating plug-in file and
generating new menus in the SolidWorks environment. The menus
allow the user to select, insert, and assemble MFs elements.
Abstract: CIM is the standard formalism for modeling management
information developed by the Distributed Management Task
Force (DMTF) in the context of its WBEM proposal, designed to
provide a conceptual view of the managed environment. In this
paper, we propose the inclusion of formal knowledge representation
techniques, based on Description Logics (DLs) and the Web Ontology
Language (OWL), in CIM-based conceptual modeling, and then we
examine the benefits of such a decision. The proposal is specified as a
CIM metamodel level mapping to a highly expressive subset of DLs
capable of capturing all the semantics of the models. The paper shows
how the proposed mapping can be used for automatic reasoning
about the management information models, as a design aid, by means
of new-generation CASE tools, thanks to the use of state-of-the-art
automatic reasoning systems that support the proposed logic and use
algorithms that are sound and complete with respect to the semantics.
Such a CASE tool framework has been developed by the authors and
its architecture is also introduced. The proposed formalization is not
only useful at design time, but also at run time through the use of
rational autonomous agents, in response to a need recently recognized
by the DMTF.
Abstract: Sharing the manufacturing facility through remote
operation and monitoring of a machining process is challenge for
effective use the production facility. Several automation tools in term
of hardware and software are necessary for successfully remote
operation of a machine. This paper presents a prototype of workpiece
holding attachment for remote operation of milling process by self
configuration the workpiece setup. The prototype is designed with
mechanism to reorient the work surface into machining spindle
direction with high positioning accuracy. Variety of parts geometry
is hold by attachment to perform single setup machining. Pin type
with array pattern additionally clamps the workpiece surface from
two opposite directions for increasing the machining rigidity.
Optimum pins configuration for conforming the workpiece geometry
with minimum deformation is determined through hybrid algorithms,
Genetic Algorithms (GA) and Particle Swarm Optimization (PSO).
Prototype with intelligent optimization technique enables to hold
several variety of workpiece geometry which is suitable for
machining low of repetitive production in remote operation.
Abstract: Statistical learning theory was developed by Vapnik. It
is a learning theory based on Vapnik-Chervonenkis dimension. It also
has been used in learning models as good analytical tools. In general, a
learning theory has had several problems. Some of them are local
optima and over-fitting problems. As well, statistical learning theory
has same problems because the kernel type, kernel parameters, and
regularization constant C are determined subjectively by the art of
researchers. So, we propose an evolutionary statistical learning theory
to settle the problems of original statistical learning theory.
Combining evolutionary computing into statistical learning theory,
our theory is constructed. We verify improved performances of an
evolutionary statistical learning theory using data sets from KDD cup.
Abstract: Conventionally the selection of parameters depends
intensely on the operator-s experience or conservative technological
data provided by the EDM equipment manufacturers that assign
inconsistent machining performance. The parameter settings given by
the manufacturers are only relevant with common steel grades. A
single parameter change influences the process in a complex way.
Hence, the present research proposes artificial neural network (ANN)
models for the prediction of surface roughness on first commenced
Ti-15-3 alloy in electrical discharge machining (EDM) process. The
proposed models use peak current, pulse on time, pulse off time and
servo voltage as input parameters. Multilayer perceptron (MLP) with
three hidden layer feedforward networks are applied. An assessment
is carried out with the models of distinct hidden layer. Training of the
models is performed with data from an extensive series of
experiments utilizing copper electrode as positive polarity. The
predictions based on the above developed models have been verified
with another set of experiments and are found to be in good
agreement with the experimental results. Beside this they can be
exercised as precious tools for the process planning for EDM.
Abstract: The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Abstract: With the advent of social web initiatives, some argued
that these new emerging tools might be useful in tacit knowledge
sharing through providing interactive and collaborative technologies.
However, there is still a poverty of literature to understand how and
what might be the contributions of social media in facilitating tacit
knowledge sharing. Therefore, this paper is intended to theoretically
investigate and map social media concepts and characteristics with
tacit knowledge creation and sharing requirements. By conducting a
systematic literature review, five major requirements found that need
to be present in an environment that involves tacit knowledge
sharing. These requirements have been analyzed against social media
concepts and characteristics to see how they map together. The
results showed that social media have abilities to comply some of the
main requirements of tacit knowledge sharing. The relationships have
been illustrated in a conceptual framework, suggesting further
empirical studies to acknowledge findings of this study.
Abstract: This research proposes an algorithm for the simulation
of time-periodic unsteady problems via the solution unsteady Euler
and Navier-Stokes equations. This algorithm which is called Time
Spectral method uses a Fourier representation in time and hence
solve for the periodic state directly without resolving transients
(which consume most of the resources in a time-accurate scheme).
Mathematical tools used here are discrete Fourier transformations. It
has shown tremendous potential for reducing the computational cost
compared to conventional time-accurate methods, by enforcing
periodicity and using Fourier representation in time, leading to
spectral accuracy. The accuracy and efficiency of this technique is
verified by Euler and Navier-Stokes calculations for pitching airfoils.
Because of flow turbulence nature, Baldwin-Lomax turbulence
model has been used at viscous flow analysis. The results presented
by the Time Spectral method are compared with experimental data. It
has shown tremendous potential for reducing the computational cost
compared to the conventional time-accurate methods, by enforcing
periodicity and using Fourier representation in time, leading to
spectral accuracy, because results verify the small number of time
intervals per pitching cycle required to capture the flow physics.
Abstract: The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Abstract: Osteoarthritis (OA) is the most prevalent and far common debilitating form of arthritis which can be defined as a degenerative condition affecting synovial joint. Patients suffering from osteoarthritis often complain of dull ache pain on movement.
Physical agents can fight the painful process when correctly indicated and used such as heat or cold therapy Aim. This study was carried out to: Compare the effect of cold, warm and contrast therapy on controlling knee osteoarthritis associated problems. Setting: The study was carried out in orthopedic outpatient clinics of Menoufia University and teaching Hospitals, Egypt. Sample: A convenient sample of 60 adult patients with unilateral knee osteoarthritis. Tools: three tools were utilized to collect the data. Tool I : An interviewing questionnaire. It comprised of three parts covering sociodemographic data, medical data and adverse effects of the treatment protocol. Tool II : Knee Injury and Osteoarthritis Outcome Score (KOOS) It consists of five main parts. Tool II1 : 0-10 Numeric pain rating scale. Results: reveled that the total knee symptoms score was decreased from moderate symptoms pre intervention to mild symptoms after warm and contrast method of therapy, but the contrast therapy had significant effect in reducing the knee symptoms and pain than the other symptoms. Conclusions: all of the three
methods of therapy resulted in improvement in all knee symptoms and pain but the most appropriate protocol of treatment to relive symptoms and pain was contrast therapy.
Abstract: In recent past, the Unified Modeling Language (UML) has become the de facto industry standard for object-oriented modeling of the software systems. The syntax and semantics rich UML has encouraged industry to develop several supporting tools including those capable of generating deployable product (code) from the UML models. As a consequence, ensuring the correctness of the model/design has become challenging and extremely important task. In this paper, we present an approach for automatic verification of protocol model/design. As a case study, Session Initiation Protocol (SIP) design is verified for the property, “the CALLER will not converse with the CALLEE before the connection is established between them ". The SIP is modeled using UML statechart diagrams and the desired properties are expressed in temporal logic. Our prototype verifier “UML-SMV" is used to carry out the verification. We subjected an erroneous SIP model to the UML-SMV, the verifier could successfully detect the error (in 76.26ms) and generate the error trace.
Abstract: In order to study floristic and molecular classification
of common wild wheat (Triticum boeoticum Boiss.), an analysis was
conducted on populations of the Triticum boeoticum collected from
different regions of Iran. Considering all floristic compositions of
habitats, six floristic groups (syntaxa) within the populations were
identified. A high level of variation of T. boeoticum also detected
using SSR markers. Our results showed that molecular method
confirmed the grouping of floristic method. In other word, the results
from our study indicate that floristic classification are still useful,
efficient, and economic tools for characterizing the amount and
distribution of genetic variation in natural populations of T.
boeoticum. Nevertheless, molecular markers appear as useful and
complementary techniques for identification and for evaluation of
genetic diversity in studied populations.
Abstract: Risk Assessment Tool (RAT) is an expert system that
assesses, monitors, and gives preliminary treatments automatically
based on the project plan. In this paper, a review was taken out for
the current project time management risk assessment tools for SME
software development projects, analyze risk assessment parameters,
conditions, scenarios, and finally propose risk assessment tool (RAT)
model to assess, treat, and monitor risks. An implementation prototype
system is developed to validate the model.
Abstract: This work proposes an accurate crosstalk noise estimation method in the presence of multiple RLC lines for the use in design automation tools. This method correctly models the loading effects of non switching aggressors and aggressor tree branches using resistive shielding effect and realistic exponential input waveforms. Noise peak and width expressions have been derived. The results obtained are at good agreement with SPICE results. Results show that average error for noise peak is 4.7% and for the width is 6.15% while allowing a very fast analysis.