Abstract: Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.
Abstract: Frequent machine breakdowns, low plant availability and increased overtime are a great threat to a manufacturing plant as they increase operating costs of an industry. The main aim of this study was to improve Overall Equipment Effectiveness (OEE) at a manufacturing company through the implementation of innovative maintenance strategies. A case study approach was used. The paper focuses on improving the maintenance in a manufacturing set up using an innovative maintenance regime mix to improve overall equipment effectiveness. Interviews, reviewing documentation and historical records, direct and participatory observation were used as data collection methods during the research. Usually production is based on the total kilowatt of motors produced per day. The target kilowatt at 91% availability is 75 Kilowatts a day. Reduced demand and lack of raw materials particularly imported items are adversely affecting the manufacturing operations. The company had to reset its targets from the usual figure of 250 Kilowatt per day to mere 75 per day due to lower availability of machines as result of breakdowns as well as lack of raw materials. The price reductions and uncertainties as well as general machine breakdowns further lowered production. Some recommendations were given. For instance, employee empowerment in the company will enhance responsibility and authority to improve and totally eliminate the six big losses. If the maintenance department is to realise its proper function in a progressive, innovative industrial society, then its personnel must be continuously trained to meet current needs as well as future requirements. To make the maintenance planning system effective, it is essential to keep track of all the corrective maintenance jobs and preventive maintenance inspections. For large processing plants these cannot be handled manually. It was therefore recommended that the company implement (Computerised Maintenance Management System) CMMS.
Abstract: Web services provide significant new benefits for SOAbased
applications, but they also expose significant new security
risks. There are huge number of WS security standards and
processes. At present, there is still a lack of a comprehensive
approach which offers a methodical development in the construction
of secure WS-based SOA. Thus, the main objective of this paper is
to address this needs, presenting a comprehensive method for Web
Services Security guaranty in SOA. The proposed method defines
three stages, Initial Security Analysis, Architectural Security
Guaranty and WS Security Standards Identification. These facilitate,
respectively, the definition and analysis of WS-specific security
requirements, the development of a WS-based security architecture
and the identification of the related WS security standards that the
security architecture must articulate in order to implement the
security services.
Abstract: The talks about technological convergence had been
around for almost twenty years. Today Internet made it possible. And
this is not only technical evolution. The way it changed our lives
reflected in variety of applications, services and technologies used in
day-to-day life. Such benefits imposed even more requirements on
heterogeneous and unreliable IP networks.
Current paper outlines QoS management system developed in the
NetQoS [1] project. It describes an overall architecture of
management system for heterogeneous networks and proposes
automated multi-layer QoS management. Paper focuses on the
structure of the most crucial modules of the system that enable
autonomous and multi-layer provisioning and dynamic adaptation.
Abstract: It is widely acknowledged that there is a shortage of software developers, not only in South Africa, but also worldwide. Despite reports on a gap between industry needs and software education, the gap has mostly been explored in quantitative studies. This paper reports on the qualitative data of a mixed method study of the perceptions of professional software developers regarding what topics they learned from their formal education and the importance of these topics to their actual work. The analysis suggests that there is a gap between industry’s needs and software development education and the following recommendations are made: 1) Real-life projects must be included in students’ education; 2) Soft skills and business skills must be included in curricula; 3) Universities must keep the curriculum up to date; 4) Software development education must be made accessible to a diverse range of students.
Abstract: This paper proposes the requirements and design of
RFID based system for SFC (Shop Floor Control) in order to achieve
the factory real time controllability, Allowing to develop EManufacturing
System. The detailed logical specifications of the core
functions and the design diagrams of RFID based system are
developed. Then RFID deployment in E-Manufacturing systems is
investigated..
Abstract: Amazing development of the information technology,
communications and internet expansion as well as the requirements
of the city managers to new ideas to run the city and higher
participation of the citizens encourage us to complete the electronic
city as soon as possible. The foundations of this electronic city are in
information technology. People-s participation in metropolitan
management is a crucial topic. Information technology does not
impede this matter. It can ameliorate populace-s participation and
better interactions between the citizens and the city managers.
Citizens can proffer their ideas, beliefs and votes through digital
mass media based upon the internet and computerization plexuses on
the topical matters to receive appropriate replies and services. They
can participate in urban projects by becoming cognizant of the city
views. The most significant challenges are as follows: information
and communicative management, altering citizens- views, as well as
legal and office documents
Electronic city obstacles have been identified in this research. The
required data were forgathered through questionnaires to identify the
barriers from a statistical community comprising specialists and
practitioners of the ministry of information technology and
communication, the municipality information technology
organization.
The conclusions demonstrate that the prioritized electronic city
application barriers in Iran are as follows:
The support quandaries (non-financial ones), behavioral, cultural
and educational plights, the security, legal and license predicaments,
the hardware, orismological and infrastructural curbs, the software
and fiscal problems.
Abstract: Characteristics of ad hoc networks and even their existence depend on the nodes forming them. Thus, services and applications designed for ad hoc networks should adapt to this dynamic and distributed environment. In particular, multicast algorithms having reliability and scalability requirements should abstain from centralized approaches. We aspire to define a reliable and scalable multicast protocol for ad hoc networks. Our target is to utilize epidemic techniques for this purpose. In this paper, we present a brief survey of epidemic algorithms for reliable multicasting in ad hoc networks, and describe formulations and analytical results for simple epidemics. Then, P2P anti-entropy algorithm for content distribution and our prototype simulation model are described together with our initial results demonstrating the behavior of the algorithm.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: Meeting users- requirements is one of predictors of project success. There should be a match between the expectations of the users and the perception of key project personnel with respect to usability and functionality. The aim of this study is to make a comparison of key project personnel-s and potential users- (customer representatives) evaluations of the relative importance of usability and functionality factors in a software design project. Analytical Network Process (ANP) was used to analyze the relative importance of the factors. The results show that navigation and interaction are the most significant factors,andsatisfaction and efficiency are the least important factors for both groups. Further, it can be concluded that having similar orders and scores of usability and functionality factors for both groups shows that key project personnel have captured the expectations and requirements of potential users accurately.
Abstract: As more people from non-technical backgrounds
are becoming directly involved with large-scale ontology
development, the focal point of ontology research has shifted
from the more theoretical ontology issues to problems
associated with the actual use of ontologies in real-world,
large-scale collaborative applications. Recently the National
Science Foundation funded a large collaborative ontology
development project for which a new formal ontology model,
the Ontology Abstract Machine (OAM), was developed to
satisfy some unique functional and data representation
requirements. This paper introduces the OAM model and the
related algorithms that enable maintenance of an ontology that
supports node-based user access. The successful software
implementation of the OAM model and its subsequent
acceptance by a large research community proves its validity
and its real-world application value.
Abstract: Resource-constrained project scheduling is an NPhard
optimisation problem. There are many different heuristic
strategies how to shift activities in time when resource requirements
exceed their available amounts. These strategies are frequently based
on priorities of activities. In this paper, we assume that a suitable
heuristic has been chosen to decide which activities should be
performed immediately and which should be postponed and
investigate the resource-constrained project scheduling problem
(RCPSP) from the implementation point of view. We propose an
efficient routine that, instead of shifting the activities, extends their
duration. It makes it possible to break down their duration into active
and sleeping subintervals. Then we can apply the classical Critical
Path Method that needs only polynomial running time. This
algorithm can simply be adapted for multiproject scheduling with
limited resources.
Abstract: Ontologies are broadly used in the context of networked home environments. With ontologies it is possible to define and store context information, as well as to model different kinds of physical environments. Ontologies are central to networked home environments as they carry the meaning. However, ontologies and the OWL language is complex. Several ontology visualization approaches have been developed to enhance the understanding of ontologies. The domain of networked home environments sets some special requirements for the ontology visualization approach. The visualization tool presented here, visualizes ontologies in a domain-specific way. It represents effectively the physical structures and spatial relationships of networked home environments. In addition, it provides extensive interaction possibilities for editing and manipulating the visualization. The tool shortens the gap from beginner to intermediate OWL ontology reader by visualizing instances in their actual locations and making OWL ontologies more interesting and concrete, and above all easier to comprehend.
Abstract: The use of magnetic and magnetic/gold core/shell
nanoparticles in biotechnology or medicine has shown good promise
due to their hybrid nature which possesses superior magnetic and
optical properties. Some of these potential applications include
hyperthermia treatment, bio-separations, diagnostics, drug delivery
and toxin removal. Synthesis refinement to control geometric and
magnetic/optical properties, and finding functional surfactants for
biomolecular attachment, are requirements to meet application
specifics.
Various high-temperature preparative methods were used for the
synthesis of iron oxide and gold-coated iron oxide nanoparticles.
Different surface functionalities, such as 11-aminoundecanoic and
11-mercaptoundecanoic acid, were introduced on the surface of the
particles to facilitate further attachment of biomolecular functionality
and drug-like molecules. Nanoparticle thermal stability, composition,
state of aggregation, size and morphology were investigated and the
results from techniques such as Fourier Transform-Infra Red
spectroscopy (FT-IR), Ultraviolet visible spectroscopy (UV-vis),
Transmission Electron Microscopy (TEM) and thermal analysis are
discussed.
Abstract: Three dimensional analysis of thermal model in laser
full penetration welding, Nd:YAG, by transparent mode DP600 alloy
steel 1.25mm of thickness and gap of 0.1mm. Three models studied
the influence of thermal dependent temperature properties, thermal
independent temperature and the effect of peak value of specific heat
at phase transformation temperature, AC1, on the transient
temperature. Another seven models studied the influence of
discretization, meshes on the temperature distribution in weld plate.
It is shown that for the effects of thermal properties, the errors less
4% of maximum temperature in FZ and HAZ have identified. The
minimum value of discretization are at least one third increment per
radius for temporal discretization and the spatial discretization
requires two elements per radius and four elements through thickness
of the assembled plate, which therefore represent the minimum
requirements of modeling for the laser welding in order to get
minimum errors less than 5% compared to the fine mesh.
Abstract: Wireless sensor network is formed with the combination of sensor nodes and sink nodes. Recently Wireless sensor network has attracted attention of the research community. The main application of wireless sensor network is security from different attacks both for mass public and military. However securing these networks, by itself is a critical issue due to many constraints like limited energy, computational power and lower memory. Researchers working in this area have proposed a number of security techniques for this purpose. Still, more work needs to be done.In this paper we provide a detailed discussion on security in wireless sensor networks. This paper will help to identify different obstacles and requirements for security of wireless sensor networks as well as highlight weaknesses of existing techniques.
Abstract: Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. Fueling the attention have been the possibilities of international agreements to reduce greenhouse gas emissions, electricity sector restructuring, high power reliability requirements for certain activities, and concern about easing transmission and distribution capacity bottlenecks and congestion. So it is necessary that impact of these kinds of generators on distribution feeder reconfiguration would be investigated. This paper presents an approach for distribution reconfiguration considering Distributed Generators (DGs). The objective function is summation of electrical power losses A Tabu search optimization is used to solve the optimal operation problem. The approach is tested on a real distribution feeder.
Abstract: System testing is actually done to the entire system
against the Functional Requirement Specification and/or the System
Requirement Specification. Moreover, it is an investigatory testing
phase, where the focus is to have almost a destructive attitude and
test not only the design, but also the behavior and even the believed
expectations of the customer. It is also intended to test up to and
beyond the bounds defined in the software/hardware requirements
specifications. In Motorola®, Automated Testing is one of the testing
methodologies uses by GSG-iSGT (Global Software Group - iDEN
TM
Subcriber Group-Test) to increase the testing volume, productivity
and reduce test cycle-time in iDEN
TM
phones testing. Testing is able
to produce more robust products before release to the market. In this
paper, iHopper is proposed as a tool to perform stress test on iDEN
TM
phonse. We will discuss the value that automation has brought to
iDEN
TM
Phone testing such as improving software quality in the
iDEN
TM
phone together with some metrics. We will also look into
the advantages of the proposed system and some discussion of the
future work as well.
Abstract: The aim of this research is to design a collaborative
framework that integrates risk analysis activities into the geospatial
database design (GDD) process. Risk analysis is rarely undertaken
iteratively as part of the present GDD methods in conformance to
requirement engineering (RE) guidelines and risk standards.
Accordingly, when risk analysis is performed during the GDD, some
foreseeable risks may be overlooked and not reach the output
specifications especially when user intentions are not systematically
collected. This may lead to ill-defined requirements and ultimately in
higher risks of geospatial data misuse. The adopted approach consists
of 1) reviewing risk analysis process within the scope of RE and
GDD, 2) analyzing the challenges of risk analysis within the context
of GDD, and 3) presenting the components of a risk-based
collaborative framework that improves the collection of the
intended/forbidden usages of the data and helps geo-IT experts to
discover implicit requirements and risks.
Abstract: This paper is intended to assist anyone with some general technical experience, but perhaps limited specific knowledge of heat transfer equipment. A characteristic of heat exchanger design is the procedure of specifying a design, heat transfer area and pressure drops and checking whether the assumed design satisfies all requirements or not. The purpose of this paper is how to design the oil cooler (heat exchanger) especially for shell-and-tube heat exchanger which is the majority type of liquid-to-liquid heat exchanger. General design considerations and design procedure are also illustrated in this paper and a flow diagram is provided as an aid of design procedure. In design calculation, the MatLAB and AutoCAD software are used. Fundamental heat transfer concepts and complex relationships involved in such exchanger are also presented in this paper. The primary aim of this design is to obtain a high heat transfer rate without exceeding the allowable pressure drop. This computer program is highly useful to design the shell-and-tube type heat exchanger and to modify existing deign.