Abstract: Virtual engineering technology has undergone rapid progress in recent years and is being adopted increasingly by manufacturing companies of many engineering disciplines. There is an increasing demand from industry for qualified virtual engineers. The qualified virtual engineers should have the ability of applying engineering principles and mechanical design methods within the commercial software package environment. It is a challenge to the engineering education in universities which traditionally tends to lack the integration of knowledge and skills required for solving real world problems. In this paper, a case study shows some recent development of a MSc Mechanical Engineering course at Department of Engineering and Technology in MMU, and in particular, two units Simulation of Mechanical Systems(SMS) and Computer Aided Fatigue Analysis(CAFA) that emphasize virtual engineering education and promote integration of knowledge acquisition, skill training and industrial application.
Abstract: Apparel product development is an important stage in the life cycle of a product. Shortening this stage will help to reduce the costs of a garment. The aim of this study is to examine the production parameters in knitwear apparel companies by defining the unit costs, and developing a software to calculate the unit costs of garments and make the cost estimates. In this study, with the help of a questionnaire, different companies- systems of unit cost estimating and cost calculating were tried to be analyzed. Within the scope of the questionnaire, the importance of cost estimating process for apparel companies and the expectations from a new cost estimating program were investigated. According to the results of the questionnaire, it was seen that the majority of companies which participated to the questionnaire use manual cost calculating methods or simple Microsoft Excel spreadsheets to make cost estimates. Furthermore, it was discovered that many companies meet with difficulties in archiving the cost data for future use and as a solution to that problem, it is thought that prior to making a cost estimate, sub units of garment costs which are fabric, accessory and the labor costs should be analyzed and added to the database of the programme beforehand. Another specification of the cost estimating unit prepared in this study is that the programme was designed to consist of two main units, one of which makes the product specification and the other makes the cost calculation. The programme is prepared as a web-based application in order that the supplier, the manufacturer and the customer can have the opportunity to communicate through the same platform.
Abstract: Conventional approaches in the implementation of logic programming applications on embedded systems are solely of software nature. As a consequence, a compiler is needed that transforms the initial declarative logic program to its equivalent procedural one, to be programmed to the microprocessor. This approach increases the complexity of the final implementation and reduces the overall system's performance. On the contrary, presenting hardware implementations which are only capable of supporting logic programs prevents their use in applications where logic programs need to be intertwined with traditional procedural ones, for a specific application. We exploit HW/SW codesign methods to present a microprocessor, capable of supporting hybrid applications using both programming approaches. We take advantage of the close relationship between attribute grammar (AG) evaluation and knowledge engineering methods to present a programmable hardware parser that performs logic derivations and combine it with an extension of a conventional RISC microprocessor that performs the unification process to report the success or failure of those derivations. The extended RISC microprocessor is still capable of executing conventional procedural programs, thus hybrid applications can be implemented. The presented implementation is programmable, supports the execution of hybrid applications, increases the performance of logic derivations (experimental analysis yields an approximate 1000% increase in performance) and reduces the complexity of the final implemented code. The proposed hardware design is supported by a proposed extended C-language called C-AG.
Abstract: The implementation of the new software and hardware-s technologies for tritium processing nuclear plants, and especially those with an experimental character or of new technology developments shows a coefficient of complexity due to issues raised by the implementation of the performing instrumentation and equipment into a unitary monitoring system of the nuclear technological process of tritium removal. Keeping the system-s flexibility is a demand of the nuclear experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be processed stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with a developed technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be executed, to be continued with the execution of optimization system, by choosing new and performed methods corresponding to the technological processes within the tritium removal processing nuclear plants. The developing software applications is executed with the support of the program packages dedicated to industrial processes and they will include acquisition and monitoring sub-modules, named “virtually" as well as the storage sub-module of the process data later required for the software of optimization and simulation of the technological process for tritium removal. The system plays and important role in the environment protection and durable development through new technologies, that is – the reduction of and fight against industrial accidents in the case of tritium processing nuclear plants. Research for monitoring optimisation of nuclear processes is also a major driving force for economic and social development.
Abstract: The advent of modern technology shadows its impetus repercussions on successful Legacy systems making them obsolete with time. These systems have evolved the large organizations in major problems in terms of new business requirements, response time, financial depreciation and maintenance. Major difficulty is due to constant system evolution and incomplete, inconsistent and obsolete documents which a legacy system tends to have. The myriad dimensions of these systems can only be explored by incorporating reverse engineering, in this context, is the best method to extract useful artifacts and by exploring these artifacts for reengineering existing legacy systems to meet new requirements of organizations. A case study is conducted on six different type of software systems having source code in different programming languages using the architectural recovery framework.
Abstract: In this paper test generation methods and appropriate fault models for testing and analysis of embedded systems described as (extended) finite state machines ((E)FSMs) are presented. Compared to simple FSMs, EFSMs specify not only the control flow but also the data flow. Thus, we define a two-level fault model to cover both aspects. The goal of this paper is to reuse well-known FSM-based test generation methods for automation of embedded system testing. These methods have been widely used in testing and validation of protocols and communicating systems. In particular, (E)FSMs-based specification and testing is more advantageous because (E)FSMs support the formal semantic of already standardised formal description techniques (FDTs) despite of their popularity in the design of hardware and software systems.
Abstract: Automatic reusability appraisal could be helpful in
evaluating the quality of developed or developing reusable software
components and in identification of reusable components from
existing legacy systems; that can save cost of developing the software
from scratch. But the issue of how to identify reusable components
from existing systems has remained relatively unexplored. In this
paper, we have mentioned two-tier approach by studying the
structural attributes as well as usability or relevancy of the
component to a particular domain. Latent semantic analysis is used
for the feature vector representation of various software domains. It
exploits the fact that FeatureVector codes can be seen as documents
containing terms -the idenifiers present in the components- and so
text modeling methods that capture co-occurrence information in
low-dimensional spaces can be used. Further, we devised Neuro-
Fuzzy hybrid Inference System, which takes structural metric values
as input and calculates the reusability of the software component.
Decision tree algorithm is used to decide initial set of fuzzy rules for
the Neuro-fuzzy system. The results obtained are convincing enough
to propose the system for economical identification and retrieval of
reusable software components.
Abstract: As the network based technologies become
omnipresent, demands to secure networks/systems against threat
increase. One of the effective ways to achieve higher security is
through the use of intrusion detection systems (IDS), which are a
software tool to detect anomalous in the computer or network. In this
paper, an IDS has been developed using an improved machine
learning based algorithm, Locally Linear Neuro Fuzzy Model
(LLNF) for classification whereas this model is originally used for
system identification. A key technical challenge in IDS and LLNF
learning is the curse of high dimensionality. Therefore a feature
selection phase is proposed which is applicable to any IDS. While
investigating the use of three feature selection algorithms, in this
model, it is shown that adding feature selection phase reduces
computational complexity of our model. Feature selection algorithms
require the use of a feature goodness measure. The use of both a
linear and a non-linear measure - linear correlation coefficient and
mutual information- is investigated respectively
Abstract: A virtualized and virtual approach is presented on
academically preparing students to successfully engage at a strategic
perspective to understand those concerns and measures that are both
structured and not structured in the area of cyber security and
information assurance. The Master of Science in Cyber Security and
Information Assurance (MSCSIA) is a professional degree for those
who endeavor through technical and managerial measures to ensure
the security, confidentiality, integrity, authenticity, control,
availability and utility of the world-s computing and information
systems infrastructure. The National University Cyber Security and
Information Assurance program is offered as a Master-s degree. The
emphasis of the MSCSIA program uniquely includes hands-on
academic instruction using virtual computers. This past year, 2011,
the NU facility has become fully operational using system
architecture to provide a Virtual Education Laboratory (VEL)
accessible to both onsite and online students. The first student cohort
completed their MSCSIA training this past March 2, 2012 after
fulfilling 12 courses, for a total of 54 units of college credits. The
rapid pace scheduling of one course per month is immensely
challenging, perpetually changing, and virtually multifaceted. This
paper analyses these descriptive terms in consideration of those
globalization penetration breaches as present in today-s world of
cyber security. In addition, we present current NU practices to
mitigate risks.
Abstract: In most study fields, a phenomenon may not be
studied directly but it will be examined indirectly by phenomenon
model. Making an accurate model of system, there is attained new
information from modeled phenomenon without any charge, danger,
etc... there have been developed more solutions for describing and
analyzing the recent complicated systems but few of them have
analyzed the performance in the range of system description. Petri
nets are of limited solutions which may make such union. Petri nets
are being applied in problems related to modeling and designing the
systems. Theory of Petri nets allow a system to model
mathematically by a Petri net and analyzing the Petri net can then
determine main information of modeled system-s structure and
dynamic. This information can be used for assessing the performance
of systems and suggesting corrections in the system. In this paper,
beside the introduction of Petri nets, a real case study will be studied
in order to show the application of generalized stochastic Petri nets in
modeling a resource sharing production system and evaluating the
efficiency of its machines and robots. The modeling tool used here is
SHARP software which calculates specific indicators helping to
make decision.
Abstract: Intelligent tutoring systems constitute an evolution of computer-aided educational software. We present here the modules of an intelligent tutoring system for Automatic Control, developed in our department. Through the software application developed,students can perform complete automatic control laboratory experiments, either over the departmental local area network or over the Internet. Monitoring of access to the system (local as well as international), along with student performance statistics, has yielded strongly encouraging results (as of fall 2004), despite the advanced technical content of the presented paradigm, thus showing the potential of the system developed for education and for training.
Abstract: The emerging Semantic Web has been attracted many
researchers and developers. New applications have been developed on top of Semantic Web and many supporting tools introduced to improve its software development process. Metadata modeling is one of development process where supporting tools exists. The existing
tools are lack of readability and easiness for a domain knowledge expert to graphically models a problem in semantic model. In this paper, a metadata modeling tool called RDFGraph is proposed. This
tool is meant to solve those problems. RDFGraph is also designed to work with modern database management systems that support RDF and to improve the performance of the query execution process. The
testing result shows that the rules used in RDFGraph follows the W3C standard and the graphical model produced in this tool is properly translated and correct.
Abstract: The Ministry of Defense (MoD) spends hundreds of
millions of dollars on software to support its infrastructure, operate
its weapons and provide command, control, communications,
computing, intelligence, surveillance, and reconnaissance (C4ISR)
functions. These and other all new advanced systems have a common
critical component is information technology. Defense and
Aerospace environment is continuously striving to keep up with
increasingly sophisticated Information Technology (IT) in order to
remain effective in today-s dynamic and unpredictable threat
environment. This makes it one of the largest and fastest growing
expenses of Defense. Hundreds of millions of dollars spent a year on
IT projects. But, too many of those millions are wasted on costly
mistakes. Systems that do not work properly, new components that
are not compatible with old once, trendily new applications that do
not really satisfy defense needs or lost though poorly managed
contracts.
This paper investigates and compiles the effective strategies that
aim to end exasperation with low returns and high cost of
Information Technology Acquisition for defense; it tries to show how
to maximize value while reducing time and expenditure.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.
Abstract: Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.
Abstract: The number of intrusions and attacks against critical
infrastructures and other information networks is increasing rapidly.
While there is no identified evidence that terrorist organizations are
currently planning a coordinated attack against the vulnerabilities of
computer systems and network connected to critical infrastructure,
and origins of the indiscriminate cyber attacks that infect computers
on network remain largely unknown. The growing trend toward the
use of more automated and menacing attack tools has also
overwhelmed some of the current methodologies used for tracking
cyber attacks. There is an ample possibility that this kind of cyber
attacks can be transform to cyberterrorism caused by illegal purposes.
Cyberterrorism is a matter of vital importance to national welfare.
Therefore, each countries and organizations have to take a proper
measure to meet the situation and consider effective legislation about
cyberterrorism.
Abstract: Sediment loads transfer in hydraulic installations and their consequences for the O&M of modern canal systems is emerging as one of the most important considerations in hydraulic engineering projects apriticularly those which are inteded to feed the irrigation and draiange schemes of large command areas such as the Dez and Mogahn in Iran.. The aim of this paper is to investigate the applicability of the vortex tube as a viable means of extracting sediment loads entering the canal systems in general and the water inatke structures in particulars. The Western conveyance canal of the Dez Diversion weir which feeds the Karkheh Flood Plain in Sothwestern Dezful has been used as the case study using the data from the Dastmashan Hydrometric Station. The SHARC software has been used as an analytical framework to interprete the data. Results show that given the grain size D50 and the canal turbulence the adaption length from the beginning of the canal and after the diversion dam is estimated at 477 m, a point which is suitable for laying the vortex tube.
Abstract: User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.
Abstract: A key aspect of the design of any software system is
its architecture. An architecture description provides a formal model
of the architecture in terms of components and connectors and how
they are composed together. COSA (Component-Object based
Software Structures), is based on object-oriented modeling and
component-based modeling. The model improves the reusability by
increasing extensibility, evolvability, and compositionality of the
software systems. This paper presents the COSA modelling tool
which help architects the possibility to verify the structural coherence
of a given system and to validate its semantics with COSA approach.
Abstract: The development and use of mobile devices as well as its integration within education systems to deliver electronic contents and to support real-time communications was the focus of this research. In order to investigate the software engineering issues in using mobile devices a research on electronic content was initiated. The Developed MP3 mobile software solution was developed as a prototype for testing and developing a strategy for designing a usable m-learning environment. The mobile software solution was evaluated using mobile device using the link: http://projects.seeu.edu.mk/mlearn. The investigation also tested the correlation between the two mobile learning indicators: electronic content and attention, based on the Task Based learning instructional method. The mobile software solution ''M-Learn“ was developed as a prototype for testing the approach and developing a strategy for designing usable m-learning environment. The proposed methodology is about what learning modeling approach is more appropriate to use when developing mobile learning software.