Abstract: LSP routing is among the prominent issues in MPLS
networks traffic engineering. The objective of this routing is to
increase number of the accepted requests while guaranteeing the
quality of service (QoS). Requested bandwidth is the most important
QoS criterion that is considered in literatures, and a various number
of heuristic algorithms have been presented with that regards. Many
of these algorithms prevent flows through bottlenecks of the network
in order to perform load balancing, which impedes optimum
operation of the network. Here, a modern routing algorithm is
proposed as MIRAD: having a little information of the network
topology, links residual bandwidth, and any knowledge of the
prospective requests it provides every request with a maximum
bandwidth as well as minimum end-to-end delay via uniform load
distribution across the network. Simulation results of the proposed
algorithm show a better efficiency in comparison with similar
algorithms.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.
Abstract: One of the basic concepts in marketing is the concept
of meeting customers- needs. Since customer satisfaction is essential
for lasting survival and development of a business, screening and
observing customer satisfaction and recognizing its underlying
factors must be one of the key activities of every business.
The purpose of this study is to recognize the drivers that effect
customer satisfaction in a business-to-business situation in order to
improve marketing activities. We conducted a survey in which 93
business customers of a manufacturer of Diesel Generator in Iran
participated and they talked about their ideas and satisfaction of
supplier-s services related to its products. We developed the measures
for drivers of satisfaction first by as investigative research (by means
of feedback from executives and customers of sponsoring firm). Then
based on these measures, we created a mail survey, and asked the
respondents to explain their opinion about the sponsoring firm which
was a supplier of diesel generator and similar products. Furthermore,
the survey required the participants to mention their functional areas
and their company features.
In Conclusion we found that there are three drivers for customer
satisfaction, which are reliability, information about product, and
commercial features. Buyers/users from different functional areas
attribute different degree of importance to the last two drivers. For
instance, people from buying and management areas believe that
commercial features are more important than information about
products. But people in engineering, maintenance and production
areas believe that having information about products is more
important than commercial aspects. Marketing experts should
consider the attribute of customers regarding information about the
product and commercial features to improve market share.
Abstract: This paper describes an experience of research,
development and innovation applied in Industrial Naval at (Science
and Technology Corporation for the Development of Shipbuilding
Industry, Naval in Colombia (COTECMAR) particularly through
processes of research, innovation and technological development,
based on theoretical models related to organizational knowledge
management, technology management and management of human
talent and integration of technology platforms. It seeks ways to
facilitate the initial establishment of environments rich in
information, knowledge and content-supported collaborative
strategies on dynamic processes missionary, seeking further
development in the context of research, development and innovation
of the Naval Engineering in Colombia, making it a distinct basis for
the generation of knowledge assets from COTECMAR.
The integration of information and communication technologies,
supported on emerging technologies (mobile technologies, wireless,
digital content via PDA, and content delivery services on the Web 2.0
and Web 3.0) as a view of the strategic thrusts in any organization
facilitates the redefinition of processes for managing information and
knowledge, enabling the redesign of workflows, the adaptation of
new forms of organization - preferably in networking and support the
creation of symbolic-inside-knowledge promotes the development of
new skills, knowledge and attitudes of the knowledge worker
Abstract: The Requirements Abstraction Model (RAM) helps in managing abstraction in requirements by organizing them at four levels (product, feature, function and component). The RAM is adaptable and can be tailored to meet the needs of the various organizations. Because software requirements are an important source of information for developing high-level tests, organizations willing to adopt the RAM model need to know the suitability of the RAM requirements for developing high-level tests. To investigate this suitability, test cases from twenty randomly selected requirements were developed, analyzed and graded. Requirements were selected from the requirements document of a Course Management System, a web based software system that supports teachers and students in performing course related tasks. This paper describes the results of the requirements document analysis. The results show that requirements at lower levels in the RAM are suitable for developing executable tests whereas it is hard to develop from requirements at higher levels.
Abstract: Modern highly automated production systems faces
problems of reliability. Machine function reliability results in
changes of productivity rate and efficiency use of expensive
industrial facilities. Predicting of reliability has become an important
research and involves complex mathematical methods and
calculation. The reliability of high productivity technological
automatic machines that consists of complex mechanical, electrical
and electronic components is important. The failure of these units
results in major economic losses of production systems. The
reliability of transport and feeding systems for automatic
technological machines is also important, because failure of transport
leads to stops of technological machines. This paper presents
reliability engineering on the feeding system and its components for
transporting a complex shape parts to automatic machines. It also
discusses about the calculation of the reliability parameters of the
feeding unit by applying the probability theory. Equations produced
for calculating the limits of the geometrical sizes of feeders and the
probability of sticking the transported parts into the chute represents
the reliability of feeders as a function of its geometrical parameters.
Abstract: This study focuses on the development of triangular fuzzy numbers, the revising of triangular fuzzy numbers, and the constructing of a HCFN (half-circle fuzzy number) model which can be utilized to perform more plural operations. They are further transformed for trigonometric functions and polar coordinates. From half-circle fuzzy numbers we can conceive cylindrical fuzzy numbers, which work better in algebraic operations. An example of fuzzy control is given in a simulation to show the applicability of the proposed half-circle fuzzy numbers.
Abstract: In many countries, digital city or ubiquitous city
(u-City) projects have been initiated to provide digitalized economic
environments to cities. Recently in Korea, Kangwon Province has
started the u-Kangwon project to boost local economy with digitalized
tourism services. We analyze the limitations of the ubiquitous IT
approach through the u-Kangwon case. We have found that travelers
are more interested in quality over speed in access of information. For
improved service quality, we are looking to develop an
IT-convergence service design framework (ISDF). The ISDF is based
on the service engineering technique and composed of three parts:
Service Design, Service Simulation, and the Service Platform.
Abstract: CEMTool is a command style design and analyzing
package for scientific and technological algorithm and a matrix based
computation language. In this paper, we present new 2D & 3D
finite element method (FEM) packages for CEMTool. We discuss
the detailed structures and the important features of pre-processor,
solver, and post-processor of CEMTool 2D & 3D FEM packages. In
contrast to the existing MATLAB PDE Toolbox, our proposed FEM
packages can deal with the combination of the reserved words. Also,
we can control the mesh in a very effective way. With the introduction
of new mesh generation algorithm and fast solving technique, our
FEM packages can guarantee the shorter computational time than
MATLAB PDE Toolbox. Consequently, with our new FEM packages,
we can overcome some disadvantages or limitations of the existing
MATLAB PDE Toolbox.
Abstract: Software engineering education not only embraces
technical skills of software development but also necessitates
communication and interaction among learners. In this paper, it is
proposed to adapt the PBL methodology that is especially designed to
be integrated into software engineering classroom in order to promote
collaborative learning environment. This approach helps students
better understand the significance of social aspects and provides a
systematic framework to enhance teamwork skills. The adaptation of
PBL facilitates the transition to an innovative software development
environment where cooperative learning can be actualized.
Abstract: The database reverse engineering problems and
solving processes are getting mature, even though, the academic
community is facing the complex problem of knowledge transfer,
both in university and industrial contexts. This paper presents a new
CASE tool developed at the University of Jordan which addresses an
efficient support of this transfer, namely UJ-CASE-TOOL. It is a
small and self-contained application exhibiting representative
problems and appropriate solutions that can be understood in a
limited time. It presents an algorithm that describes the developed
academic CASE tool which has been used for several years both as
an illustration of the principles of database reverse engineering and
as an exercise aimed at academic and industrial students.
Abstract: Cements, which are intrinsically brittle materials, can
exhibit a degree of pseudo-ductility when reinforced with a sufficient
volume fraction of a fibrous phase. This class of materials, called
Engineered Cement Composites (ECC) has the potential to be used in
future tunneling applications where a level of pseudo-ductility is
required to avoid brittle failures. However uncertainties remain
regarding mechanical performance. Previous work has focused on
comparatively thin specimens; however for future civil engineering
applications, it is imperative that the behavior in tension of thicker
specimens is understood. In the present work, specimens containing
cement powder and admixtures have been manufactured following
two different processes and tested in tension. Multiple matrix
cracking has been observed during tensile testing, leading to a
“strain-hardening" behavior, confirming the possible suitability of
ECC material when used as thick sections (greater than 50mm) in
tunneling applications.
Abstract: The distillation process in the general sense is a
relatively simple technique from the standpoints of its principles.
When dedicating distillation to water treatment and specifically
producing fresh water from sea, ocean and/ briny waters it is
interesting to notice that distillation has no limitations or domains of
applicability regarding the nature or the type of the feedstock water.
This is not the case however for other techniques that are
technologically quite complex, necessitate bigger capital investments
and are limited in their usability. In a previous paper we have
explored some of the effects of temperature on yield. In this paper,
we continue building onto that knowledge base and focus on the
effects of several additional engineering and design variables on
productivity.
Abstract: Climate change and environmental pressures are
major international issues nowadays. It is time when governments,
businesses and consumers have to respond through more
environmentally friendly and aware practices, products and policies.
This is the prime time to develop alternative sustainable construction
materials, reduce greenhouse gas emissions, save energy, look to
renewable energy sources and recycled materials, and reduce waste.
The utilization of waste materials (slag, fly ash, glass beads, plastic
and so on) in concrete manufacturing is significant due to its
engineering, financial, environmental and ecological benefits. Thus,
utilization of waste materials in concrete production is very much
helpful to reach the goal of the sustainable construction. Therefore,
this study intends to use glass beads in concrete production.
The paper reports on the performance of 9 different concrete
mixes containing different ratios of glass crushed to 5 mm - 20 mm
maximum size and glass marble of 20 mm size as coarse aggregate.
Ordinary Portland cement type 1 and fine sand less than 0.5 mm were
used to produce standard concrete cylinders. Compressive strength
tests were carried out on concrete specimens at various ages. Test
results indicated that the mix having the balanced ratio of glass beads
and round marbles possess maximum compressive strength which is
3889 psi, as glass beads perform better in bond formation but have
lower strength, on the other hand marbles are strong in themselves
but not good in bonding. These mixes were prepared following a
specific W/C and aggregate ratio; more strength can be expected to
achieve from different W/C, aggregate ratios, adding admixtures like
strength increasing agents, ASR inhibitor agents etc.
Abstract: The trend in the world of Information Technology
(IT) is getting increasingly large and difficult projects rather than
smaller and easier. However, the data on large-scale IT project
success rates provide cause for concern. This paper seeks to answer
why large-scale IT projects are different from and more difficult than
other typical engineering projects. Drawing on the industrial
experience, a compilation of the conditions that influence failure is
presented. With a view to improve success rates solutions are
suggested.
Abstract: The demands of taller structures are becoming imperative almost everywhere in the world in addition to the challenges of material and labor cost, project time line etc. This paper conducted a study keeping in view the challenging nature of high-rise construction with no generic rules for deflection minimizations and frequency control. The effects of cyclonic wind and provision of outriggers on 28-storey, 42-storey and 57-storey are examined in this paper and certain conclusions are made which would pave way for researchers to conduct further study in this particular area of civil engineering. The results show that plan dimensions have vital impacts on structural heights. Increase of height while keeping the plan dimensions same, leads to the reduction in the lateral rigidity. To achieve required stiffness increase of bracings sizes as well as introduction of additional lateral resisting system such as belt truss and outriggers is required.
Abstract: The advent of modern technology shadows its impetus repercussions on successful Legacy systems making them obsolete with time. These systems have evolved the large organizations in major problems in terms of new business requirements, response time, financial depreciation and maintenance. Major difficulty is due to constant system evolution and incomplete, inconsistent and obsolete documents which a legacy system tends to have. The myriad dimensions of these systems can only be explored by incorporating reverse engineering, in this context, is the best method to extract useful artifacts and by exploring these artifacts for reengineering existing legacy systems to meet new requirements of organizations. A case study is conducted on six different type of software systems having source code in different programming languages using the architectural recovery framework.
Abstract: Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.
Abstract: Run-offs are considered as important hydrological factors in feasibility studies of river engineering and irrigation-related projects under arid and semi-arid condition. Flood control is one of the crucial factor, the management of which while mitigates its destructive consequences, abstracts considerable volume of renewable water resources. The methodology applied here was based on Mizumura, which applied a mathematical model for simple tank to simulate the rainfall-run-off process in a particular water basin using the data from the observational hydrograph. The model was applied in the Dez River water basin adjacent to Greater Dezful region, Iran in order to simulate and estimate the floods. Results indicated that the calculated hydrographs using the simple tank method, SCS-CN model and the observation hydrographs had a close proximity. It was also found that on average the flood time and discharge peaks in the simple tank were closer to the observational data than the CN method. On the other hand, the calculated flood volume in the CN model was significantly closer to the observational data than the simple tank model.
Abstract: Clustering techniques have received attention in many areas including engineering, medicine, biology and data mining. The purpose of clustering is to group together data points, which are close to one another. The K-means algorithm is one of the most widely used techniques for clustering. However, K-means has two shortcomings: dependency on the initial state and convergence to local optima and global solutions of large problems cannot found with reasonable amount of computation effort. In order to overcome local optima problem lots of studies done in clustering. This paper is presented an efficient hybrid evolutionary optimization algorithm based on combining Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO), called PSO-ACO, for optimally clustering N object into K clusters. The new PSO-ACO algorithm is tested on several data sets, and its performance is compared with those of ACO, PSO and K-means clustering. The simulation results show that the proposed evolutionary optimization algorithm is robust and suitable for handing data clustering.