Abstract: Construction projects can be implemented under various contractual and organizational systems. They can be divided into two groups: systems without the managing company where the Client manages the process, and systems with the managing company, where management is entrusted to an external company. In the public sector of the Polish market there are two ways of delivery of construction projects with the participation of the manager: one is to assign operations to another party, the so called Project Supervisor, whilst the other results from the application of FIDIC conditions of contract, which entail appointment of the Engineer. The decision is to be made by the Client and depends on various factors. On the public procurement market in Poland the selection of construction project manager boils down to awarding the contract for such a service. The selection can be done by one of eight public procurement procedures identified by the procurement law. The paper provides the analysis of 96 contracts for services awarded in 2011, which employed construction management. The study aimed to investigate the methods and criteria for selecting managers, applied in practice by the Polish public Clients.
Abstract: Today modern simulations solutions in the wind turbine industry have achieved a high degree of complexity and detail in result. Limitations exist when it is time to validate model results against measurements. Regarding Model validation it is of special interest to identify mode frequencies and to differentiate them from the different excitations. A wind turbine is a complex device and measurements regarding any part of the assembly show a lot of noise. Input excitations are difficult or even impossible to measure due to the stochastic nature of the environment. Traditional techniques for frequency analysis or features extraction are widely used to analyze wind turbine sensor signals, but have several limitations specially attending to non stationary signals (Events). A new technique based on autoregresive analysis techniques is introduced here for a specific application, a comparison and examples related to different events in the wind turbine operations are presented.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: In contemporary global and dynamically developing environment there is a need of the strategic planning fundamental. It is complicated, but at the same time important process from the point of view of continual keeping of competitive advantage. The aim of the paper is formulation of strategic goals for the needs of the small enterprises. There will be used Balanced Scorecard as a balanced system of the indicators for the clearing and transferring vision into particular goals. In particular perspectives the theme will be focused on strategic goals. Consequently will be mention the concept of the competitiveness IDINMOSU. This connect to Balanced Scorecard.
Abstract: It has been always observed that the effectiveness of
MIS as a support tool for management decisions degenerate after
time of implementation, despite the substantial investments being
made. This is true for organizations at the initial stages of MIS
implementations, manual or computerized. A survey of a sample of
middle to top managers in business and government institutions was
made. A large ratio indicates that the MIS has lost its impact on the
day-to-day operations, and even the response lag time expands
sometimes indefinitely. The data indicates an infant mortality
phenomenon of the bathtub model. Reasons may be monotonous
nature of MIS delivery, irrelevance, irreverence, timeliness, and lack
of adequate detail. All those reasons collaborate to create a degree of
degeneracy. We investigate and model as a bathtub model the
phenomenon of MIS degeneracy that inflicts the MIS systems and
renders it ineffective. A degeneracy index is developed to identify
the status of the MIS system and possible remedies to prevent the
onset of total collapse of the system to the point of being useless.
Abstract: Morphological operators transform the original image
into another image through the interaction with the other image of
certain shape and size which is known as the structure element.
Mathematical morphology provides a systematic approach to analyze
the geometric characteristics of signals or images, and has been
applied widely too many applications such as edge detection,
objection segmentation, noise suppression and so on. Fuzzy
Mathematical Morphology aims to extend the binary morphological
operators to grey-level images. In order to define the basic
morphological operations such as fuzzy erosion, dilation, opening
and closing, a general method based upon fuzzy implication and
inclusion grade operators is introduced. The fuzzy morphological
operations extend the ordinary morphological operations by using
fuzzy sets where for fuzzy sets, the union operation is replaced by a
maximum operation, and the intersection operation is replaced by a
minimum operation.
In this work, it consists of two articles. In the first one, fuzzy set
theory, fuzzy Mathematical morphology which is based on fuzzy
logic and fuzzy set theory; fuzzy Mathematical operations and their
properties will be studied in details. As a second part, the application
of fuzziness in Mathematical morphology in practical work such as
image processing will be discussed with the illustration problems.
Abstract: Helical milling operations are used to generate or
enlarge boreholes by means of a milling tool. The bore diameter can be
adjusted through the diameter of the helical path. The kinematics of
helical milling on a three axis machine tool is analysed firstly. The
relationships between processing parameters, cutting tool geometry
characters with machined hole feature are formulated. The feed motion
of the cutting tool has been decomposed to plane circular feed and
axial linear motion. In this paper, the time varying cutting forces acted
on the side cutting edges and end cutting edges of the flat end cylinder
miller is analysed using a discrete method separately. These two
components then are combined to produce the cutting force model
considering the complicated interaction between the cutters and
workpiece. The time varying cutting force model describes the
instantaneous cutting force during processing. This model could be
used to predict cutting force, calculate statics deflection of cutter and
workpiece, and also could be the foundation of dynamics model and
predicting chatter limitation of the helical milling operations.
Abstract: In this paper, we propose a texture feature-based
language identification using wavelet-domain BDIP (block difference
of inverse probabilities) and BVLC (block variance of local
correlation coefficients) features and FFT (fast Fourier transform)
feature. In the proposed method, wavelet subbands are first obtained
by wavelet transform from a test image and denoised by Donoho-s
soft-thresholding. BDIP and BVLC operators are next applied to the
wavelet subbands. FFT blocks are also obtained by 2D (twodimensional)
FFT from the blocks into which the test image is
partitioned. Some significant FFT coefficients in each block are
selected and magnitude operator is applied to them. Moments for each
subband of BDIP and BVLC and for each magnitude of significant
FFT coefficients are then computed and fused into a feature vector. In
classification, a stabilized Bayesian classifier, which adopts variance
thresholding, searches the training feature vector most similar to the
test feature vector. Experimental results show that the proposed
method with the three operations yields excellent language
identification even with rather low feature dimension.
Abstract: The product development process (PDP) in the
Technology group plays a very important role in the launch of any
product. While a manufacturing process encourages the use of certain
measures to reduce health, safety and environmental (HSE) risks on
the shop floor, the PDP concentrates on the use of Geometric
Dimensioning and Tolerancing (GD&T) to develop a flawless design.
Furthermore, PDP distributes and coordinates activities between
different departments such as marketing, purchasing, and
manufacturing. However, it is seldom realized that PDP makes a
significant contribution to developing a product that reduces HSE
risks by encouraging the Technology group to use effective GD&T.
The GD&T is a precise communication tool that uses a set of
symbols, rules, and definitions to mathematically define parts to be
manufactured. It is a quality assurance method widely used in the oil
and gas sector. Traditionally it is used to ensure the
interchangeability of a part without affecting its form, fit, and
function. Parts that do not meet these requirements are rejected
during quality audits.
This paper discusses how the Technology group integrates this
quality assurance tool into the PDP and how the tool plays a major
role in helping the HSE department in its goal towards eliminating
HSE incidents. The PDP involves a thorough risk assessment and
establishes a method to address those risks during the design stage.
An illustration shows how GD&T helped reduce safety risks by
ergonomically improving assembling operations. A brief discussion
explains how tolerances provided on a part help prevent finger injury.
This tool has equipped Technology to produce fixtures, which are
used daily in operations as well as manufacturing. By applying
GD&T to create good fits, HSE risks are mitigated for operating
personnel. Both customers and service providers benefit from
reduced safety risks.
Abstract: The paper addresses a problem of optimal staffing in
open shop environment. The problem is to determine the optimal
number of operators serving a given number of machines to fulfill the
number of independent operations while minimizing staff idle. Using
a Gantt chart presentation of the problem it is modeled as twodimensional
cutting stock problem. A mixed-integer programming
model is used to get minimal job processing time (makespan) for
fixed number of machines' operators. An algorithm for optimal openshop
staffing is developed based on iterative solving of the
formulated optimization task. The execution of the developed
algorithm provides optimal number of machines' operators in the
sense of minimum staff idle and optimal makespan for that number of
operators. The proposed algorithm is tested numerically for a real life
staffing problem. The testing results show the practical applicability
for similar open shop staffing problems.
Abstract: The machining of Carbon Fiber Reinforced Plastics
has come to constitute a significant challenge for many fields of
industry. The resulting surface finish of machined parts is of primary
concern for several reasons, including contact quality and impact on
the assembly. Therefore, the characterization and prediction of
roughness based on machining parameters are crucial for costeffective
operations. In this study, a PCD tool comprised of two
straight flutes was used to trim 32-ply carbon fiber laminates in a bid
to analyze the effects of the feed rate and the cutting speed on the
surface roughness. The results show that while the speed has but a
slight impact on the surface finish, the feed rate for its part affects it
strongly. A detailed study was also conducted on the effect of fiber
orientation on surface roughness, for quasi-isotropic laminates used
in aerospace. The resulting roughness profiles for the four-ply
orientation lay-up were compared, and it was found that fiber angle is
a critical parameter relating to surface roughness. One of the four
orientations studied led to very poor surface finishes, and
characteristic roughness profiles were identified and found to only
relate to the ply orientations of multilayer carbon fiber laminates.
Abstract: A new and highly efficient architecture for elliptic curve scalar point multiplication which is optimized for a binary field recommended by NIST and is well-suited for elliptic curve cryptographic (ECC) applications is presented. To achieve the maximum architectural and timing improvements we have reorganized and reordered the critical path of the Lopez-Dahab scalar point multiplication architecture such that logic structures are implemented in parallel and operations in the critical path are diverted to noncritical paths. With G=41, the proposed design is capable of performing a field multiplication over the extension field with degree 163 in 11.92 s with the maximum achievable frequency of 251 MHz on Xilinx Virtex-4 (XC4VLX200) while 22% of the chip area is occupied, where G is the digit size of the underlying digit-serial finite field multiplier.
Abstract: As the world changes more rapidly, the demand for update information for resource management, environment monitoring, planning are increasing exponentially. Integration of Remote Sensing with GIS technology will significantly promote the ability for addressing these concerns. This paper presents an alternative way of update GIS applications using image processing and high resolution images. We show a method of high-resolution image segmentation using graphs and morphological operations, where a preprocessing step (watershed operation) is required. A morphological process is then applied using the opening and closing operations. After this segmentation we can extract significant cartographic elements such as urban areas, streets or green areas. The result of this segmentation and this extraction is then used to update GIS applications. Some examples are shown using aerial photography.
Abstract: Due to their high power-to-weight ratio and low cost, pneumatic actuators are attractive for robotics and automation applications; however, achieving fast and accurate control of their position have been known as a complex control problem. The paper presents a methodology for obtaining controllers that achieve high position accuracy and preserve the closed-loop characteristics over a broad operating range. Experimentation with a number of conventional (or "classical") three-term controllers shows that, as repeated operations accumulate, the characteristics of the pneumatic actuator change requiring frequent re-tuning of the controller parameters (PID gains). Furthermore, three-term controllers are found to perform poorly in recovering the closed-loop system after the application of load or other external disturbances. The key reason for these problems lies in the non-linear exchange of energy inside the cylinder relating, in particular, to the complex friction forces that develop on the piston-wall interface. In order to overcome this problem but still remain within the boundaries of classical control methods, we designed an auto selective classicaql controller so that the system performance would benefit from all three control gains (KP, Kd, Ki) according to system requirements and the characteristics of each type of controller. This challenging experimentation took place for consistent performance in the face of modelling imprecision and disturbances. In the work presented, a selective PID controller is presented for an experimental rig comprising an air cylinder driven by a variable-opening pneumatic valve and equipped with position and pressure sensors. The paper reports on tests carried out to investigate the capability of this specific controller to achieve consistent control performance under, repeated operations and other changes in operating conditions.
Abstract: In the modern manufacturing systems, the use of
thermal cutting techniques using oxyfuel, plasma and laser have
become indispensable for the shape forming of high quality complex
components; however, the conventional chip removal production
techniques still have its widespread space in the manufacturing
industry. Both these types of machining operations require the
positioning of end effector tool at the edge where the cutting process
commences. This repositioning of the cutting tool in every machining
operation is repeated several times and is termed as non-productive
time or airtime motion. Minimization of this non-productive
machining time plays an important role in mass production with high
speed machining. As, the tool moves from one region to the other by
rapid movement and visits a meticulous region once in the whole
operation, hence the non-productive time can be minimized by
synchronizing the tool movements. In this work, this problem is
being formulated as a general travelling salesman problem (TSP) and
a genetic algorithm approach has been applied to solve the same. For
improving the efficiency of the algorithm, the GA has been
hybridized with a noble special heuristic and simulating annealing
(SA). In the present work a novel heuristic in the combination of GA
has been developed for synchronization of toolpath movements
during repositioning of the tool. A comparative analysis of new Meta
heuristic techniques with simple genetic algorithm has been
performed. The proposed metaheuristic approach shows better
performance than simple genetic algorithm for minimization of nonproductive
toolpath length. Also, the results obtained with the help of
hybrid simulated annealing genetic algorithm (HSAGA) are also
found better than the results using simple genetic algorithm only.
Abstract: With the widespread growth of applications of
Wireless Sensor Networks (WSNs), the need for reliable security
mechanisms these networks has increased manifold. Many security
solutions have been proposed in the domain of WSN so far. These
solutions are usually based on well-known cryptographic
algorithms.
In this paper, we have made an effort to survey well known
security issues in WSNs and study the behavior of WSN nodes that
perform public key cryptographic operations. We evaluate time
and power consumption of public key cryptography algorithm for
signature and key management by simulation.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: In this paper, we propose a method to extract the road
signs. Firstly, the grabbed image is converted into the HSV color space
to detect the road signs. Secondly, the morphological operations are
used to reduce noise. Finally, extract the road sign using the geometric
property. The feature extraction of road sign is done by using the color
information. The proposed method has been tested for the real
situations. From the experimental results, it is seen that the proposed
method can extract the road sign features effectively.
Abstract: Crime is a major societal problem for most of the
world's nations. Consequently, the police need to develop new
methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police
face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of
spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made).
These types of data are collected from several heterogeneous sources
in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for
crime data collection, integration and dissemination through mobile
devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This
paper proposes and investigates the use of service oriented
architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an
appropriate way to support data exchange and model sharing from
heterogeneous sources. Crime control also needs to facilitate mobile
spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and
anywhere.
Abstract: Air pollution is a major environmental health
problem, affecting developed and developing countries around the
world. Increasing amounts of potentially harmful gases and
particulate matter are being emitted into the atmosphere on a global
scale, resulting in damage to human health and the environment.
Petroleum-related air pollutants can have a wide variety of adverse
environmental impacts. In the crude oil production sectors, there is a
strong need for a thorough knowledge of gaseous emissions resulting
from the flaring of associated gas of known composition on daily
basis through combustion activities under several operating
conditions. This can help in the control of gaseous emission from
flares and thus in the protection of their immediate and distant
surrounding against environmental degradation.
The impacts of methane and non-methane hydrocarbons emissions
from flaring activities at oil production facilities at Kuwait Oilfields
have been assessed through a screening study using records of flaring
operations taken at the gas and oil production sites, and by analyzing
available meteorological and air quality data measured at stations
located near anthropogenic sources. In the present study the
Industrial Source Complex (ISCST3) Dispersion Model is used to
calculate the ground level concentrations of methane and nonmethane
hydrocarbons emitted due to flaring in all over Kuwait
Oilfields.
The simulation of real hourly air quality in and around oil
production facilities in the State of Kuwait for the year 2006,
inserting the respective source emission data into the ISCST3
software indicates that the levels of non-methane hydrocarbons from
the flaring activities exceed the allowable ambient air standard set by
Kuwait EPA. So, there is a strong need to address this acute problem
to minimize the impact of methane and non-methane hydrocarbons
released from flaring activities over the urban area of Kuwait.