Abstract: Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.
Abstract: Given the limited research on Small and Mediumsized
Enterprises’ (SMEs) contribution to Corporate Social
Responsibility (CSR) and even scarcer research on Swiss SMEs, this
paper helps to fill these gaps by enabling the identification of supranational
SME parameters. Thus, the paper investigates the current
state of SME practices in Switzerland and across 15 other countries.
Combining the degree to which SMEs demonstrate an explicit (or
business case) approach or see CSR as an implicit moral activity with
the assessment of their attributes for “variety of capitalism” defines
the framework of this comparative analysis. To outline Swiss small
business CSR patterns in particular, 40 SME owner-managers were
interviewed. A secondary data analysis of studies from different
countries laid groundwork for this comparative overview of small
business CSR. The paper identifies Swiss small business CSR as
driven by norms, values, and by the aspiration to contribute to
society, thus, as an implicit part of the day-to-day business. Similar to
most Central European, Mediterranean, Nordic, and Asian countries,
explicit CSR is still very rare in Swiss SMEs. Astonishingly, also
British and American SMEs follow this pattern in spite of their strong
and distinctly liberal market economies. Though other findings show
that nationality matters this research concludes that SME culture and
an informal CSR agenda are strongly formative and superseding even
forces of market economies, nationally cultural patterns, and
language. Hence, classifications of countries by their market system,
as found in the comparative capitalism literature, do not match the
CSR practices in SMEs as they do not mirror the peculiarities of their
business. This raises questions on the universality and
generalisability of unmediated, explicit management concepts,
especially in the context of small firms.
Abstract: This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: In this paper, we present a model-based regression test
suite reducing approach that uses EFSM model dependence analysis
and probability-driven greedy algorithm to reduce software regression
test suites. The approach automatically identifies the difference
between the original model and the modified model as a set of
elementary model modifications. The EFSM dependence analysis is
performed for each elementary modification to reduce the regression
test suite, and then the probability-driven greedy algorithm is adopted
to select the minimum set of test cases from the reduced regression test
suite that cover all interaction patterns. Our initial experience shows
that the approach may significantly reduce the size of regression test
suites.
Abstract: Ontology validation is an important part of web
applications’ development, where knowledge integration and
ontological reasoning play a fundamental role. It aims to ensure the
consistency and correctness of ontological knowledge and to
guarantee that ontological reasoning is carried out in a meaningful
way. Existing approaches to ontology validation address more or less
specific validation issues, but the overall process of validating web
ontologies has not been formally established yet. As the size and the
number of web ontologies continue to grow, more web applications’
developers will rely on the existing repository of ontologies rather
than develop ontologies from scratch. If an application utilizes
multiple independently created ontologies, their consistency must be
validated and eventually adjusted to ensure proper interoperability
between them. This paper presents a validation technique intended to
test the consistency of independent ontologies utilized by a common
application.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: Experts, enterprises and operators expect that the
bandwidth request will increase up to rates of 100 to 1,000 Mbps
within several years. Therefore the most important question is which
technology shall satisfy the future consumer broadband demands.
Currently the consensus is, that the fiber technology has the best
technical characteristics to achieve such the high bandwidth rates.
But fiber technology is so far very cost-intensive and resource
consuming. To avoid these investments, operators are concentrating
to upgrade the existing copper and hybrid fiber coax infrastructures.
This work presents a comparison of the copper and fiber
technologies including an overview about the current German
broadband market. Both technologies are reviewed in the terms of
demand, willingness to pay and economic efficiency in connection
with the technical characteristics.
Abstract: As technology-based service industries grow
drastically worldwide; companies are recognizing the importance of
market preoccupancy and have made an effort to capture a large
market to gain the upper hand. To this end, a focus on patents can be
used to determine the properties of a technology, as well as to capture
advantages in technical skills, in comparison with the firm’s
competitors. However, technology-based services largely depend not
only on their technological value but also their economic value, due
to the recognized worth that is passed to a plurality of users. Thus, it
is important to determine whether there are any competitors in the
target areas and what services they provide in any field. Despite this
importance, little effort has been made to systematically benchmark
competitors in order to identify business opportunities. Thus, this
study aims to not only identify each position of technology-centered
service companies in complex market dynamics, but also to discover
new business opportunities. For this, we try to consider both
technology and market environments simultaneously by utilizing
patent data as a representative proxy for technology and trademark
dates as an index for a firm’s target goods and services. Theoretically,
this is one of the earliest attempts to combine patent data and
trademark data to analyze corporate strategies. In practice, the
research results are expected to be used as a decision criterion to
diagnose the economic value that companies can obtain by entering
the market, as well as the technological value to be passed onto their
customers. Thus, the proposed approach can be useful to support
effective technology and business strategies in a firm.
Abstract: Biodiesel as an alternative diesel fuel is steadily gaining more attention and significance. However, there are some drawbacks while using biodiesel regarding its properties that requires it to be blended with petrol based diesel and/or additives to improve the fuel characteristics. This study analyses thermal cracking as an alternative technology to improve biodiesel characteristics in which, FAME based biodiesel produced by transesterification of castor oil is fed into a continuous thermal cracking reactor at temperatures range of 450-500°C and flowrate range of 20-40 g/hr. Experiments designed by response surface methodology and subsequent statistical studies show that temperature and feed flowrate significantly affect the products yield. Response surfaces were used to study the impact of temperature and flowrate on the product properties. After each experiment, the produced crude bio-oil was distilled and diesel cut was separated. As shorter chain molecules are produced through thermal cracking, the distillation curve of the diesel cut fitted more with petrol based diesel curve in comparison to the biodiesel. Moreover, the produced diesel cut properties adequately pose within property ranges defined by the related standard of petrol based diesel. Cold flow properties, high heating value as the main drawbacks of the biodiesel are improved by this technology. Thermal cracking decreases kinematic viscosity, Flash point and cetane number.
Abstract: There is not much effective guideline on development of design parameters selection on spring back for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for spring back in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in Uchannel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24 ). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on spring back of flange angle (β2 ) and wall opening angle (β1 ), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the spring back behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for spring back was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental values.
Abstract: This study addresses a concept of the Sustainable Building Environmental Model (SBEM) developed to optimize energy consumption in air conditioning and ventilation (ACV) systems without any deterioration of indoor environmental quality (IEQ). The SBEM incorporates two main components: an adaptive comfort temperature control module (ACT) and a new carbon dioxide demand control module (nDCV). These two modules take an innovative approach to maintain satisfaction of the Indoor Environmental Quality (IEQ) with optimum energy consumption; they provide a rational basis of effective control. A total of 2133 sets of measurement data of indoor air temperature (Ta), relative humidity (Rh) and carbon dioxide concentration (CO2) were conducted in some Hong Kong offices to investigate the potential of integrating the SBEM. A simulation was used to evaluate the dynamic performance of the energy and air conditioning system with the integration of the SBEM in an air-conditioned building. It allows us make a clear picture of the control strategies and performed any pre-tuned of controllers before utilized in real systems. With the integration of SBEM, it was able to save up to 12.3% in simulation of overall electricity consumption, and maintain the average carbon dioxide concentration within 1000ppm and occupant dissatisfaction in 20%.
Abstract: Rapidly changing factors that affect daily life also affect operational environment and the way military leaders fulfill their missions. With the help of technological developments, traditional linearity of conflict and war has started to fade away. Furthermore, mission domain has broadened to include traditional threats, hybrid threats and new challenges of cyber and space. Considering the future operational environment, future military leaders need to adapt themselves to the new challenges of the future battlefield. But how to decide what kind of features of leadership are required to operate and accomplish mission in the new complex battlefield? In this article, the main aim is to provide answers to this question. To be able to find right answers, first leadership and leadership components are defined, and then characteristics of future operational environment are analyzed. Finally, leadership features that are required to be successful in redefined battlefield are explained.
Abstract: Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.
Abstract: This paper presented a study of three algorithms, the
equalization algorithm to equalize the transmission channel with ZF
and MMSE criteria, application of channel Bran A, and adaptive
filtering algorithms LMS and RLS to estimate the parameters of the
equalizer filter, i.e. move to the channel estimation and therefore
reflect the temporal variations of the channel, and reduce the error in
the transmitted signal. So far the performance of the algorithm
equalizer with ZF and MMSE criteria both in the case without noise,
a comparison of performance of the LMS and RLS algorithm.
Abstract: To explore how the brain may recognise objects in its
general,accurate and energy-efficient manner, this paper proposes the
use of a neuromorphic hardware system formed from a Dynamic
Video Sensor (DVS) silicon retina in concert with the SpiNNaker
real-time Spiking Neural Network (SNN) simulator. As a first step
in the exploration on this platform a recognition system for dynamic
hand postures is developed, enabling the study of the methods used
in the visual pathways of the brain. Inspired by the behaviours of
the primary visual cortex, Convolutional Neural Networks (CNNs)
are modelled using both linear perceptrons and spiking Leaky
Integrate-and-Fire (LIF) neurons.
In this study’s largest configuration using these approaches, a
network of 74,210 neurons and 15,216,512 synapses is created and
operated in real-time using 290 SpiNNaker processor cores in parallel
and with 93.0% accuracy. A smaller network using only 1/10th of the
resources is also created, again operating in real-time, and it is able
to recognise the postures with an accuracy of around 86.4% - only
6.6% lower than the much larger system. The recognition rate of the
smaller network developed on this neuromorphic system is sufficient
for a successful hand posture recognition system, and demonstrates
a much improved cost to performance trade-off in its approach.
Abstract: In this paper, we are interested in the problem of
finding similar images in a large database. For this purpose we
propose a new algorithm based on a combination of the 2-D
histogram intersection in the HSV space and statistical moments. The
proposed histogram is based on a 3x3 window and not only on the
intensity of the pixel. This approach overcome the drawback of the
conventional 1-D histogram which is ignoring the spatial distribution
of pixels in the image, while the statistical moments are used to
escape the effects of the discretisation of the color space which is
intrinsic to the use of histograms. We compare the performance of
our new algorithm to various methods of the state of the art and we
show that it has several advantages. It is fast, consumes little memory
and requires no learning. To validate our results, we apply this
algorithm to search for similar images in different image databases.
Abstract: Image segmentation and color identification is an
important process used in various emerging fields like intelligent
robotics. A method is proposed for the manipulator to grasp and place
the color object into correct location. The existing methods such as
PSO, has problems like accelerating the convergence speed and
converging to a local minimum leading to sub optimal performance.
To improve the performance, we are using watershed algorithm and
for color identification, we are using EPSO. EPSO method is used to
reduce the probability of being stuck in the local minimum. The
proposed method offers the particles a more powerful global
exploration capability. EPSO methods can determine the particles
stuck in the local minimum and can also enhance learning speed as
the particle movement will be faster.
Abstract: The Economic Lot Scheduling Problem (ELSP) is a
valuable mathematical model that can support decision-makers to
make scheduling decisions. The basic period approach is effective for
solving the ELSP. The assumption for applying the basic period
approach is that a product must use its maximum production rate to be
produced. However, a product can lower its production rate to reduce
the average total cost when a facility has extra idle time. The past
researches discussed how a product adjusts its production rate under
the common cycle approach. To the best of our knowledge, no studies
have addressed how a product lowers its production rate under the
basic period approach. This research is the first paper to discuss this
topic. The research develops a simple fixed rate approach that adjusts
the production rate of a product under the basic period approach to
solve the ELSP. Our numerical example shows our approach can find a
better solution than the traditional basic period approach. Our
mathematical model that applies the fixed rate approach under the
basic period approach can serve as a reference for other related
researches.