Abstract: This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: In this paper, we present a model-based regression test
suite reducing approach that uses EFSM model dependence analysis
and probability-driven greedy algorithm to reduce software regression
test suites. The approach automatically identifies the difference
between the original model and the modified model as a set of
elementary model modifications. The EFSM dependence analysis is
performed for each elementary modification to reduce the regression
test suite, and then the probability-driven greedy algorithm is adopted
to select the minimum set of test cases from the reduced regression test
suite that cover all interaction patterns. Our initial experience shows
that the approach may significantly reduce the size of regression test
suites.
Abstract: Ontology validation is an important part of web
applications’ development, where knowledge integration and
ontological reasoning play a fundamental role. It aims to ensure the
consistency and correctness of ontological knowledge and to
guarantee that ontological reasoning is carried out in a meaningful
way. Existing approaches to ontology validation address more or less
specific validation issues, but the overall process of validating web
ontologies has not been formally established yet. As the size and the
number of web ontologies continue to grow, more web applications’
developers will rely on the existing repository of ontologies rather
than develop ontologies from scratch. If an application utilizes
multiple independently created ontologies, their consistency must be
validated and eventually adjusted to ensure proper interoperability
between them. This paper presents a validation technique intended to
test the consistency of independent ontologies utilized by a common
application.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: As technology-based service industries grow
drastically worldwide; companies are recognizing the importance of
market preoccupancy and have made an effort to capture a large
market to gain the upper hand. To this end, a focus on patents can be
used to determine the properties of a technology, as well as to capture
advantages in technical skills, in comparison with the firm’s
competitors. However, technology-based services largely depend not
only on their technological value but also their economic value, due
to the recognized worth that is passed to a plurality of users. Thus, it
is important to determine whether there are any competitors in the
target areas and what services they provide in any field. Despite this
importance, little effort has been made to systematically benchmark
competitors in order to identify business opportunities. Thus, this
study aims to not only identify each position of technology-centered
service companies in complex market dynamics, but also to discover
new business opportunities. For this, we try to consider both
technology and market environments simultaneously by utilizing
patent data as a representative proxy for technology and trademark
dates as an index for a firm’s target goods and services. Theoretically,
this is one of the earliest attempts to combine patent data and
trademark data to analyze corporate strategies. In practice, the
research results are expected to be used as a decision criterion to
diagnose the economic value that companies can obtain by entering
the market, as well as the technological value to be passed onto their
customers. Thus, the proposed approach can be useful to support
effective technology and business strategies in a firm.
Abstract: Biodiesel as an alternative diesel fuel is steadily gaining more attention and significance. However, there are some drawbacks while using biodiesel regarding its properties that requires it to be blended with petrol based diesel and/or additives to improve the fuel characteristics. This study analyses thermal cracking as an alternative technology to improve biodiesel characteristics in which, FAME based biodiesel produced by transesterification of castor oil is fed into a continuous thermal cracking reactor at temperatures range of 450-500°C and flowrate range of 20-40 g/hr. Experiments designed by response surface methodology and subsequent statistical studies show that temperature and feed flowrate significantly affect the products yield. Response surfaces were used to study the impact of temperature and flowrate on the product properties. After each experiment, the produced crude bio-oil was distilled and diesel cut was separated. As shorter chain molecules are produced through thermal cracking, the distillation curve of the diesel cut fitted more with petrol based diesel curve in comparison to the biodiesel. Moreover, the produced diesel cut properties adequately pose within property ranges defined by the related standard of petrol based diesel. Cold flow properties, high heating value as the main drawbacks of the biodiesel are improved by this technology. Thermal cracking decreases kinematic viscosity, Flash point and cetane number.
Abstract: There is not much effective guideline on development of design parameters selection on spring back for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for spring back in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in Uchannel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24 ). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on spring back of flange angle (β2 ) and wall opening angle (β1 ), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the spring back behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for spring back was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental values.
Abstract: This study addresses a concept of the Sustainable Building Environmental Model (SBEM) developed to optimize energy consumption in air conditioning and ventilation (ACV) systems without any deterioration of indoor environmental quality (IEQ). The SBEM incorporates two main components: an adaptive comfort temperature control module (ACT) and a new carbon dioxide demand control module (nDCV). These two modules take an innovative approach to maintain satisfaction of the Indoor Environmental Quality (IEQ) with optimum energy consumption; they provide a rational basis of effective control. A total of 2133 sets of measurement data of indoor air temperature (Ta), relative humidity (Rh) and carbon dioxide concentration (CO2) were conducted in some Hong Kong offices to investigate the potential of integrating the SBEM. A simulation was used to evaluate the dynamic performance of the energy and air conditioning system with the integration of the SBEM in an air-conditioned building. It allows us make a clear picture of the control strategies and performed any pre-tuned of controllers before utilized in real systems. With the integration of SBEM, it was able to save up to 12.3% in simulation of overall electricity consumption, and maintain the average carbon dioxide concentration within 1000ppm and occupant dissatisfaction in 20%.
Abstract: Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.
Abstract: To explore how the brain may recognise objects in its
general,accurate and energy-efficient manner, this paper proposes the
use of a neuromorphic hardware system formed from a Dynamic
Video Sensor (DVS) silicon retina in concert with the SpiNNaker
real-time Spiking Neural Network (SNN) simulator. As a first step
in the exploration on this platform a recognition system for dynamic
hand postures is developed, enabling the study of the methods used
in the visual pathways of the brain. Inspired by the behaviours of
the primary visual cortex, Convolutional Neural Networks (CNNs)
are modelled using both linear perceptrons and spiking Leaky
Integrate-and-Fire (LIF) neurons.
In this study’s largest configuration using these approaches, a
network of 74,210 neurons and 15,216,512 synapses is created and
operated in real-time using 290 SpiNNaker processor cores in parallel
and with 93.0% accuracy. A smaller network using only 1/10th of the
resources is also created, again operating in real-time, and it is able
to recognise the postures with an accuracy of around 86.4% - only
6.6% lower than the much larger system. The recognition rate of the
smaller network developed on this neuromorphic system is sufficient
for a successful hand posture recognition system, and demonstrates
a much improved cost to performance trade-off in its approach.
Abstract: In this paper, we are interested in the problem of
finding similar images in a large database. For this purpose we
propose a new algorithm based on a combination of the 2-D
histogram intersection in the HSV space and statistical moments. The
proposed histogram is based on a 3x3 window and not only on the
intensity of the pixel. This approach overcome the drawback of the
conventional 1-D histogram which is ignoring the spatial distribution
of pixels in the image, while the statistical moments are used to
escape the effects of the discretisation of the color space which is
intrinsic to the use of histograms. We compare the performance of
our new algorithm to various methods of the state of the art and we
show that it has several advantages. It is fast, consumes little memory
and requires no learning. To validate our results, we apply this
algorithm to search for similar images in different image databases.
Abstract: Image segmentation and color identification is an
important process used in various emerging fields like intelligent
robotics. A method is proposed for the manipulator to grasp and place
the color object into correct location. The existing methods such as
PSO, has problems like accelerating the convergence speed and
converging to a local minimum leading to sub optimal performance.
To improve the performance, we are using watershed algorithm and
for color identification, we are using EPSO. EPSO method is used to
reduce the probability of being stuck in the local minimum. The
proposed method offers the particles a more powerful global
exploration capability. EPSO methods can determine the particles
stuck in the local minimum and can also enhance learning speed as
the particle movement will be faster.
Abstract: The Economic Lot Scheduling Problem (ELSP) is a
valuable mathematical model that can support decision-makers to
make scheduling decisions. The basic period approach is effective for
solving the ELSP. The assumption for applying the basic period
approach is that a product must use its maximum production rate to be
produced. However, a product can lower its production rate to reduce
the average total cost when a facility has extra idle time. The past
researches discussed how a product adjusts its production rate under
the common cycle approach. To the best of our knowledge, no studies
have addressed how a product lowers its production rate under the
basic period approach. This research is the first paper to discuss this
topic. The research develops a simple fixed rate approach that adjusts
the production rate of a product under the basic period approach to
solve the ELSP. Our numerical example shows our approach can find a
better solution than the traditional basic period approach. Our
mathematical model that applies the fixed rate approach under the
basic period approach can serve as a reference for other related
researches.
Abstract: Current research is targeting new molecular
mechanisms that underlie non-alcoholic fatty liver disease (NAFLD)
and associated metabolic disorders like non-alcoholic steatohepatitis
(NASH). Forty New Zealand White rabbits have been used and fed a
high protein (HP) and energy diet based on grains and containing
11.76 MJ/kg. Boron added to 3 experimental groups’ drinking waters
(30 mg boron/L) as boron compounds. Biochemical analysis
including boron levels, and nuclear magnetic resonance (NMR) based
metabolomics evaluation, and mRNA expression of peroxisome
proliferator-activated receptor (PPAR) family was performed. LDLcholesterol
concentrations alone were decreased in all the
experimental groups. Boron levels in serum and feces were increased.
Content of acetate was in about 2x higher for anhydrous borax group,
at least 3x higher for boric acid group. PPARα mRNA expression
was significantly decreased in boric acid group. Anhydrous borax
attenuated mRNA levels of PPARγ, which was further suppressed by
boric acid. Boron supplementation decreased the degenerative
alterations in hepatocytes. Except borax group other boron groups did
not have a pronounced change in tubular epithels of kidney. In
conclusion, high protein and energy diet leads hepatocytes’
degenerative changes which can be prevented by boron
supplementation. Boric acid seems to be more effective in this
situation.
Abstract: Total Quality Management (TQM) is a managerial
approach that improves the competitiveness of the industry,
meanwhile Information technology (IT) was introduced with TQM
for handling the technical issues which is supported by quality
experts for fulfilling the customers’ requirement. Present paper aims
to utilise AHP (Analytic Hierarchy Process) methodology to
priorities and rank the hierarchy levels of TQM enablers and IT
resource together for its successful implementation in the Information
and Communication Technology (ICT) industry. A total of 17 TQM
enablers (nine) and IT resources (eight) were identified and
partitioned into 3 categories and were prioritised by AHP approach.
The finding indicates that the 17 sub-criteria can be grouped into
three main categories namely organizing, tools and techniques, and
culture and people. Further, out of 17 sub-criteria, three sub-criteria:
top management commitment and support, total employee
involvement, and continuous improvement got highest priority
whereas three sub-criteria such as structural equation modelling,
culture change, and customer satisfaction got lowest priority. The
result suggests a hierarchy model for ICT industry to prioritise the
enablers and resources as well as to improve the TQM and IT
performance in the ICT industry. This paper has some managerial
implication which suggests the managers of ICT industry to
implement TQM and IT together in their organizations to get
maximum benefits and how to utilize available resources. At the end,
conclusions, limitation, future scope of the study are presented.
Abstract: This article proposes a new method for application in
communication circuit systems that increase efficiency, PAE, output
power and gain in the circuit. The proposed method is based on a
combination of switching class-E and class-J and has been termed
class-EJ. This method was investigated using both theory and
simulation to confirm ∼72% PAE and output power of >39dBm. The
combination and design of the proposed power amplifier accrues gain
of over 15dB in the 2.9 to 3.5GHz frequency bandwidth. This circuit
was designed using MOSFET and high power transistors. The loadand
source-pull method achieved the best input and output networks
using lumped elements. The proposed technique was investigated for
fundamental and second harmonics having desirable amplitudes for
the output signal.
Abstract: A novel design technique employing CMOS Current
Feedback Operational Amplifier (CFOA) is presented. The feature of
consumption very low power in designing pseudo-OTA is used to
decreasing the total power consumption of the proposed CFOA. This
design approach applies pseudo-OTA as input stage cascaded with
buffer stage. Moreover, the DC input offset voltage and harmonic
distortion (HD) of the proposed CFOA are very low values compared
with the conventional CMOS CFOA due to the symmetrical input
stage. P-Spice simulation results are obtained using 0.18μm MIETEC
CMOS process parameters and supply voltage of ±1.2V, 50μA
biasing current. The p-spice simulation shows excellent improvement
of the proposed CFOA over existing CMOS CFOA. Some of these
performance parameters, for example, are DC gain of 62. dB, openloop
gain bandwidth product of 108 MHz, slew rate (SR+) of
+71.2V/μS, THD of -63dB and DC consumption power (PC) of
2mW.
Abstract: We present a gas-liquid microfluidic system as a
reactor to obtain magnetite nanoparticles with an excellent degree of
control regarding their crystalline phase, shape and size. Several
types of microflow approaches were selected to prevent nanomaterial
aggregation and to promote homogenous size distribution. The
selected reactor consists of a mixer stage aided by ultrasound waves
and a reaction stage using a N2-liquid segmented flow to prevent
magnetite oxidation to non-magnetic phases. A milli-fluidic reactor
was developed to increase the production rate where a magnetite
throughput close to 450 mg/h in a continuous fashion was obtained.