Abstract: Experimental investigations of the DC electric field effect on thermal decomposition of biomass, formation of the axial flow of volatiles (CO, H2, CxHy), mixing of volatiles with swirling airflow at low swirl intensity (S ≈ 0.2-0.35), their ignition and on formation of combustion dynamics are carried out with the aim to understand the mechanism of electric field influence on biomass gasification, combustion of volatiles and heat energy production. The DC electric field effect on combustion dynamics was studied by varying the positive bias voltage of the central electrode from 0.6 kV to 3 kV, whereas the ion current was limited to 2 mA. The results of experimental investigations confirm the field-enhanced biomass gasification with enhanced release of volatiles and the development of endothermic processes at the primary stage of thermochemical conversion of biomass determining the field-enhanced heat energy consumption with the correlating decrease of the flame temperature and heat energy production at this stage of flame formation. Further, the field-enhanced radial expansion of the flame reaction zone correlates with a more complete combustion of volatiles increasing the combustion efficiency by 3% and decreasing the mass fraction of CO, H2 and CxHy in the products, whereas by 10% increases the average volume fraction of CO2 and the heat energy production downstream the combustor increases by 5-10%
Abstract: In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The resulting fully automatic generated news stories have a high resemblance to the style in which the human writer would draw up such a story. Topics include soccer games, stock exchange market reports, and weather forecasts. Each generated text is unique. Readyto-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save timeconsuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist.
Abstract: In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the surface hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor hobson talysurf tester, micro vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.
Abstract: Given the limited research on Small and Mediumsized
Enterprises’ (SMEs) contribution to Corporate Social
Responsibility (CSR) and even scarcer research on Swiss SMEs, this
paper helps to fill these gaps by enabling the identification of supranational
SME parameters. Thus, the paper investigates the current
state of SME practices in Switzerland and across 15 other countries.
Combining the degree to which SMEs demonstrate an explicit (or
business case) approach or see CSR as an implicit moral activity with
the assessment of their attributes for “variety of capitalism” defines
the framework of this comparative analysis. To outline Swiss small
business CSR patterns in particular, 40 SME owner-managers were
interviewed. A secondary data analysis of studies from different
countries laid groundwork for this comparative overview of small
business CSR. The paper identifies Swiss small business CSR as
driven by norms, values, and by the aspiration to contribute to
society, thus, as an implicit part of the day-to-day business. Similar to
most Central European, Mediterranean, Nordic, and Asian countries,
explicit CSR is still very rare in Swiss SMEs. Astonishingly, also
British and American SMEs follow this pattern in spite of their strong
and distinctly liberal market economies. Though other findings show
that nationality matters this research concludes that SME culture and
an informal CSR agenda are strongly formative and superseding even
forces of market economies, nationally cultural patterns, and
language. Hence, classifications of countries by their market system,
as found in the comparative capitalism literature, do not match the
CSR practices in SMEs as they do not mirror the peculiarities of their
business. This raises questions on the universality and
generalisability of unmediated, explicit management concepts,
especially in the context of small firms.
Abstract: This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.
Abstract: This study was conducted to examine the effectiveness of Teaching Games For Understanding (TGFU) in improving the hockey tactical skills and state self-confidence among 16-year-old students. Two hundred fifty-nine (259) school students were selected for the study based on the intact sampling method. One class was used as the control group (Boys=60, Girls=70), while another as the treatment group (Boys=60, Girls=69) underwent intervention with TGFU in physical education class conducted twice a week for four weeks. The Games Performance Assessment Instrument was used to observe the hockey tactical skills and The State Self-Confidence Inventory was used to determine the state of self-confidence among the students. After four weeks, ANCOVA analysis indicated the treatment groups had significant improvement in hockey tactical skills with F (1, 118) =313.37, p
Abstract: The modeling lung respiratory system that has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the pulmonary lung system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically relevant three-dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue that produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue viscoelasticity and tidal breathing period.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Supply chains are the backbone of trade and
commerce. Their logistics use different transport corridors on regular
basis for operational purpose. The international supply chain
transport corridors include different infrastructure elements (e.g.
weighbridge, package handling equipments, border clearance
authorities, and so on). This paper presents the use of multi-agent
systems (MAS) to model and simulate some aspects of transportation
corridors, and in particular the area of weighbridge resource
optimization for operational profit. An underlying multi-agent model
provides a means of modeling the relationships among stakeholders
in order to enable coordination in a transport corridor environment.
Simulations of the costs of container unloading, reloading, and
waiting time for queuing up tracks have been carried out using data
sets. Results of the simulation provide the potential guidance in
making decisions about optimal service resource allocation in a trade
corridor.
Abstract: In present study, it was aimed to determine potential
agricultural lands (PALs) in Gokceada (Imroz) Island of Canakkale
province, Turkey. Seven-band Landsat 8 OLI images acquired on
July 12 and August 13, 2013, and their 14-band combination image
were used to identify current Land Use Land Cover (LULC) status.
Principal Component Analysis (PCA) was applied to three Landsat
datasets in order to reduce the correlation between the bands. A total
of six Original and PCA images were classified using supervised
classification method to obtain the LULC maps including 6 main
classes (“Forest”, “Agriculture”, “Water Surface”, “Residential Area-
Bare Soil”, “Reforestation” and “Other”). Accuracy assessment was
performed by checking the accuracy of 120 randomized points for
each LULC maps. The best overall accuracy and Kappa statistic
values (90.83%, 0.8791% respectively) were found for PCA images
which were generated from 14-bands combined images called 3-
B/JA.
Digital Elevation Model (DEM) with 15 m spatial resolution
(ASTER) was used to consider topographical characteristics. Soil
properties were obtained by digitizing 1:25000 scaled soil maps of
Rural Services Directorate General. Potential Agricultural Lands
(PALs) were determined using Geographic information Systems
(GIS). Procedure was applied considering that “Other” class of
LULC map may be used for agricultural purposes in the future
properties. Overlaying analysis was conducted using Slope (S), Land
Use Capability Class (LUCC), Other Soil Properties (OSP) and Land
Use Capability Sub-Class (SUBC) properties.
A total of 901.62 ha areas within “Other” class (15798.2 ha) of
LULC map were determined as PALs. These lands were ranked as
“Very Suitable”, “Suitable”, “Moderate Suitable” and “Low
Suitable”. It was determined that the 8.03 ha were classified as “Very
Suitable” while 18.59 ha as suitable and 11.44 ha as “Moderate
Suitable” for PALs. In addition, 756.56 ha were found to be “Low
Suitable”. The results obtained from this preliminary study can serve
as basis for further studies.
Abstract: In this paper, we present a model-based regression test
suite reducing approach that uses EFSM model dependence analysis
and probability-driven greedy algorithm to reduce software regression
test suites. The approach automatically identifies the difference
between the original model and the modified model as a set of
elementary model modifications. The EFSM dependence analysis is
performed for each elementary modification to reduce the regression
test suite, and then the probability-driven greedy algorithm is adopted
to select the minimum set of test cases from the reduced regression test
suite that cover all interaction patterns. Our initial experience shows
that the approach may significantly reduce the size of regression test
suites.
Abstract: Academicians at the Arab Open University have
always voiced their concern about the efficacy of the blended
learning process. Based on 75% independent study and 25% face-toface
tutorial, it poses the challenge of the predisposition to
adjustment. Being used to the psychology of traditional educational
systems, AOU students cannot be easily weaned from being spoonfed.
Hence they lack the motivation to plunge into self-study. For
better involvement of AOU students into the learning practices, it is
imperative to diagnose the factors that impede or increase their
motivation. This is conducted through an empirical study grounded
upon observations and tested hypothesis and aimed at monitoring and
optimizing the students’ learning outcome. Recommendations of the
research will follow the findings.
Abstract: Ontology validation is an important part of web
applications’ development, where knowledge integration and
ontological reasoning play a fundamental role. It aims to ensure the
consistency and correctness of ontological knowledge and to
guarantee that ontological reasoning is carried out in a meaningful
way. Existing approaches to ontology validation address more or less
specific validation issues, but the overall process of validating web
ontologies has not been formally established yet. As the size and the
number of web ontologies continue to grow, more web applications’
developers will rely on the existing repository of ontologies rather
than develop ontologies from scratch. If an application utilizes
multiple independently created ontologies, their consistency must be
validated and eventually adjusted to ensure proper interoperability
between them. This paper presents a validation technique intended to
test the consistency of independent ontologies utilized by a common
application.
Abstract: Designing cost-efficient, secure network protocols for
Wireless Sensor Networks (WSNs) is a challenging problem because
sensors are resource-limited wireless devices. Security services such
as authentication and improved pairwise key establishment are
critical to high efficient networks with sensor nodes. For sensor
nodes to correspond securely with each other efficiently, usage of
cryptographic techniques is necessary. In this paper, two key
predistribution schemes that enable a mobile sink to establish a
secure data-communication link, on the fly, with any sensor nodes.
The intermediate nodes along the path to the sink are able to verify
the authenticity and integrity of the incoming packets using a
predicted value of the key generated by the sender’s essential power.
The proposed schemes are based on the pairwise key with the mobile
sink, our analytical results clearly show that our schemes perform
better in terms of network resilience to node capture than existing
schemes if used in wireless sensor networks with mobile sinks.
Abstract: Under active stress conditions, a rigid cantilever
retaining wall tends to rotate about a pivot point located within the
embedded depth of the wall. For purely granular and cohesive soils, a
methodology was previously reported called minimization of moment
ratio to determine the location of the pivot point of rotation. The
usage of this new methodology is to estimate the rotational stability
safety factor. Moreover, the degree of improvement required in a
backfill to get a desired safety factor can be estimated by the concept
of the shear strength demand. In this article, the accuracy of this
method for another type of cantilever walls called Contiguous Bored
Pile (CBP) retaining wall is evaluated by using physical modeling
technique. Based on observations, the results of moment ratio
minimization method are in good agreement with the results of the
carried out physical modeling.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: By the evolvement in technology, the way of
expressing opinions switched direction to the digital world. The
domain of politics, as one of the hottest topics of opinion mining
research, merged together with the behavior analysis for affiliation
determination in texts, which constitutes the subject of this paper.
This study aims to classify the text in news/blogs either as
Republican or Democrat with the minimum number of features. As
an initial set, 68 features which 64 were constituted by Linguistic
Inquiry and Word Count (LIWC) features were tested against 14
benchmark classification algorithms. In the later experiments, the
dimensions of the feature vector reduced based on the 7 feature
selection algorithms. The results show that the “Decision Tree”,
“Rule Induction” and “M5 Rule” classifiers when used with “SVM”
and “IGR” feature selection algorithms performed the best up to
82.5% accuracy on a given dataset. Further tests on a single feature
and the linguistic based feature sets showed the similar results. The
feature “Function”, as an aggregate feature of the linguistic category,
was found as the most differentiating feature among the 68 features
with the accuracy of 81% in classifying articles either as Republican
or Democrat.
Abstract: In this work, the Ictalurus punctatus species estimated
available physical habitat is compared with the estimated physical
habitat for the same but modified river reach, with the aim of creating
a linear park, along a length of 5 500 m.
To determine the effect of ecological park construction, on
physical habitat of the Lerma river stretch of study, first, the available
habitat for the Ictalurus punctatus species was estimated through the
simulation of the physical habitat, by using surveying, hydraulics,
and habitat information gotten at the river reach in its actual situation.
Second, it was estimated the available habitat for the above species,
upon the simulation of the physical habitat through the proposed
modification for the ecological park creation. Third, it is presented a
comparison between both scenarios in terms of available habitat
estimated for Ictalurus punctatus species, concluding that in cases of
adult and spawning life stages, changes in the channel to create an
ecological park would produce a considerable loss of potentially
usable habitat (PUH), while in the case of the juvenile life stage PUH
remains virtually unchanged, and in the case of life stage fry the PUH
would increase due to the presence of velocities and depths of lesser
magnitude, due to the presence of minor flow rates and lower volume
of the wet channel.
It is expected that habitat modification for linear park construction
may produce the lack of Ictalurus punktatus species conservation at
the river reach of the study.
Abstract: In this paper, the goal programming methodology for
solving multiple objective problem of the technological variants and
production plan optimization has been applied. The optimization
criteria are determined and the multiple objective linear programming
model for solving a problem of the technological variants and
production plan optimization is formed and solved. Then the obtained
results are analysed. The obtained results point out to the possibility
of efficient application of the goal programming methodology in
solving the problem of the technological variants and production plan
optimization. The paper points out on the advantages of the
application of the goal programming methodology compare to the
Surrogat Worth Trade-off method in solving this problem.
Abstract: Experimental economics is subject to criticism with
regards to frequently discussed the trade-off between internal and
external validity requirements, which seems to be critically flawed.
This paper evaluates incompatibility of trade-off condition and
condition of internal validity as a prerequisite for external validity. In
addition, it outlines the imprecise concept of artificiality, which is
found to be rather improving the external validity and seems to
strengthen the illusory status of external versus internal validity
tension. Internal validity is further analyzed with regards to Duhem-
Quine problem, where unpredictability argument is significantly
weakened trough application of inductivism within the illustrative
hypothetical-deductive model. Our discussion partially weakens
critical arguments related to the robustness of results in experimental
economics, if the perfectly controlled experimental environment is
secured.