Abstract: Background in music analysis: Traditionally, when we
think about a composer’s sketches, the chances are that we are
thinking in terms of the working out of detail, rather than the
evolution of an overall concept. Since music is a “time art,” it follows
that questions of a form cannot be entirely detached from
considerations of time. One could say that composers tend to regard
time either as a place gradually and partially intuitively filled, or they
can look for a specific strategy to occupy it. It seems that the one
thing that sheds light on Stockhausen’s compositional thinking is his
frequent use of “form schemas,” that is often a single-page
representation of the entire structure of a piece.
Background in music technology: Sonic Visualiser is a program
used to study a musical recording. It is an open source application for
viewing, analyzing, and annotating music audio files. It contains a
number of visualisation tools, which are designed with useful default
parameters for musical analysis. Additionally, the Vamp plugin
format of SV supports to provide analysis such as for example
structural segmentation.
Aims: The aim of paper is to show how SV may be used to obtain
a better understanding of the specific musical work, and how the
compositional strategy does impact on musical structures and musical
surfaces. It is known that “traditional” music analytic methods don’t
allow indicating interrelationships between musical surface (which is
perceived) and underlying musical/acoustical structure.
Main Contribution: Stockhausen had dealt with the most diverse
musical problems by the most varied methods. A characteristic which
he had never ceased to be placed at the center of his thought and
works, it was the quest for a new balance founded upon an acute
connection between speculation and intuition. In the case with
Mikrophonie I (1964) for tam-tam and 6 players Stockhausen makes
a distinction between the “connection scheme,” which indicates the
ground rules underlying all versions, and the form scheme, which is
associated with a particular version. The preface to the published
score includes both the connection scheme, and a single instance of a
“form scheme,” which is what one can hear on the CD recording. In
the current study, the insight into the compositional strategy chosen
by Stockhausen was been compared with auditory image, that is, with
the perceived musical surface. Stockhausen’s musical work is
analyzed both in terms of melodic/voice and timbre evolution.
Implications: The current study shows how musical structures
have determined of musical surface. The general assumption is this,
that while listening to music we can extract basic kinds of musical
information from musical surfaces. It is shown that interactive
strategies of musical structure analysis can offer a very fruitful way
of looking directly into certain structural features of music.
Abstract: This study, for its research subjects, uses patients who
had undergone total knee replacement surgery from the database of the
National Health Insurance Administration. Through the review of
literatures and the interviews with physicians, important factors are
selected after careful screening. Then using Cross Entropy Method,
Genetic Algorithm Logistic Regression, and Particle Swarm
Optimization, the weight of each factor is calculated and obtained. In
the meantime, Excel VBA and Case Based Reasoning are combined
and adopted to evaluate the system. Results show no significant
difference found through Genetic Algorithm Logistic Regression and
Particle Swarm Optimization with over 97% accuracy in both
methods. Both ROC areas are above 0.87. This study can provide
critical reference to medical personnel as clinical assessment to
effectively enhance medical care quality and efficiency, prevent
unnecessary waste, and provide practical advantages to resource
allocation to medical institutes.
Abstract: This article aims to analyze the static stability and
pseudostatic slope by using different methods such as: Bishop
method, Junbu, Ordinary, Morgenstern-price and GLE. The two
dimensional modeling of slope stability under various loading as: the
earthquake effect, the water level and road mobile charges. The
results show that the slope is stable in the static case without water,
but in other cases, the slope lost its stability and give unstable. The
calculation of safety factor is to evaluate the stability of the slope
using the limit equilibrium method despite the difference between the
results obtained by these methods that do not rely on the same
assumptions. In the end, the results of this study illuminate well the
influence of the action of water, moving loads and the earthquake on
the stability of the slope.
Abstract: In this paper a new methodology for vendor selection
and supply quotas determination (VSSQD) is proposed. The problem
of VSSQD is solved by the model that combines revised weighting
method for determining the objective function coefficients, and a
multiple objective linear programming (MOLP) method based on the
cooperative game theory for VSSQD. The criteria used for VSSQD
are: (1) purchase costs and (2) product quality supplied by individual
vendors. The proposed methodology has been tested on the example
of flour purchase for a bakery with two decision makers.
Abstract: The Figaro AM-1 sensor module which employs TGS
2600 model gas sensor in air quality assessment was used. The
system was coupled with a microprocessor that enables sensor
module to create warning message via telephone. This low cot sensor
system’s performance was compared with a DiagNose II commercial
electronic nose system. Both air quality sensor and electronic nose
system employ metal oxide chemical gas sensors. In the study
experimental setup, data acquisition methods for electronic nose
system, and performance of the low cost air quality system were
evaluated and explained.
Abstract: Construction and reconstruction of settlements and
individual municipalities, environmental management and the
creation, deployment of the forces of production and building
transport and technical equipment requires a large expenditure of
material and human resources. That is why the economic aspects of
the majority decision in these planes built in the foreground and are
often decisive. Thereby but more serious is that the economic aspects
of the settlement, the creation and function remain in their whole,
unprocessed, and cannot speak of a set of individual techniques and
methods traditional indicators and experiments with new approaches.
This is true both at the level of the national economy, and in their
own urban designs. Still a few remain identified specific economic
shaping patterns of settlement and the less it is possible to speak of
their control. Also practical assessing economics of specific solutions
are often used non-apt indicators in addition to economics usually
identifies with the lowest acquisition cost or high-intensity land use
with little regard for functional efficiency and little studied much
higher operating and maintenance costs".
Abstract: Riveting process is one of the important ways to keep
fastening the lap joints in aircraft structures. Failure of aircraft lap
joints directly depends on the stress field in the joint. An important
application of riveting process is in the construction of aircraft
fuselage structures. In this paper, a 3D finite element method is
carried out in order to optimize residual stress field in a riveted lap
joint and also to estimate its fatigue life. In continue, a number of
experiments are designed and analyzed using design of experiments
(DOE). Then, Taguchi method is used to select an optimized case
between different levels of each factor. Besides that, the factor which
affects the most on residual stress field is investigated. Such
optimized case provides the maximum residual stress field. Fatigue
life of the optimized joint is estimated by Paris-Erdogan law. Stress
intensity factors (SIFs) are calculated using both finite element
analysis and experimental formula. In addition, the effect of residual
stress field, geometry and secondary bending are considered in SIF
calculation. A good agreement is found between results of such
methods. Comparison between optimized fatigue life and fatigue life
of other joints has shown an improvement in the joint’s life.
Abstract: This paper represents the results of experimental work to investigate the suitability of a waste material (WM) for soft soil stabilisation. In addition, the effect of particle size distribution (PSD) of the waste material on its performance as a soil stabiliser was investigated. The WM used in this study is produced from the incineration processes in domestic energy power plant and it is available in two different grades of fineness (coarse waste material (CWM) and fine waste material (FWM)). An intermediate plasticity silty clayey soil with medium organic matter content has been used in this study. The suitability of the CWM and FWM to improve the physical and engineering properties of the selected soil was evaluated dependant on the results obtained from the consistency limits, compaction characteristics (optimum moisture content (OMC) and maximum dry density (MDD)); along with the unconfined compressive strength test (UCS). Different percentages of CWM were added to the soft soil (3, 6, 9, 12 and 15%) to produce various admixtures. Then the UCS test was carried out on specimens under different curing periods (zero, 7, 14, and 28 days) to find the optimum percentage of CWM. The optimum and other two percentages (either side of the optimum content) were used for FWM to evaluate the effect of the fineness of the WM on UCS of the stabilised soil. Results indicated that both types of the WM used in this study improved the physical properties of the soft soil where the index of plasticity (IP) was decreased significantly. IP was decreased from 21 to 13.64 and 13.10 with 12% of CWM and 15% of FWM respectively. The results of the unconfined compressive strength test indicated that 12% of CWM was the optimum and this percentage developed the UCS value from 202kPa to 500kPa for 28 days cured samples, which is equal, approximately 2.5 times the UCS value for untreated soil. Moreover, this percentage provided 1.4 times the value of UCS for stabilized soil-CWA by using FWM which recorded just under 700kPa after 28 days curing.
Abstract: Total Quality Management (TQM) refers to management methods used to enhance quality and productivity in business organizations. Total Quality Management (TQM) has become a frequently used term in discussions concerning quality. Total Quality management has brought rise in demands on the organizations policy and the customers have gained more importance in the organizations focus. TQM is considered as an important management tool, which helps the organizations to satisfy their customers. In present research critical success factors includes management commitment, customer satisfaction, continuous improvement, work culture and environment, supplier quality management, training and development, employee satisfaction and product/process design are studied. A questionnaire is developed to implement these critical success factors in implementation of total quality management in Indian industry. Questionnaires filled by consulting different industrial organizations. Data collected from questionnaires is analyzed by descriptive and importance indexes.
Abstract: STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data
Abstract: In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: Aim of this research study is to investigate and
establish the characteristics of brain dominances (BD) and multiple
intelligences (MI). This experimentation has been conducted for the
sample size of 552 undergraduate computer-engineering students. In
addition, mathematical formulation has been established to exhibit
the relation between thinking and intelligence, and its correlation has
been analyzed. Correlation analysis has been statistically measured
using Pearson’s coefficient. Analysis of the results proves that there
is a strong relational existence between thinking and intelligence.
This research is carried to improve the didactic methods in
engineering learning and also to improve e-learning strategies.
Abstract: The main objective of this study was to assess the
annual concentration and seasonal variation of benzo(a)pyrene (BaP)
associated with PM10 in an urban site of Győr and in a rural site of
Sarród in the sampling period of 2008–2012. A total of 280 PM10
aerosol samples were collected in each sampling site and analyzed for
BaP by gas chromatography method. The BaP concentrations ranged
from undetected to 8 ng/m3 with the mean value of 1.01 ng/m3 in the
sampling site of Győr, and from undetected to 4.07 ng/m3 with the
mean value of 0.52 ng/m3 in the sampling site of Sarród, respectively.
Relatively higher concentrations of BaP were detected in samples
collected in both sampling sites in the heating seasons compared with
non-heating periods. The annual mean BaP concentrations were
comparable with the published data of different other Hungarian
sites.
Abstract: The main objective of this article is to examine the
impact of interest rates on investments in Poland in the context of
financial crisis. The paper also investigates the dependence of bank
loans to enterprises on interbank market rates. The article studies the
impact of interbank market rate on the level of investments in Poland.
Besides, this article focuses on the research of the correlation
between the level of corporate loans and the amount of investments
in Poland in order to determine the indirect impact of central bank
interest rates through the transmission mechanism of monetary policy
on the real economy. To achieve the objective we have used
econometric and statistical research methods like: econometric model
and Pearson correlation coefficient.
This analysis suggests that the central bank reference rate
inversely proportionally affects the level of investments in Poland
and this dependence is moderate. This is also important issue because
it is related to preparing of Poland to accession to euro area. The
research is important from both theoretical and empirical points of
view. The formulated conclusions and recommendations determine
the practical significance of the paper which may be used in the
decision making process of monetary and economic authorities of the
country.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: The aim of this investigation is to elaborate nearinfrared
methods for testing and recognition of chemical components
and quality in “Pannon wheat” allied (i.e. true to variety or variety
identified) milling fractions as well as to develop spectroscopic
methods following the milling processes and evaluate the stability of
the milling technology by different types of milling products and
according to sampling times, respectively. These wheat categories
produced under industrial conditions where samples were collected
versus sampling time and maximum or minimum yields. The changes
of the main chemical components (such as starch, protein, lipid) and
physical properties of fractions (particle size) were analysed by
dispersive spectrophotometers using visible (VIS) and near-infrared
(NIR) regions of the electromagnetic radiation. Close correlation
were obtained between the data of spectroscopic measurement
techniques processed by various chemometric methods (e.g. principal
component analysis [PCA], cluster analysis [CA]) and operation
condition of milling technology. It is obvious that NIR methods are
able to detect the deviation of the yield parameters and differences of
the sampling times by a wide variety of fractions, respectively. NIR
technology can be used in the sensitive monitoring of milling
technology.
Abstract: In this paper, the goal programming methodology for
solving multiple objective problem of the technological variants and
production plan optimization has been applied. The optimization
criteria are determined and the multiple objective linear programming
model for solving a problem of the technological variants and
production plan optimization is formed and solved. Then the obtained
results are analysed. The obtained results point out to the possibility
of efficient application of the goal programming methodology in
solving the problem of the technological variants and production plan
optimization. The paper points out on the advantages of the
application of the goal programming methodology compare to the
Surrogat Worth Trade-off method in solving this problem.
Abstract: As technology-based service industries grow
drastically worldwide; companies are recognizing the importance of
market preoccupancy and have made an effort to capture a large
market to gain the upper hand. To this end, a focus on patents can be
used to determine the properties of a technology, as well as to capture
advantages in technical skills, in comparison with the firm’s
competitors. However, technology-based services largely depend not
only on their technological value but also their economic value, due
to the recognized worth that is passed to a plurality of users. Thus, it
is important to determine whether there are any competitors in the
target areas and what services they provide in any field. Despite this
importance, little effort has been made to systematically benchmark
competitors in order to identify business opportunities. Thus, this
study aims to not only identify each position of technology-centered
service companies in complex market dynamics, but also to discover
new business opportunities. For this, we try to consider both
technology and market environments simultaneously by utilizing
patent data as a representative proxy for technology and trademark
dates as an index for a firm’s target goods and services. Theoretically,
this is one of the earliest attempts to combine patent data and
trademark data to analyze corporate strategies. In practice, the
research results are expected to be used as a decision criterion to
diagnose the economic value that companies can obtain by entering
the market, as well as the technological value to be passed onto their
customers. Thus, the proposed approach can be useful to support
effective technology and business strategies in a firm.