Abstract: Numerous signal processing based speech enhancement systems have been proposed to improve intelligibility in the presence of noise. Traditionally, studies of neural vowel encoding have focused on the representation of formants (peaks in vowel spectra) in the discharge patterns of the population of auditory-nerve (AN) fibers. A method is presented for recording high-frequency speech components into a low-frequency region, to increase audibility for hearing loss listeners. The purpose of the paper is to enhance the formant of the speech based on the Kaiser window. The pitch and formant of the signal is based on the auto correlation, zero crossing and magnitude difference function. The formant enhancement stage aims to restore the representation of formants at the level of the midbrain. A MATLAB software’s are used for the implementation of the system with low complexity is developed.
Abstract: The problem of construction material waste remains unresolved, as a significant percentage of the materials delivered to some project sites end up as waste which might result in additional project cost. Cost overrun is a problem which affects 90% of the completed projects in the world. The argument on how to eliminate it has been on-going for the past 70 years, but there is neither substantial improvement nor significant solution for mitigating its detrimental effects. Research evidence has proposed various construction cost overruns and material-waste management approaches; nonetheless, these studies failed to give a clear indication on the framework and the equation for managing construction material waste and cost overruns. Hence, this research aims to develop a conceptual framework and a mathematical equation for managing material waste and cost overrun in the construction industry. The paper adopts the desktop methodological approach. This involves comparing the causes of material waste and those of cost overruns from the literature to determine the possible relationship. The review revealed a relationship between material waste and cost overrun that; increase in material waste would result to a corresponding increase in the amount of cost overrun at both the pre-contract and the post contract stages of a project. It was found from the equation that achieving an effective construction material waste management must ensure a “Good Quality-of-Planning, Estimating, and Design Management” and a “Good Quality- of-Construction, Procurement and Site Management”; a decrease in “Design Complexity” which would reduce “Material Waste” and subsequently reduce the amount of cost overrun by 86.74%. The conceptual framework and the mathematical equation developed in this study are recommended to the professionals of the construction industry.
Abstract: Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation. The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements” The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.
Abstract: The article proposes maximum power point tracking without mechanical sensor using Multilayer Perceptron Neural Network (MLPNN). The aim of article is to reduce the cost and complexity but still retain efficiency. The experimental is that duty cycle is generated maximum power, if it has suitable qualification. The measured data from DC generator, voltage (V), current (I), power (P), turnover rate of power (dP), and turnover rate of voltage (dV) are used as input for MLPNN model. The output of this model is duty cycle for driving the converter. The experiment implemented using Arduino Uno board. This diagram is compared to MPPT using MLPNN and P&O control (Perturbation and Observation control). The experimental results show that the proposed MLPNN based approach is more efficiency than P&O algorithm for this application.
Abstract: Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.
Abstract: Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H2S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. New design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.
Abstract: Interaction between mixing and crystallization is often
ignored despite the fact that it affects almost every aspect of the
operation including nucleation, growth, and maintenance of the
crystal slurry. This is especially pronounced in multiple impeller
systems where flow complexity is increased. By choosing proper
mixing parameters, what closely depends on the knowledge of the
hydrodynamics in a mixing vessel, the process of batch cooling
crystallization may considerably be improved. The values that render
useful information when making this choice are mixing time and
power consumption. The predominant motivation for this work was
to investigate the extent to which radial dual impeller configuration
influences mixing time, power consumption and consequently the
values of metastable zone width and nucleation rate. In this research,
crystallization of borax was conducted in a 15 dm3 baffled batch
cooling crystallizer with an aspect ratio (H/T) of 1.3. Mixing was
performed using two straight blade turbines (4-SBT) mounted on the
same shaft that generated radial fluid flow. Experiments were
conducted at different values of N/NJS ratio (impeller speed/
minimum impeller speed for complete suspension), D/T ratio
(impeller diameter/crystallizer diameter), c/D ratio (lower impeller
off-bottom clearance/impeller diameter), and s/D ratio (spacing
between impellers/impeller diameter). Mother liquor was saturated at
30°C and was cooled at the rate of 6°C/h. Its concentration was
monitored in line by Na-ion selective electrode. From the values of
supersaturation that was monitored continuously over process time, it
was possible to determine the metastable zone width and
subsequently the nucleation rate using the Mersmann’s nucleation
criterion. For all applied dual impeller configurations, the mixing
time was determined by potentiometric method using a pulse
technique, while the power consumption was determined using a
torque meter produced by Himmelstein & Co. Results obtained in
this investigation show that dual impeller configuration significantly
influences the values of mixing time, power consumption as well as
the metastable zone width and nucleation rate. A special attention
should be addressed to the impeller spacing considering the flow
interaction that could be more or less pronounced depending on the
spacing value.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) has been used in many advanced wireless communication
systems due to its high spectral efficiency and robustness to
frequency selective fading channels. However, the major concern
with OFDM system is the high peak-to-average power ratio (PAPR)
of the transmitted signal. Some of the popular techniques used for
PAPR reduction in OFDM system are conventional partial transmit
sequences (CPTS) and clipping. In this paper, a parallel
combination/hybrid scheme of PAPR reduction using clipping and
CPTS algorithms is proposed. The proposed method intelligently
applies both the algorithms in order to reduce both PAPR as well as
computational complexity. The proposed scheme slightly degrades
bit error rate (BER) performance due to clipping operation and it can
be reduced by selecting an appropriate value of the clipping ratio
(CR). The simulation results show that the proposed algorithm
achieves significant PAPR reduction with much reduced
computational complexity.
Abstract: The study of the aerodynamics related to the
improvement in the acting of airplanes and automobiles with the
objective of being reduced the effect of the attrition of the air on
structures, providing larger speeds and smaller consumption of fuel.
The application of the knowledge of the aerodynamics not more
limits to the aeronautical and automobile industries. Therefore, this
research aims to design and construction of a wind tunnel to perform
aerodynamic analysis in bodies of cars, seeking greater efficiency.
Therefore, this research aims to design and construction of a wind
tunnel to perform aerodynamic analysis in bodies of cars, seeking
greater efficiency. For this, a methodology for wind tunnel type
selection is designed to be built, taking into account the various
existing configurations in which chose to build an open circuit tunnel,
due to the lower complexity of construction and installation;
operational simplicity and low cost. The guidelines for the project
were teaching: the layer that limits study and analyze specimens with
different geometries. For the variation of pressure in the test, section
of a switched gauge used a pitot tube. Thus, it was possible to obtain
quantitative and qualitative results, which proved to be satisfactory.
Abstract: In wireless sensor network, sensor node transmits the
sensed data to the sink node in multi-hop communication
periodically. This high traffic induces congestion at the node which is
present one-hop distance to the sink node. The packet transmission
and reception rate of these nodes should be very high, when
compared to other sensor nodes in the network. Therefore, the energy
consumption of that node is very high and this effect is known as the
“funneling effect”. The tree based-data aggregation technique
(TBDA) is used to reduce the energy consumption of the node. The
throughput of the overall performance shows a considerable decrease
in the number of packet transmissions to the sink node. The proposed
scheme, TBDA, avoids the funneling effect and extends the lifetime
of the wireless sensor network. The average case time complexity for
inserting the node in the tree is O(n log n) and for the worst case time
complexity is O(n2).
Abstract: Companies face increasing challenges in research due
to higher costs and risks. The intensifying technology complexity and
interdisciplinarity require unique know-how. Therefore, companies
need to decide whether research shall be conducted internally or
externally with partners. On the other hand, research institutes meet
increasing efforts to achieve good financing and to maintain high
research reputation. Therefore, relevant research topics need to be
identified and specialization of competency is necessary. However,
additional competences for solving interdisciplinary research projects
are also often required. Secured financing can be achieved by
bonding industry partners as well as public fundings. The realization
of faster and better research drives companies and research institutes
to cooperate in organized research networks, which are managed by
an administrative organization. For an effective and efficient
cooperation, necessary processes, roles, tools and a set of rules need
to be determined. Goal of this paper is to show the state-of-art
research and to propose a governance framework for organized
research networks.
Abstract: The ASEAN Economic Community (AEC) is the goal
of regional economic integration by 2015. In the region, tourism is an
activity that is important, especially as a source of foreign currency, a
source of employment creation and a source of income bringing to the
region. Given the complexity of the issues entailing the concept of
sustainable tourism, this paper tries to assess tourism sustainability
with the ASEAN, based on a number of quantitative indicators for all
the ten economies, Thailand, Myanmar, Laos, Vietnam, Malaysia,
Singapore, Indonesia, Philippines, Cambodia, and Brunei. The
methodological framework will provide a number of benchmarks of
tourism activities in these countries. They include identification of the
dimensions; for example, economic, socio-ecologic, infrastructure
and indicators, method of scaling, chart representation and evaluation
on Asian countries. This specification shows that a similar level of
tourism activity might introduce different implementation in the
tourism activity and might have different consequences for the socioecological
environment and sustainability. The heterogeneity of
developing countries exposed briefly here would be useful to detect
and prepare for coping with the main problems of each country in
their tourism activities, as well as competitiveness and value creation
of tourism for ASEAN economic community, and will compare with
other parts of the world.
Abstract: For the last decade, researchers have started to focus
their interest on Multicast Group Key Management Framework. The
central research challenge is secure and efficient group key
distribution. The present paper is based on the Bit model based
Secure Multicast Group key distribution scheme using the most
popular absolute encoder output type code named Gray Code. The
focus is of two folds. The first fold deals with the reduction of
computation complexity which is achieved in our scheme by
performing fewer multiplication operations during the key updating
process. To optimize the number of multiplication operations, an
O(1) time algorithm to multiply two N-bit binary numbers which
could be used in an N x N bit-model of reconfigurable mesh is used
in this proposed work. The second fold aims at reducing the amount
of information stored in the Group Center and group members while
performing the update operation in the key content. Comparative
analysis to illustrate the performance of various key distribution
schemes is shown in this paper and it has been observed that this
proposed algorithm reduces the computation and storage complexity
significantly. Our proposed algorithm is suitable for high
performance computing environment.
Abstract: As a result of the ambiguity and complexity
surrounding anaerobic digester foaming, efforts have been made by
various researchers to understand the process of anaerobic digester
foaming so as to proffer a solution that can be universally applied
rather than site specific. All attempts ranging from experimental
analysis to comparative review of other process has not fully
explained the conditions and process of foaming in anaerobic
digester. Studying the current available knowledge on foam
formation and relating it to anaerobic digester process and operating
condition, this piece of work presents a succinct and enhanced
understanding of foaming in anaerobic digesters as well as
introducing a simple method to identify the onset of anaerobic
digester foaming based on analysis of historical data from a field
scale system.
Abstract: Polymorphism is one of the main pillars of objectoriented
paradigm. It induces hidden forms of class dependencies
which may impact software quality, resulting in higher cost factor for
comprehending, debugging, testing, and maintaining the software. In
this paper, a new cognitive complexity metric called Cognitive
Weighted Polymorphism Factor (CWPF) is proposed. Apart from the
software structural complexity, it includes the cognitive complexity
on the basis of type. The cognitive weights are calibrated based on 27
empirical studies with 120 persons. A case study and experimentation
of the new software metric shows positive results. Further, a
comparative study is made and the correlation test has proved that
CWPF complexity metric is a better, more comprehensive, and more
realistic indicator of the software complexity than Abreu’s
Polymorphism Factor (PF) complexity metric.
Abstract: The garment manufacturing industry involves
sequential processes that are subjected to uncontrollable variations.
The industry depends on the skill of labour in handling the varieties
of fabrics and accessories, machines, as well as complicated sewing
operation. Due to these reasons, garment manufacturers have created
systems to monitor and to control the quality of the products on a
regular basis by conducting quality approaches to minimize variation.
With that, the aim of this research has been to ascertain the quality
approaches deployed by Malaysian garment manufacturers in three
key areas - quality systems and tools; quality control and types of
inspection; as well as sampling procedures chosen for garment
inspection. Besides, the focus of this research was to distinguish the
quality approaches adopted by companies that supplied finished
garments to both domestic and international markets. Feedback from
each company representative has been obtained via online survey,
which comprised of five sections and 44 questions on the
organizational profile and the quality approaches employed in the
garment industry. As a result, the response rate was 31%. The results
revealed that almost all companies have established their own
mechanism of process control by conducting a series of quality
inspections for daily production, either it was formally set up or
otherwise. In addition, quality inspection has been the predominant
quality control activity in the garment manufacturing, while the level
of complexity of these activities was substantially dictated by the
customers. Moreover, AQL-based sampling was utilized by
companies dealing with exports, whilst almost all the companies that
only concentrated on the domestic market were comfortable using
their own sampling procedures for garment inspection. Hence, this
research has provided insights into the implementation of a number
of quality approaches that were perceived as important and useful in
the garment manufacturing sector, which is truly labour-intensive.
Abstract: Maintaining factory default battery endurance rate
over time in supporting huge amount of running applications on
energy-restricted mobile devices has created a new challenge for
mobile applications developer. While delivering customers’
unlimited expectations, developers are barely aware of efficient use
of energy from the application itself. Thus, developers need a set of
valid energy consumption indicators in assisting them to develop
energy saving applications. In this paper, we present a few software
product metrics that can be used as an indicator to measure energy
consumption of Android-based mobile applications in the early of
design stage. In particular, Trepn Profiler (Power profiling tool for
Qualcomm processor) has used to collect the data of mobile
application power consumption, and then analyzed for the 23
software metrics in this preliminary study. The results show that
McCabe cyclomatic complexity, number of parameters, nested block
depth, number of methods, weighted methods per class, number of
classes, total lines of code and method lines have direct relationship
with power consumption of mobile application.
Abstract: In this paper, we will try to demonstrate the
importance of the project approach in the urban to deal with
uncertainty, the importance of the involvement of all stakeholders in
the urban project process and that the absence of an actor can lead to
project failure but also the importance of the urban project
management. These points are handled through the following questions: Does
the urban adhere to the theory of complexity? Does the project
approach bring hope and solution to make urban planning
"sustainable"? How converging visions of actors for the same
project? Is the management of urban project the solution to support
the urban project approach?
Abstract: This paper integrates Octagon and Square Search
pattern (OCTSS) motion estimation algorithm into H.264/AVC
(Advanced Video Coding) video codec in Adaptive Group of Pictures
(AGOP) mode. AGOP structure is computed based on scene change
in the video sequence. Octagon and square search pattern block-based
motion estimation method is implemented in inter-prediction process
of H.264/AVC. Both these methods reduce bit rate and computational
complexity while maintaining the quality of the video sequence
respectively. Experiments are conducted for different types of video
sequence. The results substantially proved that the bit rate,
computation time and PSNR gain achieved by the proposed method
is better than the existing H.264/AVC with fixed GOP and AGOP.
With a marginal gain in quality of 0.28dB and average gain in bitrate
of 132.87kbps, the proposed method reduces the average computation
time by 27.31 minutes when compared to the existing state-of-art
H.264/AVC video codec.
Abstract: Rapid prototyping is a new group of manufacturing
processes, which allows fabrication of physical of any complexity
using a layer by layer deposition technique directly from a computer
system. The rapid prototyping process greatly reduces the time and
cost necessary to bring a new product to market. The prototypes
made by these systems are used in a range of industrial application
including design evaluation, verification, testing, and as patterns for
casting processes. These processes employ a variety of materials and
mechanisms to build up the layers to build the part. The present work
was to build a FDM prototyping machine that could control the X-Y
motion and material deposition, to generate two-dimensional and
three-dimensional complex shapes. This study focused on the
deposition of wax material. This work was to find out the properties
of the wax materials used in this work in order to enable better
control of the FDM process. This study will look at the integration of
a computer controlled electro-mechanical system with the traditional
FDM additive prototyping process. The characteristics of the wax
were also analysed in order to optimise the model production process.
These included wax phase change temperature, wax viscosity and
wax droplet shape during processing.