Trabecular Bone Radiograph Characterization Using Fractal, Multifractal Analysis and SVM Classifier

Osteoporosis is a common disease characterized by low bone mass and deterioration of micro-architectural bone tissue, which provokes an increased risk of fracture. This work treats the texture characterization of trabecular bone radiographs. The aim was to analyze according to clinical research a group of 174 subjects: 87 osteoporotic patients (OP) with various bone fracture types and 87 control cases (CC). To characterize osteoporosis, Fractal and MultiFractal (MF) methods were applied to images for features (attributes) extraction. In order to improve the results, a new method of MF spectrum based on the q-stucture function calculation was proposed and a combination of Fractal and MF attributes was used. The Support Vector Machines (SVM) was applied as a classifier to distinguish between OP patients and CC subjects. The features fusion (fractal and MF) allowed a good discrimination between the two groups with an accuracy rate of 96.22%.

Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique

In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.

12x12 MIMO Terminal Antennas Covering the Whole LTE and WiFi Spectrum

A broadband resonant terminal antenna has been developed. It can be used in different MIMO arrangements such as 2x2, 4x4, 8x8, or even 12x12 MIMO configurations. The antenna covers the whole LTE and WiFi bands besides the existing 2G/3G bands (700-5800 MHz), without using any matching/tuning circuits. Matching circuits significantly reduce the efficiency of any antenna and reduce the battery life. They also reduce the bandwidth because they are frequency dependent. The antenna can be implemented in smartphone handsets, tablets, laptops, notebooks or any other terminal. It is also suitable for different IoT and vehicle applications. The antenna is manufactured from a flexible material and can be bent or folded and shaped in any form to fit any available space in any terminal. It is self-contained and does not need to use the ground plane, the chassis or any other component of the terminal. Hence, it can be mounted on any terminal at different positions and configurations. Its performance does not get affected by the terminal, regardless of its type, shape or size. Moreover, its performance does not get affected by the human body of the terminal’s users. Because of all these unique features of the antenna, multiples of them can be simultaneously used for MIMO diversity coverage in any terminal device with a high isolation and a low correlation factor between them.

A Real-Time Simulation Environment for Avionics Software Development and Qualification

The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.

Towards a Deconstructive Text: Beyond Language and the Politics of Absences in Samuel Beckett’s Waiting for Godot

The writing of Samuel Beckett is associated with meaning in the meaninglessness and the production of what he calls ‘literature of unword’. The casual escape from the world of words in the form of silences and pauses, in his play Waiting for Godot, urges to ask question of their existence and ultimately leads to investigate the theory behind their use in the play. This paper proposes that these absences (silence and pause) in Beckett’s play force to think ‘beyond’ language. This paper asks how silence and pause in Beckett’s text speak for the emergence of poststructuralist text. It aims to identify the significant features of the philosophy of deconstruction in the play of Beckett to demystify the hostile complicity between literature and philosophy. With the interpretive paradigm of poststructuralism this research focuses on the text as a research data. It attempts to delineate the relationship between poststructuralist theoretical concerns and text of Beckett. Keeping in view the theoretical concerns of Poststructuralist theorist Jacques Derrida, the main concern of the discussion is directed towards the notion of ‘beyond’ language into the absences that are aimed at silencing the existing discourse with the ‘radical irony’ of this anti-formal art that contains its own denial and thus represents the idea of ceaseless questioning and radical contradiction in art and any text. This article asks how text of Beckett vibrates with loud silence and has disrupted language to demonstrate the emptiness of words and thus exploring the limitless void of absences. Beckett’s text resonates with silence and pause that is neither negation nor affirmation rather a poststructuralist’s suspension of reality that is ever changing with the undecidablity of all meanings. Within the theoretical notion of Derrida’s Différance this study interprets silence and pause in Beckett’s art. The silence and pause behave like Derrida’s Différance and have questioned their own existence in the text to deconstruct any definiteness and finality of reality to extend an undecidable threshold of poststructuralists that aims to evade the ‘labyrinth of language’.

Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Generative Syntaxes: Macro-Heterophony and the Form of ‘Synchrony’

One of the most powerful language innovation in the twentieth century music was the heterophony–hypostasis of the vertical syntax entered into the sphere of interest of many composers, such as George Enescu, Pierre Boulez, Mauricio Kagel, György Ligeti and others. The heterophonic syntax has a history of its growth, which means a succession of different concepts and writing techniques. The trajectory of settling this phenomenon does not necessarily take into account the chronology: there are highly complex primary stages and advanced stages of returning to the simple forms of writing. In folklore, the plurimelodic simultaneities are free or random and originate from the (unintentional) differences/‘deviations’ from the state of unison, through a variety of ornaments, melismas, imitations, elongations and abbreviations, all in a flexible rhythmic and non-periodic/immeasurable framework, proper to the parlando-rubato rhythmics. Within the general framework of the multivocal organization, the heterophonic syntax in elaborate (academic) version has imposed itself relatively late compared with polyphony and homophony. Of course, the explanation is simple, if we consider the causal relationship between the sound vocabulary elements – in this case, the modalism – and the typologies of vertical organization appropriate for it. Therefore, adding up the ‘classic’ pathway of the writing typologies (monody – polyphony – homophony), heterophony - applied equally to the structures of modal, serial or synthesis vocabulary – reclaims necessarily an own macrotemporal form, in the sense of the analogies enshrined by the evolution of the musical styles and languages: polyphony→fugue, homophony→sonata. Concerned about the prospect of edifying a new musical ontology, the composer Ştefan Niculescu experienced – along with the mathematical organization of heterophony according to his own original methods – the possibility of extrapolation of this phenomenon in macrostructural plan, reaching this way to the unique form of ‘synchrony’. Founded on coincidentia oppositorum principle (involving the ‘one-multiple’ binom), the sound architecture imagined by Ştefan Niculescu consists in one (temporal) model / algorithm of articulation of two sound states: 1. monovocality state (principle of identity) and 2. multivocality state (principle of difference). In this context, the heterophony becomes an (auto)generative mechanism, with macrotemporal amplitude, strategy that will be grown by the composer, practically throughout his creation (see the works: Ison I, Ison II, Unisonos I, Unisonos II, Duplum, Triplum, Psalmus, Héterophonies pour Montreux (Homages to Enescu and Bartók etc.). For the present demonstration, we selected one of the most edifying works of Ştefan Niculescu – Simphony II, Opus dacicum – where the form of (heterophony-)synchrony acquires monumental-symphonic features, representing an emblematic case for the complexity level achieved by this type of vertical syntax in the twentieth century music.

Definition and Core Components of the Role-Partner Allocation Problem in Collaborative Networks

In the current constantly changing economic context, collaborative networks allow partners to undertake projects that would not be possible if attempted by them individually. These projects usually involve the performance of a group of tasks (named roles) that have to be distributed among the partners. Thus, an allocation/matching problem arises that will be referred to as Role-Partner Allocation problem. In real life this situation is addressed by negotiation between partners in order to reach ad hoc agreements. Besides taking a long time and being hard work, both historical evidence and economic analysis show that such approach is not recommended. Instead, the allocation process should be automated by means of a centralized matching scheme. However, as a preliminary step to start the search for such a matching mechanism (or even the development of a new one), the problem and its core components must be specified. To this end, this paper establishes (i) the definition of the problem and its constraints, (ii) the key features of the involved elements (i.e., roles and partners); and (iii) how to create preference lists both for roles and partners. Only this way it will be possible to conduct subsequent methodological research on the solution method.     

Effect of Pole Weight on Nordic Walking

The purpose of study was to investigate the effect of varying pole weights on energy expenditure, upper limb and lower limb muscle activity as Electromyogram during Nordic walking (NW). Four healthy men [age = 22.5 (±1.0) years, body mass = 61.4 (±3.6) kg, height = 170.3 (±4.3) cm] and three healthy women [age = 22.7 (±2.9) years, body mass = 53.0 (±1.7) kg, height = 156.7 (±4.5) cm] participated in the experiments after informed consent. Seven healthy subjects were tested on the treadmill, walking, walking (W) with Nordic Poles (NW) and walking with 1kg weight Nordic Poles (NW+1). Walking speed was 6 km per hours in all trials. Eight EMG activities were recorded by bipolar surface methods in biceps brachii, triceps brachii, trapezius, deltoideus, tibialis anterior, medial gastrocnemius, rectus femoris and biceps femoris muscles. And heart rate (HR), oxygen uptake (VO2), and rate of perceived exertion (RPE) were measured. The level of significance was set at a = 0.05, with p < 0.05 regarded as statistically significant. Our results confirmed that use of NW poles increased HR at a given upper arm muscle activity but decreased lower limb EMGs in comparison with W. Moreover NW was able to increase more step lengths with hip joint extension during NW rather than W. Also, EMG revealed higher activation of upper limb for almost all NW and 1kgNW tests plus added masses compared to W (p < 0.05). Therefore, it was thought either of NW and 1kgNW were to have benefit as a physical exercise for safe, feasible, and readily training for a wide range of aged people in the quality of daily life. However, there was no significant effected in leg muscles activity by using 1kgNW except for upper arm muscle activity during Nordic pole walking.

A 3Y/3Y Pole-Changing Winding of High-Power Asynchronous Motors

Requirement for pole-changing motors emerged at the very early times of asynchronous motor design. Different solutions have been elaborated and some of them are generally used. An alternative is the so called 3 Y/3 Y pole-changing winding. This paper deals with high power application of this solution. A complete and comprehensive study is introduced, including features and design guidelines. The method presented in this paper is especially suitable for pole numbers being close to each other. The study also reveals that the method is more advantageous then the existing solutions for high power motors with 1:3 pole ratio. Using this motor, a new and complete drive supply system has been proposed as most appropriate arrangement of high power main naval propulsion drive. Further, the method makes possible to extend the pole ratio to 1:6, 1:9, 1:12, etc. At the end, the proposal is further extended to the here so far missing 1:4, 1:5, 1:7 etc. pole ratios. A complete proposal for the theoretically infinite range has been given in this way.

Investigation of Wave Atom Sub-Bands via Breast Cancer Classification

This paper investigates successful sub-bands of wave atom transform via classification of mammograms, when the coefficients of sub-bands are used as features. A computer-aided diagnosis system is constructed by using wave atom transform, support vector machine and k-nearest neighbor classifiers. Two-class classification is studied in detail using two data sets, separately. The successful sub-bands are determined according to the accuracy rates, coefficient numbers, and sensitivity rates.

Towards the Integration of a Micro Pump in μTAS

The objective of this study is to present a micro mechanical pump that was fabricated using SwIFT™ microfabrication surface micromachining process and to demonstrate the feasibility of integrating such micro pump into a micro analysis system. The micropump circulates the bio-sample and magnetic nanoparticles through different compartments to separate and purify the targeted bio-sample. This article reports the flow characteristics in the microchannels and in a crescent micro pump.

Energy Management Techniques in Mobile Robots

Today, the developing features of technological tools with limited energy resources have made it necessary to use energy efficiently. Energy management techniques have emerged for this purpose. As with every field, energy management is vital for robots that are being used in many areas from industry to daily life and that are thought to take up more spaces in the future. Particularly, effective power management in autonomous and multi robots, which are getting more complicated and increasing day by day, will improve the performance and success. In this study, robot management algorithms, usage of renewable and hybrid energy sources, robot motion patterns, robot designs, sharing strategies of workloads in multiple robots, road and mission planning algorithms are discussed for efficient use of energy resources by mobile robots. These techniques have been evaluated in terms of efficient use of existing energy resources and energy management in robots.

Analysis on the Feasibility of Landsat 8 Imagery for Water Quality Parameters Assessment in an Oligotrophic Mediterranean Lake

Lake water quality monitoring in combination with the use of earth observation products constitutes a major component in many water quality monitoring programs. Landsat 8 images of Trichonis Lake (Greece) acquired on 30/10/2013 and 30/08/2014 were used in order to explore the possibility of Landsat 8 to estimate water quality parameters and particularly CDOM absorption at specific wavelengths, chlorophyll-a and nutrient concentrations in this oligotrophic freshwater body, characterized by inexistent quantitative, temporal and spatial variability. Water samples have been collected at 22 different stations, on late August of 2014 and the satellite image of the same date was used to statistically correlate the in-situ measurements with various combinations of Landsat 8 bands in order to develop algorithms that best describe those relationships and calculate accurately the aforementioned water quality components. Optimal models were applied to the image of late October of 2013 and the validation of the results was conducted through their comparison with the respective available in-situ data of 2013. Initial results indicated the limited ability of the Landsat 8 sensor to accurately estimate water quality components in an oligotrophic waterbody. As resulted by the validation process, ammonium concentrations were proved to be the most accurately estimated component (R = 0.7), followed by chl-a concentration (R = 0.5) and the CDOM absorption at 420 nm (R = 0.3). In-situ nitrate, nitrite, phosphate and total nitrogen concentrations of 2014 were measured as lower than the detection limit of the instrument used, hence no statistical elaboration was conducted. On the other hand, multiple linear regression among reflectance measures and total phosphorus concentrations resulted in low and statistical insignificant correlations. Our results were concurrent with other studies in international literature, indicating that estimations for eutrophic and mesotrophic lakes are more accurate than oligotrophic, owing to the lack of suspended particles that are detectable by satellite sensors. Nevertheless, although those predictive models, developed and applied to Trichonis oligotrophic lake are less accurate, may still be useful indicators of its water quality deterioration.

Integrated ACOR/IACOMV-R-SVM Algorithm

A direction for ACO is to optimize continuous and mixed (discrete and continuous) variables in solving problems with various types of data. Support Vector Machine (SVM), which originates from the statistical approach, is a present day classification technique. The main problems of SVM are selecting feature subset and tuning the parameters. Discretizing the continuous value of the parameters is the most common approach in tuning SVM parameters. This process will result in loss of information which affects the classification accuracy. This paper presents two algorithms that can simultaneously tune SVM parameters and select the feature subset. The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. Three benchmark UCI datasets were used in the experiments to validate the performance of the proposed algorithms. The results show that the proposed algorithms have good performances as compared to other approaches.

Islamic Education System: Implementation of Curriculum Kuttab Al-Fatih Semarang

The picture and pattern of Islamic education in the Prophet's period in Mecca and Medina is the history of the past that we need to bring back. The Basic Education Institute called Kuttab. Kuttab or Maktab comes from the word kataba which means to write. The popular Kuttab in the Prophet’s period aims to resolve the illiteracy in the Arab community. In Indonesia, this Institution has 25 branches; one of them is located in Semarang (i.e. Kuttab Al-Fatih). Kuttab Al-Fatih as a non-formal institution of Islamic education is reserved for children aged 5-12 years. The independently designed curriculum is a distinctive feature that distinguishes between Kuttab Al-Fatih curriculum and the formal institutional curriculum in Indonesia. The curriculum includes the faith and the Qur’an. Kuttab Al-Fatih has been licensed as a Community Activity Learning Center under the direct supervision and guidance of the National Education Department. Here, we focus to describe the implementation of curriculum Kuttab Al-Fatih Semarang (i.e. faith and al-Qur’an). After that, we determine the relevance between the implementation of the Kuttab Al-Fatih education system with the formal education system in Indonesia. This research uses literature review and field research qualitative methods. We obtained the data from the head of Kuttab Al-Fatih Semarang, vice curriculum, faith coordinator, al-Qur’an coordinator, as well as the guardians of learners and the learners. The result of this research is the relevance of education system in Kuttab Al-Fatih Semarang about education system in Indonesia. Kuttab Al-Fatih Semarang emphasizes character building through a curriculum designed in such a way and combines thematic learning models in modules.

An 8-Bit, 100-MSPS Fully Dynamic SAR ADC for Ultra-High Speed Image Sensor

In this paper, a dynamic and power efficient 8-bit and 100-MSPS Successive Approximation Register (SAR) Analog-to-Digital Converter (ADC) is presented. The circuit uses a non-differential capacitive Digital-to-Analog (DAC) architecture segmented by 2. The prototype is produced in a commercial 65-nm 1P7M CMOS technology with 1.2-V supply voltage. The size of the core ADC is 208.6 x 103.6 µm2. The post-layout noise simulation results feature a SNR of 46.9 dB at Nyquist frequency, which means an effective number of bit (ENOB) of 7.5-b. The total power consumption of this SAR ADC is only 1.55 mW at 100-MSPS. It achieves then a figure of merit of 85.6 fJ/step.

Investigation of Combined use of MFCC and LPC Features in Speech Recognition Systems

Statement of the automatic speech recognition problem, the assignment of speech recognition and the application fields are shown in the paper. At the same time as Azerbaijan speech, the establishment principles of speech recognition system and the problems arising in the system are investigated. The computing algorithms of speech features, being the main part of speech recognition system, are analyzed. From this point of view, the determination algorithms of Mel Frequency Cepstral Coefficients (MFCC) and Linear Predictive Coding (LPC) coefficients expressing the basic speech features are developed. Combined use of cepstrals of MFCC and LPC in speech recognition system is suggested to improve the reliability of speech recognition system. To this end, the recognition system is divided into MFCC and LPC-based recognition subsystems. The training and recognition processes are realized in both subsystems separately, and recognition system gets the decision being the same results of each subsystems. This results in decrease of error rate during recognition. The training and recognition processes are realized by artificial neural networks in the automatic speech recognition system. The neural networks are trained by the conjugate gradient method. In the paper the problems observed by the number of speech features at training the neural networks of MFCC and LPC-based speech recognition subsystems are investigated. The variety of results of neural networks trained from different initial points in training process is analyzed. Methodology of combined use of neural networks trained from different initial points in speech recognition system is suggested to improve the reliability of recognition system and increase the recognition quality, and obtained practical results are shown.

Production, Characterisation and Assessment of Biomixture Fuels for Compression Ignition Engine Application

Hardly any neat biodiesel satisfies the European EN14214 standard for compression ignition engine application. To satisfy the EN14214 standard, various additives are doped into biodiesel; however, biodiesel additives might cause other problems such as increase in the particular emission and increased specific fuel consumption. In addition, the additives could be expensive. Considering the increasing level of greenhouse gas GHG emissions and fossil fuel depletion, it is forecasted that the use of biodiesel will be higher in the near future. Hence, the negative aspects of the biodiesel additives will likely to gain much more importance and need to be replaced with better solutions. This study aims to satisfy the European standard EN14214 by blending the biodiesels derived from sustainable feedstocks. Waste Cooking Oil (WCO) and Animal Fat Oil (AFO) are two sustainable feedstocks in the EU (including the UK) for producing biodiesels. In the first stage of the study, these oils were transesterified separately and neat biodiesels (W100 & A100) were produced. Secondly, the biodiesels were blended together in various ratios: 80% WCO biodiesel and 20% AFO biodiesel (W80A20), 60% WCO biodiesel and 40% AFO biodiesel (W60A40), 50% WCO biodiesel and 50% AFO biodiesel (W50A50), 30% WCO biodiesel and 70% AFO biodiesel (W30A70), 10% WCO biodiesel and 90% AFO biodiesel (W10A90). The prepared samples were analysed using Thermo Scientific Trace 1300 Gas Chromatograph and ISQ LT Mass Spectrometer (GC-MS). The GS-MS analysis gave Fatty Acid Methyl Ester (FAME) breakdowns of the fuel samples. It was found that total saturation degree of the samples was linearly increasing (from 15% for W100 to 54% for A100) as the percentage of the AFO biodiesel was increased. Furthermore, it was found that WCO biodiesel was mainly (82%) composed of polyunsaturated FAMEs. Cetane numbers, iodine numbers, calorific values, lower heating values and the densities (at 15 oC) of the samples were estimated by using the mass percentages data of the FAMEs. Besides, kinematic viscosities (at 40 °C and 20 °C), densities (at 15 °C), heating values and flash point temperatures of the biomixture samples were measured in the lab. It was found that estimated and measured characterisation results were comparable. The current study concluded that biomixture fuel samples W60A40 and W50A50 were perfectly satisfying the European EN 14214 norms without any need of additives. Investigation on engine performance, exhaust emission and combustion characteristics will be conducted to assess the full feasibility of the proposed biomixture fuels.

A Comparative Study on Fuzzy and Neuro-Fuzzy Enabled Cluster Based Routing Protocols for Wireless Sensor Networks

Dynamic Routing in Wireless Sensor Networks (WSNs) has played a significant task in research for the recent years. Energy consumption and data delivery in time are the major parameters with the usage of sensor nodes that are significant criteria for these networks. The location of sensor nodes must not be prearranged. Clustering in WSN is a key methodology which is used to enlarge the life-time of a sensor network. It consists of numerous real-time applications. The features of WSNs are minimized the consumption of energy. Soft computing techniques can be included to accomplish improved performance. This paper surveys the modern trends in routing enclose fuzzy logic and Neuro-fuzzy logic based on the clustering techniques and implements a comparative study of the numerous related methodologies.