Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB

The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.

Extended Release System of Hypoglycemic Agent Containing Solid Dispersions: Strategies and Mechanisms

The main perspective of the present study aims at overcoming solubility problems by using the technique of solid dispersion. Repaglinide is a BCS Class II drug, having low aqueous solubility and therefore, low bioavailability. Solid dispersions of repaglinide with different carriers Polyvinyl Pyrrolidone (PVP) and Ethyl Cellulose (EC) in different ratios were prepared by suspending method and Dissolving methods. In vitro release studies revealed that the F7 formulation showed extended drug release. So, the dissolution profile of solid dispersion containing EC and PVP K30 (1: 3) was selected as the best formulation because of its extended drug release among all formulations. In conclusion, solid dispersions of Repaglinide in PVP have shown to be a promising approach to improve the bioavailability of Repaglinide.

Design and Fabrication of Stent with Negative Poisson’s Ratio

The negative Poisson’s ratios can be described in terms of models based on the geometry of the system and the way this geometry changes due to applied loads. As the Poisson’s ratio does not depend on scale hence deformation can take place at the nano to macro level the only requirement is the right combination of the geometry. Our thrust in this paper is to combine our knowledge of tailored enhanced mechanical properties of the materials having negative Poisson’s ratio with the micromachining and electrospining technology to develop a novel stent carrying a drug delivery system. Therefore, the objective of this paper includes (i) fabrication of a micromachined metal sheet tailored with structure having negative Poisson’s ratio through rotating solid squares geometry using femtosecond laser ablation; (ii) rolling fabricated structure and welding to make a tubular structure (iii) wrapping it with nanofibers of biocompatible polymer PCL (polycaprolactone) for drug delivery (iv) analysis of the functional and mechanical performance of fabricated structure analytically and experimentally. Further, as the applications concerned, tubular structures have potential in biomedical for example hollow tubes called stents are placed inside to provide mechanical support to a damaged artery or diseased region and to open a blocked esophagus thus allowing feeding capacity and improving quality of life.

Mobile Cloud Middleware: A New Service for Mobile Users

Cloud computing (CC) and mobile cloud computing (MCC) have advanced rapidly the last few years. Today, MCC undergoes fast improvement and progress in terms of hardware (memory, embedded sensors, power consumption, touch screen, etc.) software (more and more sophisticated mobile applications) and transmission (higher data transmission rates achieved with different technologies such as 3Gs). This paper presents a review on the concept of CC and MCC. Then, it discusses what has been done regarding middleware in cloud and mobile cloud computing. Later, it shows the architecture of our proposed middleware along with its functionalities which will be provided to mobile clients in order to overcome the well known problems (such as low battery power, slow CPU speed and little memory…).

Broadcasting Mechanism with Less Flooding Packets by Optimally Constructing Forwarding and Non-Forwarding Nodes in Mobile Ad Hoc Networks

The conventional routing protocol designed for MANET fail to handle dynamic movement and self-starting behavior of the node effectively. Every node in MANET is considered as forward as well receiver node and all of them participate in routing the packet from source to the destination. While the interconnection topology is highly dynamic, the performance of the most of the routing protocol is not encouraging. In this paper, a reliable broadcast approach for MANET is proposed for improving the transmission rate. The MANET is considered with asymmetric characteristics and the properties of the source and destination nodes are different. The non-forwarding node list is generated with a downstream node and they do not participate in the routing. While the forwarding and non-forwarding node is constructed in a conventional way, the number of nodes in non-forwarding list is more and increases the load. In this work, we construct the forwarding and non-forwarding node optimally so that the flooding and broadcasting is reduced to certain extent. The forwarded packet is considered as acknowledgements and the non-forwarding nodes explicitly send the acknowledgements to the source. The performance of the proposed approach is evaluated in NS2 environment. Since the proposed approach reduces the flooding, we have considered functionality of the proposed approach with AODV variants. The effect of network density on the overhead and collision rate is considered for performance evaluation. The performance is compared with the AODV variants found that the proposed approach outperforms all the variants.

Design and Analysis of a Low Power High Speed 1 Bit Full Adder Cell Based On TSPC Logic with Multi-Threshold CMOS

An adder is one of the most integral component of a digital system like a digital signal processor or a microprocessor. Being an extremely computationally intensive part of a system, the optimization for speed and power consumption of the adder is of prime importance. In this paper we have designed a 1 bit full adder cell based on dynamic TSPC logic to achieve high speed operation. A high threshold voltage sleep transistor is used to reduce the static power dissipation in standby mode. The circuit is designed and simulated in TSPICE using TSMC 180nm CMOS process. Average power consumption, delay and power-delay product is measured which showed considerable improvement in performance over the existing full adder designs.

Transparency of Audit Firms in Croatia

The aim of this paper is to raise general awareness of transparency importance for audit firms and for audit services’ users. This paper analyzes transparency of audit firms that audited financial statements of listed companies, for year 2011 and 2012. We use this two years because in the meantime Code of Ethics for Professional Accountants has been adopted. This paper investigates whether transparency reports of audit firms are in accordance with the Croatian Audit Act and whether there is a difference on transparency in observed years. For this paper quality index of transparency report and financial indicators of audit firms are used to get conclusion about condition of audit firms transparency reporting. Results of our study indicate that audit firms are not fully transparent, looking for both years. Transparency of audit firms in 2012 has improved significantly, compared with transparency in 2011.

Approach of Measuring System Analyses for Automotive Part Manufacturing

This work aims to introduce an efficient and to standardize the measuring system analyses for automotive industrial. The study started by literature reviewing about the management and analyses measurement system. The approach of measuring system management, then, was constructed. Such approach was validated by collecting the current measuring system data using the equipments of interest including vernier caliper and micrometer. Their accuracy and precision of measurements were analyzed. Finally, the measuring system was improved and evaluated. The study showed that vernier did not meet its measuring characteristics based on the linearity whereas all equipments were lacking of the measuring precision characteristics. Consequently, the causes of measuring variation via the equipments of interest were declared. After the improvement, it was found that their measuring performance could be accepted as the standard required. Finally, the standardized approach for analyzing the measuring system of automotive was concluded.

Investigation of Passive Solutions of Thermal Comfort in Housing Aiming to Reduce Energy Consumption

The concern with sustainability brought the need for optimization of the buildings to reduce consumption of natural resources. Almost 1/3 of energy demanded by Brazilian housings is used to provide thermal solutions. AEC sector may contribute applying bioclimatic strategies on building design. The aim of this research is to investigate the viability of applying some alternative solutions in residential buildings. The research was developed with computational simulation on single family social housing, examining envelope type, absorptance, and insolation. The analysis of the thermal performance applied both Brazilian standard NBR 15575 and degree-hour method, in the scenery of Porto Alegre, a southern Brazilian city. We used BIM modeling through Revit/Autodesk and used Energy Plus to thermal simulation. The payback of the investment was calculated comparing energy savings and building costs, in a period of 50 years. The results shown that with the increment of envelope’s insulation there is thermal comfort improvement and energy economy, with a pay-back period of 24 to 36 years, in some cases.

Monitoring of Water Pollution and Its Consequences: An Overview

Water a vital component for all living forms is derived from variety of sources, including surface water (rivers, lakes, reservoirs and ponds) and ground water (aquifers). Over the years of time, water bodies are subjected to human interference regularly resulting in deterioration of water quality. Therefore, pollution of water bodies has become matter of global concern. As the water quality closely relate to human health, water analysis before usage is of immense importance. Improper management of water bodies can cause serious problems in availability and quality of water. The quality of water may be described according to their physico-chemical and microbiological characteristics. For effective maintenance of water quality through appropriate control measures, continuous monitoring of metals, physico-chemical and biological parameter is essential for the establishment of baseline data for the water quality in any study area. The present study has focused on to explore the status of water pollution in various areas and to estimate the magnitude of its toxicity using different bioassay.

Comparative Study of Eva and Waste Polymer Modified Bitumen

Polymer-modified bitumen is used to combat different pavement distresses and to increase the life span of pavement. Unmodified bitumen cannot perform better with the range extreme minimum and maximum pavement temperatures. The polymers commonly used to modify the bitumen are ethylene vinyl acetate (EVA) styrene butadiene styrene (SBS). The aim this study to compare the performance of EVA modified bitumen with the bitumen modified by waste low density polyethylene (LDPE), polypropylene (PP) obtained from waste carry bags and waste tyre rubber (CR) to encourage the use of waste polymer whose disposal is big problem today, in place of costly virgin polymer. From the experimental study, it was found that waste polymers are also effective in improving the properties bitumen as that of virgin polymer.

Evaluation of Fitts’ Law Index of Difficulty Formulation for Screen Size Variations

It is well-known as Fitts’ law that the time for a user to point a target on a GUI screen can be modeled as a linear function of “index of difficulty (ID).” In this paper, the authors investigate whether the traditional ID formulation is appropriate independently of device screen sizes. Result of our experiment reveals that the ID formulation may not consistently capture actual difficulty: users’ pointing performances are not consistent among pointing target variations of which index of difficulty are consistent. The term A/W may not be appropriate because the term causes the observed inconsistency. Based on this finding, the authors then evaluate the applicability of possible models other than Fitts’ one. Multiple regression models are found to be able to appropriately represent the effects of target design variations. The authors next make an attempt to improve the definition of ID in Fitts’ model. Our idea is to raise the size or the distance values depending on the screen size. The modified model is found to fit well to the users’ pointing data, which supports the idea. 

Compressive Strength and Microstructure of Hybrid Alkaline Cements

Publications on the field of alkali-activated binders, state that this new material is likely to have high potential to become an alternative to Portland cement. Classical alkali-activated cements could be made more eco-efficient if the use of sodium silicate is avoided. Besides, most alkali-activated cements suffer from severe efflorescence originated by the fact that alkaline and/or soluble silicates that are added during processing cannot be totally consumed. This paper presents experimental results on hybrid alkaline cements. Compressive strength results and efflorescence’s observations show that the new mixes already analyzed are promising. SEM results show that no traditional porous ITZ was detected in these binders.

Identification of Coauthors in Scientific Database

The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.

Turbine Compressor Vibration Analysis and Rotor Movement Evaluation by Shaft Center Line Method (The Case History Related to Main Turbine Compressor of an Olefin Plant in Iran Oil Industries)

Vibration monitoring methods of most critical equipment like main turbine and compressors always plays important role in preventive maintenance and management consideration in big industrial plants. There are a number of traditional methods like monitoring the overall vibration data from Bently Nevada panel and the time wave form (TWF) or fast Fourier transform (FFT) monitoring. Besides, Shaft centerline monitoring method developed too much in recent years. There are a number of arguments both in favor of and against this method between people who work in preventive maintenance and condition monitoring systems (vibration analysts). In this paper basic principal of Turbine compressor vibration analysis and rotor movement evaluation by shaft centerline method discussed in details through a case history. This case history is related to main turbine compressor of an olefin plant in Iran oil industry. In addition, some common mistakes that may occur by vibration analyst during the process discussed in details. It is worthy to know that, these mistakes may one of the reasons that sometimes this method seems to be not effective. Furthermore, recent patent and innovation in shaft position and movement evaluation are discussed in this paper.

A Comprehensive Survey and Comparative Analysis of Black Hole Attack in Mobile Ad Hoc Network

A Mobile Ad-hoc Network (MANET) is a self managing network consists of versatile nodes that are capable of communicating with each other without having any fixed infrastructure. These nodes may be routers and/or hosts. Due to this dynamic nature of the network, routing protocols are vulnerable to various kinds of attacks. The black hole attack is one of the conspicuous security threats in MANETs. As the route discovery process is obligatory and customary, attackers make use of this loophole to get success in their motives to destruct the network. In Black hole attack the packet is redirected to a node that actually does not exist in the network. Many researchers have proposed different techniques to detect and prevent this type of attack. In this paper, we have analyzed various routing protocols in this context. Further we have shown a critical comparison among various protocols. We have shown various routing metrics are required proper and significant analysis of the protocol.

Per Flow Packet Scheduling Scheme to Improve the End-to-End Fairness in Mobile Ad Hoc Wireless Network

Various fairness models and criteria proposed by academia and industries for wired networks can be applied for ad hoc wireless network. The end-to-end fairness in an ad hoc wireless network is a challenging task compared to wired networks, which has not been addressed effectively. Most of the traffic in an ad hoc network are transport layer flows and thus the fairness of transport layer flows has attracted the interest of the researchers. The factors such as MAC protocol, routing protocol, the length of a route, buffer size, active queue management algorithm and the congestion control algorithms affects the fairness of transport layer flows. In this paper, we have considered the rate of data transmission, the queue management and packet scheduling technique. The ad hoc network is dynamic in nature due to various parameters such as transmission of control packets, multihop nature of forwarding packets, changes in source and destination nodes, changes in the routing path influences determining throughput and fairness among the concurrent flows. In addition, the effect of interaction between the protocol in the data link and transport layers has also plays a role in determining the rate of the data transmission. We maintain queue for each flow and the delay information of each flow is maintained accordingly. The pre-processing of flow is done up to the network layer only. The source and destination address information is used for separating the flow and the transport layer information is not used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and the transport layer information is used. This minimizes the delay in the network. Each flow is attached to a timer and is updated dynamically. Finite State Machine (FSM) is proposed for queue and transmission control mechanism. The performance of the proposed approach is evaluated in ns-2 simulation environment. The throughput and fairness based on not mobility for different flows used as performance metrics. We have compared the performance of the proposed approach with ATP and MC-MLAS and the performance of the proposed approach is encouraging.

A TIPSO-SVM Expert System for Efficient Classification of TSTO Surrogates

Fully reusable spaceplanes do not exist as yet. This implies that design-qualification for optimized highly-integrated forebody-inlet configuration of booster-stage vehicle cannot be based on archival data of other spaceplanes. Therefore, this paper proposes a novel TIPSO-SVM expert system methodology. A non-trivial problem related to optimization and classification of hypersonic forebody-inlet configuration in conjunction with mass-model of the two-stage-to-orbit (TSTO) vehicle is solved. The hybrid-heuristic machine learning methodology is based on two-step improved particle swarm optimizer (TIPSO) algorithm and two-step support vector machine (SVM) data classification method. The efficacy of method is tested by first evolving an optimal configuration for hypersonic compression system using TIPSO algorithm; thereafter, classifying the results using two-step SVM method. In the first step extensive but non-classified mass-model training data for multiple optimized configurations is segregated and pre-classified for learning of SVM algorithm. In second step the TIPSO optimized mass-model data is classified using the SVM classification. Results showed remarkable improvement in configuration and mass-model along with sizing parameters.

On the Parameter of the Burr Type X under Bayesian Principles

A comprehensive Bayesian analysis has been carried out in the context of informative and non-informative priors for the shape parameter of the Burr type X distribution under different symmetric and asymmetric loss functions. Elicitation of hyperparameter through prior predictive approach is also discussed. Also we derive the expression for posterior predictive distributions, predictive intervals and the credible Intervals. As an illustration, comparisons of these estimators are made through simulation study.

Stability of Square Plate with Concentric Cutout

The finite element method is used to obtain the elastic buckling load factor for square isotropic plate containing circular, square and rectangular cutouts. ANSYS commercial finite element software had been used in the study. The applied inplane loads considered are uniaxial and biaxial compressions. In all the cases the load is distributed uniformly along the plate outer edges. The effects of the size and shape of concentric cutouts with different plate thickness ratios and the influence of plate edge conditions, such as SSSS, CCCC and mixed boundary condition SCSC on the plate buckling strength have been considered in the analysis.