Abstract: Electronic apex locators (EAL) has been widely used
clinically for measuring root canal working length with high accuracy,
which is crucial for successful endodontic treatment. In order to
maintain high accuracy in different measurement environments,
this study presented a system for root canal length measurement
based on multifrequency impedance method. This measuring system
can generate a sweep current with frequencies from 100 Hz to
1 MHz through a direct digital synthesizer. Multiple impedance
ratios with different combinations of frequencies were obtained
and transmitted by an analog-to-digital converter and several of
them with representatives will be selected after data process. The
system analyzed the functional relationship between these impedance
ratios and the distance between the file and the apex with statistics
by measuring plenty of teeth. The position of the apical foramen
can be determined by the statistical model using these impedance
ratios. The experimental results revealed that the accuracy of
the system based on multifrequency impedance ratios method to
determine the position of the apical foramen was higher than the
dual-frequency impedance ratio method. Besides that, for more
complex measurement environments, the performance of the system
was more stable.
Abstract: The constant monitoring of blood glucose level is necessary for maintaining health of patients and to alert medical specialists to take preemptive measures before the onset of any complication as a result of diabetes. The current clinical monitoring of blood glucose uses invasive methods repeatedly which are uncomfortable and may result in infections in diabetic patients. Several attempts have been made to develop non-invasive techniques for blood glucose measurement. In this regard, the existing methods are not reliable and are less accurate. Other approaches claiming high accuracy have not been tested on extended dataset, and thus, results are not statistically significant. It is a well-known fact that acetone concentration in breath has a direct relation with blood glucose level. In this paper, we have developed the first of its kind, reliable and high accuracy breath analyzer for non-invasive blood glucose measurement. The acetone concentration in breath was measured using MQ 138 sensor in the samples collected from local hospitals in Pakistan involving one hundred patients. The blood glucose levels of these patients are determined using conventional invasive clinical method. We propose a linear regression classifier that is trained to map breath acetone level to the collected blood glucose level achieving high accuracy.
Abstract: Power distribution circuits undergo frequent network
topology changes that are often left undocumented. As a result, the
documentation of a circuit’s connectivity becomes inaccurate with
time. The lack of reliable circuit connectivity information is one of the
biggest obstacles to model, monitor, and control modern distribution
systems. To enhance the reliability and efficiency of electric power
distribution systems, the circuit’s connectivity information must be
updated periodically. This paper focuses on one critical component of
a distribution circuit’s topology - the secondary transformer to phase
association. This topology component describes the set of phase lines
that feed power to a given secondary transformer (and therefore a
given group of power consumers). Finding the documentation of this
component is call Phase Identification, and is typically performed
with physical measurements. These measurements can take time
lengths on the order of several months, but with supervised learning,
the time length can be reduced significantly. This paper compares
several such methods applied to Phase Identification for a large
range of real distribution circuits, describes a method of training
data selection, describes preprocessing steps unique to the Phase
Identification problem, and ultimately describes a method which
obtains high accuracy (> 96% in most cases, > 92% in the worst
case) using only 5% of the measurements typically used for Phase
Identification.
Abstract: DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN).
Abstract: Numerical computation of wave propagation in a large
domain usually requires significant computational effort. Hence, the
considered domain must be truncated to a smaller domain of interest.
In addition, special boundary conditions, which absorb the outward
travelling waves, need to be implemented in order to describe the
system domains correctly. In this work, the linear one dimensional
wave equation is approximated by utilizing the Fourier Galerkin
approach. Furthermore, the artificial boundaries are realized with
absorbing boundary conditions. Within this work, a systematic work
flow for setting up the wave problem, including the absorbing
boundary conditions, is proposed. As a result, a convenient modal
system description with an effective absorbing boundary formulation
is established. Moreover, the truncated model shows high accuracy
compared to the global domain.
Abstract: Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.
Abstract: This paper presents a proposed analytical model for predicting the shear strength of reinforced concrete beams strengthened with CFRP composites as external reinforcement. The proposed analytical model can predict the shear contribution of CFRP composites of RC beams with an acceptable coefficient of correlation with the tested results. Based on the comparison of the proposed model with the published well-known models (ACI model, Triantafillou model, and Colotti model), the ACI model had a wider range of 0.16 to 10.08 for the ratio between tested and predicted ultimate shears at failure. Also, an acceptable range of 0.27 to 2.78 for the ratio between tested and predicted ultimate shears by the Triantafillou model. Finally, the best prediction (the ratio between the tested and predicted ones) of the ultimate shear capacity is observed by using Colotti model with a range of 0.20 to 1.78. Thus, the contribution of the CFRP composites as external reinforcement can be predicted with high accuracy by using the proposed analytical model.
Abstract: The global coverage of broadband multimedia and
internet-based services in terrestrial-satellite networks demand
particular interests for satellite providers in order to enhance services
with low latencies and high signal quality to diverse users. In
particular, the delay of on-board processing is an inherent source
of latency in a satellite communication that sometimes is discarded
for the end-to-end delay of the satellite link. The frame work for this
paper includes modelling of an on-orbit satellite payload using an
agent model that can reproduce the properties of processing delays.
In essence, a comparison of different spatial interpolation methods is
carried out to evaluate physical data obtained by an GEO satellite
in order to define a discretization function for determining that
delay. Furthermore, the performance of the proposed agent and the
development of a delay discretization function are together validated
by simulating an hybrid satellite and terrestrial network. Simulation
results show high accuracy according to the characteristics of initial
data points of processing delay for Ku bands.
Abstract: Transforaminal lumbar interbody fusion (TLIF) surgeries have nowadays became popular for treatment of degenerated spinal disorders. The interbody fusion technique like TLIF maintains load bearing capacity of the spine and a suitable disc height. Currently many techniques have been introduced to cure Spondylolisthesis. This surgery provides greater rehabilitation of degenerative spines. While performing this TLIF surgery existing methods use guideway, which is a troublesome surgery technique as the use of two separate instruments is required to perform this surgery. This paper presents a concept which eliminates the use of guideway. This concept also eliminates problems that occur like reverting the cage. The concept discussed in this paper also gives high accuracy while performing surgery.
Abstract: This study optimized the design parameters of a cone loudspeaker as an example of high flexibility of the product design. We developed an acoustic analysis software program that considers the impact of damping caused by air viscosity. In sound reproduction, it is difficult to optimize each parameter of the loudspeaker design. To overcome the limitation of the design problem in practice, this study presents an acoustic analysis algorithm to optimize the design parameters of the loudspeaker. The material character of cone paper and the loudspeaker edge were the design parameters, and the vibration displacement of the cone paper was the objective function. The results of the analysis showed that the design had high accuracy as compared to the predicted value. These results suggested that although the parameter design is difficult, with experience and intuition, the design can be performed easily using the optimized design found with the acoustic analysis software.
Abstract: Understanding the nature of wheel and rail wear in the railway field is of fundamental importance to the safe and cost effective operation of the railways. Twin disc wear testing is used extensively for studying wear of wheel and rail materials. The University of Huddersfield twin disc rig was used in this paper to examine the effect of surface conditions on wheel and rail wear measurement under a range of wheel/rail contact conditions, with and without contaminants. This work focuses on an investigation of the effect of dry, wet, and lubricated conditions and the effect of contaminants such as sand on wheel and rail wear. The wheel and rail wear measurements were carried out by using a replica material and an optical profilometer that allows measurement of wear in difficult location with high accuracy. The results have demonstrated the rate at which both water and oil reduce wheel and rail wear. Scratches and other damage were seen on the wheel and rail surfaces after the addition of sand and consequently both wheel and rail wear damage rates increased under these conditions. This work introduced the replica material and an optical instrument as effective tools to study the effect of surface conditions on wheel and rail wear.
Abstract: This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.
Abstract: Human action is recognized directly from the video sequences. The objective of this work is to recognize various human actions like run, jump, walk etc. Human action recognition requires some prior knowledge about actions namely, the motion estimation, foreground and background estimation. Region of interest (ROI) is extracted to identify the human in the frame. Then, optical flow technique is used to extract the motion vectors. Using the extracted features similarity measure based classification is done to recognize the action. From experimentations upon the Weizmann database, it is found that the proposed method offers a high accuracy.
Abstract: As wireless communication services grow quickly; the seriousness of spectrum utilization has been on the rise gradually. An emerging technology, cognitive radio has come out to solve today’s spectrum scarcity problem. To support the spectrum reuse functionality, secondary users are required to sense the radio frequency environment, and once the primary users are found to be active, the secondary users are required to vacate the channel within a certain amount of time. Therefore, spectrum sensing is of significant importance. Once sensing is done, different prediction rules apply to classify the traffic pattern of primary user. Primary user follows two types of traffic patterns: periodic and stochastic ON-OFF patterns. A cognitive radio can learn the patterns in different channels over time. Two types of classification methods are discussed in this paper, by considering edge detection and by using autocorrelation function. Edge detection method has a high accuracy but it cannot tolerate sensing errors. Autocorrelation-based classification is applicable in the real environment as it can tolerate some amount of sensing errors.
Abstract: In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.
Abstract: In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.
Abstract: In this paper, a low voltage high performance current mirror is presented. Its most important specifications, which are improved in this work, are analyzed and formulated proving that it has such outstanding merits as: Very low input resistance of 26mΩ, very wide current dynamic range of 8 decades from 10pA to 1mA (160dB) together with an extremely low current copy error of less than 0.6ppm, and very low input and output voltages. Furthermore, the proposed current mirror bandwidth is 944MHz utilizing very low power consumption (267μW) and transistors count. HSPICE simulation results are performed using TSMC 0.18μm CMOS technology utilizing 1.8V single power supply, confirming the theoretically proved outstanding performance of the proposed current mirror. Monte Carlo simulation of its most important parameter is also examined showing its sufficiently resistance against technology process variations.
Abstract: This paper describes a simple way to control the speed
of PMBLDC motor using Fuzzy logic control method. In the
conventional PI controller the performance of the motor system is
simulated and the speed is regulated by using PI controller. These
methods used to improve the performance of PMSM drives, but in
some cases at different operating conditions when the dynamics of
the system also vary over time and it can change the reference speed,
parameter variations and the load disturbance. The simulation is
powered with the MATLAB program to get a reliable and flexible
simulation. In order to highlight the effectiveness of the speed control
method the FLC method is used. The proposed method targeted in
achieving the improved dynamic performance and avoids the
variations of the motor drive. This drive has high accuracy, robust
operation from near zero to high speed. The effectiveness and
flexibility of the individual techniques of the speed control method
will be thoroughly discussed for merits and demerits and finally
verified through simulation and experimental results for comparative
analysis.
Abstract: Data fusion technology can be the best way to extract
useful information from multiple sources of data. It has been widely
applied in various applications. This paper presents a data fusion
approach in multimedia data for event detection in twitter by using
Dempster-Shafer evidence theory. The methodology applies a mining
algorithm to detect the event. There are two types of data in the
fusion. The first is features extracted from text by using the bag-ofwords
method which is calculated using the term frequency-inverse
document frequency (TF-IDF). The second is the visual features
extracted by applying scale-invariant feature transform (SIFT). The
Dempster - Shafer theory of evidence is applied in order to fuse the
information from these two sources. Our experiments have indicated
that comparing to the approaches using individual data source, the
proposed data fusion approach can increase the prediction accuracy
for event detection. The experimental result showed that the proposed
method achieved a high accuracy of 0.97, comparing with 0.93 with
texts only, and 0.86 with images only.
Abstract: The localization information is crucial for the
operation of WSN. There are principally two types of localization
algorithms. The Range-based localization algorithm has strict
requirements on hardware, thus is expensive to be implemented in
practice. The Range-free localization algorithm reduces the hardware
cost. However, it can only achieve high accuracy in ideal scenarios.
In this paper, we locate unknown nodes by incorporating the
advantages of these two types of methods. The proposed algorithm
makes the unknown nodes select the nearest anchor using the
Received Signal Strength Indicator (RSSI) and choose two other
anchors which are the most accurate to achieve the estimated
location. Our algorithm improves the localization accuracy compared
with previous algorithms, which has been demonstrated by the
simulating results.