A Critical Approach to Modern Conception in the Context of Objectivity and Quantitative Method

The struggle between modern and postmodern understanding is also displayed in terms of the superiorities of quantitative and qualitative methods to each other which are evaluated within the scope of these understandings. By way of assuming that the quantitative researches (modern) are able to account for structure while the qualitative researches (postmodern) explain the process, these methods are turned into a means for worldviews specific to a period. In fact, process is not a functioning independent of structure. In addition to this issue, the ability of quantitative methods to provide scientific knowledge is also controversial so long as they exclude the dialectical method. For this reason, the critiques charged against modernism in terms of quantitative methods are, in a sense, legitimate. Nevertheless, the main issue is in which parameters postmodernist critique tries to legitimize its critiques and whether these parameters represent a point of view enabling democratic solutions. In this respect, the scientific knowledge covered in Turkish media as a means through which ordinary people have access to scientific knowledge will be evaluated by means of content analysis within a new objectivity conception.

Fluorescence Spectroscopy of Lysozyme-Silver Nanoparticles Complex

Identifying the nature of protein-nanoparticle interactions and favored binding sites is an important issue in functional characterization of biomolecules and their physiological responses. Herein, interaction of silver nanoparticles with lysozyme as a model protein has been monitored via fluorescence spectroscopy. Formation of complex between the biomolecule and silver nanoparticles (AgNPs) induced a steady state reduction in the fluorescence intensity of protein at different concentrations of nanoparticles. Tryptophan fluorescence quenching spectra suggested that silver nanoparticles act as a foreign quencher, approaching the protein via this residue. Analysis of the Stern-Volmer plot showed quenching constant of 3.73 μM−1. Moreover, a single binding site in lysozyme is suggested to play role during interaction with AgNPs, having low affinity of binding compared to gold nanoparticles. Unfolding studies of lysozyme showed that complex of lysozyme- AgNPs has not undergone structural perturbations compared to the bare protein. Results of this effort will pave the way for utilization of sensitive spectroscopic techniques for rational design of nanobiomaterials in biomedical applications.

Study on Position Polarity Compensation for Permanent Magnet Synchronous Motor Based on High Frequency Signal Injection

The application of a high frequency signal injection method as speed and position observer in PMSM drives has been a research focus. At present, the precision of this method is nearly good as that of ten-bit encoder. But there are some questions for estimating position polarity. Based on high frequency signal injection, this paper presents a method to compensate position polarity for permanent magnet synchronous motor (PMSM). Experiments were performed to test the effectiveness of the proposed algorithm and results present the good performance.

Numerical Analysis and Experimental Validation of Detector Pressure Housing Subject to HPHT

Reservoirs with high pressures and temperatures (HPHT) that were considered to be atypical in the past are now frequent targets for exploration. For downhole oilfield drilling tools and components, the temperature and pressure affect the mechanical strength. To address this issue, a finite element analysis (FEA) for 206.84 MPa (30 ksi) pressure and 165°C has been performed on the pressure housing of the measurement-while-drilling/logging-whiledrilling (MWD/LWD) density tool. The density tool is a MWD/LWD sensor that measures the density of the formation. One of the components of the density tool is the pressure housing that is positioned in the tool. The FEA results are compared with the experimental test performed on the pressure housing of the density tool. Past results show a close match between the numerical results and the experimental test. This FEA model can be used for extreme HPHT and ultra HPHT analyses, and/or optimal design changes.

Analysis of Acoustic Emission Signal for the Detection of Defective Manufactures in Press Process

Small cracks or chips of a product appear very frequently in the course of continuous production of an automatic press process system. These phenomena become the cause of not only defective product but also damage of a press mold. In order to solve this problem AE system was introduced. AE system was expected to be very effective to real time detection of the defective product and to prevention of the damage of the press molds. In this study, for pick and analysis of AE signals generated from the press process, AE sensors/pre-amplifier/analysis and processing board were used as frequently found in the other similar cases. For analysis and processing the AE signals picked in real time from the good or bad products, specialized software called cdm8 was used. As a result of this work it was conformed that intensity and shape of the various AE signals differ depending on the weight and thickness of metal sheet and process type.

Using Malolactic Fermentation with Acid- And Ethanol- Adapted Oenococcus Oeni Strain to Improve the Quality of Wine from Champs Bourcin Grape in Sapa - Lao Cai

Champs Bourcin black grape originated from Aquitaine, France and planted in Sapa, Lao cai provice, exhibited high total acidity (11.72 g/L). After 9 days of alcoholic fermentation at 25oC using Saccharomyces cerevisiae UP3OY5 strain, the ethanol concentration of wine was 11.5% v/v, however the sharp sour taste of wine has been found. The malolactic fermentation (MLF) was carried out by Oenococcus oeni ATCCBAA-1163 strain which had been preadapted to acid (pH 3-4) and ethanol (8-12%v/v) conditions. We obtained the highest vivability (83.2%) upon malolactic fermentation after 5 days at 22oC with early stationary phase O. oeni cells preadapted to pH 3.5 and 8% v/v ethanol in MRS medium. The malic acid content in wine was decreased from 5.82 g/L to 0.02 g/L after MLF (21 days at 22oC). The sensory quality of wine was significantly improved.

Design of a Tube Vent to Enhance the Role of Roof Solar Collector

The objective of this paper was to designing a ventilation system to enhance the performance of roof solar collector (RSC) for reducing heat accumulation inside the house. The RSC has 1.8 m2 surface area made of CPAC monier roof tiles on the upper part and gypsum board on the lower part. The space between CPAC monier and gypsum board was fixed at 14 cm. Ventilation system of modified roof solar collector (modified RSC) consists of 9 tubes of 0.15m diameter and installed in the lower part of RSC. Experimental result showed that the temperature of the room, and attic temperature. The average temperature reduction of room of house used modified RSC is about 2oC. and the percentage of room temperature reduction varied between 0 to 10%. Therefore, modified RSC is an interesting option in the sense that it promotes solar energy and conserve energy.

An Anomaly Detection Approach to Detect Unexpected Faults in Recordings from Test Drives

In the automotive industry test drives are being conducted during the development of new vehicle models or as a part of quality assurance of series-production vehicles. The communication on the in-vehicle network, data from external sensors, or internal data from the electronic control units is recorded by automotive data loggers during the test drives. The recordings are used for fault analysis. Since the resulting data volume is tremendous, manually analysing each recording in great detail is not feasible. This paper proposes to use machine learning to support domainexperts by preventing them from contemplating irrelevant data and rather pointing them to the relevant parts in the recordings. The underlying idea is to learn the normal behaviour from available recordings, i.e. a training set, and then to autonomously detect unexpected deviations and report them as anomalies. The one-class support vector machine “support vector data description” is utilised to calculate distances of feature vectors. SVDDSUBSEQ is proposed as a novel approach, allowing to classify subsequences in multivariate time series data. The approach allows to detect unexpected faults without modelling effort as is shown with experimental results on recordings from test drives.

Vehicle Position Estimation for Driver Assistance System

We present a system that finds road boundaries and constructs the virtual lane based on fusion data from a laser and a monocular sensor, and detects forward vehicle position even in no lane markers or bad environmental conditions. When the road environment is dark or a lot of vehicles are parked on the both sides of the road, it is difficult to detect lane and road boundary. For this reason we use fusion of laser and vision sensor to extract road boundary to acquire three dimensional data. We use parabolic road model to calculate road boundaries which is based on vehicle and sensors state parameters and construct virtual lane. And then we distinguish vehicle position in each lane.

Determinants of the U.S. Current Account

This article provides empirical evidence on the effect of domestic and international factors on the U.S. current account deficit. Linear dynamic regression and vector autoregression models are employed to estimate the relationships during the period from 1986 to 2011. The findings of this study suggest that the current and lagged private saving rate and foreign current account for East Asian economies have played a vital role in affecting the U.S. current account. Additionally, using Granger causality tests and variance decompositions, the change of the productivity growth and foreign domestic demand are determined to influence significantly the change of the U.S. current account. To summarize, the empirical relationship between the U.S. current account deficit and its determinants is sensitive to alternative regression models and specifications.

Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

A Local Statistics Based Region Growing Segmentation Method for Ultrasound Medical Images

This paper presents the region based segmentation method for ultrasound images using local statistics. In this segmentation approach the homogeneous regions depends on the image granularity features, where the interested structures with dimensions comparable to the speckle size are to be extracted. This method uses a look up table comprising of the local statistics of every pixel, which are consisting of the homogeneity and similarity bounds according to the kernel size. The shape and size of the growing regions depend on this look up table entries. The algorithms are implemented by using connected seeded region growing procedure where each pixel is taken as seed point. The region merging after the region growing also suppresses the high frequency artifacts. The updated merged regions produce the output in formed of segmented image. This algorithm produces the results that are less sensitive to the pixel location and it also allows a segmentation of the accurate homogeneous regions.

Colorectal Cancer Screening by a CEACAM-6 Immunosensor

The biomarker for colorectal cancer (CRC) is CEACAM-6 antigen (C6AG). Therefore, this study aims to develop a novel, simple and low-cost CEACAM-6 antigen immumosensor (C6AG-IMS), based on electrical impedance measurement, for precise determination of C6AG. A low-cost screen-printed graphite electrode was constructed and used as the sensor, with CEACAM-6 antibody (C6AB) immobilized on it. The procedures of sensor fabrication and antibody immobilization are simple and low-cost. Measurement of the electrical impedance at a definite frequency ranges (0.43 – 1.26 MHz) showed that the C6AG-IMS has an excellent linear (r2>0.9) response range (8.125 – 65 pg/mL), covering the normal physiological and pathological ranges of blood C6AG levels. Also, the C6AG-IMS has excellent reliability and validity, with the intraclass correlation coefficient being 0.97. In conclusion, a novel, simple, low-cost and reliable C6AG-IMS was designed and developed, being able to accurately determine blood C6AG levels in the range of pathological and normal physiological regions. The C6AG-IMS can provide a point-of-care and immediate screening results to the user at home.

pH-Responsiveness Properties of a Biodigradable Hydrogels Based on Carrageenan-g-poly(NaAA-co-NIPAM)

A novel thermo-sensitive superabsorbent hydrogel with salt- and pH-responsiveness properties was obtained by grafting of mixtures of acrylic acid (AA) and N-isopropylacrylamide (NIPAM) monomers onto kappa-carrageenan, kC, using ammonium persulfate (APS) as a free radical initiator in the presence of methylene bisacrylamide (MBA) as a crosslinker. Infrared spectroscopy was carried out to confirm the chemical structure of the hydrogel. Moreover, morphology of the samples was examined by scanning electron microscopy (SEM). The effect of MBA concentration and AA/NIPAM weight ratio on the water absorbency capacity has been investigated. The swelling variations of hydrogels were explained according to swelling theory based on the hydrogel chemical structure. The hydrogels exhibited salt-sensitivity and cation exchange properties. The temperature- and pH-reversibility properties of the hydrogels make the intelligent polymers as good candidates for considering as potential carriers for bioactive agents, e.g. drugs.

Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems

This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.

Position Awareness Mechanisms for Wireless Sensor Networks

A Wireless sensor network (WSN) consists of a set of battery-powered nodes, which collaborate to perform sensing tasks in a given environment. Each node in WSN should be capable to act for long periods of time with scrimpy or no external management. One requirement for this independent is: in the presence of adverse positions, the sensor nodes must be capable to configure themselves. Hence, the nodes for determine the existence of unusual events in their surroundings should make use of position awareness mechanisms. This work approaches the problem by considering the possible unusual events as diseases, thus making it possible to diagnose them through their symptoms, namely, their side effects. Considering these awareness mechanisms as a foundation for highlevel monitoring services, this paper also shows how these mechanisms are included in the primal plan of an intrusion detection system.

A Single-chip Proportional to Absolute Temperature Sensor Using CMOS Technology

Nowadays it is a trend for electronic circuit designers to integrate all system components on a single-chip. This paper proposed the design of a single-chip proportional to absolute temperature (PTAT) sensor including a voltage reference circuit using CEDEC 0.18m CMOS Technology. It is a challenge to design asingle-chip wide range linear response temperature sensor for many applications. The channel widths between the compensation transistor and the reference transistor are critical to design the PTAT temperature sensor circuit. The designed temperature sensor shows excellent linearity between -100°C to 200° and the sensitivity is about 0.05mV/°C. The chip is designed to operate with a single voltage source of 1.6V.

Development of a Simple laser-based 2D Compensating System for the Contouring Accuracy of Machine Tools

The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.

Wave Vortex Parameters as an Indicator of Breaking Intensity

The study of the geometric shape of the plunging wave enclosed vortices as a possible indicator for the breaking intensity of ocean waves has been ongoing for almost 50 years with limited success. This paper investigates the validity of using the vortex ratio and vortex angle as methods of predicting breaking intensity. Previously published works on vortex parameters, based on regular wave flume results or solitary wave theory, present contradictory results and conclusions. Through the first complete analysis of field collected irregular wave breaking vortex parameters it is illustrated that the vortex ratio and vortex angle cannot be accurately predicted using standard breaking wave characteristics and hence are not suggested as a possible indicator for breaking intensity.

A Study on the Differential Diagnostic Model for Newborn Hearing Loss Screening

According to the statistics, the prevalence of congenital hearing loss in Taiwan is approximately six thousandths; furthermore, one thousandths of infants have severe hearing impairment. Hearing ability during infancy has significant impact in the development of children-s oral expressions, language maturity, cognitive performance, education ability and social behaviors in the future. Although most children born with hearing impairment have sensorineural hearing loss, almost every child more or less still retains some residual hearing. If provided with a hearing aid or cochlear implant (a bionic ear) timely in addition to hearing speech training, even severely hearing-impaired children can still learn to talk. On the other hand, those who failed to be diagnosed and thus unable to begin hearing and speech rehabilitations on a timely manner might lose an important opportunity to live a complete and healthy life. Eventually, the lack of hearing and speaking ability will affect the development of both mental and physical functions, intelligence, and social adaptability. Not only will this problem result in an irreparable regret to the hearing-impaired child for the life time, but also create a heavy burden for the family and society. Therefore, it is necessary to establish a set of computer-assisted predictive model that can accurately detect and help diagnose newborn hearing loss so that early interventions can be provided timely to eliminate waste of medical resources. This study uses information from the neonatal database of the case hospital as the subjects, adopting two different analysis methods of using support vector machine (SVM) for model predictions and using logistic regression to conduct factor screening prior to model predictions in SVM to examine the results. The results indicate that prediction accuracy is as high as 96.43% when the factors are screened and selected through logistic regression. Hence, the model constructed in this study will have real help in clinical diagnosis for the physicians and actually beneficial to the early interventions of newborn hearing impairment.