Development of a Simple laser-based 2D Compensating System for the Contouring Accuracy of Machine Tools

The dynamical contouring error is a critical element for the accuracy of machine tools. The contouring error is defined as the difference between the processing actual path and commanded path, which is implemented by following the command curves from feeding driving system in machine tools. The contouring error is resulted from various factors, such as the external loads, friction, inertia moment, feed rate, speed control, servo control, and etc. Thus, the study proposes a 2D compensating system for the contouring accuracy of machine tools. Optical method is adopted by using stable frequency laser diode and the high precision position sensor detector (PSD) to performno-contact measurement. Results show the related accuracy of position sensor detector (PSD) of 2D contouring accuracy compensating system was ±1.5 μm for a calculated range of ±3 mm, and improvement accuracy is over 80% at high-speed feed rate.

Experiments and Modeling of Ion Exchange Resins for Nuclear Power Plants

Resins are used in nuclear power plants for water ultrapurification. Two approaches are considered in this work: column experiments and simulations. A software called OPTIPUR was developed, tested and used. The approach simulates the onedimensional reactive transport in porous medium with convectivedispersive transport between particles and diffusive transport within the boundary layer around the particles. The transfer limitation in the boundary layer is characterized by the mass transfer coefficient (MTC). The influences on MTC were measured experimentally. The variation of the inlet concentration does not influence the MTC; on the contrary of the Darcy velocity which influences. This is consistent with results obtained using the correlation of Dwivedi&Upadhyay. With the MTC, knowing the number of exchange site and the relative affinity, OPTIPUR can simulate the column outlet concentration versus time. Then, the duration of use of resins can be predicted in conditions of a binary exchange.

Multiple Moving Talker Tracking by Integration of Two Successive Algorithms

In this paper, an estimation accuracy of multiple moving talker tracking using a microphone array is improved. The tracking can be achieved by the adaptive method in which two algorithms are integrated, namely, the PAST (Projection Approximation Subspace Tracking) algorithm and the IPLS (Interior Point Least Square) algorithm. When either talker begins to speak again after a silent period, an appropriate feasible region for an evaluation function of the IPLS algorithm might not be set. Then, the tracking fails due to the incorrect updating. Therefore, if an increment of the number of active talkers is detected, the feasible region must be reset. Then, a low cost realization is required for the high speed tracking and a high accuracy realization is desired for the precise tracking. In this paper, the directions roughly estimated using the delayed-sum-array method are used for the resetting. Several results of experiments performed in an actual room environment show the effectiveness of the proposed method.

A Study of the Damages to Historical Monuments due to Climatic Factors and Air Pollution and Offering Solutions

Historical monuments as architectural heritage are, economically and culturally, considered one of the key aspects for modern communities. Cultural heritage represents a country-s national identity and pride and maintains and enriches that country-s culture. Therefore, conservation of the monuments remained from our ancestors requires everybody-s serious and unremitting effort. Conservation, renewal, restoration, and technical study of cultural and historical matters are issues which have a special status among various forms of art and science in the present century and this is due to two reasons: firstly, progress of humankind in this century has created a factor called environmental pollution which not only has caused new destructive processes of cultural/historical monuments but also has accelerated the previous destructive processes by several times, and secondly, the rapid advance of various sciences, especially chemistry, has lead to the contribution of new methods and materials to this significant issue.

Modeling of Material Removal on Machining of Ti-6Al-4V through EDM using Copper Tungsten Electrode and Positive Polarity

This paper deals optimized model to investigate the effects of peak current, pulse on time and pulse off time in EDM performance on material removal rate of titanium alloy utilizing copper tungsten as electrode and positive polarity of the electrode. The experiments are carried out on Ti6Al4V. Experiments were conducted by varying the peak current, pulse on time and pulse off time. A mathematical model is developed to correlate the influences of these variables and material removal rate of workpiece. Design of experiments (DOE) method and response surface methodology (RSM) techniques are implemented. The validity test of the fit and adequacy of the proposed models has been carried out through analysis of variance (ANOVA). The obtained results evidence that as the material removal rate increases as peak current and pulse on time increases. The effect of pulse off time on MRR changes with peak ampere. The optimum machining conditions in favor of material removal rate are verified and compared. The optimum machining conditions in favor of material removal rate are estimated and verified with proposed optimized results. It is observed that the developed model is within the limits of the agreeable error (about 4%) when compared to experimental results. This result leads to desirable material removal rate and economical industrial machining to optimize the input parameters.

Edit Distance Algorithm to Increase Storage Efficiency of Javanese Corpora

Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).

Wave Vortex Parameters as an Indicator of Breaking Intensity

The study of the geometric shape of the plunging wave enclosed vortices as a possible indicator for the breaking intensity of ocean waves has been ongoing for almost 50 years with limited success. This paper investigates the validity of using the vortex ratio and vortex angle as methods of predicting breaking intensity. Previously published works on vortex parameters, based on regular wave flume results or solitary wave theory, present contradictory results and conclusions. Through the first complete analysis of field collected irregular wave breaking vortex parameters it is illustrated that the vortex ratio and vortex angle cannot be accurately predicted using standard breaking wave characteristics and hence are not suggested as a possible indicator for breaking intensity.

Organizational De-Evolution; the Small Group or Single Actor Terrorist

Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.

Software Reliability Prediction Model Analysis

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.

Applying Complex Network Theory to Software Structure Analysis

Complex networks have been intensively studied across many fields, especially in Internet technology, biological engineering, and nonlinear science. Software is built up out of many interacting components at various levels of granularity, such as functions, classes, and packages, representing another important class of complex networks. It can also be studied using complex network theory. Over the last decade, many papers on the interdisciplinary research between software engineering and complex networks have been published. It provides a different dimension to our understanding of software and also is very useful for the design and development of software systems. This paper will explore how to use the complex network theory to analyze software structure, and briefly review the main advances in corresponding aspects.

Synergy in Vertical Transformations of Expert Designers

Existing literature ondesign reasoning seems to give either one sided accounts on expert design behaviour based on internal processing. In the same way ecological theoriesseem to focus one sidedly on external elementsthat result in a lack of unifying design cognition theory. Although current extended design cognition studies acknowledge the intellectual interaction between internal and external resources, there still seems to be insufficient understanding of the complexities involved in such interactive processes. As such,this paper proposes a novelmulti-directional model for design researchers tomap the complex and dynamic conduct controlling behaviour in which both the computational and ecological perspectives are integrated in a vertical manner. A clear distinction between identified intentional and emerging physical drivers, and relationships between them during the early phases of experts- design process, is demonstrated by presenting a case study in which the model was employed.

Treatment of Recycled Concrete Aggregates by Si-Based Polymers

The recycling of concrete, bricks and masonry rubble as concrete aggregates is an important way to contribute to a sustainable material flow. However, there are still various uncertainties limiting the widespread use of Recycled Concrete Aggregates (RCA). The fluctuations in the composition of grade recycled aggregates and their influence on the properties of fresh and hardened concrete are of particular concern regarding the use of RCA. Most of problems occurring while using recycled concrete aggregates as aggregates are due to higher porosity and hence higher water absorption, lower mechanical strengths, residual impurities on the surface of the RCA forming weaker bond between cement paste and aggregate. So, the reuse of RCA is still limited. Efficient polymer based treatment is proposed in order to reuse RCA easier. The silicon-based polymer treatments of RCA were carried out and were compared. This kind of treatment can improve the properties of RCA such as the rate of water absorption on treated RCA is significantly reduced.

Device for 3D Analysis of Basic Movements of the Lower Extremity

This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.

Evaluating Service Quality of Online Auction by Fuzzy MCDM

This paper applies fuzzy set theory to evaluate the service quality of online auction. Service quality is a composition of various criteria. Among them many intangible attributes are difficult to measure. This characteristic introduces the obstacles for respondent in replying to the survey. So as to overcome this problem, we invite fuzzy set theory into the measurement of performance. By using AHP in obtaining criteria and TOPSIS in ranking, we found the most concerned dimension of service quality is Transaction Safety Mechanism and the least is Charge Item. Regarding to the most concerned attributes are information security, accuracy and information.

Evaluation of the Immunoregulatory Activity of rFip-gts Purified from Baculovirus-infected Insect Cells

Fip-gts, an immunomodulatory protein purified from Ganoderma tsugae, has been reported to possess therapeutic effects in the treatment of cancer and autoimmune disease. For medicinal application, a recombinant Fip-gts was successfully expressed and purified in Sf21 insect cells by our previously work. It is important to evaluate the immunomodulatory activity of the rFip-gts. To assess the immunomodulatory potential of rFip-gts, the T lymphocytes of murine splenocytes were used in the present study. Results revealed that rFip-gts induced cellular aggregation formation. Additionally, the expression of IL-2 and IFN-r were up-regulated after the treatment of rFip-gts, and a corresponding increased production of IL-2 and IFN-r in a dose-dependent manner. The results showed that rFip-gts has an immunomodulatory activity in inducing Th1 lymphocytes from murine splenocytes released IL-2 and IFN-γ, thus suggest that rFip-gts may have therapeutic potential in vivo as an immune modulator.

The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?

To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.

Weka Based Desktop Data Mining as Web Service

Data mining is the process of sifting through large volumes of data, analyzing data from different perspectives and summarizing it into useful information. One of the widely used desktop applications for data mining is the Weka tool which is nothing but a collection of machine learning algorithms implemented in Java and open sourced under the General Public License (GPL). A web service is a software system designed to support interoperable machine to machine interaction over a network using SOAP messages. Unlike a desktop application, a web service is easy to upgrade, deliver and access and does not occupy any memory on the system. Keeping in mind the advantages of a web service over a desktop application, in this paper we are demonstrating how this Java based desktop data mining application can be implemented as a web service to support data mining across the internet.

The Effects of Yield and Yield Components of Some Quality Increase Applications on Ismailoglu Grape Type in Turkey

This study was conducted Ismailoglu grape type (Vitis vinifera L.) and its vine which was aged 15 was grown on its own root in a vegetation period of 2013 in Nevşehir province in Turkey. In this research, it was investigated whether the applications of Control (C), 1/3 cluster tip reduction (1/3 CTR), shoot tip reduction (STR), 1/3 CTR + STR, TKI-HUMAS (TKI-HM) (Soil) (S), TKIHM (Foliar) (F), TKI-HM (S + F), 1/3 CTR + TKI-HM (S), 1/3 CTR + TKI-HM (F), 1/3 CTR + TKI-HM (S+F), STR + TKI-HM (S), STR + TKI-HM (F), STR + TKI-HM (S + F), 1/3 CTR + STR+TKI-HM (S), 1/3 CTR + STR + TKI-HM (F), 1/3 CTR + STR + TKI-HM (S + F) on yield and yield components of Ismailoglu grape type. The results were obtained as the highest fresh grape yield (16.15 kg/vine) with TKI-HM (S), as the highest cluster weight (652.39 g) with 1/3 CTR + STR, as the highest 100 berry weight (419.07 g) with 1/3 CTR + STR + TKI-HM (F), as the highest maturity index (44.06) with 1/3 CTR, as the highest must yield (810.00 ml) with STR + TKI-HM (F), as the highest intensity of L* color (42.04) with TKIHM (S + F), as the highest intensity of a* color (2.60) with 1/3 CTR + TKI-HM (S), as the highest intensity of b* color (7.16) with 1/3 CTR + TKI-HM (S) applications. To increase the fresh grape yield of Ismailoglu grape type can be recommended TKI-HM (S) application.

Vehicle Gearbox Fault Diagnosis Based On Cepstrum Analysis

Research on damage of gears and gear pairs using vibration signals remains very attractive, because vibration signals from a gear pair are complex in nature and not easy to interpret. Predicting gear pair defects by analyzing changes in vibration signal of gears pairs in operation is a very reliable method. Therefore, a suitable vibration signal processing technique is necessary to extract defect information generally obscured by the noise from dynamic factors of other gear pairs.This article presents the value of cepstrum analysis in vehicle gearbox fault diagnosis. Cepstrum represents the overall power content of a whole family of harmonics and sidebands when more than one family of sidebands is present at the same time. The concept for the measurement and analysis involved in using the technique are briefly outlined. Cepstrum analysis is used for detection of an artificial pitting defect in a vehicle gearbox loaded with different speeds and torques. The test stand is equipped with three dynamometers; the input dynamometer serves asthe internal combustion engine, the output dynamometers introduce the load on the flanges of the output joint shafts. The pitting defect is manufactured on the tooth side of a gear of the fifth speed on the secondary shaft. Also, a method for fault diagnosis of gear faults is presented based on order Cepstrum. The procedure is illustrated with the experimental vibration data of the vehicle gearbox. The results show the effectiveness of Cepstrum analysis in detection and diagnosis of the gear condition.