Performance of Membrane Bioreactor (MBR) in High Phosphate Wastewater

This study presents the performance of membrane bioreactor in treating high phosphate wastewater. The laboratory scale MBR was operated at permeate flux of 25 L/m2.h with a hollow fiber membrane (polypropylene, approx. pore size 0.01 - 0.2 μm) at hydraulic retention time (HRT) of 12 hrs. Scanning electron microscopy (SEM) and energy diffusive X-ray (EDX) analyzer were used to characterize the membrane foulants. Results showed that the removal efficiencies of COD, TSS, NH3-N and PO4 3- were 93, 98, 80 and 30% respectively. On average 91% of influent soluble microbial products (SMP) were eliminated, with the eliminations of polysaccharides mostly above 80%. The main fouling resistance was cake resistance. It should be noted that SMP were found in major portions of mixed liquor that played a relatively significant role in membrane fouling. SEM and EDX analyses indicated that the foulants covering the membrane surfaces comprises not only organic substances but also inorganic elements including Mg, Ca, Al, K and P.

An Expert System for Car Failure Diagnosis

Car failure detection is a complicated process and requires high level of expertise. Any attempt of developing an expert system dealing with car failure detection has to overcome various difficulties. This paper describes a proposed knowledge-based system for car failure detection. The paper explains the need for an expert system and the some issues on developing knowledge-based systems, the car failure detection process and the difficulties involved in developing the system. The system structure and its components and their functions are described. The system has about 150 rules for different types of failures and causes. It can detect over 100 types of failures. The system has been tested and gave promising results.

A New Method in Short-Term Heart Rate Variability — Five-Class Density Histogram

A five-class density histogram with an index named cumulative density was proposed to analyze the short-term HRV. 150 subjects participated in the test, falling into three groups with equal numbers -- the healthy young group (Young), the healthy old group (Old), and the group of patients with congestive heart failure (CHF). Results of multiple comparisons showed a significant differences of the cumulative density in the three groups, with values 0.0238 for Young, 0.0406 for Old and 0.0732 for CHF (p

A Study of Thai Muslims’ Way of Life through Their Clothes

The purpose of this research was to investigate Thai Muslims’ way of life through the way their clothes. The data of this qualitative research were collected from related documents and research reports, ancient cloths and clothing, and in-depth interviews with clothes owners and weavers. The research found that in the 18th century Thai Muslims in the three southern border provinces used many types of clothing in their life. At home women wore plain clothes. They used checked cloths to cover the upper part of their body from the breasts down to the waist. When going out, they used Lima cloth and So Kae with a piece of Pla-nging cloth as a head scarf. For men, they wore a checked sarong as a lower garment, and wore no upper garment. However, when going out, they wore Puyo Potong. In addition, Thai Muslims used cloths in various religious rites, namely, the rite of placing a baby in a cradle, the Masoyawi rite, the Nikah rite, and the burial rite. These types of cloths were related to the way of life of Thai Muslims from birth to death. They reflected the race, gender, age, social status, values, and beliefs in traditions that have been inherited. Practical Implication: Woven in these cloths are the lost local wisdom, and therefore, aesthetics on the cloths are like mirrors reflecting the background of people in this region that is fading away. These cloths are pages of a local history book that is of importance and value worth for preservation and publicity so that they are treasured. Government organizations can expand and materialize the knowledge received from the study in accordance with government policy in supporting the One Tambon, One Product project.

Lean Changeability – Evaluation and Design of Lean and Transformable Factories

In today-s turbulent environment, companies are faced with two principal challenges. On the one hand, it is necessary to produce ever more cost-effectively to remain competitive. On the other hand, factories need to be transformable in order to manage unpredictable changes in the corporate environment. To deal with these different challenges, companies use the philosophy of lean production in the first case, in the second case the philosophy of transformability. To a certain extent these two approaches follow different directions. This can cause conflicts when designing factories. Therefore, the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover has developed a procedure to allow companies to evaluate and design their factories with respect to the requirements of both philosophies.

Some (v + 1, b + r + λ + 1, r + λ + 1, k, λ + 1) Balanced Incomplete Block Designs (BIBDs) from Lotto Designs (LDs)

The paper considered the construction of BIBDs using potential Lotto Designs (LDs) earlier derived from qualifying parent BIBDs. The study utilized Li’s condition  pr t−1  ( t−1 2 ) + pr− pr t−1 (t−1) 2  < ( p 2 ) λ, to determine the qualification of a parent BIBD (v, b, r, k, λ) as LD (n, k, p, t) constrained on v ≥ k, v ≥ p, t ≤ min{k, p} and then considered the case k = t since t is the smallest number of tickets that can guarantee a win in a lottery. The (15, 140, 28, 3, 4) and (7, 7, 3, 3, 1) BIBDs were selected as parent BIBDs to illustrate the procedure. These BIBDs yielded three potential LDs each. Each of the LDs was completely generated and their properties studied. The three LDs from the (15, 140, 28, 3, 4) produced (9, 84, 28, 3, 7), (10, 120, 36, 3, 8) and (11, 165, 45, 3, 9) BIBDs while those from the (7, 7, 3, 3, 1) produced the (5, 10, 6, 3, 3), (6, 20, 10, 3, 4) and (7, 35, 15, 3, 5) BIBDs. The produced BIBDs follow the generalization (v + 1, b + r + λ + 1, r +λ+1, k, λ+1) where (v, b, r, k, λ) are the parameters of the (9, 84, 28, 3, 7) and (5, 10, 6, 3, 3) BIBDs. All the BIBDs produced are unreduced designs.

Using Case-Based Reasoning to New Service Development from User Innovation Community in Mobile Application Services

The emergence of mobile application services and App Store has led to the explosive growth of user innovation, which users voluntarily contribute to. User innovation communities where end users freely reveal innovative ideas and needs with other community members are becoming increasingly influential in this area. However, user-s ideas in user innovation community are not enough to be new service opportunity, because some of them can already developed as existing services in App Store. Moreover, the existing services similar to new service opportunity can be significant references to apply analogy to develop service concept. In response, this research proposes Case-Based Reasoning approach to matching the user needs and existing services, identifying unmet opportunistic user needs, and retrieving similar services with opportunity. Due to its intuitive and transparent algorithm, users related to App Store innovation communities can easily employ Case-Based Reasoning based approach to their innovation.

A Follow up Study on the Elderly Survivors - Mental Health Two Years after the Wenchuan Earthquake

Background: This investigated the mental health of the elderly survivors six months, ten months and two years after the “5.12 Wenchuan" earthquake. Methods: Two hundred and thirty-two physically healthy older survivors from earthquake-affected Mianyang County were interviewed. The measures included the Revised Impact of Event Scale (IES-R, Chinese version, for PTSD) and a Chinese Mental Health Inventory for the Elderly (MHIE). A repeated measures ANOVA test was used for statistical analysis. Results: The follow-up group had a statistically significant lower IES-R score and lower MHIE score than the initial group ten months after the earthquake. Two years later, the score of IES-R in follow-up group were still lower than that of non-follow-up group, but no differences were significant on the score of MHIE between groups. Furthermore, a negative relationship was found between scores of IES-R and MHIE. Conclusion: The earthquake has had a persistent negative impact on older survivors- mental health within the two-year period and that although the PTSD level declined significantly with time, it did not disappear completely.

A Study on Algorithm Fusion for Recognition and Tracking of Moving Robot

This paper presents an algorithm for the recognition and tracking of moving objects, 1/10 scale model car is used to verify performance of the algorithm. Presented algorithm for the recognition and tracking of moving objects in the paper is as follows. SURF algorithm is merged with Lucas-Kanade algorithm. SURF algorithm has strong performance on contrast, size, rotation changes and it recognizes objects but it is slow due to many computational complexities. Processing speed of Lucas-Kanade algorithm is fast but the recognition of objects is impossible. Its optical flow compares the previous and current frames so that can track the movement of a pixel. The fusion algorithm is created in order to solve problems which occurred using the Kalman Filter to estimate the position and the accumulated error compensation algorithm was implemented. Kalman filter is used to create presented algorithm to complement problems that is occurred when fusion two algorithms. Kalman filter is used to estimate next location, compensate for the accumulated error. The resolution of the camera (Vision Sensor) is fixed to be 640x480. To verify the performance of the fusion algorithm, test is compared to SURF algorithm under three situations, driving straight, curve, and recognizing cars behind the obstacles. Situation similar to the actual is possible using a model vehicle. Proposed fusion algorithm showed superior performance and accuracy than the existing object recognition and tracking algorithms. We will improve the performance of the algorithm, so that you can experiment with the images of the actual road environment.

On the Quantizer Design for Base Station Cooperation Systems with SC-FDE Techniques

By employing BS (Base Station) cooperation we can increase substantially the spectral efficiency and capacity of cellular systems. The signals received at each BS are sent to a central unit that performs the separation of the different MT (Mobile Terminal) using the same physical channel. However, we need accurate sampling and quantization of those signals so as to reduce the backhaul communication requirements. In this paper we consider the optimization of the quantizers for BS cooperation systems. Four different quantizer types are analyzed and optimized to allow better SQNR (Signal-to-Quantization Noise Ratio) and BER (Bit Error Rate) performance.

Information Entropy of Isospectral Hydrogen Atom

The position and momentum space information entropies of hydrogen atom are exactly evaluated. Using isospectral Hamiltonian approach, a family of isospectral potentials is constructed having same energy eigenvalues as that of the original potential. The information entropy content is obtained in position space as well as in momentum space. It is shown that the information entropy content in each level can be re-arranged as a function of deformation parameter.

Intellectual Capital Report for Universities

Intellectual capital reporting becomes critical at universities, mainly due to the fact that knowledge is the main output as well as input in these institutions. In addition, universities have continuous external demands for greater information and transparency about the use of public funds, and are increasingly provided with greater autonomy regarding their organization, management, and budget allocation. This situation requires new management and reporting systems. The purpose of the present study is to provide a model for intellectual capital report in Spanish universities. To this end, a questionnaire was sent to every member of the Social Councils of Spanish public universities in order to identify which intangible elements university stakeholders demand most. Our proposal for an intellectual capital report aims to act as a guide to help the Spanish universities on the road to the presentation of information on intellectual capital which can assist stakeholders to make the right decisions.

Quantitative Evaluation of Frameworks for Web Applications

An empirical study of web applications that use software frameworks is presented here. The analysis is based on two approaches. In the first, developers using such frameworks are required, based on their experience, to assign weights to parameters such as database connection. In the second approach, a performance testing tool, OpenSTA, is used to compute start time and other such measures. From such an analysis, it is concluded that open source software is superior to proprietary software. The motivation behind this research is to examine ways in which a quantitative assessment can be made of software in general and frameworks in particular. Concepts such as metrics and architectural styles are discussed along with previously published research.

A Neural-Network-Based Fault Diagnosis Approach for Analog Circuits by Using Wavelet Transformation and Fractal Dimension as a Preprocessor

This paper presents a new method of analog fault diagnosis based on back-propagation neural networks (BPNNs) using wavelet decomposition and fractal dimension as preprocessors. The proposed method has the capability to detect and identify faulty components in an analog electronic circuit with tolerance by analyzing its impulse response. Using wavelet decomposition to preprocess the impulse response drastically de-noises the inputs to the neural network. The second preprocessing by fractal dimension can extract unique features, which are the fed to a neural network as inputs for further classification. A comparison of our work with [1] and [6], which also employs back-propagation (BP) neural networks, reveals that our system requires a much smaller network and performs significantly better in fault diagnosis of analog circuits due to our proposed preprocessing techniques.

Multi-stage Directional Median Filter

Median filter is widely used to remove impulse noise without blurring sharp edges. However, when noise level increased, or with thin edges, median filter may work poorly. This paper proposes a new filter, which will detect edges along four possible directions, and then replace noise corrupted pixel with estimated noise-free edge median value. Simulations show that the proposed multi-stage directional median filter can provide excellent performance of suppressing impulse noise in all situations.

Matrix-Interleaved Serially Concatenated Block Codes for Speech Transmission in Fixed Wireless Communication Systems

In this paper, we study a class of serially concatenated block codes (SCBC) based on matrix interleavers, to be employed in fixed wireless communication systems. The performances of SCBC¬coded systems are investigated under various interleaver dimensions. Numerical results reveal that the matrix interleaver could be a competitive candidate over conventional block interleaver for frame lengths of 200 bits; hence, the SCBC coding based on matrix interleaver is a promising technique to be employed for speech transmission applications in many international standards such as pan-European Global System for Mobile communications (GSM), Digital Cellular Systems (DCS) 1800, and Joint Detection Code Division Multiple Access (JD-CDMA) mobile radio systems, where the speech frame contains around 200 bits.

SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids

Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.

Reciprocating Compressor Optimum Design and Manufacturing with Respect to Performance, Reliability and Cost

Reciprocating compressors are flexible to handle wide capacity and condition swings, offer a very efficient method of compressing almost any gas mixture in wide range of pressure, can generate high head independent of density, and have numerous applications and wide power ratings. These make them vital component in various units of industrial plants. In this paper optimum reciprocating compressor configuration regarding interstage pressures, low suction pressure, non-lubricated cylinder, speed of machine, capacity control system, compressor valve, lubrication system, piston rod coating, cylinder liner material, barring device, pressure drops, rod load, pin reversal, discharge temperature, cylinder coolant system, performance, flow, coupling, special tools, condition monitoring (including vibration, thermal and rod drop monitoring), commercial points, delivery and acoustic conditions are presented.

Compiler-Based Architecture for Context Aware Frameworks

Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.

An Evaluation of Carbon Dioxide Emissions Trading among Enterprises -The Tokyo Cap and Trade Program-

This study aims to propose three evaluation methods to evaluate the Tokyo Cap and Trade Program when emissions trading is performed virtually among enterprises, focusing on carbon dioxide (CO2), which is the only emitted greenhouse gas that tends to increase. The first method clarifies the optimum reduction rate for the highest cost benefit, the second discusses emissions trading among enterprises through market trading, and the third verifies long-term emissions trading during the term of the plan (2010-2019), checking the validity of emissions trading partly using Geographic Information Systems (GIS). The findings of this study can be summarized in the following three points. 1. Since the total cost benefit is the greatest at a 44% reduction rate, it is possible to set it more highly than that of the Tokyo Cap and Trade Program to get more total cost benefit. 2. At a 44% reduction rate, among 320 enterprises, 8 purchasing enterprises and 245 sales enterprises gain profits from emissions trading, and 67 enterprises perform voluntary reduction without conducting emissions trading. Therefore, to further promote emissions trading, it is necessary to increase the sales volumes of emissions trading in addition to sales enterprises by increasing the number of purchasing enterprises. 3. Compared to short-term emissions trading, there are few enterprises which benefit in each year through the long-term emissions trading of the Tokyo Cap and Trade Program. Only 81 enterprises at the most can gain profits from emissions trading in FY 2019. Therefore, by setting the reduction rate more highly, it is necessary to increase the number of enterprises that participate in emissions trading and benefit from the restraint of CO2 emissions.