Intermolecular Dynamics between Alcohols and Fatty Acid Ester Solvents

This work focused on the interactions which occur between ester solvents and alcohol solutes. The alcohols selected ranged from the simplest alcohol (methanol) to C10-alcohols, and solubility predictions in the form of infinite dilution activity coefficients were made using the Modified UNIFAC Dortmund group contribution model. The model computation was set up on a Microsoft Excel spreadsheet specifically designed for this purpose. It was found that alcohol/ ester interactions yielded an increase in activity coefficients (i.e. became less soluble) with an increase in the size of the ester solvent molecule. Furthermore, activity coefficients decreased with an increase in the size of the alcohol solute. The activity coefficients also decreased with an increase in the degree of unsaturation of the ester hydrocarbon tail. Tertiary alcohols yielded lower activity coefficients than primary alcohols. Finally, cyclic alcohols yielded higher activity coefficients than straight-chain alcohols until a point is reached where the trend is reversed, referred to as the ‘crossover’ point.

The Use of TV and the Internet in the Social Context

This study examines the media habits of young people in Saudi Arabia, in particular their use of the Internet and television in the domestic sphere, and how use of the Internet impacts upon other activities. In order to address the research questions, focus group interviews were conducted with Saudi university students. The study found that television has become a central part of social life within the household where television represents a main source for family time, particularly in Ramadan while the Internet is a solitary activity where it is used in more private spaces. Furthermore, Saudi females were also more likely to have their Internet access monitored and circumscribed by family members, with parents controlling the location and the amount of time spent using the Internet.

A Statistical Prediction of Likely Distress in Nigeria Banking Sector Using a Neural Network Approach

One of the most significant threats to the economy of a nation is the bankruptcy of its banks. This study evaluates the susceptibility of Nigerian banks to failure with a view to identifying ratios and financial data that are sensitive to solvency of the bank. Further, a predictive model is generated to guide all stakeholders in the industry. Thirty quoted banks that had published Annual Reports for the year preceding the consolidation i.e. year 2004 were selected. They were examined for distress using the Multilayer Perceptron Neural Network Analysis. The model was used to analyze further reforms by the Central Bank of Nigeria using published Annual Reports of twenty quoted banks for the year 2008 and 2011. The model can thus be used for future prediction of failure in the Nigerian banking system.

An Integrated Operational Research and System Dynamics Approach for Planning Decisions in Container Terminals

This paper focuses on the operational and strategic planning decisions related to the quayside of container terminals. We introduce an integrated operational research (OR) and system dynamics (SD) approach to solve the Berth Allocation Problem (BAP) and the Quay Crane Assignment Problem (QCAP). A BAP-QCAP optimization modeling approach which considers practical aspects not studied before in the integration of BAP and QCAP is discussed. A conceptual SD model is developed to determine the long-term effect of optimization on the system behavior factors like resource utilization, attractiveness to port, number of incoming vessels to port and port profits. The framework can be used for improving the operational efficiency of container terminals and providing a strategic view after applying optimization.

Natural Disaster Impact on Annual Visitors of Recreation Area: The Taiwan Case

This paper aims to quantify the impact of natural disaster on tourism by the change of annual visitors to scenic spots. The data of visitors to Alishan, Sun Moon Lake, Sitou and Palace Museum in Taiwan during 1986 to 2012 year is collected, and the trend analysis is used to predict the annual visitors to these scenic spots. The findings show that 1999 Taiwan earthquake had significant effect on the visitors to Alishan, Sun Moon Lake and Sitou with an average impact of 55.75% during 1999 to 2000 year except for Palace Museum. The impact was greater as closer epicenter of 1999 earthquake. And the discovery period of visitors is about 2 to 9 years. Further, the impact of heavy rainfall on Alishan, Taiwan is estimated. As the accumulative rainfall reaches to 500 mm, the impact on visitors can be predicted. 

Composition Dependent Formation of Sputtered Co-Cu Film on Cr Under-Layer

Sputtered CoxCu100-x films with the different compositions of x = 57.7, 45.8, 25.5, 13.8, 8.8, 7.5 and 1.8 were deposited on Cr under-layer by RF-sputtering. SEM result reveals that the averaged thickness of Co-Cu film and Cr under-layer are 92 nm and 22nm, respectively. All Co-Cu films are composed of Co (FCC) and Cu (FCC) phases in (111) directions on BCC-Cr (110) under-layers. Magnetic properties, surface roughness and morphology of Co-Cu films are dependent on the film composition. The maximum and minimum surface roughness of 3.24 and 1.16nm are observed on the Co7.5Cu92.5 and Co45.8Cu54.2films, respectively. It can be described that the variance of surface roughness of the film because of the difference of the agglomeration rate of Co and Cu atoms on Cr under-layer. The Co57.5Cu42.3, Co45.8Cu54.2 and Co25.5Cu74.5 films shows the ferromagnetic phase whereas the rest of the film exhibits the paramagnetic phase at room temperature. The saturation magnetization, remnant magnetization and coercive field of Co-Cu films on Cr under-layer are slightly increased with increasing the Co composition. It can be concluded that the required magnetic properties and surface roughness of the Co-Cu film can be adapted by the adjustment of the film composition.

A Study of Priority Evaluation and Resource Allocation for Revitalization of Cultural Heritages in the Urban Development

Proper maintenance and preservation of significant cultural heritages or historic buildings is necessary. It can not only enhance environmental benefits and a sense of community, but also preserve a city's history and people’s memory. It allows the next generation to be able to get a glimpse of our past, and achieve the goal of sustainable preserved cultural assets. However, the management of maintenance work has not been appropriate for many designated heritages or historic buildings so far. The planning and implementation of the reuse has yet to have a breakthrough specification. It leads the heritages to a mere formality of being “reserved”, instead of the real meaning of “conservation”. For the restoration and preservation of cultural heritages study issues, it is very important due to the consideration of historical significance, symbolism, and economic benefits effects. However, the decision makers such as the officials from public sector they often encounter which heritage should be prioritized to be restored first under the available limited budgets. Only very few techniques are available today to determine the appropriately restoration priorities for the diverse historical heritages, perhaps because of a lack of systematized decision-making aids been proposed before. In the past, the discussions of management and maintenance towards cultural assets were limited to the selection of reuse alternatives instead of the allocation of resources. In view of this, this research will adopt some integrated research methods to solve the existing problems that decision-makers might encounter when allocating resources in the management and maintenance of heritages and historic buildings. The purpose of this study is to develop a sustainable decision making model for local governments to resolve these problems. We propose an alternative decision support model to prioritize restoration needs within the limited budgets. The model is constructed based on fuzzy Delphi, fuzzy analysis network process (FANP) and goal programming (GP) methods. In order to avoid misallocate resources; this research proposes a precise procedure that can take multi-stakeholders views, limited costs and resources into consideration. Also, the combination of many factors and goals has been taken into account to find the highest priority and feasible solution results. To illustrate the approach we propose in this research, seven cultural heritages in Taipei city as one example has been used as an empirical study, and the results are in depth analyzed to explain the application of our proposed approach.

On the Computation of a Common n-finger Robotic Grasp for a Set of Objects

Industrial robotic arms utilize multiple end-effectors, each for a specific part and for a specific task. We propose a novel algorithm which will define a single end-effector’s configuration able to grasp a given set of objects with different geometries. The algorithm will have great benefit in production lines allowing a single robot to grasp various parts. Hence, reducing the number of endeffectors needed. Moreover, the algorithm will reduce end-effector design and manufacturing time and final product cost. The algorithm searches for a common grasp over the set of objects. The search algorithm maps all possible grasps for each object which satisfy a quality criterion and takes into account possible external wrenches (forces and torques) applied to the object. The mapped grasps are- represented by high-dimensional feature vectors which describes the shape of the gripper. We generate a database of all possible grasps for each object in the feature space. Then we use a search and classification algorithm for intersecting all possible grasps over all parts and finding a single common grasp suitable for all objects. We present simulations of planar and spatial objects to validate the feasibility of the approach.

Comparison of Two Types of Preconditioners for Stokes and Linearized Navier-Stokes Equations

To solve saddle point systems efficiently, several preconditioners have been published. There are many methods for constructing preconditioners for linear systems from saddle point problems, for instance, the relaxed dimensional factorization (RDF) preconditioner and the augmented Lagrangian (AL) preconditioner are used for both steady and unsteady Navier-Stokes equations. In this paper we compare the RDF preconditioner with the modified AL (MAL) preconditioner to show which is more effective to solve Navier-Stokes equations. Numerical experiments indicate that the MAL preconditioner is more efficient and robust, especially, for moderate viscosities and stretched grids in steady problems. For unsteady cases, the convergence rate of the RDF preconditioner is slightly faster than the MAL perconditioner in some circumstances, but the parameter of the RDF preconditioner is more sensitive than the MAL preconditioner. Moreover the convergence rate of the MAL preconditioner is still quite acceptable. Therefore we conclude that the MAL preconditioner is more competitive than the RDF preconditioner. These experiments are implemented with IFISS package. 

A Study of Removing SUVA and Trihalomethanes by Biological Activated Carbon

SUVA (equivalent to UV254/DOC) value in raw water is a precursor for the formation of trihalomethane during chlorination at a water treatment plant. This study collected rapidly filtered water from an advanced water treatment plant for use in experiments on raw water. The removal rate of treating the trihalomethanes formation potential (THMFP) was conducted by using a biological activated carbon. The hydraulic retention time and SUVA loading were major factors in biological degradation tests. The results showed that biological powder-activated carbon (BPAC) lowered the average concentration of UV254 and value of SUVA in raw water. A removal efficiency of THMFP was present in the treatment of the three primary organic carbon items. These results highlighted the importance of the BPAC had an excellent treatment efficiency on THMFP.

The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.

Using Multi-Linguistic Techniques for Thailand Herb and Traditional Medicine Registration Systems

Thailand has evolved many unique culture and knowledge, and the leading is the Thai traditional medicine (TTM). Recently, a number of researchers have tried to save this indigenous knowledge. However, the system to do so has still been scant. To preserve this ancient knowledge, we therefore invented and integrated multi-linguistic techniques to create the system of the collected all of recipes. This application extracted the medical recipes from antique scriptures then normalized antiquarian words, primitive grammar and antiquated measurement of them to the modern ones. Then, we applied ingredient-duplication-calculation, proportion-similarity-calculation and score-ranking to examine duplicate recipes. We collected the questionnaires from registrants and people to investigate the users’ satisfaction. The satisfactory results were found. This application assists not only registrants to validating the copyright violation in TTM registration process but also people to cure their illness that aids both Thai people and all mankind to fight for intractable diseases.

An Application of the Data Mining Methods with Decision Rule

  ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.

A Car Parking Monitoring System Using Wireless Sensor Networks

This paper presents a car parking monitoring system using wireless sensor networks. Multiple sensor nodes and a sink node, a gateway, and a server constitute a wireless network for monitoring a parking lot. Each of the sensor nodes is equipped with a 3-axis AMR sensor and deployed in the center of a parking space. Each sensor node reads its sensor values periodically and transmits the data to the sink node if the current and immediate past sensor values show a difference exceeding a threshold value. The sensor nodes and sink node use the 448 MHz band for wireless communication. Since RF transmission only occurs when sensor values show abrupt changes, the number of RF transmission operations is reduced and battery power can be conserved. The data from the sensor nodes reach the server via the sink node and gateway. The server determines which parking spaces are taken by cars based upon the received sensor data and reference values. The reference values are average sensor values measured by each sensor node when the corresponding parking spot is not occupied by a vehicle. Because the decision making is done by the server, the computational burden of the sensor node is relieved, which helps reduce the duty cycle of the sensor node.

A Consideration of the Achievement of Productive Level Parallel Programming Skills

This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.

Partial Purification of Cytotoxic Peptides against Gastric Cancer Cells from Protein Hydrolysate of Euphorbia hirta Linn.

Protein hydrolysates prepared from a number of medicinal plants are promising sources of various bioactive peptides. In this work, proteins from dried whole plant of Euphorbia hirta Linn. were extracted and digested with pepsin for 12h. The hydrolysates of lesser than 3 KDa were fractionated by a cut-off membrane. The peptide hydrolysate was then purified by an anion-exchange chromatography on DEAE-Sephacel™ column and reverse-phase chromatography on Sep-pak C18 column, respectively. The cytotoxic effect of each peptide fraction against a gastric carcinoma cell line (KATO-III, ATCC No. HTB103) was investigated using colorimetric MTT viability assay. A human liver cell line (Chang Liver, CLS No. 300139) was used as a control normal cell line. Two purified peptide peaks, peak l and peak ll at 100µg peptides mL-1 affected cell viability of the gastric cancer cell lines to 63.85±4.94 and 66.92±6.46%, respectively. Our result showed for the first time that the peptide fractions derived from protein hydrolysate of Euphorbia hirta Linn. have anti-gastric cancer activity, which offers a potential novel and natural anti-gastric cancer remedy.

Using Strategic CSR to Achieve the Hybrid Middle Ground in Social Entrepreneurship: The Case of Telenor Hungary

To be considered a socially entrepreneurial organization today requires achieving what can be termed a “hybrid middle ground” equilibrium, comprising of economic as well as social sustainability. This middle ground requires some blend of both business and social commitments. In this paper, we use the case of Hungary's second ranked mobile operator, Telenor Hungary to illustrate an example of a company that is moving to the hybrid middle ground by transitioning from a for-profit company to a socially responsible business using the concept of strategic CSR. In this line of thinking, the organization explicitly supports programs and initiatives that have a direct link to the core business and bring operational and/or financial advantages for the company, while creating a positive social and/or environmental impact. The important lessons learned from the company transition are also discussed. 

Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Some Preconditioners for Block Pentadiagonal Linear Systems Based on New Approximate Factorization Methods

In this paper, getting an high-efficiency parallel algorithm to solve sparse block pentadiagonal linear systems suitable for vectors and parallel processors, stair matrices are used to construct some parallel polynomial approximate inverse preconditioners. These preconditioners are appropriate when the desired target is to maximize parallelism. Moreover, some theoretical results about these preconditioners are presented and how to construct preconditioners effectively for any nonsingular block pentadiagonal H-matrices is also described. In addition, the availability of these preconditioners is illustrated with some numerical experiments arising from two dimensional biharmonic equation.

Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.