Beneficiation of Pyrolitic Carbon Black

This research investigated treatment of crude carbon black produced from pyrolysis of waste tyres in order to evaluate its quality and possible industrial applications. A representative sample of crude carbon black was dry screened to determine the initial particle size distribution. This was followed by pulverizing the crude carbon black and leaching in hot concentrated sulphuric acid for the removal of heavy metals and other contaminants. Analysis of the refined carbon black showed a significant improvement of the product quality compared to crude carbon black. It was discovered that refined carbon black can be further classified into multiple high value products for various industrial applications such as filler, paint pigment, activated carbon and fuel briquettes.

The Use of TV and the Internet in the Social Context

This study examines the media habits of young people in Saudi Arabia, in particular their use of the Internet and television in the domestic sphere, and how use of the Internet impacts upon other activities. In order to address the research questions, focus group interviews were conducted with Saudi university students. The study found that television has become a central part of social life within the household where television represents a main source for family time, particularly in Ramadan while the Internet is a solitary activity where it is used in more private spaces. Furthermore, Saudi females were also more likely to have their Internet access monitored and circumscribed by family members, with parents controlling the location and the amount of time spent using the Internet.

Augmented Reality on Android

Augmented Reality is an application which combines a live view of real-world environment and computer-generated images. This paper studies and demonstrates an efficient Augmented Reality development in the mobile Android environment with the native Java language and Android SDK. Major components include Barcode Reader, File Loader, Marker Detector, Transform Matrix Generator, and a cloud database.

An Effective Genetic Algorithm for a Complex Real-World Scheduling Problem

We address a complex scheduling problem arising in the wood panel industry with the objective of minimizing a quadratic function of job tardiness. The proposed solution strategy, which is based on an effective genetic algorithm, has been coded and implemented within a major Tunisian company, leader in the wood panel manufacturing. Preliminary experimental results indicate significant decrease of delivery times.

An Integrated Operational Research and System Dynamics Approach for Planning Decisions in Container Terminals

This paper focuses on the operational and strategic planning decisions related to the quayside of container terminals. We introduce an integrated operational research (OR) and system dynamics (SD) approach to solve the Berth Allocation Problem (BAP) and the Quay Crane Assignment Problem (QCAP). A BAP-QCAP optimization modeling approach which considers practical aspects not studied before in the integration of BAP and QCAP is discussed. A conceptual SD model is developed to determine the long-term effect of optimization on the system behavior factors like resource utilization, attractiveness to port, number of incoming vessels to port and port profits. The framework can be used for improving the operational efficiency of container terminals and providing a strategic view after applying optimization.

Back Analysis of Tehran Metro Tunnel Construction Using FLAC-3D

An important aspect of planning for shallow tunneling under urban areas is the determination of likely surface movements and interaction with existing structures. Back analysis of built tunnels that their settlements magnitude is available, could aid the designers to have a more accuracy in future projects. In this paper, one single Tehran Metro Tunnel (at west of Hor square, Jang University Street) was selected. At first, surface settlements of this tunnel were measured in situ. Then this tunnel was modeled using the commercial finite deference software FLAC-3D. Finally, Results of modeling and in situ measurements compared for verification.

A Study of Priority Evaluation and Resource Allocation for Revitalization of Cultural Heritages in the Urban Development

Proper maintenance and preservation of significant cultural heritages or historic buildings is necessary. It can not only enhance environmental benefits and a sense of community, but also preserve a city's history and people’s memory. It allows the next generation to be able to get a glimpse of our past, and achieve the goal of sustainable preserved cultural assets. However, the management of maintenance work has not been appropriate for many designated heritages or historic buildings so far. The planning and implementation of the reuse has yet to have a breakthrough specification. It leads the heritages to a mere formality of being “reserved”, instead of the real meaning of “conservation”. For the restoration and preservation of cultural heritages study issues, it is very important due to the consideration of historical significance, symbolism, and economic benefits effects. However, the decision makers such as the officials from public sector they often encounter which heritage should be prioritized to be restored first under the available limited budgets. Only very few techniques are available today to determine the appropriately restoration priorities for the diverse historical heritages, perhaps because of a lack of systematized decision-making aids been proposed before. In the past, the discussions of management and maintenance towards cultural assets were limited to the selection of reuse alternatives instead of the allocation of resources. In view of this, this research will adopt some integrated research methods to solve the existing problems that decision-makers might encounter when allocating resources in the management and maintenance of heritages and historic buildings. The purpose of this study is to develop a sustainable decision making model for local governments to resolve these problems. We propose an alternative decision support model to prioritize restoration needs within the limited budgets. The model is constructed based on fuzzy Delphi, fuzzy analysis network process (FANP) and goal programming (GP) methods. In order to avoid misallocate resources; this research proposes a precise procedure that can take multi-stakeholders views, limited costs and resources into consideration. Also, the combination of many factors and goals has been taken into account to find the highest priority and feasible solution results. To illustrate the approach we propose in this research, seven cultural heritages in Taipei city as one example has been used as an empirical study, and the results are in depth analyzed to explain the application of our proposed approach.

On the Computation of a Common n-finger Robotic Grasp for a Set of Objects

Industrial robotic arms utilize multiple end-effectors, each for a specific part and for a specific task. We propose a novel algorithm which will define a single end-effector’s configuration able to grasp a given set of objects with different geometries. The algorithm will have great benefit in production lines allowing a single robot to grasp various parts. Hence, reducing the number of endeffectors needed. Moreover, the algorithm will reduce end-effector design and manufacturing time and final product cost. The algorithm searches for a common grasp over the set of objects. The search algorithm maps all possible grasps for each object which satisfy a quality criterion and takes into account possible external wrenches (forces and torques) applied to the object. The mapped grasps are- represented by high-dimensional feature vectors which describes the shape of the gripper. We generate a database of all possible grasps for each object in the feature space. Then we use a search and classification algorithm for intersecting all possible grasps over all parts and finding a single common grasp suitable for all objects. We present simulations of planar and spatial objects to validate the feasibility of the approach.

Comparison of Two Types of Preconditioners for Stokes and Linearized Navier-Stokes Equations

To solve saddle point systems efficiently, several preconditioners have been published. There are many methods for constructing preconditioners for linear systems from saddle point problems, for instance, the relaxed dimensional factorization (RDF) preconditioner and the augmented Lagrangian (AL) preconditioner are used for both steady and unsteady Navier-Stokes equations. In this paper we compare the RDF preconditioner with the modified AL (MAL) preconditioner to show which is more effective to solve Navier-Stokes equations. Numerical experiments indicate that the MAL preconditioner is more efficient and robust, especially, for moderate viscosities and stretched grids in steady problems. For unsteady cases, the convergence rate of the RDF preconditioner is slightly faster than the MAL perconditioner in some circumstances, but the parameter of the RDF preconditioner is more sensitive than the MAL preconditioner. Moreover the convergence rate of the MAL preconditioner is still quite acceptable. Therefore we conclude that the MAL preconditioner is more competitive than the RDF preconditioner. These experiments are implemented with IFISS package. 

The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.

Using Multi-Linguistic Techniques for Thailand Herb and Traditional Medicine Registration Systems

Thailand has evolved many unique culture and knowledge, and the leading is the Thai traditional medicine (TTM). Recently, a number of researchers have tried to save this indigenous knowledge. However, the system to do so has still been scant. To preserve this ancient knowledge, we therefore invented and integrated multi-linguistic techniques to create the system of the collected all of recipes. This application extracted the medical recipes from antique scriptures then normalized antiquarian words, primitive grammar and antiquated measurement of them to the modern ones. Then, we applied ingredient-duplication-calculation, proportion-similarity-calculation and score-ranking to examine duplicate recipes. We collected the questionnaires from registrants and people to investigate the users’ satisfaction. The satisfactory results were found. This application assists not only registrants to validating the copyright violation in TTM registration process but also people to cure their illness that aids both Thai people and all mankind to fight for intractable diseases.

An Application of the Data Mining Methods with Decision Rule

  ankings for output of Chinese main agricultural commodity in the world for 1978, 1980, 1990, 2000, 2006, 2007 and 2008 have been released in United Nations FAO Database. Unfortunately, where the ranking of output of Chinese cotton lint in the world for 2008 was missed. This paper uses sequential data mining methods with decision rules filling this gap. This new data mining method will be help to give a further improvement for United Nations FAO Database.

A Car Parking Monitoring System Using Wireless Sensor Networks

This paper presents a car parking monitoring system using wireless sensor networks. Multiple sensor nodes and a sink node, a gateway, and a server constitute a wireless network for monitoring a parking lot. Each of the sensor nodes is equipped with a 3-axis AMR sensor and deployed in the center of a parking space. Each sensor node reads its sensor values periodically and transmits the data to the sink node if the current and immediate past sensor values show a difference exceeding a threshold value. The sensor nodes and sink node use the 448 MHz band for wireless communication. Since RF transmission only occurs when sensor values show abrupt changes, the number of RF transmission operations is reduced and battery power can be conserved. The data from the sensor nodes reach the server via the sink node and gateway. The server determines which parking spaces are taken by cars based upon the received sensor data and reference values. The reference values are average sensor values measured by each sensor node when the corresponding parking spot is not occupied by a vehicle. Because the decision making is done by the server, the computational burden of the sensor node is relieved, which helps reduce the duty cycle of the sensor node.

Effect of Processing Methods on Texture Evolution in AZ31 Mg Alloy Sheet

Textures of AZ31 Mg alloy sheets were evaluated by using neutron diffraction method in this study. The AZ31 sheets were fabricated either by conventional casting and subsequent hot rolling or strip casting. The effect of warm rolling was investigated using the AZ31 Mg alloy sheet produced by conventional casting. Warm rolling of 30% thickness reduction per pass was possible without any side-crack at temperatures as low as 200oC under the roll speed of 30 m/min. The initial microstructure of conventionally cast specimen was found to be partially recrystallized structures. Grain refinement was found to occur actively during the warm rolling. The (0002),(10-10) (10-11),and (10-12) complete pole figures were measured using the HANARO FCD (Neutron Four Circle Diffractometer) and ODF were calculated. The major texture of all specimens can be expressed by ND//(0001) fiber texture. Texture of hot rolled specimen showed the strongest fiber component, while that of strip cast sheet seemed to be similar to random distribution.

A Consideration of the Achievement of Productive Level Parallel Programming Skills

This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.

Partial Purification of Cytotoxic Peptides against Gastric Cancer Cells from Protein Hydrolysate of Euphorbia hirta Linn.

Protein hydrolysates prepared from a number of medicinal plants are promising sources of various bioactive peptides. In this work, proteins from dried whole plant of Euphorbia hirta Linn. were extracted and digested with pepsin for 12h. The hydrolysates of lesser than 3 KDa were fractionated by a cut-off membrane. The peptide hydrolysate was then purified by an anion-exchange chromatography on DEAE-Sephacel™ column and reverse-phase chromatography on Sep-pak C18 column, respectively. The cytotoxic effect of each peptide fraction against a gastric carcinoma cell line (KATO-III, ATCC No. HTB103) was investigated using colorimetric MTT viability assay. A human liver cell line (Chang Liver, CLS No. 300139) was used as a control normal cell line. Two purified peptide peaks, peak l and peak ll at 100µg peptides mL-1 affected cell viability of the gastric cancer cell lines to 63.85±4.94 and 66.92±6.46%, respectively. Our result showed for the first time that the peptide fractions derived from protein hydrolysate of Euphorbia hirta Linn. have anti-gastric cancer activity, which offers a potential novel and natural anti-gastric cancer remedy.

Designing Software Quality Measurement System for Telecommunication Industry Using Object-Oriented Technique

Numbers of software quality measurement system have been implemented over the past few years, but none of them focuses on telecommunication industry. Software quality measurement system for telecommunication industry was a system that could calculate the quality value of the measured software that totally focused in telecommunication industry. Before designing a system, quality factors, quality attributes and quality metrics were identified based on literature review and survey. Then, using the identified quality factors, quality attributes and quality metrics, quality model for telecommunication industry was constructed. Each identified quality metrics had its own formula. Quality value for the system was measured based on the quality metrics and aggregated by referring to the quality model. It would classify the quality level of the software based on Net Satisfaction Index (NSI). The system was designed using object-oriented approach in web-based environment. Thus, existing of software quality measurement system was important to both developers and users in order to produce high quality software product for telecommunication industry.

Application of Particle Swarm Optimization for Economic Load Dispatch and Loss Reduction

This paper proposes a particle swarm optimization (PSO) technique to solve the economic load dispatch (ELD) problems. For the ELD problem in this work, the objective function is to minimize the total fuel cost of all generator units for a given daily load pattern while the main constraints are power balance and generation output of each units. Case study in the test system of 40-generation units with 6 load patterns is presented to demonstrate the performance of PSO in solving the ELD problem. It can be seen that the optimal solution given by PSO provides the minimum total cost of generation while satisfying all the constraints and benefiting greatly from saving in power loss reduction.

Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption

This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs. 

Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.