Sulphur-Mediated Precipitation of Pt/Fe/Co/CrIons in Liquid-Liquid and Gas-Liquid Chloride Systems

The proof of concept experiments were conducted to determine the feasibility of using small amounts of Dissolved Sulphur (DS) from the gaseous phase to precipitate platinum ions in chloride media. Two sets of precipitation experiments were performed in which the source of sulphur atoms was either a thiosulphate solution (Na2S2O3) or a sulphur dioxide gas (SO2). In liquid-liquid (L-L) system, complete precipitation of Pt was achieved at small dosages of Na2S2O3 (0.01 – 1.0 M) in a time interval of 3-5 minutes. On the basis of this result, gas absorption tests were carried out mainly to achieve sulphur solubility equivalent to 0.018 M. The idea that huge amounts of precious metals could be recovered selectively from their dilute solutions by utilizing the waste SO2 streams at low pressure seemed attractive from the economic and environmental point of views. Therefore, mass transfer characteristics of SO2 gas associated with reactive absorption across the gas-liquid (G-L) interface were evaluated under different conditions of pressure (0.5 – 2 bar), solution temperature ranges from 20 – 50 oC and acid strength (1 – 4 M, HCl). This paper concludes with information about selective precipitation of Pt in the presence of cations (Fe2+, Co2+, and Cr3+) in a CSTR and recommendation to scale up laboratory data to industrial pilot scale operations.

Generational PipeLined Genetic Algorithm (PLGA)using Stochastic Selection

In this paper, a pipelined version of genetic algorithm, called PLGA, and a corresponding hardware platform are described. The basic operations of conventional GA (CGA) are made pipelined using an appropriate selection scheme. The selection operator, used here, is stochastic in nature and is called SA-selection. This helps maintaining the basic generational nature of the proposed pipelined GA (PLGA). A number of benchmark problems are used to compare the performances of conventional roulette-wheel selection and the SA-selection. These include unimodal and multimodal functions with dimensionality varying from very small to very large. It is seen that the SA-selection scheme is giving comparable performances with respect to the classical roulette-wheel selection scheme, for all the instances, when quality of solutions and rate of convergence are considered. The speedups obtained by PLGA for different benchmarks are found to be significant. It is shown that a complete hardware pipeline can be developed using the proposed scheme, if parallel evaluation of the fitness expression is possible. In this connection a low-cost but very fast hardware evaluation unit is described. Results of simulation experiments show that in a pipelined hardware environment, PLGA will be much faster than CGA. In terms of efficiency, PLGA is found to outperform parallel GA (PGA) also.

An Analysis of Economic Capital Allocation of Global Banks

There are three main ways of categorizing capital in banking operations: accounting, regulatory and economic capital. However, the 2008-2009 global crisis has shown that none of these categories adequately reflects the real risks of bank operations, especially in light of the failures Bear Stearns, Lehman Brothers or Northern Rock. This paper deals with the economic capital allocation of global banks. In theory, economic capital should reflect the real risks of a bank and should be publicly available. Yet, as discovered during the global financial crisis, even when economic capital information was publicly disclosed, the underlying assumptions rendered the information useless. Specifically, some global banks that reported relatively high levels of economic capital before the crisis went bankrupt or had to be bailed-out by their government. And, only 15 out of 50 global banks reported their economic capital during the 2007-2010 period. In this paper, we analyze the changes in reported bank economic capital disclosure during this period. We conclude that relative shares of credit and business risks increased in 2010 compared to 2007, while both operational and market risks decreased their shares on the total economic capital of top-rated global banks. Generally speaking, higher levels of disclosure and transparency of bank operations are required to obtain more confidence from stakeholders. Moreover, additional risks such as liquidity risks should be included in these disclosures.

Maya Semantic Technique: A Mathematical Technique Used to Determine Partial Semantics for Declarative Sentences

This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.

Optimal External Merge Sorting Algorithm with Smart Block Merging

Like other external sorting algorithms, the presented algorithm is a two step algorithm including internal and external steps. The first part of the algorithm is like the other similar algorithms but second part of that is including a new easy implementing method which has reduced the vast number of inputoutput operations saliently. As decreasing processor operating time does not have any effect on main algorithm speed, any improvement in it should be done through decreasing the number of input-output operations. This paper propose an easy algorithm for choose the correct record location of the final list. This decreases the time complexity and makes the algorithm faster.

Identifications and Monitoring of Power System Dynamics Based on the PMUs and Wavelet Technique

Low frequency power oscillations may be triggered by many events in the system. Most oscillations are damped by the system, but undamped oscillations can lead to system collapse. Oscillations develop as a result of rotor acceleration/deceleration following a change in active power transfer from a generator. Like the operations limits, the monitoring of power system oscillating modes is a relevant aspect of power system operation and control. Unprevented low-frequency power swings can be cause of cascading outages that can rapidly extend effect on wide region. On this regard, a Wide Area Monitoring, Protection and Control Systems (WAMPCS) help in detecting such phenomena and assess power system dynamics security. The monitoring of power system electromechanical oscillations is very important in the frame of modern power system management and control. In first part, this paper compares the different technique for identification of power system oscillations. Second part analyzes possible identification some power system dynamics behaviors Using Wide Area Monitoring Systems (WAMS) based on Phasor Measurement Units (PMUs) and wavelet technique.

Improving University Operations with Data Mining: Predicting Student Performance

The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.

In Cognitive Radio the Analysis of Bit-Error- Rate (BER) by using PSO Algorithm

The electromagnetic spectrum is a natural resource and hence well-organized usage of the limited natural resources is the necessities for better communication. The present static frequency allocation schemes cannot accommodate demands of the rapidly increasing number of higher data rate services. Therefore, dynamic usage of the spectrum must be distinguished from the static usage to increase the availability of frequency spectrum. Cognitive radio is not a single piece of apparatus but it is a technology that can incorporate components spread across a network. It offers great promise for improving system efficiency, spectrum utilization, more effective applications, reduction in interference and reduced complexity of usage for users. Cognitive radio is aware of its environmental, internal state, and location, and autonomously adjusts its operations to achieve designed objectives. It first senses its spectral environment over a wide frequency band, and then adapts the parameters to maximize spectrum efficiency with high performance. This paper only focuses on the analysis of Bit-Error-Rate in cognitive radio by using Particle Swarm Optimization Algorithm. It is theoretically as well as practically analyzed and interpreted in the sense of advantages and drawbacks and how BER affects the efficiency and performance of the communication system.

The Integration between Transportation Solutions, Economic Development and Community Development as an Approach for Sustainability – A Case Study of Curitiba, Brazil

Sustainability and sustainable development have been the main theme of many international conferences, such the UN Rio de Janeiro 1992 Earth Summit This was followed by the appearance of the global conferences at the late of the nineties and the early of 2000 to confirm the importance of the sustainable development .it was focused on the importance of the economic development as it is considered an effective tool in the operations of the sustainable development. Industry plays a critical role in technological innovations and research and development activities, which are crucial for the economic and social development of any country. Transportation and mobility are an important part or urban economics and the quality of life. To analyze urban transportation and its environmental impacts, a comprehensive approach is needed. So this research aims to apply new approach for the development of the urban communities that insure the continuity and facing the deterioration. This approach aims to integrate sustainable transport solutions with economic development and community development. For that purpose we will concentrate on one of the most sustainable cities in the world (Curitiba in Brazil) which provides the world with a model in how to integrate sustainable transport considerations into business development, road infrastructure development, and local community development.

Innovation Development of Food Market of Kazakhstan

Currently, one of the main directions is developing of development based on the clustering of economic operations of Kazakhstan, providing for the organization and concentration of production capacity in one region or the most optimal system. In the modern economic literature clustering is regarded as one of the most effective tools to ensure competitive businesses, and improve their business itself.

Hazardous Waste Management of Transmission Line Tower Manufacturing

The manufacturing transmission line tower parts has being generated hazardous waste which is required proper disposal of waste for protection of land pollution. Manufacturing Process in the manufacturing of steel angle, plates, pipes, channels are passes through conventional, semi automatic and CNC machines for cutting, marking, punching, drilling, notching, bending operations. All fabricated material Coated with thin layer of Zinc in Galvanizing plant where molten zinc is used for coating. Prior to Galvanizing, chemical like 33% concentrated HCl Acid, ammonium chloride and d-oil being used for pretreatment of iron. The bath of water with sodium dichromate is used for cooling and protection of the galvanized steel. For the heating purpose the furnace oil burners are used. These above process the Zinc dross, Zinc ash, ETP sludge and waste pickled acid generated as hazardous waste. The RPG has made captive secured land fill site, since 1997 since then it was using for disposal of hazardous waste after completion of SLF (Secured land fill) site. The RPG has raised height from ground level then now it is being used for disposal of waste as he designed the SLF after in creasing height of from GL it is functional without leach ate or adverse impacts in the environment.

An Implicit Representation of Spherical Product for Increasing the Shape Variety of Super-quadrics in Implicit Surface Modeling

Super-quadrics can represent a set of implicit surfaces, which can be used furthermore as primitive surfaces to construct a complex object via Boolean set operations in implicit surface modeling. In fact, super-quadrics were developed to create a parametric surface by performing spherical product on two parametric curves and some of the resulting parametric surfaces were also represented as implicit surfaces. However, because not every parametric curve can be redefined implicitly, this causes only implicit super-elliptic and super-hyperbolic curves are applied to perform spherical product and so only implicit super-ellipsoids and hyperboloids are developed in super-quadrics. To create implicit surfaces with more diverse shapes than super-quadrics, this paper proposes an implicit representation of spherical product, which performs spherical product on two implicit curves like super-quadrics do. By means of the implicit representation, many new implicit curves such as polygonal, star-shaped and rose-shaped curves can be used to develop new implicit surfaces with a greater variety of shapes than super-quadrics, such as polyhedrons, hyper-ellipsoids, superhyperboloids and hyper-toroids containing star-shaped and roseshaped major and minor circles. Besides, the newly developed implicit surfaces can also be used to define new primitive implicit surfaces for constructing a more complex implicit surface in implicit surface modeling.

Voice Driven Applications in Non-stationary and Chaotic Environment

Automated operations based on voice commands will become more and more important in many applications, including robotics, maintenance operations, etc. However, voice command recognition rates drop quite a lot under non-stationary and chaotic noise environments. In this paper, we tried to significantly improve the speech recognition rates under non-stationary noise environments. First, 298 Navy acronyms have been selected for automatic speech recognition. Data sets were collected under 4 types of noisy environments: factory, buccaneer jet, babble noise in a canteen, and destroyer. Within each noisy environment, 4 levels (5 dB, 15 dB, 25 dB, and clean) of Signal-to-Noise Ratio (SNR) were introduced to corrupt the speech. Second, a new algorithm to estimate speech or no speech regions has been developed, implemented, and evaluated. Third, extensive simulations were carried out. It was found that the combination of the new algorithm, the proper selection of language model and a customized training of the speech recognizer based on clean speech yielded very high recognition rates, which are between 80% and 90% for the four different noisy conditions. Fourth, extensive comparative studies have also been carried out.

Multirate Neural Control for AUV's Increased Situational Awareness during Diving Tasks Using Stochastic Model

This paper focuses on a critical component of the situational awareness (SA), the neural control of depth flight of an autonomous underwater vehicle (AUV). Constant depth flight is a challenging but important task for AUVs to achieve high level of autonomy under adverse conditions. With the SA strategy, we proposed a multirate neural control of an AUV trajectory for a nontrivial mid-small size AUV “r2D4" stochastic model. This control system has been demonstrated and evaluated by simulation of diving maneuvers using software package Simulink. From the simulation results it can be seen that the chosen AUV model is stable in the presence of noises, and also can be concluded that the proposed research technique will be useful for fast SA of similar AUV systems in real-time search-and-rescue operations.

Risk Monitoring through Traceability Information Model

This paper shows a traceability framework for supply risk monitoring, beginning with the identification, analysis, and evaluation of the supply chain risk and focusing on the supply operations of the Health Care Institutions with oncology services in Bogota, Colombia. It includes a brief presentation of the state of the art of the Supply Chain Risk Management and traceability systems in logistics operations, and it concludes with the methodology to integrate the SCRM model with the traceability system.

Storing OWL Ontologies in SQL Relational Databases

Relational databases are often used as a basis for persistent storage of ontologies to facilitate rapid operations such as search and retrieval, and to utilize the benefits of relational databases management systems such as transaction management, security and integrity control. On the other hand, there appear more and more OWL files that contain ontologies. Therefore, this paper proposes to extract ontologies from OWL files and then store them in relational databases. A prerequisite for this storing is transformation of ontologies to relational databases, which is the purpose of this paper.

A Dynamically Reconfigurable Arithmetic Circuit for Complex Number and Double Precision Number

This paper proposes an architecture of dynamically reconfigurable arithmetic circuit. Dynamic reconfiguration is a technique to realize required functions by changing hardware construction during operations. The proposed circuit is based on a complex number multiply-accumulation circuit which is used frequently in the field of digital signal processing. In addition, the proposed circuit performs real number double precision arithmetic operations. The data formats are single and double precision floating point number based on IEEE754. The proposed circuit is designed using VHDL, and verified the correct operation by simulations and experiments.

A Set Theory Based Factoring Technique and Its Use for Low Power Logic Design

Factoring Boolean functions is one of the basic operations in algorithmic logic synthesis. A novel algebraic factorization heuristic for single-output combinatorial logic functions is presented in this paper and is developed based on the set theory paradigm. The impact of factoring is analyzed mainly from a low power design perspective for standard cell based digital designs in this paper. The physical implementation of a number of MCNC/IWLS combinational benchmark functions and sub-functions are compared before and after factoring, based on a simple technology mapping procedure utilizing only standard gate primitives (readily available as standard cells in a technology library) and not cells corresponding to optimized complex logic. The power results were obtained at the gate-level by means of an industry-standard power analysis tool from Synopsys, targeting a 130nm (0.13μm) UMC CMOS library, for the typical case. The wire-loads were inserted automatically and the simulations were performed with maximum input activity. The gate-level simulations demonstrate the advantage of the proposed factoring technique in comparison with other existing methods from a low power perspective, for arbitrary examples. Though the benchmarks experimentation reports mixed results, the mean savings in total power and dynamic power for the factored solution over a non-factored solution were 6.11% and 5.85% respectively. In terms of leakage power, the average savings for the factored forms was significant to the tune of 23.48%. The factored solution is expected to better its non-factored counterpart in terms of the power-delay product as it is well-known that factoring, in general, yields a delay-efficient multi-level solution.

Multi Switched Split Vector Quantizer

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization, This is a hybrid of two product code vector quantization techniques namely the Multi stage vector quantization technique, and Switched split vector quantization technique,. Multi Switched Split Vector Quantization technique quantizes the linear predictive coefficients in terms of line spectral frequencies. From results it is proved that Multi Switched Split Vector Quantization provides better trade off between bitrate and spectral distortion performance, computational complexity and memory requirements when compared to Switched Split Vector Quantization, Multi stage vector quantization, and Split Vector Quantization techniques. By employing the switching technique at each stage of the vector quantizer the spectral distortion, computational complexity and memory requirements were greatly reduced. Spectral distortion was measured in dB, Computational complexity was measured in floating point operations (flops), and memory requirements was measured in (floats).

Application of Wireless Visual Sensor for Semi- Autonomous Mine Navigation System

The present paper represent the efforts undertaken for the development of an semi-automatic robot that may be used for various post-disaster rescue operation planning and their subsequent execution using one-way communication of video and data from the robot to the controller and controller to the robot respectively. Wireless communication has been used for the purpose so that the robot may access the unapproachable places easily without any difficulties. It is expected that the information obtained from the robot would be of definite help to the rescue team for better planning and execution of their operations.