Effect of Preheating Temperature and Chamber Pressure on the Properties of Porous NiTi Alloy Prepared by SHS Technique

The fabrication of porous NiTi shape memory alloys (SMAs) from elemental powder compacts was conducted by selfpropagating high temperature synthesis (SHS). Effects of the preheating temperature and the chamber pressure on the combustion characteristics as well as the final morphology and the composition of products were studied. The samples with porosity between 56.4 and 59.0% under preheating temperature in the range of 200-300°C and Ar-gas chamber pressure of 138 and 201 kPa were obtained. The pore structures were found to be dissimilar only in the samples processed with different preheating temperature. The major phase in the porous product is NiTi with small amounts of secondary phases, NiTi2 and Ni4Ti3. The preheating temperature and the chamber pressure have very little effect on the phase constituent. While the combustion temperature of the sample was notably increased by increasing the preheating temperature, they were slightly changed by varying the chamber pressure.

Verification of the Simultaneous Local Extraction Method of Base and Thermal Resistance of Bipolar Transistors

In this paper an extensive verification of the extraction method (published earlier) that consistently accounts for self-heating and Early effect to accurately extract both base and thermal resistance of bipolar junction transistors is presented. The method verification is demonstrated on advanced RF SiGe HBTs were the extracted results for the thermal resistance are compared with those from another published method that ignores the effect of Early effect on internal base-emitter voltage and the extracted results of the base resistance are compared with those determined from noise measurements. A self-consistency of our method in the extracted base resistance and thermal resistance using compact model simulation results is also carried out in order to study the level of accuracy of the method.

Navigation and Self Alignment of Inertial Systems using Nonlinear H∞ Filters

Micro electromechanical sensors (MEMS) play a vital role along with global positioning devices in navigation of autonomous vehicles .These sensors are low cost ,easily available but depict colored noises and unpredictable discontinuities .Conventional filters like Kalman filters and Sigma point filters are not able to cope with nonwhite noises. This research has utilized H∞ filter in nonlinear frame work both with Kalman filter and Unscented filter for navigation and self alignment of an airborne vehicle. The system is simulated for colored noises and discontinuities and results are compared with not robust nonlinear filters. The results are found 40%-70% more robust against colored noises and discontinuities.

Description and Analysis of Embedded Firewall Techniques

With the turn of this century, many researchers started showing interest in Embedded Firewall (EF) implementations. These are not the usual firewalls that are used as checkpoints at network gateways. They are, rather, applied near those hosts that need protection. Hence by using them, individual or grouped network components can be protected from the inside as well as from external attacks. This paper presents a study of EF-s, looking at their architecture and problems. A comparative study assesses how practical each kind is. It particularly focuses on the architecture, weak points, and portability of each kind. A look at their use by different categories of users is also presented.

Decoupled Scheduling in Meta Environment

Grid scheduling is the process of mapping grid jobs to resources over multiple administrative domains. Traditionally, application-level schedulers have been tightly integrated with the application itself and were not easily applied to other applications. This design is generic that decouples the scheduler core (the search procedure) from the application-specific (e.g. application performance models) and platform-specific (e.g. collection of resource information) components used by the search procedure. In this decoupled approach the application details are not revealed completely to broker, but customer will give the application to resource provider for execution. In a decoupled approach, apart from scheduling, the resource selection can be performed independently in order to achieve scalability.

What Attributes Determine Housing Affordability?

The concept of housing affordability is a contested issue, but a pressing and widespread problem for many countries. Simple ratio measures based on housing expenditure and income are habitually used to defined and assess housing affordability. However, conceptualising and measuring affordability in this manner focuses only on financial attributes and fails to deal with wider issues such as housing quality, location and access to services and facilities. The research is based on the notion that the housing affordability problem encompasses more than the financial costs of housing and a households ability to meet such costs and must address larger issues such as social and environmental sustainability and the welfare of households. Therefore, the need arises for a broad and more encompassing set of attributes by which housing affordability can be assessed. This paper presents a system of criteria by which the affordability of different housing locations could be assessed in a comprehensive and sustainable manner. Moreover, the paper explores the way in which such criteria could be measured.

Regret, Choice, and Outcome

In two studies we challenged the well consolidated position in regret literature according to which the necessary condition for the emergence of regret is a bad outcome ensuing from free decisions. Without free choice, and, consequently, personal responsibility, other emotions, such as disappointment, but not regret, are supposed to be elicited. In our opinion, a main source of regret is being obliged by circumstance out of our control to chose an undesired option. We tested the hypothesis that regret resulting from a forced choice is more intense than regret derived from a free choice and that the outcome affects the latter, not the former. Besides, we investigated whether two other variables – the perception of the level of freedom of the choice and the choice justifiability – mediated the relationships between choice and regret, as well as the other four emotions we examined: satisfaction, anger toward oneself, disappointment, anger towards circumstances. The two studies were based on the scenario methodology and implied a 2 x 2 (choice x outcome) between design. In the first study the foreseen short-term effects of the choice were assessed; in the second study the experienced long-term effects of the choice were assessed. In each study 160 students of the Second University of Naples participated. Results largely corroborated our hypotheses. They were discussed in the light of the main theories on regret and decision making.

A Comparison of SVM-based Criteria in Evolutionary Method for Gene Selection and Classification of Microarray Data

An evolutionary method whose selection and recombination operations are based on generalization error-bounds of support vector machine (SVM) can select a subset of potentially informative genes for SVM classifier very efficiently [7]. In this paper, we will use the derivative of error-bound (first-order criteria) to select and recombine gene features in the evolutionary process, and compare the performance of the derivative of error-bound with the error-bound itself (zero-order) in the evolutionary process. We also investigate several error-bounds and their derivatives to compare the performance, and find the best criteria for gene selection and classification. We use 7 cancer-related human gene expression datasets to evaluate the performance of the zero-order and first-order criteria of error-bounds. Though both criteria have the same strategy in theoretically, experimental results demonstrate the best criterion for microarray gene expression data.

A Study of Color Transformation on Website Images for the Color Blind

In this paper, we study on color transformation method on website images for the color blind. The most common category of color blindness is red-green color blindness which is viewed as beige color. By transforming the colors of the images, the color blind can improve their color visibility. They can have a better view when browsing through the websites. To transform colors on the website images, we study on two algorithms which are the conversion techniques from RGB color space to HSV color space and self-organizing color transformation. The comparative study focuses on criteria based on the ease of use, quality, accuracy and efficiency. The outcome of the study leads to enhancement of website images to meet the color blinds- vision requirements in perceiving image detailed.

Preparation and Characterization of Self Assembled Gold Nanoparticles on Amino Functionalized SiO2 Dielectric Core

Wet chemistry methods are used to prepare the SiO2/Au nanoshells. The purpose of this research was to synthesize gold coated SiO2 nanoshells for biomedical applications. Tunable nanoshells were prepared by using different colloidal concentrations. The nanoshells are characterized by FTIR, XRD, UV-Vis spectroscopy and atomic force microscopy (AFM). The FTIR results confirmed the functionalization of the surfaces of silica nanoparticles with NH2 terminal groups. A tunable absorption was observed between 470-600 nm with a maximum range of 530-560 nm. Based on the XRD results three main peaks of Au (111), (200) and (220) were identified. Also AFM results showed that the silica core diameter was about 100 nm and the thickness of gold shell about 10 nm.

Valorization of Lignocellulosic Wastes – Evaluation of Its Toxicity When Used in Adsorption Systems

The agriculture lignocellulosic by-products are receiving increased attention, namely in the search for filter materials that retain contaminants from water. These by-products, specifically almond and hazelnut shells are abundant in Portugal once almond and hazelnuts production is a local important activity. Hazelnut and almond shells have as main constituents lignin, cellulose and hemicelluloses, water soluble extractives and tannins. Along the adsorption of heavy metals from contaminated waters, water soluble compounds can leach from shells and have a negative impact in the environment. Usually, the chemical characterization of treated water by itself may not show environmental impact caused by the discharges when parameters obey to legal quality standards for water. Only biological systems can detect the toxic effects of the water constituents. Therefore, the evaluation of toxicity by biological tests is very important when deciding the suitability for safe water discharge or for irrigation applications. The main purpose of the present work was to assess the potential impacts of waters after been treated for heavy metal removal by hazelnut and almond shells adsorption systems, with short term acute toxicity tests. To conduct the study, water at pH 6 with 25 mg.L-1 of lead, was treated with 10 g of shell per litre of wastewater, for 24 hours. This procedure was followed for each bark. Afterwards the water was collected for toxicological assays; namely bacterial resistance, seed germination, Lemna minor L. test and plant grow. The effect in isolated bacteria strains was determined by disc diffusion method and the germination index of seed was evaluated using lettuce, with temperature and humidity germination control for 7 days. For aquatic higher organism, Lemnas were used with 4 days contact time with shell solutions, in controlled light and temperature. For terrestrial higher plants, biomass production was evaluated after 14 days of tomato germination had occurred in soil, with controlled humidity, light and temperature. Toxicity tests of water treated with shells revealed in some extent effects in the tested organisms, with the test assays showing a close behaviour as the control, leading to the conclusion that its further utilization may not be considered to create a serious risk to the environment.

Promoting Complex Systems Learning through the use of Computer Modeling

This paper describes part of a project about Learningby- Modeling (LbM). Studying complex systems is increasingly important in teaching and learning many science domains. Many features of complex systems make it difficult for students to develop deep understanding. Previous research indicates that involvement with modeling scientific phenomena and complex systems can play a powerful role in science learning. Some researchers argue with this view indicating that models and modeling do not contribute to understanding complexity concepts, since these increases the cognitive load on students. This study will investigate the effect of different modes of involvement in exploring scientific phenomena using computer simulation tools, on students- mental model from the perspective of structure, behavior and function. Quantitative and qualitative methods are used to report about 121 freshmen students that engaged in participatory simulations about complex phenomena, showing emergent, self-organized and decentralized patterns. Results show that LbM plays a major role in students' concept formation about complexity concepts.

Multidimensional Visualization Tools for Analysis of Expression Data

Expression data analysis is based mostly on the statistical approaches that are indispensable for the study of biological systems. Large amounts of multidimensional data resulting from the high-throughput technologies are not completely served by biostatistical techniques and are usually complemented with visual, knowledge discovery and other computational tools. In many cases, in biological systems we only speculate on the processes that are causing the changes, and it is the visual explorative analysis of data during which a hypothesis is formed. We would like to show the usability of multidimensional visualization tools and promote their use in life sciences. We survey and show some of the multidimensional visualization tools in the process of data exploration, such as parallel coordinates and radviz and we extend them by combining them with the self-organizing map algorithm. We use a time course data set of transitional cell carcinoma of the bladder in our examples. Analysis of data with these tools has the potential to uncover additional relationships and non-trivial structures.

Analysis of an Electrical Transformer: A Bond Graph Approach

Bond graph models of an electrical transformer including the nonlinear saturation are presented. These models determine the relation between self and mutual inductances, and the leakage and magnetizing inductances of power transformers with two and three windings using the properties of a bond graph. The modelling and analysis using this methodology to three phase power transformers or transformers with internal incipient faults can be extended.

Interoperability in Component Based Software Development

The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.

A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Self-Adaptive Differential Evolution Based Power Economic Dispatch of Generators with Valve-Point Effects and Multiple Fuel Options

This paper presents the solution of power economic dispatch (PED) problem of generating units with valve point effects and multiple fuel options using Self-Adaptive Differential Evolution (SDE) algorithm. The global optimal solution by mathematical approaches becomes difficult for the realistic PED problem in power systems. The Differential Evolution (DE) algorithm is found to be a powerful evolutionary algorithm for global optimization in many real problems. In this paper the key parameters of control in DE algorithm such as the crossover constant CR and weight applied to random differential F are self-adapted. The PED problem formulation takes into consideration of nonsmooth fuel cost function due to valve point effects and multi fuel options of generator. The proposed approach has been examined and tested with the numerical results of PED problems with thirteen-generation units including valve-point effects, ten-generation units with multiple fuel options neglecting valve-point effects and ten-generation units including valve-point effects and multiple fuel options. The test results are promising and show the effectiveness of proposed approach for solving PED problems.

Intelligent Agent Approach to the Control of Critical Infrastructure Networks

In this paper we propose an intelligent agent approach to control the electric power grid at a smaller granularity in order to give it self-healing capabilities. We develop a method using the influence model to transform transmission substations into information processing, analyzing and decision making (intelligent behavior) units. We also develop a wireless communication method to deliver real-time uncorrupted information to an intelligent controller in a power system environment. A combined networking and information theoretic approach is adopted in meeting both the delay and error probability requirements. We use a mobile agent approach in optimizing the achievable information rate vector and in the distribution of rates to users (sensors). We developed the concept and the quantitative tools require in the creation of cooperating semiautonomous subsystems which puts the electric grid on the path towards intelligent and self-healing system.

On Analysis of Boundness Property for ECATNets by Using Rewriting Logic

To analyze the behavior of Petri nets, the accessibility graph and Model Checking are widely used. However, if the analyzed Petri net is unbounded then the accessibility graph becomes infinite and Model Checking can not be used even for small Petri nets. ECATNets [2] are a category of algebraic Petri nets. The main feature of ECATNets is their sound and complete semantics based on rewriting logic [8] and its language Maude [9]. ECATNets analysis may be done by using techniques of accessibility analysis and Model Checking defined in Maude. But, these two techniques supported by Maude do not work also with infinite-states systems. As a category of Petri nets, ECATNets can be unbounded and so infinite systems. In order to know if we can apply accessibility analysis and Model Checking of Maude to an ECATNet, we propose in this paper an algorithm allowing the detection if the ECATNet is bounded or not. Moreover, we propose a rewriting logic based tool implementing this algorithm. We show that the development of this tool using the Maude system is facilitated thanks to the reflectivity of the rewriting logic. Indeed, the self-interpretation of this logic allows us both the modelling of an ECATNet and acting on it.

Shot Detection Using Modified Dugad Model

In this paper we present a modification to existed model of threshold for shot cut detection, which is able to adapt itself to the sequence statistics and operate in real time, because it use for calculation only previously evaluated frames. The efficiency of proposed modified adaptive threshold scheme was verified through extensive test experiment with several similarity metrics and achieved results were compared to the results reached by the original model. According to results proposed threshold scheme reached higher accuracy than existed original model.