Multi-agent Data Fusion Architecture for Intelligent Web Information Retrieval

In this paper we propose a multi-agent architecture for web information retrieval using fuzzy logic based result fusion mechanism. The model is designed in JADE framework and takes advantage of JXTA agent communication method to allow agent communication through firewalls and network address translators. This approach enables developers to build and deploy P2P applications through a unified medium to manage agent-based document retrieval from multiple sources.

Investigation on Some Ergonomics and Psychological Strains of Common Militarism Protective Clothing

Protective clothing limits heat transfer and hampers task performance due to the increased weight. Militarism protective clothing enables humans to operate in adverse environments. In the selection and evaluation of militarism protective clothing attention should be given to heat strain, ergonomic and fit issues next to the actual protection it offers. Fifty Male healthy subjects participated in the study. The subjects were dressed in shorts, T-shirts, socks, sneakers and four deferent kinds of militarism protective clothing such as CS, CSB, CS with NBC protection and CS with NBC- protection added. Ergonomically and psychological strains of every four cloths were investigated on subjects by walking on a treadmill (7km/hour) with a 19.7 kg backpack. As a result of these tests were showed that, the highest heart rate was found wearing the NBC-protection added outfit, the highest temperatures were observed wearing NBCprotection added, followed by respectively CS with NBC protection, CSB and CS and the highest value for thermal comfort (implying worst thermal comfort) was observed wearing NBC-protection added.

Potential Effects of Human Bone Marrow Non- Mesenchymal Mononuclear Cells on Neuronal Differentiation

Bone marrow-derived stem cells have been widely studied as an alternative source of stem cells. Mesenchymal stem cells (MSCs) were mostly investigated and studies showed MSCs can promote neurogenesis. Little is known about the non-mesenchymal mononuclear cell fraction, which contains both hematopoietic and nonhematopoietic cells, including monocytes and endothelial progenitor cells. This study focused on unfractionated bone marrow mononuclear cells (BMMCs), which remained 72 h after MSCs were adhered to the culture plates. We showed that BMMC-conditioned medium promoted morphological changes of human SH-SY5Y neuroblastoma cells from an epithelial-like phenotype towards a neuron-like phenotype as indicated by an increase in neurite outgrowth, like those observed in retinoic acid (RA)-treated cells. The result could be explained by the effects of trophic factors released from BMMCs, as shown in the RT-PCR results that BMMCs expressed nerve growth factor (NGF), brain-derived neurotrophic factor (BDNF), and ciliary neurotrophic factor (CNTF). Similar results on the cell proliferation rate were also observed between RA-treated cells and cells cultured in BMMC-conditioned medium, suggesting that cells creased proliferating and differentiated into a neuronal phenotype. Using real-time RT-PCR, a significantly increased expression of tyrosine hydroxylase (TH) mRNA in SHSY5Y cells indicated that BMMC-conditioned medium induced catecholaminergic identities in differentiated SH-SY5Y cells.

Analysis and Design of a Novel Active Soft Switched Phase-Shifted Full Bridge Converter

This paper proposes an active soft-switching circuit for bridge converters aiming to improve the power conversion efficiency. The proposed circuit achieves loss-less switching for both main and auxiliary switches without increasing the main switch current/voltage rating. A winding coupled to the primary of power transformer ensures ZCS for the auxiliary switches during their turn-off. A 350 W, 100 kHz phase shifted full bridge (PSFB) converter is built to validate the analysis and design. Theoretical loss calculations for proposed circuit is presented. The proposed circuit is compared with passive soft switched PSFB in terms of efficiency and loss in duty cycle.

Hierarchies Based On the Number of Cooperating Systems of Finite Automata on Four-Dimensional Input Tapes

In theoretical computer science, the Turing machine has played a number of important roles in understanding and exploiting basic concepts and mechanisms in computing and information processing [20]. It is a simple mathematical model of computers [9]. After that, M.Blum and C.Hewitt first proposed two-dimensional automata as a computational model of two-dimensional pattern processing, and investigated their pattern recognition abilities in 1967 [7]. Since then, a lot of researchers in this field have been investigating many properties about automata on a two- or three-dimensional tape. On the other hand, the question of whether processing fourdimensional digital patterns is much more difficult than two- or threedimensional ones is of great interest from the theoretical and practical standpoints. Thus, the study of four-dimensional automata as a computasional model of four-dimensional pattern processing has been meaningful [8]-[19],[21]. This paper introduces a cooperating system of four-dimensional finite automata as one model of four-dimensional automata. A cooperating system of four-dimensional finite automata consists of a finite number of four-dimensional finite automata and a four-dimensional input tape where these finite automata work independently (in parallel). Those finite automata whose input heads scan the same cell of the input tape can communicate with each other, that is, every finite automaton is allowed to know the internal states of other finite automata on the same cell it is scanning at the moment. In this paper, we mainly investigate some accepting powers of a cooperating system of eight- or seven-way four-dimensional finite automata. The seven-way four-dimensional finite automaton is an eight-way four-dimensional finite automaton whose input head can move east, west, south, north, up, down, or in the fu-ture, but not in the past on a four-dimensional input tape.

Bottom Up Text Mining through Hierarchical Document Representation

Most of the existing text mining approaches are proposed, keeping in mind, transaction databases model. Thus, the mined dataset is structured using just one concept: the “transaction", whereas the whole dataset is modeled using the “set" abstract type. In such cases, the structure of the whole dataset and the relationships among the transactions themselves are not modeled and consequently, not considered in the mining process. We believe that taking into account structure properties of hierarchically structured information (e.g. textual document, etc ...) in the mining process, can leads to best results. For this purpose, an hierarchical associations rule mining approach for textual documents is proposed in this paper and the classical set-oriented mining approach is reconsidered profits to a Direct Acyclic Graph (DAG) oriented approach. Natural languages processing techniques are used in order to obtain the DAG structure. Based on this graph model, an hierarchical bottom up algorithm is proposed. The main idea is that each node is mined with its parent node.

BIP-Based Alarm Declaration and Clearing in SONET Networks Employing Automatic Protection Switching

The paper examines the performance of bit-interleaved parity (BIP) methods in error rate monitoring, and in declaration and clearing of alarms in those transport networks that employ automatic protection switching (APS). The BIP-based error rate monitoring is attractive for its simplicity and ease of implementation. The BIP-based results are compared with exact results and are found to declare the alarms too late, and to clear the alarms too early. It is concluded that the standards development and systems implementation should take into account the fact of early clearing and late declaration of alarms. The window parameters defining the detection and clearing thresholds should be set so as to build sufficient hysteresis into the system to ensure that BIP-based implementations yield acceptable performance results.

Effect of Superplasticizer and NaOH Molarity on Workability, Compressive Strength and Microstructure Properties of Self-Compacting Geopolymer Concrete

The research investigates the effects of super plasticizer and molarity of sodium hydroxide alkaline solution on the workability, microstructure and compressive strength of self compacting geopolymer concrete (SCGC). SCGC is an improved way of concreting execution that does not require compaction and is made by complete elimination of ordinary Portland cement content. The parameters studied were superplasticizer (SP) dosage and molarity of NaOH solution. SCGC were synthesized from low calcium fly ash, activated by combinations of sodium hydroxide and sodium silicate solutions, and by incorporation of superplasticizer for self compactability. The workability properties such as filling ability, passing ability and resistance to segregation were assessed using slump flow, T-50, V-funnel, L-Box and J-ring test methods. It was found that the essential workability requirements for self compactability according to EFNARC were satisfied. Results showed that the workability and compressive strength improved with the increase in superplasticizer dosage. An increase in strength and a decrease in workability of these concrete samples were observed with the increase in molarity of NaOH solution from 8M to 14M. Improvement of interfacial transition zone (ITZ) and micro structure with the increase of SP and increase of concentration from 8M to 12M were also identified.

Scrum as the Method Supporting the Implementation of Knowledge Management in an Organization

Many companies have switched their processes to project-oriented in the last years. This brings new possibilities and effectiveness not only in the field of external processes connected with the product delivery but also the internal processes as well. However centralized project organization which is based on the role of project manager in the team has proved insufficient in some cases. Agile methods of project organization are trying to solve this problem by bringing new view on the project organization, roles, processes and competences. Scrum is one of these methods which builds on the principles of knowledge management to drive the project to effectiveness from all view angles. Using this method to organize internal and delivery projects helps the organization to create and share knowledge throughout the company. It also supports forming unique competences of individuals and project teams and drives innovations in the company.

A Power Reduction Technique for Built-In-Self Testing Using Modified Linear Feedback Shift Register

A linear feedback shift register (LFSR) is proposed which targets to reduce the power consumption from within. It reduces the power consumption during testing of a Circuit Under Test (CUT) at two stages. At first stage, Control Logic (CL) makes the clocks of the switching units of the register inactive for a time period when output from them is going to be same as previous one and thus reducing unnecessary switching of the flip-flops. And at second stage, the LFSR reorders the test vectors by interchanging the bit with its next and closest neighbor bit. It keeps fault coverage capacity of the vectors unchanged but reduces the Total Hamming Distance (THD) so that there is reduction in power while shifting operation.

Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Formal Analysis of a Public-Key Algorithm

In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.

Distributed Case Based Reasoning for Intelligent Tutoring System: An Agent Based Student Modeling Paradigm

Online learning with Intelligent Tutoring System (ITS) is becoming very popular where the system models the student-s learning behavior and presents to the student the learning material (content, questions-answers, assignments) accordingly. In today-s distributed computing environment, the tutoring system can take advantage of networking to utilize the model for a student for students from other similar groups. In the present paper we present a methodology where using Case Based Reasoning (CBR), ITS provides student modeling for online learning in a distributed environment with the help of agents. The paper describes the approach, the architecture, and the agent characteristics for such system. This concept can be deployed to develop ITS where the tutor can author and the students can learn locally whereas the ITS can model the students- learning globally in a distributed environment. The advantage of such an approach is that both the learning material (domain knowledge) and student model can be globally distributed thus enhancing the efficiency of ITS with reducing the bandwidth requirement and complexity of the system.

Effect of Increasing Road Light Luminance on Night Driving Performance of Older Adults

The main objective of this study was to determine if a minimal increase in road light level (luminance) could lead to improved driving performance among older adults. Older, middleaged and younger adults were tested in a driving simulator following vision and cognitive screening. Comparisons were made for the performance of simulated night driving under two road light conditions (0.6 and 2.5 cd/m2). At each light level, the effects of self reported night driving avoidance were examined along with the vision/cognitive performance. It was found that increasing road light level from 0.6 cd/m2 to 2.5 cd/m2 resulted in improved recognition of signage on straight highway segments. The improvement depends on different driver-related factors such as vision and cognitive abilities, and confidence. On curved road sections, the results showed that driver-s performance worsened. It is concluded that while increasing road lighting may be helpful to older adults especially for sign recognition, it may also result in increased driving confidence and thus reduced attention in some driving situations.

Comparative Kinetic Study on Alkylation of p-cresol with Tert-butyl Alcohol using Different SO3-H Functionalized Ionic Liquid Catalysts

Ionic liquids are well known as green solvents, reaction media and catalysis. Here, three different sulfonic acid functional ionic liquids prepared in the laboratory are used as catalysts in alkylation of p-cresol with tert-butyl alcohol. The kinetics on each of the catalysts was compared and a kinetic model was developed based on the product distribution over these catalysts. The kinetic parameters were estimated using Marquadt's algorithm to minimize the error function. The Arrhenius plots show a curvature which is best interpreted by the extended Arrhenius equation.

Association between Serum Concentrations of Anabolic Hormones and their Binding Proteins in Response to Graded Exercise in Male Athletes

We investigated the response of testosterone (T), growth hormone (GH), cortisol (C), steroid hormone binding globulin (SHBG), insulin-like growth factor (IGF-1), insulin-like growth factor binding protein-3 (IGFBP-3), and some anaboliccatabolic indexes, i.e.: T/C, T/SHBG, and IGF-1/IGFBP-3 to maximal exercise in endurance-trained athletes (TREN) and untrained subjects (CG). The baseline concentration of IGF-1 was higher in athletes (TREN) when compared to the CG (p

Development and Assessment of the Competence Creativity Applied to Technical Drawing

The results obtained after incorporating the competence “creativity" to the subject Technical Drawing of the first course of the Degree in Forestry, Technical University of Madrid, are presented in this study.At first, learning activities which could serve two functions at the same time -developing students- creativity and developing other specific competences of the subject- were considered. Besides, changes in the assessment procedure were made and a method which analyzes two aspects of the assessment of the competence creativity was established. On the one hand, the products are evaluated by analyzing the outcomes obtained by students in the essays suggested and by establishing a parameter to assess the creativity expressed in those essays. On the other, an assessment of the student is directly carried out through a psychometric test which has been previously chosen by the team.Moreover, these results can be applied to similar or could be of general application.

Mutation Rate for Evolvable Hardware

Evolvable hardware (EHW) refers to a selfreconfiguration hardware design, where the configuration is under the control of an evolutionary algorithm (EA). A lot of research has been done in this area several different EA have been introduced. Every time a specific EA is chosen for solving a particular problem, all its components, such as population size, initialization, selection mechanism, mutation rate, and genetic operators, should be selected in order to achieve the best results. In the last three decade a lot of research has been carried out in order to identify the best parameters for the EA-s components for different “test-problems". However different researchers propose different solutions. In this paper the behaviour of mutation rate on (1+λ) evolution strategy (ES) for designing logic circuits, which has not been done before, has been deeply analyzed. The mutation rate for an EHW system modifies values of the logic cell inputs, the cell type (for example from AND to NOR) and the circuit output. The behaviour of the mutation has been analyzed based on the number of generations, genotype redundancy and number of logic gates used for the evolved circuits. The experimental results found provide the behaviour of the mutation rate to be used during evolution for the design and optimization of logic circuits. The researches on the best mutation rate during the last 40 years are also summarized.

DRE - A Quality Metric for Component based Software Products

The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).

Design Techniques and Implementation of Low Power High-Throughput Discrete Wavelet Transform Tilters for JPEG 2000 Standard

In this paper, the implementation of low power, high throughput convolutional filters for the one dimensional Discrete Wavelet Transform and its inverse are presented. The analysis filters have already been used for the implementation of a high performance DWT encoder [15] with minimum memory requirements for the JPEG 2000 standard. This paper presents the design techniques and the implementation of the convolutional filters included in the JPEG2000 standard for the forward and inverse DWT for achieving low-power operation, high performance and reduced memory accesses. Moreover, they have the ability of performing progressive computations so as to minimize the buffering between the decomposition and reconstruction phases. The experimental results illustrate the filters- low power high throughput characteristics as well as their memory efficient operation.