BEM Formulations Based on Kirchhoffs Hypoyhesis to Perform Linear Bending Analysis of Plates Reinforced by Beams

In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.

Using a Semantic Self-Organising Web Page-Ranking Mechanism for Public Administration and Education

In the proposed method for Web page-ranking, a novel theoretic model is introduced and tested by examples of order relationships among IP addresses. Ranking is induced using a convexity feature, which is learned according to these examples using a self-organizing procedure. We consider the problem of selforganizing learning from IP data to be represented by a semi-random convex polygon procedure, in which the vertices correspond to IP addresses. Based on recent developments in our regularization theory for convex polygons and corresponding Euclidean distance based methods for classification, we develop an algorithmic framework for learning ranking functions based on a Computational Geometric Theory. We show that our algorithm is generic, and present experimental results explaining the potential of our approach. In addition, we explain the generality of our approach by showing its possible use as a visualization tool for data obtained from diverse domains, such as Public Administration and Education.

Working Memory Capacity in Australian Sign Language (Auslan)/English Interpreters and Deaf Signers

Little research has examined working memory capacity (WMC) in signed language interpreters and deaf signers. This paper presents the findings of a study that investigated WMC in professional Australian Sign Language (Auslan)/English interpreters and deaf signers. Thirty-one professional Auslan/English interpreters (14 hearing native signers and 17 hearing non-native signers) completed an English listening span task and then an Auslan working memory span task, which tested their English WMC and their Auslan WMC, respectively. Moreover, 26 deaf signers (6 deaf native signers and 20 deaf non-native signers) completed the Auslan working memory span task. The results revealed a non-significant difference between the hearing native signers and the hearing non-native signers in their English WMC, and a non-significant difference between the hearing native signers and the hearing non-native signers in their Auslan WMC. Moreover, the results yielded a non-significant difference between the hearing native signers- English WMC and their Auslan WMC, and a non-significant difference between the hearing non-native signers- English WMC and their Auslan WMC. Furthermore, a non-significant difference was found between the deaf native signers and the deaf non-native signers in their Auslan WMC.

New Methods for E-Commerce Databases Designing in Semantic Web Systems (Modern Systems)

The purpose of this paper is to study Database Models to use them efficiently in E-commerce websites. In this paper we are going to find a method which can save and retrieve information in Ecommerce websites. Thus, semantic web applications can work with, and we are also going to study different technologies of E-commerce databases and we know that one of the most important deficits in semantic web is the shortage of semantic data, since most of the information is still stored in relational databases, we present an approach to map legacy data stored in relational databases into the Semantic Web using virtually any modern RDF query language, as long as it is closed within RDF. To achieve this goal we study XML structures for relational data bases of old websites and eventually we will come up one level over XML and look for a map from relational model (RDM) to RDF. Noting that a large number of semantic webs get advantage of relational model, opening the ways which can be converted to XML and RDF in modern systems (semantic web) is important.

Novel Hybrid Method for Gene Selection and Cancer Prediction

Microarray data profiles gene expression on a whole genome scale, therefore, it provides a good way to study associations between gene expression and occurrence or progression of cancer. More and more researchers realized that microarray data is helpful to predict cancer sample. However, the high dimension of gene expressions is much larger than the sample size, which makes this task very difficult. Therefore, how to identify the significant genes causing cancer becomes emergency and also a hot and hard research topic. Many feature selection algorithms have been proposed in the past focusing on improving cancer predictive accuracy at the expense of ignoring the correlations between the features. In this work, a novel framework (named by SGS) is presented for stable gene selection and efficient cancer prediction . The proposed framework first performs clustering algorithm to find the gene groups where genes in each group have higher correlation coefficient, and then selects the significant genes in each group with Bayesian Lasso and important gene groups with group Lasso, and finally builds prediction model based on the shrinkage gene space with efficient classification algorithm (such as, SVM, 1NN, Regression and etc.). Experiment results on real world data show that the proposed framework often outperforms the existing feature selection and prediction methods, say SAM, IG and Lasso-type prediction model.

Mapping Complex, Large – Scale Spiking Networks on Neural VLSI

Traditionally, VLSI implementations of spiking neural nets have featured large neuron counts for fixed computations or small exploratory, configurable nets. This paper presents the system architecture of a large configurable neural net system employing a dedicated mapping algorithm for projecting the targeted biology-analog nets and dynamics onto the hardware with its attendant constraints.

Investigations into Effect of Neural Network Predictive Control of UPFC for Improving Transient Stability Performance of Multimachine Power System

The paper presents an investigation in to the effect of neural network predictive control of UPFC on the transient stability performance of a multimachine power system. The proposed controller consists of a neural network model of the test system. This model is used to predict the future control inputs using the damped Gauss-Newton method which employs ‘backtracking’ as the line search method for step selection. The benchmark 2 area, 4 machine system that mimics the behavior of large power systems is taken as the test system for the study and is subjected to three phase short circuit faults at different locations over a wide range of operating conditions. The simulation results clearly establish the robustness of the proposed controller to the fault location, an increase in the critical clearing time for the circuit breakers, and an improved damping of the power oscillations as compared to the conventional PI controller.

An Off-the-Shelf Scheme for Dependable Grid Systems Using Virtualization

Recently, grid computing has been widely focused on the science, industry, and business fields, which are required a vast amount of computing. Grid computing is to provide the environment that many nodes (i.e., many computers) are connected with each other through a local/global network and it is available for many users. In the environment, to achieve data processing among nodes for any applications, each node executes mutual authentication by using certificates which published from the Certificate Authority (for short, CA). However, if a failure or fault has occurred in the CA, any new certificates cannot be published from the CA. As a result, a new node cannot participate in the gird environment. In this paper, an off-the-shelf scheme for dependable grid systems using virtualization techniques is proposed and its implementation is verified. The proposed approach using the virtualization techniques is to restart an application, e.g., the CA, if it has failed. The system can tolerate a failure or fault if it has occurred in the CA. Since the proposed scheme is implemented at the application level easily, the cost of its implementation by the system builder hardly takes compared it with other methods. Simulation results show that the CA in the system can recover from its failure or fault.

Analysing the Elementary Science and Technology Coursebook and Student Workbook in Terms of Constructivism

The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.

Decision Support for the Selection of Electric Power Plants Generated from Renewable Sources

Decision support based upon risk analysis into comparison of the electricity generation from different renewable energy technologies can provide information about their effects on the environment and society. The aim of this paper is to develop the assessment framework regarding risks to health and environment, and the society-s benefits of the electric power plant generation from different renewable sources. The multicriteria framework to multiattribute risk analysis technique and the decision analysis interview technique are applied in order to support the decisionmaking process for the implementing renewable energy projects to the Bangkok case study. Having analyses the local conditions and appropriate technologies, five renewable power plants are postulated as options. As this work demonstrates, the analysis can provide a tool to aid decision-makers for achieving targets related to promote sustainable energy system.

X-ray Pulse Profiles of PSR J0538+2817

This paper reports our analysis of 163 ks observations of PSR J0538+2817 with the Rossi X-Ray Timing Explorer (RXTE).The pulse profiles, detected up to 60 keV, show a single peak asin the case for radio frequency. The profile is well described by one Gaussians function with full width at half maximum (FWHM) 0.04794. We compared the difference of arrival time between radio and X-ray pulse profiles for the first time. It turns out that the phase of radio emits precede the X-ray by 8.7 ± 4.5 ms. Furthermore we obtained the pulse profiles in the energy ranges of 2.29-6.18 keV, 6.18-12.63 keV and 12.63-17.36 keV. The intensity of pulses decreases with the increasing energy range. We discuss the emission geometry in our work.

A GA-Based Role Assignment Approach for Web-based Cooperative Learning Environments

Web-based cooperative learning focuses on (1) the interaction and the collaboration of community members, and (2) the sharing and the distribution of knowledge and expertise by network technology to enhance learning performance. Numerous research literatures related to web-based cooperative learning have demonstrated that cooperative scripts have a positive impact to specify, sequence, and assign cooperative learning activities. Besides, literatures have indicated that role-play in web-based cooperative learning environments enhances two or more students to work together toward the completion of a common goal. Since students generally do not know each other and they lack the face-to-face contact that is necessary for the negotiation of assigning group roles in web-based cooperative learning environments, this paper intends to further extend the application of genetic algorithm (GA) and propose a GA-based algorithm to tackle the problem of role assignment in web-based cooperative learning environments, which not only saves communication costs but also reduces conflict between group members in negotiating role assignments.

An Analysis of Blackouts for Electric Power Transmission Systems

In this paper an analysis of blackouts in electric power transmission systems is implemented using a model and studied in simple networks with a regular topology. The proposed model describes load demand and network improvements evolving on a slow timescale as well as the fast dynamics of cascading overloads and outages.

Levenberg-Marquardt Algorithm for Karachi Stock Exchange Share Rates Forecasting

Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.

Using Partnerships to Achieve National Goals

Ireland developed a National Strategy 2030 that argued for the creation of a new form of higher education institution, a Technological University. The research reported here reviews the first stage of this partnership development. The study found that national policy can create system capacity and change, but that individual partners may have more to gain or lose in collaborating. When presented as a zero-sum activity, fear among partners is high. The level of knowledge and networking within the higher education system possessed by each partner contributed to decisions to participate or not in a joint proposal for collaboration. Greater success resulted when there were gains for all partners. This research concludes that policy mandates can provide motivation to collaborate, but that the partnership needs to be built more on shared values versus coercion by mandates.

A New Scheduling Algorithm Based on Traffic Classification Using Imprecise Computation

Wireless channels are characterized by more serious bursty and location-dependent errors. Many packet scheduling algorithms have been proposed for wireless networks to guarantee fairness and delay bounds. However, most existing schemes do not consider the difference of traffic natures among packet flows. This will cause the delay-weight coupling problem. In particular, serious queuing delays may be incurred for real-time flows. In this paper, it is proposed a scheduling algorithm that takes traffic types of flows into consideration when scheduling packets and also it is provided scheduling flexibility by trading off video quality to meet the playback deadline.

New Product Development Process on High-Tech Innovation Life Cycle

This work will provide a new perspective of exploring innovation thematic. It will reveal that radical and incremental innovations are complementary during the innovation life cycle and accomplished through distinct ways of developing new products. Each new product development process will be constructed according to the nature of each innovation and the state of the product development. This paper proposes the inclusion of the organizational function areas that influence new product's development on the new product development process.

Pervasive Differentiated Services: A QoS Model for Pervasive Systems

In this article, we introduce a mechanism by which the same concept of differentiated services used in network transmission can be applied to provide quality of service levels to pervasive systems applications. The classical DiffServ model, including marking and classification, assured forwarding, and expedited forwarding, are all utilized to create quality of service guarantees for various pervasive applications requiring different levels of quality of service. Through a collection of various sensors, personal devices, and data sources, the transmission of contextsensitive data can automatically occur within a pervasive system with a given quality of service level. Triggers, initiators, sources, and receivers are four entities labeled in our mechanism. An explanation of the role of each is provided, and how quality of service is guaranteed.

Innovative Teaching in Systems Analysis and Design - an Action Research Project

Systems Analysis and Design is a key subject in Information Technology courses, but students do not find it easy to cope with, since it is not “precise" like programming and not exact like Mathematics. It is a subject working with many concepts, modeling ideas into visual representations and then translating the pictures into a real life system. To complicate matters users who are not necessarily familiar with computers need to give their inputs to ensure that they get the system the need. Systems Analysis and Design also covers two fields, namely Analysis, focusing on the analysis of the existing system and Design, focusing on the design of the new system. To be able to test the analysis and design of a system, it is necessary to develop a system or at least a prototype of the system to test the validity of the analysis and design. The skills necessary in each aspect differs vastly. Project Management Skills, Database Knowledge and Object Oriented Principles are all necessary. In the context of a developing country where students enter tertiary education underprepared and the digital divide is alive and well, students need to be motivated to learn the necessary skills, get an opportunity to test it in a “live" but protected environment – within the framework of a university. The purpose of this article is to improve the learning experience in Systems Analysis and Design through reviewing the underlying teaching principles used, the teaching tools implemented, the observations made and the reflections that will influence future developments in Systems Analysis and Design. Action research principles allows the focus to be on a few problematic aspects during a particular semester.

Functionalization and Characterization of Carbon Nanotubes/ Polypropylene Nanocomposite

Chemical and physical functionalization of multiwalled carbon nanotubes (MWCNT) has been commonly practiced to achieve better dispersion of carbon nanotubes (CNTs) in polymer matrix. This work describes various functionalization methods (acidtreatment, non-ionic surfactant treatment with TritonX-100), fabrication of MWCNT/PP nanocomposites via melt blending and characterization of mechanical properties. Microscopy analysis (FESEM, TEM, XPS) showed effective purification of MWCNTs under acid treatment, and better dispersion under both chemical and physical functionalization techniques combined, in their respective order. Tensile tests showed increase in tensile strength for the nanocomposites that contain MWCNTs up to 2 wt%. A decrease in tensile strength was seen in samples that contain 4 wt% of MWCNTs for both raw and Triton X-100 functionalized, signifying MWCNT degradation/rebundling at composition with higher content of MWCNTs. For the acid-treated MWCNTs, however, the tensile results showed slight improvement even at 4wt%, indicating effective dispersion of MWCNTs.