Tuning Neurons to Interaural Intensity Differences Using Spike Timing-Dependent Plasticity

Mammals are known to use Interaural Intensity Difference (IID) to determine azimuthal position of high frequency sounds. In the Lateral Superior Olive (LSO) neurons have firing behaviours which vary systematicaly with IID. Those neurons receive excitatory inputs from the ipsilateral ear and inhibitory inputs from the contralateral one. The IID sensitivity of a LSO neuron is thought to be due to delay differences between both ears, delays due to different synaptic delays and to intensity-dependent delays. In this paper we model the auditory pathway until the LSO. Inputs to LSO neurons are at first numerous and differ in their relative delays. Spike Timing-Dependent Plasticity is then used to prune those connections. We compare the pruned neuron responses with physiological data and analyse the relationship between IID-s of teacher stimuli and IID sensitivities of trained LSO neurons.

The Reliability of the Improved e-N Method for Transition Prediction as Checked by PSE Method

Transition prediction of boundary layers has always been an important problem in fluid mechanics both theoretically and practically, yet notwithstanding the great effort made by many investigators, there is no satisfactory answer to this problem. The most popular method available is so-called e-N method which is heavily dependent on experiments and experience. The author has proposed improvements to the e-N method, so to reduce its dependence on experiments and experience to a certain extent. One of the key assumptions is that transition would occur whenever the velocity amplitude of disturbance reaches 1-2% of the free stream velocity. However, the reliability of this assumption needs to be verified. In this paper, transition prediction on a flat plate is investigated by using both the improved e-N method and the parabolized stability equations (PSE) methods. The results show that the transition locations predicted by both methods agree reasonably well with each other, under the above assumption. For the supersonic case, the critical velocity amplitude in the improved e-N method should be taken as 0.013, whereas in the subsonic case, it should be 0.018, both are within the range 1-2%.

Modular Workflow System for HPC Applications

Nowadays, HPC, Grid and Cloud systems are evolving very rapidly. However, the development of infrastructure solutions related to HPC is lagging behind. While the existing infrastructure is sufficient for simple cases, many computational problems have more complex requirements.Such computational experiments use different resources simultaneously to start a large number of computational jobs.These resources are heterogeneous. They have different purposes, architectures, performance and used software.Users need a convenient tool that allows to describe and to run complex computational experiments under conditions of HPC environment. This paper introduces a modularworkflow system called SEGL which makes it possible to run complex computational experiments under conditions of a real HPC organization. The system can be used in a great number of organizations, which provide HPC power. Significant requirements to this system are high efficiency and interoperability with the existing HPC infrastructure of the organization without any changes.

Red Diode Laser in the Treatment of Epidermal Diseases in PDT

The process of laser absorption in the skin during laser irradiation was a critical point in medical application treatments. Delivery the correct amount of laser light is a critical element in photodynamic therapy (PDT). More amounts of laser light able to affect tissues in the skin and small amount not able to enhance PDT procedure in skin. The knowledge of the skin tone laser dependent distribution of 635 nm radiation and its penetration depth in skin is a very important precondition for the investigation of advantage laser induced effect in (PDT) in epidermis diseases (psoriasis). The aim of this work was to estimate an optimum effect of diode laser (635 nm) on the treatment of epidermis diseases in different color skin. Furthermore, it is to improve safety of laser in PDT in epidermis diseases treatment. Advanced system analytical program (ASAP) which is a new approach in investigating the PDT, dependent on optical properties of different skin color was used in present work. A two layered Realistic Skin Model (RSM); stratum corneum and epidermal with red laser (635 nm, 10 mW) were used for irradiative transfer to study fluence and absorbance in different penetration for various human skin colors. Several skin tones very fair, fair, light, medium and dark are used to irradiative transfer. This investigation involved the principles of laser tissue interaction when the skin optically injected by a red laser diode. The results demonstrated that the power characteristic of a laser diode (635 nm) can affect the treatment of epidermal disease in various color skins. Power absorption of the various human skins were recorded and analyzed in order to find the influence of the melanin in PDT treatment in epidermal disease. A two layered RSM show that the change in penetration depth in epidermal layer of the color skin has a larger effect on the distribution of absorbed laser in the skin; this is due to the variation of the melanin concentration for each color.

Extrapolation of Clinical Data from an Oral Glucose Tolerance Test Using a Support Vector Machine

To extract the important physiological factors related to diabetes from an oral glucose tolerance test (OGTT) by mathematical modeling, highly informative but convenient protocols are required. Current models require a large number of samples and extended period of testing, which is not practical for daily use. The purpose of this study is to make model assessments possible even from a reduced number of samples taken over a relatively short period. For this purpose, test values were extrapolated using a support vector machine. A good correlation was found between reference and extrapolated values in evaluated 741 OGTTs. This result indicates that a reduction in the number of clinical test is possible through a computational approach.

Automatic Visualization Pipeline Formation for Medical Datasets on Grid Computing Environment

Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.

Electronic Transactions: Jurisdictional Issues in the European Union

One of the main consequences of the ubiquitous usage of Internet as a means to conduct business has been the progressive internationalization of contracts created to support such transactions. As electronic commerce becomes International commerce, the reality is that commercial disputes will occur creating such questions as: "In which country do I bring proceedings?" and "Which law is to be applied to solve disputes?" The decentralized and global structure of the Internet and its decentralized operation have given e-commerce a transnational element that affects two questions essential to any transaction: applicable law and jurisdiction in the event of dispute. The sharing of applicable law and jurisdiction among States in respect of international transactions traditionally has been based on the use of contact factors generally of a territorial nature (the place where real estate is located, customary residence, principal establishment, place of shipping goods). The characteristics of the Internet as a new space sometimes make it difficult to apply these rules, and may make them inoperative or lead to results that are surprising or totally foreign to the contracting parties and other elements and circumstances of the case.

Flood Hazard Mapping in Dikrong Basin of Arunachal Pradesh (India)

Flood zoning studies have become more efficient in recent years because of the availability of advanced computational facilities and use of Geographic Information Systems (GIS). In the present study, flood inundated areas were mapped using GIS for the Dikrong river basin of Arunachal Pradesh, India, corresponding to different return periods (2, 5, 25, 50, and 100 years). Further, the developed inundation maps corresponding to 25, 50, and 100 year return period floods were compared to corresponding maps developed by conventional methods as reported in the Brahmaputra Board Master Plan for Dikrong basin. It was found that, the average deviation of modelled flood inundation areas from reported map inundation areas is below 5% (4.52%). Therefore, it can be said that the modelled flood inundation areas matched satisfactorily with reported map inundation areas. Hence, GIS techniques were proved to be successful in extracting the flood inundation extent in a time and cost effective manner for the remotely located hilly basin of Dikrong, where conducting conventional surveys is very difficult.

EEG Spikes Detection, Sorting, and Localization

This study introduces a new method for detecting, sorting, and localizing spikes from multiunit EEG recordings. The method combines the wavelet transform, which localizes distinctive spike features, with Super-Paramagnetic Clustering (SPC) algorithm, which allows automatic classification of the data without assumptions such as low variance or Gaussian distributions. Moreover, the method is capable of setting amplitude thresholds for spike detection. The method makes use of several real EEG data sets, and accordingly the spikes are detected, clustered and their times were detected.

Performance Appraisal System using Multifactorial Evaluation Model

Performance appraisal of employee is important in managing the human resource of an organization. With the change towards knowledge-based capitalism, maintaining talented knowledge workers is critical. However, management classification of “outstanding", “poor" and “average" performance may not be an easy decision. Besides that, superior might also tend to judge the work performance of their subordinates informally and arbitrarily especially without the existence of a system of appraisal. In this paper, we propose a performance appraisal system using multifactorial evaluation model in dealing with appraisal grades which are often express vaguely in linguistic terms. The proposed model is for evaluating staff performance based on specific performance appraisal criteria. The project was collaboration with one of the Information and Communication Technology company in Malaysia with reference to its performance appraisal process.

Dynamic Stability of Beams with Piezoelectric Layers Located on a Continuous Elastic Foundation

This paper studies dynamic stability of homogeneous beams with piezoelectric layers subjected to periodic axial compressive load that is simply supported at both ends lies on a continuous elastic foundation. The displacement field of beam is assumed based on Bernoulli-Euler beam theory. Applying the Hamilton's principle, the governing dynamic equation is established. The influences of applied voltage, foundation coefficient and piezoelectric thickness on the unstable regions are presented. To investigate the accuracy of the present analysis, a compression study is carried out with a known data.

Knowledge Sharing based on Semantic Nets and Mereology to Avoid Risks in Manufacturing

The right information at the right time influences the enterprise and technical success. Sharing knowledge among members of a big organization may be a complex activity. And as long as the knowledge is not shared, can not be exploited by the organization. There are some mechanisms which can originate knowledge sharing. It is intended, in this paper, to trigger these mechanisms by using semantic nets. Moreover, the intersection and overlapping of terms and sub-terms, as well as their relationships will be described through the mereology science for the whole knowledge sharing system. It is proposed a knowledge system to supply to operators with the right information about a specific process and possible risks, e.g. at the assembly process, at the right time in an automated manufacturing environment, such as at the automotive industry.

Technology Adoption among Small and Medium Enterprises (SME's): A Research Agenda

This paper presents the research agenda that has been proposed to develop an integrated model to explain technology adoption of SMEs in Malaysia. SMEs form over 90% of all business entities in Malaysia and they have been contributing to the development of the nation. Technology adoption has been a thorn issue among SMEs as they require big outlay which might not be available to the SMEs. Although resource has been an issue among SMEs they cannot lie low and ignore the technological advancements that are taking place at a rapid pace. With that in mind this paper proposes a model to explain the technology adoption issue among SMEs.

Robust Probabilistic Online Change Detection Algorithm Based On the Continuous Wavelet Transform

In this article we present a change point detection algorithm based on the continuous wavelet transform. At the beginning of the article we describe a necessary transformation of a signal which has to be made for the purpose of change detection. Then case study related to iron ore sinter production which can be solved using our proposed technique is discussed. After that we describe a probabilistic algorithm which can be used to find changes using our transformed signal. It is shown that our algorithm works well with the presence of some noise and abnormal random bursts.

Oxidation of Selected Pharmaceuticals in Water Matrices by Bromine and Chlorine

The bromination of five selected pharmaceuticals (metoprolol, naproxen, amoxicillin, hydrochlorotiazide and phenacetin) in ultrapure water and in three water matrices (a groundwater, a surface water from a public reservoir and a secondary effluent from a WWTP) was investigated. The apparent rate constants for the bromination reaction were determined as a function of the pH, and the sequence obtained for the reaction rate was amoxicillin > naproxen >> hydrochlorotiazide ≈ phenacetin ≈ metoprolol. The proposal of a kinetic mechanism, which specifies the dissociation of bromine and each pharmaceutical according to their pKa values and the pH allowed the determination of the intrinsic rate constants for every elementary reaction. The influence of the main operating conditions (pH, initial bromine dose, and the water matrix) on the degradation of pharmaceuticals was established. In addition, the presence of bromide in chlorination experiments was investigated. The presence of bromide in wastewaters and drinking waters in the range of 10 to several hundred μg L-1 accelerated slightly the oxidation of the selected pharmaceuticals during chorine disinfection.

Graphical Programming of Programmable Logic Controllers -Case Study for a Punching Machine-

The Programmable Logic Controller (PLC) plays a vital role in automation and process control. Grafcet is used for representing the control logic, and traditional programming languages are used for describing the pure algorithms. Grafcet is used for dividing the process to be automated in elementary sequences that can be easily implemented. Each sequence represent a step that has associated actions programmed using textual or graphical languages after case. The programming task is simplified by using a set of subroutines that are used in several steps. The paper presents an example of implementation for a punching machine for sheets and plates. The use the graphical languages the programming of a complex sequential process is a necessary solution. The state of Grafcet can be used for debugging and malfunction determination. The use of the method combined with a set of knowledge acquisition for process application reduces the downtime of the machine and improve the productivity.

An Empirical Analysis of the Board Composition Concerning Logistics Competencies

Empirical insights into the implementation of logistics competencies at the top management level are scarce. This paper addresses this issue with an explorative approach which is based on a dataset of 872 observations in the years 2000, 2004 and 2008 using quantitative content analysis from annual reports of the 500 publicly listed firms with the highest global research and development expenditures according to the British Department for Business Innovation and Skills. We find that logistics competencies are more pronounced in Asian companies than in their European or American counterparts. On an industrial level the results are quite mixed. Using partial point-biserial correlations we show that logistics competencies are positively related to financial performance.

A Hybrid Neural Network and Gravitational Search Algorithm (HNNGSA) Method to Solve well known Wessinger's Equation

This study presents a hybrid neural network and Gravitational Search Algorithm (HNGSA) method to solve well known Wessinger's equation. To aim this purpose, gravitational search algorithm (GSA) technique is applied to train a multi-layer perceptron neural network, which is used as approximation solution of the Wessinger's equation. A trial solution of the differential equation is written as sum of two parts. The first part satisfies the initial/ boundary conditions and does not contain any adjustable parameters and the second part which is constructed so as not to affect the initial/boundary conditions. The second part involves adjustable parameters (the weights and biases) for a multi-layer perceptron neural network. In order to demonstrate the presented method, the obtained results of the proposed method are compared with some known numerical methods. The given results show that presented method can introduce a closer form to the analytic solution than other numerical methods. Present method can be easily extended to solve a wide range of problems.

An Agent-Based Scheduling Framework for Flexible Manufacturing Systems

The concept of flexible manufacturing is highly appealing in gaining a competitive edge in the market by quickly adapting to the changing customer needs. Scheduling jobs on flexible manufacturing systems (FMSs) is a challenging task of managing the available flexibility on the shop floor to react to the dynamics of the environment in real-time. In this paper, an agent-oriented scheduling framework that can be integrated with a real or a simulated FMS is proposed. This framework works in stochastic environments with a dynamic model of job arrival. It supports a hierarchical cooperative scheduling that builds on the available flexibility of the shop floor. Testing the framework on a model of a real FMS showed the capability of the proposed approach to overcome the drawbacks of the conventional approaches and maintain a near optimal solution despite the dynamics of the operational environment.

The Impact of Video Games in Children-s Learning of Mathematics

This paper describes a research project on Year 3 primary school students in Malaysia in their use of computer-based video game to enhance learning of multiplication facts (tables) in the Mathematics subject. This study attempts to investigate whether video games could actually contribute to positive effect on children-s learning or otherwise. In conducting this study, the researchers assume a neutral stand in the investigation as an unbiased outcome of the study would render reliable response to the impact of video games in education which would contribute to the literature of technology-based education as well as impact to the pedagogical aspect of formal education. In order to conduct the study, a subject (Mathematics) with a specific topic area in the subject (multiplication facts) is chosen. The study adopts a causal-comparative research to investigate the impact of the inclusion of a computer-based video game designed to teach multiplication facts to primary level students. Sample size is 100 students divided into two i.e., A: conventional group and B conventional group aided by video games. The conventional group (A) would be taught multiplication facts (timetables) and skills conventionally. The other group (B) underwent the same lessons but with supplementary activity: a computer-based video game on multiplication which is called Timez-Attack. Analysis of marks accrued from pre-test will be compared to post- test using comparisons of means, t tests, and ANOVA tests to investigate the impact of computer games as an added learning activity. The findings revealed that video games as a supplementary activity to classroom learning brings significant and positive effect on students- retention and mastery of multiplication tables as compared to students who rely only upon formal classroom instructions.