Burst on Hurst Algorithm for Detecting Activity Patterns in Networks of Cortical Neurons

Electrophysiological signals were recorded from primary cultures of dissociated rat cortical neurons coupled to Micro-Electrode Arrays (MEAs). The neuronal discharge patterns may change under varying physiological and pathological conditions. For this reason, we developed a new burst detection method able to identify bursts with peculiar features in different experimental conditions (i.e. spontaneous activity and under the effect of specific drugs). The main feature of our algorithm (i.e. Burst On Hurst), based on the auto-similarity or fractal property of the recorded signal, is the independence from the chosen spike detection method since it works directly on the raw data.

B-VIS Service-oriented Middleware for RFID Sensor Network

One of the most importance of intelligence in-car and roadside systems is the cooperative vehicle-infrastructure system. In Thailand, ITS technologies are rapidly growing and real-time vehicle information is considerably needed for ITS applications; for example, vehicle fleet tracking and control and road traffic monitoring systems. This paper defines the communication protocols and software design for middleware components of B-VIS (Burapha Vehicle-Infrastructure System). The proposed B-VIS middleware architecture serves the needs of a distributed RFID sensor network and simplifies some intricate details of several communication standards.

Evaluating the Interactions of Co2-Ionic Liquid Systems through Molecular Modeling

Owing to the stringent environmental legislations, CO2 capture and sequestration is one of the viable solutions to reduce the CO2 emissions from various sources. In this context, Ionic liquids (ILs) are being investigated as suitable absorption media for CO2 capture. Due to their non-evaporative, non-toxic, and non-corrosive nature, these ILs have the potential to replace the existing solvents like aqueous amine solutions for CO2 separation technologies. Thus, the present work aims at studying the important aspects such as the interactions of CO2 molecule with different anions (F-, Br-, Cl-, NO3 -, BF4 -, PF6 -, Tf2N-, and CF3SO3 -) that are commonly used in ILs through molecular modeling. In this, the minimum energy structures have been obtained using Ab initio based calculations at MP2 (Moller-Plesset perturbation) level. Results revealed various degrees of distortion of CO2 molecule (from its linearity) with the anions studied, most likely due to the Lewis acid-base interactions between CO2 and anion. Furthermore, binding energies for the anion-CO2 complexes were also calculated. The implication of anion-CO2 interactions to the solubility of CO2 in ionic liquids is also discussed.

Development of a Support Tool for Cost and Schedule Integration Managment at Program Level

There has been gradual progress of late in construction projects, particularly in big-scale megaprojects. Due to the long-term construction period, however, with large-scale budget investment, lack of construction management technologies, and increase in the incomplete elements of project schedule management, a plan to conduct efficient operations and to ensure business safety is required. In particular, as the project management information system (PMIS) is meant for managing a single project centering on the construction phase, there is a limitation in the management of program-scale businesses like megaprojects. Thus, a program management information system (PgMIS) that includes program-level management technologies is needed to manage multiple projects. In this study, a support tool was developed for managing the cost and schedule information occurring in the construction phase, at the program level. In addition, a case study on the developed support tool was conducted to verify the usability of the system. With the use of the developed support tool program, construction managers can monitor the progress of the entire project and of the individual subprojects in real time.

Increase Success by Decreasing Admission for Maths– Fairytale or Reality?

South Africa is facing a crisis with not being able to produce enough graduates in the scarce skills areas to sustain economic growth. The crisis is fuelled by a school system that does not produce enough potential students with Mathematics, Accounting and Science. Since the introduction of the new school curriculum in 2008, there is no longer an option to take pure maths on a standard grade level. Instead, only two mathematical subjects are offered: pure maths (which is on par with higher grade maths) and mathematical literacy. It is compulsory to take one or the other. As a result, lees student finishes Grade 12 with pure mathematics every year. This national problem needs urgent attention if South Africa is to make any headway in critical skills development as mathematics is a gateway to scarce skills professions. Higher education institutions initiated several initiatives in an attempt to address the above, including preparatory courses, bridging programmes and extended curricula with foundation provisions. In view of the above, and government policy directives to broaden access in the scarce skills areas to increase student throughput, foundation provision was introduced for Commerce and Information Technology programmes at the Vaal Triangle Campus (VTC) of North-West University (NWU) in 2010. Students enrolling for extended programmes do not comply with the minimum prerequisites for the normal programmes. The question then arises as to whether these programmes have the intended impact? This paper reports the results of a two year longitudinal study, tracking the first year academic achievement of the two cohorts of enrolments since 2010. The results provide valuable insight into the structuring of an extended programme and its potential impact.

A Decision Boundary based Discretization Technique using Resampling

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

Simulating Discrete Time Model Reference Adaptive Control System with Great Initial Error

This article is based on the technique which is called Discrete Parameter Tracking (DPT). First introduced by A. A. Azab [8] which is applicable for less order reference model. The order of the reference model is (n-l) and n is the number of the adjustable parameters in the physical plant. The technique utilizes a modified gradient method [9] where the knowledge of the exact order of the nonadaptive system is not required, so, as to eliminate the identification problem. The applicability of the mentioned technique (DPT) was examined through the solution of several problems. This article introduces the solution of a third order system with three adjustable parameters, controlled according to second order reference model. The adjustable parameters have great initial error which represent condition. Computer simulations for the solution and analysis are provided to demonstrate the simplicity and feasibility of the technique.

Investigation of 5,10,15,20-Tetrakis(3-,5--Di-Tert-Butylphenyl)Porphyrinatocopper(II) for Electronics Applications

In this work, an organic compound 5,10,15,20- Tetrakis(3,5-di-tertbutylphenyl)porphyrinatocopper(II) (TDTBPPCu) is studied as an active material for thin film electronic devices. To investigate the electrical properties of TDTBPPCu, junction of TDTBPPCu with heavily doped n-Si and Al is fabricated. TDTBPPCu film was sandwiched between Al and n-Si electrodes. Various electrical parameters of TDTBPPCu are determined. The current-voltage characteristics of the junction are nonlinear, asymmetric and show rectification behavior, which gives the clue of formation of depletion region. This behavior indicates the potential of TDTBPPCu for electronics applications. The current-voltage and capacitance-voltage techniques are used to find the different electronic parameters.

H-ARQ Techniques for Wireless Systems with Punctured Non-Binary LDPC as FEC Code

This paper presents the H-ARQ techniques comparison for OFDM systems with a new family of non-binary LDPC codes which has been developed within the EU FP7 DAVINCI project. The punctured NB-LDPC codes have been used in a simulated model of the transmission system. The link level performance has been evaluated in terms of spectral efficiency, codeword error rate and average number of retransmissions. The NB-LDPC codes can be easily and effective implemented with different methods of the retransmission needed if correct decoding of a codeword failed. Here the Optimal Symbol Selection method is proposed as a Chase Combining technique.

A Review of Coverage and Routing for Wireless Sensor Networks

The special constraints of sensor networks impose a number of technical challenges for employing them. In this review, we study the issues and existing protocols in three areas: coverage and routing. We present two types of coverage problems: to determine the minimum number of sensor nodes that need to perform active sensing in order to monitor a certain area; and to decide the quality of service that can be provided by a given sensor network. While most routing protocols in sensor networks are data-centric, there are other types of routing protocols as well, such as hierarchical, location-based, and QoS-aware. We describe and compare several protocols in each group. We present several multipath routing protocols and single-path with local repair routing protocols, which are proposed for recovering from sensor node crashes. We also discuss some transport layer schemes for reliable data transmission in lossy wireless channels.

Simulation of a Process Design Model for Anaerobic Digestion of Municipal Solid Wastes

Anaerobic Digestion has become a promising technology for biological transformation of organic fraction of the municipal solid wastes (MSW). In order to represent the kinetic behavior of such biological process and thereby to design a reactor system, development of a mathematical model is essential. Addressing this issue, a simplistic mathematical model has been developed for anaerobic digestion of MSW in a continuous flow reactor unit under homogeneous steady state condition. Upon simulated hydrolysis, the kinetics of biomass growth and substrate utilization rate are assumed to follow first order reaction kinetics. Simulation of this model has been conducted by studying sensitivity of various process variables. The model was simulated using typical kinetic data of anaerobic digestion MSW and typical MSW characteristics of Kolkata. The hydraulic retention time (HRT) and solid retention time (SRT) time were mainly estimated by varying different model parameters like efficiency of reactor, influent substrate concentration and biomass concentration. Consequently, design table and charts have also been prepared for ready use in the actual plant operation.

A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Developing Pedotransfer Functions for Estimating Some Soil Properties using Artificial Neural Network and Multivariate Regression Approaches

Study of soil properties like field capacity (F.C.) and permanent wilting point (P.W.P.) play important roles in study of soil moisture retention curve. Although these parameters can be measured directly, their measurement is difficult and expensive. Pedotransfer functions (PTFs) provide an alternative by estimating soil parameters from more readily available soil data. In this investigation, 70 soil samples were collected from different horizons of 15 soil profiles located in the Ziaran region, Qazvin province, Iran. The data set was divided into two subsets for calibration (80%) and testing (20%) of the models and their normality were tested by Kolmogorov-Smirnov method. Both multivariate regression and artificial neural network (ANN) techniques were employed to develop the appropriate PTFs for predicting soil parameters using easily measurable characteristics of clay, silt, O.C, S.P, B.D and CaCO3. The performance of the multivariate regression and ANN models was evaluated using an independent test data set. In order to evaluate the models, root mean square error (RMSE) and R2 were used. The comparison of RSME for two mentioned models showed that the ANN model gives better estimates of F.C and P.W.P than the multivariate regression model. The value of RMSE and R2 derived by ANN model for F.C and P.W.P were (2.35, 0.77) and (2.83, 0.72), respectively. The corresponding values for multivariate regression model were (4.46, 0.68) and (5.21, 0.64), respectively. Results showed that ANN with five neurons in hidden layer had better performance in predicting soil properties than multivariate regression.

Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

Using Multi-Thread Technology Realize Most Short-Path Parallel Algorithm

The shortest path question is in a graph theory model question, and it is applied in many fields. The most short-path question may divide into two kinds: Single sources most short-path, all apexes to most short-path. This article mainly introduces the problem of all apexes to most short-path, and gives a new parallel algorithm of all apexes to most short-path according to the Dijkstra algorithm. At last this paper realizes the parallel algorithms in the technology of C # multithreading.

A Study on Dogme 95 in the Korean Films

Many new experimental films which were free from conventional movie forms have appeared since Nubellbak Movement in the late 1950s. Forty years after the movement started, on March 13th, 1995, on the 100th anniversary of the birth of film, the declaration called Dogme 95, was issued in Copenhagen, Denmark. It aimed to create a new style of avant-garde film, and showed a tendency toward being anti-Hollywood and anti-genre, which were against the highly popular Hollywood trend of movies based on large-scale investment. The main idea of Dogme 95 is the opposition to 'the writer's doctrine' that a film should be the artist's individual work and to 'the overuse of technology' in film. The key figures declared ten principles called 'Vow of Chastity', by which new movie forms were to be produced. Interview (2000), directed by Byunhyuk, was made in 2000, five years after Dogme 95 was declared. This movie was dedicated as the first Asian Dogme. This study will survey the relationship between Korean film and the Vow of Chastity through the Korean films released in theaters from a viewpoint of technology and content. It also will call attention to its effects on and significance to Korean film in modern society.

Space Vector PWM Simulation for Three Phase DC/AC Inverter

Space Vector Pulse Width Modulation SVPWM is one of the most used techniques to generate sinusoidal voltage and current due to its facility and efficiency with low harmonics distortion. This algorithm is specially used in power electronic applications. This paper describes simulation algorithm of SVPWM & SPWM using MatLab/simulink environment. It also implements a closed loop three phases DC-AC converter controlling its outputs voltages amplitude and frequency using MatLab. Also comparison between SVPWM & SPWM results is given.

Study of Measures to Secure Video Phone Service Safety through a Preliminary Evaluationof the Information Security of the New IT Service

The rapid advance of communication technology is evolving the network environment into the broadband convergence network. Likewise, the IT services operated in the individual network are also being quickly converged in the broadband convergence network environment. VoIP and IPTV are two examples of such new services. Efforts are being made to develop the video phone service, which is an advanced form of the voice-oriented VoIP service. However, the new IT services will be subject to stability and reliability vulnerabilities if the relevant security issues are not answered during the convergence of the existing IT services currently being operated in individual networks within the wider broadband network environment. To resolve such problems, this paper attempts to analyze the possible threats and identify the necessary security measures before the deployment of the new IT services. Furthermore, it measures the quality of the encryption algorithm application example to describe the appropriate algorithm in order to present security technology that will have no negative impact on the quality of the video phone service.

Mammogram Image Size Reduction Using 16-8 bit Conversion Technique

Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.

IFC-Based Construction Engineering Domain Otology Development

The essence of the 21st century is knowledge economy. Knowledge has become the key resource of economic growth and social development. Construction industry is no exception. Because of the characteristic of complexity, project manager can't depend only on information management. The only way to improve the level of construction project management is to set up a kind of effective knowledge accumulation mechanism. This paper first introduced the IFC standard and the concept of ontology. Then put forward the construction method of the architectural engineering domain ontology based on IFC. And finally build up the concepts, properties and the relationship between the concepts of the ontology. The deficiency of this paper is also pointed out.