The Parameters Analysis for the Intersection Collision Avoidance Systems Based on Radar Sensors

This paper mainly studies the analyses of parameters in the intersection collision avoidance (ICA) system based on the radar sensors. The parameters include the positioning errors, the repeat period of the radar sensor, the conditions of potential collisions of two cross-path vehicles, etc. The analyses of the parameters can provide the requirements, limitations, or specifications of this ICA system. In these analyses, the positioning errors will be increased as the measured vehicle approach the intersection. In addition, it is not necessary to implement the radar sensor in higher position since the positioning sensitivities become serious as the height of the radar sensor increases. A concept of the safety buffer distances for front and rear of the measured vehicle is also proposed. The conditions for potential collisions of two cross-path vehicles are also presented to facilitate the computation algorithm.

Exploring the Determinants for Successful Collaboration of SMEs

The goal of this research is discovering the determinants of the success or failure of external cooperation in small and medium enterprises (SMEs). For this, a survey was given to 190 SMEs that experienced external cooperation within the last 3 years. A logistic regression model was used to derive organizational or strategic characteristics that significantly influence whether external collaboration of domestic SMEs is successful or not. Results suggest that research and development (R&D) features in general characteristics (both idea creation and discovering market opportunities) that focused on and emphasized indirected-market stakeholders (such as complementary companies and affiliates) and strategies in innovative strategic characteristics raise the probability of successful external cooperation. This can be used meaningfully to build a policy or strategy for inducing successful external cooperation or to understand the innovation of SMEs.

Artificial Neural Network based Modeling of Evaporation Losses in Reservoirs

An Artificial Neural Network based modeling technique has been used to study the influence of different combinations of meteorological parameters on evaporation from a reservoir. The data set used is taken from an earlier reported study. Several input combination were tried so as to find out the importance of different input parameters in predicting the evaporation. The prediction accuracy of Artificial Neural Network has also been compared with the accuracy of linear regression for predicting evaporation. The comparison demonstrated superior performance of Artificial Neural Network over linear regression approach. The findings of the study also revealed the requirement of all input parameters considered together, instead of individual parameters taken one at a time as reported in earlier studies, in predicting the evaporation. The highest correlation coefficient (0.960) along with lowest root mean square error (0.865) was obtained with the input combination of air temperature, wind speed, sunshine hours and mean relative humidity. A graph between the actual and predicted values of evaporation suggests that most of the values lie within a scatter of ±15% with all input parameters. The findings of this study suggest the usefulness of ANN technique in predicting the evaporation losses from reservoirs.

Preparation of Computer Model of the Aircraft for Numerical Aeroelasticity Tests – Flutter

Article presents the geometry and structure reconstruction procedure of the aircraft model for flatter research (based on the I22-IRYDA aircraft). For reconstruction the Reverse Engineering techniques and advanced surface modeling CAD tools are used. Authors discuss all stages of data acquisition process, computation and analysis of measured data. For acquisition the three dimensional structured light scanner was used. In the further sections, details of reconstruction process are present. Geometry reconstruction procedure transform measured input data (points cloud) into the three dimensional parametric computer model (NURBS solid model) which is compatible with CAD systems. Parallel to the geometry of the aircraft, the internal structure (structural model) are extracted and modeled. In last chapter the evaluation of obtained models are discussed.

Radio and Television Supreme Council as a Regulatory Board

In parallel, broadcasting has changed rapidly with the changing of the world at the same area. Broadcasting is also influenced and reshaped in terms of the emergence of new communication technologies. These developments have resulted a lot of economic and social consequences. The most important consequences of these results are those of the powers of the governments to control over the means of communication and control mechanisms related to the descriptions of the new issues. For this purpose, autonomous and independent regulatory bodies have been established by the state. One of these regulatory bodies is the Radio and Television Supreme Council, which to be established in 1994, with the Code no 3984. Today’s Radio and Television Supreme Council which is responsible for the regulation of the radio and television broadcasts all across Turkey has an important and effective position as autonomous and independent regulatory body. The Radio and Television Supreme Council acts as being a remarkable organizer for a sensitive area of radio and television broadcasting on one hand, and the area of democratic, liberal and keep in mind the concept of the public interest by putting certain principles for the functioning of the Board control, in the context of media policy as one of the central organs, on the other hand. In this study, the role of the Radio and Television Supreme Council is examined in accordance with the Code no 3894 in order to control over the communication and control mechanisms as well as the examination of the changes in the duties of the Code No. 6112, dated 2011.

CFD Simulation of SO2 Removal from Gas Mixtures using Ceramic Membranes

This work deals with modeling and simulation of SO2 removal in a ceramic membrane by means of FEM. A mass transfer model was developed to predict the performance of SO2 absorption in a chemical solvent. The model was based on solving conservation equations for gas component in the membrane. Computational fluid dynamics (CFD) of mass and momentum were used to solve the model equations. The simulations aimed to obtain the distribution of gas concentration in the absorption process. The effect of the operating parameters on the efficiency of the ceramic membrane was evaluated. The modeling findings showed that the gas phase velocity has significant effect on the removal of gas whereas the liquid phase does not affect the SO2 removal significantly. It is also indicated that the main mass transfer resistance is placed in the membrane and gas phase because of high tortuosity of the ceramic membrane.

Theoretical Considerations for Software Component Metrics

We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.

Confronting the Uncertainty of Systemic Innovation in Public Welfare Services

Faced with social and health system capacity constraints and rising and changing demand for welfare services, governments and welfare providers are increasingly relying on innovation to help support and enhance services. However, the evidence reported by several studies indicates that the realization of that potential is not an easy task. Innovations can be deemed inherently complex to implement and operate, because many of them involve a combination of technological and organizational renewal within an environment featuring a diversity of stakeholders. Many public welfare service innovations are markedly systemic in their nature, which means that they emerge from, and must address, the complex interplay between political, administrative, technological, institutional and legal issues. This paper suggests that stakeholders dealing with systemic innovation in welfare services must deal with ambiguous and incomplete information in circumstances of uncertainty. Employing a literature review methodology and case study, this paper identifies, categorizes and discusses different aspects of the uncertainty of systemic innovation in public welfare services, and argues that uncertainty can be classified into eight categories: technological uncertainty, market uncertainty, regulatory/institutional uncertainty, social/political uncertainty, acceptance/legitimacy uncertainty, managerial uncertainty, timing uncertainty and consequence uncertainty.

Multiobjective Optimal Power Flow Using Hybrid Evolutionary Algorithm

This paper solves the environmental/ economic dispatch power system problem using the Non-dominated Sorting Genetic Algorithm-II (NSGA-II) and its hybrid with a Convergence Accelerator Operator (CAO), called the NSGA-II/CAO. These multiobjective evolutionary algorithms were applied to the standard IEEE 30-bus six-generator test system. Several optimization runs were carried out on different cases of problem complexity. Different quality measure which compare the performance of the two solution techniques were considered. The results demonstrated that the inclusion of the CAO in the original NSGA-II improves its convergence while preserving the diversity properties of the solution set.

Nutrients Removal from Municipal Wastewater Treatment Plant Effluent using Eichhornia Crassipes

Water hyacinth has been used in aquatic systems for wastewater purification in many years worldwide. The role of water hyacinth (Eichhornia crassipes) species in polishing nitrate and phosphorus concentration from municipal wastewater treatment plant effluent by phytoremediation method was evaluated. The objective of this project is to determine the removal efficiency of water hyacinth in polishing nitrate and phosphorus, as well as chemical oxygen demand (COD) and ammonia. Water hyacinth is considered as the most efficient aquatic plant used in removing vast range of pollutants such as organic matters, nutrients and heavy metals. Water hyacinth, also referred as macrophytes, were cultivated in the treatment house in a reactor tank of approximately 90(L) x 40(W) x 25(H) in dimension and built with three compartments. Three water hyacinths were placed in each compartments and water sample in each compartment were collected in every two days. The plant observation was conducted by weight measurement, plant uptake and new young shoot development. Water hyacinth effectively removed approximately 49% of COD, 81% of ammonia, 67% of phosphorus and 92% of nitrate. It also showed significant growth rate at starting from day 6 with 0.33 shoot/day and they kept developing up to 0.38 shoot/day at the end of day 24. From the studies conducted, it was proved that water hyacinth is capable of polishing the effluent of municipal wastewater which contains undesirable amount of nitrate and phosphorus concentration.

An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement

Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.

GPT Onto: A New Beginning for Malaysia Gross Pollutant Trap Ontology

Ontology is widely being used as a tool for organizing information, creating the relation between the subjects within the defined knowledge domain area. Various fields such as Civil, Biology, and Management have successful integrated ontology in decision support systems for managing domain knowledge and to assist their decision makers. Gross pollutant traps (GPT) are devices used in trapping and preventing large items or hazardous particles in polluting and entering our waterways. However choosing and determining GPT is a challenge in Malaysia as there are inadequate GPT data repositories being captured and shared. Hence ontology is needed to capture, organize and represent this knowledge into meaningful information which can be contributed to the efficiency of GPT selection in Malaysia urbanization. A GPT Ontology framework is therefore built as the first step to capture GPT knowledge which will then be integrated into the decision support system. This paper will provide several examples of the GPT ontology, and explain how it is constructed by using the Protégé tool.

The Effect of Stress Biaxiality on Crack Shape Development

The development of shape and size of a crack in a pressure vessel under uniaxial and biaxial loadings is important in fitness-for-service evaluations such as leak-before-break. In this work finite element modelling was used to evaluate the mean stress and the J-integral around a front of a surface-breaking crack. A procedure on the basis of ductile tearing resistance curves of high and low constrained fracture mechanics geometries was developed to estimate the amount of ductile crack extension for surface-breaking cracks and to show the evolution of the initial crack shape. The results showed non-uniform constraint levels and crack driving forces around the crack front at large deformation levels. It was also shown that initially semi-elliptical surface cracks under biaxial load developed higher constraint levels around the crack front than in uniaxial tension. However similar crack shapes were observed with more extensions associated with cracks under biaxial loading.

Plug and Play Interferometer Configuration using Single Modulator Technique

We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.

Semi-Automatic Approach for Semantic Annotation

The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.

Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.

A Simplified Adaptive Decision Feedback Equalization Technique for π/4-DQPSK Signals

We present a simplified equalization technique for a π/4 differential quadrature phase shift keying ( π/4 -DQPSK) modulated signal in a multipath fading environment. The proposed equalizer is realized as a fractionally spaced adaptive decision feedback equalizer (FS-ADFE), employing exponential step-size least mean square (LMS) algorithm as the adaptation technique. The main advantage of the scheme stems from the usage of exponential step-size LMS algorithm in the equalizer, which achieves similar convergence behavior as that of a recursive least squares (RLS) algorithm with significantly reduced computational complexity. To investigate the finite-precision performance of the proposed equalizer along with the π/4 -DQPSK modem, the entire system is evaluated on a 16-bit fixed point digital signal processor (DSP) environment. The proposed scheme is found to be attractive even for those cases where equalization is to be performed within a restricted number of training samples.

Design and Simulation of Low Speed Axial Flux Permanent Magnet (AFPM) Machine

In this paper presented initial design of Low Speed Axial Flux Permanent Magnet (AFPM) Machine with Non-Slotted TORUS topology type by use of certain algorithm (Appendix). Validation of design algorithm studied by means of selected data of an initial prototype machine. Analytically design calculation carried out by means of design algorithm and obtained results compared with results of Finite Element Method (FEM).

Simulation Games in Business Process Management Education

Business process management (BPM) has become widely accepted within business community as a means for improving business performance. However, it is of the highest importance to incorporate BPM as part of the curriculum at the university level education in order to achieve the appropriate acceptance of the method. Goal of the paper is to determine the current state of education in business process management (BPM) at the Croatian universities and abroad. It investigates the applied forms of instruction and teaching methods and gives several proposals for BPM courses improvement. Since majority of undergraduate and postgraduate students have limited understanding of business processes and lack of any practical experience, there is a need for introducing new teaching approaches. Therefore, we offer some suggestions for further improvement, among which the introduction of simulation games environment in BPM education is strongly recommended.

The Future Regulatory Challenges of Liquidity Risk Management

Liquidity risk management ranks to key concepts applied in finance. Liquidity is defined as a capacity to obtain funding when needed, while liquidity risk means as a threat to this capacity to generate cash at fair costs. In the paper we present challenges of liquidity risk management resulting from the 2007- 2009 global financial upheaval. We see five main regulatory liquidity risk management issues requiring revision in coming years: liquidity measurement, intra-day and intra-group liquidity management, contingency planning and liquidity buffers, liquidity systems, controls and governance, and finally models testing the viability of business liquidity models.