Research on Weakly Hard Real-Time Constraints and Their Boolean Combination to Support Adaptive QoS

Advances in computing applications in recent years have prompted the demand for more flexible scheduling models for QoS demand. Moreover, in practical applications, partly violated temporal constraints can be tolerated if the violation meets certain distribution. So we need extend the traditional Liu and Lanland model to adapt to these circumstances. There are two extensions, which are the (m, k)-firm model and Window-Constrained model. This paper researches on weakly hard real-time constraints and their combination to support QoS. The fact that a practical application can tolerate some violations of temporal constraint under certain distribution is employed to support adaptive QoS on the open real-time system. The experiment results show these approaches are effective compared to traditional scheduling algorithms.

Product Ecodesign Approaches in ISO 14001 Certified Companies

The aim of the study was to investigate whether there is the promotion of product ecodesign measures as a result of adopting ISO 14001 certification in manufacturing companies in the Republic of Slovenia. Companies gave the most of their product development attention to waste and energy reduction during manufacturing process and reduction of material consumption per unit of product. Regarding the importance of different ecodesign criteria reduction of material consumption per unit of product was reported as the most important criterion. Less attention is paid to endof- life issues considering recycling or packaging. Most manufacturing enterprises considered ISO 14001 standard as a very useful tool or at least a useful tool helping them to accelerate and establish product ecodesign activities. Two most frequently considered ecodesign drivers are increased competitive advantage and legal requirements and two most important barriers are high development costs and insufficient market demand.

Improving Injection Moulding Processes Using Experimental Design

Moulded parts contribute to more than 70% of components in products. However, common defects particularly in plastic injection moulding exist such as: warpage, shrinkage, sink marks, and weld lines. In this paper Taguchi experimental design methods are applied to reduce the warpage defect of thin plate Acrylonitrile Butadiene Styrene (ABS) and are demonstrated in two levels; namely, orthogonal arrays of Taguchi and the Analysis of Variance (ANOVA). Eight trials have been run in which the optimal parameters that can minimize the warpage defect in factorial experiment are obtained. The results obtained from ANOVA approach analysis with respect to those derived from MINITAB illustrate the most significant factors which may cause warpage in injection moulding process. Moreover, ANOVA approach in comparison with other approaches like S/N ratio is more accurate and with the interaction of factors it is possible to achieve higher and the better outcomes.

Automatic Authentication of Handwritten Documents via Low Density Pixel Measurements

We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.

Multicast Optimization Techniques using Best Effort Genetic Algorithms

Multicast Network Technology has pervaded our lives-a few examples of the Networking Techniques and also for the improvement of various routing devices we use. As we know the Multicast Data is a technology offers many applications to the user such as high speed voice, high speed data services, which is presently dominated by the Normal networking and the cable system and digital subscriber line (DSL) technologies. Advantages of Multi cast Broadcast such as over other routing techniques. Usually QoS (Quality of Service) Guarantees are required in most of Multicast applications. The bandwidth-delay constrained optimization and we use a multi objective model and routing approach based on genetic algorithm that optimizes multiple QoS parameters simultaneously. The proposed approach is non-dominated routes and the performance with high efficiency of GA. Its betterment and high optimization has been verified. We have also introduced and correlate the result of multicast GA with the Broadband wireless to minimize the delay in the path.

Identification of Industrial Health Using ANN

The customary practice of identifying industrial sickness is a set traditional techniques which rely upon a range of manual monitoring and compilation of financial records. It makes the process tedious, time consuming and often are susceptible to manipulation. Therefore, certain readily available tools are required which can deal with such uncertain situations arising out of industrial sickness. It is more significant for a country like India where the fruits of development are rarely equally distributed. In this paper, we propose an approach based on Artificial Neural Network (ANN) to deal with industrial sickness with specific focus on a few such units taken from a less developed north-east (NE) Indian state like Assam. The proposed system provides decision regarding industrial sickness using eight different parameters which are directly related to the stages of sickness of such units. The mechanism primarily uses certain signals and symptoms of industrial health to decide upon the state of a unit. Specifically, we formulate an ANN based block with data obtained from a few selected units of Assam so that required decisions related to industrial health could be taken. The system thus formulated could become an important part of planning and development. It can also contribute towards computerization of decision support systems related to industrial health and help in better management.

Multi-Objective Optimization of Gas Turbine Power Cycle

Because of importance of energy, optimization of power generation systems is necessary. Gas turbine cycles are suitable manner for fast power generation, but their efficiency is partly low. In order to achieving higher efficiencies, some propositions are preferred such as recovery of heat from exhaust gases in a regenerator, utilization of intercooler in a multistage compressor, steam injection to combustion chamber and etc. However thermodynamic optimization of gas turbine cycle, even with above components, is necessary. In this article multi-objective genetic algorithms are employed for Pareto approach optimization of Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective optimization a number of conflicting objective functions are to be optimized simultaneously. The important objective functions that have been considered for optimization are entropy generation of RIGT cycle (Ns) derives using Exergy Analysis and Gouy-Stodola theorem, thermal efficiency and the net output power of RIGT Cycle. These objectives are usually conflicting with each other. The design variables consist of thermodynamic parameters such as compressor pressure ratio (Rp), excess air in combustion (EA), turbine inlet temperature (TIT) and inlet air temperature (T0). At the first stage single objective optimization has been investigated and the method of Non-dominated Sorting Genetic Algorithm (NSGA-II) has been used for multi-objective optimization. Optimization procedures are performed for two and three objective functions and the results are compared for RIGT Cycle. In order to investigate the optimal thermodynamic behavior of two objectives, different set, each including two objectives of output parameters, are considered individually. For each set Pareto front are depicted. The sets of selected decision variables based on this Pareto front, will cause the best possible combination of corresponding objective functions. There is no superiority for the points on the Pareto front figure, but they are superior to any other point. In the case of three objective optimization the results are given in tables.

A Critics Study of Neural Networks Applied to ion-Exchange Process

This paper presents a critical study about the application of Neural Networks to ion-exchange process. Ionexchange is a complex non-linear process involving many factors influencing the ions uptake mechanisms from the pregnant solution. The following step includes the elution. Published data presents empirical isotherm equations with definite shortcomings resulting in unreliable predictions. Although Neural Network simulation technique encounters a number of disadvantages including its “black box", and a limited ability to explicitly identify possible causal relationships, it has the advantage to implicitly handle complex nonlinear relationships between dependent and independent variables. In the present paper, the Neural Network model based on the back-propagation algorithm Levenberg-Marquardt was developed using a three layer approach with a tangent sigmoid transfer function (tansig) at hidden layer with 11 neurons and linear transfer function (purelin) at out layer. The above mentioned approach has been used to test the effectiveness in simulating ion exchange processes. The modeling results showed that there is an excellent agreement between the experimental data and the predicted values of copper ions removed from aqueous solutions.

Mapping SOA and Outsourcing on NEBIC: A Dynamic Capabilities Perspective Approach

This article is an extension and a practical application approach of Wheeler-s NEBIC theory (Net Enabled Business Innovation Cycle). NEBIC theory is a new approach in IS research and can be used for dynamic environment related to new technology. Firms can follow the market changes rapidly with support of the IT resources. Flexible firms adapt their market strategies, and respond more quickly to customers changing behaviors. When every leading firm in an industry has access to the same IT resources, the way that these IT resources are managed will determine the competitive advantages or disadvantages of firm. From Dynamic Capabilities Perspective and from newly introduced NEBIC theory by Wheeler, we know that only IT resources cannot deliver customer value but good configuration of those resources can guarantee customer value by choosing the right emerging technology, grasping the economic opportunities through business innovation and growth. We found evidences in literature that SOA (Service Oriented Architecture) is a promising emerging technology which can deliver the desired economic opportunity through modularity, flexibility and loosecoupling. SOA can also help firms to connect in network which can open a new window of opportunity to collaborate in innovation and right kind of outsourcing

Development of a Pipeline Monitoring System by Bio-mimetic Robots

To explore pipelines is one of various bio-mimetic robot applications. The robot may work in common buildings such as between ceilings and ducts, in addition to complicated and massive pipeline systems of large industrial plants. The bio-mimetic robot finds any troubled area or malfunction and then reports its data. Importantly, it can not only prepare for but also react to any abnormal routes in the pipeline. The pipeline monitoring tasks require special types of mobile robots. For an effective movement along a pipeline, the movement of the robot will be similar to that of insects or crawling animals. During its movement along the pipelines, a pipeline monitoring robot has an important task of finding the shapes of the approaching path on the pipes. In this paper we propose an effective solution to the pipeline pattern recognition, based on the fuzzy classification rules for the measured IR distance data.

New Features for Specific JPEG Steganalysis

We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.

Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation

Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.

Application of Multi-objective Optimization Packages in Design of an Evaporator Coil

A novel methodology has been used to design an evaporator coil of a refrigerant. The methodology used is through a complete Computer Aided Design /Computer Aided Engineering approach, by means of a Computational Fluid Dynamic/Finite Element Analysis model which is executed many times for the thermal-fluid exploration of several designs' configuration by an commercial optimizer. Hence the design is carried out automatically by parallel computations, with an optimization package taking the decisions rather than the design engineer. The engineer instead takes decision regarding the physical settings and initializing of the computational models to employ, the number and the extension of the geometrical parameters of the coil fins and the optimization tools to be employed. The final design of the coil geometry found to be better than the initial design.

Interactive Methods of Design Education as the Principles of Social Implications of Modern Communities

The term interactive education indicates the meaning related with multidisciplinary aspects of distance education following contemporary means around a common basis with different functional requirements. The aim of this paper is to reflect the new techniques in education with the new methods and inventions. These methods are better supplied by interactivity. The integration of interactive facilities in the discipline of education with distance learning is not a new concept but in addition the usage of these methods on design issue is newly being adapted to design education. In this paper the general approach of this method and after the analysis of different samples, the advantages and disadvantages of these approaches are being identified. The method of this paper is to evaluate the related samples and then analyzing the main hypothesis. The main focus is to mention the formation processes of this education. Technological developments in education should be filtered around the necessities of the design education and the structure of the system could then be formed or renewed. The conclusion indicates that interactive methods of education in design issue is a meaning capturing not only technical and computational intelligence aspects but also aesthetical and artistic approaches coming together around the same purpose.

Lodging Business Management in Nakhon Pathom with Sufficient Economy Approach

The objectives of this research are to search the management pattern of Nakhon Pathom lodging entrepreneurs for sufficient economy ways, to know the threat that affects this sector and design fit arrangement model to sustain their business with Nakhon Pathom style. What will happen if they do not use this approach? Will they have a financial crisis? The data and information are collected by informal discussions with 12 managers and 400 questionnaires. A mixed method of both qualitative research and quantitative research are used. Bent Flyvbjerg’s phronesis is utilized for this analysis. Our research will prove that sufficient economy can help small business firms to solve their problems. We think that the results of our research will be a financial model to solve many problems of the entrepreneurs and this way will can be a model for other provinces of Thailand.

Recursive Wiener-Khintchine Theorem

Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).

A Subjectively Influenced Router for Vehicles in a Four-Junction Traffic System

A subjectively influenced router for vehicles in a fourjunction traffic system is presented. The router is based on a 3-layer Backpropagation Neural Network (BPNN) and a greedy routing procedure. The BPNN detects priorities of vehicles based on the subjective criteria. The subjective criteria and the routing procedure depend on the routing plan towards vehicles depending on the user. The routing procedure selects vehicles from their junctions based on their priorities and route them concurrently to the traffic system. That is, when the router is provided with a desired vehicles selection criteria and routing procedure, it routes vehicles with a reasonable junction clearing time. The cost evaluation of the router determines its efficiency. In the case of a routing conflict, the router will route the vehicles in a consecutive order and quarantine faulty vehicles. The simulations presented indicate that the presented approach is an effective strategy of structuring a subjective vehicle router.

A Metametadata Architecture forPedagogic Data Description

This paper focuses on a novel method for semantic searching and retrieval of information about learning materials. Metametadata encapsulate metadata instances by using the properties and attributes provided by ontologies rather than describing learning objects. A novel metametadata taxonomy has been developed which provides the basis for a semantic search engine to extract, match and map queries to retrieve relevant results. The use of ontological views is a foundation for viewing the pedagogical content of metadata extracted from learning objects by using the pedagogical attributes from the metametadata taxonomy. Using the ontological approach and metametadata (based on the metametadata taxonomy) we present a novel semantic searching mechanism.These three strands – the taxonomy, the ontological views, and the search algorithm – are incorporated into a novel architecture (OMESCOD) which has been implemented.

ILMI Approach for Robust Output Feedback Control of Induction Machine

In this note, the robust static output feedback stabilisation of an induction machine is addressed. The machine is described by a non homogenous bilinear model with structural uncertainties, and the feedback gain is computed via an iterative LMI (ILMI) algorithm.

Spatial thinking Issues: Towards Rural Sociological Research Agenda in the Third Millennium

Does the spatial perspective provide a common thread for rural sociology? Have rural sociologists succeeded in bringing order to their data using spatial analysis models and techniques? A trial answer to such questions, as touchstones of theoretical and applied sociological studies in rural areas, is the point at issue in the present paper. Spatial analyses have changed the way rural sociologists approach scientific problems. Rural sociology is spatial by nature because much, if not most, of its research topics has a spatial “awareness." However, such spatial awareness is not quite the same as spatial analysis because it is not typically associated with underlying theories and hypotheses about spatial patterns that are designed to be tested for their specific spatial content. This paper presents pressing issues for future research to reintroduce mainstream rural sociology to the concept of space.