An Adverse Model for Price Discrimination in the Case of Monopoly

We consider a Principal-Agent model with the Principal being a seller who does not know perfectly how much the buyer (the Agent) is willing to pay for the good. The buyer-s preferences are hence his private information. The model corresponds to the nonlinear pricing problem of Maskin and Riley. We assume there are three types of Agents. The model is solved using “informational rents" as variables. In the last section we present the main characteristics of the optimal contracts in asymmetric information and some possible extensions of the model.

Frame Texture Classification Method (FTCM) Applied on Mammograms for Detection of Abnormalities

Texture classification is an important image processing task with a broad application range. Many different techniques for texture classification have been explored. Using sparse approximation as a feature extraction method for texture classification is a relatively new approach, and Skretting et al. recently presented the Frame Texture Classification Method (FTCM), showing very good results on classical texture images. As an extension of that work the FTCM is here tested on a real world application as detection of abnormalities in mammograms. Some extensions to the original FTCM that are useful in some applications are implemented; two different smoothing techniques and a vector augmentation technique. Both detection of microcalcifications (as a primary detection technique and as a last stage of a detection scheme), and soft tissue lesions in mammograms are explored. All the results are interesting, and especially the results using FTCM on regions of interest as the last stage in a detection scheme for microcalcifications are promising.

Handover Strategies Challenges in Wireless ATM Networks

To support user mobility for a wireless network new mechanisms are needed and are fundamental, such as paging, location updating, routing, and handover. Also an important key feature is mobile QoS offered by the WATM. Several ATM network protocols should be updated to implement mobility management and to maintain the already ATM QoS over wireless ATM networks. A survey of the various schemes and types of handover is provided. Handover procedure allows guarantee the terminal connection reestablishment when it moves between areas covered by different base stations. It is useful to satisfy user radio link transfer without interrupting a connection. However, failure to offer efficient solutions will result in handover important packet loss, severe delays and degradation of QoS offered to the applications. This paper reviews the requirements, characteristics and open issues of wireless ATM, particularly with regard to handover. It introduces key aspects of WATM and mobility extensions, which are added in the fixed ATM network. We propose a flexible approach for handover management that will minimize the QoS deterioration. Functional entities of this flexible approach are discussed in order to achieve minimum impact on the connection quality when a MT crosses the BS.

Fuzzy Ideology based Long Term Load Forecasting

Fuzzy Load forecasting plays a paramount role in the operation and management of power systems. Accurate estimation of future power demands for various lead times facilitates the task of generating power reliably and economically. The forecasting of future loads for a relatively large lead time (months to few years) is studied here (long term load forecasting). Among the various techniques used in forecasting load, artificial intelligence techniques provide greater accuracy to the forecasts as compared to conventional techniques. Fuzzy Logic, a very robust artificial intelligent technique, is described in this paper to forecast load on long term basis. The paper gives a general algorithm to forecast long term load. The algorithm is an Extension of Short term load forecasting method to Long term load forecasting and concentrates not only on the forecast values of load but also on the errors incorporated into the forecast. Hence, by correcting the errors in the forecast, forecasts with very high accuracy have been achieved. The algorithm, in the paper, is demonstrated with the help of data collected for residential sector (LT2 (a) type load: Domestic consumers). Load, is determined for three consecutive years (from April-06 to March-09) in order to demonstrate the efficiency of the algorithm and to forecast for the next two years (from April-09 to March-11).

Flexible Workplaces Fostering Knowledge Workers Informal Learning: The Flexible Office Case

Organizations face challenges supporting knowledge workers due to their particular requirements for an environment supportive of their self-guided learning activities which are important to increase their productivity and to develop creative solutions to non-routine problems. Face-to-face knowledge sharing remains crucial in spite of a large number of knowledge management instruments that aim at supporting a more impersonal transfer of knowledge. This paper first describes the main criteria for a conceptual and technical solution targeted at flexible management of office space that aims at assigning those knowledge workers to the same room that are most likely to thrive when being brought together thus enhancing their knowledge work productivity. The paper reflects on lessons learned from the implementation and operation of such a solution in a project-focused organization and derives several implications for future extensions that target to foster problem solving, informal learning and personal development.

A Probability based Pair Extension Method in Protein 2-DE Gel Image Analysis

The two-dimensional gel electrophoresis method (2-DE) is widely used in Proteomics to separate thousands of proteins in a sample. By comparing the protein expression levels of proteins in a normal sample with those in a diseased one, it is possible to identify a meaningful set of marker proteins for the targeted disease. The major shortcomings of this approach involve inherent noises and irregular geometric distortions of spots observed in 2-DE images. Various experimental conditions can be the major causes of these problems. In the protein analysis of samples, these problems eventually lead to incorrect conclusions. In order to minimize the influence of these problems, this paper proposes a partition based pair extension method that performs spot-matching on a set of gel images multiple times and segregates more reliable mapping results which can improve the accuracy of gel image analysis. The improved accuracy of the proposed method is analyzed through various experiments on real 2-DE images of human liver tissues.

Perfect Plastic Deformation of a Circular Thin Bronze Plate due to the Growth and Collapse of a Vapour Bubble

Dynamics of a vapour bubble generated due to a high local energy input near a circular thin bronze plate in the absence of the buoyancy forces is numerically investigated in this paper. The bubble is generated near a thin bronze plate and during the growth and collapse of the bubble, it deforms the nearby plate. The Boundary Integral Equation Method is employed for numerical simulation of the problem. The fluid is assumed to be incompressible, irrotational and inviscid and the surface tension on the bubble boundary is neglected. Therefore the fluid flow around the vapour bubble can be assumed as a potential flow. Furthermore, the thin bronze plate is assumed to have perfectly plastic behaviour. Results show that the displacement of the circular thin bronze plate has considerable effect on the dynamics of its nearby vapour bubble. It is found that by decreasing the thickness of the thin bronze plate, the growth and collapse rate of the bubble becomes higher and consequently the lifetime of the bubble becomes shorter.

Performance Evaluation of AOMDV-PAMAC Protocols for Ad Hoc Networks

Power consumption of nodes in ad hoc networks is a critical issue as they predominantly operate on batteries. In order to improve the lifetime of an ad hoc network, all the nodes must be utilized evenly and the power required for connections must be minimized. In this project a link layer algorithm known as Power Aware medium Access Control (PAMAC) protocol is proposed which enables the network layer to select a route with minimum total power requirement among the possible routes between a source and a destination provided all nodes in the routes have battery capacity above a threshold. When the battery capacity goes below a predefined threshold, routes going through these nodes will be avoided and these nodes will act only as source and destination. Further, the first few nodes whose battery power drained to the set threshold value are pushed to the exterior part of the network and the nodes in the exterior are brought to the interior. Since less total power is required to forward packets for each connection. The network layer protocol AOMDV is basically an extension to the AODV routing protocol. AOMDV is designed to form multiple routes to the destination and it also avoid the loop formation so that it reduces the unnecessary congestion to the channel. In this project, the performance of AOMDV is evaluated using PAMAC as a MAC layer protocol and the average power consumption, throughput and average end to end delay of the network are calculated and the results are compared with that of the other network layer protocol AODV.

Covering-based Rough sets Based on the Refinement of Covering-element

Covering-based rough sets is an extension of rough sets and it is based on a covering instead of a partition of the universe. Therefore it is more powerful in describing some practical problems than rough sets. However, by extending the rough sets, covering-based rough sets can increase the roughness of each model in recognizing objects. How to obtain better approximations from the models of a covering-based rough sets is an important issue. In this paper, two concepts, determinate elements and indeterminate elements in a universe, are proposed and given precise definitions respectively. This research makes a reasonable refinement of the covering-element from a new viewpoint. And the refinement may generate better approximations of covering-based rough sets models. To prove the theory above, it is applied to eight major coveringbased rough sets models which are adapted from other literature. The result is, in all these models, the lower approximation increases effectively. Correspondingly, in all models, the upper approximation decreases with exceptions of two models in some special situations. Therefore, the roughness of recognizing objects is reduced. This research provides a new approach to the study and application of covering-based rough sets.

Reliable One-Dimensional Model of Two-Dimensional Insulated Oval Duct Considering Heat Radiation

The reliable results of an insulated oval duct considering heat radiation are obtained basing on accurate oval perimeter obtained by integral method as well as one-dimensional Plane Wedge Thermal Resistance (PWTR) model. This is an extension study of former paper of insulated oval duct neglecting heat radiation. It is found that in the practical situations with long-short-axes ratio a/b 4.5% while t/R2

Acoustic Analysis with Consideration of Damping Effects of Air Viscosity in Sound Pathway

Sound pathways in the enclosures of small earphones are very narrow. In such narrow pathways, the speed of sound propagation and the phase of sound waves change because of the air viscosity. We have developed a new finite element method that includes the effects of damping due to air viscosity for modeling the sound pathway. This method is developed as an extension of the existing finite element method for porous sound-absorbing materials. The numerical calculation results using the proposed finite element method are validated against the existing calculation methods.

Migration and Accumulation of Artificial Radionuclides in the System Water-Soil-Plants Depending on Polymers Applying

The possibility of radionuclides-related contamination of lands at agricultural holdings defines the necessity to apply special protective measures in plant growing. The aim of researches is to elucidate the influence of polymers applying on biological migration of man-made anthropogenic radionuclides 90Sr and 137Cs in the system water - soil – plant. The tests are being carried out under field conditions with and without application of polymers in root-inhabited media in more radioecological tension zone (with the radius of 7 km from the Armenian Nuclear Power Plant). The polymers on the base of K+, Caµ, KµCaµ ions were tested. Productivity of pepper depending on the presence and type of polymer material, content of artificial radionuclides in waters, soil and plant material has been determined. The character of different polymers influence on the artificial radionuclides migration and accumulation in the system water-soil-plant and accumulation in the plants has been cleared up.

LOWL: Logic and OWL, an Extension

Current research on semantic web aims at making intelligent web pages meaningful for machines. In this way, ontology plays a primary role. We believe that logic can help ontology languages (such as OWL) to be more fluent and efficient. In this paper we try to combine logic with OWL to reduce some disadvantages of this language. Therefore we extend OWL by logic and also show how logic can satisfy our future expectations of an ontology language.

Application of Smooth Ergodic Hidden Markov Model in Text to Speech Systems

In developing a text-to-speech system, it is well known that the accuracy of information extracted from a text is crucial to produce high quality synthesized speech. In this paper, a new scheme for converting text into its equivalent phonetic spelling is introduced and developed. This method is applicable to many applications in text to speech converting systems and has many advantages over other methods. The proposed method can also complement the other methods with a purpose of improving their performance. The proposed method is a probabilistic model and is based on Smooth Ergodic Hidden Markov Model. This model can be considered as an extension to HMM. The proposed method is applied to Persian language and its accuracy in converting text to speech phonetics is evaluated using simulations.

Creating Customer Value through SOA and Outsourcing: A NEBIC Approach

This article is an extension and a practical application approach of Wheeler-s NEBIC theory (Net Enabled Business Innovation Cycle). NEBIC theory is a new approach in IS research and can be used for dynamic environment related to new technology. Firms can follow the market changes rapidly with support of the IT resources. Flexible firms adapt their market strategies, and respond more quickly to customers changing behaviors. When every leading firm in an industry has access to the same IT resources, the way that these IT resources are managed will determine the competitive advantages or disadvantages of firm. From Dynamic Capabilities Perspective and from newly introduced NEBIC theory by Wheeler, we know that only IT resources cannot deliver customer value but good configuration of those resources can guarantee customer value by choosing the right emerging technology, grasping the right economic opportunities through business innovation and growth. We found evidences in literature that SOA (Service Oriented Architecture) is a promising emerging technology which can deliver the desired economic opportunity through modularity, flexibility and loose-coupling. SOA can also help firms to connect in network which can open a new window of opportunity to collaborate in innovation and right kind of outsourcing. There are many articles and research reports indicates that failure rate in outsourcing is very high but at the same time research indicates that successful outsourcing projects adds tangible and intangible benefits to the service consumer. Business executives and policy makers in the west should not afraid of outsourcing but they should choose the right strategy through the use of emerging technology to significantly reduce the failure rate in outsourcing.

Identification of the Causes of Construction Delay in Malaysia

Construction delay is unavoidable in developing countries including Malaysia. It is defined as time overrun or extension of time for completion of a project. The purpose of the study is to determine the causes of delay in Malaysian construction industries based on previous worldwide research. The field survey conducted includes the experienced developers, consultants and contractors in Malaysia. 34 causes of the construction delay have been determined and 24 have been selected using the Rasch model analysis. The analysis result will be used as the baseline for the next research to find the causes of delay in the Malaysian construction industry taking place in Malaysian higher learning institutions.

PTFE Capillary-Based DNA Amplification within an Oscillatory Thermal Cycling Device

This study describes a capillary-based device integrated with the heating and cooling modules for polymerase chain reaction (PCR). The device consists of the reaction polytetrafluoroethylene (PTFE) capillary, the aluminum blocks, and is equipped with two cartridge heaters, a thermoelectric (TE) cooler, a fan, and some thermocouples for temperature control. The cartridge heaters are placed into the heating blocks and maintained at two different temperatures to achieve the denaturation and the extension step. Some thermocouples inserted into the capillary are used to obtain the transient temperature profiles of the reaction sample during thermal cycles. A 483-bp DNA template is amplified successfully in the designed system and the traditional thermal cycler. This work should be interesting to persons involved in the high-temperature based reactions and genomics or cell analysis.

Elastic Strain-Concentration Factor of Notched Bars under Combined Loading of Static Tension and Pure Bending

The effect of notch depth on the elastic new strainconcentration factor (SNCF) of rectangular bars with single edge Unotch under combined loading is studied here. The finite element method (FEM) and super position technique are used in the current study. This new SNCF under combined loading of static tension and pure bending has been defined under triaxial stress state. The employed specimens have constant gross thickness of 16.7 mm and net section thickness varied to give net-to-gross thickness ratio ho/Ho from 0.2 to 0.95. The results indicated that the elastic SNCF for combined loading increases with increasing notch depth up to ho/Ho = 0.7 and sharply decreases with increasing notch depth. It is also indicated that the elastic SNCF of combined loading is greater than that of pure bending and less than that of the static tension for 0.2 ≤ ho/Ho ≤ 0.7. However, the elastic SNCF of combined loading is the elastic SNCF for static tension and less than that of pure bending for shallow notches (i.e. 0.8 ≤ ho/Ho ≤ 0.95).

Parallel Joint Channel Coding and Cryptography

Method of Parallel Joint Channel Coding and Cryptography has been analyzed and simulated in this paper. The method is an extension of Soft Input Decryption with feedback, which is used for improvement of channel decoding of secured messages. Parallel Joint Channel Coding and Cryptography results in improved coding gain of channel decoding, which achieves more than 2 dB. Such results are an implication of a combination of receiver components and their interoperability.

Improved K-Modes for Categorical Clustering Using Weighted Dissimilarity Measure

K-Modes is an extension of K-Means clustering algorithm, developed to cluster the categorical data, where the mean is replaced by the mode. The similarity measure proposed by Huang is the simple matching or mismatching measure. Weight of attribute values contribute much in clustering; thus in this paper we propose a new weighted dissimilarity measure for K-Modes, based on the ratio of frequency of attribute values in the cluster and in the data set. The new weighted measure is experimented with the data sets obtained from the UCI data repository. The results are compared with K-Modes and K-representative, which show that the new measure generates clusters with high purity.