Determinants of Brand Equity: Offering a Model to Chocolate Industry

This study examined the underlying dimensions of brand equity in the chocolate industry. For this purpose, researchers developed a model to identify which factors are influential in building brand equity. The second purpose was to assess brand loyalty and brand images mediating effect between brand attitude, brand personality, brand association with brand equity. The study employed structural equation modeling to investigate the causal relationships between the dimensions of brand equity and brand equity itself. It specifically measured the way in which consumers’ perceptions of the dimensions of brand equity affected the overall brand equity evaluations. Data were collected from a sample of consumers of chocolate industry in Iran. The results of this empirical study indicate that brand loyalty and brand image are important components of brand equity in this industry. Moreover, the role of brand loyalty and brand image as mediating factors in the intention of brand equity are supported. The principal contribution of the present research is that it provides empirical evidence of the multidimensionality of consumer based brand equity, supporting Aaker´s and Keller´s conceptualization of brand equity. The present research also enriched brand equity building by incorporating the brand personality and brand image, as recommended by previous researchers. Moreover, creating the brand equity index in chocolate industry of Iran particularly is novel.

Enhancing Retrieval Effectiveness of Malay Documents by Exploiting Implicit Semantic Relationship between Words

Phrases has a long history in information retrieval, particularly in commercial systems. Implicit semantic relationship between words in a form of BaseNP have shown significant improvement in term of precision in many IR studies. Our research focuses on linguistic phrases which is language dependent. Our results show that using BaseNP can improve performance although above 62% of words formation in Malay Language based on derivational affixes and suffixes.

A Critics Study of Neural Networks Applied to ion-Exchange Process

This paper presents a critical study about the application of Neural Networks to ion-exchange process. Ionexchange is a complex non-linear process involving many factors influencing the ions uptake mechanisms from the pregnant solution. The following step includes the elution. Published data presents empirical isotherm equations with definite shortcomings resulting in unreliable predictions. Although Neural Network simulation technique encounters a number of disadvantages including its “black box", and a limited ability to explicitly identify possible causal relationships, it has the advantage to implicitly handle complex nonlinear relationships between dependent and independent variables. In the present paper, the Neural Network model based on the back-propagation algorithm Levenberg-Marquardt was developed using a three layer approach with a tangent sigmoid transfer function (tansig) at hidden layer with 11 neurons and linear transfer function (purelin) at out layer. The above mentioned approach has been used to test the effectiveness in simulating ion exchange processes. The modeling results showed that there is an excellent agreement between the experimental data and the predicted values of copper ions removed from aqueous solutions.

A Survey of Business Component Identification Methods and Related Techniques

With deep development of software reuse, componentrelated technologies have been widely applied in the development of large-scale complex applications. Component identification (CI) is one of the primary research problems in software reuse, by analyzing domain business models to get a set of business components with high reuse value and good reuse performance to support effective reuse. Based on the concept and classification of CI, its technical stack is briefly discussed from four views, i.e., form of input business models, identification goals, identification strategies, and identification process. Then various CI methods presented in literatures are classified into four types, i.e., domain analysis based methods, cohesion-coupling based clustering methods, CRUD matrix based methods, and other methods, with the comparisons between these methods for their advantages and disadvantages. Additionally, some insufficiencies of study on CI are discussed, and the causes are explained subsequently. Finally, it is concluded with some significantly promising tendency about research on this problem.

Integrating LCA into PDM for Ecodesign

Product Data Management (PDM) systems for Computer Aided Design (CAD) file management are widely established in design processes. This management system is indispensable for design collaboration or when design task distribution is present. It is thus surprising that engineering design curricula has not paid much attention in the education of PDM systems. This is also the case for eduction of ecodesign and environmental evaluation of products. With the rise of sustainability as a strategic aspect in companies, environmental concerns are becoming a key issue in design. This paper discusses the establishment of a PDM platform to be used among technical and vocational schools in Austria. The PDM system facilitates design collaboration among these schools. Further, it will be discussed how the PDM system has been prepared in order to facilitate environmental evaluation of parts, components and subassemblies of a product. By integrating a Business Intelligence solution, environmental Life Cycle Assessment and communication of results is enabled.

Kazakhstani Humanism: Challenges and Prospects

This article examines the emergence and development of the Kazakhstan species of humanism. The biggest challenge for Kazakhstan in terms of humanism is connected with advocating human values in parallel to promoting national interests; preserving the continuity of traditions in various spheres of life, business and culture. This should be a common goal for the entire society, the main direction for a national intelligence, and a platform for the state policy. An idea worth considering is a formation of national humanist tradition model; the challenges are adapting people to live in the context of new industrial and innovative economic conditions, keeping the balance during intensive economic development of the country, and ensuring social harmony in the society.

Development of a Pipeline Monitoring System by Bio-mimetic Robots

To explore pipelines is one of various bio-mimetic robot applications. The robot may work in common buildings such as between ceilings and ducts, in addition to complicated and massive pipeline systems of large industrial plants. The bio-mimetic robot finds any troubled area or malfunction and then reports its data. Importantly, it can not only prepare for but also react to any abnormal routes in the pipeline. The pipeline monitoring tasks require special types of mobile robots. For an effective movement along a pipeline, the movement of the robot will be similar to that of insects or crawling animals. During its movement along the pipelines, a pipeline monitoring robot has an important task of finding the shapes of the approaching path on the pipes. In this paper we propose an effective solution to the pipeline pattern recognition, based on the fuzzy classification rules for the measured IR distance data.

Integrated Learning in Engineering Services: A Conceptual Framework

This study explores how the mechanics of learning paves the way to engineering innovation. Theories related to learning in the new product/service innovation are reviewed from an organizational perspective, behavioral perspective, and engineering perspective. From this, an engineering team-s external interactions for knowledge brokering and internal composition for skill balance are examined from a learning and innovation viewpoints. As a result, an integrated learning model is developed by reconciling the theoretical perspectives as well as developing propositions that emphasize the centrality of learning, and its drivers, in the engineering product/service development. The paper also provides a review and partial validation of the propositions using the results of a previously published field study in the aerospace industry.

Theoretical Background of Dividend Taxation

The article deals with dividends and their distribution from investors from a theoretical point of view. Some studies try to analyzed the reaction of the market on the dividend announcement and found out the change of dividend policy is associated with abnormal returns around the dividend announcement date. Another researches directly questioned the investors about their dividend preference and beliefs. Investors want the dividend from many reasons (e.g. some of them explain the dividend preference by the existence of transaction cost; investors prefer the dividend today, because there is less risky; the managers have private information about the firm). The most controversial theory of dividend policy was developed by Modigliani and Miller (1961) who demonstrated that in the perfect and complete capital markets the dividend policy is irrelevant and the value of the company is independent of its payout policy. Nevertheless, in the real world the capital markets are imperfect, because of asymmetric information, transaction costs, incomplete contracting possibilities and taxes.

New Features for Specific JPEG Steganalysis

We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.

Finite Element Investigation of Transmission Conditions for Non-Monotonic Temperature Interphases

Imperfect transmission conditions modeling a thin reactive 2D interphases layer between two dissimilar bonded strips have been extracted. In this paper, the soundness of these transmission conditions for heat conduction problems are examined by the finite element method for a strong temperature-dependent source or sink and non-monotonic temperature distributions around the faces..

Egyptian Electronic Government: The University Enrolment Case Study

E-government projects have potential for greater efficiency and effectiveness of government operations. For this reason, many developing countries governments have invested heavily in this agenda and an increasing number of e-government projects are being implemented. However, there is a lack of clear case material, which describes the potentialities and consequence experienced by organizations trying to manage with this change. The Ministry of State for Administrative Development (MSAD) is the organization responsible for the e-Government program in Egypt since early 2004. This paper presents a case study of the process of admission to public universities and institutions in Egypt which is led by MSAD. Underlining the key benefits resulting from the initiative, explaining the strategies and the development steps used to implement it, and highlighting the main obstacles encountered and how they were overcome will help repeat the experience in other useful e-government projects.

A hybrid Tabu Search Algorithm to Cell Formation Problem and its Variants

Cell formation is the first step in the design of cellular manufacturing systems. In this study, a general purpose computational scheme employing a hybrid tabu search algorithm as the core is proposed to solve the cell formation problem and its variants. In the proposed scheme, great flexibilities are left to the users. The core solution searching algorithm embedded in the scheme can be easily changed to any other meta-heuristic algorithms, such as the simulated annealing, genetic algorithm, etc., based on the characteristics of the problems to be solved or the preferences the users might have. In addition, several counters are designed to control the timing of conducting intensified solution searching and diversified solution searching strategies interactively.

Landslide and Debris Flow Characteristics during Extreme Rainfall in Taiwan

As the global climate changes, the threat from landslides and debris flows increases. Learning how a watershed initiates landslides under abnormal rainfall conditions and predicting landslide magnitude and frequency distribution is thus important. Landslides show a power-law distribution in the frequency-area distribution. The distribution curve shows an exponent gradient 1.0 in the Sandpile model test. Will the landslide frequency-area statistics show a distribution similar to the Sandpile model under extreme rainfall conditions? The purpose of the study is to identify the extreme rainfall-induced landslide frequency-area distribution in the Laonong River Basin in southern Taiwan. Results of the analysis show that a lower gradient of landslide frequency-area distribution could be attributed to the transportation and deposition of debris flow areas that are included in the landslide area.

Realization of Electronically Tunable Currentmode First-order Allpass Filter and Its Application

This article presents a resistorless current-mode firstorder allpass filter based on second generation current controlled current conveyors (CCCIIs). The features of the circuit are that: the pole frequency can be electronically controlled via the input bias current: the circuit description is very simple, consisting of 2 CCCIIs and single grounded capacitor, without any external resistors and component matching requirements. Consequently, the proposed circuit is very appropriate to further develop into an integrated circuit. Low input and high output impedances of the proposed configuration enable the circuit to be cascaded in current-mode without additional current buffers. The PSpice simulation results are depicted. The given results agree well with the theoretical anticipation. The application example as a current-mode quadrature oscillator is included.

Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation

Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.

A Feature-based Invariant Watermarking Scheme Using Zernike Moments

In this paper, a novel feature-based image watermarking scheme is proposed. Zernike moments which have invariance properties are adopted in the scheme. In the proposed scheme, feature points are first extracted from host image and several circular patches centered on these points are generated. The patches are used as carriers of watermark information because they can be regenerated to locate watermark embedding positions even when watermarked images are severely distorted. Zernike transform is then applied to the patches to calculate local Zernike moments. Dither modulation is adopted to quantize the magnitudes of the Zernike moments followed by false alarm analysis. Experimental results show that quality degradation of watermarked image is visually transparent. The proposed scheme is very robust against image processing operations and geometric attacks.

Could Thermal Oceanic Hotspot Increase Climate Changes Activities in North Tropical Atlantic: Example of the 2005 Caribbean Coral Bleaching Hotspot and Hurricane Katrina Interaction

This paper reviews recent studies and particularly the effects of Climate Change in the North Tropical Atlantic by studying atmospheric conditions that prevailed in 2005 ; Coral Bleaching HotSpot and Hurricane Katrina. In the aim to better understand and estimate the impact of the physical phenomenon, i.e. Thermal Oceanic HotSpot (TOHS), isotopic studies of δ18O and δ13C on marine animals from Guadeloupe (French Caribbean Island) were carried out. Recorded measures show Sea Surface Temperature (SST) up to 35°C in August which is much higher than data recorded by NOAA satellites 32°C. After having reviewed the process that led to the creation of Hurricane Katrina which hit New Orleans in August 29, 2005, it will be shown that the climatic conditions in the Caribbean from August to October 2005 have influenced Katrina evolution. This TOHS is a combined effect of various phenomenon which represent an additional factor to estimate future climate changes.

Analysis and Simulation of Automotive Interleaved Buck Converter

This paper will focus on modeling, analysis and simulation of a 42V/14V dc/dc converter based architecture. This architecture is considered to be technically a viable solution for automotive dual-voltage power system for passenger car in the near further. An interleaved dc/dc converter system is chosen for the automotive converter topology due to its advantages regarding filter reduction, dynamic response, and power management. Presented herein, is a model based on one kilowatt interleaved six-phase buck converter designed to operate in a Discontinuous Conduction Mode (DCM). The control strategy of the converter is based on a voltagemode- controlled Pulse Width Modulation (PWM) with a Proportional-Integral-Derivative (PID). The effectiveness of the interleaved step-down converter is verified through simulation results using control-oriented simulator, MatLab/Simulink.

Software Architecture and Support for Patient Tracking Systems in Critical Scenarios

In this work a new platform for mobile-health systems is presented. System target application is providing decision support to rescue corps or military medical personnel in combat areas. Software architecture relies on a distributed client-server system that manages a wireless ad-hoc networks hierarchy in which several different types of client operate. Each client is characterized for different hardware and software requirements. Lower hierarchy levels rely in a network of completely custom devices that store clinical information and patient status and are designed to form an ad-hoc network operating in the 2.4 GHz ISM band and complying with the IEEE 802.15.4 standard (ZigBee). Medical personnel may interact with such devices, that are called MICs (Medical Information Carriers), by means of a PDA (Personal Digital Assistant) or a MDA (Medical Digital Assistant), and transmit the information stored in their local databases as well as issue a service request to the upper hierarchy levels by using IEEE 802.11 a/b/g standard (WiFi). The server acts as a repository that stores both medical evacuation forms and associated events (e.g., a teleconsulting request). All the actors participating in the diagnostic or evacuation process may access asynchronously to such repository and update its content or generate new events. The designed system pretends to optimise and improve information spreading and flow among all the system components with the aim of improving both diagnostic quality and evacuation process.