Post ERP Feral System and use of ‘Feral System as Coping Mechanism

A number of studies highlighted problems related to ERP systems, yet, most of these studies focus on the problems during the project and implementation stages but not during the postimplementation use process. Problems encountered in the process of using ERP would hinder the effective exploitation and the extended and continued use of ERP systems and their value to organisations. This paper investigates the different types of problems users (operational, supervisory and managerial) faced in using ERP and how 'feral system' is used as the coping mechanism. The paper adopts a qualitative method and uses data collected from two cases and 26 interviews, to inductively develop a casual network model of ERP usage problem and its coping mechanism. This model classified post ERP usage problems as data quality, system quality, interface and infrastructure. The model is also categorised the different coping mechanism through use of 'feral system' inclusive of feral information system, feral data and feral use of technology.

Enterprise Resource Planning (ERP) System in Higher Education: A Literature Review and Implications

ERP systems are the largest software applications adopted by universities, along with quite significant investments in their implementation. However, unlike other applications little research has been conducted regarding these systems in a university environment. This paper aims at providing a critical review of previous research in ERP system in higher education with a special focus on higher education in Australia. The research not only forms the basis of an evaluation of previous research and research needs, it also makes inroads in identifying the payoff of ERPs in the sector from different perspectives with particular reference to the user. The paper is divided into two parts, the first part focuses on ERP literature in higher education at large, while the second focuses on ERP literature in higher education in Australia.

Analysis of Permanence and Extinction of Enterprise Cluster Based On Ecology Theory

This paper is concerned with the permanence and extinction problem of enterprises cluster constituted by m satellite enterprises and a dominant enterprise. We present the model involving impulsive effect based on ecology theory, which effectively describe the competition and cooperation of enterprises cluster in real economic environment. Applying comparison theorem of impulsive differential equation, we establish sufficient conditions which ultimately affect the fate of enterprises: permanence, extinction, and co-existence. Finally, we present numerical examples to explain the economical significance of mathematical results.

Blending Processing of Industrial Residues: A Specific Case of an Enterprise Located in the Municipality of Belo Horizonte, MG, Brazil

Residues are produced in all stages of human activities in terms of composition and volume which vary according to consumption practices and to production methods. Forms of significant harm to the environment are associated to volume of generated material as well as to improper disposal of solid wastes, whose negative effects are noticed more frequently in the long term. The solution to this problem constitutes a challenge to the government, industry and society, because they involve economic, social, environmental and, especially, awareness of the population in general. The main concerns are focused on the impact it can have on human health and on the environment (soil, water, air and sights). The hazardous waste produced mainly by industry, are particularly worrisome because, when improperly managed, they become a serious threat to the environment. In view of this issue, this study aimed to evaluate the management system of solid waste of a coprocessing industrial waste company, to propose improvements to the rejects generation management in a specific step of the Blending production process.

Underpricing of IPOs during Hot and Cold Market Periods on the South African Stock Exchange (JSE)

Underpricing is one anomaly in initial public offerings (IPO) literature that has been widely observed across different stock markets with different trends emerging over different time periods. This study seeks to determine how IPOs on the JSE performed on the first day, first week and first month over the period of 1996-2011. Underpricing trends are documented for both hot and cold market periods in terms of four main sectors (cyclical, defensive, growth stock and interest rate sensitive stocks). Using a sample of 360 listed companies on the JSE, the empirical findings established that IPOs on the JSE are significantly underpriced with an average market adjusted first day return of 62.9%. It is also established that hot market IPOs on the JSE are more underpriced than the cold market IPOs. Also observed is the fact that as the offer price per share increases above the median price for any given period, the level of underpricing decreases substantially. While significant differences exist in the level of underpricing of IPOs in the four different sectors in the hot and cold market periods, interest rates sensitive stocks showed a different trend from the other sectors and thus require further investigation to uncover this pattern.

Core Issues Affecting Software Architecture in Enterprise Projects

In this paper we analyze the core issues affecting software architecture in enterprise projects where a large number of people at different backgrounds are involved and complex business, management and technical problems exist. We first give general features of typical enterprise projects and then present foundations of software architectures. The detailed analysis of core issues affecting software architecture in software development phases is given. We focus on three main areas in each development phase: people, process, and management related issues, structural (product) issues, and technology related issues. After we point out core issues and problems in these main areas, we give recommendations for designing good architecture. We observed these core issues and the importance of following the best software development practices and also developed some novel practices in many big enterprise commercial and military projects in about 10 years of experience.

LabVIEW with Fuzzy Logic Controller Simulation Panel for Condition Monitoring of Oil and Dry Type Transformer

Condition monitoring of electrical power equipment has attracted considerable attention for many years. The aim of this paper is to use Labview with Fuzzy Logic controller to build a simulation system to diagnose transformer faults and monitor its condition. The front panel of the system was designed using LabVIEW to enable computer to act as customer-designed instrument. The dissolved gas-in-oil analysis (DGA) method was used as technique for oil type transformer diagnosis; meanwhile terminal voltages and currents analysis method was used for dry type transformer. Fuzzy Logic was used as expert system that assesses all information keyed in at the front panel to diagnose and predict the condition of the transformer. The outcome of the Fuzzy Logic interpretation will be displayed at front panel of LabVIEW to show the user the conditions of the transformer at any time.

Environmental Management System According to ISO 14001 as a Source of Eco-Innovations in Enterprises - A Case of Podkarpackie Voivodeship

This paper presents results of empirical studies that were conducted in enterprises from Podkarpackie Voivodeship (Poland). It shows the experiences of those enterprises resulting from implementing and improving the eco-innovativeness management that is formal Environmental Management System (EMS). This study shows the expected and obtained internal benefits which are the effects of a functioning EMS. The aim of this paper is to determine whether the information included in international theoretical studies concerning the benefits of implementing, functioning and improving formal EMS (which is based on the international standard ISO 14001) are confirmed by the effects of the enterprises- activities.

Mobile Phone as a Tool for Data Collection in Field Research

The necessity of accurate and timely field data is shared among organizations engaged in fundamentally different activities, public services or commercial operations. Basically, there are three major components in the process of the qualitative research: data collection, interpretation and organization of data, and analytic process. Representative technological advancements in terms of innovation have been made in mobile devices (mobile phone, PDA-s, tablets, laptops, etc). Resources that can be potentially applied on the data collection activity for field researches in order to improve this process. This paper presents and discuss the main features of a mobile phone based solution for field data collection, composed of basically three modules: a survey editor, a server web application and a client mobile application. The data gathering process begins with the survey creation module, which enables the production of tailored questionnaires. The field workforce receives the questionnaire(s) on their mobile phones to collect the interviews responses and sending them back to a server for immediate analysis.

An Algorithm Proposed for FIR Filter Coefficients Representation

Finite impulse response (FIR) filters have the advantage of linear phase, guaranteed stability, fewer finite precision errors, and efficient implementation. In contrast, they have a major disadvantage of high order need (more coefficients) than IIR counterpart with comparable performance. The high order demand imposes more hardware requirements, arithmetic operations, area usage, and power consumption when designing and fabricating the filter. Therefore, minimizing or reducing these parameters, is a major goal or target in digital filter design task. This paper presents an algorithm proposed for modifying values and the number of non-zero coefficients used to represent the FIR digital pulse shaping filter response. With this algorithm, the FIR filter frequency and phase response can be represented with a minimum number of non-zero coefficients. Therefore, reducing the arithmetic complexity needed to get the filter output. Consequently, the system characteristic i.e. power consumption, area usage, and processing time are also reduced. The proposed algorithm is more powerful when integrated with multiplierless algorithms such as distributed arithmetic (DA) in designing high order digital FIR filters. Here the DA usage eliminates the need for multipliers when implementing the multiply and accumulate unit (MAC) and the proposed algorithm will reduce the number of adders and addition operations needed through the minimization of the non-zero values coefficients to get the filter output.

Ranking Genes from DNA Microarray Data of Cervical Cancer by a local Tree Comparison

The major objective of this paper is to introduce a new method to select genes from DNA microarray data. As criterion to select genes we suggest to measure the local changes in the correlation graph of each gene and to select those genes whose local changes are largest. More precisely, we calculate the correlation networks from DNA microarray data of cervical cancer whereas each network represents a tissue of a certain tumor stage and each node in the network represents a gene. From these networks we extract one tree for each gene by a local decomposition of the correlation network. The interpretation of a tree is that it represents the n-nearest neighbor genes on the n-th level of a tree, measured by the Dijkstra distance, and, hence, gives the local embedding of a gene within the correlation network. For the obtained trees we measure the pairwise similarity between trees rooted by the same gene from normal to cancerous tissues. This evaluates the modification of the tree topology due to tumor progression. Finally, we rank the obtained similarity values from all tissue comparisons and select the top ranked genes. For these genes the local neighborhood in the correlation networks changes most between normal and cancerous tissues. As a result we find that the top ranked genes are candidates suspected to be involved in tumor growth. This indicates that our method captures essential information from the underlying DNA microarray data of cervical cancer.

Enhancing K-Means Algorithm with Initial Cluster Centers Derived from Data Partitioning along the Data Axis with the Highest Variance

In this paper, we propose an algorithm to compute initial cluster centers for K-means clustering. Data in a cell is partitioned using a cutting plane that divides cell in two smaller cells. The plane is perpendicular to the data axis with the highest variance and is designed to reduce the sum squared errors of the two cells as much as possible, while at the same time keep the two cells far apart as possible. Cells are partitioned one at a time until the number of cells equals to the predefined number of clusters, K. The centers of the K cells become the initial cluster centers for K-means. The experimental results suggest that the proposed algorithm is effective, converge to better clustering results than those of the random initialization method. The research also indicated the proposed algorithm would greatly improve the likelihood of every cluster containing some data in it.

A Web Services based Architecture for NGN Services Delivery

The notion of Next Generation Network (NGN) is based on the Network Convergence concept which refers to integration of services (such as IT and communication services) over IP layer. As the most popular implementation of Service Oriented Architecture (SOA), Web Services technology is known to be the base for service integration. In this paper, we present a platform to deliver communication services as web services. We also implement a sample service to show the simplicity of making composite web and communication services using this platform. A Service Logic Execution Environment (SLEE) is used to implement the communication services. The proposed architecture is in agreement with Service Oriented Architecture (SOA) and also can be integrated to an Enterprise Service Bus to make a base for NGN Service Delivery Platform (SDP).

The Safety of WiMAX Insolid Propellant Rocket Production

With the advance in wireless networking, IEEE 802.16 WiMAX technology has been widely deployed for several applications such as “last mile" broadband service, cellular backhaul, and high-speed enterprise connectivity. As a result, military employed WiMAX as a high-speed wireless connection for data-link because of its point to multi-point and non-line-of-sight (NLOS) capability for many years. However, the risk of using WiMAX is a critical factor in some sensitive area of military applications especially in ammunition manufacturing such as solid propellant rocket production. The US DoD policy states that the following certification requirements are met for WiMAX: electromagnetic effects on the environment (E3) and Hazards of Electromagnetic Radiation to Ordnance (HERO). This paper discuses the Recommended Power Densities and Safe Separation Distance (SSD) for HERO on WiMAX systems deployed on solid propellant rocket production. The result of this research found that WiMAX is safe to operate at close proximity distances to the rocket production based on AF Guidance Memorandum immediately changing AFMAN 91-201.

Fractal Dimension of Breast Cancer Cell Migration in a Wound Healing Assay

Migration in breast cancer cell wound healing assay had been studied using image fractal dimension analysis. The migration of MDA-MB-231 cells (highly motile) in a wound healing assay was captured using time-lapse phase contrast video microscopy and compared to MDA-MB-468 cell migration (moderately motile). The Higuchi fractal method was used to compute the fractal dimension of the image intensity fluctuation along a single pixel width region parallel to the wound. The near-wound region fractal dimension was found to decrease three times faster in the MDA-MB- 231 cells initially as compared to the less cancerous MDA-MB-468 cells. The inner region fractal dimension was found to be fairly constant for both cell types in time and suggests a wound influence range of about 15 cell layer. The box-counting fractal dimension method was also used to study region of interest (ROI). The MDAMB- 468 ROI area fractal dimension was found to decrease continuously up to 7 hours. The MDA-MB-231 ROI area fractal dimension was found to increase and is consistent with the behavior of a HGF-treated MDA-MB-231 wound healing assay posted in the public domain. A fractal dimension based capacity index has been formulated to quantify the invasiveness of the MDA-MB-231 cells in the perpendicular-to-wound direction. Our results suggest that image intensity fluctuation fractal dimension analysis can be used as a tool to quantify cell migration in terms of cancer severity and treatment responses.

CT Reconstruction from a Limited Number of X-Ray Projections

Most CT reconstruction system x-ray computed tomography (CT) is a well established visualization technique in medicine and nondestructive testing. However, since CT scanning requires sampling of radiographic projections from different viewing angles, common CT systems with mechanically moving parts are too slow for dynamic imaging, for instance of multiphase flows or live animals. A large number of X-ray projections are needed to reconstruct CT images, so the collection and calculation of the projection data consume too much time and harmful for patient. For the purpose of solving the problem, in this study, we proposed a method for tomographic reconstruction of a sample from a limited number of x-ray projections by using linear interpolation method. In simulation, we presented reconstruction from an experimental x-ray CT scan of a Aluminum phantom that follows to two steps: X-ray projections will be interpolated using linear interpolation method and using it for CT reconstruction based upon Ordered Subsets Expectation Maximization (OSEM) method.

Adaptive Path Planning for Mobile Robot Obstacle Avoidance

Generally speaking, the mobile robot is capable of sensing its surrounding environment, interpreting the sensed information to obtain the knowledge of its location and the environment, planning a real-time trajectory to reach the object. In this process, the issue of obstacle avoidance is a fundamental topic to be challenged. Thus, an adaptive path-planning control scheme is designed without detailed environmental information, large memory size and heavy computation burden in this study for the obstacle avoidance of a mobile robot. In this scheme, the robot can gradually approach its object according to the motion tracking mode, obstacle avoidance mode, self-rotation mode, and robot state selection. The effectiveness of the proposed adaptive path-planning control scheme is verified by numerical simulations of a differential-driving mobile robot under the possible occurrence of obstacle shapes.

A “Greedy“ Czech Manufacturing Case

The article describes a case study on one of Czech Republic-s manufacturing middle size enterprises (ME), where due to the European financial crisis, production lines had to be redesigned and optimized in order to minimize the total costs of the production of goods. It is considered an optimization problem of minimizing the total cost of the work load, according to the costs of the possible locations of the workplaces, with an application of the Greedy algorithm and a partial analogy to a Set Packing Problem. The displacement of working tables in a company should be as a one-toone monotone increasing function in order for the total costs of production of the goods to be at minimum. We use a heuristic approach with greedy algorithm for solving this linear optimization problem, regardless the possible greediness which may appear and we apply it in a Czech ME.

Value of Sharing: Viral Advertisement

Sharing motivations of viral advertisements by consumers and the impacts of these advertisements on the perceptions for brand will be questioned in this study. Three fundamental questions are answered in the study. These are advertisement watching and sharing motivations of individuals, criteria of liking viral advertisement and the impact of individual attitudes for viral advertisement on brand perception respectively. This study will be carried out via a viral advertisement which was practiced in Turkey. The data will be collected by survey method and the sample of the study consists of individuals who experienced the practice of sample advertisement. Data will be collected by online survey method and will be analyzed by using SPSS statistical package program. Recently traditional advertisement mind have been changing. New advertising approaches which have significant impacts on consumers have been argued. Viral advertising is a modernist advertisement mind which offers significant advantages to brands apart from traditional advertising channels such as television, radio and magazines. Viral advertising also known as Electronic Word-of- Mouth (eWOM) consists of free spread of convincing messages sent by brands among interpersonal communication. When compared to the traditional advertising, a more provocative thematic approach is argued. The foundation of this approach is to create advertisements that are worth sharing with others by consumers. When that fact is taken into consideration, in a manner of speaking it can also be stated that viral advertising is media engineering. The content worth sharing makes people being a volunteer spokesman of a brand and strengthens the emotional bonds among brand and consumer. Especially for some sectors in countries which are having traditional advertising channel limitations, viral advertising creates vital advantages.

Fuzzy Rules Generation and Extraction from Support Vector Machine Based on Kernel Function Firing Signals

Our study proposes an alternative method in building Fuzzy Rule-Based System (FRB) from Support Vector Machine (SVM). The first set of fuzzy IF-THEN rules is obtained through an equivalence of the SVM decision network and the zero-ordered Sugeno FRB type of the Adaptive Network Fuzzy Inference System (ANFIS). The second set of rules is generated by combining the first set based on strength of firing signals of support vectors using Gaussian kernel. The final set of rules is then obtained from the second set through input scatter partitioning. A distinctive advantage of our method is the guarantee that the number of final fuzzy IFTHEN rules is not more than the number of support vectors in the trained SVM. The final FRB system obtained is capable of performing classification with results comparable to its SVM counterpart, but it has an advantage over the black-boxed SVM in that it may reveal human comprehensible patterns.