Antioxidant and Aِntimicrobial Properties of Peptides as Bioactive Components in Beef Burger

Dried soy protein hydrolysate powder was added to the burger in order to enhance the oxidative stability as well as decreases the microbial spoilage. The soybean bioactive compounds (soy protein hydrolysate) as antioxidant and antimicrobial were added at level of 1, 2 and 3 %.Chemical analysis and physical properties were affected by protein hydrolysate addition. The TBA values were significantly affected (P < 0.05) by the storage period and the level of soy protein hydrolysate. All the tested soybean protein hydrolysate additives showed strong antioxidant properties. Samples of soybean protein hydrolysate showed the lowest (P < 0.05) TBA values at each time of storage. The counts of all determined microbiological indicators were significantly (P < 0.05) affected by the addition of the soybean protein hydrolysate. Decreasing trends of different extent were also observed in samples of the treatments for total viable counts, Coliform, Staphylococcus aureus, yeast and molds. Storage period was being significantly (P < 0.05) affected on microbial counts in all samples Staphylococcus aureus were the most sensitive microbe followed by Coliform group of the sample containing protein hydrolysate, while molds and yeast count showed a decreasing trend but not significant (P < 0.05) until the end of the storage period compared with control sample. Sensory attributes were also performed, added protein hydrolysate exhibits beany flavor which was clear about samples of 3% protein hydrolysate.

Simulation of PM10 Source Apportionment at An Urban Site in Southern Taiwan by a Gaussian Trajectory Model

This study applied the Gaussian trajectory transfer-coefficient model (GTx) to simulate the particulate matter concentrations and the source apportionments at Nanzih Air Quality Monitoring Station in southern Taiwan from November 2007 to February 2008. The correlation coefficient between the observed and the calculated daily PM10 concentrations is 0.5 and the absolute bias of the PM10 concentrations is 24%. The simulated PM10 concentrations matched well with the observed data. Although the emission rate of PM10 was dominated by area sources (58%), the results of source apportionments indicated that the primary sources for PM10 at Nanzih Station were point sources (42%), area sources (20%) and then upwind boundary concentration (14%). The obvious difference of PM10 source apportionment between episode and non-episode days was upwind boundary concentrations which contributed to 20% and 11% PM10 sources, respectively. The gas-particle conversion of secondary aerosol and long range transport played crucial roles on the PM10 contribution to a receptor.

A Methodology for Creating a Conceptual Model Under Uncertainty

This article deals with the conceptual modeling under uncertainty. First, the division of information systems with their definition will be described, focusing on those where the construction of a conceptual model is suitable for the design of future information system database. Furthermore, the disadvantages of the traditional approach in creating a conceptual model and database design will be analyzed. A comprehensive methodology for the creation of a conceptual model based on analysis of client requirements and the selection of a suitable domain model is proposed here. This article presents the expert system used for the construction of a conceptual model and is a suitable tool for database designers to create a conceptual model.

Automating the Testing of Object Behaviour: A Statechart-Driven Approach

The evolution of current modeling specifications gives rise to the problem of generating automated test cases from a variety of application tools. Past endeavours on behavioural testing of UML statecharts have not systematically leveraged the potential of existing graph theory for testing of objects. Therefore there exists a need for a simple, tool-independent, and effective method for automatic test generation. An architecture, codenamed ACUTE-J (Automated stateChart Unit Testing Engine for Java), for automating the unit test generation process is presented. A sequential approach for converting UML statechart diagrams to JUnit test classes is described, with the application of existing graph theory. Research byproducts such as a universal XML Schema and API for statechart-driven testing are also proposed. The result from a Java implementation of ACUTE-J is discussed in brief. The Chinese Postman algorithm is utilised as an illustration for a run-through of the ACUTE-J architecture.

Intrapreneurship as a Unique Competitive Advantage

Intrapreneurship, a term used to describe entrepreneurship within existing organizations, has been acknowledged in international literature and practice as a vital element of economic and organizational growth, success and competitiveness and can be considered as a unique competitive advantage. The purpose of the paper is, first, to provide a comprehensive analysis of the concept of intrapreneurship, and, second, to highlight the need for a different approach in the research on the field of intrapreneurship. Concluding, the paper suggests directions for future research.

Domestic Tourist Behaviours of the Tourism Place in Bangkok and Greater Area

This research aims to study the preferable tourism and the elements of choosing tourist destination from domestic tourist in Bangkok and the nearby areas in Thailand.The data were collected by using 1249 set of questionnaires, in mid-August 2012. The result illustrates that religious destinations are the most preferable places for the tourist. The average expense per travel is approximately 47 USD a time. Travellers travel based on the advertisement in the television and internet and their decisions is based on the reputation of the destinations. The result on a place dimension demonstrates the neatness and well managed location play a crucial role on tourist destination. Gender, age, marriage status and their origins are affecting their spending and travelling behaviour. The researcher reckon that providing the area of arcade, selling the souvenir and promoting tourism among a young professional group would be an important key follow the income distribution policy, including managing the destination to welcome the family group, which the result is to identified as the highest spending.

The Traffic Prediction Multi-path Energy-aware Source Routing (TP-MESR)in Ad hoc Networks

The purpose of this study is to suggest energy efficient routing for ad hoc networks which are composed of nodes with limited energy. There are diverse problems including limitation of energy supply of node, and the node energy management problem has been presented. And a number of protocols have been proposed for energy conservation and energy efficiency. In this study, the critical point of the EA-MPDSR, that is the type of energy efficient routing using only two paths, is improved and developed. The proposed TP-MESR uses multi-path routing technique and traffic prediction function to increase number of path more than 2. It also verifies its efficiency compared to EA-MPDSR using network simulator (NS-2). Also, To give a academic value and explain protocol systematically, research guidelines which the Hevner(2004) suggests are applied. This proposed TP-MESR solved the existing multi-path routing problem related to overhead, radio interference, packet reassembly and it confirmed its contribution to effective use of energy in ad hoc networks.

Limiting Fiber Extensibility as Parameter for Damage in Venous Wall

An inflation–extension test with human vena cava inferior was performed with the aim to fit a material model. The vein was modeled as a thick–walled tube loaded by internal pressure and axial force. The material was assumed to be an incompressible hyperelastic fiber reinforced continuum. Fibers are supposed to be arranged in two families of anti–symmetric helices. Considered anisotropy corresponds to local orthotropy. Used strain energy density function was based on a concept of limiting strain extensibility. The pressurization was comprised by four pre–cycles under physiological venous loading (0 – 4kPa) and four cycles under nonphysiological loading (0 – 21kPa). Each overloading cycle was performed with different value of axial weight. Overloading data were used in regression analysis to fit material model. Considered model did not fit experimental data so good. Especially predictions of axial force failed. It was hypothesized that due to nonphysiological values of loading pressure and different values of axial weight the material was not preconditioned enough and some damage occurred inside the wall. A limiting fiber extensibility parameter Jm was assumed to be in relation to supposed damage. Each of overloading cycles was fitted separately with different values of Jm. Other parameters were held the same. This approach turned out to be successful. Variable value of Jm can describe changes in the axial force – axial stretch response and satisfy pressure – radius dependence simultaneously.

Methodology of Realization for Supervisor and Simulator Dedicated to a Semiconductor Research and Production Factory

In the micro and nano-technology industry, the «clean-rooms» dedicated to manufacturing chip, are equipped with the most sophisticated equipment-tools. There use a large number of resources in according to strict specifications for an optimum working and result. The distribution of «utilities» to the production is assured by teams who use a supervision tool. The studies show the interest to control the various parameters of production or/and distribution, in real time, through a reliable and effective supervision tool. This document looks at a large part of the functions that the supervisor must assure, with complementary functionalities to help the diagnosis and simulation that prove very useful in our case where the supervised installations are complexed and in constant evolution.

Numerical Analysis and Experimental Validation of a Downhole Stress/Strain Measurement Tool

Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.

Investigation on Novel Based Metaheuristic Algorithms for Combinatorial Optimization Problems in Ad Hoc Networks

Routing in MANET is extremely challenging because of MANETs dynamic features, its limited bandwidth, frequent topology changes caused by node mobility and power energy consumption. In order to efficiently transmit data to destinations, the applicable routing algorithms must be implemented in mobile ad-hoc networks. Thus we can increase the efficiency of the routing by satisfying the Quality of Service (QoS) parameters by developing routing algorithms for MANETs. The algorithms that are inspired by the principles of natural biological evolution and distributed collective behavior of social colonies have shown excellence in dealing with complex optimization problems and are becoming more popular. This paper presents a survey on few meta-heuristic algorithms and naturally-inspired algorithms.

Estimating of the Renewal Function with Heavy-tailed Claims

We develop a new estimator of the renewal function for heavy-tailed claims amounts. Our approach is based on the peak over threshold method for estimating the tail of the distribution with a generalized Pareto distribution. The asymptotic normality of an appropriately centered and normalized estimator is established, and its performance illustrated in a simulation study.

Block Cipher Based on Randomly Generated Quasigroups

Quasigroups are algebraic structures closely related to Latin squares which have many different applications. The construction of block cipher is based on quasigroup string transformation. This article describes a block cipher based Quasigroup of order 256, suitable for fast software encryption of messages written down in universal ASCII code. The novelty of this cipher lies on the fact that every time the cipher is invoked a new set of two randomly generated quasigroups are used which in turn is used to create a pair of quasigroup of dual operations. The cryptographic strength of the block cipher is examined by calculation of the xor-distribution tables. In this approach some algebraic operations allows quasigroups of huge order to be used without any requisite to be stored.

A Model of Technological Platform for the Knowledge Management Organization

This paper describes an experience of research, development and innovation applied in Industrial Naval at (Science and Technology Corporation for the Development of Shipbuilding Industry, Naval in Colombia (COTECMAR) particularly through processes of research, innovation and technological development, based on theoretical models related to organizational knowledge management, technology management and management of human talent and integration of technology platforms. It seeks ways to facilitate the initial establishment of environments rich in information, knowledge and content-supported collaborative strategies on dynamic processes missionary, seeking further development in the context of research, development and innovation of the Naval Engineering in Colombia, making it a distinct basis for the generation of knowledge assets from COTECMAR. The integration of information and communication technologies, supported on emerging technologies (mobile technologies, wireless, digital content via PDA, and content delivery services on the Web 2.0 and Web 3.0) as a view of the strategic thrusts in any organization facilitates the redefinition of processes for managing information and knowledge, enabling the redesign of workflows, the adaptation of new forms of organization - preferably in networking and support the creation of symbolic-inside-knowledge promotes the development of new skills, knowledge and attitudes of the knowledge worker

Designing of the Heating Process for Fiber- Reinforced Thermoplastics with Middle-Wave Infrared Radiators

Manufacturing components of fiber-reinforced thermoplastics requires three steps: heating the matrix, forming and consolidation of the composite and terminal cooling the matrix. For the heating process a pre-determined temperature distribution through the layers and the thickness of the pre-consolidated sheets is recommended to enable forming mechanism. Thus, a design for the heating process for forming composites with thermoplastic matrices is necessary. To obtain a constant temperature through thickness and width of the sheet, the heating process was analyzed by the help of the finite element method. The simulation models were validated by experiments with resistance thermometers as well as with an infrared camera. Based on the finite element simulation, heating methods for infrared radiators have been developed. Using the numeric simulation many iteration loops are required to determine the process parameters. Hence, the initiation of a model for calculating relevant process parameters started applying regression functions.

Investigation of Dimethyl Ether Solubility in Liquid Hexadecane by UNIFAC Method

It is shown that a modified UNIFAC model can be applied to predict solubility of hydrocarbon gases and vapors in hydrocarbon solvents. Very good agreement with experimental data has been achieved. In this work we try to find best way for predicting dimethyl ether solubility in liquid paraffin by using group contribution theory.

A Markov Chain Model for Load-Balancing Based and Service Based RAT Selection Algorithms in Heterogeneous Networks

Next Generation Wireless Network (NGWN) is expected to be a heterogeneous network which integrates all different Radio Access Technologies (RATs) through a common platform. A major challenge is how to allocate users to the most suitable RAT for them. An optimized solution can lead to maximize the efficient use of radio resources, achieve better performance for service providers and provide Quality of Service (QoS) with low costs to users. Currently, Radio Resource Management (RRM) is implemented efficiently for the RAT that it was developed. However, it is not suitable for a heterogeneous network. Common RRM (CRRM) was proposed to manage radio resource utilization in the heterogeneous network. This paper presents a user level Markov model for a three co-located RAT networks. The load-balancing based and service based CRRM algorithms have been studied using the presented Markov model. A comparison for the performance of load-balancing based and service based CRRM algorithms is studied in terms of traffic distribution, new call blocking probability, vertical handover (VHO) call dropping probability and throughput.

Suitability of Requirements Abstraction Model (RAM) Requirements for High-Level System Testing

The Requirements Abstraction Model (RAM) helps in managing abstraction in requirements by organizing them at four levels (product, feature, function and component). The RAM is adaptable and can be tailored to meet the needs of the various organizations. Because software requirements are an important source of information for developing high-level tests, organizations willing to adopt the RAM model need to know the suitability of the RAM requirements for developing high-level tests. To investigate this suitability, test cases from twenty randomly selected requirements were developed, analyzed and graded. Requirements were selected from the requirements document of a Course Management System, a web based software system that supports teachers and students in performing course related tasks. This paper describes the results of the requirements document analysis. The results show that requirements at lower levels in the RAM are suitable for developing executable tests whereas it is hard to develop from requirements at higher levels.

Computational Algorithm for Obtaining Abelian Subalgebras in Lie Algebras

The set of all abelian subalgebras is computationally obtained for any given finite-dimensional Lie algebra, starting from the nonzero brackets in its law. More concretely, an algorithm is described and implemented to compute a basis for each nontrivial abelian subalgebra with the help of the symbolic computation package MAPLE. Finally, it is also shown a brief computational study for this implementation, considering both the computing time and the used memory.

Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors

A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.