Identifying Impact Factors in Technology Transfer with the Aim of Technology Localization

Technology transfer is a common method for companies to acquire new technology and presents both challenges and substantial benefits. In some cases especially in developing countries, the mere possession of technology does not guarantee a competitive advantage if the appropriate infrastructure is not in place. In this paper, we identify the localization factors needed to provide a better understanding of the conditions necessary for localization in order to benefit from future technology developments. Our theoretical and empirical analyses allow us to identify several factors in the technology transfer process that affect localization and provide leverage in enhancing capabilities and absorptive capacity.The impact factors are categorized within different groups of government, firms, institutes and market, and are verified through the empirical survey of a technology transfer experience. Moreover, statistical analysis has allowed a deeper understanding of the importance of each factor and has enabled each group to prioritize their organizational policies to effectively localize their technology.

Texture Feature Extraction of Infrared River Ice Images using Second-Order Spatial Statistics

Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.

The Acaricidal and Repellent Effect of Cinnamon Essential Oil against House Dust Mite

The major source of allergy in home is the house dust mite (Dematophagoides farina, Dermatophagoides pteronyssinus) causing allergic symptom include atopic dermatitis, asthma, perennial rhinitis and even infant death syndrome. Control of this mite species is dependent on the use of chemical methods such as fumigation treatments with methylene bromide, spraying with organophosphates such as pirimiphos-methyl, or treatments with repellents such as DEET and benzyl benzoate. Although effective, their repeated use for decades has sometimes resulted in development of resistance and fostered environmental and human health concerns. Both decomposing animal parts and the protein that surrounds mite fecal pellets cause mite allergy. So it is more effective to repel than to kill them because allergen is not living house dust mite but dead body or fecal particles of house dust mite. It is important to find out natural repellent material against house dust mite to control them and reduce the allergic reactions. Plants may be an alternative source for dust mite control because they contain a range of bioactive chemicals. The research objectives of this paper were to verify the acaricidal and repellent effects of cinnamon essential oil and to find out it-s most effective concentrations. We could find that cinnamon bark essential oil was very effective material to control the house dust mite. Furthermore, it could reduce chemical resistance and danger for human health.

Multimodal Biometric System Based on Near- Infra-Red Dorsal Hand Geometry and Fingerprints for Single and Whole Hands

Prior research evidenced that unimodal biometric systems have several tradeoffs like noisy data, intra-class variations, restricted degrees of freedom, non-universality, spoof attacks, and unacceptable error rates. In order for the biometric system to be more secure and to provide high performance accuracy, more than one form of biometrics are required. Hence, the need arise for multimodal biometrics using combinations of different biometric modalities. This paper introduces a multimodal biometric system (MMBS) based on fusion of whole dorsal hand geometry and fingerprints that acquires right and left (Rt/Lt) near-infra-red (NIR) dorsal hand geometry (HG) shape and (Rt/Lt) index and ring fingerprints (FP). Database of 100 volunteers were acquired using the designed prototype. The acquired images were found to have good quality for all features and patterns extraction to all modalities. HG features based on the hand shape anatomical landmarks were extracted. Robust and fast algorithms for FP minutia points feature extraction and matching were used. Feature vectors that belong to similar biometric traits were fused using feature fusion methodologies. Scores obtained from different biometric trait matchers were fused using the Min-Max transformation-based score fusion technique. Final normalized scores were merged using the sum of scores method to obtain a single decision about the personal identity based on multiple independent sources. High individuality of the fused traits and user acceptability of the designed system along with its experimental high performance biometric measures showed that this MMBS can be considered for med-high security levels biometric identification purposes.

Simultaneous Term Structure Estimation of Hazard and Loss Given Default with a Statistical Model using Credit Rating and Financial Information

The objective of this study is to propose a statistical modeling method which enables simultaneous term structure estimation of the risk-free interest rate, hazard and loss given default, incorporating the characteristics of the bond issuing company such as credit rating and financial information. A reduced form model is used for this purpose. Statistical techniques such as spline estimation and Bayesian information criterion are employed for parameter estimation and model selection. An empirical analysis is conducted using the information on the Japanese bond market data. Results of the empirical analysis confirm the usefulness of the proposed method.

Static Recrystallization Behavior of Mg Alloy Single Crystals

Single crystals of Magnesium alloys such as pure Mg, Mg-1Zn-0.5Y, Mg-0.1Y, and Mg-0.1Ce alloys were successfully fabricated in this study by employing the modified Bridgman method. To determine the exact orientation of crystals, pole figure measurement using X-ray diffraction were carried out on each single crystal. Hardness and compression tests were conducted followed by subsequent recrysatllization annealing. Recrystallization kinetics of Mg alloy single crystals has been investigated. Fabricated single crystals were cut into rectangular shaped specimen and solution treated at 400oC for 24 hrs, and then deformed in compression mode by 30% reduction. Annealing treatment for recrystallization has been conducted on these cold-rolled plates at temperatures of 300oC for various times from 1 to 20 mins. The microstructure observation and hardness measurement conducted on the recrystallized specimens revealed that static recrystallization of ternary alloy single crystal was very slow, while recrystallization behavior of binary alloy single crystals appeared to be very fast.

Automatic Segmentation of Thigh Magnetic Resonance Images

Purpose: To develop a method for automatic segmentation of adipose and muscular tissue in thighs from magnetic resonance images. Materials and methods: Thirty obese women were scanned on a Siemens Impact Expert 1T resonance machine. 1500 images were finally used in the tests. The developed segmentation method is a recursive and multilevel process that makes use of several concepts such as shaped histograms, adaptative thresholding and connectivity. The segmentation process was implemented in Matlab and operates without the need of any user interaction. The whole set of images were segmented with the developed method. An expert radiologist segmented the same set of images following a manual procedure with the aid of the SliceOmatic software (Tomovision). These constituted our 'goal standard'. Results: The number of coincidental pixels of the automatic and manual segmentation procedures was measured. The average results were above 90 % of success in most of the images. Conclusions: The proposed approach allows effective automatic segmentation of MRIs from thighs, comparable to expert manual performance.

Dependence of Particle Initiated PD Characteristics on Size and Position of Metallic Particle Adhering to the Spacer Surface in GIS

It is well known that metallic particles reduce the reliability of Gas-Insulated Substation (GIS) equipments by initiating partial discharge (PDs) that can lead to breakdown and complete failure of GIS. This paper investigates the characteristics of PDs caused by metallic particle adhering to the solid spacer. The PD detection and measurement were carried out by using IEC 60270 method with particles of different sizes and at different positions on the spacer surface. The results show that a particle of certain size at certain position possesses a unique PD characteristic as compared to those caused by particles of different sizes and/or at different positions. Therefore PD characteristics may be useful for the particle size and position identification.

Eigenvalues of Particle Bound in Single and Double Delta Function Potentials through Numerical Analysis

This study employs the use of the fourth order Numerov scheme to determine the eigenstates and eigenvalues of particles, electrons in particular, in single and double delta function potentials. For the single delta potential, it is found that the eigenstates could only be attained by using specific potential depths. The depth of the delta potential well has a value that varies depending on the delta strength. These depths are used for each well on the double delta function potential and the eigenvalues are determined. There are two bound states found in the computation, one with a symmetric eigenstate and another one which is antisymmetric.

Roundness Deviation Measuring Strategy at Coordination Measuring Machines and Conventional Machines

Today technological process makes possible surface control of producing parts which is needful for product quality guarantee. Geometrical structure of part surface includes form, proportion, accuracy to shape, accuracy to size, alignment and surface topography (roughness, waviness, etc.). All these parameters are dependence at technology, production machine parameters, material properties, but also at human, etc. Every parameters approves at total part accuracy, it is means at accuracy to shape. One of the most important accuracy to shape element is roundness. This paper will be deals by comparison of roughness deviations at coordination measuring machines and at special single purpose machines. Will describing measuring by discreet method (discontinuous) and scanning method (continuous) at coordination measuring machines and confrontation with reference method using at single purpose machines.

Effects of Bay Leaves on Blood Glucose and Lipid Profiles on the Patients with Type 1 Diabetes

Bay leaves have been shown to improve insulin function in vitro but the effects on people have not been determined. The objective of this study was to determine if bay leaves may be important in the prevention and/or alleviation of type 1 diabetes. Methods: Fifty five people with type 1 diabetes were divided into two groups, 45 given capsules containing 3 g of bay leaves per day for 30 days and 10 given a placebo capsules. Results All the patients consumed bay leaves shows reduced serum glucose with significant decreases 27% after 30 d. Total cholesterol decreased, 21 %, after 30 days with larger decreases in low density lipoprotein (LDL) 24%. High density lipoprotein (HDL) increased 20% and Triglycerides also decreased 26%. There were no significant changes in the placebo group. Conclusion, this study demonstrates that consumption of bay leaves, 3 g/d for 30 days, decreases risk factors for diabetes and cardiovascular diseases and suggests that bay leaves may be beneficial for people with type 1 diabetes.

The Influence of Pad Thermal Diffusivity over Heat Transfer into the PCBs Structure

The Pads have unique values of thermophysical properties (THP) having important contribution over heat transfer into the PCB structure. Materials with high thermal diffusivity (TD) rapidly adjust their temperature to that of their surroundings, because the HT is quick in compare to their volumetric heat capacity (VHC). In the paper is presenting the diffusivity tests (ASTM E1461 flash method) for PCBs with different core materials. In the experiments, the multilayer structure of PCBA was taken into consideration, an equivalent property referring to each of experimental structure be practically measured. Concerning to entire structure, the THP emphasize the major contribution of substrate in establishing of reflow soldering process (RSP) heat transfer necessities. This conclusion offer practical solution for heat transfer time constant calculation as function of thickness and substrate material diffusivity with an acceptable error estimation.

Solving an Extended Resource Leveling Problem with Multiobjective Evolutionary Algorithms

We introduce an extended resource leveling model that abstracts real life projects that consider specific work ranges for each resource. Contrary to traditional resource leveling problems this model considers scarce resources and multiple objectives: the minimization of the project makespan and the leveling of each resource usage over time. We formulate this model as a multiobjective optimization problem and we propose a multiobjective genetic algorithm-based solver to optimize it. This solver consists in a two-stage process: a main stage where we obtain non-dominated solutions for all the objectives, and a postprocessing stage where we seek to specifically improve the resource leveling of these solutions. We propose an intelligent encoding for the solver that allows including domain specific knowledge in the solving mechanism. The chosen encoding proves to be effective to solve leveling problems with scarce resources and multiple objectives. The outcome of the proposed solvers represent optimized trade-offs (alternatives) that can be later evaluated by a decision maker, this multi-solution approach represents an advantage over the traditional single solution approach. We compare the proposed solver with state-of-art resource leveling methods and we report competitive and performing results.

Automated Inspection Algorithm for Thick Plate Using Dual Light Switching Lighting Method

This paper presents an automated inspection algorithm for a thick plate. Thick plates typically have various types of surface defects, such as scabs, scratches, and roller marks. These defects have individual characteristics including brightness and shape. Therefore, it is not simple to detect all the defects. In order to solve these problems and to detect defects more effectively, we propose a dual light switching lighting method and a defect detection algorithm based on Gabor filters.

An Approximate Engineering Method for Aerodynamic Heating Solution around Blunt Body Nose

This paper is devoted to predict laminar and turbulent heating rates around blunt re-entry spacecraft at hypersonic conditions. Heating calculation of a hypersonic body is normally performed during the critical part of its flight trajectory. The procedure is of an inverse method, where a shock wave is assumed, and the body shape that supports this shock, as well as the flowfield between the shock and body, are calculated. For simplicity the normal momentum equation is replaced with a second order pressure relation; this simplification significantly reduces computation time. The geometries specified in this research, are parabola and ellipsoids which may have conical after bodies. An excellent agreement is observed between the results obtained in this paper and those calculated by others- research. Since this method is much faster than Navier-Stokes solutions, it can be used in preliminary design, parametric study of hypersonic vehicles.

Road Extraction Using Stationary Wavelet Transform

In this paper, a novel road extraction method using Stationary Wavelet Transform is proposed. To detect road features from color aerial satellite imagery, Mexican hat Wavelet filters are used by applying the Stationary Wavelet Transform in a multiresolution, multi-scale, sense and forming the products of Wavelet coefficients at a different scales to locate and identify road features at a few scales. In addition, the shifting of road features locations is considered through multiple scales for robust road extraction in the asymmetry road feature profiles. From the experimental results, the proposed method leads to a useful technique to form the basis of road feature extraction. Also, the method is general and can be applied to other features in imagery.

A Novel Q-algorithm for EPC Global Class-1 Generation-2 Anti-collision Protocol

This paper provides a scheme to improve the read efficiency of anti-collision algorithm in EPCglobal UHF Class-1 Generation-2 RFID standard. In this standard, dynamic frame slotted ALOHA is specified to solve the anti-collision problem. Also, the Q-algorithm with a key parameter C is adopted to dynamically adjust the frame sizes. In the paper, we split the C parameter into two parameters to increase the read speed and derive the optimal values of the two parameters through simulations. The results indicate our method outperforms the original Q-algorithm.

A Parametric Study of an Inverse Electrostatics Problem (IESP) Using Simulated Annealing, Hooke & Jeeves and Sequential Quadratic Programming in Conjunction with Finite Element and Boundary Element Methods

The aim of the current work is to present a comparison among three popular optimization methods in the inverse elastostatics problem (IESP) of flaw detection within a solid. In more details, the performance of a simulated annealing, a Hooke & Jeeves and a sequential quadratic programming algorithm was studied in the test case of one circular flaw in a plate solved by both the boundary element (BEM) and the finite element method (FEM). The proposed optimization methods use a cost function that utilizes the displacements of the static response. The methods were ranked according to the required number of iterations to converge and to their ability to locate the global optimum. Hence, a clear impression regarding the performance of the aforementioned algorithms in flaw identification problems was obtained. Furthermore, the coupling of BEM or FEM with these optimization methods was investigated in order to track differences in their performance.

Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Robust Features for Impulsive Noisy Speech Recognition Using Relative Spectral Analysis

The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.