Commercializing Technology Solutions- Moving from Products to Solutions

The paper outlines the drivers behind the movement from products to solutions in the Hi-Tech Business-to-Business markets. The paper lists out the challenges in enabling the transformation from products to solutions and also attempts to explore strategic and operational recommendations based on the authors- factual experiences with Japanese Hi-tech manufacturing organizations. Organizations in the Hi-Tech Business-to-Business markets are increasingly being compelled to move to a solutions model from the conventional products model. Despite the added complexity of solutions, successful technology commercialization can be achieved by making prudent choices in defining a relevant solutions model, by backing the solution model through appropriate organizational design, and by overhauling the new product development process and supporting infrastructure.

Fracture Characterization of Plain Woven Fabric Glass-Epoxy Composites

Delamination between layers in composite materials is a major structural failure. The delamination resistance is quantified by the critical strain energy release rate (SERR). The present investigation deals with the strain energy release rate of two woven fabric composites. Materials used are made of two types of glass fiber (360 gsm and 600 gsm) of plain weave and epoxy as matrix. The fracture behavior is studied using the mode I, double cantilever beam test and the mode II, end notched flexure test, in order to determine the energy required for the initiation and growth of an artificial crack. The delamination energy of these two materials is compared in order to study the effect of weave and reinforcement on mechanical properties. The fracture mechanism is also analyzed by means of scanning electron microscopy (SEM). It is observed that the plain weave fabric composite with lesser strand width has higher inter laminar fracture properties compared to the plain weave fabric composite with more strand width.

Potential Effects of Human Bone Marrow Non- Mesenchymal Mononuclear Cells on Neuronal Differentiation

Bone marrow-derived stem cells have been widely studied as an alternative source of stem cells. Mesenchymal stem cells (MSCs) were mostly investigated and studies showed MSCs can promote neurogenesis. Little is known about the non-mesenchymal mononuclear cell fraction, which contains both hematopoietic and nonhematopoietic cells, including monocytes and endothelial progenitor cells. This study focused on unfractionated bone marrow mononuclear cells (BMMCs), which remained 72 h after MSCs were adhered to the culture plates. We showed that BMMC-conditioned medium promoted morphological changes of human SH-SY5Y neuroblastoma cells from an epithelial-like phenotype towards a neuron-like phenotype as indicated by an increase in neurite outgrowth, like those observed in retinoic acid (RA)-treated cells. The result could be explained by the effects of trophic factors released from BMMCs, as shown in the RT-PCR results that BMMCs expressed nerve growth factor (NGF), brain-derived neurotrophic factor (BDNF), and ciliary neurotrophic factor (CNTF). Similar results on the cell proliferation rate were also observed between RA-treated cells and cells cultured in BMMC-conditioned medium, suggesting that cells creased proliferating and differentiated into a neuronal phenotype. Using real-time RT-PCR, a significantly increased expression of tyrosine hydroxylase (TH) mRNA in SHSY5Y cells indicated that BMMC-conditioned medium induced catecholaminergic identities in differentiated SH-SY5Y cells.

Labeling Method in Steganography

In this paper a way of hiding text message (Steganography) in the gray image has been presented. In this method tried to find binary value of each character of text message and then in the next stage, tried to find dark places of gray image (black) by converting the original image to binary image for labeling each object of image by considering on 8 connectivity. Then these images have been converted to RGB image in order to find dark places. Because in this way each sequence of gray color turns into RGB color and dark level of grey image is found by this way if the Gary image is very light the histogram must be changed manually to find just dark places. In the final stage each 8 pixels of dark places has been considered as a byte and binary value of each character has been put in low bit of each byte that was created manually by dark places pixels for increasing security of the main way of steganography (LSB).

Elastic Failure of Web-Cracked Plate Girder

The presence of a vertical fatigue crack in the web of a plate girder subjected to pure bending influences the bending moment capacity of the girder. The growth of the crack may lead to premature elastic failure due to flange local yielding, flange local buckling, or web local buckling. Approximate expressions for the bending moment capacities corresponding to these failure modes were formulated. Finite element analyses were then used to validate the expressions. The expressions were employed to assess the effects of crack length on the capacity. Neglecting brittle fracture, tension buckling, and ductile failure modes, it was found that typical girders are governed by the capacity associated with flange local yielding as influenced by the crack. Concluding, a possible use of the capacity expressions in girder design was demonstrated.

LFC Design of a Deregulated Power System with TCPS Using PSO

In the LFC problem, the interconnections among some areas are the input of disturbances, and therefore, it is important to suppress the disturbances by the coordination of governor systems. In contrast, tie-line power flow control by TCPS located between two areas makes it possible to stabilize the system frequency oscillations positively through interconnection, which is also expected to provide a new ancillary service for the further power systems. Thus, a control strategy using controlling the phase angle of TCPS is proposed for provide active control facility of system frequency in this paper. Also, the optimum adjustment of PID controller's parameters in a robust way under bilateral contracted scenario following the large step load demands and disturbances with and without TCPS are investigated by Particle Swarm Optimization (PSO), that has a strong ability to find the most optimistic results. This newly developed control strategy combines the advantage of PSO and TCPS and has simple stricture that is easy to implement and tune. To demonstrate the effectiveness of the proposed control strategy a three-area restructured power system is considered as a test system under different operating conditions and system nonlinearities. Analysis reveals that the TCPS is quite capable of suppressing the frequency and tie-line power oscillations effectively as compared to that obtained without TCPS for a wide range of plant parameter changes, area load demands and disturbances even in the presence of system nonlinearities.

Low cost Nano-membrane Fabrication and Electro-polishing System

This paper presents the development of low cost Nano membrane fabrication system. The system is specially designed for anodic aluminum oxide membrane. This system is capable to perform the processes such as anodization and electro-polishing. The designed machine was successfully tested for 'mild anodization' (MA) for 48 hours and 'hard anodization' (HA) for 3 hours at constant 0oC. The system is digitally controlled and guided for temperature maintenance during anodization and electro-polishing. The total cost of the developed machine is 20 times less than the multi-cooling systems available in the market which are generally used for this purpose.

Detection and Quantification of Ozone in Screen Printing Facilities

Most often the contaminants are not taken seriously into consideration, and this behavior comes out directly from the lack of monitoring and professional reporting about pollution in the printing facilities in Serbia. The goal of planned and systematic ozone measurements in ambient air of the screen printing facilities in Novi Sad is to examine of its impact on the employees health, and to track trends in concentration. In this study, ozone concentrations were determined by using discontinuous and continuous method during the automatic and manual screen printing process. Obtained results indicates that the average concentrations of ozone measured during the automatic process were almost 3 to 28 times higher for discontinuous and 10 times higher for continuous method (1.028 ppm) compared to the values prescribed by OSHA. In the manual process, average concentrations of ozone were within prescribed values for discontinuous and almost 3 times higher for continuous method (0.299 ppm).

Analysis of Linear Equalizers for Cooperative Multi-User MIMO Based Reporting System

In this paper, we consider a multi user multiple input multiple output (MU-MIMO) based cooperative reporting system for cognitive radio network. In the reporting network, the secondary users forward the primary user data to the common fusion center (FC). The FC is equipped with linear equalizers and an energy detector to make the decision about the spectrum. The primary user data are considered to be a digital video broadcasting - terrestrial (DVB-T) signal. The sensing channel and the reporting channel are assumed to be an additive white Gaussian noise and an independent identically distributed Raleigh fading respectively. We analyzed the detection probability of MU-MIMO system with linear equalizers and arrived at the closed form expression for average detection probability. Also the system performance is investigated under various MIMO scenarios through Monte Carlo simulations.

Genetic Algorithm for Solving Non-Convex Economic Dispatch Problem

Economic dispatch (ED) is considered to be one of the key functions in electric power system operation. This paper presents a new hybrid approach based genetic algorithm (GA) to economic dispatch problems. GA is most commonly used optimizing algorithm predicated on principal of natural evolution. Utilization of chaotic queue with GA generates several neighborhoods of near optimal solutions to keep solution variation. It could avoid the search process from becoming pre-mature. For the objective of chaotic queue generation, utilization of tent equation as opposed to logistic equation results in improvement of iterative speed. The results of the proposed approach were compared in terms of fuel cost, with existing differential evolution and other methods in literature.

Extraction of Semantic Digital Signatures from MRI Photos for Image-Identification Purposes

This paper makes an attempt to solve the problem of searching and retrieving of similar MRI photos via Internet services using morphological features which are sourced via the original image. This study is aiming to be considered as an additional tool of searching and retrieve methods. Until now the main way of the searching mechanism is based on the syntactic way using keywords. The technique it proposes aims to serve the new requirements of libraries. One of these is the development of computational tools for the control and preservation of the intellectual property of digital objects, and especially of digital images. For this purpose, this paper proposes the use of a serial number extracted by using a previously tested semantic properties method. This method, with its center being the multi-layers of a set of arithmetic points, assures the following two properties: the uniqueness of the final extracted number and the semantic dependence of this number on the image used as the method-s input. The major advantage of this method is that it can control the authentication of a published image or its partial modification to a reliable degree. Also, it acquires the better of the known Hash functions that the digital signature schemes use and produces alphanumeric strings for cases of authentication checking, and the degree of similarity between an unknown image and an original image.

Speckle Reducing Contourlet Transform for Medical Ultrasound Images

Speckle noise affects all coherent imaging systems including medical ultrasound. In medical images, noise suppression is a particularly delicate and difficult task. A tradeoff between noise reduction and the preservation of actual image features has to be made in a way that enhances the diagnostically relevant image content. Even though wavelets have been extensively used for denoising speckle images, we have found that denoising using contourlets gives much better performance in terms of SNR, PSNR, MSE, variance and correlation coefficient. The objective of the paper is to determine the number of levels of Laplacian pyramidal decomposition, the number of directional decompositions to perform on each pyramidal level and thresholding schemes which yields optimal despeckling of medical ultrasound images, in particular. The proposed method consists of the log transformed original ultrasound image being subjected to contourlet transform, to obtain contourlet coefficients. The transformed image is denoised by applying thresholding techniques on individual band pass sub bands using a Bayes shrinkage rule. We quantify the achieved performance improvement.

Analysis and Design of a Novel Active Soft Switched Phase-Shifted Full Bridge Converter

This paper proposes an active soft-switching circuit for bridge converters aiming to improve the power conversion efficiency. The proposed circuit achieves loss-less switching for both main and auxiliary switches without increasing the main switch current/voltage rating. A winding coupled to the primary of power transformer ensures ZCS for the auxiliary switches during their turn-off. A 350 W, 100 kHz phase shifted full bridge (PSFB) converter is built to validate the analysis and design. Theoretical loss calculations for proposed circuit is presented. The proposed circuit is compared with passive soft switched PSFB in terms of efficiency and loss in duty cycle.

Edge Detection with the Parametric Filtering Method (Comparison with Canny Method)

In this paper, a new method of image edge-detection and characterization is presented. “Parametric Filtering method" uses a judicious defined filter, which preserves the signal correlation structure as input in the autocorrelation of the output. This leads, showing the evolution of the image correlation structure as well as various distortion measures which quantify the deviation between two zones of the signal (the two Hamming signals) for the protection of an image edge.

Titania and Cu-Titania Composite Layer on Graphite Substrate as Negative Electrode for Li-Ion Battery

This research study the application of the immobilized TiO2 layer and Cu-TiO2 layer on graphite substrate as a negative electrode or anode for Li-ion battery. The titania layer was produced through chemical bath deposition method, meanwhile Cu particles were deposited electrochemically. A material can be used as an electrode as it has capability to intercalates Li ions into its crystal structure. The Li intercalation into TiO2/Graphite and Cu- TiO2/Graphite were analyzed from the changes of its XRD pattern after it was used as electrode during discharging process. The XRD patterns were refined by Le Bail method in order to determine the crystal structure of the prepared materials. A specific capacity and the cycle ability measurement were carried out to study the performance of the prepared materials as negative electrode of the Li-ion battery. The specific capacity was measured during discharging process from fully charged until the cut off voltage. A 300 was used as a load. The result shows that the specific capacity of Li-ion battery with TiO2/Graphite as negative electrode is 230.87 ± 1.70mAh.g-1 which is higher than the specific capacity of Li-ion battery with pure graphite as negative electrode, i.e 140.75 ±0.46mAh.g-1. Meanwhile deposition of Cu onto TiO2 layer does not increase the specific capacity, and the value even lower than the battery with TiO2/Graphite as electrode. The cycle ability of the prepared battery is only two cycles, due to the Li ribbon which was used as cathode became fragile and easily broken.

Characterization of Corn Cobs from Microwave and Potassium Hydroxide Pretreatment

The complexity of lignocellulosic biomass requires a pretreatment step to improve the yield of fermentable sugars. The efficient pretreatment of corn cobs using microwave and potassium hydroxide and enzymatic hydrolysis was investigated. The objective of this work was to characterize the optimal condition of pretreatment of corn cobs using microwave and potassium hydroxide enhance enzymatic hydrolysis. Corn cobs were submerged in different potassium hydroxide concentration at varies temperature and resident time. The pretreated corn cobs were hydrolyzed to produce the reducing sugar for analysis. The morphology and microstructure of samples were investigated by Thermal gravimetric analysis (TGA, scanning electron microscope (SEM), X-ray diffraction (XRD). The results showed that lignin and hemicellulose were removed by microwave/potassium hydroxide pretreatment. The crystallinity of the pretreated corn cobs was higher than the untreated. This method was compared with autoclave and conventional heating method. The results indicated that microwave-alkali treatment was an efficient way to improve the enzymatic hydrolysis rate by increasing its accessibility hydrolysis enzymes.

Hierarchies Based On the Number of Cooperating Systems of Finite Automata on Four-Dimensional Input Tapes

In theoretical computer science, the Turing machine has played a number of important roles in understanding and exploiting basic concepts and mechanisms in computing and information processing [20]. It is a simple mathematical model of computers [9]. After that, M.Blum and C.Hewitt first proposed two-dimensional automata as a computational model of two-dimensional pattern processing, and investigated their pattern recognition abilities in 1967 [7]. Since then, a lot of researchers in this field have been investigating many properties about automata on a two- or three-dimensional tape. On the other hand, the question of whether processing fourdimensional digital patterns is much more difficult than two- or threedimensional ones is of great interest from the theoretical and practical standpoints. Thus, the study of four-dimensional automata as a computasional model of four-dimensional pattern processing has been meaningful [8]-[19],[21]. This paper introduces a cooperating system of four-dimensional finite automata as one model of four-dimensional automata. A cooperating system of four-dimensional finite automata consists of a finite number of four-dimensional finite automata and a four-dimensional input tape where these finite automata work independently (in parallel). Those finite automata whose input heads scan the same cell of the input tape can communicate with each other, that is, every finite automaton is allowed to know the internal states of other finite automata on the same cell it is scanning at the moment. In this paper, we mainly investigate some accepting powers of a cooperating system of eight- or seven-way four-dimensional finite automata. The seven-way four-dimensional finite automaton is an eight-way four-dimensional finite automaton whose input head can move east, west, south, north, up, down, or in the fu-ture, but not in the past on a four-dimensional input tape.

Improved IDR(s) Method for Gaining Very Accurate Solutions

The IDR(s) method based on an extended IDR theorem was proposed by Sonneveld and van Gijzen. The original IDR(s) method has excellent property compared with the conventional iterative methods in terms of efficiency and small amount of memory. IDR(s) method, however, has unexpected property that relative residual 2-norm stagnates at the level of less than 10-12. In this paper, an effective strategy for stagnation detection, stagnation avoidance using adaptively information of parameter s and improvement of convergence rate itself of IDR(s) method are proposed in order to gain high accuracy of the approximated solution of IDR(s) method. Through numerical experiments, effectiveness of adaptive tuning IDR(s) method is verified and demonstrated.

Bottom Up Text Mining through Hierarchical Document Representation

Most of the existing text mining approaches are proposed, keeping in mind, transaction databases model. Thus, the mined dataset is structured using just one concept: the “transaction", whereas the whole dataset is modeled using the “set" abstract type. In such cases, the structure of the whole dataset and the relationships among the transactions themselves are not modeled and consequently, not considered in the mining process. We believe that taking into account structure properties of hierarchically structured information (e.g. textual document, etc ...) in the mining process, can leads to best results. For this purpose, an hierarchical associations rule mining approach for textual documents is proposed in this paper and the classical set-oriented mining approach is reconsidered profits to a Direct Acyclic Graph (DAG) oriented approach. Natural languages processing techniques are used in order to obtain the DAG structure. Based on this graph model, an hierarchical bottom up algorithm is proposed. The main idea is that each node is mined with its parent node.

BIP-Based Alarm Declaration and Clearing in SONET Networks Employing Automatic Protection Switching

The paper examines the performance of bit-interleaved parity (BIP) methods in error rate monitoring, and in declaration and clearing of alarms in those transport networks that employ automatic protection switching (APS). The BIP-based error rate monitoring is attractive for its simplicity and ease of implementation. The BIP-based results are compared with exact results and are found to declare the alarms too late, and to clear the alarms too early. It is concluded that the standards development and systems implementation should take into account the fact of early clearing and late declaration of alarms. The window parameters defining the detection and clearing thresholds should be set so as to build sufficient hysteresis into the system to ensure that BIP-based implementations yield acceptable performance results.