Application of Multi-Dimensional Principal Component Analysis to Medical Data

Multi-dimensional principal component analysis (PCA) is the extension of the PCA, which is used widely as the dimensionality reduction technique in multivariate data analysis, to handle multi-dimensional data. To calculate the PCA the singular value decomposition (SVD) is commonly employed by the reason of its numerical stability. The multi-dimensional PCA can be calculated by using the higher-order SVD (HOSVD), which is proposed by Lathauwer et al., similarly with the case of ordinary PCA. In this paper, we apply the multi-dimensional PCA to the multi-dimensional medical data including the functional independence measure (FIM) score, and describe the results of experimental analysis.

Monitoring Patents Using the Statistical Process Control

The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.

Analyzing and Comparing the Architectural Specifications and the Urban Role of Scientific– Technological Parks in Iran and the World

The issue of scientific – technological parks has been proposed in several countries of the world especially in western countries since a few decades ago and its efficiency is under examination. In our county Iran, some scientific – technological parks have been established or are being established. This design would evaluate the urban role and method of architecture of these parks in order to criticize its efficiency and offer some suggestions, as much as possible to improve its building methods in Iran. The main problem of this design is that how much these parks in Iran do meet the international measurements. So for this reason, one scientific park in Iran and one from western countries would be studied and compared with each other.

Simulation Modeling of Fire Station Locations under Traffic Obstacles

Facility location problem involves locating a facility to optimize some performance measures. Location of a public facility to serve the community, such as a fire station, significantly affects its service quality. Main objective in locating a fire station is to minimize the response time, which is the time duration between receiving a call and reaching the place of incident. In metropolitan areas, fire vehicles need to cross highways and other traffic obstacles through some obstacle-overcoming points which delay the response time. In this paper, fire station location problem is analyzed. Simulation models are developed for the location problems which involve obstacles. Particular case problems are analyzed and the results are presented.

Statistics of Exon Lengths in Animals, Plants, Fungi, and Protists

Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.

Drag Analysis of an Aircraft Wing Model withand without Bird Feather like Winglet

This work describes the aerodynamic characteristic for aircraft wing model with and without bird feather like winglet. The aerofoil used to construct the whole structure is NACA 653-218 Rectangular wing and this aerofoil has been used to compare the result with previous research using winglet. The model of the rectangular wing with bird feather like winglet has been fabricated using polystyrene before design using CATIA P3 V5R13 software and finally fabricated in wood. The experimental analysis for the aerodynamic characteristic for rectangular wing without winglet, wing with horizontal winglet and wing with 60 degree inclination winglet for Reynolds number 1.66×105, 2.08×105 and 2.50×105 have been carried out in open loop low speed wind tunnel at the Aerodynamics laboratory in Universiti Putra Malaysia. The experimental result shows 25-30 % reduction in drag coefficient and 10-20 % increase in lift coefficient by using bird feather like winglet for angle of attack of 8 degree.

Efficacy of Anti-phishing Measures and Strategies - A Research Analysis

Statistics indicate that more than 1000 phishing attacks are launched every month. With 57 million people hit by the fraud so far in America alone, how do we combat phishing?This publication aims to discuss strategies in the war against Phishing. This study is an examination of the analysis and critique found in the ways adopted at various levels to counter the crescendo of phishing attacks and new techniques being adopted for the same. An analysis of the measures taken up by the varied popular Mail servers and popular browsers is done under this study. This work intends to increase the understanding and awareness of the internet user across the globe and even discusses plausible countermeasures at the users as well as the developers end. This conceptual paper will contribute to future research on similar topics.

Negative Emotions and Ways of Overcoming them in Prison

The aim of this paper is description of the notion of the death for prisoners and the ways of deal with. They express indifference, coldness, inability to accept the blame, they have no shame and no empathy. Is it enough to perform acts verging on the death. In this paper we described mechanisms and regularities of selfdestructive behaviour in the view of the relevant literature? The explanation of the phenomenon is of a biological and sociopsychological nature. It must be clearly stated that all forms of selfdestructive behaviour result from various impulses, conflicts and deficits. That is why they should be treated differently in terms of motivation and functions which they perform in a given group of people. Behind self-destruction there seems to be a motivational mechanism which forces prisoners to rebel and fight against the hated law and penitentiary systems. The imprisoned believe that pain and suffering inflicted on them by themselves are better than passive acceptance of repression. The variety of self-destruction acts is wide, and some of them take strange forms. We assume that a life-death barrier is a kind of game for them. If they cannot change the degrading situation, their life loses sense.

Biokinetics of Coping Mechanism of Freshwater tilapia following Exposure to Waterborne and Dietary Copper

The purpose of this study was to understand the main sources of copper (Cu) accumulation in target organs of tilapia (Oreochromis mossambicus) and to investigate how the organism mediate the process of Cu accumulation under prolonged conditions. By measuring both dietary and waterborne Cu accumulation and total concentrations in tilapia with biokinetic modeling approach, we were able to clarify the biokinetic coping mechanisms for the long term Cu accumulation. This study showed that water and food are both the major source of Cu for the muscle and liver of tilapia. This implied that control the Cu concentration in these two routes will be correlated to the Cu bioavailability for tilapia. We found that exposure duration and level of waterborne Cu drove the Cu accumulation in tilapia. The ability for Cu biouptake and depuration in organs of tilapia were actively mediated under prolonged exposure conditions. Generally, the uptake rate, depuration rate and net bioaccumulation ability in all selected organs decreased with the increasing level of waterborne Cu and extension of exposure duration.Muscle tissues accounted for over 50%of the total accumulated Cu and played a key role in buffering the Cu burden in the initial period of exposure, alternatively, the liver acted a more important role in the storage of Cu with the extension of exposures. We concluded that assumption of the constant biokinetic rates could lead to incorrect predictions with overestimating the long-term Cu accumulation in ecotoxicological risk assessments.

Mixed Convection in a 2D-channel with a Co- Flowing Fluid Injection: Influence of the Jet Position

Numerical study of a plane jet occurring in a vertical heated channel is carried out. The aim is to explore the influence of the forced flow, issued from a flat nozzle located in the entry section of a channel, on the up-going fluid along the channel walls. The Reynolds number based on the nozzle width and the jet velocity ranges between 3 103 and 2.104; whereas, the Grashof number based on the channel length and the wall temperature difference is 2.57 1010. Computations are established for a symmetrically heated channel and various nozzle positions. The system of governing equations is solved with a finite volumes method. The obtained results show that the jet-wall interactions activate the heat transfer, the position variation modifies the heat transfer especially for low Reynolds numbers: the heat transfer is enhanced for the adjacent wall; however it is decreased for the opposite one. The numerical velocity and temperature fields are post-processed to compute the quantities of engineering interest such as the induced mass flow rate, and the Nusselt number along the plates.

Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

Use of Heliox during Spontaneous Ventilation: Model Study

The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.

64 bit Computer Architectures for Space Applications – A study

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

EGCL: An Extended G-Code Language with Flow Control, Functions and Mnemonic Variables

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.

Direct Democracy and Social Contract in Ancient Athens

In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.

Analysis of Textual Data Based On Multiple 2-Class Classification Models

This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.

High Energy Dual-Wavelength Mid-Infrared Extracavity KTA Optical Parametric Oscillator

A high energy dual-wavelength extracavity KTA optical parametric oscillator (OPO) with excellent stability and beam quality, which is pumped by a Q-switched single-longitudinal-mode Nd:YAG laser, has been demonstrated based on a type II noncritical phase matching (NCPM) KTA crystal. The maximum pulse energy of 10.2 mJ with the output stability of better than 4.1% rms at 3.467 μm is obtained at the repetition rate of 10 Hz and pulse width of 2 ns, and the 11.9 mJ of 1.535 μm radiation is obtained simultaneously. This extracavity NCPM KTA OPO is very useful when high energy, high beam quality and smooth time domain are needed.

The Stigma of Mental Illness and the Way of Destigmatization: The Effects of Interactivity and Self-Construal

Some believe that stigma is the worst side effect of the people who have mental illness. Mental illness researchers have focused on the influence of mass media on the stigmatization of the people with mental illness. However, no studies have investigated the effects of the interactive media, such as blogs, on the stigmatization of mentally ill people, even though the media have a significant influence on people in all areas of life. The purpose of this study is to investigate the use of interactivity in destigmatization of the mentally ill and the moderating effect of self-construal (independent versus interdependent self-construal) on the relation between interactivity and destigmatization. The findings suggested that people in the human-human interaction condition had less social distance toward people with mental illness. Additionally, participants with higher independence showed more favorable affection and less social distance toward mentally ill people. Finally, direct contact with mentally ill people increased a person-s positive affect toward people with mental illness. The current study should provide insights for mental health practitioners by suggesting how they can use interactive media to approach the public that stigmatizes the mentally ill.

Iterative Joint Power Control and Partial Crosstalk Cancellation in Upstream VDSL

Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.

Comparing Transformational Leadership in Successful and Unsuccessful Companies

In this article, while it is attempted to describe the problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and unsuccessful companies. According to the methodology, the method of research, hypotheses, population and statistical sample are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that transformational leadership is significantly higher in successful companies than unsuccessful ones P