Mixed Convection in a 2D-channel with a Co- Flowing Fluid Injection: Influence of the Jet Position

Numerical study of a plane jet occurring in a vertical heated channel is carried out. The aim is to explore the influence of the forced flow, issued from a flat nozzle located in the entry section of a channel, on the up-going fluid along the channel walls. The Reynolds number based on the nozzle width and the jet velocity ranges between 3 103 and 2.104; whereas, the Grashof number based on the channel length and the wall temperature difference is 2.57 1010. Computations are established for a symmetrically heated channel and various nozzle positions. The system of governing equations is solved with a finite volumes method. The obtained results show that the jet-wall interactions activate the heat transfer, the position variation modifies the heat transfer especially for low Reynolds numbers: the heat transfer is enhanced for the adjacent wall; however it is decreased for the opposite one. The numerical velocity and temperature fields are post-processed to compute the quantities of engineering interest such as the induced mass flow rate, and the Nusselt number along the plates.

Dust Storm Prediction Using ANNs Technique (A Case Study: Zabol City)

Dust storms are one of the most costly and destructive events in many desert regions. They can cause massive damages both in natural environments and human lives. This paper is aimed at presenting a preliminary study on dust storms, as a major natural hazard in arid and semi-arid regions. As a case study, dust storm events occurred in Zabol city located in Sistan Region of Iran was analyzed to diagnose and predict dust storms. The identification and prediction of dust storm events could have significant impacts on damages reduction. Present models for this purpose are complicated and not appropriate for many areas with poor-data environments. The present study explores Gamma test for identifying inputs of ANNs model, for dust storm prediction. Results indicate that more attempts must be carried out concerning dust storms identification and segregate between various dust storm types.

Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

Use of Heliox during Spontaneous Ventilation: Model Study

The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.

64 bit Computer Architectures for Space Applications – A study

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

EGCL: An Extended G-Code Language with Flow Control, Functions and Mnemonic Variables

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.

Direct Democracy and Social Contract in Ancient Athens

In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.

A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines

We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.

Analysis of Textual Data Based On Multiple 2-Class Classification Models

This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.

High Energy Dual-Wavelength Mid-Infrared Extracavity KTA Optical Parametric Oscillator

A high energy dual-wavelength extracavity KTA optical parametric oscillator (OPO) with excellent stability and beam quality, which is pumped by a Q-switched single-longitudinal-mode Nd:YAG laser, has been demonstrated based on a type II noncritical phase matching (NCPM) KTA crystal. The maximum pulse energy of 10.2 mJ with the output stability of better than 4.1% rms at 3.467 μm is obtained at the repetition rate of 10 Hz and pulse width of 2 ns, and the 11.9 mJ of 1.535 μm radiation is obtained simultaneously. This extracavity NCPM KTA OPO is very useful when high energy, high beam quality and smooth time domain are needed.

The Stigma of Mental Illness and the Way of Destigmatization: The Effects of Interactivity and Self-Construal

Some believe that stigma is the worst side effect of the people who have mental illness. Mental illness researchers have focused on the influence of mass media on the stigmatization of the people with mental illness. However, no studies have investigated the effects of the interactive media, such as blogs, on the stigmatization of mentally ill people, even though the media have a significant influence on people in all areas of life. The purpose of this study is to investigate the use of interactivity in destigmatization of the mentally ill and the moderating effect of self-construal (independent versus interdependent self-construal) on the relation between interactivity and destigmatization. The findings suggested that people in the human-human interaction condition had less social distance toward people with mental illness. Additionally, participants with higher independence showed more favorable affection and less social distance toward mentally ill people. Finally, direct contact with mentally ill people increased a person-s positive affect toward people with mental illness. The current study should provide insights for mental health practitioners by suggesting how they can use interactive media to approach the public that stigmatizes the mentally ill.

Investigation of Short Time Scale Variation of Solar Radiation Spectrum in UV, PAR, and NIR Bands due to Atmospheric Aerosol and Water Vapor

Long terms variation of solar insolation had been widely studied. However, its parallel observations in short time scale is rather lacking. This paper aims to investigate the short time scale evolution of solar radiation spectrum (UV, PAR, and NIR bands) due to atmospheric aerosols and water vapors. A total of 25 days of global and diffused solar spectrum ranges from air mass 2 to 6 were collected using ground-based spectrometer with shadowband technique. The result shows that variation of solar radiation is the least in UV fraction, followed by PAR and the most in NIR. Broader variations in PAR and NIR are associated with the short time scale fluctuations of aerosol and water vapors. The corresponding daily evolution of UV, PAR, and NIR fractions implies that aerosol and water vapors variation could also be responsible for the deviation pattern in the Langley-plot analysis.

Iterative Joint Power Control and Partial Crosstalk Cancellation in Upstream VDSL

Crosstalk is the major limiting issue in very high bit-rate digital subscriber line (VDSL) systems in terms of bit-rate or service coverage. At the central office side, joint signal processing accompanied by appropriate power allocation enables complex multiuser processors to provide near capacity rates. Unfortunately complexity grows with the square of the number of lines within a binder, so by taking into account that there are only a few dominant crosstalkers who contribute to main part of crosstalk power, the canceller structure can be simplified which resulted in a much lower run-time complexity. In this paper, a multiuser power control scheme, namely iterative waterfilling, is combined with previously proposed partial crosstalk cancellation approaches to demonstrate the best ever achieved performance which is verified by simulation results.

A Distributed Algorithm for Intrinsic Cluster Detection over Large Spatial Data

Clustering algorithms help to understand the hidden information present in datasets. A dataset may contain intrinsic and nested clusters, the detection of which is of utmost importance. This paper presents a Distributed Grid-based Density Clustering algorithm capable of identifying arbitrary shaped embedded clusters as well as multi-density clusters over large spatial datasets. For handling massive datasets, we implemented our method using a 'sharednothing' architecture where multiple computers are interconnected over a network. Experimental results are reported to establish the superiority of the technique in terms of scale-up, speedup as well as cluster quality.

Periodic Solutions of Recurrent Neural Networks with Distributed Delays and Impulses on Time Scales

In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.

Comparing Transformational Leadership in Successful and Unsuccessful Companies

In this article, while it is attempted to describe the problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and unsuccessful companies. According to the methodology, the method of research, hypotheses, population and statistical sample are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that transformational leadership is significantly higher in successful companies than unsuccessful ones P

Higher-Dimensional Quantum Cryptography

We report on a high-speed quantum cryptography system that utilizes simultaneous entanglement in polarization and in “time-bins". With multiple degrees of freedom contributing to the secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.

An Examination of the Factors Influencing Software Development Effort

Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.

Correspondence Theorem for Anti L-fuzzy Normal Subgroups

In this paper the concept of the cosets of an anti Lfuzzy normal subgroup of a group is given. Furthermore, the group G/A of cosets of an anti L-fuzzy normal subgroup A of a group G is shown to be isomorphic to a factor group of G in a natural way. Finally, we prove that if f : G1 -→ G2 is an epimorphism of groups, then there is a one-to-one order-preserving correspondence between the anti L-fuzzy normal subgroups of G2 and those of G1 which are constant on the kernel of f.

An Agent Oriented Approach to Operational Profile Management

Software reliability, defined as the probability of a software system or application functioning without failure or errors over a defined period of time, has been an important area of research for over three decades. Several research efforts aimed at developing models to improve reliability are currently underway. One of the most popular approaches to software reliability adopted by some of these research efforts involves the use of operational profiles to predict how software applications will be used. Operational profiles are a quantification of usage patterns for a software application. The research presented in this paper investigates an innovative multiagent framework for automatic creation and management of operational profiles for generic distributed systems after their release into the market. The architecture of the proposed Operational Profile MAS (Multi-Agent System) is presented along with detailed descriptions of the various models arrived at following the analysis and design phases of the proposed system. The operational profile in this paper is extended to comprise seven different profiles. Further, the criticality of operations is defined using a new composed metrics in order to organize the testing process as well as to decrease the time and cost involved in this process. A prototype implementation of the proposed MAS is included as proof-of-concept and the framework is considered as a step towards making distributed systems intelligent and self-managing.