Mixed Convection in a 2D-channel with a Co- Flowing Fluid Injection: Influence of the Jet Position

Numerical study of a plane jet occurring in a vertical heated channel is carried out. The aim is to explore the influence of the forced flow, issued from a flat nozzle located in the entry section of a channel, on the up-going fluid along the channel walls. The Reynolds number based on the nozzle width and the jet velocity ranges between 3 103 and 2.104; whereas, the Grashof number based on the channel length and the wall temperature difference is 2.57 1010. Computations are established for a symmetrically heated channel and various nozzle positions. The system of governing equations is solved with a finite volumes method. The obtained results show that the jet-wall interactions activate the heat transfer, the position variation modifies the heat transfer especially for low Reynolds numbers: the heat transfer is enhanced for the adjacent wall; however it is decreased for the opposite one. The numerical velocity and temperature fields are post-processed to compute the quantities of engineering interest such as the induced mass flow rate, and the Nusselt number along the plates.

Dust Storm Prediction Using ANNs Technique (A Case Study: Zabol City)

Dust storms are one of the most costly and destructive events in many desert regions. They can cause massive damages both in natural environments and human lives. This paper is aimed at presenting a preliminary study on dust storms, as a major natural hazard in arid and semi-arid regions. As a case study, dust storm events occurred in Zabol city located in Sistan Region of Iran was analyzed to diagnose and predict dust storms. The identification and prediction of dust storm events could have significant impacts on damages reduction. Present models for this purpose are complicated and not appropriate for many areas with poor-data environments. The present study explores Gamma test for identifying inputs of ANNs model, for dust storm prediction. Results indicate that more attempts must be carried out concerning dust storms identification and segregate between various dust storm types.

Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

64 bit Computer Architectures for Space Applications – A study

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

EGCL: An Extended G-Code Language with Flow Control, Functions and Mnemonic Variables

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.

Direct Democracy and Social Contract in Ancient Athens

In the present essay, a model of choice by actors is analysedby utilizing the theory of chaos to explain how change comes about. Then, by using ancient and modern sources of literature, the theory of the social contract is analysed as a historical phenomenon that first appeared during the period of Classical Greece. Then, based on the findings of this analysis, the practice of direct democracy and public choice in ancient Athens is analysed, through two historical cases: Eubulus and Lycurgus political program in the second half of the 4th century. The main finding of this research is that these policies can be interpreted as an implementation of a social contract, through which citizens were taking decisions based on rational choice according to economic considerations.

A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines

We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.

Analysis of Textual Data Based On Multiple 2-Class Classification Models

This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.

The Stigma of Mental Illness and the Way of Destigmatization: The Effects of Interactivity and Self-Construal

Some believe that stigma is the worst side effect of the people who have mental illness. Mental illness researchers have focused on the influence of mass media on the stigmatization of the people with mental illness. However, no studies have investigated the effects of the interactive media, such as blogs, on the stigmatization of mentally ill people, even though the media have a significant influence on people in all areas of life. The purpose of this study is to investigate the use of interactivity in destigmatization of the mentally ill and the moderating effect of self-construal (independent versus interdependent self-construal) on the relation between interactivity and destigmatization. The findings suggested that people in the human-human interaction condition had less social distance toward people with mental illness. Additionally, participants with higher independence showed more favorable affection and less social distance toward mentally ill people. Finally, direct contact with mentally ill people increased a person-s positive affect toward people with mental illness. The current study should provide insights for mental health practitioners by suggesting how they can use interactive media to approach the public that stigmatizes the mentally ill.

A Distributed Algorithm for Intrinsic Cluster Detection over Large Spatial Data

Clustering algorithms help to understand the hidden information present in datasets. A dataset may contain intrinsic and nested clusters, the detection of which is of utmost importance. This paper presents a Distributed Grid-based Density Clustering algorithm capable of identifying arbitrary shaped embedded clusters as well as multi-density clusters over large spatial datasets. For handling massive datasets, we implemented our method using a 'sharednothing' architecture where multiple computers are interconnected over a network. Experimental results are reported to establish the superiority of the technique in terms of scale-up, speedup as well as cluster quality.

Higher-Dimensional Quantum Cryptography

We report on a high-speed quantum cryptography system that utilizes simultaneous entanglement in polarization and in “time-bins". With multiple degrees of freedom contributing to the secret key, we can achieve over ten bits of random entropy per detected coincidence. In addition, we collect from multiple spots o the downconversion cone to further amplify the data rate, allowing usto achieve over 10 Mbits of secure key per second.

An Examination of the Factors Influencing Software Development Effort

Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.

An Agent Oriented Approach to Operational Profile Management

Software reliability, defined as the probability of a software system or application functioning without failure or errors over a defined period of time, has been an important area of research for over three decades. Several research efforts aimed at developing models to improve reliability are currently underway. One of the most popular approaches to software reliability adopted by some of these research efforts involves the use of operational profiles to predict how software applications will be used. Operational profiles are a quantification of usage patterns for a software application. The research presented in this paper investigates an innovative multiagent framework for automatic creation and management of operational profiles for generic distributed systems after their release into the market. The architecture of the proposed Operational Profile MAS (Multi-Agent System) is presented along with detailed descriptions of the various models arrived at following the analysis and design phases of the proposed system. The operational profile in this paper is extended to comprise seven different profiles. Further, the criticality of operations is defined using a new composed metrics in order to organize the testing process as well as to decrease the time and cost involved in this process. A prototype implementation of the proposed MAS is included as proof-of-concept and the framework is considered as a step towards making distributed systems intelligent and self-managing.

Surrogate based Evolutionary Algorithm for Design Optimization

Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.

Investigation of Self-Similarity Solution for Wake Flow of a Cylinder

The data measurement of mean velocity has been taken for the wake of single circular cylinder with three different diameters for two different velocities. The effects of change in diameter and in velocity are studied in self-similar coordinate system. The spatial variations of velocity defect and that of the half-width have been investigated. The results are compared with those published by H.Schlichting. In the normalized coordinates, it is also observed that all cases except for the first station are self-similar. By attention to self-similarity profiles of mean velocity, it is observed for all the cases at the each station curves tend to zero at a same point.

Shadow Detection for Increased Accuracy of Privacy Enhancing Methods in Video Surveillance Edge Devices

Shadow detection is still considered as one of the potential challenges for intelligent automated video surveillance systems. A pre requisite for reliable and accurate detection and tracking is the correct shadow detection and classification. In such a landscape of conditions, privacy issues add more and more complexity and require reliable shadow detection. In this work the intertwining between security, accuracy, reliability and privacy is analyzed and, accordingly, a novel architecture for Privacy Enhancing Video Surveillance (PEVS) is introduced. Shadow detection and masking are dealt with through the combination of two different approaches simultaneously. This results in a unique privacy enhancement, without affecting security. Subsequently, the methodology was employed successfully in a large-scale wireless video surveillance system; privacy relevant information was stored and encrypted on the unit, without transferring it over an un-trusted network.

Talent Management and its Use in the Field of Human Resources Management in the Organization of the Czech Republic

The article is aimed at bringing information on the scope and the level of use of talent management by organizations in one of the Czech Republic regions, in the Moravian-Silesian Region. On the basis of data acquired by a questionnaire survey it has been found out that organizations in the above-mentioned region are implementing the system of talent management on a small scale: this approach is used by 3.8 % of organizations only that is 9 from 237 (100 %) of the approached respondents. The main reasons why this approach is not used is either that organizations have no knowledge of it or there is lack of financial and personnel resources. In the article recommendations suggested by the author can be found for a wider application of talent management in the Czech practice.

Structural Characteristics of Batch Processed Agro-Waste Fibres

The characterisation of agro-wastes fibres for composite applications from Nigeria using X-ray diffraction (XRD) and Scanning Electron Microscopy (SEM) has been done. Fibres extracted from groundnut shell, coconut husk, rice husk, palm fruit bunch and palm fruit stalk are processed using two novel cellulose fibre production methods developed by the authors. Cellulose apparent crystallinity calculated using the deconvolution of the diffractometer trace shows that the amorphous portion of cellulose was permeable to hydrolysis yielding high crystallinity after treatment. All diffratograms show typical cellulose structure with well-defined 110, 200 and 040 peaks. Palm fruit fibres had the highest 200 crystalline cellulose peaks compared to others and it is an indication of rich cellulose content. Surface examination of the resulting fibres using SEM indicates the presence of regular cellulose network structure with some agglomerated laminated layer of thin leaves of cellulose microfibrils. The surfaces were relatively smooth indicating the removal of hemicellulose, lignin and pectin.

Fuzzy Hierarchical Clustering Applied for Quality Estimation in Manufacturing System

This paper develops a quality estimation method with the application of fuzzy hierarchical clustering. Quality estimation is essential to quality control and quality improvement as a precise estimation can promote a right decision-making in order to help better quality control. Normally the quality of finished products in manufacturing system can be differentiated by quality standards. In the real life situation, the collected data may be vague which is not easy to be classified and they are usually represented in term of fuzzy number. To estimate the quality of product presented by fuzzy number is not easy. In this research, the trapezoidal fuzzy numbers are collected in manufacturing process and classify the collected data into different clusters so as to get the estimation. Since normal hierarchical clustering methods can only be applied for real numbers, fuzzy hierarchical clustering is selected to handle this problem based on quality standards.

Parametric Analysis in the Electronic Sensor Frequency Adjustment Process

The use of electronic sensors in the electronics industry has become increasingly popular over the past few years, and it has become a high competition product. The frequency adjustment process is regarded as one of the most important process in the electronic sensor manufacturing process. Due to inaccuracies in the frequency adjustment process, up to 80% waste can be caused due to rework processes; therefore, this study aims to provide a preliminary understanding of the role of parameters used in the frequency adjustment process, and also make suggestions in order to further improve performance. Four parameters are considered in this study: air pressure, dispensing time, vacuum force, and the distance between the needle tip and the product. A full factorial design for experiment 2k was considered to determine those parameters that significantly affect the accuracy of the frequency adjustment process, where a deviation in the frequency after adjustment and the target frequency is expected to be 0 kHz. The experiment was conducted on two levels, using two replications and with five center-points added. In total, 37 experiments were carried out. The results reveal that air pressure and dispensing time significantly affect the frequency adjustment process. The mathematical relationship between these two parameters was formulated, and the optimal parameters for air pressure and dispensing time were found to be 0.45 MPa and 458 ms, respectively. The optimal parameters were examined by carrying out a confirmation experiment in which an average deviation of 0.082 kHz was achieved.