Organizational De-Evolution; the Small Group or Single Actor Terrorist

Traditionally, terror groups have been formed by ideologically aligned actors who perceive a lack of options for achieving political or social change. However, terrorist attacks have been increasingly carried out by small groups of actors or lone individuals who may be only ideologically affiliated with larger, formal terrorist organizations. The formation of these groups represents the inverse of traditional organizational growth, whereby structural de-evolution within issue-based organizations leads to the formation of small, independent terror cells. Ideological franchising – the bypassing of formal affiliation to the “parent" organization – represents the de-evolution of traditional concepts of organizational structure in favor of an organic, independent, and focused unit. Traditional definitions of dark networks that are issue-based include focus on an identified goal, commitment to achieving this goal through unrestrained actions, and selection of symbolic targets. The next step in the de-evolution of small dark networks is the miniorganization, consisting of only a handful of actors working toward a common, violent goal. Information-sharing through social media platforms, coupled with civil liberties of democratic nations, provide the communication systems, access to information, and freedom of movement necessary for small dark networks to flourish without the aid of a parent organization. As attacks such as the 7/7 bombings demonstrate the effectiveness of small dark networks, terrorist actors will feel increasingly comfortable aligning with an ideology only, without formally organizing. The natural result of this de-evolving organization is the single actor event, where an individual seems to subscribe to a larger organization-s violent ideology with little or no formal ties.

Software Reliability Prediction Model Analysis

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.

Applying Complex Network Theory to Software Structure Analysis

Complex networks have been intensively studied across many fields, especially in Internet technology, biological engineering, and nonlinear science. Software is built up out of many interacting components at various levels of granularity, such as functions, classes, and packages, representing another important class of complex networks. It can also be studied using complex network theory. Over the last decade, many papers on the interdisciplinary research between software engineering and complex networks have been published. It provides a different dimension to our understanding of software and also is very useful for the design and development of software systems. This paper will explore how to use the complex network theory to analyze software structure, and briefly review the main advances in corresponding aspects.

Synergy in Vertical Transformations of Expert Designers

Existing literature ondesign reasoning seems to give either one sided accounts on expert design behaviour based on internal processing. In the same way ecological theoriesseem to focus one sidedly on external elementsthat result in a lack of unifying design cognition theory. Although current extended design cognition studies acknowledge the intellectual interaction between internal and external resources, there still seems to be insufficient understanding of the complexities involved in such interactive processes. As such,this paper proposes a novelmulti-directional model for design researchers tomap the complex and dynamic conduct controlling behaviour in which both the computational and ecological perspectives are integrated in a vertical manner. A clear distinction between identified intentional and emerging physical drivers, and relationships between them during the early phases of experts- design process, is demonstrated by presenting a case study in which the model was employed.

CFD Simulation the Thermal-Hydraulic Characteristic within Fuel Rod Bundle near Grid Spacers

This paper looks into detailed investigation of thermal-hydraulic characteristics of the flow field in a fuel rod model, especially near the spacer. The area investigate represents a source of information on the velocity flow field, vortex, and on the amount of heat transfer into the coolant all of which are critical for the design and improvement of the fuel rod in nuclear power plants. The flow field investigation uses three-dimensional Computational Fluid Dynamics (CFD) with the Reynolds stresses turbulence model (RSM). The fuel rod model incorporates a vertical annular channel where three different shapes of spacers are used; each spacer shape is addressed individually. These spacers are mutually compared in consideration of heat transfer capabilities between the coolant and the fuel rod model. The results are complemented with the calculated heat transfer coefficient in the location of the spacer and along the stainless-steel pipe.

Improved Power Spectrum Estimation for RR-Interval Time Series

The RR interval series is non-stationary and unevenly spaced in time. For estimating its power spectral density (PSD) using traditional techniques like FFT, require resampling at uniform intervals. The researchers have used different interpolation techniques as resampling methods. All these resampling methods introduce the low pass filtering effect in the power spectrum. The lomb transform is a means of obtaining PSD estimates directly from irregularly sampled RR interval series, thus avoiding resampling. In this work, the superiority of Lomb transform method has been established over FFT based approach, after applying linear and cubicspline interpolation as resampling methods, in terms of reproduction of exact frequency locations as well as the relative magnitudes of each spectral component.

Treatment of Recycled Concrete Aggregates by Si-Based Polymers

The recycling of concrete, bricks and masonry rubble as concrete aggregates is an important way to contribute to a sustainable material flow. However, there are still various uncertainties limiting the widespread use of Recycled Concrete Aggregates (RCA). The fluctuations in the composition of grade recycled aggregates and their influence on the properties of fresh and hardened concrete are of particular concern regarding the use of RCA. Most of problems occurring while using recycled concrete aggregates as aggregates are due to higher porosity and hence higher water absorption, lower mechanical strengths, residual impurities on the surface of the RCA forming weaker bond between cement paste and aggregate. So, the reuse of RCA is still limited. Efficient polymer based treatment is proposed in order to reuse RCA easier. The silicon-based polymer treatments of RCA were carried out and were compared. This kind of treatment can improve the properties of RCA such as the rate of water absorption on treated RCA is significantly reduced.

A Study on the Differential Diagnostic Model for Newborn Hearing Loss Screening

According to the statistics, the prevalence of congenital hearing loss in Taiwan is approximately six thousandths; furthermore, one thousandths of infants have severe hearing impairment. Hearing ability during infancy has significant impact in the development of children-s oral expressions, language maturity, cognitive performance, education ability and social behaviors in the future. Although most children born with hearing impairment have sensorineural hearing loss, almost every child more or less still retains some residual hearing. If provided with a hearing aid or cochlear implant (a bionic ear) timely in addition to hearing speech training, even severely hearing-impaired children can still learn to talk. On the other hand, those who failed to be diagnosed and thus unable to begin hearing and speech rehabilitations on a timely manner might lose an important opportunity to live a complete and healthy life. Eventually, the lack of hearing and speaking ability will affect the development of both mental and physical functions, intelligence, and social adaptability. Not only will this problem result in an irreparable regret to the hearing-impaired child for the life time, but also create a heavy burden for the family and society. Therefore, it is necessary to establish a set of computer-assisted predictive model that can accurately detect and help diagnose newborn hearing loss so that early interventions can be provided timely to eliminate waste of medical resources. This study uses information from the neonatal database of the case hospital as the subjects, adopting two different analysis methods of using support vector machine (SVM) for model predictions and using logistic regression to conduct factor screening prior to model predictions in SVM to examine the results. The results indicate that prediction accuracy is as high as 96.43% when the factors are screened and selected through logistic regression. Hence, the model constructed in this study will have real help in clinical diagnosis for the physicians and actually beneficial to the early interventions of newborn hearing impairment.

Improved Root-Mean-Square-Gain-Combining for SIMO Channels

The major problem that wireless communication systems undergo is multipath fading caused by scattering of the transmitted signal. However, we can treat multipath propagation as multiple channels between the transmitter and receiver to improve the signal-to-scattering-noise ratio. While using Single Input Multiple Output (SIMO) systems, the diversity receivers extract multiple signal branches or copies of the same signal received from different channels and apply gain combining schemes such as Root Mean Square Gain Combining (RMSGC). RMSGC asymptotically yields an identical performance to that of the theoretically optimal Maximum Ratio Combining (MRC) for values of mean Signal-to- Noise-Ratio (SNR) above a certain threshold value without the need for SNR estimation. This paper introduces an improvement of RMSGC using two different issues. We found that post-detection and de-noising the received signals improve the performance of RMSGC and lower the threshold SNR.

Device for 3D Analysis of Basic Movements of the Lower Extremity

This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.

A Fast Sign Localization System Using Discriminative Color Invariant Segmentation

Building intelligent traffic guide systems has been an interesting subject recently. A good system should be able to observe all important visual information to be able to analyze the context of the scene. To do so, signs in general, and traffic signs in particular, are usually taken into account as they contain rich information to these systems. Therefore, many researchers have put an effort on sign recognition field. Sign localization or sign detection is the most important step in the sign recognition process. This step filters out non informative area in the scene, and locates candidates in later steps. In this paper, we apply a new approach in detecting sign locations using a new color invariant model. Experiments are carried out with different datasets introduced in other works where authors claimed the difficulty in detecting signs under unfavorable imaging conditions. Our method is simple, fast and most importantly it gives a high detection rate in locating signs.

Power of Involvement over Rewards for Retention Likelihood in IT Professionals

Retention in the IT profession is critical for organizations to stay competitive and operate reliably in the dynamic business environment. Most organizations rely on compensation and rewards as primary tools to enhance retention of employees. In this quantitative survey-based study conducted at a large global bank, we analyze the perceptions of 575 information technology (IT) software professionals in India and Malaysia and find that fairness of rewards has very little impact on retention likelihood. It is far more important to actively involve employees in organizational activities. In addition, our findings indicate that involvement is far more important than information flow: the typical organizational communication to keep employees informed.

An Exhaustive Review of Die Sinking Electrical Discharge Machining Process and Scope for Future Research

Electrical Discharge Machine (EDM) is especially used for the manufacturing of 3-D complex geometry and hard material parts that are extremely difficult-to-machine by conventional machining processes. In this paper authors review the research work carried out in the development of die-sinking EDM within the past decades for the improvement of machining characteristics such as Material Removal Rate, Surface Roughness and Tool Wear Ratio. In this review various techniques reported by EDM researchers for improving the machining characteristics have been categorized as process parameters optimization, multi spark technique, powder mixed EDM, servo control system and pulse discriminating. At the end, flexible machine controller is suggested for Die Sinking EDM to enhance the machining characteristics and to achieve high-level automation. Thus, die sinking EDM can be integrated with Computer Integrated Manufacturing environment as a need of agile manufacturing systems.

Effects of Photovoltaic System Introduction in Detached Houses with All-Electrified Residential Equipment in Japan

In this paper, in order to investigate the effects of photovoltaic system introduction to detached houses in Japan, two kinds of works were done. Firstly, the hourly generation amount of a 4.2kW photovoltaic system were simulated in 46 cities to investigate the potential of the system in different regions in Japan using a simulation model of photovoltaic system. Secondly, based on the simulated electricity generation amount, the energy saving, the environmental and the economic effect of the photovoltaic system were examined from hourly to annual timescales, based upon calculations of typical electricity, heating, cooling and hot water supply load profiles for Japanese dwellings. The above analysis was carried out using a standard year-s hourly weather data for the different city provided by the Expanded AMeDAS Weather Data issued by AIJ (Architectural Institute of Japan).

Evaluating Service Quality of Online Auction by Fuzzy MCDM

This paper applies fuzzy set theory to evaluate the service quality of online auction. Service quality is a composition of various criteria. Among them many intangible attributes are difficult to measure. This characteristic introduces the obstacles for respondent in replying to the survey. So as to overcome this problem, we invite fuzzy set theory into the measurement of performance. By using AHP in obtaining criteria and TOPSIS in ranking, we found the most concerned dimension of service quality is Transaction Safety Mechanism and the least is Charge Item. Regarding to the most concerned attributes are information security, accuracy and information.

Evaluation of the Immunoregulatory Activity of rFip-gts Purified from Baculovirus-infected Insect Cells

Fip-gts, an immunomodulatory protein purified from Ganoderma tsugae, has been reported to possess therapeutic effects in the treatment of cancer and autoimmune disease. For medicinal application, a recombinant Fip-gts was successfully expressed and purified in Sf21 insect cells by our previously work. It is important to evaluate the immunomodulatory activity of the rFip-gts. To assess the immunomodulatory potential of rFip-gts, the T lymphocytes of murine splenocytes were used in the present study. Results revealed that rFip-gts induced cellular aggregation formation. Additionally, the expression of IL-2 and IFN-r were up-regulated after the treatment of rFip-gts, and a corresponding increased production of IL-2 and IFN-r in a dose-dependent manner. The results showed that rFip-gts has an immunomodulatory activity in inducing Th1 lymphocytes from murine splenocytes released IL-2 and IFN-γ, thus suggest that rFip-gts may have therapeutic potential in vivo as an immune modulator.

The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?

To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.

Chip Formation during Turning Multiphase Microalloyed Steel

Machining through turning was carried out in a lathe to study the chip formation of Multiphase Ferrite (F-B-M) microalloyed steel. Taguchi orthogonal array was employed to perform the machining. Continuous and discontinuous chips were formed for different cutting parameters like speed, feed and depth of cut. Optical and scanning electron microscope was employed to identify the chip morphology.

The Imaging Methods for Classifying Crispiness of Freeze-Dried Durian using Fuzzy Logic

In quality control of freeze-dried durian, crispiness is a key quality index of the product. Generally, crispy testing has to be done by a destructive method. A nondestructive testing of the crispiness is required because the samples can be reused for other kinds of testing. This paper proposed a crispiness classification method of freeze-dried durians using fuzzy logic for decision making. The physical changes of a freeze-dried durian include the pores appearing in the images. Three physical features including (1) the diameters of pores, (2) the ratio of the pore area and the remaining area, and (3) the distribution of the pores are considered to contribute to the crispiness. The fuzzy logic is applied for making the decision. The experimental results comparing with food expert opinion showed that the accuracy of the proposed classification method is 83.33 percent.