Effect of Temperature on Specific Retention Volumes of Selected Volatile Organic Compounds Using the Gas - Liquid Chromatographic Technique Revisited

This paper is a continuation of our interest in the influence of temperature on specific retention volumes and the resulting infinite dilution activity coefficients. This has a direct effect in the design of absorption and stripping columns for the abatement of volatile organic compounds. The interaction of 13 volatile organic compounds (VOCs) with polydimethylsiloxane (PDMS) at varying temperatures was studied by gas liquid chromatography (GLC). Infinite dilution activity coefficients and specific retention volumes obtained in this study were found to be in agreement with those obtained from static headspace and group contribution methods by the authors as well as literature values for similar systems. Temperature variation also allows for transport calculations for different seasons. The results of this work confirm that PDMS is well suited for the scrubbing of VOCs from waste gas streams. Plots of specific retention volumes against temperature gave linear van-t Hoff plots.

Moving From Problem Space to Solution Space

Extracting and elaborating software requirements and transforming them into viable software architecture are still an intricate task. This paper defines a solution architecture which is based on the blurred amalgamation of problem space and solution space. The dependencies between domain constraints, requirements and architecture and their importance are described that are to be considered collectively while evolving from problem space to solution space. This paper proposes a revised version of Twin Peaks Model named Win Peaks Model that reconciles software requirements and architecture in more consistent and adaptable manner. Further the conflict between stakeholders- win-requirements is resolved by proposed Voting methodology that is simple adaptation of win-win requirements negotiation model and QARCC.

Molecular Mechanism of Amino Acid Discrimination for the Editing Reaction of E.coli Leucyl-tRNA Synthetase

Certain tRNA synthetases have developed highly accurate molecular machinery to discriminate their cognate amino acids. Those aaRSs achieve their goal via editing reaction in the Connective Polypeptide 1 (CP1). Recently mutagenesis studies have revealed the critical importance of residues in the CP1 domain for editing activity and X-ray structures have shown binding mode of noncognate amino acids in the editing domain. To pursue molecular mechanism for amino acid discrimination, molecular modeling studies were performed. Our results suggest that aaRS bind the noncognate amino acid more tightly than the cognate one. Finally, by comparing binding conformations of the amino acids in three systems, the amino acid binding mode was elucidated and a discrimination mechanism proposed. The results strongly reveal that the conserved threonines are responsible for amino acid discrimination. This is achieved through side chain interactions between T252 and T247/T248 as well as between those threonines and the incoming amino acids.

Investigating Transformations in the Cartesian Plane Using Spreadsheets

The link between coordinate transformations in the plane and their effects on the graph of a function can be difficult for students studying college level mathematics to comprehend. To solidify this conceptual link in the mind of a student Microsoft Excel can serve as a convenient graphing tool and pedagogical aid. The authors of this paper describe how various transformations and their related functional symmetry properties can be graphically displayed with an Excel spreadsheet.

e-Plagiarism Detection at Glamorgan

There are increasingly plagiarism offences for students in higher education in the digital educational world. On the other hand, various and competitive online assessment and plagiarism detection tools are available in the market. Taking the University of Glamorgan as a case study, this paper describes and introduces an institutional journey on electronic plagiarism detection to inform the initial experience of an innovative tool and method which could be further explored in the future research. The comparative study and system workflow for e-plagiarism detection tool are discussed. Benefits for both academics and students are also presented. Electronic plagiarism detection tools brought great benefits to both academics and students in Glamorgan. On the other hand, the debates raised in such initial experience are discussed.

Radio and Television Supreme Council as a Regulatory Board

In parallel, broadcasting has changed rapidly with the changing of the world at the same area. Broadcasting is also influenced and reshaped in terms of the emergence of new communication technologies. These developments have resulted a lot of economic and social consequences. The most important consequences of these results are those of the powers of the governments to control over the means of communication and control mechanisms related to the descriptions of the new issues. For this purpose, autonomous and independent regulatory bodies have been established by the state. One of these regulatory bodies is the Radio and Television Supreme Council, which to be established in 1994, with the Code no 3984. Today’s Radio and Television Supreme Council which is responsible for the regulation of the radio and television broadcasts all across Turkey has an important and effective position as autonomous and independent regulatory body. The Radio and Television Supreme Council acts as being a remarkable organizer for a sensitive area of radio and television broadcasting on one hand, and the area of democratic, liberal and keep in mind the concept of the public interest by putting certain principles for the functioning of the Board control, in the context of media policy as one of the central organs, on the other hand. In this study, the role of the Radio and Television Supreme Council is examined in accordance with the Code no 3894 in order to control over the communication and control mechanisms as well as the examination of the changes in the duties of the Code No. 6112, dated 2011.

Recovery of Copper and DCA from Simulated Micellar Enhanced Ultrafiltration (MEUF)Waste Stream

Simultaneous recovery of copper and DCA from simulated MEUF concentrated stream was investigated. Effects of surfactant (DCA) and metal (copper) concentrations, surfactant to metal molar ratio (S/M ratio), electroplating voltage, EDTA concentration, solution pH, and salt concentration on metal recovery and current efficiency were studied. Electric voltage of -0.5 V was shown to be optimum operation condition in terms of Cu recovery, current efficiency, and surfactant recovery. Increasing Cu recovery and current efficiency were observed with increases of Cu concentration while keeping concentration of DCA constant. However, increasing both Cu and DCA concentration while keeping S/M ratio constant at 2.5 showed detrimental effect on Cu recovery at DCA concentration higher than 15 mM. Cu recovery decreases with increasing pH while current efficiency showed an opposite trend. It is believed that conductivity is the main cause for discrepancy of Cu recovery and current efficiency observed at different pH. Finally, it was shown that EDTA had adverse effect on both Cu recovery and current efficiency while addition of NaCl salt had negative impact on current efficiency at concentration higher than 8000 mg/L.

An Advanced Time-Frequency Domain Method for PD Extraction with Non-Intrusive Measurement

Partial discharge (PD) detection is an important method to evaluate the insulation condition of metal-clad apparatus. Non-intrusive sensors which are easy to install and have no interruptions on operation are preferred in onsite PD detection. However, it often lacks of accuracy due to the interferences in PD signals. In this paper a novel PD extraction method that uses frequency analysis and entropy based time-frequency (TF) analysis is introduced. The repetitive pulses from convertor are first removed via frequency analysis. Then, the relative entropy and relative peak-frequency of each pulse (i.e. time-indexed vector TF spectrum) are calculated and all pulses with similar parameters are grouped. According to the characteristics of non-intrusive sensor and the frequency distribution of PDs, the pulses of PD and interferences are separated. Finally the PD signal and interferences are recovered via inverse TF transform. The de-noised result of noisy PD data demonstrates that the combination of frequency and time-frequency techniques can discriminate PDs from interferences with various frequency distributions.

An Improved Method to Watermark Images Sensitive to Blocking Artifacts

A new digital watermarking technique for images that are sensitive to blocking artifacts is presented. Experimental results show that the proposed MDCT based approach produces highly imperceptible watermarked images and is robust to attacks such as compression, noise, filtering and geometric transformations. The proposed MDCT watermarking technique is applied to fingerprints for ensuring security. The face image and demographic text data of an individual are used as multiple watermarks. An AFIS system was used to quantitatively evaluate the matching performance of the MDCT-based watermarked fingerprint. The high fingerprint matching scores show that the MDCT approach is resilient to blocking artifacts. The quality of the extracted face and extracted text images was computed using two human visual system metrics and the results show that the image quality was high.

Construction of cDNALibrary and EST Analysis of Tenebriomolitorlarvae

Tofurther advance research on immune-related genes from T. molitor, we constructed acDNA library and analyzed expressed sequence taq (EST) sequences from 1,056 clones. After removing vector sequence and quality checkingthrough thePhred program (trim_alt 0.05 (P-score>20), 1039 sequences were generated. The average length of insert was 792 bp. In addition, we identified 162 clusters, 167 contigs and 391 contigs after clustering and assembling process using a TGICL package. EST sequences were searchedagainst NCBI nr database by local BLAST (blastx, E

Response Surface Based Optimization of Toughness of Hybrid Polyamide 6 Nanocomposites

Toughening of polyamide 6 (PA6)/ Nanoclay (NC) nanocomposites with styrene-ethylene/butadiene-styrene copolymer (SEBS) using maleated styrene-ethylene/butadiene-styrene copolymer (mSEBS)/ as a compatibilizer were investigated by blending them in a co-rotating twin-screw extruder. Response surface method of experimental design was used for optimizing the material and processing parameters. Effect of four factors, including SEBS, mSEBS and NC contents as material variables and order of mixing as a processing factor, on toughness of hybrid nanocomposites were studied. All the prepared samples showed ductile behavior and low temperature Izod impact toughness of some of the hybrid nanocomposites demonstrated 900% improvement compared to the PA6 matrix while the modulus showed maximum enhancement of 20% compared to the pristine PA6 resin.

Italians- Social and Emotional Loneliness: The Results of Five Studies

Subjective loneliness describes people who feel a disagreeable or unacceptable lack of meaningful social relationships, both at the quantitative and qualitative level. The studies to be presented tested an Italian 18-items self-report loneliness measure, that included items adapted from scales previously developed, namely a short version of the UCLA (Russell, Peplau and Cutrona, 1980), and the 11-items Loneliness scale by De Jong-Gierveld & Kamphuis (JGLS; 1985). The studies aimed at testing the developed scale and at verifying whether loneliness is better conceptualized as a unidimensional (so-called 'general loneliness') or a bidimensional construct, namely comprising the distinct facets of social and emotional loneliness. The loneliness questionnaire included 2 singleitem criterion measures of sad mood, and social contact, and asked participants to supply information on a number of socio-demographic variables. Factorial analyses of responses obtained in two preliminary studies, with 59 and 143 Italian participants respectively, showed good factor loadings and subscale reliability and confirmed that perceived loneliness has clearly two components, a social and an emotional one, the latter measured by two subscales, a 7-item 'general' loneliness subscale derived from UCLA, and a 6–item 'emotional' scale included in the JGLS. Results further showed that type and amount of loneliness are related, negatively, to frequency of social contacts, and, positively, to sad mood. In a third study data were obtained from a nation-wide sample of 9.097 Italian subjects, 12 to about 70 year-olds, who filled the test on-line, on the Italian web site of a large-audience magazine, Focus. The results again confirmed the reliability of the component subscales, namely social, emotional, and 'general' loneliness, and showed that they were highly correlated with each other, especially the latter two. Loneliness scores were significantly predicted by sex, age, education level, sad mood and social contact, and, less so, by other variables – e.g., geographical area and profession. The scale validity was confirmed by the results of a fourth study, with elderly men and women (N 105) living at home or in residential care units. The three subscales were significantly related, among others, to depression, and to various measures of the extension of, and satisfaction with, social contacts with relatives and friends. Finally, a fifth study with 315 career-starters showed that social and emotional loneliness correlate with life satisfaction, and with measures of emotional intelligence. Altogether the results showed a good validity and reliability in the tested samples of the entire scale, and of its components.

Location Management in Cellular Networks

Cellular networks provide voice and data services to the users with mobility. To deliver services to the mobile users, the cellular network is capable of tracking the locations of the users, and allowing user movement during the conversations. These capabilities are achieved by the location management. Location management in mobile communication systems is concerned with those network functions necessary to allow the users to be reached wherever they are in the network coverage area. In a cellular network, a service coverage area is divided into smaller areas of hexagonal shape, referred to as cells. The cellular concept was introduced to reuse the radio frequency. Continued expansion of cellular networks, coupled with an increasingly restricted mobile spectrum, has established the reduction of communication overhead as a highly important issue. Much of this traffic is used in determining the precise location of individual users when relaying calls, with the field of location management aiming to reduce this overhead through prediction of user location. This paper describes and compares various location management schemes in the cellular networks.

Tidal Data Analysis using ANN

The design of a complete expansion that allows for compact representation of certain relevant classes of signals is a central problem in signal processing applications. Achieving such a representation means knowing the signal features for the purpose of denoising, classification, interpolation and forecasting. Multilayer Neural Networks are relatively a new class of techniques that are mathematically proven to approximate any continuous function arbitrarily well. Radial Basis Function Networks, which make use of Gaussian activation function, are also shown to be a universal approximator. In this age of ever-increasing digitization in the storage, processing, analysis and communication of information, there are numerous examples of applications where one needs to construct a continuously defined function or numerical algorithm to approximate, represent and reconstruct the given discrete data of a signal. Many a times one wishes to manipulate the data in a way that requires information not included explicitly in the data, which is done through interpolation and/or extrapolation. Tidal data are a very perfect example of time series and many statistical techniques have been applied for tidal data analysis and representation. ANN is recent addition to such techniques. In the present paper we describe the time series representation capabilities of a special type of ANN- Radial Basis Function networks and present the results of tidal data representation using RBF. Tidal data analysis & representation is one of the important requirements in marine science for forecasting.

Coordinated Q–V Controller for Multi-machine Steam Power Plant: Design and Validation

This paper discusses coordinated reactive power - voltage (Q-V) control in a multi machine steam power plant. The drawbacks of manual Q-V control are briefly listed, and the design requirements for coordinated Q-V controller are specified. Theoretical background and mathematical model of the new controller are presented next followed by validation of developed Matlab/Simulink model through comparison with recorded responses in real steam power plant and description of practical realisation of the controller. Finally, the performance of commissioned controller is illustrated on several examples of coordinated Q-V control in real steam power plant and compared with manual control.

A Retrospective Analysis of a Professional Learning Community: How Teachers- Capacities Shaped It

The purpose of this paper is to describe the process of setting up a learning community within an elementary school in Ontario, Canada. The description is provided through reflection and examination of field notes taken during the yearlong training and implementation process. Specifically the impact of teachers- capacity on the creation of a learning community was of interest. This paper is intended to inform and add to the debate around the tensions that exist in implementing a bottom-up professional development model like the learning community in a top-down organizational structure. My reflections of the process illustrate that implementation of the learning community professional development model may be difficult and yet transformative in the professional lives of the teachers, students, and administration involved in the change process. I conclude by suggesting the need for a new model of professional development that requires a transformative shift in power dynamics and a shift in the view of what constitutes effective professional learning.

Predictions Using Data Mining and Case-based Reasoning: A Case Study for Retinopathy

Diabetes is one of the high prevalence diseases worldwide with increased number of complications, with retinopathy as one of the most common one. This paper describes how data mining and case-based reasoning were integrated to predict retinopathy prevalence among diabetes patients in Malaysia. The knowledge base required was built after literature reviews and interviews with medical experts. A total of 140 diabetes patients- data were used to train the prediction system. A voting mechanism selects the best prediction results from the two techniques used. It has been successfully proven that both data mining and case-based reasoning can be used for retinopathy prediction with an improved accuracy of 85%.

Model Order Reduction of Discrete-Time Systems Using Fuzzy C-Means Clustering

A computationally simple approach of model order reduction for single input single output (SISO) and linear timeinvariant discrete systems modeled in frequency domain is proposed in this paper. Denominator of the reduced order model is determined using fuzzy C-means clustering while the numerator parameters are found by matching time moments and Markov parameters of high order system.

Combining the Description Features of UMLRT and CSP+T Specifications Applied to a Complete Design of Real-Time Systems

UML is a collection of notations for capturing a software system specification. These notations have a specific syntax defined by the Object Management Group (OMG), but many of their constructs only present informal semantics. They are primarily graphical, with textual annotation. The inadequacies of standard UML as a vehicle for complete specification and implementation of real-time embedded systems has led to a variety of competing and complementary proposals. The Real-time UML profile (UML-RT), developed and standardized by OMG, defines a unified framework to express the time, scheduling and performance aspects of a system. We present in this paper a framework approach aimed at deriving a complete specification of a real-time system. Therefore, we combine two methods, a semiformal one, UML-RT, which allows the visual modeling of a realtime system and a formal one, CSP+T, which is a design language including the specification of real-time requirements. As to show the applicability of the approach, a correct design of a real-time system with hard real time constraints by applying a set of mapping rules is obtained.