Delay and Energy Consumption Analysis of Conventional SRAM

The energy consumption and delay in read/write operation of conventional SRAM is investigated analytically as well as by simulation. Explicit analytical expressions for the energy consumption and delay in read and write operation as a function of device parameters and supply voltage are derived. The expressions are useful in predicting the effect of parameter changes on the energy consumption and speed as well as in optimizing the design of conventional SRAM. HSPICE simulation in standard 0.25μm CMOS technology confirms precision of analytical expressions derived from this paper.

Biodiesel as an Alternative Fuel for Diesel Engines

There is growing interest in biodiesel (fatty acid methyl ester or FAME) because of the similarity in its properties when compared to those of diesel fuels. Diesel engines operated on biodiesel have lower emissions of carbon monoxide, unburned hydrocarbons, particulate matter, and air toxics than when operated on petroleum-based diesel fuel. Production of fatty acid methyl ester (FAME) from rapeseed (nonedible oil) fatty acid distillate having high free fatty acids (FFA) was investigated in this work. Conditions for esterification process of rapeseed oil were 1.8 % H2SO4 as catalyst, MeOH/oil of molar ratio 2 : 0.1 and reaction temperature 65 °C, for a period of 3h. The yield of methyl ester was > 90 % in 1 h. The amount of FFA was reduced from 93 wt % to less than 2 wt % at the end of the esterification process. The FAME was pureed by neutralization with 1 M sodium hydroxide in water solution at a reaction temperature of 62 °C. The final FAME product met with the biodiesel quality standard, and ASTM D 6751.

Gypseous Soil Improvement using Fuel Oil

This research investigates the suitability of fuel oil in improving gypseous soil. A detailed laboratory tests were carried-out on two soils (soil I with 51.6% gypsum content, and soil II with 26.55%), where the two soils were obtained from Al-Therthar site (Al-Anbar Province-Iraq). This study examines the improvement of soil properties using the gypsum material which is locally available with low cost to minimize the effect of moisture on these soils by using the fuel oil. This study was conducted on two models of the soil gypsum, from the Tharthar area. The first model was sandy soil with Gypsum content of (51.6%) and the second is clayey soil and the content of Gypsum is (26.55%). The program included tests measuring the permeability and compressibility of the soil and their collapse properties. The shear strength of the soil and the amounts of weight loss of fuel oil due to drying had been found. These tests have been conducted on the treated and untreated soils to observe the effect of soil treatment on the engineering properties when mixed with varying degrees of fuel oil with the equivalent of the water content. The results showed that fuel oil is a good material to modify the basic properties of the gypseous soil of collapsibility and permeability, which are the main problems of this soil and retained the soil by an appropriate amount of the cohesion suitable for carrying the loads from the structure.

Development of a Kinetic Model for the Photodegradation of 4-Chlorophenol using a XeBr Excilamp

Excilamps are new UV sources with great potential for application in wastewater treatment. In the present work, a XeBr excilamp emitting radiation at 283 nm has been used for the photodegradation of 4-chlorophenol within a range of concentrations from 50 to 500 mg L-1. Total removal of 4-chlorophenol was achieved for all concentrations assayed. The two main photoproduct intermediates formed along the photodegradation process, benzoquinone and hydroquinone, although not being completely removed, remain at very low residual concentrations. Such concentrations are insignificant compared to the 4-chlorophenol initial ones and non-toxic. In order to simulate the process and scaleup, a kinetic model has been developed and validated from the experimental data.

U.S. Supreme Court Justices and Partisanship: Support for the President and Solicitor General

This paper analyzes the extent to which the justices of the U.S. Supreme Court cast votes that support the positions of the president, or more generally the Executive Branch. Can presidents count on such deference from those justices they nominate or those whom are nominated by other presidents of the same party? Or, do the justices demonstrate judicial independence and impartiality such that they are not so predisposed to vote in favor of arguments of their nominating president-s party? The results suggest that while in general the justices do not exhibit any marked tendency to partisan support of presidents, more recent and conservative Supreme Court justices are significantly more likely to support Republican presidents.

Note to the Global GMRES for Solving the Matrix Equation AXB = F

In the present work, we propose a new projection method for solving the matrix equation AXB = F. For implementing our new method, generalized forms of block Krylov subspace and global Arnoldi process are presented. The new method can be considered as an extended form of the well-known global generalized minimum residual (Gl-GMRES) method for solving multiple linear systems and it will be called as the extended Gl-GMRES (EGl- GMRES). Some new theoretical results have been established for proposed method by employing Schur complement. Finally, some numerical results are given to illustrate the efficiency of our new method.

Virtual Assembly in a Semi-Immersive Environment

Virtual Assembly (VA) is one of the key technologies in advanced manufacturing field. It is a promising application of virtual reality in design and manufacturing field. It has drawn much interest from industries and research institutes in the last two decades. This paper describes a process for integrating an interactive Virtual Reality-based assembly simulation of a digital mockup with the CAD/CAM infrastructure. The necessary hardware and software preconditions for the process are explained so that it can easily be adopted by non VR experts. The article outlines how assembly simulation can improve the CAD/CAM procedures and structures; how CAD model preparations have to be carried out and which virtual environment requirements have to be fulfilled. The issue of data transfer is also explained in the paper. The other challenges and requirements like anti-aliasing and collision detection have also been explained. Finally, a VA simulation has been carried out for a ball valve assembly and a car door assembly with the help of Vizard virtual reality toolkit in a semi-immersive environment and their performance analysis has been done on different workstations to evaluate the importance of graphical processing unit (GPU) in the field of VA.

Memory Effects in Randomly Perturbed Nematic Liquid Crystals

We study the typical domain size and configuration character of a randomly perturbed system exhibiting continuous symmetry breaking. As a model system we use rod-like objects within a cubic lattice interacting via a Lebwohl–Lasher-type interaction. We describe their local direction with a headless unit director field. An example of such systems represents nematic LC or nanotubes. We further introduce impurities of concentration p, which impose the random anisotropy field-type disorder to directors. We study the domain-type pattern of molecules as a function of p, anchoring strength w between a neighboring director and impurity, temperature, history of samples. In simulations we quenched the directors either from the random or homogeneous initial configuration. Our results show that a history of system strongly influences: i) the average domain coherence length; and ii) the range of ordering in the system. In the random case the obtained order is always short ranged (SR). On the contrary, in the homogeneous case, SR is obtained only for strong enough anchoring and large enough concentration p. In other cases, the ordering is either of quasi long range (QLR) or of long range (LR). We further studied memory effects for the random initial configuration. With increasing external ordering field B either QLR or LR is realized.

Flexible, Adaptable and Scaleable Business Rules Management System for Data Validation

The policies governing the business of any organization are well reflected in her business rules. The business rules are implemented by data validation techniques, coded during the software development process. Any change in business policies results in change in the code written for data validation used to enforce the business policies. Implementing the change in business rules without changing the code is the objective of this paper. The proposed approach enables users to create rule sets at run time once the software has been developed. The newly defined rule sets by end users are associated with the data variables for which the validation is required. The proposed approach facilitates the users to define business rules using all the comparison operators and Boolean operators. Multithreading is used to validate the data entered by end user against the business rules applied. The evaluation of the data is performed by a newly created thread using an enhanced form of the RPN (Reverse Polish Notation) algorithm.

CSR of top Portuguese Companies: Relation between Social Performance and Economic Performance

Modern times call organizations to have an active role in the social arena, through Corporate Social Responsibility (CSR). The objective of this research was to test the hypothesis that there is a positive relation between social performance and economic performance, and if there is a positive correlation between social performance and financial-economic performance. To test these theories a measure of social performance, based on the Green Book of Commission of the European Community, was used in a group of nineteen Portuguese top companies, listed on the PSI 20 index, through a period of five years, since 2005 to 2009. A clusters analysis was applied to group companies by their social performance and to compare and correlate their economic performance. Results indicate that companies that had a better social performance are not the ones who had a better economic performance, and suggest that the middle path might provide a good relation CSR-Economic performance, as a basis to a sustainable development.

A Kernel Classifier using Linearised Bregman Iteration

In this paper we introduce a novel kernel classifier based on a iterative shrinkage algorithm developed for compressive sensing. We have adopted Bregman iteration with soft and hard shrinkage functions and generalized hinge loss for solving l1 norm minimization problem for classification. Our experimental results with face recognition and digit classification using SVM as the benchmark have shown that our method has a close error rate compared to SVM but do not perform better than SVM. We have found that the soft shrinkage method give more accuracy and in some situations more sparseness than hard shrinkage methods.

Linear Phase High Pass FIR Filter Design using Improved Particle Swarm Optimization

This paper presents an optimal design of linear phase digital high pass finite impulse response (FIR) filter using Improved Particle Swarm Optimization (IPSO). In the design process, the filter length, pass band and stop band frequencies, feasible pass band and stop band ripple sizes are specified. FIR filter design is a multi-modal optimization problem. An iterative method is introduced to find the optimal solution of FIR filter design problem. Evolutionary algorithms like real code genetic algorithm (RGA), particle swarm optimization (PSO), improved particle swarm optimization (IPSO) have been used in this work for the design of linear phase high pass FIR filter. IPSO is an improved PSO that proposes a new definition for the velocity vector and swarm updating and hence the solution quality is improved. A comparison of simulation results reveals the optimization efficacy of the algorithm over the prevailing optimization techniques for the solution of the multimodal, nondifferentiable, highly non-linear, and constrained FIR filter design problems.

A Systematic Mapping Study on Software Engineering Education

Inadequate curriculum for software engineering is considered to be one of the most common software risks. A number of solutions, on improving Software Engineering Education (SEE) have been reported in literature but there is a need to collectively present these solutions at one place. We have performed a mapping study to present a broad view of literature; published on improving the current state of SEE. Our aim is to give academicians, practitioners and researchers an international view of the current state of SEE. Our study has identified 70 primary studies that met our selection criteria, which we further classified and categorized in a well-defined Software Engineering educational framework. We found that the most researched category within the SE educational framework is Innovative Teaching Methods whereas the least amount of research was found in Student Learning and Assessment category. Our future work is to conduct a Systematic Literature Review on SEE.

Application of Spreadsheet and Queuing Network Model to Capacity Optimization in Product Development

Modeling of a manufacturing system enables one to identify the effects of key design parameters on the system performance and as a result to make correct decision. This paper proposes a manufacturing system modeling approach using a spreadsheet model based on queuing network theory, in which a static capacity planning model and stochastic queuing model are integrated. The model was used to improve the existing system utilization in relation to product design. The model incorporates few parameters such as utilization, cycle time, throughput, and batch size. The study also showed that the validity of developed model is good enough to apply and the maximum value of relative error is 10%, far below the limit value 32%. Therefore, the model developed in this study is a valuable alternative model in evaluating a manufacturing system

Simulation of a Multi-Component Transport Model for the Chemical Reaction of a CVD-Process

In this paper we present discretization and decomposition methods for a multi-component transport model of a chemical vapor deposition (CVD) process. CVD processes are used to manufacture deposition layers or bulk materials. In our transport model we simulate the deposition of thin layers. The microscopic model is based on the heavy particles, which are derived by approximately solving a linearized multicomponent Boltzmann equation. For the drift-process of the particles we propose diffusionreaction equations as well as for the effects of heat conduction. We concentrate on solving the diffusion-reaction equation with analytical and numerical methods. For the chemical processes, modelled with reaction equations, we propose decomposition methods and decouple the multi-component models to simpler systems of differential equations. In the numerical experiments we present the computational results of our proposed models.

Flexible Laser Reduced Graphene Oxide/ MnO2 Electrode for Supercapacitor Applications

We succeeded to produce a high performance and flexible graphene/Manganese dioxide (G/MnO2) electrode coated on flexible polyethylene terephthalate (PET) substrate. The graphene film is initially synthesized by drop-casting the graphene oxide (GO) solution on the PET substrate, followed by simultaneous reduction and patterning of the dried film using carbon dioxide (CO2) laser beam with power of 1.8 W. Potentiostatic Anodic Deposition method was used to deposit thin film of MnO2 with different loading mass 10 – 50 and 100 μg.cm-2 on the pre-prepared graphene film. The electrodes were fully characterized in terms of structure, morphology, and electrochemical performance. A maximum specific capacitance of 973 F.g-1 was attributed when depositing 50μg.cm-2 MnO2 on the laser reduced graphene oxide rGO (or G/50MnO2) and over 92% of its initial capacitance was retained after 1000 cycles. The good electrochemical performance and long-term cycling stability make our proposed approach a promising candidate in the supercapacitor applications.

Comparative Study of View Point Types on Landscape Evaluation

The purpose of this study was to examine the viewpoints in terms of changing distances and levels and thereby, comparatively analyze the visual sensitivity to the elements of the natural views. The questionnaire survey was conducted separately for experts and non-experts. Summing up, it was confirmed that the visual sensitivity to the elements of the same natural views differed significantly depending on subjects' professionalism, changes of the viewpoint levels and distances, while the visual sensitivity to 'openness of visual/view axes' did not differ significantly when only the distances of the viewpoints were varied. In addition, the visual sensitivity to visual/view axes differed between experts and ordinary people when the levels of the viewpoints were varied, while the visual sensitivity to 'damaged natural view resources' differed between two groups when the distances of the viewpoints were varied.

Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology

We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.740.97 and 0.620.93, respectively. A new MATLAB-based programming tool aiming at analysis of cardiologic RR intervals and blood pressure descriptors, is worked out, too. For each set of data, ten different parameters are extracted: 2 in time domain, 4 in frequency domain and 4 in Poincaré plot analysis. In addition twelve different parameters of baroreflex sensitivity are calculated. All these data sets can be visualized in time domain together with their power spectra and Poincaré plots. If available, the respiratory oscillation curves can be also plotted for comparison. Another application processes biological data obtained from BLAST analysis.

Dynamic Performance Indicators for Aged-Care Construction Projects

Key performance indicators (KPIs) are used for post result evaluation in the construction industry, and they normally do not have provisions for changes. This paper proposes a set of dynamic key performance indicators (d-KPIs) which predicts the future performance of the activity being measured and presents the opportunity to change practice accordingly. Critical to the predictability of a construction project is the ability to achieve automated data collection. This paper proposes an effective way to collect the process and engineering management data from an integrated construction management system. The d-KPI matrix, consisting of various indicators under seven categories, developed from this study can be applied to close monitoring of the development projects of aged-care facilities. The d-KPI matrix also enables performance measurement and comparison at both project and organization levels.

An Improvement of PDLZW implementation with a Modified WSC Updating Technique on FPGA

In this paper, an improvement of PDLZW implementation with a new dictionary updating technique is proposed. A unique dictionary is partitioned into hierarchical variable word-width dictionaries. This allows us to search through dictionaries in parallel. Moreover, the barrel shifter is adopted for loading a new input string into the shift register in order to achieve a faster speed. However, the original PDLZW uses a simple FIFO update strategy, which is not efficient. Therefore, a new window based updating technique is implemented to better classify the difference in how often each particular address in the window is referred. The freezing policy is applied to the address most often referred, which would not be updated until all the other addresses in the window have the same priority. This guarantees that the more often referred addresses would not be updated until their time comes. This updating policy leads to an improvement on the compression efficiency of the proposed algorithm while still keep the architecture low complexity and easy to implement.