Modified Scaling-Free CORDIC Based Pipelined Parallel MDC FFT and IFFT Architecture for Radix 2^2 Algorithm

An innovative approach to develop modified scaling free CORDIC based two parallel pipelined Multipath Delay Commutator (MDC) FFT and IFFT architectures for radix 22 FFT algorithm is presented. Multipliers and adders are the most important data paths in FFT and IFFT architectures. Multipliers occupy high area and consume more power. In order to optimize the area and power overhead, modified scaling-free CORDIC based complex multiplier is utilized in the proposed design. In general twiddle factor values are stored in RAM block. In the proposed work, modified scaling-free CORDIC based twiddle factor generator unit is used to generate the twiddle factor and efficient switching units are used. In addition to this, four point FFT operations are performed without complex multiplication which helps to reduce area and power in the last two stages of the pipelined architectures. The design proposed in this paper is based on multipath delay commutator method. The proposed design can be extended to any radix 2n based FFT/IFFT algorithm to improve the throughput. The work is synthesized using Synopsys design Compiler using TSMC 90-nm library. The proposed method proves to be better compared to the reference design in terms of area, throughput and power consumption. The comparative analysis of the proposed design with Xilinx FPGA platform is also discussed in the paper.

A Conceptual Framework and a Mathematical Equation for Managing Construction-Material Waste and Cost Overruns

The problem of construction material waste remains unresolved, as a significant percentage of the materials delivered to some project sites end up as waste which might result in additional project cost. Cost overrun is a problem which affects 90% of the completed projects in the world. The argument on how to eliminate it has been on-going for the past 70 years, but there is neither substantial improvement nor significant solution for mitigating its detrimental effects. Research evidence has proposed various construction cost overruns and material-waste management approaches; nonetheless, these studies failed to give a clear indication on the framework and the equation for managing construction material waste and cost overruns. Hence, this research aims to develop a conceptual framework and a mathematical equation for managing material waste and cost overrun in the construction industry. The paper adopts the desktop methodological approach. This involves comparing the causes of material waste and those of cost overruns from the literature to determine the possible relationship. The review revealed a relationship between material waste and cost overrun that; increase in material waste would result to a corresponding increase in the amount of cost overrun at both the pre-contract and the post contract stages of a project. It was found from the equation that achieving an effective construction material waste management must ensure a “Good Quality-of-Planning, Estimating, and Design Management” and a “Good Quality- of-Construction, Procurement and Site Management”; a decrease in “Design Complexity” which would reduce “Material Waste” and subsequently reduce the amount of cost overrun by 86.74%. The conceptual framework and the mathematical equation developed in this study are recommended to the professionals of the construction industry.

Influence of the Low Frequency Ultrasound on the Cadmium (II) Biosorption by an Ecofriendly Biocomposite (Extraction Solid Waste of Ammi visnaga / Calcium Alginate): Kinetic Modeling

In the present study, an ecofriendly biocomposite namely calcium alginate immobilized Ammi Visnaga (Khella) extraction waste (SWAV/CA) was prepared by electrostatic extrusion method and used on the cadmium biosorption from aqueous phase with and without the assistance of ultrasound in batch conditions. The influence of low frequency ultrasound (37 and 80 KHz) on the cadmium biosorption kinetics was studied. The obtained results show that the ultrasonic irradiation significantly enhances and improves the efficiency of the cadmium removal. The Pseudo first order, Pseudo-second-order, Intraparticle diffusion, and Elovich models were evaluated using the non-linear curve fitting analysis method. Modeling of kinetic results shows that biosorption process is best described by the pseudo-second order and Elovich, in both the absence and presence of ultrasound.

The Appropriateness of Antibiotic Prescribing within Dundee Dental Hospital

Background: The societal impact of antibiotic resistance is a major public health concern. The increase in incidence of resistant bacteria can ultimately be fatal. Objective: To analyse the appropriateness of antibiotic prescribing in Dundee Dental Hospital, ultimately improving the safety and quality of patient care. Methods: Two examiners independently crosschecked approximately fifty consecutive prescriptions, and corresponding patient case notes, for three data collection cycles between August 2014 – September 2015. The Scottish Dental Clinical Effectiveness Program (SDCEP) Drug Prescribing for Dentistry guidelines was the standard utilised. The criteria: clinical justification, regime justification and review arrangements was measured, and compared to the standard. Results: Cycle one revealed 42% of antibiotic prescriptions were appropriate. Interventions included: multiple staff meetings, introduction of a checklist attached to the prescription pack, and production of patient leaflets explaining indications for antibiotics. Cycle two and three revealed 44%, and 30% compliance, respectively. Conclusion: The results of the audit have yet to meet target standards set out in prescribing guidelines. However, steps are being taken and change has occurred on a cultural level.

Simulation Modeling and Analysis of In-Plant Logistics at a Cement Manufacturing Plant in India

This paper presents the findings of successful implementation of Business Process Reengineering (BPR) of cement dispatch activities in a cement manufacturing plant located in India. Simulation model was developed for the purpose of identifying and analyzing the areas for improvement. The company was facing a problem of low throughput rate and subsequent forced stoppages of the plant leading to a high production loss of 15000MT per month. It was found from the study that the present systems and procedures related to the in-plant logistics plant required significant changes. The major recommendations included process improvement at the entry gate, reducing the cycle time at the security gate and installation of an additional weigh bridge. This paper demonstrates how BPR can be implemented for improving the in-plant logistics process. Various recommendations helped the plant to increase its throughput by 14%.

Out-of-Plane Bending Properties of Out-of-Autoclave Thermosetting Prepregs during Forming Processes

In order to predict and model wrinkling which is caused by out of plane deformation due to compressive loading in the plane of the material during composite prepregs forming, it is necessary to quantitatively understand the relative magnitude of the bending stiffness. This study aims to examine the bending properties of out-of-autoclave (OOA) thermosetting prepreg under vertical cantilever test condition. A direct method for characterizing the bending behavior of composite prepregs was developed. The results from direct measurement were compared with results derived from an image-processing procedure that analyses the captured image during the vertical bending test. A numerical simulation was performed using ABAQUS to confirm the bending stiffness value.

A Discrete Element Method Centrifuge Model of Monopile under Cyclic Lateral Loads

This paper presents the data of a series of two-dimensional Discrete Element Method (DEM) simulations of a large-diameter rigid monopile subjected to cyclic loading under a high gravitational force. At present, monopile foundations are widely used to support the tall and heavy wind turbines, which are also subjected to significant from wind and wave actions. A safe design must address issues such as rotations and changes in soil stiffness subject to these loadings conditions. Design guidance on the issue is limited, so are the availability of laboratory and field test data. The interpretation of these results in sand, such as the relation between loading and displacement, relies mainly on empirical correlations to pile properties. Regarding numerical models, most data from Finite Element Method (FEM) can be found. They are not comprehensive, and most of the FEM results are sensitive to input parameters. The micro scale behaviour could change the mechanism of the soil-structure interaction. A DEM model was used in this paper to study the cyclic lateral loads behaviour. A non-dimensional framework is presented and applied to interpret the simulation results. The DEM data compares well with various set of published experimental centrifuge model test data in terms of lateral deflection. The accumulated permanent pile lateral displacements induced by the cyclic lateral loads were found to be dependent on the characteristics of the applied cyclic load, such as the extent of the loading magnitudes and directions.

Neighborhood Sustainability Assessment Tools: A Conceptual Framework for Their Use in Building Adaptive Capacity to Climate Change

Climate change remains a challenging matter for the human and the built environment in the 21st century, where the need to consider adaptation to climate change in the development process is paramount. However, there remains a lack of information regarding how we should prepare responses to this issue, such as through developing organized and sophisticated tools enabling the adaptation process. This study aims to build a systematic framework approach to investigate the potentials that Neighborhood Sustainability Assessment tools (NSA) might offer in enabling both the analysis of the emerging adaptive capacity to climate change. The analysis of the framework presented in this paper aims to discuss this issue in three main phases. The first part attempts to link sustainability and climate change, in the context of adaptive capacity. It is argued that in deciding to promote sustainability in the context of climate change, both the resilience and vulnerability processes become central. However, there is still a gap in the current literature regarding how the sustainable development process can respond to climate change. As well as how the resilience of practical strategies might be evaluated. It is suggested that the integration of the sustainability assessment processes with both the resilience thinking process, and vulnerability might provide important components for addressing the adaptive capacity to climate change. A critical review of existing literature is presented illustrating the current lack of work in this field, integrating these three concepts in the context of addressing the adaptive capacity to climate change. The second part aims to identify the most appropriate scale at which to address the built environment for the climate change adaptation. It is suggested that the neighborhood scale can be considered as more suitable than either the building or urban scales. It then presents the example of NSAs, and discusses the need to explore their potential role in promoting the adaptive capacity to climate change. The third part of the framework presents a comparison among three example NSAs, BREEAM Communities, LEED-ND, and CASBEE-UD. These three tools have been selected as the most developed and comprehensive assessment tools that are currently available for the neighborhood scale. This study concludes that NSAs are likely to present the basis for an organized framework to address the practical process for analyzing and yet promoting Adaptive Capacity to Climate Change. It is further argued that vulnerability (exposure & sensitivity) and resilience (Interdependence & Recovery) form essential aspects to be addressed in the future assessment of NSA’s capability to adapt to both short and long term climate change impacts. Finally, it is acknowledged that further work is now required to understand impact assessment in terms of the range of physical sectors (Water, Energy, Transportation, Building, Land Use and Ecosystems), Actor and stakeholder engagement as well as a detailed evaluation of the NSA indicators, together with a barriers diagnosis process.

Quality of Bali Beef and Broiler after Immersion in Liquid Smoke on Different Concentrations and Storage Times

The aim of this study was to improve the durability and quality of Bali beef (M. Longissimus dorsi) and broiler carcass through the addition of liquid smoke as a natural preservative. This study was using Longissimus dorsi muscle from male Bali beef aged 3 years, broiler breast and thigh aged 40 days. Three types of meat were marinated in liquid smoke with concentrations of 0, 5, and 10% for 30 minutes at the level of 20% of the sample weight (w/w). The samples were storage at 2-5°C for 1 month. This study designed as a factorial experiment 3 x 3 x 4 based on a completely randomized design with 5 replications; the first factor was meat type (beef, chicken breast and chicken thigh); the 2nd factor was liquid smoke concentrations (0, 5, and 10%), and the 3rd factor was storage duration (1, 2, 3, and 4 weeks). Parameters measured were TBA value, total bacterial colonies, water holding capacity (WHC), shear force value both before and after cooking (80°C – 15min.), and cooking loss. The results showed that the type of meat produced WHC, shear force value, cooking loss and TBA differed between the three types of meat. Higher concentration of liquid smoke, the WHC, shear force value, TBA, and total bacterial colonies were decreased; at a concentration of 10% of liquid smoke, the total bacterial colonies decreased by 57.3% from untreated with liquid smoke. Longer storage, the total bacterial colonies and WHC were increased, while the shear force value and cooking loss were decreased. It can be concluded that a 10% concentration of liquid smoke was able to maintain fat oxidation and bacterial growth in Bali beef and chicken breast and thigh.

A State-Of-The-Art Review on Web Services Adaptation

Web service adaptation involves the creation of adapters that solve Web services incompatibilities known as mismatches. Since the importance of Web services adaptation is increasing because of the frequent implementation and use of online Web services, this paper presents a literature review of web services to investigate the main methods of adaptation, their theoretical underpinnings and the metrics used to measure adapters performance. Eighteen publications were reviewed independently by two researchers. We found that adaptation techniques are needed to solve different types of problems that may arise due to incompatibilities in Web service interfaces, including protocols, messages, data and semantics that affect the interoperability of the services. Although adapters are non-invasive methods that can improve Web services interoperability and there are current approaches for service adaptation; there is, however, not yet one solution that fits all types of mismatches. Our results also show that only a few research projects incorporate theoretical frameworks and that metrics to measure adapters’ performance are very limited. We conclude that further research on software adaptation should improve current adaptation methods in different layers of the service interoperability and that an adaptation theoretical framework that incorporates a theoretical underpinning and measures of qualitative and quantitative performance needs to be created.

Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method

The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.

Statistical Feature Extraction Method for Wood Species Recognition System

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Analysis of Diverse Cluster Ensemble Techniques

Data mining is the procedure of determining interesting patterns from the huge amount of data. With the intention of accessing the data faster the most supporting processes needed is clustering. Clustering is the process of identifying similarity between data according to the individuality present in the data and grouping associated data objects into clusters. Cluster ensemble is the technique to combine various runs of different clustering algorithms to obtain a general partition of the original dataset, aiming for consolidation of outcomes from a collection of individual clustering outcomes. The performances of clustering ensembles are mainly affecting by two principal factors such as diversity and quality. This paper presents the overview about the different cluster ensemble algorithm along with their methods used in cluster ensemble to improve the diversity and quality in the several cluster ensemble related papers and shows the comparative analysis of different cluster ensemble also summarize various cluster ensemble methods. Henceforth this clear analysis will be very useful for the world of clustering experts and also helps in deciding the most appropriate one to determine the problem in hand.

Electronic Health Record System: A Perspective to Improve the Value of Services Rendered to Patients in Healthcare Organization in Rwanda, Case of CHUB and Hopital De Nemba

In Rwanda, many healthcare organizations are still using a paper based patients’ data record system although it still present weaknesses to share health patients’ information across different services when necessary. In developed countries, the EHR has been put in place to revolutionize the paper based record system but still the EHR has some challenges related to privacy, security, or interoperability. The purpose of this research was to assess the existing patients’ data record system in healthcare sector in Rwanda, see what an EHR can improve to the system in place and assess the acceptance of EHR as system which is interoperable, very secure and interoperable and see whether stakeholders are ready to adopt the system. The case based methodology was used and TAM theoretical framework to design the questionnaire for the survey. A judgmental sample across two cases, CHUB and Hopital de Nemba, has been selected and SPSS has been used for descriptive statistics. After a qualitative analysis, the findings showed that the paper based record is useful, gives complete information about the patient, protects the privacy of patients but it is still less secure and less interoperable. The respondents shown that they are ready to use the proposed EHR System and want it secure, capable of enforcing the privacy but still they are not all ready for the interoperability. A conclusion has been formulated; recommendations and further research have been proposed.

Upgraded Rough Clustering and Outlier Detection Method on Yeast Dataset by Entropy Rough K-Means Method

Rough set theory is used to handle uncertainty and incomplete information by applying two accurate sets, Lower approximation and Upper approximation. In this paper, the rough clustering algorithms are improved by adopting the Similarity, Dissimilarity–Similarity and Entropy based initial centroids selection method on three different clustering algorithms namely Entropy based Rough K-Means (ERKM), Similarity based Rough K-Means (SRKM) and Dissimilarity-Similarity based Rough K-Means (DSRKM) were developed and executed by yeast dataset. The rough clustering algorithms are validated by cluster validity indexes namely Rand and Adjusted Rand indexes. An experimental result shows that the ERKM clustering algorithm perform effectively and delivers better results than other clustering methods. Outlier detection is an important task in data mining and very much different from the rest of the objects in the clusters. Entropy based Rough Outlier Factor (EROF) method is seemly to detect outlier effectively for yeast dataset. In rough K-Means method, by tuning the epsilon (ᶓ) value from 0.8 to 1.08 can detect outliers on boundary region and the RKM algorithm delivers better results, when choosing the value of epsilon (ᶓ) in the specified range. An experimental result shows that the EROF method on clustering algorithm performed very well and suitable for detecting outlier effectively for all datasets. Further, experimental readings show that the ERKM clustering method outperformed the other methods.

Improvement of Parallel Compressor Model in Dealing Outlet Unequal Pressure Distribution

Parallel Compressor Model (PCM) is a simplified approach to predict compressor performance with inlet distortions. In PCM calculation, it is assumed that the sub-compressors’ outlet static pressure is uniform and therefore simplifies PCM calculation procedure. However, if the compressor’s outlet duct is not long and straight, such assumption frequently induces error ranging from 10% to 15%. This paper provides a revised calculation method of PCM that can correct the error. The revised method employs energy equation, momentum equation and continuity equation to acquire needed parameters and replace the equal static pressure assumption. Based on the revised method, PCM is applied on two compression system with different blades types. The predictions of their performance in non-uniform inlet conditions are yielded through the revised calculation method and are employed to evaluate the method’s efficiency. Validating the results by experimental data, it is found that although little deviation occurs, calculated result agrees well with experiment data whose error ranges from 0.1% to 3%. Therefore, this proves the revised calculation method of PCM possesses great advantages in predicting the performance of the distorted compressor with limited exhaust duct.

Zinc Sorption by Six Agricultural Soils Amended with Municipal Biosolids

Anthropogenic sources of zinc (Zn), including industrial emissions and effluents, Zn–rich fertilizer materials and pesticides containing Zn, can contribute to increasing the concentration of soluble Zn at levels toxic to plants in acid sandy soils. The application of municipal sewage sludge or biosolids (MBS) which contain metal immobilizing agents on coarse-textured soils could improve the metal sorption capacity of the low-CEC soils. The purpose of this experiment was to evaluate the sorption of Zn in surface samples (0-15 cm) of six Quebec (Canada) soils amended with MBS (pH 6.9) from Val d’Or (Quebec, Canada). Soil samples amended with increasing amounts (0 to 20%) of MBS were equilibrated with various amounts of Zn as ZnCl2 in 0.01 M CaCl2 for 48 hours at room temperature. Sorbed Zn was calculated from the difference between the initial and final Zn concentration in solution. Zn sorption data conformed to the linear form of Freundlich equation. The amount of sorbed Zn increased considerably with increasing MBS rate. Analysis of variance revealed a highly significant effect (p ≤ 0.001) of soil texture and MBS rate on the amount of sorbed Zn. The average values of the Zn-sorption capacity of MBS-amended coarse-textured soils were lower than those of MBS-amended fine textured soils. The two sandy soils (86-99% sand) amended with MBS retained 2- to 5-fold Zn than those without MBS (control). Significant Pearson correlation coefficients between the Zn sorption isotherm parameter, i.e. the Freundlich sorption isotherm (KF), and commonly measured physical and chemical entities were obtained. Among all the soil properties measured, soil pH gave the best significant correlation coefficients (p ≤ 0.001) for soils receiving 0, 5 and 10% MBS. Furthermore, KF values were positively correlated with soil clay content, exchangeable basic cations (Ca, Mg or K), CEC and clay content to CEC ratio. From these results, it can be concluded that (i) municipal biosolids provide sorption sites that have a strong affinity for Zn, (ii) both soil texture, especially clay content, and soil pH are the main factors controlling anthropogenic Zn sorption in the municipal biosolids-amended soils, and (iii) the effect of municipal biosolids on Zn sorption will be more pronounced for a sandy soil than for a clay soil.

Knowledge Management and Tourism: An Exploratory Study Applied to Travel Agents in Egypt

Knowledge management focuses on the development, storage, retrieval, and dissemination of information and expertise. It has become an important tool to improve performance in tourism enterprises. This includes improving decision-making, developing customer services, and increasing sales and profits. Knowledge management adoption depends on human, organizational and technological factors. This study aims to explore the concept of knowledge management in travel agents in Egypt. It explores the requirements of adoption and its impact on performance in these agencies. The study targets Category A travel agents in Egypt. The population of the study encompasses Category A travel agents having online presence. An online questionnaire is used to collect data from managers of travel agents. This study is useful for travel agents who are in urgent need to restructure their intermediary role and support their survival in the global travel market. The study sheds light on the requirements of adoption and the expected impact on performance. This could help travel agents identify their situation and the determine the extent to which they are ready to adopt knowledge management. This study is contributing to knowledge by providing insights from the tourism sector in a developing country where the concept of knowledge management is still in its infancy stages.

Architectural Approaches to a Sustainable Community with Floating Housing Units Adapting to Climate Change and Sea Level Rise in Vietnam

Climate change and sea level rise is one of the greatest challenges facing human beings in the 21st century. Because of sea level rise, several low-lying coastal areas around the globe are at risk of being completely submerged, disappearing under water. Particularly in Viet Nam, the rise in sea level is predicted to result in more frequent and even permanently inundated coastal plains. As a result, land reserving fund of coastal cities is going to be narrowed in near future, while construction ground is becoming increasingly limited due to a rapid growth in population. Faced with this reality, the solutions are being discussed not only in tradition view such as accommodation is raised or moved to higher areas, or “living with the water”, but also forwards to “living on the water”. Therefore, the concept of a sustainable floating community with floating houses based on the precious value of long term historical tradition of water dwellings in Viet Nam would be a sustainable solution for adaptation of climate change and sea level rise in the coastal areas. The sustainable floating community is comprised of sustainability in four components: architecture, environment, socio-economic and living quality. This research paper is focused on sustainability in architectural component of floating community. Through detailed architectural analysis of current floating houses and floating communities in Viet Nam, this research not only accumulates precious values of traditional architecture that need to be preserved and developed in the proposed concept, but also illustrates its weaknesses that need to address for optimal design of the future sustainable floating communities. Based on these studies the research would provide guidelines with appropriate architectural solutions for the concept of sustainable floating community with floating housing units that are adapted to climate change and sea level rise in Viet Nam.

Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters

The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.