Utilization of Cement Kiln Dust in Adsorption Technology

This paper involves a study of the heavy metal pollution of the soils around one of cement plants in Libya called Suk-Alkhameas and surrounding urban areas caused by cement kiln dust (CKD) emitted. Samples of soil was collected from sites at four directions around the cement factory at distances 250m, 1000m, and 3000m from the factory and at (0-10)cm deep in the soil. These samples are analyzed for Fe (iii), Zn(ii), and Pb (ii) as major pollutants. These values are compared with soils at 25 Km distances from the factory as a reference or control samples. The results show that the concentration of Fe ions in the surface soil was within the acceptable range of 1000ppm. However, for Zn and Pb ions the concentrations at the east and north sides of the factory were found six fold higher than the benchmark level. This high value was attributed to the wind which blows usually from south to north and from west to east. This work includes an investigation of the adsorption isotherms and adsorption efficiency of CKD as adsorbent of heavy metal ions (Fe (iii), Zn(ii), and Pb(ii)) from the polluted soils of Suk-Alkameas city. The investigation was conducted in batch and fixed bed column flow technique. The adsorption efficiency of the studied heavy metals ions removals onto CKD depends on the pH of the solution. The optimum pH values are found to be in the ranges of 8-10 and decreases at lower pH values. The removal efficiency of these heavy metals ions ranged from 93% for Pb, 94% for Zn, and 98% for Fe ions for 10 g.l-1 adsorbent concentration. The maximum removal efficiency of these ions was achieved at 50-60 minutes contact times at which equilibrium is reached. Fixed bed column experimental measurements are also made to evaluate CKD as an adsorbent for the heavy metals. Results obtained are with good agreement with Langmuir and Drachsal assumption of multilayer formation on the adsorbent surface.

Design Patterns for Emergency Management Processes

Natural or human made disasters have a significant negative impact on the environment. At the same time there is an extensive effort to support management and decision making in emergency situations by information technologies. Therefore the purpose of the paper is to propose a design patterns applicable in emergency management, enabling better analysis and design of emergency management processes and therefore easier development and deployment of information systems in the field of emergency management. It will be achieved by detailed analysis of existing emergency management legislation, contingency plans and information systems. The result is a set of design patterns focused at emergency management processes that enable easier design of emergency plans or development of new information system. These results will have a major impact on the development of new information systems as well as to more effective and faster solving of emergencies.

Effect of Inlet Valve Variable Timing in the Spark Ignition Engine on Achieving Greener Transport

The current emission legislations and the large concern about the environment produced very numerous constraints on both governments and car manufacturers. Also the cost of energy increase means a reduction in fuel consumption must be met, without largely affecting the current engine production and performance. It is the intension to contribute towards the development and pursuing, among others on variable valve timing (VVT), for improving the engine performance. The investigation of the effect of (IVO) and (IVC) to optimize engine torque and volumetric efficiency for different engine speeds was considered. Power, BMEP and BSFC were calculated and presented to show the effect of varying inlet valve timing on them for all cases. A special program used to carry out the calculations. The analysis of the results shows that the reduction of 10% of (IVO) angle gave an improvement of around 1.3% in torque, BSFC, and volumetric efficiency, while a 10% decrease in (IVC) caused a 0.1% reduction in power, torque, and volumetric efficiency.  

Cloud Computing Cryptography "State-of-the-Art"

Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing environments provide, there are many questions still coming up on how to gain a trusted environment that protect data and applications in clouds from hackers and intruders. This paper surveys the “keys generation and management” mechanism and encryption/decryption algorithms used in cloud computing environments, we proposed new security architecture for cloud computing environment that considers the various security gaps as much as possible. A new cryptographic environment that implements quantum mechanics in order to gain more trusted with less computation cloud communications is given.

Optimization of Electrospinning Parameter by Employing Genetic Algorithm in order to Produce Desired Nanofiber Diameter

A numerical simulation of optimization all of electrospinning processing parameters to obtain smallest nanofiber diameter have been performed by employing genetic algorithm (GA). Fitness function in genetic algorithm methods, which was different for each parameter, was determined by simulation approach based on the Reneker’s model. Moreover, others genetic algorithm parameter, namely length of population, crossover and mutation were applied to get the optimum electrospinning processing parameters. In addition, minimum fiber diameter, 32 nm, was achieved from a simulation by applied the optimum parameters of electrospinning. This finding may be useful for process control and prediction of electrospun fiber production. In this paper, it is also compared between predicted parameters with some experimental results.

Increased Signal to Noise Ratio in P300 Potentials by the Method of Coherent Self-Averaging in BCI Systems

The coherent Self-Averaging (CSA), is a new method proposed in this work; applied to simulated signals evoked potentials related to events (ERP) to find the wave P300, useful systems in the brain computer interface (BCI). The CSA method cleans signal in the time domain of white noise through of successive averaging of a single signal. The method is compared with the traditional method, coherent averaging or synchronized (CA), showing optimal results in the improvement of the signal to noise ratio (SNR). The method of CSA is easy to implement, robust and applicable to any physiological time series contaminated with white noise

Optimization of Deglet-Nour Date (Phoenix dactylifera L.) Phenol Extraction Conditions

The objective of this study was to optimize the extraction conditions for phenolic compounds, total flavonoids, and antioxidant activity from Deglet-Nour variety. The extraction of active components from natural sources depends on different factors. The knowledge of the effects of different extraction parameters is useful for the optimization of the process, as well for the ability to predict the extraction yield. The effects of extraction variables, namely types of solvent (methanol, ethanol and acetone) and extraction time (1h, 6h, 12h and 24h) on phenolics extraction yield were evaluated. It has been shown that the time of extraction and types of solvent have a statistically significant influence on the extraction of phenolic compounds from Deglet-Nour variety. The optimised conditions yielded values of 80.19 ± 6.37 mg GAE/100 g FW for TPC, 2.34 ± 0.27 mg QE/100 g FW for TFC and 90.20 ± 1.29% for antioxidant activity were methanol solvent and 6 hours of time. According to the results obtained in this study, Deglet-Nour variety can be considered as a natural source of phenolic compounds with good antioxidant capacity.

Recommender Systems Using Ensemble Techniques

This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store.

Optical Flow Based Moving Object Detection and Tracking for Traffic Surveillance

Automated motion detection and tracking is a challenging task in traffic surveillance. In this paper, a system is developed to gather useful information from stationary cameras for detecting moving objects in digital videos. The moving detection and tracking system is developed based on optical flow estimation together with application and combination of various relevant computer vision and image processing techniques to enhance the process. To remove noises, median filter is used and the unwanted objects are removed by applying thresholding algorithms in morphological operations. Also the object type restrictions are set using blob analysis. The results show that the proposed system successfully detects and tracks moving objects in urban videos.

Evaluation Factors of Clinical Decision Support System in u_Healthcare Service

Automated intelligent, clinical decision support systems generally promote to help or to assist physicians and patients regarding to prevention of diseases or treatment of illnesses using computer represented knowledge and information. In this paper, assessment factors affecting the proper design of clinical decision support system were investigated. The required procedure steps for gathering the data from clinical trial and extracting the information from large volume of healthcare repositories were listed, which are necessary for validation and verification of evidence-based implementation of clinical decision support system. The goal of this paper is to extract useful evaluation factors affecting the quality of the clinical decision support system in the design, development, and implementation of a computer-based decision support system.

Fung’s Model Constants for Intracranial Blood Vessel of Human Using Biaxial Tensile Test Results

Mechanical properties of cerebral arteries are, due to their relationship with cerebrovascular diseases, of clinical worth. To acquire these properties, eight samples were obtained from middle cerebral arteries of human cadavers, whose death were not due to injuries or diseases of cerebral vessels, and tested within twelve hours after resection, by a precise biaxial tensile test device specially developed for the present study considering the dimensions, sensitivity and anisotropic nature of samples. The resulting stress-stretch curve was plotted and subsequently fitted to a hyperelastic three-parameter Fung model. It was found that the arteries were noticeably stiffer in circumferential than in axial direction. It was also demonstrated that the use of multi-parameter hyperelastic constitutive models is useful for mathematical description of behavior of cerebral vessel tissue. The reported material properties are a proper reference for numerical modeling of cerebral arteries and computational analysis of healthy or diseased intracranial arteries.

Quality Management in Public e-Administration

Since the late 1970s, quality management has become an important tool for achieving a high quality of public e-administration services in many countries. Very important part of quality management in e-administration is measurement of quality indicators related to this sector. Therefore, this paper gives a description of e-administration, including statistics about it and other examples from many countries worldwide, as well as the explanation of quality management in public e-administration. The paper also gives a list and description of quality indicators relevant to e-administration, as part of quality management within the e-administration. Through a literature review and best practices, the paper aims to analyze quality indicators measurement and other parts of good quality management when it comes to the public e-administration and consequently to show the usefulness of quality management in public e-administration in order to provide services of high quality.

A Consideration of the Achievement of Productive Level Parallel Programming Skills

This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.

Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.

Issues in the User Interface Design of a Content Rich Vocational Training Application for Digitally Illiterate Users

This paper discusses our preliminary experiences in the design of a user interface of a computerized content-rich vocational training courseware meant for users with little or no computer experience. In targeting a growing population with limited access to skills training of any sort, we faced numerous challenges, including language and cultural differences, resource limits, gender boundaries and, in many cases, the simple lack of trainee motivation. With the size of the unskilled population increasing much more rapidly than the numbers of sufficiently skilled teachers, there is little choice but to develop teaching techniques that will take advantage of emerging computer-based training technologies. However, in striving to serve populations with minimal computer literacy, one must carefully design the user interface to accommodate their cultural, social, educational, motivational and other differences. Our work, which uses computer based and haptic simulation technologies to deliver training to these populations, has provided some useful insights on potential user interface design approaches.

Creative Teaching of New Product Development to Operations Managers

New Product Development (NPD) has got its roots on an Engineering background. Thus, one might wonder about the interest, opportunity, contents and delivery process, if students from soft sciences were involved. This paper addressed «What to teach?» and «How to do it?», as the preliminary research questions that originated the introduced propositions. The curriculum-developer model that was purposefully chosen to adapt the coursebook by pursuing macro/micro strategies was found significant by an exploratory qualitative case study. Moreover, learning was developed and value created by implementing the institutional curriculum through a creative, hands-on, experiencing, problem-solving, problem-based but organized teamwork approach. Product design of an orange squeezer complying with ill-defined requirements, including drafts, sketches, prototypes, CAD simulations and a business plan, plus a website, written reports and presentations were the deliverables that confirmed an innovative contribution towards research and practice of teaching and learning of engineering subjects to non-specialist operations managers candidates.

Enhanced Gram-Schmidt Process for Improving the Stability in Signal and Image Processing

The Gram-Schmidt Process (GSP) is used to convert a non-orthogonal basis (a set of linearly independent vectors) into an orthonormal basis (a set of orthogonal, unit-length vectors). The process consists of taking each vector and then subtracting the elements in common with the previous vectors. This paper introduces an Enhanced version of the Gram-Schmidt Process (EGSP) with inverse, which is useful for signal and image processing applications.

The Digital Filing Cabinet–A GIS Based Management Solution Tool for the Land Surveyor and Engineer

This paper explains how the New Jersey Institute of Technology surveying student team members designed and created an interactive GIS map, the purpose of which is to be useful to the land surveyor and engineer for project management. This was achieved by building a research and storage database that can be easily integrated into any land surveyor’s current operations through the use of ArcGIS 10, Arc Catalog, and AutoCAD. This GIS database allows for visual representation and information querying for multiple job sites, and simple access to uploaded data, which is geospatially referenced to each individual job site or project. It can also be utilized by engineers to determine design criteria, or to store important files. This cost-effective approach to a surveying map not only saves time, but saves physical storage space and paper resources.

A Special Algorithm to Approximate the Square Root of Positive Integer

The paper concerns a special approximate algorithm of the square root of the specific positive integer, which is built by the use of the property of positive integer solution of the Pell’s equation, together with using some elementary theorems of matrices, and then takes it to compare with general used the Newton’s method and give a practical numerical example and error analysis; it is unexpected to find its special property: the significant figure of the approximation value of the square root of positive integer will increase one digit by one. It is well useful in some occasions.

Absorbed Dose Measurement in Gonads Menduring Abdominal and Pelvicradiotherapy

Two different testicular tissues have to be distinguished in regard to radiation damage: first the seminiferous tubules, corresponding to the sites of spermatogenesis, which are extremely radiosensitive. Second the testosterone secreting Leydig cells, which are considered to be less radiosensitive. This study aims to estimate testicular dose and the associated risks for infertility and hereditary effects from Abdominal and pelvic irradiation. Radiotherapy was simulated on a humanoid phantom using a 15 MV photon beam. Testicular dose was measured for various field sizes and tissue thicknesses along beam axis using an ionization chamber and TLD. For transmission Factor Also common method of measuring the absorbed dose distribution and electron contamination in the build-up region of high-energy beams for radiation therapy is by means of parallel-plate Ionisation chambers. Gonadal dose was reduced by placing lead cups around the testes supplemented by a field edge block. For a tumor dose of 100 cGy, testicular dose was 2.96-8.12 cGy depending upon the field size and the distance from the inferior field edge. The treatment at parameters, the presence of gonad shield and the somatometric characteristics determine whether testicular dose can exceed 1 Gy which allows a complete recovery of spermatogenesis.