IFC-Based Construction Engineering Domain Otology Development

The essence of the 21st century is knowledge economy. Knowledge has become the key resource of economic growth and social development. Construction industry is no exception. Because of the characteristic of complexity, project manager can't depend only on information management. The only way to improve the level of construction project management is to set up a kind of effective knowledge accumulation mechanism. This paper first introduced the IFC standard and the concept of ontology. Then put forward the construction method of the architectural engineering domain ontology based on IFC. And finally build up the concepts, properties and the relationship between the concepts of the ontology. The deficiency of this paper is also pointed out.

Specifying Strict Serializability of Iterated Transactions in Propositional Temporal Logic

We present an operator for a propositional linear temporal logic over infinite schedules of iterated transactions, which, when applied to a formula, asserts that any schedule satisfying the formula is serializable. The resulting logic is suitable for specifying and verifying consistency properties of concurrent transaction management systems, that can be defined in terms of serializability, as well as other general safety and liveness properties. A strict form of serializability is used requiring that, whenever the read and write steps of a transaction occurrence precede the read and write steps of another transaction occurrence in a schedule, the first transaction must precede the second transaction in an equivalent serial schedule. This work improves on previous work in providing a propositional temporal logic with a serializability operator that is of the same PSPACE complete computational complexity as standard propositional linear temporal logic without a serializability operator.

Personalization and the Universal Communications Identifier Concept

As communications systems and technology become more advanced and complex, it will be increasingly important to focus on users- individual needs. Personalization and effective user profile management will be necessary to ensure the uptake and success of new services and devices and it is therefore important to focus on the users- requirements in this area and define solutions that meet these requirements. The work on personalization and user profiles emerged from earlier ETSI work on a Universal Communications Identifier (UCI) which is a unique identifier of the user rather than a range of identifiers of the many of communication devices or services (e.g. numbers of fixed phone at home/work, mobile phones, fax and email addresses). This paper describes work on personalization including standardized information and preferences and an architectural framework providing a description of how personalization can be integrated in Next Generation Networks, together with the UCI concept.

Determining Optimal Demand Rate and Production Decisions: A Geometric Programming Approach

In this paper a nonlinear model is presented to demonstrate the relation between production and marketing departments. By introducing some functions such as pricing cost and market share loss functions it will be tried to show some aspects of market modelling which has not been regarded before. The proposed model will be a constrained signomial geometric programming model. For model solving, after variables- modifications an iterative technique based on the concept of geometric mean will be introduced to solve the resulting non-standard posynomial model which can be applied to a wide variety of models in non-standard posynomial geometric programming form. At the end a numerical analysis will be presented to accredit the validity of the mentioned model.

eLearning Tools Evaluation based on Quality Concept Distance Computing. A Case Study

Despite the extensive use of eLearning systems, there is no consensus on a standard framework for evaluating this kind of quality system. Hence, there is only a minimum set of tools that can supervise this judgment and gives information about the course content value. This paper presents two kinds of quality set evaluation indicators for eLearning courses based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The “distance" calculus is applied to standard evaluation templates (i.e. the European Commission Programme procedures vs. the AFNOR Z 76-001 Standard), determining a reference point in the evaluation of the e-learning course quality vs. the optimal concept(s). The case study, based on the results of project(s) developed in the framework of the European Programme “Leonardo da Vinci", with Romanian contractors, try to put into evidence the benefits of such a method.

Adaptive Kernel Principal Analysis for Online Feature Extraction

The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.

Performance of Soft Handover Algorithm in Varied Propagation Environments

CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.

Development of Heterogeneous Parallel Genetic Simulated Annealing Using Multi-Niche Crowding

In this paper, a new hybrid of genetic algorithm (GA) and simulated annealing (SA), referred to as GSA, is presented. In this algorithm, SA is incorporated into GA to escape from local optima. The concept of hierarchical parallel GA is employed to parallelize GSA for the optimization of multimodal functions. In addition, multi-niche crowding is used to maintain the diversity in the population of the parallel GSA (PGSA). The performance of the proposed algorithms is evaluated against a standard set of multimodal benchmark functions. The multi-niche crowding PGSA and normal PGSA show some remarkable improvement in comparison with the conventional parallel genetic algorithm and the breeder genetic algorithm (BGA).

Region-Based Segmentation of Generic Video Scenes Indexing

In this work we develop an object extraction method and propose efficient algorithms for object motion characterization. The set of proposed tools serves as a basis for development of objectbased functionalities for manipulation of video content. The estimators by different algorithms are compared in terms of quality and performance and tested on real video sequences. The proposed method will be useful for the latest standards of encoding and description of multimedia content – MPEG4 and MPEG7.

The Impact Factors of the Environmental Pollution and Workers Health in Printing Industry

This paper presents the study of parameters affecting the environment protection in the printing industry. The paper has also compared LCA studies performed within the printing industry in order to identify common practices, limitations, areas for improvement, and opportunities for standardization. This comparison is focused on the data sources and methodologies used in the printing pollutants register. The presented concepts, methodology and results represent the contribution to the sustainable development management. Furthermore, the paper analyzes the result of the quantitative identification of hazardous substances emitted in printing industry of Novi Sad.

Creep Constitutive Equation for 2- Materials of Weldment-304L Stainless Steel

In this paper, creep constitutive equations of base (Parent) and weld materials of the weldment for cold-drawn 304L stainless steel have been obtained experimentally. For this purpose, test samples have been generated from cold drawn bars and weld material according to the ASTM standard. The creep behavior and properties have been examined for these materials by conducting uniaxial creep tests. Constant temperatures and constant load uni-axial creep tests have been carried out at two high temperatures, 680 and 720 oC, subjected to constant loads, which produce initial stresses ranging from 240 to 360 MPa. The experimental data have been used to obtain the creep constitutive parameters using numerical optimization techniques.

Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Design of the Production Line Based On RFID through 3D Modeling

Radio-frequency identification has entered as a beneficial means with conforming GS1 standards to provide the best solutions in the manufacturing area. It competes with other automated identification technologies e.g. barcodes and smart cards with regard to high speed scanning, reliability and accuracy as well. The purpose of this study is to improve production line-s performance by implementing RFID system in the manufacturing area on the basis of radio-frequency identification (RFID) system by 3D modeling in the program Cinema 4D R13 which provides obvious graphical scenes for users to portray their applications. Finally, with regard to improving system performance, it shows how RFID appears as a well-suited technology in a comparison of the barcode scanner to handle different kinds of raw materials in the production line base on logical process.

Face Recognition using Features Combination and a New Non-linear Kernel

To improve the classification rate of the face recognition, features combination and a novel non-linear kernel are proposed. The feature vector concatenates three different radius of local binary patterns and Gabor wavelet features. Gabor features are the mean, standard deviation and the skew of each scaling and orientation parameter. The aim of the new kernel is to incorporate the power of the kernel methods with the optimal balance between the features. To verify the effectiveness of the proposed method, numerous methods are tested by using four datasets, which are consisting of various emotions, orientations, configuration, expressions and lighting conditions. Empirical results show the superiority of the proposed technique when compared to other methods.

Research on the Survivability of Embedded Real-time System

Introducing survivability into embedded real-time system (ERTS) can improve the survivability power of the system. This paper mainly discusses about the survivability of ERTS. The first is the survivability origin of ERTS. The second is survivability analysis. According to the definition of survivability based on survivability specification and division of the entire survivability analysis process for ERTS, a survivability analysis profile is presented. The quantitative analysis model of this profile is emphasized and illuminated in detail, the quantifying analysis of system was showed helpful to evaluate system survivability more accurate. The third is platform design of survivability analysis. In terms of the profile, the analysis process is encapsulated and assembled into one platform, on which quantification, standardization and simplification of survivability analysis are all achieved. The fourth is survivability design. According to character of ERTS, strengthened design method is selected to realize system survivability design. Through the analysis of embedded mobile video-on-demand system, intrusion tolerant technology is introduced in whole survivability design.

Environmental Management System According to ISO 14001 as a Source of Eco-Innovations in Enterprises - A Case of Podkarpackie Voivodeship

This paper presents results of empirical studies that were conducted in enterprises from Podkarpackie Voivodeship (Poland). It shows the experiences of those enterprises resulting from implementing and improving the eco-innovativeness management that is formal Environmental Management System (EMS). This study shows the expected and obtained internal benefits which are the effects of a functioning EMS. The aim of this paper is to determine whether the information included in international theoretical studies concerning the benefits of implementing, functioning and improving formal EMS (which is based on the international standard ISO 14001) are confirmed by the effects of the enterprises- activities.

Mechanisms of Organic Contaminants Uptake and Degradation in Plants

As a result of urbanization, the unpredictable growth of industry and transport, production of chemicals, military activities, etc. the concentration of anthropogenic toxicants spread in nature exceeds all the permissible standards. Most dangerous among these contaminants are organic compounds having great persistence, bioaccumulation, and toxicity along with our awareness of their prominent occurrence in the environment and food chain. Among natural ecological tools, plants still occupying above 40% of the world land, until recently, were considered as organisms having only a limited ecological potential, accumulating in plant biomass and partially volatilizing contaminants of different structure. However, analysis of experimental data of the last two decades revealed the essential role of plants in environment remediation due to ability to carry out intracellular degradation processes leading to partial or complete decomposition of carbon skeleton of different structure contaminants. Though, phytoremediation technologies still are in research and development, their various applications have been successfully used. The paper aims to analyze mechanisms of organic contaminants uptake and detoxification in plants, being the less studied issue in evaluation and exploration of plants potential for environment remediation.

A Perceptually Optimized Foveation Based Wavelet Embedded Zero Tree Image Coding

In this paper, we propose a Perceptually Optimized Foveation based Embedded ZeroTree Image Coder (POEFIC) that introduces a perceptual weighting to wavelet coefficients prior to control SPIHT encoding algorithm in order to reach a targeted bit rate with a perceptual quality improvement with respect to a given bit rate a fixation point which determines the region of interest ROI. The paper also, introduces a new objective quality metric based on a Psychovisual model that integrates the properties of the HVS that plays an important role in our POEFIC quality assessment. Our POEFIC coder is based on a vision model that incorporates various masking effects of human visual system HVS perception. Thus, our coder weights the wavelet coefficients based on that model and attempts to increase the perceptual quality for a given bit rate and observation distance. The perceptual weights for all wavelet subbands are computed based on 1) foveation masking to remove or reduce considerable high frequencies from peripheral regions 2) luminance and Contrast masking, 3) the contrast sensitivity function CSF to achieve the perceptual decomposition weighting. The new perceptually optimized codec has the same complexity as the original SPIHT techniques. However, the experiments results show that our coder demonstrates very good performance in terms of quality measurement.

Improving Fault Resilience and Reconstruction of Overlay Multicast Tree Using Leaving Time of Participants

Network layer multicast, i.e. IP multicast, even after many years of research, development and standardization, is not deployed in large scale due to both technical (e.g. upgrading of routers) and political (e.g. policy making and negotiation) issues. Researchers looked for alternatives and proposed application/overlay multicast where multicast functions are handled by end hosts, not network layer routers. Member hosts wishing to receive multicast data form a multicast delivery tree. The intermediate hosts in the tree act as routers also, i.e. they forward data to the lower hosts in the tree. Unlike IP multicast, where a router cannot leave the tree until all members below it leave, in overlay multicast any member can leave the tree at any time thus disjoining the tree and disrupting the data dissemination. All the disrupted hosts have to rejoin the tree. This characteristic of the overlay multicast causes multicast tree unstable, data loss and rejoin overhead. In this paper, we propose that each node sets its leaving time from the tree and sends join request to a number of nodes in the tree. The nodes in the tree will reject the request if their leaving time is earlier than the requesting node otherwise they will accept the request. The node can join at one of the accepting nodes. This makes the tree more stable as the nodes will join the tree according to their leaving time, earliest leaving time node being at the leaf of the tree. Some intermediate nodes may not follow their leaving time and leave earlier than their leaving time thus disrupting the tree. For this, we propose a proactive recovery mechanism so that disrupted nodes can rejoin the tree at predetermined nodes immediately. We have shown by simulation that there is less overhead when joining the multicast tree and the recovery time of the disrupted nodes is much less than the previous works. Keywords

Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.