On Fractional (k,m)-Deleted Graphs with Constrains Conditions

Let G be a graph of order n, and let k  2 and m  0 be two integers. Let h : E(G)  [0, 1] be a function. If e∋x h(e) = k holds for each x  V (G), then we call G[Fh] a fractional k-factor of G with indicator function h where Fh = {e  E(G) : h(e) > 0}. A graph G is called a fractional (k,m)-deleted graph if there exists a fractional k-factor G[Fh] of G with indicator function h such that h(e) = 0 for any e  E(H), where H is any subgraph of G with m edges. In this paper, it is proved that G is a fractional (k,m)-deleted graph if (G)  k + m + m k+1 , n  4k2 + 2k − 6 + (4k 2 +6k−2)m−2 k−1 and max{dG(x), dG(y)}  n 2 for any vertices x and y of G with dG(x, y) = 2. Furthermore, it is shown that the result in this paper is best possible in some sense.

A 3.125Gb/s Clock and Data Recovery Circuit Using 1/4-Rate Technique

This paper describes the design and fabrication of a clock and data recovery circuit (CDR). We propose a new clock and data recovery which is based on a 1/4-rate frequency detector (QRFD). The proposed frequency detector helps reduce the VCO frequency and is thus advantageous for high speed application. The proposed frequency detector can achieve low jitter operation and extend the pull-in range without using the reference clock. The proposed CDR was implemented using a 1/4-rate bang-bang type phase detector (PD) and a ring voltage controlled oscillator (VCO). The CDR circuit has been fabricated in a standard 0.18 CMOS technology. It occupies an active area of 1 x 1 and consumes 90 mW from a single 1.8V supply.

Detecting Email Forgery using Random Forests and Naïve Bayes Classifiers

As emails communications have no consistent authentication procedure to ensure the authenticity, we present an investigation analysis approach for detecting forged emails based on Random Forests and Naïve Bays classifiers. Instead of investigating the email headers, we use the body content to extract a unique writing style for all the possible suspects. Our approach consists of four main steps: (1) The cybercrime investigator extract different effective features including structural, lexical, linguistic, and syntactic evidence from previous emails for all the possible suspects, (2) The extracted features vectors are normalized to increase the accuracy rate. (3) The normalized features are then used to train the learning engine, (4) upon receiving the anonymous email (M); we apply the feature extraction process to produce a feature vector. Finally, using the machine learning classifiers the email is assigned to one of the suspects- whose writing style closely matches M. Experimental results on real data sets show the improved performance of the proposed method and the ability of identifying the authors with a very limited number of features.

Auspicious Meaning for Community Souvenir Products

The objective of this research was to find the relationship between auspicious meaning in eastern wisdom and the interpretation as a guideline for the design and development of community souvenirs. The sample group included 400 customers in Bangkok who used to buy community souvenir products. The information was applied to design the souvenirs which were considered for the appropriateness by 5 design specialists. The data were analyzed to find frequency, percentage, and SD with the results as follows. 1) The best factor referring to the auspicious meaning is color. The application of auspicious meaning can make the value added to the product and bring the fortune to the receivers. 2) The effectiveness of the auspicious meaning integration on the design of community souvenir product was in high level. When considering in each aspect, it was found that the interpretation aspect was in high level, the congruency of the auspicious meaning and the utility of the product was in high level. The attractiveness and the good design were in very high level while the potential of the value added in the product design was in high level. The suitable application to the design of community souvenir product was in high level.

Information Measures Based on Sampling Distributions

Information theory and Statistics play an important role in Biological Sciences when we use information measures for the study of diversity and equitability. In this communication, we develop the link among the three disciplines and prove that sampling distributions can be used to develop new information measures. Our study will be an interdisciplinary and will find its applications in Biological systems.

Improvising Intrusion Detection for Malware Activities on Dual-Stack Network Environment

Malware is software which was invented and meant for doing harms on computers. Malware is becoming a significant threat in computer network nowadays. Malware attack is not just only involving financial lost but it can also cause fatal errors which may cost lives in some cases. As new Internet Protocol version 6 (IPv6) emerged, many people believe this protocol could solve most malware propagation issues due to its broader addressing scheme. As IPv6 is still new compares to native IPv4, some transition mechanisms have been introduced to promote smoother migration. Unfortunately, these transition mechanisms allow some malwares to propagate its attack from IPv4 to IPv6 network environment. In this paper, a proof of concept shall be presented in order to show that some existing IPv4 malware detection technique need to be improvised in order to detect malware attack in dual-stack network more efficiently. A testbed of dual-stack network environment has been deployed and some genuine malware have been released to observe their behaviors. The results between these different scenarios will be analyzed and discussed further in term of their behaviors and propagation methods. The results show that malware behave differently on IPv6 from the IPv4 network protocol on the dual-stack network environment. A new detection technique is called for in order to cater this problem in the near future.

Latent Topic Based Medical Data Classification

This paper discusses the classification process for medical data. In this paper, we use the data from ACM KDDCup 2008 to demonstrate our classification process based on latent topic discovery. In this data set, the target set and outliers are quite different in their nature: target set is only 0.6% size in total, while the outliers consist of 99.4% of the data set. We use this data set as an example to show how we dealt with this extremely biased data set with latent topic discovery and noise reduction techniques. Our experiment faces two major challenge: (1) extremely distributed outliers, and (2) positive samples are far smaller than negative ones. We try to propose a suitable process flow to deal with these issues and get a best AUC result of 0.98.

Detente and Power - Conceptual Determination, Forms and Means of Education at the Preteen Age

The scientific perspective, the practice area of physical education and sports activities improve power capacity in all its forms of expression, being a generator of the research topics. Today theories that strength training athletes and slow down development progress will affect the strength and flexibility are discredited. On the other hand there are sectors and / or samples whose results are sports of the way higher manifestation of power as a result of the composition of the force and velocity, being based in this respect on the systematic and continuous development of both bio-motric capacities said. Training of force for children was and is controversial. Teama de accidentări sau a stopării premature a procesului de creştere a făcut ca în trecut copiii să fie ţinuţi departe de lucrul cu diferite greutăţi.Fear of injury or premature stop the growth process in the past made the children to be kept away from working with different weights. Recent studies have shown that the risk of accidents is relatively small and the strength training can help prevent them. For example, most accidents occur at the level of athletics ligaments and tendons. From this point of view, it can be said that a progressive intervention of force training, optimal design, will help enhancing their process, such as athlete much better prepared to meet training requests and competitions. Preparation of force provides a solid basis for further phases in the highest performance.

Conversion of Modified Commercial Polyacrylonitrile Fibers to Carbon Fibers

Carbon fibers are fabricated from different materials, such as special polyacrylonitrile (PAN) fibers, rayon fibers and pitch. Among these three groups of materials, PAN fibers are the most widely used precursor for the manufacture of carbon fibers. The process of fabrication carbon fibers from special PAN fibers includes two steps; oxidative stabilization at low temperature and carbonization at high temperatures in an inert atmosphere. Due to the high price of raw materials (special PAN fibers), carbon fibers are still expensive. In the present work the main goal is making carbon fibers from low price commercial PAN fibers with modified chemical compositions. The results show that in case of conducting completes stabilization process, it is possible to produce carbon fibers with desirable tensile strength from this type of PAN fibers. To this matter, thermal characteristics of commercial PAN fibers were investigated and based upon the obtained results, with some changes in conventional procedure of stabilization in terms of temperature and time variables; the desirable conditions of complete stabilization is achieved.

Engineering Geological Characteristics of Soil Materials, East Nile Delta, Egypt

This paper is concerned with the study of mineralogy and engineering characteristics of soil materials derived from the eastern part of Nile Delta. The clay minerals of the studied soil by using X- ray diffraction are mainly illite (average 72.6 %) and kaolinite (average 2.6 %), expandable portion in illite-smectite mixed layer (average 7 %). Smectite is more abundant in fluviatile clays, whereas kaolinite is more abundant in lagoonal clays. On the other hand, illite and illite-smectite are more abundant in marine clays. The geotechnical results show that the soil under study consists mainly of about 0.3 % gravel, 5 % sand, 51.5 % silt and 42.2 % clay in average. The average shrinkage limit attains 11 % whereas the average value of the plasticity index is 23.4 %. The free swelling ranges from 40 % to 75 % and has a value of 55 % giving an indication about the inadequacy of such soil under foundations. From a construction point of view, the soil under investigation poses many problems even under light foundations due to the swelling and shrinkage. Such swelling and shrinkage is due to the high content of soil materials in the expandable clay minerals of illite and smectite. Based on the results of the present and earlier studies, trial application of soil stabilisation is recommended.

Analysis of Highway Slope Failure by an Application of the Stereographic Projection

The mountain road slope failures triggered by earthquake activities and torrential rain namely to create the disaster. Province Road No. 24 is a main route to the Wutai Township. The area of the study is located at the mileages between 46K and 47K along the road. However, the road has been suffered frequent damages as a result of landslide and slope failures during typhoon seasons. An understanding of the sliding behaviors in the area appears to be necessary. Slope failures triggered by earthquake activities and heavy rainfalls occur frequently. The study is to understand the mechanism of slope failures and to look for the way to deal with the situation. In order to achieve these objectives, this paper is based on theoretical and structural geology data interpretation program to assess the potential slope sliding behavior. The study showed an intimate relationship between the landslide behavior of the slopes and the stratum materials, based on structural geology analysis method to analysis slope stability and finds the slope safety coefficient to predict the sites of destroyed layer. According to the case study and parameter analyses results, the slope mainly slips direction compared to the site located in the southeast area. Find rainfall to result in the rise of groundwater level is main reason of the landslide mechanism. Future need to set up effective horizontal drain at corrective location, that can effective restrain mountain road slope failures and increase stability of slope.

Specifying a Timestamp-based Protocol For Multi-step Transactions Using LTL

Most of the concurrent transactional protocols consider serializability as a correctness criterion of the transactions execution. Usually, the proof of the serializability relies on mathematical proofs for a fixed finite number of transactions. In this paper, we introduce a protocol to deal with an infinite number of transactions which are iterated infinitely often. We specify serializability of the transactions and the protocol using a specification language based on temporal logics. It is worthwhile using temporal logics such as LTL (Lineartime Temporal Logic) to specify transactions, to gain full automatic verification by using model checkers.

Multi-models Approach for Describing and Verifying Constraints Based Interactive Systems

The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.

A Secure Semi-Fragile Watermarking Scheme for Authentication and Recovery of Images Based On Wavelet Transform

Authentication of multimedia contents has gained much attention in recent times. In this paper, we propose a secure semi-fragile watermarking, with a choice of two watermarks to be embedded. This technique operates in integer wavelet domain and makes use of semi fragile watermarks for achieving better robustness. A self-recovering algorithm is employed, that hides the image digest into some Wavelet subbands to detect possible malevolent object manipulation undergone by the image (object replacing and/or deletion). The Semi-fragility makes the scheme tolerant for JPEG lossy compression as low as quality of 70%, and locate the tempered area accurately. In addition, the system ensures more security because the embedded watermarks are protected with private keys. The computational complexity is reduced using parameterized integer wavelet transform. Experimental results show that the proposed scheme guarantees the safety of watermark, image recovery and location of the tempered area accurately.

Bioclimatic Principles and Urban Open Spaces: The Case of Xanthi

Open urban public spaces comprise an important element for the development of social, cultural and economic activities of the population in the modern cities. These spaces are also considered regulators of the region-s climate conditions, providing better thermal, visual and auditory conditions which can be optimized by the application of appropriate strategies of bioclimatic design. The paper focuses on the analysis and evaluation of the recent unification of the open spaces in the centre of Xanthi, a medium – size city in northern Greece, from a bioclimatic perspective, as well as in the creation of suitable methodology. It is based both on qualitative observation of the interventions by fieldwork research and assessment and on quantitative analysis and modeling of the research area.

Reentry Trajectory Optimization Based on Differential Evolution

Reentry trajectory optimization is a multi-constraints optimal control problem which is hard to solve. To tackle it, we proposed a new algorithm named CDEN(Constrained Differential Evolution Newton-Raphson Algorithm) based on Differential Evolution( DE) and Newton-Raphson.We transform the infinite dimensional optimal control problem to parameter optimization which is finite dimensional by discretize control parameter. In order to simplify the problem, we figure out the control parameter-s scope by process constraints. To handle constraints, we proposed a parameterless constraints handle process. Through comprehensive analyze the problem, we use a new algorithm integrated by DE and Newton-Raphson to solve it. It is validated by a reentry vehicle X-33, simulation results indicated that the algorithm is effective and robust.

Exact Evaluation Method for Error Performance Analysis of Arbitrary 2-D Modulation OFDM Systems with CFO

Orthogonal frequency division multiplexing (OFDM) has developed into a popular scheme for wideband digital communications used in consumer applications such as digital broadcasting, wireless networking and broadband internet access. In the OFDM system, carrier frequency offset (CFO) causes intercarrier interference (ICI) which significantly degrades the system error performance. In this paper we provide an exact evaluation method for error performance analysis of arbitrary 2-D modulation OFDM systems with CFO, and analyze the effect of CFO on error performance.

Application of Build-up and Wash-off Models for an East-Australian Catchment

Estimation of stormwater pollutants is a pre-requisite for the protection and improvement of the aquatic environment and for appropriate management options. The usual practice for the stormwater quality prediction is performed through water quality modeling. However, the accuracy of the prediction by the models depends on the proper estimation of model parameters. This paper presents the estimation of model parameters for a catchment water quality model developed for the continuous simulation of stormwater pollutants from a catchment to the catchment outlet. The model is capable of simulating the accumulation and transportation of the stormwater pollutants; suspended solids (SS), total nitrogen (TN) and total phosphorus (TP) from a particular catchment. Rainfall and water quality data were collected for the Hotham Creek Catchment (HTCC), Gold Coast, Australia. Runoff calculations from the developed model were compared with the calculated discharges from the widely used hydrological models, WBNM and DRAINS. Based on the measured water quality data, model water quality parameters were calibrated for the above-mentioned catchment. The calibrated parameters are expected to be helpful for the best management practices (BMPs) of the region. Sensitivity analyses of the estimated parameters were performed to assess the impacts of the model parameters on overall model estimations of runoff water quality.

Power Line Carrier for Power Telemetering

This paper presents an application of power line carrier (PLC) for electrical power telemetering. This system has a special capability of transmitting the measured values to a centralized computer via power lines. The PLC modem as a passive high-pass filter is designed for transmitting and receiving information. Its function is to send the information carrier together with transmitted data by superimposing it on the 50 Hz power frequency signal. A microcontroller is employed to function as the main processing of the modem. It is programmed for PLC control and interfacing with other devices. Each power meter, connected via a PLC modem, is assigned with a unique identification number (address) for distinguishing each device from one another.

Security Architecture for At-Home Medical Care Using Sensor Network

This paper proposes a novel architecture for At- Home medical care which enables senior citizens, patients with chronic ailments and patients requiring post- operative care to be remotely monitored in the comfort of their homes. This architecture is implemented using sensors and wireless networking for transmitting patient data to the hospitals, health- care centers for monitoring by medical professionals. Patients are equipped with sensors to measure their physiological parameters, like blood pressure, pulse rate etc. and a Wearable Data Acquisition Unit is used to transmit the patient sensor data. Medical professionals can be alerted to any abnormal variations in these values for diagnosis and suitable treatment. Security threats and challenges inherent to wireless communication and sensor network have been discussed and a security mechanism to ensure data confidentiality and source authentication has been proposed. Symmetric key algorithm AES has been used for encrypting the data and a patent-free, two-pass block cipher mode CCFB has been used for implementing semantic security.