“Blood Family“ Activity With Respect To Comprehensive Guidance School Program

Children and adolescents developing in the worlds of today are facing a getting array of new and old challenges. School counselling is improving rapidly in contemporary education systems around the world. It can be said that counselling system in Turkey was newly borning. In this study, “Family of the Blood" activity is improved with respect to compherensive guidance school program. The sample included 22 adolescents who were high school students. The activity was carried out in 4 sessions, each of which lasted 45 minutes. In the first session, students- personal-social needs were determined. In the second session, in order to warm up, the students were asked three questions consisting of the constructional aspect. In the third session, the counselor and the teacher shared the results of students- responses obtained in the previous session. In the fourth session, the tables formed by students were presented in the classroom. In order to evaluate the activity, three questions were asked of the teacher and counselor. According to the results, the lesson aims of curriculum and counselling aims of curriculum were attained. In the light of literature, the results were discussed and some suggestions were made. It is taken into consideration that the activitiy was beneficial in many respects, similar studies should be carried out in the near future.

Testing Object-Oriented Framework Applications Using FIST2 Tool: A Case Study

An application framework provides a reusable design and implementation for a family of software systems. Frameworks are introduced to reduce the cost of a product line (i.e., a family of products that shares the common features). Software testing is a timeconsuming and costly ongoing activity during the application software development process. Generating reusable test cases for the framework applications during the framework development stage, and providing and using the test cases to test part of the framework application whenever the framework is used reduces the application development time and cost considerably. This paper introduces the Framework Interface State Transition Tester (FIST2), a tool for automated unit testing of Java framework applications. During the framework development stage, given the formal descriptions of the framework hooks, the specifications of the methods of the framework-s extensible classes, and the illegal behavior description of the Framework Interface Classes (FICs), FIST2 generates unitlevel test cases for the classes. At the framework application development stage, given the customized method specifications of the implemented FICs, FIST2 automates the use, execution, and evaluation of the already generated test cases to test the implemented FICs. The paper illustrates the use of the FIST2 tool for testing several applications that use the SalesPoint framework.

Identification of Factors Influencing Company's Competitiveness

Fast development of technologies, economic globalization and many other external circumstances stimulate company’s competitiveness. One of the major trends in today’s business is the shift to the exploitation of the Internet and electronic environment for entrepreneurial needs. Latest researches confirm that e-environment provides a range of possibilities and opportunities for companies, especially for micro-, small- and medium-sized companies, which have limited resources. The usage of e-tools raises the effectiveness and the profitability of an organization, as well as its competitiveness. In the electronic market, as in the classic one, there are factors, such as globalization, development of new technology, price sensitive consumers, Internet, new distribution and communication channels that influence entrepreneurship. As a result of eenvironment development, e-commerce and e-marketing grow as well. Objective of the paper: To describe and identify factors influencing company’s competitiveness in e-environment. Research methodology: The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistics method, factor analysis in SPSS 20 environment, etc. The theoretical and methodological background of the research is formed by using scientific researches and publications, such as that from mass media and professional literature; statistical information from legal institutions as well as information collected by the authors during the surveying process. Research result: The authors detected and classified factors influencing competitiveness in e-environment.  In this paper, the authors presented their findings based on theoretical, scientific, and field research. Authors have conducted a research on e-environment utilization among Latvian enterprises. 

One scheme of Transition Probability Evaluation

In present work are considered the scheme of evaluation the transition probability in quantum system. It is based on path integral representation of transition probability amplitude and its evaluation by means of a saddle point method, applied to the part of integration variables. The whole integration process is reduced to initial value problem solutions of Hamilton equations with a random initial phase point. The scheme is related to the semiclassical initial value representation approaches using great number of trajectories. In contrast to them from total set of generated phase paths only one path for each initial coordinate value is selected in Monte Karlo process.

A New Analytical Approach for Free Vibration of Membrane from Wave Standpoint

In this paper, an analytical approach for free vibration analysis of rectangular and circular membranes is presented. The method is based on wave approach. From wave standpoint vibration propagate, reflect and transmit in a structure. Firstly, the propagation and reflection matrices for rectangular and circular membranes are derived. Then, these matrices are combined to provide a concise and systematic approach to free vibration analysis of membranes. Subsequently, the eigenvalue problem for free vibration of membrane is formulated and the equation of membrane natural frequencies is constructed. Finally, the effectiveness of the approach is shown by comparison of the results with existing classical solution.

Approximation Approach to Linear Filtering Problem with Correlated Noise

The (sub)-optimal soolution of linear filtering problem with correlated noises is considered. The special recursive form of the class of filters and criteria for selecting the best estimator are the essential elements of the design method. The properties of the proposed filter are studied. In particular, for Markovian observation noise, the approximate filter becomes an optimal Gevers-Kailath filter subject to a special choice of the parameter in the class of given linear recursive filters.

1−Skeleton Resolution of Free Simplicial Algebras with Given CW−Basis

In this paper we use the definition of CW basis of a free simplicial algebra. Using the free simplicial algebra, it is shown to construct free or totally free 2−crossed modules on suitable construction data with given a CW−basis of the free simplicial algebra. We give applications free crossed squares, free squared complexes and free 2−crossed complexes by using of 1(one) skeleton resolution of a step by step construction of the free simplicial algebra with a given CW−basis.

Target Detection with Improved Image Texture Feature Coding Method and Support Vector Machine

An image texture analysis and target recognition approach of using an improved image texture feature coding method (TFCM) and Support Vector Machine (SVM) for target detection is presented. With our proposed target detection framework, targets of interest can be detected accurately. Cascade-Sliding-Window technique was also developed for automated target localization. Application to mammogram showed that over 88% of normal mammograms and 80% of abnormal mammograms can be correctly identified. The approach was also successfully applied to Synthetic Aperture Radar (SAR) and Ground Penetrating Radar (GPR) images for target detection.

Speech Encryption and Decryption Using Linear Feedback Shift Register (LFSR)

This paper is taken into consideration the problem of cryptanalysis of stream ciphers. There is some attempts need to improve the existing attacks on stream cipher and to make an attempt to distinguish the portions of cipher text obtained by the encryption of plain text in which some parts of the text are random and the rest are non-random. This paper presents a tutorial introduction to symmetric cryptography. The basic information theoretic and computational properties of classic and modern cryptographic systems are presented, followed by an examination of the application of cryptography to the security of VoIP system in computer networks using LFSR algorithm. The implementation program will be developed Java 2. LFSR algorithm is appropriate for the encryption and decryption of online streaming data, e.g. VoIP (voice chatting over IP). This paper is implemented the encryption module of speech signals to cipher text and decryption module of cipher text to speech signals.

Hazard Identification and Sensitivity of Potential Resource of Emergency Water Supply

The paper presents the case study of hazard identification and sensitivity of potential resource of emergency water supply as part of the application of methodology classifying the resources of drinking water for emergency supply of population. The case study has been carried out on a selected resource of emergency water supply in one region of the Czech Republic. The hazard identification and sensitivity of potential resource of emergency water supply is based on a unique procedure and developed general registers of selected types of hazards and sensitivities. The registers have been developed with the help of the “Fault Tree Analysis” method in combination with the “What if method”. The identified hazards for the assessed resource include hailstorms and torrential rains, drought, soil erosion, accidents of farm machinery, and agricultural production. The developed registers of hazards and vulnerabilities and a semi-quantitative assessment of hazards for individual parts of hydrological structure and technological elements of presented drilled wells are the basis for a semi-quantitative risk assessment of potential resource of emergency supply of population and the subsequent classification of such resource within the system of crisis planning.

Investigation of a Transition from Steady Convection to Chaos in Porous Media Using Piecewise Variational Iteration Method

In this paper, a new dependable algorithm based on an adaptation of the standard variational iteration method (VIM) is used for analyzing the transition from steady convection to chaos for lowto-intermediate Rayleigh numbers convection in porous media. The solution trajectories show the transition from steady convection to chaos that occurs at a slightly subcritical value of Rayleigh number, the critical value being associated with the loss of linear stability of the steady convection solution. The VIM is treated as an algorithm in a sequence of intervals for finding accurate approximate solutions to the considered model and other dynamical systems. We shall call this technique as the piecewise VIM. Numerical comparisons between the piecewise VIM and the classical fourth-order Runge–Kutta (RK4) numerical solutions reveal that the proposed technique is a promising tool for the nonlinear chaotic and nonchaotic systems.

A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease

In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.

A Web Text Mining Flexible Architecture

Text Mining is an important step of Knowledge Discovery process. It is used to extract hidden information from notstructured o semi-structured data. This aspect is fundamental because much of the Web information is semi-structured due to the nested structure of HTML code, much of the Web information is linked, much of the Web information is redundant. Web Text Mining helps whole knowledge mining process to mining, extraction and integration of useful data, information and knowledge from Web page contents. In this paper, we present a Web Text Mining process able to discover knowledge in a distributed and heterogeneous multiorganization environment. The Web Text Mining process is based on flexible architecture and is implemented by four steps able to examine web content and to extract useful hidden information through mining techniques. Our Web Text Mining prototype starts from the recovery of Web job offers in which, through a Text Mining process, useful information for fast classification of the same are drawn out, these information are, essentially, job offer place and skills.

Flow Characteristics of Pulp Liquid in Straight Ducts

An experimental investigation was performed on pulp liquid flow in straight ducts with a square cross section. Fully developed steady flow was visualized and the fiber concentration was obtained using a light-section method developed by the author et al. The obtained results reveal quantitatively, in a definite form, the distribution of the fiber concentration. From the results and measurements of pressure loss, it is found that the flow characteristics of pulp liquid in ducts can be classified into five patterns. The relationships among the distributions of mean and fluctuation of fiber concentration, the pressure loss and the flow velocity are discussed, and then the features for each pattern are extracted. The degree of nonuniformity of the fiber concentration, which is indicated by the standard deviation of its distribution, is decreased from 0.3 to 0.05 with an increase in the velocity of the tested pulp liquid from 0.4 to 0.8%.

Synchronization for Impulsive Fuzzy Cohen-Grossberg Neural Networks with Time Delays under Noise Perturbation

In this paper, we investigate a class of fuzzy Cohen- Grossberg neural networks with time delays and impulsive effects. By virtue of stochastic analysis, Halanay inequality for stochastic differential equations, we find sufficient conditions for the global exponential square-mean synchronization of the FCGNNs under noise perturbation. In particular, the traditional assumption on the differentiability of the time-varying delays is no longer needed. Finally, a numerical example is given to show the effectiveness of the results in this paper.

Density Clustering Based On Radius of Data (DCBRD)

Clustering algorithms are attractive for the task of class identification in spatial databases. However, the application to large spatial databases rises the following requirements for clustering algorithms: minimal requirements of domain knowledge to determine the input parameters, discovery of clusters with arbitrary shape and good efficiency on large databases. The well-known clustering algorithms offer no solution to the combination of these requirements. In this paper, a density based clustering algorithm (DCBRD) is presented, relying on a knowledge acquired from the data by dividing the data space into overlapped regions. The proposed algorithm discovers arbitrary shaped clusters, requires no input parameters and uses the same definitions of DBSCAN algorithm. We performed an experimental evaluation of the effectiveness and efficiency of it, and compared this results with that of DBSCAN. The results of our experiments demonstrate that the proposed algorithm is significantly efficient in discovering clusters of arbitrary shape and size.

Data Mining Using Learning Automata

In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).

Object Localization in Medical Images Using Genetic Algorithms

We present a genetic algorithm application to the problem of object registration (i.e., object detection, localization and recognition) in a class of medical images containing various types of blood cells. The genetic algorithm approach taken here is seen to be most appropriate for this type of image, due to the characteristics of the objects. Successful cell registration results on real life microscope images of blood cells show the potential of the proposed approach.

Standard Deviation of Mean and Variance of Rows and Columns of Images for CBIR

This paper describes a novel and effective approach to content-based image retrieval (CBIR) that represents each image in the database by a vector of feature values called “Standard deviation of mean vectors of color distribution of rows and columns of images for CBIR". In many areas of commerce, government, academia, and hospitals, large collections of digital images are being created. This paper describes the approach that uses contents as feature vector for retrieval of similar images. There are several classes of features that are used to specify queries: colour, texture, shape, spatial layout. Colour features are often easily obtained directly from the pixel intensities. In this paper feature extraction is done for the texture descriptor that is 'variance' and 'Variance of Variances'. First standard deviation of each row and column mean is calculated for R, G, and B planes. These six values are obtained for one image which acts as a feature vector. Secondly we calculate variance of the row and column of R, G and B planes of an image. Then six standard deviations of these variance sequences are calculated to form a feature vector of dimension six. We applied our approach to a database of 300 BMP images. We have determined the capability of automatic indexing by analyzing image content: color and texture as features and by applying a similarity measure Euclidean distance.

A Fuzzy Approach for Delay Proportion Differentiated Service

There are two paradigms proposed to provide QoS for Internet applications: Integrated service (IntServ) and Differentiated service (DiffServ).Intserv is not appropriate for large network like Internet. Because is very complex. Therefore, to reduce the complexity of QoS management, DiffServ was introduced to provide QoS within a domain using aggregation of flow and per- class service. In theses networks QoS between classes is constant and it allows low priority traffic to be effected from high priority traffic, which is not suitable. In this paper, we proposed a fuzzy controller, which reduced the effect of low priority class on higher priority ones. Our simulations shows that, our approach reduces the latency dependency of low priority class on higher priority ones, in an effective manner.