Experimental Study on Recycled Aggregate Pervious Concrete

Concrete is the most widely used building material in the world. At the same time, the world produces a large amount of construction waste each year. Waste concrete is processed and treated, and the recycled aggregate is used to make pervious concrete, which enables the construction waste to be recycled. Pervious concrete has many advantages such as permeability to water, protection of water resources, and so on. This paper tests the recycled aggregate obtained by crushing high-strength waste concrete (TOU) and low-strength waste concrete (PU), and analyzes the effect of porosity, amount of cement, mineral admixture and recycled aggregate on the strength of permeable concrete. The porosity is inversely proportional to the strength, and the amount of cement used is proportional to the strength. The mineral admixture can effectively improve the workability of the mixture. The quality of recycled aggregates had a significant effect on strength. Compared with concrete using "PU" aggregates, the strength of 7d and 28d concrete using "TOU" aggregates increased by 69.0% and 73.3%, respectively. Therefore, the quality of recycled aggregates should be strictly controlled during production, and the mix ratio should be designed according to different use environments and usage requirements. This test prepared a recycled aggregate permeable concrete with a compressive strength of 35.8 MPa, which can be used for light load roads and provides a reference for engineering applications.

The Influence of Disturbances Generated by Arc Furnaces on the Power Quality

The paper presents the impact of work on the electric arc furnace. Arc equipment is one of the largest receivers powered by the power system. Electric arc disturbances arising during melting process occurring in these furnaces are the cause of an abrupt change of the passive power of furnaces. Currents drawn by these devices undergo an abrupt change, which in turn cause voltage fluctuations and light flicker. The quantitative evaluation of the voltage fluctuations is now the basic criterion of assessment of an influence of unquiet receiver on the supplying net. The paper presents the method of determination of range of voltage fluctuations and light flicker at parallel operation of arc devices. The results of measurements of voltage fluctuations and light flicker indicators recorded in power supply networks of steelworks were presented, with different number of parallel arc devices. Measurements of energy quality parameters were aimed at verifying the proposed method in practice. It was also analyzed changes in other parameters of electricity: the content of higher harmonics, asymmetry, voltage dips.

The Creative Unfolding of “Reduced Descriptive Structures” in Musical Cognition: Technical and Theoretical Insights Based on the OpenMusic and PWGL Long-Term Feedback

We here describe the theoretical and philosophical understanding of a long term use and development of algorithmic computer-based tools applied to music composition. The findings of our research lead us to interrogate some specific processes and systems of communication engaged in the discovery of specific cultural artworks: artistic creation in the sono-musical domain. Our hypothesis is that the patterns of auditory learning cannot be only understood in terms of social transmission but would gain to be questioned in the way they rely on various ranges of acoustic stimuli modes of consciousness and how the different types of memories engaged in the percept-action expressive systems of our cultural communities also relies on these shadowy conscious entities we named “Reduced Descriptive Structures”.

An Approach to Consumption of Exhaustible Resources Based on Islamic Justice and Hartwick Criteria

Nowadays, there is an increasing attention to the resources scarcity issues. Because of failure in present patterns in the field of the allocation of exhaustible resources between generations and the challenges related to economic justice supply, it is supposed, to present a pattern from the Islamic perspective in this essay. By using content analysis of religious texts, we conclude that governments should remove the gap which is exists between the per capita income of the poor and their minimum consumption (necessary consumption). In order to preserve the exhaustible resources for poor people) not for all), between all generations, government should invest exhaustible resources on endless resources according to Hartwick’s criteria and should spend these benefits for poor people. But, if benefits did not cover the gap between minimum consumption and per capita income of poor levels in one generation, in this case, the government is responsible for covering this gap through the direct consumption of exhaustible resources. For an exact answer to this question, ‘how much of exhaustible resources should expense to maintain justice between generations?’ The theoretical and mathematical modeling has been used and proper function has been provided. The consumption pattern is presented for economic policy makers in Muslim countries, and non-Muslim even, it can be useful.

Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet

This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.

An Approach to Secure Mobile Agent Communication in Multi-Agent Systems

Inter-agent communication manager facilitates communication among mobile agents via message passing mechanism. Until now, all Foundation for Intelligent Physical Agents (FIPA) compliant agent systems are capable of exchanging messages following the standard format of sending and receiving messages. Previous works tend to secure messages to be exchanged among a community of collaborative agents commissioned to perform specific tasks using cryptosystems. However, the approach is characterized by computational complexity due to the encryption and decryption processes required at the two ends. The proposed approach to secure agent communication allows only agents that are created by the host agent server to communicate via the agent communication channel provided by the host agent platform. These agents are assumed to be harmless. Therefore, to secure communication of legitimate agents from intrusion by external agents, a 2-phase policy enforcement system was developed. The first phase constrains the external agent to run only on the network server while the second phase confines the activities of the external agent to its execution environment. To implement the proposed policy, a controller agent was charged with the task of screening any external agent entering the local area network and preventing it from migrating to the agent execution host where the legitimate agents are running. On arrival of the external agent at the host network server, an introspector agent was charged to monitor and restrain its activities. This approach secures legitimate agent communication from Man-in-the Middle and Replay attacks.

Effects of Selected Plant-Derived Nutraceuticals on the Quality and Shelf-Life Stability of Frankfurter Type Sausages during Storage

The application of natural plant extracts which are rich in promising antioxidants and antimicrobial ingredients in the production of frankfurter-type sausages addresses consumer demands for healthier, more functional meat products. The effects of olive leaves, green tea and Urtica dioica L. extracts on physicochemical, microbiological and sensory characteristic of frankfurter-type sausage were investigated during 45 days of storage at 4 °C. The results revealed that pH and phenolic compounds decreased significantly (P < 0.05) in all samples during storage. Sausages containing 500 ppm green tea extract (1.78 mg/kg) showed the lowest TBARS values compared to olive leaves (2.01 mg/kg), Urtica dioica L. (2.26 mg/kg) extracts and control (2.74 mg/kg). Plant extracts significantly (P < 0.05) reduced the count of total mesophilic bacteria, yeast and mold by at least 2 log cycles (CFU/g) than those of control samples. Sensory characteristics of texture showed no difference (P > 0.05) between sausage samples, but sausage containing Urtica dioica L. extract had the highest score regarding flavor, freshness odor, and overall acceptability. Based on the results, sausage containing plant extracts could have a significant impact on antimicrobial activity, antioxidant capacity, sensory score, and shelf life stability of frankfurter-type sausage.

Clash of Civilizations without Civilizational Groups: Revisiting Samuel P. Huntington´s Clash of Civilizations Theory

This paper is largely a response/critique of Samuel P. Huntington´s Clash of Civilizations thesis. The overriding argument is that Huntington´s thesis is characterized by failure to distinguish between ´groups´ and ´categories´. Multinational civilizations overcoming their internal collective action problems, which would enable them to pursue a unified strategy vis-à-vis the West, is a rather foundational assumption in his theory. Without assigning sufficient intellectual attention to the processes through which multinational civilizations may gain capacity for concerted action i.e. become a group, he contended that the post-cold-war world would be shaped in large measure by interactions among seven or eight major civilizations. Thus, failure in providing a convincing analysis of multi-national civilizations´ transition from categories to groups is a significant weakness in Huntington´s clash theory. It is also suggested that so-called Islamic terrorism and the war on terror is not to be taken as an expression of presence of clash between a Western and an Islamic civilization, as terrorist organizations would be superfluous in a world characterized by clash of civilizations. Consequences of multinational civilizations becoming a group are discussed in relation to contemporary Western superiority.

Carcass Characteristics and Qualities of Philippine White Mallard (Anas boschas L.) and Pekin (Anas platyrhynchos L.) Duck

The Philippine White Mallard duck was compared with Pekin duck for potential meat production. A total of 50 ducklings were randomly assigned to five (5) pens per treatment after one month of brooding. Each pen containing five (5) ducks was considered as a replicate. The ducks were raised until 12 weeks of age and slaughtered at the end of the growing period. Meat from both breeds was analyzed. The data were subjected to the Independent-Sample T-test at 5% level of confidence. Results showed that post-mortem pH (0, 20 minutes, 50 minutes, 1 hour and 20 minutes, 1 hour and 50 minutes, and 24 hours ) did not differ significantly (P>0.05) between breeds. However, Pekin ducks (89.84±0.71) had a significantly higher water-holding capacity than Philippine White Mallard ducks (87.93±0.63) (P0.05) except for the yellowness of the lean muscles of the Pekin duck breast. Pekin duck meat (1.15±0.04) had significantly higher crude fat content than Philippine White Mallard (0.47±0.58). The study clearly showed that breed is a factor and provided some pronounced effects among the parameters. However, these results are considered as preliminary information on the meat quality of Philippine White Mallard duck. Hence, further studies are needed to understand and fully utilize it for meat production and develop different meat products from this breed.

Quality Evaluation of Grape Seed Oils of the Ionian Islands Based on GC-MS and Other Spectroscopic Techniques

Grape seeds are waste products of wineries and often referred to as an important agricultural and industrial waste product with the potential to be used in pharmaceutical, food, and cosmetic applications. In this study, grape seed oil from traditional Ionian varieties was examined for the determination of the quality and the characteristics of each variety. Initially, the fatty acid methyl ester (FAME) profiles were analyzed using Gas Chromatography-Mass Spectrometry, after transesterification. Furthermore, other quality parameters of the grape seed oils were determined by Spectroscopy techniques, UV-Vis and Raman included. Moreover, the antioxidant capacity of the oil was measured by 2,2'-azino-bis-3-ethylbenzothiazoline-6-sulfonic acid (ABTS) and 2,2-Diphenyl-1-picrylhydrazyl (DPPH) assays and their antioxidant capacity expressed in Trolox equivalents. K and ΔΚ indices were measured in 232, 268, 270 nm, as an oil quality index. The results indicate that the air-dried grape seed total oil content ranged from 5.26 to 8.77% w/w, which is in accordance with the other grape seed varieties tested in similar studies. The composition of grape seed oil is predominated with linoleic and oleic fatty acids, with the linoleic fatty acid ranging from 53.68 to 69.95% and both the linoleic and oleic fatty acids totaling 78-82% of FAMEs, which is analogous to the fatty acid composition of safflower oil. The antioxidant assays ABTS and DPPH scored high, exhibiting that the oils have potential in the cosmetic and culinary businesses. Above that, our results demonstrate that Ionian grape seed oils have prospects that can go further than cosmetic or culinary use, into the pharmaceuticals industry. Finally, the reclamation of grape seeds from wineries waste stream is in accordance with the bio-economy strategic framework and contributes to environmental protection.

A Preliminary Literature Review of Digital Transformation Case Studies

While struggling to succeed in today’s complex market environment and provide better customer experience and services, enterprises encompass digital transformation as a means for reaching competitiveness and foster value creation. A digital transformation process consists of information technology implementation projects, as well as organizational factors such as top management support, digital transformation strategy, and organizational changes. However, to the best of our knowledge, there is little evidence about digital transformation endeavors in organizations and how they perceive it – is it only about digital technologies adoption or a true organizational shift is needed? In order to address this issue and as the first step in our research project, a literature review is conducted. The analysis included case study papers from Scopus and Web of Science databases. The following attributes are considered for classification and analysis of papers: time component; country of case origin; case industry and; digital transformation concept comprehension, i.e. focus. Research showed that organizations – public, as well as private ones, are aware of change necessity and employ digital transformation projects. Also, the changes concerning digital transformation affect both manufacturing and service-based industries. Furthermore, we discovered that organizations understand that besides technologies implementation, organizational changes must also be adopted. However, with only 29 relevant papers identified, research positioned digital transformation as an unexplored and emerging phenomenon in information systems research. The scarcity of evidence-based papers calls for further examination of this topic on cases from practice.

The Role of Business Process Management in Driving Digital Transformation: Insurance Company Case Study

Digital transformation is one of the latest trends on the global market. In order to maintain the competitive advantage and sustainability, increasing number of organizations are conducting digital transformation processes. Those organizations are changing their business processes and creating new business models with the help of digital technologies. In that sense, one should also observe the role of business process management (BPM) and its maturity in driving digital transformation. Therefore, the goal of this paper is to investigate the role of BPM in digital transformation process within one organization. Since experiences from practice show that organizations from financial sector could be observed as leaders in digital transformation, an insurance company has been selected to participate in the study. That company has been selected due to the high level of its BPM maturity and the fact that it has previously been through a digital transformation process. In order to fulfill the goals of the paper, several interviews, as well as questionnaires, have been conducted within the selected company. The results are presented in a form of a case study. Results indicate that digital transformation process within the observed company has been successful, with special focus on the development of digital strategy, BPM and change management. The role of BPM in the digital transformation of the observed company is further discussed in the paper.

A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

Multiscale Syntheses of Knee Collateral Ligament Stresses: Aggregate Mechanics as a Function of Molecular Properties

Knee collateral ligaments play a significant role in restraining excessive frontal motion (varus/valgus rotations). In this investigation, a multiscale frame was developed based on structural hierarchies of the collateral ligaments starting from the bottom (tropocollagen molecule) to up where the fibred reinforced structure established. Experimental data of failure tensile test were considered as the principal driver of the developed model. This model was calibrated statistically using Bayesian calibration due to the high number of unknown parameters. Then the model is scaled up to fit the real structure of the collateral ligaments and simulated under realistic boundary conditions. Predications have been successful in describing the observed transient response of the collateral ligaments during tensile test under pre- and post-damage loading conditions. Collateral ligaments maximum stresses and strengths were observed near to the femoral insertions, a results that is in good agreement with experimental investigations. Also for the first time, damage initiation and propagation were documented with this model as a function of the cross-link density between tropocollagen molecules.

Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks

One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.

Determination of the Quality of the Machined Surface Using Fuzzy Logic

This paper deals with measuring and modelling of the quality of the machined surface of the metal machining process. The average surface roughness (Ra) which represents the quality of the machined part was measured during the dry turning of the AISI 4140 steel. A large number of factors with the unknown relations among them influences this parameter, and that is why mathematical modelling is extremely complicated. Different values of cutting speed, feed rate, depth of cut (cutting regime) and workpiece hardness causes different surface roughness values. Modelling with soft computing techniques may be very useful in such cases. This paper presents the usage of the fuzzy logic-based system for determining metal machining process parameter in order to find the proper values of cutting regimes.

Absorbed Dose Estimation of 177Lu-DOTATOC in Adenocarcinoma Breast Cancer Bearing Mice

In this study, the absorbed dose of human organs after injection of 177Lu-DOTATOC was studied based on the biodistribution of the complex in adenocarcinoma breast cancer bearing mice. For this purpose, the biodistribution of the radiolabelled complex was studied and compartmental modeling was applied to calculate the absorbed dose with high precision. As expected, 177Lu-DOTATOC illustrated a notable specific uptake in tumor and pancreas, organs with high level of somatostatin receptor on their surface and the effectiveness of the radio-conjugate for targeting of the breast adenocarcinoma tumors was indicated. The elicited results of modeling were the exponential equations, and those are utilized for obtaining the cumulated activity data by taking their integral. The results also exemplified that non-target absorbed-doses such as the liver, spleen and pancreas were approximately 0.008, 0.004, and 0.039, respectively. While these values were so much lower than target (tumor) absorbed-dose, it seems due to this low toxicity, this complex is a good agent for therapy.

Spectral Mixture Model Applied to Cannabis Parcel Determination

Many research projects require accurate delineation of the different land cover type of the agricultural area. Especially it is critically important for the definition of specific plants like cannabis. However, the complexity of vegetation stands structure, abundant vegetation species, and the smooth transition between different seconder section stages make vegetation classification difficult when using traditional approaches such as the maximum likelihood classifier. Most of the time, classification distinguishes only between trees/annual or grain. It has been difficult to accurately determine the cannabis mixed with other plants. In this paper, a mixed distribution models approach is applied to classify pure and mix cannabis parcels using Worldview-2 imagery in the Lakes region of Turkey. Five different land use types (i.e. sunflower, maize, bare soil, and cannabis) were identified in the image. A constrained Gaussian mixture discriminant analysis (GMDA) was used to unmix the image. In the study, 255 reflectance ratios derived from spectral signatures of seven bands (Blue-Green-Yellow-Red-Rededge-NIR1-NIR2) were randomly arranged as 80% for training and 20% for test data. Gaussian mixed distribution model approach is proved to be an effective and convenient way to combine very high spatial resolution imagery for distinguishing cannabis vegetation. Based on the overall accuracies of the classification, the Gaussian mixed distribution model was found to be very successful to achieve image classification tasks. This approach is sensitive to capture the illegal cannabis planting areas in the large plain. This approach can also be used for monitoring and determination with spectral reflections in illegal cannabis planting areas.

An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model

Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.