Abstract: Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.
Abstract: In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the surface hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor hobson talysurf tester, micro vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.
Abstract: Given the limited research on Small and Mediumsized
Enterprises’ (SMEs) contribution to Corporate Social
Responsibility (CSR) and even scarcer research on Swiss SMEs, this
paper helps to fill these gaps by enabling the identification of supranational
SME parameters. Thus, the paper investigates the current
state of SME practices in Switzerland and across 15 other countries.
Combining the degree to which SMEs demonstrate an explicit (or
business case) approach or see CSR as an implicit moral activity with
the assessment of their attributes for “variety of capitalism” defines
the framework of this comparative analysis. To outline Swiss small
business CSR patterns in particular, 40 SME owner-managers were
interviewed. A secondary data analysis of studies from different
countries laid groundwork for this comparative overview of small
business CSR. The paper identifies Swiss small business CSR as
driven by norms, values, and by the aspiration to contribute to
society, thus, as an implicit part of the day-to-day business. Similar to
most Central European, Mediterranean, Nordic, and Asian countries,
explicit CSR is still very rare in Swiss SMEs. Astonishingly, also
British and American SMEs follow this pattern in spite of their strong
and distinctly liberal market economies. Though other findings show
that nationality matters this research concludes that SME culture and
an informal CSR agenda are strongly formative and superseding even
forces of market economies, nationally cultural patterns, and
language. Hence, classifications of countries by their market system,
as found in the comparative capitalism literature, do not match the
CSR practices in SMEs as they do not mirror the peculiarities of their
business. This raises questions on the universality and
generalisability of unmediated, explicit management concepts,
especially in the context of small firms.
Abstract: In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.
Abstract: This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.
Abstract: Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers.
Abstract: This paper reviews the model-based qualitative and
quantitative Operations Management research in the context of
Construction Supply Chain Management (CSCM). Construction
industry has been traditionally blamed for low productivity, cost and
time overruns, waste, high fragmentation and adversarial
relationships. The construction industry has been slower than other
industries to employ the Supply Chain Management (SCM) concept
and develop models that support the decision-making and planning.
However the last decade there is a distinct shift from a project-based
to a supply-based approach of construction management. CSCM
comes up as a new promising management tool of construction
operations and improves the performance of construction projects in
terms of cost, time and quality. Modeling the Construction Supply
Chain (CSC) offers the means to reap the benefits of SCM, make
informed decisions and gain competitive advantage. Different
modeling approaches and methodologies have been applied in the
multi-disciplinary and heterogeneous research field of CSCM. The
literature review reveals that a considerable percentage of the CSC
modeling research accommodates conceptual or process models
which present general management frameworks and do not relate to
acknowledged soft Operations Research methods. We particularly
focus on the model-based quantitative research and categorize the
CSCM models depending on their scope, objectives, modeling
approach, solution methods and software used. Although over the last
few years there has been clearly an increase of research papers on
quantitative CSC models, we identify that the relevant literature is
very fragmented with limited applications of simulation,
mathematical programming and simulation-based optimization. Most
applications are project-specific or study only parts of the supply
system. Thus, some complex interdependencies within construction
are neglected and the implementation of the integrated supply chain
management is hindered. We conclude this paper by giving future
research directions and emphasizing the need to develop optimization
models for integrated CSCM. We stress that CSC modeling needs a
multi-dimensional, system-wide and long-term perspective. Finally,
prior applications of SCM to other industries have to be taken into
account in order to model CSCs, but not without translating the
generic concepts to the context of construction industry.
Abstract: This paper focuses on a critical component of the
situational awareness (SA), the control of autonomous vertical flight
for vectored thrust aerial vehicle (VTAV). With the SA strategy, we
proposed a neural network motion control procedure to address the
dynamics variation and performance requirement difference of flight
trajectory for a VTAV. This control strategy with using of NARMAL2
neurocontroller for chosen model of VTAV has been verified by
simulation of take-off and forward maneuvers using software
package Simulink and demonstrated good performance for fast
stabilization of motors, consequently, fast SA with economy in
energy can be asserted during search-and-rescue operations.
Abstract: Designing cost-efficient, secure network protocols for
Wireless Sensor Networks (WSNs) is a challenging problem because
sensors are resource-limited wireless devices. Security services such
as authentication and improved pairwise key establishment are
critical to high efficient networks with sensor nodes. For sensor
nodes to correspond securely with each other efficiently, usage of
cryptographic techniques is necessary. In this paper, two key
predistribution schemes that enable a mobile sink to establish a
secure data-communication link, on the fly, with any sensor nodes.
The intermediate nodes along the path to the sink are able to verify
the authenticity and integrity of the incoming packets using a
predicted value of the key generated by the sender’s essential power.
The proposed schemes are based on the pairwise key with the mobile
sink, our analytical results clearly show that our schemes perform
better in terms of network resilience to node capture than existing
schemes if used in wireless sensor networks with mobile sinks.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: Experts, enterprises and operators expect that the
bandwidth request will increase up to rates of 100 to 1,000 Mbps
within several years. Therefore the most important question is which
technology shall satisfy the future consumer broadband demands.
Currently the consensus is, that the fiber technology has the best
technical characteristics to achieve such the high bandwidth rates.
But fiber technology is so far very cost-intensive and resource
consuming. To avoid these investments, operators are concentrating
to upgrade the existing copper and hybrid fiber coax infrastructures.
This work presents a comparison of the copper and fiber
technologies including an overview about the current German
broadband market. Both technologies are reviewed in the terms of
demand, willingness to pay and economic efficiency in connection
with the technical characteristics.
Abstract: In this work, the Ictalurus punctatus species estimated
available physical habitat is compared with the estimated physical
habitat for the same but modified river reach, with the aim of creating
a linear park, along a length of 5 500 m.
To determine the effect of ecological park construction, on
physical habitat of the Lerma river stretch of study, first, the available
habitat for the Ictalurus punctatus species was estimated through the
simulation of the physical habitat, by using surveying, hydraulics,
and habitat information gotten at the river reach in its actual situation.
Second, it was estimated the available habitat for the above species,
upon the simulation of the physical habitat through the proposed
modification for the ecological park creation. Third, it is presented a
comparison between both scenarios in terms of available habitat
estimated for Ictalurus punctatus species, concluding that in cases of
adult and spawning life stages, changes in the channel to create an
ecological park would produce a considerable loss of potentially
usable habitat (PUH), while in the case of the juvenile life stage PUH
remains virtually unchanged, and in the case of life stage fry the PUH
would increase due to the presence of velocities and depths of lesser
magnitude, due to the presence of minor flow rates and lower volume
of the wet channel.
It is expected that habitat modification for linear park construction
may produce the lack of Ictalurus punktatus species conservation at
the river reach of the study.
Abstract: Any signal transmitted over a channel is corrupted by noise and interference. A host of channel coding techniques has been proposed to alleviate the effect of such noise and interference. Among these Turbo codes are recommended, because of increased capacity at higher transmission rates and superior performance over convolutional codes. The multimedia elements which are associated with ample amount of data are best protected by Turbo codes. Turbo decoder employs Maximum A-posteriori Probability (MAP) and Soft Output Viterbi Decoding (SOVA) algorithms. Conventional Turbo coded systems employ Equal Error Protection (EEP) in which the protection of all the data in an information message is uniform. Some applications involve Unequal Error Protection (UEP) in which the level of protection is higher for important information bits than that of other bits. In this work, enhancement to the traditional Log MAP decoding algorithm is being done by using optimized scaling factors for both the decoders. The error correcting performance in presence of UEP in Additive White Gaussian Noise channel (AWGN) and Rayleigh fading are analyzed for the transmission of image with Discrete Cosine Transform (DCT) as source coding technique. This paper compares the performance of log MAP, Modified log MAP (MlogMAP) and Enhanced log MAP (ElogMAP) algorithms used for image transmission. The MlogMAP algorithm is found to be best for lower Eb/N0 values but for higher Eb/N0 ElogMAP performs better with optimized scaling factors. The performance comparison of AWGN with fading channel indicates the robustness of the proposed algorithm. According to the performance of three different message classes, class3 would be more protected than other two classes. From the performance analysis, it is observed that ElogMAP algorithm with UEP is best for transmission of an image compared to Log MAP and MlogMAP decoding algorithms.
Abstract: A sensory network consists of multiple detection
locations called sensor nodes, each of which is tiny, featherweight
and portable. A single path routing protocols in wireless sensor
network can lead to holes in the network, since only the nodes
present in the single path is used for the data transmission. Apart
from the advantages like reduced computation, complexity and
resource utilization, there are some drawbacks like throughput,
increased traffic load and delay in data delivery. Therefore, multipath
routing protocols are preferred for WSN. Distributing the traffic
among multiple paths increases the network lifetime. We propose a
scheme, for the data to be transmitted through a dominant path to
save energy. In order to obtain a high delivery ratio, a basic route
reconstruction protocol is utilized to reconstruct the path whenever a
failure is detected. A basic reconstruction routing (BRR) algorithm is
proposed, in which a node can leap over path failure by using the
already existing routing information from its neighbourhood while
the composed data is transmitted from the source to the sink. In order
to save the energy and attain high data delivery ratio, data is
transmitted along a multiple path, which is achieved by BRR
algorithm whenever a failure is detected. Further, the analysis of
how the proposed protocol overcomes the drawback of the existing
protocols is presented. The performance of our protocol is compared
to AOMDV and energy efficient node-disjoint multipath routing
protocol (EENDMRP). The system is implemented using NS-2.34.
The simulation results show that the proposed protocol has high
delivery ratio with low energy consumption.
Abstract: The main function of Medium Access Control (MAC) is to share the channel efficiently between all nodes. In the real-time scenario, there will be certain amount of wastage in bandwidth due to back-off periods. More bandwidth will be wasted in idle state if the back-off period is very high and collision may occur if the back-off period is small. So, an optimization is needed for this problem. The main objective of the work is to reduce delay due to back-off period thereby reducing collision and increasing throughput. Here a method, called the virtual back-off algorithm (VBA) is used to optimize the back-off period and thereby it increases throughput and reduces collisions. The main idea is to optimize the number of transmission for every node. A counter is introduced at each node to implement this idea. Here counter value represents the sequence number. VBA is classified into two types VBA with counter sharing (VBA-CS) and VBA with no counter sharing (VBA-NCS). These two classifications of VBA are compared for various parameters. Simulation is done in NS-2 environment. The results obtained are found to be promising.
Abstract: The venture capital becomes more and more advanced
and effective source of the innovation project financing, connected
with a high-risk level. In the developed countries, it plays a key role
in transforming innovation projects into successful businesses and
creating the prosperity of the modern economy. In Russia, there are
many necessary preconditions for creation of the effective venture
investment system: the network of the public institutes for innovation
financing operates; there is a significant number of the small and
medium-sized enterprises, capable to sell production with good
market potential. However, the current system does not confirm the
necessary level of efficiency in practice that can be substantially
explained by the absence of the accurate plan of action to form the
national venture model and by the lack of experience of successful
venture deals with profitable exits in Russian economy. This paper
studies the influence of various factors on the venture industry
development by the example of the IT-sector in Russia. The choice of
the sector is based on the fact, that this segment is the main driver of
the venture capital market growth in Russia, and the necessary set of
data exists. The size of investment of the second round is used as the
dependent variable. To analyse the influence of the previous round,
such determinant as the volume of the previous (first) round
investments is used. There is also used a dummy variable in
regression to examine that the participation of an investor with high
reputation and experience in the previous round can influence the size
of the next investment round. The regression analysis of short-term
interrelations between studied variables reveals prevailing influence
of the volume of the first round investments on the venture
investments volume of the second round. The most important
determinant of the value of the second-round investment is the value
of first–round investment, so it means that the most competitive on
the Russian market are the start-up teams that can attract more money
on the start, and the target market growth is not the factor of crucial
importance. This supports the point of view that VC in Russia is
driven by endogenous factors and not by exogenous ones that are
based on global market growth.
Abstract: To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.
Abstract: Based on an indoor environmental quality (IEQ) index established by previous work that indicates the overall IEQ acceptance from the prospect of an occupant in residential buildings in terms of four IEQ factors - thermal comfort, indoor air quality, visual and aural comforts, this study develops a user-friendly IEQ calculator for iOS and Android users to calculate the occupant acceptance and compare the relative performance of IEQ in apartments. “IEQ calculator” is easy to use and it preliminarily illustrates the overall indoor environmental quality on the spot. Users simply input indoor parameters such as temperature, number of people and windows are opened or closed for the mobile application to calculate the scores in four areas: the comforts of temperature, brightness, noise and indoor air quality. The calculator allows the prediction of the best IEQ scenario on a quantitative scale. Any indoor environments under the specific IEQ conditions can be benchmarked against the predicted IEQ acceptance range. This calculator can also suggest how to achieve the best IEQ acceptance among a group of residents.
Abstract: Doxorubicin (DOX) is an anthracycline drug used to treat many cancer diseases. Similarly to other cytostatic drugs, DOX has serious side effects; the biggest obstacle is the cardiotoxicity. With the aim of lowering the negative side effects and to target the DOX into the tumor tissue, the different nanoparticles (NPs) are studied. The aim of this work was to synthetized different NPs and conjugated them with DOX and determine the binding capacity of the NPs. For this experiment, carbon nanotubes (CNTs), graphene oxide (GO), fullerene (FUL) and liposomes (LIP) were used. The highest binding capacity was observed in GO (85%). Subsequently the toxicity of NPs and NPs-DOX conjugates was analyzed in in vivo system (chicken embryos). Some NPs (GO) can increase the toxicity of DOX, whereas other NPs (LIP, CNTs) decrease DOX toxicity.
Abstract: Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.