Abstract: This study aimed to determine the possible protective effects of L‐carnitine against gentamicin‐induced nephrotoxicity. Forty male albino rats were divided into 4 groups (10 rats each); Group 1: normal control, group 2: induced nephrotoxicity (gentamicin 50 mg/kg/day S.C; 8 days), group 3: treated with L‐ carnitine (40 mg/kg/d SC for 12 days) and group 4: treated with L‐ carnitine 4 days before and for 8 days in concomitant with gentamicin. Gentamicin‐induced nephrotoxicity (group 2): caused significant increase in serum urea, creatinine, urinary N‐acetyl‐B‐D‐ glucosaminidase (NAG), gamma glutamyl transpeptidase (GGT), urinary total protein and kidney tissue malondialdehyde (MDA) with significant decrease in serum superoxide dismutase (SOD), serum catalase and creatinine clearance and marked tubular necrosis in the proximal convoluted tubules with interruption in the basement membrane around the necrotic tubule compared to the normal control group. L‐carnitine 4 days before and for 8 days in concomitant with gentamicin (group 4) offered marked decrease in serum urea, serum creatinine, urinary NAG, urinary GGT, urinary proteins and kidney tissue MDA, with marked increase in serum SOD, serum catalase and creatinine clearance with marked improvement in the tubular damage compared to gentamicin‐induced nephrotoxicity group. L‐carnitine administered for 12 days produced no change in the parameters mentioned above as compared to the normal control group. In conclusion: L‐carnitine could reduce most of the biochemical parameters and also improve the histopathological features of kidney asscociated with gentamicin induced‐nephrotoxicity.
Abstract: Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.
Abstract: It is a well-established fact that terrorism is one of the foremost threats to present-day international security. The creation of tools or mechanisms for confronting it in an effective and efficient manner will only be possible by way of an objective assessment of the phenomenon. In order to achieve this, this paper has the following three main objectives: Firstly, setting out to find the reasons that have prevented the establishment of a universally accepted definition of terrorism, and consequently trying to outline the main features defining the face of the terrorist threat in order to discover the fundamental goals of what is now a serious blight on world society. Secondly, trying to explain the differences between a terrorist movement and a terrorist organisation, and the reasons for which a terrorist movement can be led to transform itself into an organisation. After analysing these motivations and the characteristics of a terrorist organisation, an example of the latter will be succinctly analysed to help the reader understand the ideas expressed. Lastly, discovering and exposing the factors that can lead to the appearance of terrorist tendencies, and discussing the most efficient and effective responses that can be given to this global security threat.
Abstract: This paper presents an approach for the classification of
an unstructured format description for identification of file formats.
The main contribution of this work is the employment of data mining
techniques to support file format selection with just the unstructured
text description that comprises the most important format features for
a particular organisation. Subsequently, the file format indentification
method employs file format classifier and associated configurations to
support digital preservation experts with an estimation of required file
format. Our goal is to make use of a format specification knowledge
base aggregated from a different Web sources in order to select file
format for a particular institution. Using the naive Bayes method,
the decision support system recommends to an expert, the file format
for his institution. The proposed methods facilitate the selection of
file format and the quality of a digital preservation process. The
presented approach is meant to facilitate decision making for the
preservation of digital content in libraries and archives using domain
expert knowledge and specifications of file formats. To facilitate
decision-making, the aggregated information about the file formats is
presented as a file format vocabulary that comprises most common
terms that are characteristic for all researched formats. The goal is to
suggest a particular file format based on this vocabulary for analysis
by an expert. The sample file format calculation and the calculation
results including probabilities are presented in the evaluation section.
Abstract: By the evolvement in technology, the way of
expressing opinions switched direction to the digital world. The
domain of politics, as one of the hottest topics of opinion mining
research, merged together with the behavior analysis for affiliation
determination in texts, which constitutes the subject of this paper.
This study aims to classify the text in news/blogs either as
Republican or Democrat with the minimum number of features. As
an initial set, 68 features which 64 were constituted by Linguistic
Inquiry and Word Count (LIWC) features were tested against 14
benchmark classification algorithms. In the later experiments, the
dimensions of the feature vector reduced based on the 7 feature
selection algorithms. The results show that the “Decision Tree”,
“Rule Induction” and “M5 Rule” classifiers when used with “SVM”
and “IGR” feature selection algorithms performed the best up to
82.5% accuracy on a given dataset. Further tests on a single feature
and the linguistic based feature sets showed the similar results. The
feature “Function”, as an aggregate feature of the linguistic category,
was found as the most differentiating feature among the 68 features
with the accuracy of 81% in classifying articles either as Republican
or Democrat.
Abstract: Rapidly changing factors that affect daily life also affect operational environment and the way military leaders fulfill their missions. With the help of technological developments, traditional linearity of conflict and war has started to fade away. Furthermore, mission domain has broadened to include traditional threats, hybrid threats and new challenges of cyber and space. Considering the future operational environment, future military leaders need to adapt themselves to the new challenges of the future battlefield. But how to decide what kind of features of leadership are required to operate and accomplish mission in the new complex battlefield? In this article, the main aim is to provide answers to this question. To be able to find right answers, first leadership and leadership components are defined, and then characteristics of future operational environment are analyzed. Finally, leadership features that are required to be successful in redefined battlefield are explained.
Abstract: In this paper we propose a computer-aided solution
with Genetic Algorithms in order to reduce the drafting of reports:
FMEA analysis and Control Plan required in the manufacture of the
product launch and improved knowledge development teams for
future projects. The solution allows to the design team to introduce
data entry required to FMEA. The actual analysis is performed using
Genetic Algorithms to find optimum between RPN risk factor and
cost of production. A feature of Genetic Algorithms is that they are
used as a means of finding solutions for multi criteria optimization
problems. In our case, along with three specific FMEA risk factors is
considered and reduce production cost. Analysis tool will generate
final reports for all FMEA processes. The data obtained in FMEA
reports are automatically integrated with other entered parameters in
Control Plan. Implementation of the solution is in the form of an
application running in an intranet on two servers: one containing
analysis and plan generation engine and the other containing the
database where the initial parameters and results are stored. The
results can then be used as starting solutions in the synthesis of other
projects. The solution was applied to welding processes, laser cutting
and bending to manufacture chassis for buses. Advantages of the
solution are efficient elaboration of documents in the current project
by automatically generating reports FMEA and Control Plan using
multiple criteria optimization of production and build a solid
knowledge base for future projects. The solution which we propose is
a cheap alternative to other solutions on the market using Open
Source tools in implementation.
Abstract: The nanoindentation behaviour and phase
transformation of annealed single-crystal silicon wafers are examined.
The silicon specimens are annealed at temperatures of 250, 350 and
450ºC, respectively, for 15 minutes and are then indented to maximum
loads of 30, 50 and 70 mN. The phase changes induced in the indented
specimens are observed using transmission electron microscopy
(TEM) and micro-Raman scattering spectroscopy (RSS). For all
annealing temperatures, an elbow feature is observed in the unloading
curve following indentation to a maximum load of 30 mN. Under
higher loads of 50 mN and 70 mN, respectively, the elbow feature is
replaced by a pop-out event. The elbow feature reveals a complete
amorphous phase transformation within the indented zone, whereas
the pop-out event indicates the formation of Si XII and Si III phases.
The experimental results show that the formation of these crystalline
silicon phases increases with an increasing annealing temperature and
indentation load. The hardness and Young’s modulus both decrease as
the annealing temperature and indentation load are increased.
Abstract: A novel design technique employing CMOS Current
Feedback Operational Amplifier (CFOA) is presented. The feature of
consumption very low power in designing pseudo-OTA is used to
decreasing the total power consumption of the proposed CFOA. This
design approach applies pseudo-OTA as input stage cascaded with
buffer stage. Moreover, the DC input offset voltage and harmonic
distortion (HD) of the proposed CFOA are very low values compared
with the conventional CMOS CFOA due to the symmetrical input
stage. P-Spice simulation results are obtained using 0.18μm MIETEC
CMOS process parameters and supply voltage of ±1.2V, 50μA
biasing current. The p-spice simulation shows excellent improvement
of the proposed CFOA over existing CMOS CFOA. Some of these
performance parameters, for example, are DC gain of 62. dB, openloop
gain bandwidth product of 108 MHz, slew rate (SR+) of
+71.2V/μS, THD of -63dB and DC consumption power (PC) of
2mW.
Abstract: In this paper, the dependence of soliton pulses with
respect to phase in a 10Gbps, single channel, dispersion
uncompensated telecommunication system was studied. The
characteristic feature of periodic soliton interaction was noted at the
Interaction point (I=6202.5Km) in one collision length of L=12405.1
Km. The interaction point is located for 10Gbps system with an
initial relative spacing (qo) of soliton as 5.28 using Perturbation
theory. It is shown that, when two in-phase solitons are launched,
they interact at the point I=6202.5 Km, but the interaction could be
restricted with introduction of different phase initially. When the
phase of the input solitons increases, the deviation of soliton pulses at
the ‘I’ also increases. We have successfully demonstrated this effect
in a telecommunication set-up in terms of Quality factor (Q), where
the Q=0 for in-phase soliton. The Q was noted to be 125.9, 38.63,
47.53, 59.60, 161.37, and 78.04 for different phases such as 10o, 20o,
30o, 45o, 60o and 90o degrees respectively at Interaction point (I).
Abstract: Any variation in environmental characteristics of
geomorphosites would lead to destabilisation of their geotouristic
values all around the planet. The Urmia lake, with an area of
approximately 5,500 km2 and a catchment area of 51,876 km2, and to
which various reasons over time, especially in the last fifty years
have seen a sharp decline and have decreased by about 93 % in two
recent decades. These variations are not only driving significant
changes in the morphology and ecology of the present lake
landscape, but at the same time are shaping newly formed
morphologies, which vanished some valuable geomorphosites or
develop into smaller geomorphosites with significant value from a
scientific and cultural point of view. This paper analyses and
discusses features and evolution in several representative coastal and
island geomorphosites. For this purpose, a total of 23 geomorphosites
were studied in two data series (1963 and 2015) and the respective
data were compared and analysed. The results showed, the total loss
in geomorphosites area in a half century amounted to a loss of more
than 90% of the valuable geomorphosites. Moreover, the comparison
between the mean yearly value of coastal area lost over the entire
period and the yearly average calculated for the shorter period (1998-
2014) clearly indicates a pattern of acceleration. This acceleration in
the rate of reduction in lake area was seen in most of the southern
half of the lake. In the region as well, the general water-level falling
is not only causing the loss of a significant water resource, which is
followed by major impact on regional ecosystems, but is also driving
the most marked recent (last century) changes in the geotouristic
landscapes. In fact, the disappearance of geomorphosites means the
loss of tourism phenomenon. In this context attention must be paid to
the question of conservation. The action needed to safeguard
geomorphosites includes: 1) Preventive action, 2) Corrective action,
and 3) Sharing knowledge.
Abstract: Here, we study the characteristic feature of
conventional (ON-OFF keying) and soliton based transmission
system. We consider 20Gbps transmission system implemented with
Conventional Single Mode Fiber (C-SMF) to examine the role of
Gaussian pulse which is the characteristic of conventional
propagation and Hyperbolic-secant pulse which is the characteristic
of soliton propagation in it. We note the influence of these pulses
with respect to different dispersion lengths and soliton period in
conventional and soliton system respectively and evaluate the system
performance in terms of Quality factor. From the analysis, we could
prove that the soliton pulse has the consistent performance even for
long distance without dispersion compensation than the conventional
system as it is robust to dispersion. For the length of transmission of
200Km, soliton system yielded Q of 33.958 while the conventional
system totally exhausted with Q=0.
Abstract: In this research article of modeling Underwater
Wireless Sensor Network Simulators, we provide a comprehensive
overview of the various currently available simulators used in UWSN
modeling. In this work, we compare their working environment,
software platform, simulation language, key features, limitations and
corresponding applications. Based on extensive experimentation and
performance analysis, we provide their efficiency for specific
applications. We have also provided guidelines for developing
protocols in different layers of the protocol stack, and finally these
parameters are also compared and tabulated. This analysis is
significant for researchers and designers to find the right simulator
for their research activities.
Abstract: In this paper we propose a novel methodology for
extracting a road network and its nodes from satellite images of
Algeria country.
This developed technique is a progress of our previous research
works. It is founded on the information theory and the mathematical
morphology; the information theory and the mathematical
morphology are combined together to extract and link the road
segments to form a road network and its nodes.
We therefore have to define objects as sets of pixels and to study
the shape of these objects and the relations that exist between them.
In this approach, geometric and radiometric features of roads are
integrated by a cost function and a set of selected points of a crossing
road. Its performances were tested on satellite images of Algeria
country.
Abstract: The Scheduling and mapping of tasks on a set of
processors is considered as a critical problem in parallel and
distributed computing system. This paper deals with the problem of
dynamic scheduling on a special type of multiprocessor architecture
known as Linear Crossed Cube (LCQ) network. This proposed
multiprocessor is a hybrid network which combines the features of
both linear types of architectures as well as cube based architectures.
Two standard dynamic scheduling schemes namely Minimum
Distance Scheduling (MDS) and Two Round Scheduling (TRS)
schemes are implemented on the LCQ network. Parallel tasks are
mapped and the imbalance of load is evaluated on different set of
processors in LCQ network. The simulations results are evaluated
and effort is made by means of through analysis of the results to
obtain the best solution for the given network in term of load
imbalance left and execution time. The other performance matrices
like speedup and efficiency are also evaluated with the given
dynamic algorithms.
Abstract: Organizational tendencies towards computer-based
information processing have been observed noticeably in the
third-world countries. Many enterprises are taking major initiatives
towards computerized working environment because of massive
benefits of computer-based information processing. However,
designing and developing information resource management software
for small and mid-size enterprises under budget costs and strict
deadline is always challenging for software engineers. Therefore, we
introduced an approach to design mid-size enterprise software by
using the Waterfall model, which is one of the SDLC (Software
Development Life Cycles), in a cost effective way. To fulfill research
objectives, in this study, we developed mid-sized enterprise software
named “BSK Management System” that assists enterprise software
clients with information resource management and perform complex
organizational tasks. Waterfall model phases have been applied to
ensure that all functions, user requirements, strategic goals, and
objectives are met. In addition, Rich Picture, Structured English, and
Data Dictionary have been implemented and investigated properly in
engineering manner. Furthermore, an assessment survey with 20
participants has been conducted to investigate the usability and
performance of the proposed software. The survey results indicated
that our system featured simple interfaces, easy operation and
maintenance, quick processing, and reliable and accurate transactions.
Abstract: In the past few years, the amount of malicious software
increased exponentially and, therefore, machine learning algorithms
became instrumental in identifying clean and malware files through
(semi)-automated classification. When working with very large
datasets, the major challenge is to reach both a very high malware
detection rate and a very low false positive rate. Another challenge
is to minimize the time needed for the machine learning algorithm to
do so. This paper presents a comparative study between different
machine learning techniques such as linear classifiers, ensembles,
decision trees or various hybrids thereof. The training dataset consists
of approximately 2 million clean files and 200.000 infected files,
which is a realistic quantitative mixture. The paper investigates the
above mentioned methods with respect to both their performance
(detection rate and false positive rate) and their practicability.
Abstract: Chrome tannery wastewater causes serious environmental hazard due to its high pollution potential. As a result, rigorous treatment is necessary for abatement of pollution from this type of wastewater. There are many research studies on chrome tannery wastewater treatment in the field of physical, chemical, and biological methods. In general, biological treatment process is found ineffective for direct application because of adverse effects by toxic chromium, sulphide, chloride etc. However, biological methods were employed mainly for a few sub processes generating significant amount of organic matter and without chromium, chlorides etc. In this context the present paper reviews the characteristics feature and pollution potential of wastewater generated from chrome tannery units and treatment of the same. The different biological processes used earlier and their chronological development for treatment of the chrome tannery wastewater are thoroughly reviewed in this paper. In this regard, the scope of hybrid bioreactor - an advanced technology option has also been explored, as this kind of treatment is well suited for the wastewater having inhibitory substances.
Abstract: Reflux condensation occurs in vertical channels and tubes when there is an upward core flow of vapour (or gas-vapour mixture) and a downward flow of the liquid film. The understanding of this condensation configuration is crucial in the design of reflux condensers, distillation columns, and in loss-of-coolant safety analyses in nuclear power plant steam generators. The unique feature of this flow is the upward flow of the vapour-gas mixture (or pure vapour) that retards the liquid flow via shear at the liquid-mixture interface. The present model solves the full, elliptic governing equations in both the film and the gas-vapour core flow. The computational mesh is non-orthogonal and adapts dynamically the phase interface, thus produces a sharp and accurate interface. Shear forces and heat and mass transfer at the interface are accounted for fundamentally. This modeling is a big step ahead of current capabilities by removing the limitations of previous reflux condensation models which inherently cannot account for the detailed local balances of shear, mass, and heat transfer at the interface. Discretisation has been done based on finite volume method and co-located variable storage scheme. An in-house computer code was developed to implement the numerical solution scheme. Detailed results are presented for laminar reflux condensation from steam-air mixtures flowing in vertical parallel plate channels. The results include velocity and gas mass fraction profiles, as well as axial variations of film thickness.
Abstract: All the software engineering researches and best
industry practices aim at providing software products with high
degree of quality and functionality at low cost and less time. These
requirements are addressed by the Component Based Software
Engineering (CBSE) as well. CBSE, which deals with the software
construction by components’ assembly, is a revolutionary extension
of Software Engineering. CBSE must define and describe processes
to assure timely completion of high quality software systems that are
composed of a variety of pre built software components. Though
these features provide distinct and visible benefits in software design
and programming, they also raise some challenging problems. The
aim of this work is to summarize the pertinent issues and
considerations in CBSE to make an understanding in forms of
concepts and observations that may lead to development of newer
ways of dealing with the problems and challenges in CBSE.