Abstract: In the 2000s, a new migration trend of highly skilled Indian professionals towards Japan has appeared. This paper examines the factors that set off the incoming of highly skilled Indian professionals in Japan, mainly focusing on IT professionals’ immigration, and the reasons of the increase in their number. It investigates the influence of four factors: The Japanese immigration policy, the bilateral relations between India and Japan, the higher education system in India and the American H-1B visa policy with its cap system. This study concludes that increased and continuous supply of highly skilled Indian professionals have intensified the competition for migration to traditional destinations like the USA. This led Indian professionals to consider other options such as Japan.
Abstract: The rapid depletion of high-grade iron ore (Fe2O3) has gained attention on the use of other sources of iron ore. Titaniferous magnetite ore (TMO) is a special type of magnetite ore having high titania content (23.23% TiO2 present in this case). Due to high TiO2 content and high density, TMO cannot be treated by the conventional smelting reduction. In this present work, the TMO has been collected from high-grade metamorphic terrain of the Precambrian Chotanagpur gneissic complex situated in the eastern part of India (Shaltora area, Bankura district, West Bengal) and the hematite ore has been collected from Visakhapatnam Steel Plant (VSP), Visakhapatnam. At VSP, iron ore is received from Bailadila mines, Chattisgarh of M/s. National Mineral Development Corporation. The preliminary characterization of TMO and hematite ore (HMO) has been investigated by WDXRF, XRD and FESEM analyses. Similarly, good quality of coal (mainly coking coal) is also getting depleted fast. The basic purpose of this work is to find how lean grade coal can be utilised along with TMO for smelting to produce pig iron. Lean grade coal has been characterised by using TG/DTA, proximate and ultimate analyses. The boiler grade coal has been found to contain 28.08% of fixed carbon and 28.31% of volatile matter. TMO fines (below 75 μm) and HMO fines (below 75 μm) have been separately agglomerated with lean grade coal fines (below 75 μm) in the form of briquettes using binders like bentonite and molasses. These green briquettes are dried first in oven at 423 K for 30 min and then reduced isothermally in tube furnace over the temperature range of 1323 K, 1373 K and 1423 K for 30 min & 60 min. After reduction, the reduced briquettes are characterized by XRD and FESEM analyses. The best reduced TMO and HMO samples are taken and blended in three different weight percentage ratios of 1:4, 1:8 and 1:12 of TMO:HMO. The chemical analysis of three blended samples is carried out and degree of metallisation of iron is found to contain 89.38%, 92.12% and 93.12%, respectively. These three blended samples are briquetted using binder like bentonite and lime. Thereafter these blended briquettes are separately smelted in raising hearth furnace at 1773 K for 30 min. The pig iron formed is characterized using XRD, microscopic analysis. It can be concluded that 90% yield of pig iron can be achieved when the blend ratio of TMO:HMO is 1:4.5. This means for 90% yield, the maximum TMO that could be used in the blend is about 18%.
Abstract: Cyclone Hudhud which battered the city of Visakhapatnam on 12th October, 2014, damaged many buildings, public amenities and infrastructure facilities along the Visakha- Bheemili coastal corridor. More than half the green cover of the city was wiped out. Majority of the trees along the coastal corridor suffered from complete or partial damage. In order to understand the different ways that trees incurred damage during the cyclone, a damage assessment study was carried out by the author. The areas covered by this study included two university campuses, several parks and residential colonies which bore the brunt of the cyclone. Post disaster attempts have been made to restore many of the trees that have suffered from partial or complete damage from the effects of extreme winds. This paper examines the various ways that trees incurred damage from the cyclone Hudhud and presents some examples of the restoration efforts carried out by educational institutions, public parks and religious institutions of the city of Visakhapatnam in the aftermath of the devastating cyclone.
Abstract: Project Portfolio Management (PPM) is an essential
component of an organisation’s strategic procedures, which requires
attention of several factors to envisage a range of long-term outcomes
to support strategic project portfolio decisions. To evaluate overall
efficiency at the portfolio level, it is essential to identify the
functionality of specific projects as well as to aggregate those
findings in a mathematically meaningful manner that indicates the
strategic significance of the associated projects at a number of levels
of abstraction. PPM success is directly associated with the quality of
decisions made and poor judgment increases portfolio costs. Hence,
various Multi-Criteria Decision Making (MCDM) techniques have
been designed and employed to support the decision-making
functions. This paper reviews possible options to enhance the
decision-making outcomes in organisational portfolio management
processes using the Analytic Hierarchy Process (AHP) both from
academic and practical perspectives and will examine the usability,
certainty and quality of the technique. The results of the study will
also provide insight into the technical risk associated with current
decision-making model to underpin initiative tracking and strategic
portfolio management.
Abstract: The paper shows that on transferring sense from the
SL to the TL, the translator’s reading against the grain determines the
creation of a faulty pattern of rendering the original meaning in the
receiving culture which reflects the use of misleading transformative
codes. In this case, the translator is a writer per se who decides what
goes in and out of the book, how the style is to be ciphered and what
elements of ideology are to be highlighted. The paper also proves that
figurative language must not be flattened for the sake of clarity or
naturalness. The missing figurative elements make the translated text
less interesting, less challenging and less vivid which reflects poorly
on the writer. There is a close connection between style and the
writer’s person. If the writer’s style is very much altered in a
translation, the translation is useless as the original writer and his /
her imaginative world can no longer be discovered. The purpose of the paper is to prove that adaptation is a dangerous
tool which leads to variants that sometimes reflect the original less
than the reader would wish to. It contradicts the very essence of the
process of translation which is that of making an original work
available in a foreign language. If the adaptive transformative codes
are so flexible that they encourage the translator to repeatedly leave
out parts of the original work, then a subversive pattern emerges
which changes the entire book. In conclusion, as a result of using adaptation, manipulative or
subversive effects are created in the translated work. This is generally
achieved by adding new words or connotations, creating new figures
of speech or using explicitations. The additional meanings of the
original work are neglected and the translator creates new meanings,
implications, emphases and contexts. Again s/he turns into a new
author who enjoys the freedom of expressing his / her own ideas
without the constraints of the original text. Reading against the grain
is unadvisable during the process of translation and consequently,
following personal common sense becomes essential in the field of
translation as well as everywhere else, so that translation should not
become a source of fantasy.
Abstract: Floorplanning plays a vital role in the physical design
process of Very Large Scale Integrated (VLSI) chips. It is an
essential design step to estimate the chip area prior to the optimized
placement of digital blocks and their interconnections. Since VLSI
floorplanning is an NP-hard problem, many optimization techniques
were adopted in the literature. In this work, a music-inspired
Harmony Search (HS) algorithm is used for the fixed die outline
constrained floorplanning, with the aim of reducing the total chip
area. HS draws inspiration from the musical improvisation process of
searching for a perfect state of harmony. Initially, B*-tree is used to
generate the primary floorplan for the given rectangular hard
modules and then HS algorithm is applied to obtain an optimal
solution for the efficient floorplan. The experimental results of the
HS algorithm are obtained for the MCNC benchmark circuits.
Abstract: Out-migration is an important issue for Georgia as
well as since independence has loosed due to emigration one fifth of
its population. During Soviet time out-migration from USSR was
almost impossible and one of the most important instruments in
regulating population movement within the Soviet Union was the
system of compulsory residential registrations, so-called “propiska”.
Since independent here was not any regulation for migration from
Georgia. The majorities of Georgian migrants go abroad by tourist
visa and then overstay, becoming the irregular labor migrants. The
official statistics on migration published for this period was based on
the administrative system of population registration, were
insignificant in terms of numbers and did not represent the real scope
of these migration movements. This paper discusses the data quality
and methodology of migration statistics in Georgia and we are going
to answer the questions: what is the real reason of increasing
immigration flows according to the official numbers since 2000s?
Abstract: For the music composer Myriam Marbe the musical
time and memory represent 2 (complementary) phenomena with
conclusive impact on the settlement of new musical ontologies.
Summarizing the most important achievements of the contemporary
techniques of composition, her vision on the microform presented in
The Concert for Daniel Kientzy, saxophone and orchestra transcends
the linear and unidirectional time in favour of a flexible, multivectorial
speech with spiral developments, where the sound substance
is auto(re)generated by analogy with the fundamental processes of
the memory. The conceptual model is of an archetypal essence, the
music composer being concerned with identifying the mechanisms of
the creation process, especially of those specific to the collective
creation (of oral tradition). Hence the spontaneity of expression,
improvisation tint, free rhythm, micro-interval intonation, coloristictimbral
universe dominated by multiphonics and unique sound
effects, hence the atmosphere of ritual, however purged by the
primary connotations and reprojected into a wonderful spectacular
space. The Concert is a work of artistic maturity and enforces respect,
among others, by the timbral diversity of the three species of
saxophone required by the music composer (baritone, sopranino and
alt), in Part III Daniel Kientzy shows the performance of playing two
saxophones concomitantly. The score of the music composer Myriam
Marbe contains a deeply spiritualized music, full or archetypal
symbols, a music whose drama suggests a real cinematographic
movement.
Abstract: Non contact evaluation of the thickness of paint
coatings can be attempted by different destructive and nondestructive
methods such as cross-section microscopy, gravimetric mass
measurement, magnetic gauges, Eddy current, ultrasound or
terahertz. Infrared thermography is a nondestructive and non-invasive
method that can be envisaged as a useful tool to measure the surface
thickness variations by analyzing the temperature response. In this
paper, the thermal quadrupole method for two layered samples heated
up with a pulsed excitation is firstly used. By analyzing the thermal
responses as a function of thermal properties and thicknesses of both
layers, optimal parameters for the excitation source can be identified.
Simulations show that a pulsed excitation with duration of ten
milliseconds allows obtaining a substrate-independent thermal
response. Based on this result, an experimental setup consisting of a
near-infrared laser diode and an Infrared camera was next used to
evaluate the variation of paint coating thickness between 60 μm and
130 μm on two samples. Results show that the parameters extracted
for thermal images are correlated with the estimated thicknesses by
the Eddy current methods. The laser pulsed thermography is thus an
interesting alternative nondestructive method that can be moreover
used for nonconductive substrates.
Abstract: The effects of hypertension are often lethal thus its
early detection and prevention is very important for everybody. In
this paper, a neural network (NN) model was developed and trained
based on a dataset of hypertension causative parameters in order to
forecast the likelihood of occurrence of hypertension in patients. Our
research goal was to analyze the potential of the presented NN to
predict, for a period of time, the risk of hypertension or the risk of
developing this disease for patients that are or not currently
hypertensive. The results of the analysis for a given patient can
support doctors in taking pro-active measures for averting the
occurrence of hypertension such as recommendations regarding the
patient behavior in order to lower his hypertension risk. Moreover,
the paper envisages a set of three example scenarios in order to
determine the age when the patient becomes hypertensive, i.e.
determine the threshold for hypertensive age, to analyze what
happens if the threshold hypertensive age is set to a certain age and
the weight of the patient if being varied, and, to set the ideal weight
for the patient and analyze what happens with the threshold of
hypertensive age.
Abstract: Composites depending on the nature of their
constituents and mode of production are regarded as one of the
advanced materials that drive today’s technology. This paper
attempts a short review of the subject matter with a general aim of
pushing to the next level the frontier of knowledge as it impacts the
technology of nano-particles manufacturing. The objectives entail an
effort to; aggregate recent research efforts in this field, analyse
research findings and observations, streamline research efforts and
support industry in taking decision on areas of fund deployment. It is
envisaged that this work will serve as a quick hand-on compendium
material for researchers in this field and a guide to relevant
government departments wishing to fund a research whose outcomes
have the potential of improving the nation’s GDP.
Abstract: The Port of Townsville conducts regular annual
maintenance dredging to maintain depths of its harbor basin and
approach channels for the navigational safety of the vessels against
the natural accumulation of marine sediments. In addition to the
regular maintenance dredging, the port undertakes emergency
dredging in cases where large quantities of sediments are mobilized
and deposited in port waters by cyclone or major flood events. The
maintenance dredging material derived from the port may be
disposed at sea or on land in accordance with relevant state and
commonwealth regulations. For the land disposal, the dredged mud
slurry is hydraulically placed into containment ponds and left to
undergo sedimentation and self-weight consolidation to form fill
material for land reclamation. This paper provides an overview of the
maintenance dredging at the Port of Townsville and emphasis on
maintenance dredging requirements, sediment quality, bathymetry,
dredging methods used, and dredged material disposal options.
Abstract: This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.
Abstract: This investigation develops a revisable method for estimating the estimate value of equivalent 10 Hz voltage flicker (DV10) of a DC Electric Arc Furnace (EAF). This study also discusses three 161kV DC EAFs by field measurement, with those results indicating that the estimated DV10 value is significantly smaller than the survey value. The key point is that the conventional means of estimating DV10 is inappropriate. There is a main cause as the assumed Qmax is too small.
Although DC EAF is regularly operated in a constant MVA mode, the reactive power variation in the Main Transformer (MT) is more significant than that in the Furnace Transformer (FT). A substantial difference exists between estimated maximum reactive power fluctuation (DQmax) and the survey value from actual DC EAF operations. However, this study proposes a revisable method that can obtain a more accurate DV10 estimate than the conventional method.
Abstract: Early 20th century functionalism aimed at generalising living and rationalising construction, thus laying the foundation for the standardisation of construction components and products. From the 1930s onwards, all measurement and quality instructions for building products, different types of building components, descriptions of working methods complying with advisable building practises, planning, measurement and calculation guidelines, terminology, etc. were called standards. Standardisation was regarded as a necessary prerequisite for the mass production of housing.
This article examines the early stages of standardisation in Finland in the 1940s and 1950s, as reflected on the working history of an individual architect, ErkkiKoiso-Kanttila (1914-2006). In 1950 Koiso-Kanttila was appointed the Head of Design of the Finnish Association of Architects’ Building Standards Committee, a position which he held until 1958. His main responsibilities were the development of the RT Building Information File and compiling of the files.
Abstract: Being main teaching media and major source of comprehensive target language input, teacher talk plays an important role in learners' second-language acquisition. Under the trend of "learner-centered" teaching mode, some researchers think that the best teacher talk means less. But the author holds that, in Chinese second language classroom, it is not advisable to lay too much stress on the formal students' participation, which requires the teacher to say as little as possible and the student to say as much as possible. The emphasis should be put on how to raise teacher talk's quality.
Abstract: This paper proposes a bioprocess optimization procedure based on Relevance Vector Regression models and evolutionary programming technique. Relevance Vector Regression scheme allows developing a compact and stable data-based process model avoiding time-consuming modeling expenses. The model building and process optimization procedure could be done in a half-automated way and repeated after every new cultivation run. The proposed technique was tested in a simulated mammalian cell cultivation process. The obtained results are promising and could be attractive for optimization of industrial bioprocesses.
Abstract: The continuous growth in the size of the World Wide Web has resulted in intricate Web sites, demanding enhanced user skills and more sophisticated tools to help the Web user to find the desired information. In order to make Web more user friendly, it is necessary to provide personalized services and recommendations to the Web user. For discovering interesting and frequent navigation patterns from Web server logs many Web usage mining techniques have been applied. The recommendation accuracy of usage based techniques can be improved by integrating Web site content and site structure in the personalization process.
Herein, we propose semantically enriched Web Usage Mining method for Personalization (SWUMP), an extension to solely usage based technique. This approach is a combination of the fields of Web Usage Mining and Semantic Web. In the proposed method, we envisage enriching the undirected graph derived from usage data with rich semantic information extracted from the Web pages and the Web site structure. The experimental results show that the SWUMP generates accurate recommendations and is able to achieve 10-20% better accuracy than the solely usage based model. The SWUMP addresses the new item problem inherent to solely usage based techniques.
Abstract: The use of Automated Teller Machine (ATM) has become an important tool among commercial banks, customers of banks have come to depend on and trust the ATM conveniently meet their banking needs. Although the overwhelming advantages of ATM cannot be over-emphasized, its alarming fraud rate has become a bottleneck in it’s full adoption in Nigeria. This study examined the menace of ATM in the society another cost of running ATM services by banks in the country. The researcher developed a prototype of an enhanced Automated Teller Machine Authentication using Short Message Service (SMS) Verification. The developed prototype was tested by Ten (10) respondents who are users of ATM cards in the country and the data collected was analyzed using Statistical Package for Social Science (SPSS). Based on the results of the analysis, it is being envisaged that the developed prototype will go a long way in reducing the alarming rate of ATM fraud in Nigeria.
Abstract: The paper describes software for remote control and measuring with new Graphical User Interface for Rohde & Schwarz instruments. Software allows remote control through Ethernet and supports basic and advanced functions for control various type of instruments like network and spectrum analyzers, power meters, signal generators and oscilloscopes. Standard Commands for Programmable Instruments (SCPI) and Virtual Instrument Software Architecture (VISA) are used for remote control and setup of instruments. Developed software is modular with user friendly graphic user interface for each instrument with automatic identification of instruments.