4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Effect of Fire Retardant Painting Product on Smoke Optical Density of Burning Natural Wood Samples

Natural wood is used in many applications in Jordan such as furniture, partitions constructions, and cupboards. Experimental work for smoke produced by the combustion of certain wood samples was studied. Smoke generated from burning of natural wood, is considered as a major cause of death in furniture fires. The critical parameter for life safety in fires is the available time for escape, so the visual obscuration due to smoke release during fire is taken into consideration. The effect of smoke, produced by burning of wood, depends on the amount of smoke released in case of fire. The amount of smoke production, apparently, affects the time available for the occupants to escape. To achieve the protection of life of building occupants during fire growth, fire retardant painting products are tested. The tested samples of natural wood include Beech, Ash, Beech Pine, and white Beech Pine. A smoke density chamber manufactured by fire testing technology has been used to perform measurement of smoke properties. The procedure of test was carried out according to the ISO-5659. A nonflammable vertical radiant heat flux of 25 kW/m2 is exposed to the wood samples in a horizontal orientation. The main objective of the current study is to carry out the experimental tests for samples of natural woods to evaluate the capability to escape in case of fire and the fire safety requirements. Specific optical density, transmittance, thermal conductivity, and mass loss are main measured parameters. Also, comparisons between samples with paint and with no paint are carried out between the selected samples of woods.

Risk Management Approach for Lean, Agile, Resilient and Green Supply Chain

Implementation of LARG (Lean, Agile, Resilient, Green) practices in the supply chain management is a complex task mainly because ecological, economical and operational goals are usually in conflict. To implement these LARG practices successfully, companies’ need relevant decision making tools allowing processes performance control and improvement strategies visibility. To contribute to this issue, this work tries to answer the following research question: How to master performance and anticipate problems in supply chain LARG practices implementation? To answer this question, a risk management approach (RMA) is adopted. Indeed, the proposed RMA aims basically to assess the ability of a supply chain, guided by “Lean, Green and Achievement” performance goals, to face “agility and resilience risk” factors. To proof its relevance, a logistics academic case study based on simulation is used to illustrate all its stages. It shows particularly how to build the “LARG risk map” which is the main output of this approach.

Validation of Visibility Data from Road Weather Information Systems by Comparing Three Data Resources: Case Study in Ohio

Adverse weather conditions, particularly those with low visibility, are critical to the driving tasks. However, the direct relationship between visibility distances and traffic flow/roadway safety is uncertain due to the limitation of visibility data availability. The recent growth of deployment of Road Weather Information Systems (RWIS) makes segment-specific visibility information available which can be integrated with other Intelligent Transportation System, such as automated warning system and variable speed limit, to improve mobility and safety. Before applying the RWIS visibility measurements in traffic study and operations, it is critical to validate the data. Therefore, an attempt was made in the paper to examine the validity and viability of RWIS visibility data by comparing visibility measurements among RWIS, airport weather stations, and weather information recorded by police in crash reports, based on Ohio data. The results indicated that RWIS visibility measurements were significantly different from airport visibility data in Ohio, but no conclusion regarding the reliability of RWIS visibility could be drawn in the consideration of no verified ground truth in the comparisons. It was suggested that more objective methods are needed to validate the RWIS visibility measurements, such as continuous in-field measurements associated with various weather events using calibrated visibility sensors.

A Ground Observation Based Climatology of Winter Fog: Study over the Indo-Gangetic Plains, India

Every year, fog formation over the Indo-Gangetic Plains (IGPs) of Indian region during the winter months of December and January is believed to create numerous hazards, inconvenience, and economic loss to the inhabitants of this densely populated region of Indian subcontinent. The aim of the paper is to analyze the spatial and temporal variability of winter fog over IGPs. Long term ground observations of visibility and other meteorological parameters (1971-2010) have been analyzed to understand the formation of fog phenomena and its relevance during the peak winter months of January and December over IGP of India. In order to examine the temporal variability, time series and trend analysis were carried out by using the Mann-Kendall Statistical test. Trend analysis performed by using the Mann-Kendall test, accepts the alternate hypothesis with 95% confidence level indicating that there exists a trend. Kendall tau’s statistics showed that there exists a positive correlation between time series and fog frequency. Further, the Theil and Sen’s median slope estimate showed that the magnitude of trend is positive. Magnitude is higher during January compared to December for the entire IGP except in December when it is high over the western IGP. Decade wise time series analysis revealed that there has been continuous increase in fog days. The net overall increase of 99 % was observed over IGP in last four decades. Diurnal variability and average daily persistence were computed by using descriptive statistical techniques. Geo-statistical analysis of fog was carried out to understand the spatial variability of fog. Geo-statistical analysis of fog revealed that IGP is a high fog prone zone with fog occurrence frequency of more than 66% days during the study period. Diurnal variability indicates the peak occurrence of fog is between 06:00 and 10:00 local time and average daily fog persistence extends to 5 to 7 hours during the peak winter season. The results would offer a new perspective to take proactive measures in reducing the irreparable damage that could be caused due to changing trends of fog.

Spatial Integration at the Room-Level of 'Sequina' Slum Area in Alexandria, Egypt

The social logic of 'Sequina' slum area in Alexandria details the integral measure of space syntax at the room-level of twenty-building samples. The essence of spatial structure integrates the central 'visitor' domain with the 'living' frontage of the 'children' zone against the segregated privacy of the opposite 'parent' depth. Meanwhile, the multifunctioning of shallow rooms optimizes the integral 'visitor' structure through graph and visibility dimensions in contrast to the 'inhabitant' structure of graph-tails out of sight. Common theme of the layout integrity increases in compensation to the decrease of room visibility. Despite the 'pheno-type' of collective integration, the individual layouts observe 'geno-type' structure of spatial diversity per room adjoins. In this regard, the layout integrity alternates the cross-correlation of the 'kitchen & living' rooms with the 'inhabitant & visitor' domains of 'motherhood' dynamic structure. Moreover, the added 'grandparent' restructures the integral measure to become the deepest space, but opens to the 'living' of 'household' integrity. Some isomorphic layouts change the integral structure just through the 'balcony' extension of access, visual or ignored 'ringiness' of space syntax. However, the most integrated or segregated layouts invert the 'geno-type' into a shallow 'inhabitant' centrality versus the remote 'visitor' structure. Overview of the multivariate social logic of spatial integrity could never clarify without the micro-data analysis.

A New Tool for Global Optimization Problems- Cuttlefish Algorithm

This paper presents a new meta-heuristic bio-inspired optimization algorithm which is called Cuttlefish Algorithm (CFA). The algorithm mimics the mechanism of color changing behavior of the cuttlefish to solve numerical global optimization problems. The colors and patterns of the cuttlefish are produced by reflected light from three different layers of cells. The proposed algorithm considers mainly two processes: reflection and visibility. Reflection process simulates light reflection mechanism used by these layers, while visibility process simulates visibility of matching patterns of the cuttlefish. To show the effectiveness of the algorithm, it is tested with some other popular bio-inspired optimization algorithms such as Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Bees Algorithm (BA) that have been previously proposed in the literature. Simulations and obtained results indicate that the proposed CFA is superior when compared with these algorithms.

Trial Development the Evaluation Method of Quantification the Feeling of Preventing Visibility by Front A Pillar

There are many drivers who feel right A pillar of Japanese right-hand-drive car preventing visibility on turning right or left at intersection. On the other hand, there is a report that almost pedestrian accident is caused by the delay of finding pedestrian by drivers and this is found by drivers’ eye movement. Thus, we developed the evaluation method of quantification using drivers’ eye movement data by least squares estimation and we applied this method to commercial vehicle and evaluation the visibility. It is suggested that visibility of vehicle can be quantified and estimated by linear model obtained from experimental eye fixation data and information of vehicle dimensions.

Security in Crosswalks

Lighting is not only important for the safety of traffic, but also it is very important for the protection of pedestrians. Improvement on visibility in a long distance, lighting, signing, reduces considerably the risk of accidents in crosswalks. This paper evaluates different aspects of crosswalks including signing and lighting to improve road safety.

A New Approach to Steganography using Sinc-Convolution Method

Both image steganography and image encryption have advantages and disadvantages. Steganograhy allows us to hide a desired image containing confidential information in a covered or host image while image encryption is decomposing the desired image to a non-readable, non-comprehended manner. The encryption methods are usually much more robust than the steganographic ones. However, they have a high visibility and would provoke the attackers easily since it usually is obvious from an encrypted image that something is hidden! The combination of steganography and encryption will cover both of their weaknesses and therefore, it increases the security. In this paper an image encryption method based on sinc-convolution along with using an encryption key of 128 bit length is introduced. Then, the encrypted image is covered by a host image using a modified version of JSteg steganography algorithm. This method could be applied to almost all image formats including TIF, BMP, GIF and JPEG. The experiment results show that our method is able to hide a desired image with high security and low visibility.

A Study of Color Transformation on Website Images for the Color Blind

In this paper, we study on color transformation method on website images for the color blind. The most common category of color blindness is red-green color blindness which is viewed as beige color. By transforming the colors of the images, the color blind can improve their color visibility. They can have a better view when browsing through the websites. To transform colors on the website images, we study on two algorithms which are the conversion techniques from RGB color space to HSV color space and self-organizing color transformation. The comparative study focuses on criteria based on the ease of use, quality, accuracy and efficiency. The outcome of the study leads to enhancement of website images to meet the color blinds- vision requirements in perceiving image detailed.

Plug and Play Interferometer Configuration using Single Modulator Technique

We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.

Unified Method to Block Pornographic Images in Websites

This paper proposes a technique to block adult images displayed in websites. The filter is designed so as to perform even in exceptional cases such as, where face detection is not possible or improper face visibility. This is achieved by using an alternative phase to extract the MFC (Most Frequent Color) from the Human Body regions estimated using a biometric of anthropometric distances between fixed rigidly connected body locations. The logical results generated can be protected from overriding by a firewall or intrusion, by encrypting the result in a SSH data packet.