VDGMSISS: A Verifiable and Detectable Multi-Secret Images Sharing Scheme with General Access Structure

A secret image sharing scheme is a way to protect images. The main idea is dispersing the secret image into numerous shadow images. A secret image sharing scheme can withstand the impersonal attack and achieve the highly practical property of multiuse  is more practical. Therefore, this paper proposes a verifiable and detectable secret image-sharing scheme called VDGMSISS to solve the impersonal attack and to achieve some properties such as encrypting multi-secret images at one time and multi-use. Moreover, our scheme can also be used for any genera access structure.

Fundamental Theory of the Evolution Force: Gene Engineering utilizing Synthetic Evolution Artificial Intelligence

The effects of the evolution force are observable in nature at all structural levels ranging from small molecular systems to conversely enormous biospheric systems. However, the evolution force and work associated with formation of biological structures has yet to be described mathematically or theoretically. In addressing the conundrum, we consider evolution from a unique perspective and in doing so we introduce the “Fundamental Theory of the Evolution Force: FTEF”. We utilized synthetic evolution artificial intelligence (SYN-AI) to identify genomic building blocks and to engineer 14-3-3 ζ docking proteins by transforming gene sequences into time-based DNA codes derived from protein hierarchical structural levels. The aforementioned served as templates for random DNA hybridizations and genetic assembly. The application of hierarchical DNA codes allowed us to fast forward evolution, while dampening the effect of point mutations. Natural selection was performed at each hierarchical structural level and mutations screened using Blosum 80 mutation frequency-based algorithms. Notably, SYN-AI engineered a set of three architecturally conserved docking proteins that retained motion and vibrational dynamics of native Bos taurus 14-3-3 ζ.

The Need to Enhance Online Consumer Protection in KSA

E-commerce has evolved to become a functional and mainstream tool of global trading, including in the Kingdom of Saudi Arabia. Consequently, online consumers need protection just as much as consumers in the offline world. In 2019, the Ministry of Commerce in Saudi Arabia established a so-called ‘e-commerce law’; however, this law does not cover the court enforcement of contracts entered into by international vendors, so it is not applicable in cross-border situations. The purpose of this paper is to identify the gaps present in this new e-commerce law in Saudi Arabia.

Churn Prediction for Telecommunication Industry Using Artificial Neural Networks

Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.

Study of Photonic Crystal Band Gap and Hexagonal Microcavity Based on Elliptical Shaped Holes

In this paper, we present a numerical optical properties of a triangular periodic lattice of elliptical air holes. We report the influence of the ratio (semi-major axis length of elliptical hole to the filling ratio) on the photonic band gap. Then by using the finite difference time domain (FDTD) algorithm, the resonant wavelength of the point defect microcavities in a two-dimensional photonic crystal (PC) shifts towards the low wavelengths with significantly increased filing ratio. It can be noted that the Q factor is gradually changed to higher when the filling ratio increases. It is due to an increase in reflectivity of the PC mirror. Also we theoretically investigate the H1 cavity, where the value of semi-major axis (Rx) of the six holes surrounding the cavity are fixed at 0.5a and the Rx of the two edge air holes are fixed at the optimum value of 0.52a. The highest Q factor of 4.1359 × 106 is achieved at the resonant mode located at λ = 1.4970 µm.

Model of Obstacle Avoidance on Hard Disk Drive Manufacturing with Distance Constraint

Obstacle avoidance is the one key for the robot system in unknown environment. The robots should be able to know their position and safety region. This research starts on the path planning which are SLAM and AMCL in ROS system. In addition, the best parameters of the obstacle avoidance function are required. In situation on Hard Disk Drive Manufacturing, the distance between robots and obstacles are very serious due to the manufacturing constraint. The simulations are accomplished by the SLAM and AMCL with adaptive velocity and safety region calculation.

The Explanation for Dark Matter and Dark Energy

The following assumptions of the Big Bang theory are challenged and found to be false: the cosmological principle, the assumption that all matter formed at the same time and the assumption regarding the cause of the cosmic microwave background radiation. The evolution of the universe is described based on the conclusion that the universe is finite with a space boundary. This conclusion is reached by ruling out the possibility of an infinite universe or a universe which is finite with no boundary. In a finite universe, the centre of the universe can be located with reference to our home galaxy (The Milky Way) using the speed relative to the Cosmic Microwave Background (CMB) rest frame and Hubble's law. This places our home galaxy at a distance of approximately 26 million light years from the centre of the universe. Because we are making observations from a point relatively close to the centre of the universe, the universe appears to be isotropic and homogeneous but this is not the case. The CMB is coming from a source located within the event horizon of the universe. There is sufficient mass in the universe to create an event horizon at the Schwarzschild radius. Galaxies form over time due to the energy released by the expansion of space. Conservation of energy must consider total energy which is mass (+ve) plus energy (+ve) plus spacetime curvature (-ve) so that the total energy of the universe is always zero. The predominant position of galaxy formation moves over time from the centre of the universe towards the boundary so that today the majority of new galaxy formation is taking place beyond our horizon of observation at 14 billion light years.

A Research on Determining the Viability of a Job Board Website for Refugees in Kenya

Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.

A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems

Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.

Einstein’s General Equation of the Gravitational Field

The generalization of relativistic theory of gravity based essentially on the principle of equivalence stipulates that for all bodies, the grave mass is equal to the inert mass which leads us to believe that gravitation is not a property of the bodies themselves, but of space, and the conclusion that the gravitational field must curved space-time what allows the abandonment of Minkowski space (because Minkowski space-time being nonetheless null curvature) to adopt Riemannian geometry as a mathematical framework in order to determine the curvature. Therefore the work presented in this paper begins with the evolution of the concept of gravity then tensor field which manifests by Riemannian geometry to formulate the general equation of the gravitational field.

The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties

The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.

Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Impact of Weather Conditions on Generalized Frequency Division Multiplexing over Gamma Gamma Channel

The technique called as Generalized frequency division multiplexing (GFDM) used in the free space optical channel can be a good option for implementation free space optical communication systems. This technique has several strengths e.g. good spectral efficiency, low peak-to-average power ratio (PAPR), adaptability and low co-channel interference. In this paper, the impact of weather conditions such as haze, rain and fog on GFDM over the gamma-gamma channel model is discussed. A Trade off between link distance and system performance under intense weather conditions is also analysed. The symbol error probability (SEP) of GFDM over the gamma-gamma turbulence channel is derived and verified with the computer simulations.

Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Assessment of Psychomotor Development of Preschool Children: A Review of Eight Psychomotor Developmental Tools

The assessment of psychomotor development allows us to identify children with motor delays, helps us to monitor progress in time and prepare suitable intervention programs. The foundation of psychomotor development lies in pre-school age and is crucial for child´s further cognitive and social development. Many assessment tools of psychomotor development have been developed over the years. Some of them are easy screening tools; others are more complex and sophisticated. The purpose of this review is to describe the history of psychomotor assessment, specify preschool children´s psychomotor evaluation and review eight psychomotor development assessment tools for preschool children (Denver II., DEMOST-PRE, TGMD -2/3, BOT-2, MABC-2, PDMS-2, KTK, MOT 4-6). The selection of test depends on purpose and context in which is the assessment planned.

BIM Application Research Based on the Main Entrance and Garden Area Project of Shanghai Disneyland

Based on the main entrance and garden area (ME&G) project of Shanghai Disneyland, this paper introduces the application of BIM technology in this kind of low-rise comprehensive building with complex facade system, electromechanical system and decoration system. BIM technology is applied to the whole process of design, construction and completion of the whole project. With the construction of BIM application framework of the whole project, the key points of BIM modeling methods of different systems and the integration and coordination of BIM models are elaborated in detail. The specific application methods of BIM technology in similar complex low-rise building projects are sorted out. Finally, the paper summarizes the benefits of BIM technology application, and puts forward some suggestions for BIM management mode and practical application of similar projects in the future.

A Deep-Learning Based Prediction of Pancreatic Adenocarcinoma with Electronic Health Records from the State of Maine

Predicting the risk of Pancreatic Adenocarcinoma (PA) in advance can benefit the quality of care and potentially reduce population mortality and morbidity. The aim of this study was to develop and prospectively validate a risk prediction model to identify patients at risk of new incident PA as early as 3 months before the onset of PA in a statewide, general population in Maine. The PA prediction model was developed using Deep Neural Networks, a deep learning algorithm, with a 2-year electronic-health-record (EHR) cohort. Prospective results showed that our model identified 54.35% of all inpatient episodes of PA, and 91.20% of all PA that required subsequent chemoradiotherapy, with a lead-time of up to 3 months and a true alert of 67.62%. The risk assessment tool has attained an improved discriminative ability. It can be immediately deployed to the health system to provide automatic early warnings to adults at risk of PA. It has potential to identify personalized risk factors to facilitate customized PA interventions.

Teaching Attentive Literature Reading in Higher Education French as a Foreign Language: A Pilot Study of a Flipped Classroom Teaching Model

Teaching French as a foreign language usually implies teaching French literature, especially in higher education. Training university students in literary reading in a foreign language requires addressing several aspects at the same time: the (foreign) language, the poetic language, the aesthetic aspects of the studied works, and various interpretations of them. A pilot study sought to test a teaching model that would support students in learning to perform competent readings and short analyses of French literary works, in a rather independent manner. This shared practice paper describes the use of a flipped classroom method in two French literature courses, a campus course and an online course, and suggests that the teaching model may provide efficient tools for teaching literary reading and analysis in a foreign language. The teaching model builds on a high level of student activity and focuses on attentive reading, meta-perspectives such as theoretical concepts, individual analyses by students where said concepts are applied, and group discussions of the studied texts and of possible interpretations.

Needs Analysis Survey of Hearing Impaired Students’ Teachers in Elementary Schools for Designing Curriculum Plans and Improving Human Resources

This paper intends to study needs analysis of hearing-impaired students’ teachers in elementary schools all over Iran. The subjects of this study were 275 teachers who were teaching hearing-impaired students in elementary schools. The participants were selected by a quota sampling method. To collect the data, questionnaires of training needs consisting of 41 knowledge items and 31 performance items were used. The collected data were analyzed by using SPSS software in the form of descriptive analyses (frequency and mean) and inferential analyses (one sample t-test, paired t-test, independent t-test, and Pearson correlation coefficient). The findings of the study indicated that teachers generally have considerable needs in knowledge and performance domains. In 32 items out of the total 41 knowledge domain items and in the 27 items out of the total 31 performance domain items, the teachers had considerable needs. From the quantitative point of view, the needs of the performance domain were more than those of the knowledge domain, so they have to be considered as the first priority in training these teachers. There was no difference between the level of the needs of male and female teachers. There was a significant difference between the knowledge and performance domain needs and the teachers’ teaching experience, 0.354 and 0.322 respectively. The teachers who had been trained in working with hearing-impaired students expressed more training needs (both knowledge and performance).