In the present paper, the analysis of the turning capability of the naval supply vessel presented in Part I
(Broglia et al., 2015) is continued with different stern appendages, namely twin rudder and centreline skeg.
The main purpose of the analysis is to assess the capability of an in-house CFD tool in capturing the
different manoeuvring characteristics of the ship hulls; the test case is challenging, as the difference be-
tween the two configurations lies in the complex flow structure related to rudder-propeller interactions.
Moreover, although the twin rudder solution slightly improves the poor course keeping ability of the
original vessel, the course stability remains poor and, consequently, large lateral motions and drift angle
have to be expected during the manoeuvre. The manoeuvring capabilities of the new configuration are
discussed and compared with the single rudder configuration, focusing on the nature of the hydrodynamic
forces and moments acting on the main hull and appendages during the transient and stabilized phases of
the manoeuvre. Emphasis will be also given to the different contributions of the propulsion system in the
twin rudder configuration, that results from the different rudder-propeller interaction
Graphics processing units (GPUs) are increasingly common on desktops, servers, and embedded platforms. In this article, we report on new security issues related to CUDA, which is the most widespread platform for GPU computing. In particular, details and proofs-of-concept are provided about novel vulnerabilities to which CUDA architectures are subject. We show how such vulnerabilities can be exploited to cause severe information leakage. As a case study, we experimentally show how to exploit one of these vulnerabilities on a GPU implementation of the AES encryption algorithm. Finally, we also suggest software patches and alternative approaches to tackle the presented vulnerabilities.
Offloading computing to distributed and possibly mobile nodes is increas-
ingly popular thanks to the convenience and availability of cloud resources. How-
ever, trusted mobile computing is not presently viable due to a number of issues in
both the mobile platform architectures and in the cloud service implementations.
The complexity of such systems potentially exposes them to malicious and/or self-
ish behavior. This chapter describes the state-of-the-art research on theoretical ad-
vancements and practical implementations of trusted computing on a mobile cloud.
Further, mobile distributed cloud computing security and reliability issues are intro-
duced. Discussed solutions feature different levels of resiliency against malicious
and misbehaving nodes.
We analyze the stability of the zero solution to Volterra equations on time scales with respect to two classes of bounded perturbations. We obtain sufficient conditions on the kernel which include some known results for continuous and for discrete equations. In order to check the applicability of these conditions, we apply the theory to a test example.
Assessment of two techniques to merge ground-based and TRMM rainfall measurements: a case study about Brazilian Amazon Rainforest
Mateus Pedro
;
Borma Laura S
;
da Silva Ricardo D
;
Nico Giovanni
;
Catalao Joao
The availability of accurate rainfall data with high spatial resolution, especially in vast watersheds with low density of ground-measurements, is critical for planning and management of water resources and can increase the quality of the hydrological modeling predictions. In this study, we used two classical methods: the optimal interpolation and the successive correction method (SCM), for merging ground-measurements and satellite rainfall estimates. Cressman and Barnes schemes have been used in the SCM in order to define the error covariance matrices. The correction of bias in satellite rainfall data has been assessed by using four different algorithms: (1) the mean bias correction, (2) the regression equation, (3) the distribution transformation, and (4) the spatial transformation. The satellite rainfall data were provided by the Tropical Rainfall Measuring Mission, over the Brazilian Amazon Rainforest. Performances of the two merging data techniques are compared, qualitatively, by visual inspection and quantitatively, by a statistical analysis, collected from January 1999 to December 2010. The computation of the statistical indices shows that the SCM, with the Cressman scheme, provides slightly better results.
rainfall interpolation
TRMM
statistical data merging
bias correction
remote sensing
This paper studies the problem of the assimilation of precipitable water vapor (PWV), estimated by synthetic aperture radar interferometry, using the Weather Research and Forecast Data Assimilation model 3-D variational data assimilation system. The experiment is designed to assess the impact of the PWV assimilation on the hydrometers and the rainfall predictions during 12 h after the assimilation time. A methodology to obtain calibrated maps of PWV and estimated their precision is also presented. The forecasts are compared with GPS estimates of PWV and with rainfall observations from a meteorological radar. Results show that after data assimilation, there is a correction of the bias in the PWV prediction and an improvement in the prediction of the weak to moderate rainfall up to 9 h after the assimilation time.
A new methodology for the mapping of intertidal terrain morphology is presented. It is based on the use of synthetic aperture radar (SAR) images and the temporal correlation between the SAR backscatter intensity and the water level on the intertidal zone. The proposed methodology does not require manual editing, providing a set of geolocated pixels that can be used to generate a digital elevation model of the intertidal zone. The methodology is validated using TerraSAR-X SAR images acquired over Tagus estuary. This methodology can be useful for the regular updating of intertidal bathymetric models useful for both flood hazard mitigation and morphodynamics modeling.
Global Navigation Satellite System (GNSS) tomography provides 3-D reconstructions of atmosphere wet refractivity, related to water vapor. A simulated analysis of the integration of Global Positioning System and future Galileo data is presented. Atmospheric refractivity is derived from radiosonde data acquired over the Lisbon area. The impact of Galileo data on the tomographic reconstruction is assessed. Furthermore, horizontal anomalies are added to a reference vertical profile of atmospheric refractivity to reproduce low-level dry or wet air intrusions, a phenomenon commonly observed in meteorological data acquired by both radiosonde and satellites. The dependence of tomographic solution on the GNSS network density is also analyzed. Better reconstruction capabilities in the lower layers are observed when increasing the network density.
We study the effects of a controlled gas flow on the dynamics of electrified jets in the electrospinning process. The main idea is to model the air drag effects of the gas flow by using a nonlinear Langevin-like approach. The model is employed to investigate the dynamics of electrified polymer jets at different conditions of air drag force, showing that a controlled gas counterflow can lead to a decrease of the average diameter of electrospun fibers, and potentially to an improvement of the quality of electrospun products. We probe the influence of air drag effects on the bending instabilities of the jet and on its angular fluctuations during the process. The insights provided by this study might prove useful for the design of future electrospinning experiments and polymer nanofiber materials.
A Derivative-Free Riemannian Powell's Method, Minimizing Hartley-Entropy-Based ICA Contrast
Chattopadhyay Amit
;
Selvan Suviseshamuthu Easter
;
Amato Umberto
Even though the Hartley-entropy-based contrast function guarantees an unmixing local minimum, the reported nonsmooth optimization techniques that minimize this nondifferentiable function encounter computational bottlenecks. Toward this, Powell's derivative-free optimization method has been extended to a Riemannian manifold, namely, oblique manifold, for the recovery of quasi-correlated sources by minimizing this contrast function. The proposed scheme has been demonstrated to converge faster than the related algorithms in the literature, besides the impressive source separation results in simulations involving synthetic sources having finite-support distributions and correlated images.
A model is proposed to describe the spike-frequency adaptation observed in many neuronal systems. We assume that adaptation is mainly due to a calcium-activated potassium current, and we consider two coupled stochastic differential equations for which an analytical approach combined with simulation techniques and numerical methods allow to obtain both qualitative and quantitative results about asymptotic mean firing rate, mean calcium concentration and the firing probability density. A related algorithm, based on the Hazard Rate Method, is also devised and described.
Calcium-activated potassium current
Fast-slow analysis
Hazard rate method
The likelihood of a subglacial lake beneath Amundsenisen Plateau at Southern Spitzbergen, Svalbard, pointed out by the flat signal within the Ground Penetrating Radar (GPR) remote survey of the area, is justified, here, via numerical simulation.This investigation has been developed under the assumption that the icefield thickness does not change on average, as it is confirmed by recently published physical measurements taken over the past 40 years. As a consequence, we have considered admissible to assume the temperature and density in-depth profiles, snow and firn layers included, to be stationary. The upper icefield surface and the rocky bed surface are known in detail.The mathematical numerical model is based on an unsteady Stokes formulation of the ice flow and on a Large Eddy Simulation formulation of the lake water flow. Following the numerical sensitivity results that we presented on a recent issue of this journal, we have, here, upgraded the model by improving the description of critical aspects of icefield thermo-mechanics, such as the local water release within temperate ice as a strain heating effect and ice sliding on the bedrock. The first issue impacts on ice texture, i.e. its constitutive equation, while the second one drives icefield surging. Actually, we have obtained 13% enhancement of the numerical value of the ice top surface velocity versus measured one, and physically consistent numerical ice sliding velocity values at the rocky bottom.Adopting a new physically sound initial subglacial lake water temperature and velocity fields, we present the numerical simulation of the whole system, icefield and conjectured subglacial lake, within a time slot of 20,000. d (physical time), when its evolution trend was clearly captured. By then, although the maximum value of water temperature keeps rather low, metastability appears to be overcome on more than half of the conjectured basin, with a progressive trend in time in support to the subglacial lake existence. We stress that the numerical subglacial lake surface converges to the GPR flat signal spot with tolerance equal to the GPR measuring error.Finally, we observe that the numerical simulation results meet quantitatively and qualitatively the fundamental aspects of the conjecture, so that further on-site investigations on the subglacial lake (e.g. drilling operations) appear fully justified.
Arctic
Continuum mechanics
Finite volumes
Phase-change
Subglacial lake
Temperate ice
We derive and test a formal explicit approximated rule for the reconstruction of a damaged inaccessible portion of the boundary of a thin conductor from thermal data collected on the opposite accessible face.
In this paper, to complete the global dynamics of a multi-strains SIS epidemic model, we establish a precise result on coexistence for the cases of the partial and complete duplicated multiple largest reproduction ratios for this model.
multi-strains SIS epidemic model
global attractivity
Lyapunov function
coexistence
MIPAS on ENVISAT performed almost continuous measurements of atmospheric composition for approximately 10 years, from June 2002 to April 2012. ESA processor, based on the algorithm ORM (Optimized Retrieval Model), originally designed for the Near Real Time analysis and developed by an European Consortium led by IFAC, is currently used for the reanalysis of the full MIPAS mission. The maintenance and the upgrade of the ESA processor are made in the frame of the Quality Working Group, where a fruitful collaboration among Level 1, Level 2 and validation teams can be exploited. This collaboration is essential to pursue improvements in the accuracy of the products and their characterization. This paper is meant to describe the most recent upgrades in the ESA processor performed to improve the quality of ESA products. In particular, the full mission was recently reprocessed with L1 V7 and L2 V7 processors, containing significant improvements with respect to previous version V6, and further improvements are in preparation, that will be collected in version 8 of the ESA processor. Improvements involve both L1 and L2 processors, as well as the auxiliary data. Improvements in the L1 processor consist in a correction of the instrumental drift, improved spike detection algorithm and new Instrument Line Shape, as well as the use of measured daily gain instead of weekly gain. Improvements in the L2 processor include a different approach for retrieving atmospheric continuum, the use of an a posteriori regularization with altitude dependent constraint, a better approach for handling interfering species, a reduced bias in CFC-11, the handling of horizontal inhomogeneities and the use of ECMWF altitude/pressure relation for determining more accurate altitudes.
Improvements in the auxiliary data consist in the use of microwindows with larger information content, new spectroscopic database and diurnally varying climatological dataset.
Furthermore, with each new version additional species are provided, leading to 20 the number of retrieved species by the L2 V8 processor.
Improvements in the V7 products will be revised on the light of the results of the validation with correlative measurements, and, by comparing the first new L2 V8 products with the L2 V7 ones, a preliminary assessment of the performance of the new V8 processor will be performed.
Nell'ambito del workshop d'inaugurazione dell'anno accademico dell'Associazione Matematica & Realta', la presentazione intende offrire alla platea degli insegnanti di matematica delle scuole superiori di secondo grado un esempio realmente implementato di attivita' di studio e ricerca di matematica applicata. Vi si trovano ben delineati i passi fondamentali che conducono dal problema reale alla soluzione numerica proposta, manifestando la necessita' di conoscenze interdisciplinari per giungere all' uso critico ed efficace dello strumento matematico. Il problema specifico considerato e' la validazione dell'ipotesi dell'esistenza di un lago subglaciale a Spitzbergen, isola dell'arcipelago delle Svalbard.
The segmentation of speckled images, as the synthetic aperture radar (SAR) images, is usually recognized as a very complex problem, because of the speckle, multiplicative noise, which produces granular images. In segmentation problems, based on level set method, the evolution of the curve is determined by a speed function, which is fundamental to achieve a good segmentation. In this paper we propose a study of the new speed function obtained by the linear combination of image average intensity and image gradient speed functions. Thus the aim is tuning the combined speed in the segmentation process. We segmented synthetic images by tuning parameters of the new speed function and we evaluated the best computed results. Then we applied this experimental setup to real SAR images, which are PRecision Images, acquired during European Remote Sensing mission, and a Cosmo-SkyMed image. In particular, we are interested in monitoring complex areas with low light covered by clouds, as coastlines and polar regions may be. In Earth Observation, the acquisition of SAR data becomes fundamental, since the SAR sensor can work in the night/day and in all weather conditions.
Image
Level set method
SAR
Segmentation
Speed evolution