Objective: Drug delivery from a drug-loaded device into an adjacent tissue is a complicated process involving drug transport through diffusion and advection, coupled with drug binding kinetics responsible for drug uptake in the tissue. This work presents a theoretical model to predict drug delivery from a device into a multilayer tissue, assuming linear reversible drug binding in the tissue layers. Methods: The governing mass conservation equations based on diffusion, advection and drug binding in a multilayer cylindrical geometry are written, and solved using Laplace transformation. The model is used to understand the impact of various non-dimensional parameters on the amounts of bound and unbound drug concentrations as functions of time. Results: Good agreement for special cases considered in past work is demonstrated. The effect of forward and reverse binding reaction rates on the multilayer drug binding process is studied in detail. The effect of the nature of the external boundary condition on drug binding and drug loss is also studied. For typical parameter values, results indicate that only a small fraction of drug delivered binds in the tissue. Additionally, the amount of bound drug rises rapidly with time due to early dominance of the forward reaction, reaches a maxima and then decays due to the reverse reaction. Conclusions: The general model presented here can account for other possible effects such as drug absorption within the device. Besides generalizing past work on drug delivery modeling, this work also offers analytical tools to understand and optimize practical drug delivery devices.
drug delivery
linear reversible drug binding
theoretical modeling
Laplace transformation
Layer-by-layer assembly of nanotheranostic particles for simultaneous delivery of docetaxel and doxorubicin to target osteosarcoma
Desmond L.
;
Margini S.
;
Barchiesi E.
;
Pontrelli G.
;
Phan A. N.
;
Gentile P.
Osteosarcoma (OS) is a rare form of primary bone cancer, impacting approximately 3.4 × 106 individuals worldwide each year, primarily afflicting children. Given the limitations of existing cancer therapies, the emergence of nanotheranostic platforms has generated considerable research interest in recent decades. These platforms seamlessly integrate therapeutic potential of drug compounds with the diagnostic capabilities of imaging probes within a single construct. This innovation has opened avenues for enhanced drug delivery to targeted sites while concurrently enabling real-time monitoring of the vehicle's trajectory. In this study, we developed a nanotheranostic system employing the layer-by-layer (LbL) technique on a core containing doxorubicin (DOXO) and in-house synthesized carbon quantum dots. By utilizing chitosan and chondroitin sulfate as polyelectrolytes, we constructed a multilayered coating to encapsulate DOXO and docetaxel, achieving a coordinated co-delivery of both drugs. The LbL-functionalized nanoparticles exhibited an approximate size of 150 nm, manifesting a predominantly uniform and spherical morphology, with an encapsulation efficiency of 48% for both drugs. The presence of seven layers in these systems facilitated controlled drug release over time, as evidenced by in vitro release tests. Finally, the impact of the LbL-functionalized nanoparticles was evaluated on U2OS and Saos-2 osteosarcoma cells. The synergistic effect of the two drugs was found to be crucial in inducing cell death, particularly in Saos-2 cells treated with nanoparticles at concentrations higher than 10 μg/ml. Transmission electron microscopy analysis confirmed the internalization of the nanoparticles into both cell types through endocytic mechanisms, revealing an underlying mechanism of necrosis-induced cell death.
drug delivery, mathematical modelling, osteosarcoma
The aim of this paper is to describe a Matlab package for computing the simultaneous Gaussian quadrature rules associated with a variety of multiple orthogonal polynomials. Multiple orthogonal polynomials can be considered as a generalization of classical orthogonal polynomials, satisfying orthogonality constraints with respect to different measures, with Moreover, they satisfy -term recurrence relations. In this manuscript, without loss of generality, is considered equal to The so-called simultaneous Gaussian quadrature rules associated with multiple orthogonal polynomials can be computed by solving a banded lower Hessenberg eigenvalue problem. Unfortunately, computing the eigendecomposition of such a matrix turns out to be strongly ill-conditioned and the Matlab function balance.m does not improve the condition of the eigenvalue problem. Therefore, most procedures for computing simultaneous Gaussian quadrature rules are implemented with variable precision arithmetic. Here, we propose a Matlab package that allows to reliably compute the simultaneous Gaussian quadrature rules in floating point arithmetic. It makes use of a variant of a new balancing procedure, recently developed by the authors of the present manuscript, that drastically reduces the condition of the Hessenberg eigenvalue problem.
In this paper, we derive a new method to compute the nodes and weights of simultaneous n-point Gaussian quadrature rules. The method is based on the eigendecomposition of the banded lower Hessenberg matrix that contains the coefficients of the recurrence relations for the corresponding multiple orthogonal polynomials. The novelty of the approach is that it uses the property of total nonnegativity of this matrix associated with the particular considered multiple orthogonal polynomials, in order to compute its eigenvalues and eigenvectors in a numerically stable manner. The overall complexity of the computation of all the nodes and weights is O(n^2).
Gaussian quadrature, Multiple orthogonal polynomials, Total nonnegativity, Numerical stability
The present work focuses on a non-local integro-differential model reproducing Cancer-on-chip experiments where tumor cells, treated with chemotherapy drugs, secrete chemical signals stimulating the immune response. The reliability of the model in reproducing the phenomenon of interest is investigated through a global sensitivity analysis, rather than a local one, to have global information about the role of parameters, and by examining potential non-linear effects in greater detail. Focusing on a region in the parameter space, the effect of 13 model parameters on the in silico outcome is investigated by considering 11 different target outputs, properly selected to monitor the spatial distribution and the dynamics of immune cells along the period of observation. In order to cope with the large number of model parameters to be investigated and the computational cost of each numerical simulation, a two-step global sensitivity analysis is performed. First, the screening Morris method is applied to rank the effect of the 13 model parameters on each target output and it emerges that all the output targets are mainly affected by the same 6 parameters. The extended Fourier Amplitude Sensitivity Test (eFAST) method is then used to quantify the role of these 6 parameters. As a result, the proposed analysis highlights the feasibility of the considered space of parameters, and indicates that the most relevant parameters are those related to the chemical field and cell-substrate adhesion. In turn, it suggests how to possibly improve the model description as well as the calibration procedure, in order to better capture the observed phenomena and, at the same time, reduce the complexity of the simulation algorithm. On one hand, the model could be simplified by neglecting cell–cell alignment effects unless clear empirical evidences of their importance emerge. On the other hand, the best way to increase the accuracy and reliability of our model predictions would be to have experimental data/information to reduce the uncertainty of the more relevant parameters.
Cancer-on-chip, Global sensitivity analysis, Discrete and continuous mathematical model
We address the problem of user fast revocation in the lattice-based Ciphertext Policy Attribute-Based Encryption (CP-ABE) by extending the scheme originally introduced by Zhang and Zhang [Zhang J, Zhang Z. A ciphertext policy attribute-based encryption scheme without pairings. In: International Conference on Information Security and Cryptology. Springer; 2011. p. 324–40. doi: https://doi.org/10.1007/978-3-642-34704-7_23.]. While a lot of work exists on the construction of revocable schemes for CP-ABE based on pairings, works based on lattices are not so common, and – to the best of our knowledge – we introduce the first server-aided revocation scheme in a lattice-based CP-ABE scheme, hence being embedded in a post-quantum secure environment. In particular, we rely on semi-trusted “mediators” to provide a multi-step decryption capable of handling mediation without re-encryption. We comment on the scheme and its application, and we provide performance experiments on a prototype implementation in the Attribute-Based Encryption spin-off library of Palisade to evaluate the overhead compared with the original scheme.
The growth potential of a crypto project, typically sustained by an associated cryptocurrency, can be measured by the use cases spurred by the underlying technology. However, these projects are implemented through decentralized applications, with a weak (if any) feedback scheme. Hence, a metric that is widely used as a proxy for the healthiness of such projects is the number of transactions and related volumes. Nevertheless, such a metric can be subject to manipulation - the crypto market being an unregulated one, magnifies such a risk. To address the cited gap, in this paper, we design a comprehensive methodology to process large cryptocurrency transaction graphs that, after clustering user addresses of interest, derives a compact representation of the network that highlights interactions among clusters. The analysis of these interactions provides insights into/over/on the strength of the project.To show the quality and viability of our solution, we bring forward a use case centered on Polkadot. The Polkadot network, a cutting-edge cryptocurrency platform, has gained significant attention in the digital currency landscape due to its pioneering approach to interoperability and scalability. However, little is known about how many and to what extent its wide range of enabled use cases have been adopted by end-users so far. The answer to this type of question means mapping Polkadot (or any analyzed crypto project) on a palette that ranges from a thriving ecosystem to a speculative coin without compelling use cases.Our findings, rooted on extensive experimental results - we have parsed 12.5+ million blocks - , demonstrate that crypto exchanges exert considerable influence on the Polkadot network, owning nearly 40% of all addresses in the ledger and absorbing at least 80% of all transactions. In addition, the high volume of inter-exchange transactions (more than 20%) underscores the strong interconnections among just a couple of prominent exchanges, prompting further investigations into the behavior of these actors to uncover potential unethical activities, such as wash trading.These results are a testament to the quality and viability of the proposed solution that, while characterized by a high level of scalability and adaptability, is at the same time immune from the drawbacks of currently used metrics.
2023 has replaced 2016 as the warmest year on record since 1850, bringing us closer to the 1.5 oC limit set by the Paris Agreement. High temperatures increase the likelihood of extreme events, with heatwaves and drought being prominent among them. Climate change has led to a rise in the frequency of droughts, affecting countries that never experienced them. Assessing drought events is crucial and satellite data can provide significant assistance due to its large spatial coverage and continuous data supply. Based on the Infrared Atmospheric Sounder Interferometer (IASI), we designed a new Water Deficit Index (wdi) that we have already proven useful in detecting drought events. Unfortunately, infrared sensors such as IASI cannot penetrate thick cloud layers, so observations are blinded to surface emissions under cloudiness bringing sparse and not homogeneous distributed data over a given spatial region. To reconstruct a model of the field of interest for the entire surface on a regular grid mesh, interpolation techniques, and spatial statistics to deal with huge data sets are mandatory. In this paper, we exploited the capability of two machine learning algorithms, i.e. gradient boosting and random forest, in converting IASI L2 scattered data to a regular L3 grid. Specifically, we trained a model that can predict the wdi index over a 0.05o regular grid, using data from other sensors as a proxy together with vegetational products, soil indices, and territorial and geographic information as covariates. We applied the methodology over the Po Valley region, which experienced an intense drought in the last three years causing high vegetation and soil water stress. Overall, we found that these methods can yield good results and allow simultaneous regular grid conversion and downscaling.
Infrared radiative transfer, Vegetation and soil water stress, Drought, IASI, Surface Temperature, Dew point temperature, Machine Learning, Downscaling
Clustering univariate functional data is mostly based on projecting the curves onto an adequate basis and applying some distance or similarity models on the coefficients. The basis functions should be chosen depending on features of the function being estimated. Commonly used are Fourier, polynomial and splines, but these may not be well suited for curves that exhibit inhomogeneous behavior. Wavelets on the contrary are well suited for identifying highly discriminant local time and scale features, and are able to adapt to the data smoothness. In recent years, few methods, relying on wavelet-based similarity measures, have been proposed for clustering curves, observed on equidistant points. In this work, we present a non-equidistant design wavelet based method for non-parametrically estimating and clustering a large number of curves. The method consists of several crucial stages: fitting functional data by non-equispaced design wavelet regression, screening out nearly flat curves, denoising the remaining curves with wavelet thresholding, and finally clustering the denoised curves. Simulation studies compare our proposed method with some other functional clustering methods. The method is applied for clustering some real functional data profiles.
The Mediterranean basin is one of those areas where the impact of climate change is showing its most alarming consequences. Many regions in this area, both woodlands and croplands, have been suffering from droughts and water deficits due to the intense summer heatwaves of the last decades. Monitoring these phenomena is key to understanding how they are evolving and what could be done to mitigate their effects. Emissivity is a useful parameter in identifying the presence (or absence) of water. Surface and dew point temperatures are extremely useful not only in measuring the intensity of the heatwave but also in accounting for how much water content the surface is losing as humidity to the atmosphere. This paper presents a climatological study of Southern Italy’s water loss for the period 2015-2023 based on daily observations acquired by the Infrared Atmospheric Sounding Interferometer (IASI), mounted on top of EUMETSAT’s MetOp satellites. The Water Deficit Index (WDI) and the Emissivity Contrast Index (ECI) were estimated: monthly averages of each quantity were produced for the period of interest. Moreover, a validation with in situ measurements was conducted to better understand how these heatwave-induced droughts have been impacting the surface on different types of land covers.
climate change, remote sensing, droughts, heat waves, vegetation, water deficit, emissivity, land cover
: The brain-related phenotypes observed in 22q11.2 deletion syndrome (DS) patients are highly variable, and their origin is poorly understood. Changes in brain metabolism might contribute to these phenotypes, as many of the deleted genes are involved in metabolic processes, but this is unknown. This study shows for the first time that Tbx1 haploinsufficiency causes brain metabolic imbalance. We studied two mouse models of 22q11.2DS using mass spectrometry, nuclear magnetic resonance spectroscopy, and transcriptomics. We found that Tbx1 +/- mice and Df1/+ mice, with a multigenic deletion that includes Tbx1, have elevated brain methylmalonic acid, which is highly brain-toxic. Focusing on Tbx1 mutants, we found that they also have a more general brain metabolomic imbalance that affects key metabolic pathways, such as glutamine-glutamate and fatty acid metabolism. We provide transcriptomic evidence of a genotype-vitamin B12 treatment interaction. In addition, vitamin B12 treatment rescued a behavioural anomaly in Tbx1 +/- mice. Further studies will be required to establish whether the specific metabolites affected by Tbx1 haploinsufficiency are potential biomarkers of brain disease status in 22q11.2DS patients.
Modern ICT infrastructures, i.e., cyber-physical systems and critical infrastructures relying on interconnected IT (Information Technology)- and OT (Operational Technology)-based components and (sub-)systems, raise complex challenges in tackling security and safety issues. Nowadays, many security controls and mechanisms have been made available and exploitable to solve specific security needs, but, when dealing with very complex and multifaceted heterogeneous systems, a methodology is needed on top of the selection of each security control that will allow the designer/maintainer to drive her/his choices to build and keep the system secure as a whole, leaving the choice of the security controls to the last step of the system design/development. This paper aims at providing a comprehensive methodological approach to design and preliminarily implement an Open Platform Architecture (OPA) to secure the cyber-physical systems of critical infrastructures. Here, the Open Platform Architecture (OPA) depicts how an already existing or under-design target system (TS) can be equipped with technologies that are modern or currently under development, to monitor and timely detect possibly dangerous situations and to react in an automatic way by putting in place suitable countermeasures. A multifaceted use case (UC) that is able to show the OPA, starting from the security and safety requirements to the fully designed system, will be developed step by step to show the feasibility and the effectiveness of the proposed methodology.
Cybersecurity
Monitoring
Firewalling
Rule distribution
Slow DoS attack
Denial of service
Industrial security
Critical infrastructure protection
Security investments
Current research directions indicate that vehicles with autonomous capabilities will increase in traffic contexts. Starting from data analyzed in R. E. Stern et al. (2018), this paper shows the benefits due to the traffic control exerted by a unique autonomous vehicle circulating on a ring track with more than 20 human-driven vehicles. Considering different traffic experiments with high stop-and-go waves and using a general microscopic model for emissions, it was first proved that emissions reduces by about 25%. Then, concentrations for pollutants at street level were found by solving numerically a system of differential equations with source terms derived from the emission model. The results outline that ozone and nitrogen oxides can decrease, depending on the analyzed experiment, by about 10% and 30%, respectively. Such findings suggest possible management strategies for traffic control, with emphasis on the environmental impact for vehicular flows.
road traffic modeling, traffic waves, emissions, Nitrogen oxides, ozone production
The mathematical modeling of various real-life phenomena often leads to the formulation of positive and conservative Production-Destruction differential Systems (PDS). Here we address a general finite horizon Optimal Control Problem (OCP) for PDS and delve into the properties of its continuous-time solution. Leveraging the dynamic programming approach, we recast the OCP as a backward-in-time Hamilton-Jacobi-Bellman (HJB) equation, whose unique viscosity solution corresponds to the value function [1]. We then propose a parallel-in-space semi-Lagrangian approximation scheme for the HJB equation [3] and derive the optimal control in feedback form. Finally, to reconstruct the optimal trajectories of the controlled PDS, we employ unconditionally positive and conservative modified Patankar linear multistep methods [2]. [1] CRANDALL, M. G.; ISHII, H.; LIONS, P.-L. User’s guide to viscosity solutions of second order partial differential equations. Bull. Amer. Math. Soc., 1992, 27.1: 1-67. [2] IZZO, G.; MESSINA, E.; PEZZELLA, M.; VECCHIO, A. TITOLO DA ACCERTARE LUNEDì. In preparation. [3] FALCONE, M.; FERRETTI, R. Semi-Lagrangian approximation schemes for linear and Hamilton—Jacobi equations. SIAM, 2013.
Investigating the impact of the regularization parameter on EEG resting-state source reconstruction and functional connectivity using real and simulated data
Leone F.
;
Caporali A.
;
Pascarella A.
;
Perciballi C.
;
Maddaluno O.
;
Basti A.
;
Belardinelli P.
;
Marzetti L.
;
Di Lorenzo G.
;
Betti V.
: Accurate EEG source localization is crucial for mapping resting-state network dynamics and it plays a key role in estimating source-level functional connectivity. However, EEG source estimation techniques encounter numerous methodological challenges, with a key one being the selection of the regularization parameter in minimum norm estimation. This choice is particularly intricate because the optimal amount of regularization for EEG source estimation may not align with the requirements of EEG connectivity analysis, highlighting a nuanced trade-off. In this study, we employed a methodological approach to determine the optimal regularization coefficient that yields the most effective reconstruction outcomes across all simulations involving varying signal-to-noise ratios for synthetic EEG signals. To this aim, we considered three resting state networks: the Motor Network, the Visual Network, and the Dorsal Attention Network. The performance was assessed using three metrics, at different regularization parameters: the Region Localization Error, source extension, and source fragmentation. The results were validated using real functional connectivity data. We show that the best estimate of functional connectivity is obtained using 10-2, while 10-1 has to be preferred when source localization only is at target.
Introduction: The formation and functioning of neural networks hinge critically on the balance between structurally homologous areas in the hemispheres. This balance, reflecting their physiological relationship, is fundamental for learning processes. In our study, we explore this functional homology in the resting state, employing a complexity measure that accounts for the temporal patterns in neurodynamics. Methods: We used Normalized Compression Distance (NCD) to assess the similarity over time, neurodynamics, of the somatosensory areas associated with hand perception (S1). This assessment was conducted using magnetoencephalography (MEG) in conjunction with Functional Source Separation (FSS). Our primary hypothesis posited that neurodynamic similarity would be more pronounced within individual subjects than across different individuals. Additionally, we investigated whether this similarity is influenced by hemisphere or age at a population level. Results: Our findings validate the hypothesis, indicating that NCD is a robust tool for capturing balanced functional homology between hemispheric regions. Notably, we observed a higher degree of neurodynamic similarity in the population within the left hemisphere compared to the right. Also, we found that intra-subject functional homology displayed greater variability in older individuals than in younger ones. Discussion: Our approach could be instrumental in investigating chronic neurological conditions marked by imbalances in brain activity, such as depression, addiction, fatigue, and epilepsy. It holds potential for aiding in the development of new therapeutic strategies tailored to these complex conditions, though further research is needed to fully realize this potential.
functional source separation
neurodynamics
normalized compression distance
resting state
temporal course of the neuronal electrical activity
In recent years, the use of techniques based on electromagnetic radiation as an investigative tool in the agri-food industry has grown considerably, and between them, the application of imaging and THz spectroscopy has gained significance in the field of food quality control. This study presents the development of an experimental setup operating in transmission mode within the frequency range of 18 to 40 GHz, which was specifically designed for assessing various quality parameters of hazelnuts. The THz measurements were conducted to distinguish between healthy and rotten hazelnut samples. Two different data analysis techniques were employed and compared: a traditional approach based on data matrix manipulation and curve fitting for parameter extrapolation, and the utilization of a Self-Organizing Map (SOM), for which we use a neural network commonly known as the Kohonen neural network, which is recognized for its efficacy in analyzing THz measurement data. The classification of hazelnuts based on their quality was performed using these techniques. The results obtained from the comparative analysis of coding efforts, analysis times, and outcomes shed light on the potential applications of each method. The findings demonstrate that THz spectroscopy is an effective technique for quality assessment in hazelnuts, and this research serves to clarify the suitability of each analysis technique.
A thorough experimental assessment of THz-TDS plasma diagnostic techniques for nuclear fusion applications
Teka G. G.
;
Peng K.
;
Alonzo M.
;
Bombarda F.
;
Koch-Dandolo C. L.
;
Senni L.
;
Taschin A.
;
Zerbini M.
In this paper, the study of a plasma diagnostic system based on the THz time domain spectroscopy technique is presented. Such a system could potentially probe a large part of the electromagnetic spectrum currently covered by several other diagnostics in a single measurement. This feature, keeping in mind the basic requirements for plasma diagnostics in nuclear fusion experiments, such as robustness and hard environment applicability, as well as durability and low maintenance, makes the diagnostic of great interest. A conceptual design of the THz-TDS diagnostic has been developed, starting from the well-established classical microwave and far infrared plasma diagnostics landscape. The physical constraints and required instrumental characteristics have been studied and are described in detail here, together with the solutions available for each type of plasma measurement. Specific experimental laboratory tests of the different experimental configurations have been carried out, evaluating the capacity and potential of the novel diagnostic, together with the instrumental constraint, within the diagnostic parameter space.
The project of the Visible Spectroscopy diagnostics for the Zeff radial profile measurement and for the divertor visible imaging spectroscopy, designed for the new tokamak DTT (Divertor Tokamak Test), is presented. To deal with the geometrical constraints of DTT and to minimize the diagnostics volume inside the access port, an integrated and compact solution hosting the two systems has been proposed. The Zeff radial profile will be evaluated from the Bremsstrahlung radiation measurement in the visible spectral range, acquiring light along ten Lines of Sight (LoS) in the upper part of the poloidal plane. The plasma emission will be focused on optical fibers, which will carry it to the spectroscopy laboratory. A second equipment, with a single toroidal LoS crossing the plasma centre and laying on the equatorial plane, will measure the average Zeff on a longer path, minimizing the incidental continuum spectrum contaminations by lines/bands emitted from the plasma edge. The divertor imaging system is designed to measure impurity and main gas influxes, to monitor the plasma position and kinetics of impurities, and to follow the plasma detachment evolution. The project aims at obtaining the maximum coverage of the divertor region. The collected light can be shared among different spectrometers and interferential filter devices placed outside the torus hall to easily change their setup. The system is composed of two telescopes, an upper and a lower one, allowing both a perpendicular and a tangential view of the DTT divertor region. This diagnostic offers a unique and compact solution designed to cope the demanding constraints of this next-generation tokamak fusion devices, integrating essential tools for wide-ranging impurity characterization and versatile investigation of divertor physics.
The study of materials for space exploration is one of the most interesting targets of international space agencies. An essential tool for realizing light junctions is epoxy adhesive (EA), which provides an elastic and robust material with a complex mesh of polymeric chains and crosslinks. In this work, a study of the structural and chemical modification of a commercial two-part flexible EA (3MTM Scotch-WeldTM EC-2216 B/A Gray), induced by 60Co gamma radiation, is presented. Combining different spectroscopic techniques, such as the spectroscopic Fourier transform infrared spectroscopy (FTIR), the THz time-domain spectroscopy (TDS), and the electron paramagnetic resonance (EPR), a characterization of the EA response in different regions of the electromagnetic spectrum is performed, providing valuable information about the structural and chemical properties of the polymers before and after irradiation. A simultaneous dissociation of polymeric chain and crosslinking formation is observed.The polymer is not subject to structural modification at an absorbed dose of 10 kGy, in which only transient free radicals are observed. Differently, between 100 and 500 kGy, a gradual chemical degradation of the samples is observed together with a broad and long-living EPR signal appearance. This study also provides a microscopic characterization of the material useful for the mechanism evaluation of system degradation.