Luis Velasco, Full Professor, Universitat Politècnica de Catalunya (UPC)
Huge efforts have been paid lastly to study the application of Machine Learning techniques to optical transport networks. Applications include Quality of Transmission (QoT) estimation, failure detection and attack detection, and network automation, just to mention a few. In this regard, the development of Optical Layer Digital Twins able to accurately model the optical layer, reproduce scenarios and generate expected signals are of paramount importance. In this paper, we will overview several applications of Digital Twins, including QoT estimation, failure management, and anomaly detection.
Andrea Sgambelluri, Assistant Professor at TeCIP Institute, Scuola Superiore Sant’Anna, Pisa, Italy
In order to activate high-data-rate connectivity, super-channel transmission strategy is becoming a suitable solution. Optical Software Defined Networking (OSDN) architecture leverages NETCONF protocol for the configuration and management of optical devices. To support a vendor-neutral approach, OpenConfig YANG models are adopted in the NETCONF communication. In OpenConfig, all the proprietary parameters (i.e., FEC, bit rate, modulation format) are mapped to operational modes, maintaining a basic compatibility between vendors. In this work, the experimental analysis of an automatic super-channel optimization is shown. In particular, for each established super-channel connection, the sub-carriers are partially overlapped and tight filtered, achieving spectrum saving with margins reduction, while guaranteeing a level of Quality of Transmission (QoT). The procedure has been demonstrated using an SDN Controller with an SDN-based Optical Network, including OpenConfig 600 Gbit/s transponders and emulated ROADMs. After the setup of a lightpath, the optimization procedure is activated at the transponder agents, without involving the SDN controller, to find the optimal super-channel configuration, according to the reach and the modulation formats to be used. The 25% of the spectrum saving is achieved, still guaranteeing a good QoT level for the channels involved.
John P. Eason, Network Modeling and Optimization Engineer, Meta , USA
Large scale transport backbone networks must be robust against hardware failure events as well as perform well under a range of uncertain traffic matrices. Reliability concerns are balanced by efficiency concerns, leading to tradeoffs in many aspects of the network design such as the degree and prevalence of optical express, optical/IP node sizing, etc. In this work, we present our experience with large scale distributed optimization methods for solving the cross-layer network design problem under hardware failures and demand uncertainties. Compared to a baseline approach where failure states and traffic matrices are planned for sequentially, the distributed optimization method allows for finding global optimal designs across the uncertain parameters. This leads to significantly improved network efficiency without sacrificing reliability metrics, all while helping the network planning process itself scale with increasing traffic and network complexity.
Paulo P. Monteiro, Associate Professor, University of Aveiro, Institute of Telecommunications, Portugal
Next-Generation Wireless Systems, 6G and beyond, will require support to peak bit-rates in the Terabit-per-second range, which will challenge the physical limits of standalone radio-frequency communications. To unlock the capacity of next-generation wireless systems, novel free-space optical communications are currently under development. In this talk, we will review our latest work on visible-light and near-infrared optical wireless transmission delivering bit-rates up to 1 Tbps, leveraged by the use of advanced modulation/coding, digital signal processing, channel modelling/estimation and machine learning techniques, with the aim of supporting the foreseen 6G requirements of ultra-high-capacity and reliability at the physical layer.
Marina Settembre, Senior Researcher, Scientific Direction, Fondazione Ugo Bordoni, Rome, Italy
The digital threat landscape poses more and more challenges to protect information systems and communication networks against different types of attacks, including Advanced Persistent Threats. (APT). New approaches and tools for risk and vulnerability assessment and threat modeling to handle a growing complexity and increasing number of heterogeneous sub-systems changing over time (e.g. updating, substitution or integration of components) and a higher level of automation are needed. Cybersecurity ontology is not a new concept, but recently it is gaining a renewed interest witnessed by a growing number of papers in the technical scientific literature and other initiatives carried out by several agencies and organizations (e.g., ENISA, NIST, MITRE). A common and controlled taxonomy and a machine-readable and human- understandable conceptual models, including entities, attributes, and relationships, can be useful for a real time view of network services and network elements and for knowledge reasoning aiming at a continuous adaptive critical vulnerability and risk assessment and knowledge re-use in threat intelligence and guiding decision-making processes. Current proposed ontologies are focused on specific, but still fragmented, issues and there is an ongoing debate whether and where ontologies driven approaches can be more useful or necessary. In the paper, without claiming to be exhaustive or definitive, the state of the art and the perspectives of ontology driven approaches will be presented, with preliminary insights on 5G security domain, referring to proposed threat taxonomy for 5G networks and location of the exploitation’s target in 5G networks.