IEEE CIS SSoCIA 2022

Summer School 1

Gerardo Rubino

Random Neural Networks and applications

Random Neural Networks are a class of Neural Networks coming from Stochastic Processes and, in particular, from Queuing Models. They have some nice properties and they have reached good performances in several application areas. They are, in fact, queuing systems seen as Neural machines, and the two uses (probabilistic models for the performance evaluation of systems, or learning machines similar as the other more standard families of Neural Networks) refer to the same mathematical objects. They have the appealing that, as other special models that are unknown for most experts in Machine Learning, their testing in and/or adaptation to the many areas where standard Machine Learning techniques have obtained great successes is totally open.

In the talk we will introduce Random Neurons and the networks we can build with them, plus some details about the numerical techniques needed to learn with them. We will also underline the reasons that make them at least extremely interesting. We will also describe some of their successful applications, including our examples. We will focus on learning, but we will mention other uses of these models in performance evaluation, in the analysis of biological systems, and in optimization.

Summer School 2

Rosangela Ballini

Fuzzy Rule-Based Model: theory and applications

A modelling strategy based on the application of fuzzy inference system is shown to provide a powerful and efficient method for the identification of non-linear and linear economic and financial relationships. The procedure is particularly suitable for the estimation of ill-defined systems in which there is considerable uncertainty about the nature and range of key input variables. In addition, no prior knowledge is required about the form of the underlying relationships. Here, we will discuss the advances in the theory and applications proposed in the literature in recent years.

Summer School 3

Leslie Pérez Cáceres

Algorithm Configuration Survival Guide

Algorithm configuration is an essential task that enables adapting the search behaviour of algorithms to the problem at hand. Algorithm performance is often very dependent on the parameter settings, and the choice of these settings is commonly trusted to the experience with the problem and the algorithm. In this talk, we will discuss algorithm configuration and the elements that are relevant for defining a configuration task. We will introduce t:irace configurator and show how this tool can be used to adjust parameter settings automatically to obtain the best performance out of an algorithm.

Summer School 4

Graph adjacency spectral embeddings: Algorithmic advances and applications

The random dot product graph (RDPG) is a tractable yet expressive generative graph model for relational data, that subsumes simple Erdős-Rényi and stochastic block model ensembles as particular cases. RDPGs postulate that there exist latent positions for each node and specify the edge formation probabilities via the inner product of the corresponding latent vectors. In this talk, we first focus on the embedding task of estimating these latent positions from observed graphs. The workhorse adjacency spectral embedding (ASE) offers an approximate solution obtained via the eigendecomposition of the adjacency matrix, which enjoys solid statistical guarantees but can be computationally intensive and is formally solving a surrogate problem. To address these challenges, we bring to bear recent non-convex optimization advances and demonstrate their impact to RDPG inference. We show the proposed algorithms are scalable, robust to missing network data, and can track the latent positions over time when the graphs are acquired in a streaming fashion; even for dynamic networks subject to node additions and deletions. We also discuss extensions to the vanilla RDPG to accommodate directed and weighted graphs. Unlike previous proposals, our non-parametric RDPG model for weighted networks does not require a priori specification of the weights’ distribution to perform inference and estimation in a provably consistent fashion. Finally, we discuss the problem of online monitoring and detection of changes in the underlying data distribution of a graph sequence. Our idea is to endow sequential change-point detection (CPD) techniques with a graph representation learning substrate based on the versatile RDPG model. We share an open-source implementation of the proposed node embedding and online CPD algorithms, whose effectiveness is demonstrated via synthetic and real network data experiments.

Summer School (talk cancelled due to health-related travel problems)

Sanaz Mostaghim

Recent Advances in Evolutionary Multi-Objective Optimization

In our daily lives we are inevitably involved in optimization problems. How to get to the university in the least possible time is a simple optimization problem that we encounter every morning. Just looking around ourselves we can see many other examples of optimization problems even with conflicting objectives and higher complexities. It is natural to want everything to be as good as possible, in other words optimal and any new development in optimization which can lead to a better solution of a particular problem is of considerable value to science and industry. The difficulty arises when there are conflicts between the different goals and objectives. Indeed, there are many real-world optimization problems with multiple conflicting objectives in science and industry, which are of great complexity. We call them Multi-objective Optimization Problems.

Over the past decade, lots of new ideas have been investigated and studied to solve such optimization problems. Among these methods, Evolutionary Multi-objective Optimization (EMO) algorithms are shown to be quite successful and have been applied to many applications.

This talk addresses the recent advanced topics in the area of evolutionary multi-objective optimization and contains the following content:

  • Classical approaches for solving multi-objective optimization (MO) problems, definitions and theoretical foundations for EMO
  • State-of-the-art Evolutionary multi-objective algorithms
  • Large-scale EMO: large-scale decision space and many objective optimization, corresponding test benchmark such as Distance Minimization Problem and its variants for dynamic MO problems
  • Evaluation mechanisms (Design of experiments, test problems, metrics, visualization)

Summer School 5

Gabriela Ochoa

Introduction to Fitness Landscapes and Local Optima Networks

Fitness landscape analysis can be used to characterize optimization problems by analyzing the underlying search space in terms of the objective function to be optimized. Fitness landscape analysis provides methods and measures that have been shown to be effective in the understanding of complex problems, algorithm behaviour, and the selection of algorithms. The goal of this tutorial is to describe the complexities of search spaces and how these impact on the performance of algorithms. We will cover these two topics:

  • Fundamental concepts of fitness landscapes
  • Local optima networks (LONs) for understanding the global structure and modelling high dimensional spaces

Summer School 6

Federico Larroca

A tutorial on Graph Neural Networks with Applications

Data obtained from networks may be naturally modeled as graph signals, where we associate a value to each node in a graph describing the underlying network structure. In this context, there are several interesting learning problems, where the main challenge is the irregular structure of the network: deciding if a social network profile is actually a bot, recommending new content to a user of a streaming platform, or predicting the solubility of a certain molecule are some illustrative examples. In the last few years, Graph Neural Networks (GNNs) have become extremely popular to address these problems, which is a learning architecture that may be regarded as mimicking the more traditional Convolutional Neural Networks (CNNs) in the context of networks. In this talk, we will present the fundamentals of GNNs, defining them as extensions of convolutions on graphs, which will then allow us to present and understand two key properties that explain their excellent performance (and identify in which cases using GNNs would not be a good idea): permutation equivariance and stability to perturbations. These concepts will be illustrated through example applications.

Organizers

IEEE Computer Intelligence Society
IEEE Computer Intelligence Society

https://cis.ieee.org/

IEEE
IEEE

https://www.ieee.org/

Instituto de Computación
Instituto de Computación

https://www.fing.edu.uy/inco

Facultad de Ingeniería
Facultad de Ingeniería

https://www.fing.edu.uy/

Universidad de la República
Universidad de la República

https://www.universidad.edu.uy