Sitemap
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Pages
Posts
Future Blog Post
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Blog Post number 4
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 3
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
publications
Dynamic Graph Echo State Networks
Published in ESANN, 2021
Preliminary experiments on temporal graph classification with DynGESN, a novel reservoir computing model for dynamic graphs.
Recommended citation: D. Tortorella, A. Micheli (2021). "Dynamic Graph Echo State Networks." Proceedings of the 29th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2021), pp. 99-104.
Download Paper
Discrete-Time Dynamic Graph Echo State Networks
Published in Neurocomputing, 2022
DynGESN is introduced as a novel reservoir computing model for temporal graphs. More efficient then temporal graph kernels and 100x faster than temporal GNNs.
Recommended citation: A. Micheli, D. Tortorella (2022). "Discrete-Time Dynamic Graph Echo State Networks." Neurocomputing, vol. 496, pp. 85-95.
Download Paper
Spectral Bounds for Graph Echo State Network Stability
Published in IJCNN, 2022
More accurate stability bounds for GESN based on graph spectral properties.
Recommended citation: D. Tortorella, C. Gallicchio, A. Micheli (2022). "Spectral Bounds for Graph Echo State Network Stability." Proceedings of the 2022 International Joint Conference on Neural Networks.
Download Paper
Hierarchical Dynamics in Deep Echo State Networks
Published in ICANN, 2022
An in-depth theoretical analysis of asymptotic dynamics in Deep ESNs with different contractivity hierarchies.
Recommended citation: D. Tortorella, C. Gallicchio, A. Micheli (2022). "Hierarchical Dynamics in Deep Echo State Networks." Proceedings of the 31st International Conference on Artificial Neural Networks (ICANN 2022), pp. 668-679.
Download Paper
Beyond Homophily with Graph Echo State Networks
Published in ESANN, 2022
Preliminary experiments on heterophilic node classification with GESN, showing the effectiveness of going beyond stability constraints.
Recommended citation: D. Tortorella, A. Micheli (2022). "Beyond Homophily with Graph Echo State Networks." Proceedings of the 30th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2022), pp. 491-496.
Download Paper
Leave Graphs Alone: Addressing Over-Squashing without Rewiring
Published in LoG, 2022
GESN achieves a significantly better accuracy on six heterophilic node classification tasks via tuning Lipschitz constants instead of resorting to graph rewiring.
Recommended citation: D. Tortorella, A. Micheli (2022). "Leave Graphs Alone: Addressing Over-Squashing without Rewiring" (Extended Abstract). Presented at the First Learning on Graphs Conference (LoG 2022), Virtual Event, December 9–12, 2022.
Download Paper
Addressing Heterophily in Node Classification with Graph Echo State Networks
Published in Neurocomputing, 2023
Node-level GESN is highly effective for heterophilic node classification tasks, while also being efficient and resilient to over-smoothing.
Recommended citation: A. Micheli, D. Tortorella (2023). "Addressing Heterophily in Node Classification with Graph Echo State Networks." Neurocomputing, vol. 550, 126506.
Download Paper
Minimum Spanning Set Selection in Graph Kernels
Published in GbRPR, 2023
Minimizing the number of support vectors in SVM without any loss of accuracy via an RRQR factorization of kernel matrix.
Recommended citation: D. Tortorella, A. Micheli (2023). "Minimum Spanning Set Selection in Graph Kernels." Graph-Based Representations in Pattern Recognition. GbRPR 2023. LNCS vol. 14121, pp. 15-24.
Download Paper
Richness of Node Embeddings in Graph Echo State Networks
Published in ESANN, 2023
Preliminary analysis of GESN’s node embedding richness via entropy and numerical analysis metrics.
Recommended citation: D. Tortorella, A. Micheli (2023). "Richness of Node Embeddings in Graph Echo State Networks." Proceedings of the 31st European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023), pp. 11-16.
Download Paper
Entropy Based Regularization Improves Performance in the Forward-Forward Algorithm
Published in ESANN, 2023
Adding a representation entropy term into the loss of Hinton’s FFA improves accuracy.
Recommended citation: M. Pardi, D. Tortorella, A. Micheli (2023). "Entropy Based Regularization Improves Performance in the Forward-Forward Algorithm." Proceedings of the 31st European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2023), pp. 393-398.
Download Paper
Designs of Graph Echo State Networks for Node Classification
Published in Neurocomputing, 2024
Analysis of dense and sparse reservoir designs for node-level GESN via topology-dependent and topology-agnostic richness measures for node embeddings.
Recommended citation: A. Micheli, D. Tortorella (2024). "Designs of Graph Echo State Networks for Node Classification." Neurocomputing, vol. 597, 127965.
Download Paper
Continuously Deep Recurrent Neural Networks
Published in ECML PKDD, 2024
A continuous-depth ESN is proposed, where a smooth depth hyperparameter regulates the extent of local connections.
Recommended citation: A. Ceni, P. F. Dominey, C. Gallicchio, A. Micheli, L. Pedrelli, D. Tortorella (2024). "Continuously Deep Recurrent Neural Networks." Machine Learning and Knowledge Discovery in Databases: Research Track. ECML PKDD 2024. LNCS vol. 14947, pp. 59-73.
Download Paper
Onion Echo State Networks: A Preliminary Analysis of Dynamics
Published in ICANN, 2024
A preliminary analysis of dynamical properties of Onion ESN, a novel reservoir with groups of units presentig an annular spectrum.
Recommended citation: D. Tortorella, A. Micheli (2024). "Onion Echo State Networks: A Preliminary Analysis of Dynamics." Proceedings of the 33rd International Conference on Artificial Neural Networks (ICANN 2024), LNCS vol. 15025, pp. 117-128.
Download Paper
Continual Learning with Graph Reservoirs: Preliminary experiments in graph classification
Published in ESANN, 2024
GESN relieves part of catastrophic forgetting in the continual learning setting by avoiding training representations for graph classification.
Recommended citation: D. Tortorella, A. Micheli (2024). "Continual Learning with Graph Reservoirs: Preliminary experiments in graph classification." Proceedings of the 32nd European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2024), pp. 35-40.
Download Paper
Analyzing Explanations of Deep Graph Networks through Node Centrality and Connectivity
Published in Discovery Science, 2024
We analyze the alignment of DGNs explanations to node centrality and graph connectivity, highlighting the presence of different inductive biases.
Recommended citation: M. Fontanesi, A. Micheli, M. Podda, D. Tortorella (2024). "Analyzing Explanations of Deep Graph Networks through Node Centrality and Connectivity." Discovery Science 2024, to appear.
research
Deep Learning on Graphs
Short description of portfolio item number 1
Reservoir Computing
Short description of portfolio item number 1
talks
Talk 1 on Relevant Topic in Your Field
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Conference Proceeding talk 3 on Relevant Topic in Your Field
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
teaching
TA for the Machine Learning course (fall 2022)
Master degree course, University of Pisa, Department of Computer Science, 2022
Supporting the students in matters concerning the course project.
TA for the Machine Learning course (fall 2024)
Master degree course, University of Pisa, Department of Computer Science, 2024
Supporting the students in matters concerning the course project.