Valorisez vos réalisations phares sur People@HES-SO Plus d'infos
PEOPLE@HES-SO – Annuaire et Répertoire des compétences
PEOPLE@HES-SO – Annuaire et Répertoire des compétences

PEOPLE@HES-SO
Annuaire et Répertoire des compétences

Aide
language
  • fr
  • en
  • de
  • fr
  • en
  • de
  • SWITCH edu-ID
  • Administration
« Retour
Monti Matteo

Monti Matteo

Professeur HES associé

Compétences principales

Solaire thermique

Science des matériaux

Energy storage

Montage et gestion de projet

Projets de recher

Thin Films and Surface Treatment

  • Contact

  • Enseignement

  • Conférences

Contrat principal

Professeur HES associé

Téléphone: +41 24 557 75 77

Bureau: S06a

Haute école d'Ingénierie et de Gestion du Canton de Vaud
Route de Cheseaux 1, 1400 Yverdon-les-Bains, CH
HEIG-VD
Institut
IE - Institut des énergies
MSc HES-SO en Engineering - HES-SO Master
  • Solaire Thermique
BSc HES-SO en Energie et techniques environnementales - Haute école d'Ingénierie et de Gestion du Canton de Vaud
  • Solaire Thermique

2023

Byzantine-Resilient learning beyond gradients :
Conférence ArODES
distributing evolutionary search

Andrei Kucharavy, Matteo Monti

GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation

Lien vers la conférence

Résumé:

Modern machine learning (ML) models are capable of impressive performances. However, their prowess is not due only to the improvements in their architecture and training algorithms but also to a drastic increase in computational power used to train them. Such a drastic increase led to a growing interest in distributed ML, which in turn made worker failures and adversarial attacks an increasingly pressing concern. While distributed byzantine resilient algorithms have been proposed in a differentiable setting, none exist in a gradient-free setting. The goal of this work is to address this shortcoming. For that, we introduce a more general definition of byzantine-resilience in ML- the model-consensus, that extends the definition of the classical distributed consensus. We then leverage this definition to show that a general class of gradient-free ML algorithms - (1, 𝜆)-Evolutionary Search - can be combined with classical distributed consensus algorithms to generate gradient-free byzantine-resilient distributed learning algorithms. We provide proofs and pseudo-code for two specific cases - the Total Order Broadcast and proof-of-work leader election. To our knowledge, this is the first time a byzantine resilience in gradient-free ML was defined, and algorithms to achieve it – were proposed

Réalisations

Médias et communication
Nous contacter
Suivez la HES-SO
linkedin instagram facebook twitter youtube rss
univ-unita.eu www.eua.be swissuniversities.ch
Mentions légales
© 2021 - HES-SO.

HES-SO Rectorat