Heben Sie Ihre Leistungen auf People@HES-SO hervor weitere Infos
PEOPLE@HES-SO - Verzeichnis der Mitarbeitenden und Kompetenzen
PEOPLE@HES-SO - Verzeichnis der Mitarbeitenden und Kompetenzen

PEOPLE@HES-SO
Verzeichnis der Mitarbeitenden und Kompetenzen

Hilfe
language
  • fr
  • en
  • de
  • fr
  • en
  • de
  • SWITCH edu-ID
  • Verwaltung
« Zurück
Monti Matteo

Monti Matteo

Professeur HES associé

Hauptkompetenzen

Solaire thermique

Science des matériaux

Energy storage

Montage et gestion de projet

Projets de recher

Thin Films and Surface Treatment

  • Kontakt

  • Lehre

  • Konferenzen

Hauptvertrag

Professeur HES associé

Telefon-Nummer: +41 24 557 75 77

Büro: S06a

Haute école d'Ingénierie et de Gestion du Canton de Vaud
Route de Cheseaux 1, 1400 Yverdon-les-Bains, CH
HEIG-VD
Institut
IE - Institut des énergies
MSc HES-SO en Engineering - HES-SO Master
  • Solaire Thermique
BSc HES-SO en Energie et techniques environnementales - Haute école d'Ingénierie et de Gestion du Canton de Vaud
  • Solaire Thermique

2023

Byzantine-Resilient learning beyond gradients :
Konferenz ArODES
distributing evolutionary search

Andrei Kucharavy, Matteo Monti

GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation

Link zur Konferenz

Zusammenfassung:

Modern machine learning (ML) models are capable of impressive performances. However, their prowess is not due only to the improvements in their architecture and training algorithms but also to a drastic increase in computational power used to train them. Such a drastic increase led to a growing interest in distributed ML, which in turn made worker failures and adversarial attacks an increasingly pressing concern. While distributed byzantine resilient algorithms have been proposed in a differentiable setting, none exist in a gradient-free setting. The goal of this work is to address this shortcoming. For that, we introduce a more general definition of byzantine-resilience in ML- the model-consensus, that extends the definition of the classical distributed consensus. We then leverage this definition to show that a general class of gradient-free ML algorithms - (1, 𝜆)-Evolutionary Search - can be combined with classical distributed consensus algorithms to generate gradient-free byzantine-resilient distributed learning algorithms. We provide proofs and pseudo-code for two specific cases - the Total Order Broadcast and proof-of-work leader election. To our knowledge, this is the first time a byzantine resilience in gradient-free ML was defined, and algorithms to achieve it – were proposed

Errungenschaften

Medien und Kommunikation
Kontaktieren Sie uns
Folgen Sie der HES-SO
linkedin instagram facebook twitter youtube rss
univ-unita.eu www.eua.be swissuniversities.ch
Rechtliche Hinweise
© 2021 - HES-SO.

HES-SO Rectorat