Scalable Artificial Intelligence

  • type: Lecture (V)
  • chair: KIT-Fakultäten - KIT-Fakultät für Informatik - Institut für Telematik - ITM Streit
    KIT-Fakultäten - KIT-Fakultät für Informatik
  • semester: WS 21/22
  • time: Th 2021-10-21
    10:00 - 11:30, weekly


    Tu 2021-10-26
    14:00 - 15:30

    Th 2021-10-28
    10:00 - 11:30, weekly

    Th 2021-11-04
    10:00 - 11:30, weekly

    Tu 2021-11-09
    14:00 - 15:30

    Th 2021-11-11
    10:00 - 11:30, weekly

    Th 2021-11-18
    10:00 - 11:30, weekly

    Tu 2021-11-23
    14:00 - 15:30

    Th 2021-11-25
    10:00 - 11:30, weekly

    Th 2021-12-02
    10:00 - 11:30, weekly

    Tu 2021-12-07
    14:00 - 15:30

    Th 2021-12-09
    10:00 - 11:30, weekly

    Th 2021-12-16
    10:00 - 11:30, weekly

    Tu 2021-12-21
    14:00 - 15:30

    Th 2021-12-23
    10:00 - 11:30, weekly

    Th 2022-01-13
    10:00 - 11:30, weekly

    Tu 2022-01-18
    14:00 - 15:30

    Th 2022-01-20
    10:00 - 11:30, weekly

    Th 2022-01-27
    10:00 - 11:30, weekly

    Tu 2022-02-01
    14:00 - 15:30

    Th 2022-02-03
    10:00 - 11:30, weekly

    Th 2022-02-10
    10:00 - 11:30, weekly


  • lecturer: Dr. Charlotte Debus
    Dr. Markus Götz
    Marie Weiel-Potyagaylo
  • sws: 3
  • lv-no.: 2400004
  • information: Online
Content

Over the last decade artifical intelligence (AI) methods have significantly advanced the state-of-the-art in science and engineering. One of the most prominent trends is an ever increasing amount of analyed (training) data, necessitating the usage of parallel and distributed computational resources. A well-known example for this is the machine translation algorithm Generative Pre-trained Transformer 3 (GPT-3) [1]. With a total of 175 billion parameters trained on 285.000 processor cores as well as 10.000 GPUs, this model exceeds the capabilities of traditional AI hardware. In this lecture students will learn about parallelization and scaling approaches for different AI algorithms. An emphasis is put on the advantages of parallel computing for AI, available software packages for implementation, and, majorly, the algorithmic design challenges. In line with this, examples from the following algorithmic classes will illustrates the potential use for scalable AI:

* unsupervised learning
* supervised learning
* neural networks
* ensemble method
* search

In conjuction with the course topic, students will also learn about supporting data formats, machine models and the use of novel hardware, such as quantum computer or neuromorphic devices.

Language of instructionGerman/English
Bibliography

[1] Ben-Nun, Tal, and Torsten Hoefler. "Demystifying parallel and distributed deep learning: An in-depth concurrency analysis." ACM Computing Surveys (CSUR) 52.4 (2019): 1-43.

[2] Brown, Tom B., et al. "Language models are few-shot learners." arXiv preprint arXiv:2005.14165 (2020).