
33:40
Dear All,We are working on an OOD-Bible titled "Oodles of Odds".This is a work-in-progress and we plan to disseminate the paper this coming week.Here's the project URL:https://matthew-mcateer.github.io/oodles-of-oods/

37:29
Also, should you be using MNIST in your OOD-experiments, we have curated two 'semantically relevant' OOD datasets that you might find useful:- https://www.kaggle.com/c/Kannada-MNIST- https://github.com/Daniel-Wu/AfroMNIST

46:28
Follow up question: Do you consider a blank image to be OOD? Almost every generative model will assign extremely high logp to a blank image.

46:52
I’m assuming all of the results are asymmetric, right? I.e. training on SVHN won’t lead to high log p(x) on CIFAR, and training on MNIST won’t lead to high log p(x) on Fashion MNIST?

48:39
@Andrew - good question - I don't want to stop Balaji now but please hold on to it and ask it

49:50
Answered now :)

50:19
Yep :)

01:11:03
How much easier does this problem become if you consider the problem of detecting whether a *batch* of samples is iid from the training distribution, rather than an individual example?

01:11:14
(Maybe a question for the end)

01:15:00
(I guess you’re getting to it right now :)

01:16:59
@Andrew you seem to be good at bits/dimension compression of this talk :)

01:17:05
Hehe :)

01:31:56
When you say ensembles consistently perform the best, are all the models in the ensemble trained on the exact same training data, or different data distributions?

01:33:46
Can you clarify what kind of ensembles are you referring to? Are they simple MLPs, ConvNets, other common architectures? Also, when training ensembles is that simple standard training, just SGD and probably some regularisation but nothing else?

01:39:11
Have you tried using Anchored Ensembles (https://arxiv.org/pdf/1810.05546.pdf) in such experiments?

01:39:56
I think he's close to the Q&A part so let's wait till then?

02:04:08
Please continue if possible

02:04:10
Thanks for the great talk!

02:04:18
Thank you for having this!! This was great.