Alex Dimakis   @AlexGDimakis

UT Austin Professor. Researcher in Machine Learning and Information Theory.



















 

  Tweets by Alex Dimakis  

Alex Dimakis    @AlexGDimakis    3 hours      

Neurips best paper awards -- Congratulations to all the recipients!
  
          7



Alex Dimakis    @AlexGDimakis    11/24/2021      

Honored to be selected as an IEEE Fellow `for contributions to distributed coding and learning' Congratulations to the whole Fellows class of 2022 https://t.co/T1PQBs6EJI
  
          17



Alex Dimakis    @AlexGDimakis    11/23/2021      

Shocked to learn Graham Bell designed a Sierpinski triangle airplane before Sierpinski.
  
    1         7



Alex Dimakis    @AlexGDimakis    11/23/2021      

If you had to identify the top 1 percent papers and I had to independently identify the top 1 percent and we agreed in half of the papers that would be quite strong alignment of sortings.
  
          3



Alex Dimakis    @AlexGDimakis    11/17/2021      

While waiting for #CVPR2022 CMT to get up again, I would like to propose a simple cryptographic solution to the big data submission problem: We only upload a SHA256 hash of our to-be-submitted pdf and then upload the committed pdf any time next week.
  
    6         205



Alex Dimakis    @AlexGDimakis    11/11/2021      

We have multiple postdoc openings at the AI Institute for the Foundations of Machine Learning (IFML). Fellows can work with all IFML groups in UT Austin, Univ. of Washington and Microsoft Research https://t.co/k33XqTtX8R (1/3)
  
    46         165



Alex Dimakis    @AlexGDimakis    11/11/2021      

I cannot understand people who use spaces in file names. And don't get me started with non-ascii characters ®.
  
          11



Alex Dimakis    @AlexGDimakis    11/11/2021      

Tried to get a photo of a pony but only got a tiny part of the legs? No problem, this conditional diffusion generative model will imagine a pony for you.
  
    1         23



Alex Dimakis    @AlexGDimakis    11/8/2021      

Proud advisor moment: Congrats @Qi_Lei_ for being selected as a rising star in Machine Learning
  
    1         66



Alex Dimakis    @AlexGDimakis    11/6/2021      

One more thing on Shannon-Nyquist: the fact that you can roughly approximate any smooth function by sampling and interpolating is obvious. The theorem is that you can *perfectly reconstruct* a continuous function if its band limited and you interpolate with Sinc.
  
    1         26



Alex Dimakis    @AlexGDimakis    11/6/2021      

Here is a very good reason why the Nyquist–Shannon sampling theorem requires that your function is low-pass before you sub-sample to downscale. If you just sub-sample without smoothing, a bad guy can place another image exactly on the pixels you sub-sample. Adversarial aliasing.
  
    59         376



Alex Dimakis    @AlexGDimakis    11/5/2021      

Founding Youtube email: YouTube started as a video version of Hot or Not.
  
    1         4



Alex Dimakis    @AlexGDimakis    11/5/2021      

Had a great time at the joint @MLFoundations @SimonsInstitute workshop! Check out the great program and videos available online.
  
    1         19



Alex Dimakis    @AlexGDimakis    11/4/2021      

3.5% reduction in Neurips submissions from 2020 to 2021. Seems we're past peak Neurips.
  
    6         87



Alex Dimakis    @AlexGDimakis    10/9/2021      

Causal inference and information theory aaai workshop deadline approaching
  
    2         10



Alex Dimakis    @AlexGDimakis    10/8/2021      

On the difference between (classical) Statistics and Machine Learning, I found this gem by Leo Breiman:'The two cultures of Statistical modeling' https://t.co/nm5c54rrOz
  
    35         194



Alex Dimakis    @AlexGDimakis    10/4/2021      

Of course! A ranking based on student placement would be great. My recommended related trick to students is: read dissertation acknowledgements from recent grads. There is interesting signal in how advisors are thanked. 🤓
  
          5



Alex Dimakis    @AlexGDimakis    9/30/2021      

Fitting theoretical predictions to explain a real phenomenon can be always achieved by tuning free parameters in the theory. Famously "four parameters can fit to an elephant and five can make him wiggle its trunk.''
  
    1         8