I was involved in machine learning and AI a few years ago, mainly before the onset of the new diffusion models, large transformers (GPT*), Graph NNs and Neural ODE stuff. I am comfortable with autograd/computation graphs, PyTorch, “classic” neural nets and ones used for vision-type applications, as well as the basics of Transformer networks (I’ve trained a few smaller ones myself) and RNNs. Do you know of any good resources to slowly get back into the loop? So far I plan on reading through the original Diffusion/GPT papers and start going from there but I’d love to see what you think are some good sources. I would especially love to see some Jupyter notebooks to fiddle with as I find I learn best when I get to play around with the code. Thank you
Story Published at: December 10, 2022 at 01:51PM

Leave a Reply

Your email address will not be published. Required fields are marked *