Dillon Pulliam

Deep Learning Researcher

Deep Learning Paper Recap - Diffusion and Transformer Models
Deep Learning Paper Recap - Diffusion and Transformer Models

This week’s Deep Learning Paper Reviews is Diffusion-LM Improves Controllable Text Generation and Sparsifying Transformer Models with Trainable Representation Pooling.

Review – TOXIGEN & Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Review – TOXIGEN & Knowledge Distillation Meets Open-Set Semi-Supervised Learning

This week’s Deep Learning Paper Reviews are TOXIGEN: A Large-Scale Machine-Generated Dataset for Adversarial and Implicit Hate Speech Detection and Knowledge Distillation Meets Open-Set Semi-Supervised Learning.

Review - Perceiver: General Perception with Iterative Attention
Review - Perceiver: General Perception with Iterative Attention

This week’s Deep Learning Paper Review is Perceiver: General Perception with Iterative Attention.

Review - SimCLS and RefSum - Summarization Techniques
Review - SimCLS and RefSum - Summarization Techniques

This week’s Deep Learning research papers are SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization and RefSum: Refactoring Neural Summarization.

New: Improved Topic Detection and IAB Classification
New: Improved Topic Detection and IAB Classification

We’re excited to release a major accuracy improvement to our Topic Detection feature, which can accurately predict the topics spoken in audio/video files.

Fine-Tuning Transformers for NLP
Fine-Tuning Transformers for NLP

Since the Attention Is All You Need paper, Transformers have completely redefined the field of Natural Language Processing. In this blog, we show you how to quickly fine-tune Transformers for numerous downstream tasks, that often perform really well out of the box!