Dequan Wang @ UC Berkeley
Dequan Wang @ UC Berkeley
Home
Publications
CV
Light
Dark
Automatic
selected
GACT: Activation Compressed Training for Generic Network Architectures
GACT is an activation compression training (ACT) framework to support a broad range of machine learning tasks for generic neural network architectures with limited domain knowledge. By analyzing a linearized version of ACT’s approximate gradient, we prove the convergence of GACT without prior knowledge on operator type or model architecture.
Xiaoxuan Liu
,
Lianmin Zheng
,
Dequan Wang
,
Yukuo Cen
,
Weize Chen
,
Xu Han
,
Jianfei Chen
,
Zhiyuan Liu
,
Jie Tang
,
Joseph Gonzalez
,
Michael Mahoney
,
Alvin Cheung
PDF
Cite
Tent: Fully Test-time Adaptation by Entropy Minimization
Tent equips a model to adapt itself to new and different data during testing ☀️ 🌧 ❄️. Tented models adapt online and batch-by-batch to reduce error on dataset shifts like corruptions, simulation-to-real discrepancies, and other differences between training and testing data.
Dequan Wang
,
Evan Shelhamer
,
Shaoteng Liu
,
Bruno Olshausen
,
Trevor Darrell
PDF
Cite
Code
Video
Objects as Points
We represent objects by a single point at their bounding box center. Other properties, such as object size, dimension, 3D extent, orientation, and pose are then regressed directly from image features at the center location.
Xingyi Zhou
,
Dequan Wang
,
Philipp Krähenbühl
PDF
Cite
Code
Video
Cite
×