Search Results for author: Yuiko Sakuma

Found 4 papers, 0 papers with code

Mixed-precision Supernet Training from Vision Foundation Models using Low Rank Adapter

no code implementations29 Mar 2024 Yuiko Sakuma, Masakazu Yoshimura, Junji Otsuka, Atsushi Irie, Takeshi Ohashi

To tackle these challenges, first, we study the effective search space design for fine-tuning a VFM by comparing different operators (such as resolution, feature size, width, depth, and bit-widths) in terms of performance and BitOPs reduction.

Neural Architecture Search

DetOFA: Efficient Training of Once-for-All Networks for Object Detection Using Path Filter

no code implementations23 Mar 2023 Yuiko Sakuma, Masato Ishii, Takuya Narihira

We address the challenge of training a large supernet for the object detection task, using a relatively small amount of training data.

Neural Architecture Search object-detection +2

n-hot: Efficient bit-level sparsity for powers-of-two neural network quantization

no code implementations22 Mar 2021 Yuiko Sakuma, Hiroshi Sumihiro, Jun Nishikawa, Toshiki Nakamura, Ryoji Ikegaya

Moreover, we use a two-stage fine-tuning algorithm to recover the accuracy drop that is triggered by introducing the bit-level sparsity.

object-detection Object Detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.