Search Results for author: Shinichi Shirakawa

Found 18 papers, 8 papers with code

Probabilistic Model-Based Dynamic Architecture Search

no code implementations ICLR 2019 Nozomu Yoshinari, Kento Uchida, Shota Saito, Shinichi Shirakawa, Youhei Akimoto

The experimental results show that the proposed architecture search method is fast and can achieve comparable performance to the existing methods.

Image Classification Neural Architecture Search

CMA-ES with Adaptive Reevaluation for Multiplicative Noise

no code implementations19 May 2024 Kento Uchida, Kenta Nishihara, Shinichi Shirakawa

We derive that the set of maximizers of the noise-independent utility, which is used in the reevaluation technique, certainly contains the optimal solution, while the noise-dependent utility, which is used in the population size and leaning rate adaptations, does not satisfy it under multiplicative noise.

CMA-ES for Safe Optimization

no code implementations17 May 2024 Kento Uchida, Ryoki Hamano, Masahiro Nomura, Shota Saito, Shinichi Shirakawa

This optimization setting is known as safe optimization and formulated as a specialized type of constrained optimization problem with constraints for safety functions.

Bayesian Optimization

(1+1)-CMA-ES with Margin for Discrete and Mixed-Integer Problems

no code implementations1 May 2023 Yohei Watanabe, Kento Uchida, Ryoki Hamano, Shota Saito, Masahiro Nomura, Shinichi Shirakawa

The margin correction has been applied to ($\mu/\mu_\mathrm{w}$,$\lambda$)-CMA-ES, while this paper introduces the margin correction into (1+1)-CMA-ES, an elitist version of CMA-ES.

Simple Domain Generalization Methods are Strong Baselines for Open Domain Generalization

1 code implementation31 Mar 2023 Masashi Noguchi, Shinichi Shirakawa

In real-world applications, a machine learning model is required to handle an open-set recognition (OSR), where unknown classes appear during the inference, in addition to a domain shift, where the distribution of data differs between the training and inference phases.

Data Augmentation Domain Generalization +3

Marginal Probability-Based Integer Handling for CMA-ES Tackling Single-and Multi-Objective Mixed-Integer Black-Box Optimization

1 code implementation19 Dec 2022 Ryoki Hamano, Shota Saito, Masahiro Nomura, Shinichi Shirakawa

If the CMA-ES is applied to the MI-BBO with straightforward discretization, however, the variance corresponding to the integer variables becomes much smaller than the granularity of the discretization before reaching the optimal solution, which leads to the stagnation of the optimization.

Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling

no code implementations21 Jul 2022 Yuhei Noda, Shota Saito, Shinichi Shirakawa

The proposed method allows us to obtain multiple architectures with different complexities in a single architecture search, resulting in reducing the search cost.

Neural Architecture Search

CMA-ES with Margin: Lower-Bounding Marginal Probability for Mixed-Integer Black-Box Optimization

2 code implementations26 May 2022 Ryoki Hamano, Shota Saito, Masahiro Nomura, Shinichi Shirakawa

If the CMA-ES is applied to the MI-BBO with straightforward discretization, however, the variance corresponding to the integer variables becomes much smaller than the granularity of the discretization before reaching the optimal solution, which leads to the stagnation of the optimization.

A Two-phase Framework with a Bézier Simplex-based Interpolation Method for Computationally Expensive Multi-objective Optimization

1 code implementation29 Mar 2022 Ryoji Tanabe, Youhei Akimoto, Ken Kobayashi, Hiroshi Umeki, Shinichi Shirakawa, Naoki Hamada

The first phase in TPB aims to approximate a few Pareto optimal solutions by optimizing a sequence of single-objective scalar problems.

NAS-HPO-Bench-II: A Benchmark Dataset on Joint Optimization of Convolutional Neural Network Architecture and Training Hyperparameters

1 code implementation19 Oct 2021 Yoichi Hirose, Nozomu Yoshinari, Shinichi Shirakawa

Building the benchmark dataset for joint optimization of architecture and training hyperparameters is essential to further NAS research.

4k Benchmarking +1

Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures

no code implementations15 Jul 2019 Shota Saito, Shinichi Shirakawa

We focus on the probabilistic model-based dynamic neural network structure optimization that considers the probability distribution of structure parameters and simultaneously optimizes both the distribution parameters and connection weights based on gradient methods.

Neural Architecture Search

Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

1 code implementation21 May 2019 Youhei Akimoto, Shinichi Shirakawa, Nozomu Yoshinari, Kento Uchida, Shota Saito, Kouhei Nishida

It accepts arbitrary search space (widely-applicable) and enables to employ a gradient-based simultaneous optimization of weights and architecture (fast).

Image Classification Neural Architecture Search

Parameterless Stochastic Natural Gradient Method for Discrete Optimization and its Application to Hyper-Parameter Optimization for Neural Network

no code implementations18 Sep 2018 Kouhei Nishida, Hernan Aguirre, Shota Saito, Shinichi Shirakawa, Youhei Akimoto

This paper proposes a parameterless BBDO algorithm based on information geometric optimization, a recent framework for black box optimization using stochastic natural gradient.

Sample Reuse via Importance Sampling in Information Geometric Optimization

no code implementations31 May 2018 Shinichi Shirakawa, Youhei Akimoto, Kazuki Ouchi, Kouzou Ohara

The experimental results show that the sample reuse helps to reduce the number of function evaluations on many benchmark functions for both the PBIL and the pure rank-$\mu$ update CMA-ES.

Evolutionary Algorithms Incremental Learning

Dynamic Optimization of Neural Network Structures Using Probabilistic Modeling

no code implementations23 Jan 2018 Shinichi Shirakawa, Yasushi Iwata, Youhei Akimoto

We consider a probability distribution that generates network structures, and optimize the parameters of the distribution instead of directly optimizing the network structure.

A Genetic Programming Approach to Designing Convolutional Neural Network Architectures

5 code implementations3 Apr 2017 Masanori Suganuma, Shinichi Shirakawa, Tomoharu Nagao

To evaluate the proposed method, we constructed a CNN architecture for the image classification task with the CIFAR-10 dataset.

General Classification Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.