Search Results for author: Juyang Weng

Found 5 papers, 0 papers with code

On "Deep Learning" Misconduct

no code implementations23 Nov 2022 Juyang Weng

This paper establishes a theorem that a simple method called Pure-Guess Nearest Neighbor (PGNN) reaches any required errors on validation data set and test data set, including zero-error requirements, through the same misconduct, as long as the test data set is in the possession of the authors and both the amount of storage space and the time of training are finite but unbounded.

Why Deep Learning's Performance Data Are Misleading

no code implementations23 Aug 2022 Juyang Weng

A theorem is established that the NNWT method reaches a zero error on any validation set and any test set using the two misconducts, as long as the test set is in the possession of the author and both the amount of storage space and the time of training are finite but unbounded like with many deep learning methods.

Developmental Network Two, Its Optimality, and Emergent Turing Machines

no code implementations4 Aug 2022 Juyang Weng, Zejia Zheng, Xiang Wu

By dynamic, we mean the automatic selection of features while disregarding distractors is not static, but instead based on dynamic statistics (e. g. because of the instability of shadows in the context of landmark).

Vocal Bursts Valence Prediction

Post-Selections in AI and How to Avoid Them

no code implementations19 Jun 2021 Juyang Weng

To avoid future pitfalls in AI competitions, this paper proposes a new AI metrics, called developmental errors for all networks trained, under Three Learning Conditions: (1) an incremental learning architecture (due to a "big data" flaw), (2) a training experience and (3) a limited amount of computational resources.

Incremental Learning

A Model for Auto-Programming for General Purposes

no code implementations12 Oct 2018 Juyang Weng

This theoretical work shows how the Developmental Network (DN) can accomplish this.

Cannot find the paper you are looking for? You can Submit a new open access paper.