no code implementations • PANDL (COLING) 2022 • Remo Nitschke, Yuwei Wang, Chen Chen, Adarsh Pyarelal, Rebecca Sharp
Natural language (as opposed to structured communication modes such as Morse code) is by far the most common mode of communication between humans, and can thus provide significant insight into both individual mental states and interpersonal dynamics.
no code implementations • 16 Apr 2024 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Tianliu He, Wen Wang
On-device intelligence (ODI) enables artificial intelligence (AI) applications to run on end devices, providing real-time and customized AI inference without relying on remote servers.
no code implementations • 12 Mar 2024 • Yao Liang, Yuwei Wang, Yang Li, Yi Zeng
In response to this, inspired by the idea that the functions of the brain are shaped by its geometric structure, this paper integrates this idea into LoRA technology and proposes a new matrix transformation-based reparameterization method for efficient fine-tuning, named Matrix-Transformation based Low-Rank Adaptation (MTLoRA).
1 code implementation • 10 Mar 2024 • Xinhao Cai, Qiuxia Lai, Yuwei Wang, Wenguan Wang, Zeren Sun, Yazhou Yao
Object detection in remote sensing images (RSIs) often suffers from several increasing challenges, including the large variation in object scales and the diverse-ranging context.
no code implementations • 29 Feb 2024 • Yi Zeng, Feifei Zhao, Yuxuan Zhao, Dongcheng Zhao, Enmeng Lu, Qian Zhang, Yuwei Wang, Hui Feng, Zhuoya Zhao, Jihang Wang, Qingqun Kong, Yinqian Sun, Yang Li, Guobin Shen, Bing Han, Yiting Dong, Wenxuan Pan, Xiang He, Aorigele Bao, Jin Wang
In this paper, we introduce a Brain-inspired and Self-based Artificial Intelligence (BriSe AI) paradigm.
no code implementations • 12 Jan 2024 • Yuwei Wang, Yi Zeng
Concept learning is a fundamental aspect of human cognition and plays a critical role in mental processes such as categorization, reasoning, memory, and decision-making.
no code implementations • 8 Jan 2024 • Yuhan Tang, Zhiyuan Wu, Bo Gao, Tian Wen, Yuwei Wang, Sheng Sun
Federated Distillation (FD) is a novel and promising distributed machine learning paradigm, where knowledge distillation is leveraged to facilitate a more efficient and flexible cross-device knowledge transfer in federated learning.
2 code implementations • 1 Jan 2024 • Zhiyuan Wu, Tianliu He, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Xuefeng Jiang
Federated Learning (FL) enables collaborative model training among participants while guaranteeing the privacy of raw data.
1 code implementation • 7 Dec 2023 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Tian Wen, Wen Wang
ALU drastically decreases the frequency of communication in federated distillation, thereby significantly reducing the communication overhead during the training process.
1 code implementation • 1 Dec 2023 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Quyang Pan, Tianliu He, Xuefeng Jiang
Federated Learning (FL) enables training Artificial Intelligence (AI) models over end devices without compromising their privacy.
no code implementations • 14 Nov 2023 • Yuwei Wang, Runhan Li, Hao Tan, Xuefeng Jiang, Sheng Sun, Min Liu, Bo Gao, Zhiyuan Wu
By fusing the logits of the two models, the private weak learner can capture the variance of different data, regardless of their category.
no code implementations • 8 Nov 2023 • Gechun Liang, Moris S. Strub, Yuwei Wang
We consider a new framework of predictable relative forward performance processes (PRFPP) to study portfolio management within a competitive environment.
no code implementations • 9 Oct 2023 • Yuwei Wang, Enmeng Lu, Zizhe Ruan, Yao Liang, Yi Zeng
This paper presents Social data and knowledge collective intelligence platform for TRaining Ethical AI Models (STREAM) to address the challenge of aligning AI models with human moral values, and to provide ethics datasets and knowledge bases to help promote AI models "follow good advice as naturally as a stream follows its course".
no code implementations • 14 Jul 2023 • Jingjing Xue, Min Liu, Sheng Sun, Yuwei Wang, Hui Jiang, Xuefeng Jiang
In this paper, we propose Federated learning with Bayesian Inference-based Adaptive Dropout (FedBIAD), which regards weight rows of local models as probability distributions and adaptively drops partial weight rows based on importance indicators correlated with the trend of local training loss.
no code implementations • 17 Feb 2023 • Qingxiang Liu, Sheng Sun, Min Liu, Yuwei Wang, Bo Gao
In this paper, we perform the first study of forecasting traffic flow adopting Online Learning (OL) manner in FL framework and then propose a novel prediction method named Online Spatio-Temporal Correlation-based Federated Learning (FedOSTC), aiming to guarantee performance gains regardless of traffic fluctuation.
1 code implementation • 14 Jan 2023 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Xuefeng Jiang, Runhan Li, Bo Gao
The increasing demand for intelligent services and privacy protection of mobile and Internet of Things (IoT) devices motivates the wide application of Federated Edge Learning (FEL), in which devices collaboratively train on-device Machine Learning (ML) models without sharing their private data.
1 code implementation • 1 Jan 2023 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Quyang Pan, Xuefeng Jiang, Bo Gao
Federated Multi-task Learning (FMTL) is proposed to train related but personalized ML models for different devices, whereas previous works suffer from excessive communication overhead during training and neglect the model heterogeneity among devices in MEC.
1 code implementation • 25 Aug 2022 • Xuefeng Jiang, Sheng Sun, Yuwei Wang, Min Liu
Federated learning (FL) aims to learn joint knowledge from a large scale of decentralized devices with labeled data in a privacy-preserving manner.
no code implementations • 18 Jul 2022 • Yi Zeng, Dongcheng Zhao, Feifei Zhao, Guobin Shen, Yiting Dong, Enmeng Lu, Qian Zhang, Yinqian Sun, Qian Liang, Yuxuan Zhao, Zhuoya Zhao, Hongjian Fang, Yuwei Wang, Yang Li, Xin Liu, Chengcheng Du, Qingqun Kong, Zizhe Ruan, Weida Bi
These brain-inspired AI models have been effectively validated on various supervised, unsupervised, and reinforcement learning tasks, and they can be used to enable AI models to be with multiple brain-inspired cognitive functions.
no code implementations • 11 Jul 2022 • Hongjian Fang, Yi Zeng, Jianbo Tang, Yuwei Wang, Yao Liang, Xin Liu
For the fields of neuroscience and cognitive science, the work in this paper provided the foundation of computational modeling for further exploration of the way the human brain represents commonsense knowledge.
no code implementations • 27 Apr 2022 • Zeqian Li, Yuwei Wang, Kexun Chen, Zhibin Yu
To demonstrate the practicality of the pruning method, we select the YOLOv5 model for experiments and provide a data set of outdoor obstacles to show the effect of model.
2 code implementations • 14 Apr 2022 • Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Quyang Pan, Junbo Zhang, Zeju Li, Qingxiang Liu
Federated distillation (FD) is proposed to simultaneously address the above two problems, which exchanges knowledge between the server and clients, supporting heterogeneous local models while significantly reducing communication overhead.
no code implementations • 24 Jan 2022 • Ruyi Qu, Yi Yang, Yuwei Wang
The representational model are Faster RCNN and YOLO series.
no code implementations • 17 Oct 2021 • Gechun Liang, Moris S. Strub, Yuwei Wang
We study discrete-time predictable forward processes when trading times do not coincide with performance evaluation times in a binomial tree model for the financial market.
no code implementations • 4 Dec 2020 • Bo Hu, Keping Qiu, Yue Cao, Junhao Liu, Yuwei Wang, Guangxing Li, Zhiqiang Shen, Juan Li, Junzhi Wang, Bin Li, Jian Dong
DR21 south filament (DR21SF) is a unique component of the giant network of filamentary molecular clouds in the north region of Cygnus X complex.
Astrophysics of Galaxies
no code implementations • 13 Oct 2019 • Yuwei Wang, Yan Zheng, Yanqing Peng, Chin-Chia Michael Yeh, Zhongfang Zhuang, Das Mahashweta, Bendre Mangesh, Feifei Li, Wei zhang, Jeff M. Phillips
Embeddings are already essential tools for large language models and image analysis, and their use is being extended to many other research domains.
no code implementations • 17 Sep 2019 • Xiaoyu Yu, Yuwei Wang, Jie Miao, Ephrem Wu, Heng Zhang, Yu Meng, Bo Zhang, Biao Min, Dewei Chen, Jianlin Gao
Intensive computation is entering data centers with multiple workloads of deep learning.