Chenfiona的个人博客分享 http://blog.sciencenet.cn/u/Chenfiona

博文

MIR专题征稿 | 大规模预训练:数据、模型和微调 (6月30日截稿)

已有 921 次阅读 2022-5-6 20:48 |个人分类:征稿启事|系统分类:博客资讯

Machine Intelligence Research (MIR)专题"Large-scale Pre-training: Data, Models, and Fine-tuning"现公开征集原创稿件,截稿日期为2022年6月30日。专题客座编委:中国人民大学文继荣教授、昆士兰大学Huang Zi教授、南洋理工大学Zhang Hanwang教授。欢迎赐稿!


image.png


专题简介


Machine Intelligence Research (ISSN 2731-538X) seeks original manuscripts for a special issue on "Large-scale Pre-training: Data, Models, and Fine-tuning".


Recent years have witnessed an explosion of interest in and a fast development of large-scale pretrained models with the explosion of massive data and model parameters. Large-scale pretrained models have achieved milestones and exemplary performance on a broad range of practical problems, including not only computer science areas like natural language processing, computer vision, and recommender systems, but also other research areas like biology, meteorology, art, etc. Different from early non-neural models and small models that heavily rely on hand-crafted features, statistical methods, and accurate human annotations, neural models can automatically learn low-level distributed representations and high-level latent semantic information from data. As deep neural models tend to overfit and have poor generalization with huge numbers of parameters, massive efforts have been devoted to exploiting how to pre-train large-scale models on large-scale data. As large-scale human annotations are labor-consuming and time-costing, it is impractical for large-scale pre-training in a fully-supervised manner. Considering this issue, the AI community has made recent efforts on self-supervised learning algorithms and theories, large-scale pre-training paradigms according to data format, large-scale model architecture designs, and fine-tuning pre-trained models for downstream applications.



征稿范围(包括但不限于)

This special issue seeks original and novel contributions towards advancing the theory, architecture, and algorithmic design for large-scale pre-trained models as well as downstream applications. The special issue will provide a timely collection of recent advances to benefit the researchers and practitioners working in the broad research field of deep learning, natural language processing, computer vision, and machine intelligence. Topics of interest include (but are not limited to):

- Language Pre-training

- Visual Pre-training

- Multi-modal Pre-training

- Multi-lingual Pre-training

- Large-scale Pre-training Theories

- Large-scale Pre-training Algorithms and Architectures

- Efficient Large-scale Pre-training

- Fine-tuning Pre-trained Models

- Pre-training Applications

- Survey of Large-scale Pre-training



投稿指南

截稿日期:2022年6月30日

投稿地址(已开通)

https://mc03.manuscriptcentral.com/mir

投稿时,请在系统中选择:

“Step 6 Details & Comments: Special Issue and Special Section”---“Special Issue on Large-scale Pre-training: Data, Models, and Fine-tuning”


客座编委

Prof. Ji-Rong Wen, Renmin University of China, China (jrwen@ruc.edu.cn)

Prof. Zi Huang, The University of Queensland, Australia (huang@itee.uq.edu.au)

Prof. Hanwang Zhang, Nanyang Technological University, Singapore (hanwangzhang@ntu.edu.sg)


点击下方链接下载征稿启事:

CFP - Special Issue on Large-scale Pre-training Data, Models, and Fine-tuning.pdf



image.png

Machine Intelligence Research(简称MIR,原刊名International Journal of Automation and Computing)由中国科学院自动化研究所主办,于2022年正式出版。MIR立足国内、面向全球,着眼于服务国家战略需求,刊发机器智能领域最新原创研究性论文、综述、评论等,全面报道国际机器智能领域的基础理论和前沿创新研究成果,促进国际学术交流与学科发展,服务国家人工智能科技进步。期刊入选"中国科技期刊卓越行动计划",已被ESCI、EI、Scopus、中国科技核心期刊、CSCD等数据库收录。


image.png

AI最前沿 | 聚焦知识挖掘、5G、强化学习等领域;来自联想研究院、中科院自动化所等团队
联想CTO芮勇团队 | 知识挖掘:跨领域的综述
主编谭铁牛院士寄语, MIR第一期正式出版!
华南理工詹志辉团队 | 综述: 面向昂贵优化的进化计算
北科大殷绪成团队 | 弱相关知识集成的小样本图像分类
东南大学张敏灵团队 | 基于选择性特征增广的多维分类方法



image.png

喜报 | MIR被 ESCI 收录!

喜报 | MIR 被 EI 与 Scopus 数据库收录
新春喜报!MIR入选“中国科技核心期刊”


image.png



https://wap.sciencenet.cn/blog-749317-1337340.html

上一篇:榜上有名!MIR 11位编委、3位作者入选 "2021中国高被引学者"
下一篇:中科院自动化所何晖光团队 | 一种基于RGEC的新型网络
收藏 IP: 124.64.17.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-5-17 19:49

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部