ICOIN Workshop on Next Infrastructure : Cloud Continuum (NICC)

Date: Jan. 17-19, 2024
 Important Dates
Same as ICOIN 2024 regular paper

 Call for Workshop Papers
Most of the major cloud service providers are launching AI services on conventional cloud services (PaaS & IaaS). Also many computing HW resources (such as CPU, VPU, TPU,GPU,FPGA) in cloud data center are used to train large scale AI models handling huge amounts of data including multimodal information. Many companies also have been trying to incorporate AI services fastly to have marketing advantages among competitors using diverse cloud computing infrastructures. Thus, cloud computing faces several issues to provide energy efficient AI services, hyper-scale federated learning & model delivery, highly scalable and heterogeneous resource management for compute and store. Cloud continuum as emerging infrastructure including diverse cloud resources for future AI services including real-time train data collection for hyperscale AI model train and serving is highly considered to research fields around world.

The direct link for paper submission is

Topics of interest include, but are not limited to the following.
  • Energy aware data center
  • Memory centric computing for ML
  • Heterogeneous Resource management
  • Federated Learning on edge computing
  • Microservices for Model delivery
  • Container based Serverless computing
  • Data Condensing & Privacy in AI
  • Data Lake on Multi-Cloud
  • AI model surgery and verification
  • Parallel & Distributed ML
  • ML on Multi-Cloud
 Workshop Organizer
  • Eui-Nam Huh (Kyung Hee University)
  • Yeonmook Nah (Dan-Kook University)
  • Young Han Kim (Soongsil University)
  • Yangwoo Kim (Dongguk University)