Physical AI company RealWorld (RLWRL) said on Sunday it will join the government’s cooperation ecosystem for an “independent AI foundation model” (Dokpamo) through a consortium led by Upstage.
RealWorld said it will work through the consortium to link an independent AI multimodal expansion to on-site robot demonstrations. It will focus on defining vision-language model (VLM) technical requirements to optimise robot control and connect with a robotics foundation model (RFM), identifying tasks by domain such as hotels, logistics and retail for economic feasibility and technical implementation, and designing detailed RFM demonstration scenarios and test protocols. It plans to verify Upstage’s model for use in physical AI and to focus on expanding it to actual industrial sites.
RealWorld, a physical AI company founded in 2024, has built a data pipeline that collects and accumulates “4D+” multimodal data using sensing tools such as cameras and tactile gloves in real work environments including manufacturing and logistics. Based on that, it is developing a robotics foundation model called “RLDX” to enable robots to perform precise handwork, and is preparing to launch its own model in the first half of this year.
Under the cooperation, RealWorld and Upstage will detail a demonstration design to connect the multimodal expansion flow of an independent AI model to robots’ “see-understand-act” process. At the early stage of cooperation, they plan to jointly review the feasibility of on-site application and validation. In designing test protocols, they plan to prioritise experimental conditions, reproducibility standards and documentation in a form that industrial customers can verify.
Ryu Joong-hee (류중희), chief executive of RealWorld, said an independent AI model needs not only to show what it can do but also under what conditions it works and how reproducibly it performs to gain trust in industry. He said RealWorld will join the Dokpamo cooperation ecosystem to refine VLM requirements and RFM demonstration scenarios and test protocols, and to create a validation path that links K-AI outcomes to on-site robot automation.