CoCreatAR

Enhancing Authoring of Outdoor Augmented Reality Experiences Through Asymmetric Collaboration

CHI 2025

*Work done while at Niantic, during Nels' internship.

This figure illustrates our system, CoCreatAR, being used by a user ex-situ on a desktop computer and another user in-situ on a mobile phone. It shows various meshes and visualizations augmented on the real-world location, including RGB-D point clouds, coarse meshes, and a virtual character representing the in-situ user on the ex-situ user's screen. For example, some coarse meshes are shown as rough planes outside of the original location mesh.

Abstract


Authoring site-specific outdoor augmented reality (AR) experiences requires a nuanced understanding of real-world contexts to create immersive and relevant content. Existing ex-situ authoring tools typically rely on static 3D models to represent spatial information. However, our formative study (n=25) identifies key limitations of this approach: models are often outdated, incomplete, or insufficient for capturing critical factors such as safety considerations, user flow, and dynamic environmental changes. These issues necessitate frequent on-site visits and additional iterations, making the authoring process more time-consuming and resource-intensive.

To mitigate these challenges, we introduce CoCreatAR, an asymmetric collaborative authoring system that integrates the flexibility of ex-situ workflows with the immediate contextual awareness of in-situ authoring. We conducted an exploratory study (n=32) comparing CoCreatAR to an asynchronous workflow baseline, finding that it enhances user engagement and confidence in the authored output while also providing preliminary insights into its impact on task load. We conclude by discussing the implications of our findings for integrating real-world context into site-specific AR authoring systems.


Video


System Overview


CoCreatAR, a collaborative authoring system for outdoor AR experiences, designed to facilitate real-time interaction between ex-situ developers and in-situ collaborators. The system enables ex-situ creators, who typically design and develop the experience within Unity asynchronously, to synchronously collaborate with in-situ users who experience the AR content directly in the field. By integrating real-time communication, contextual reference tools, and spatial data capture, CoCreatAR aims to reduce the need for repeated on-site visits during the iterative design of site-specific AR experiences.

In-Situ Interface

Images of phone screens illustrating CoCreatAR's in-situ UI. A: Coarse meshes are depicted in a different color overlaid on the ground plane, and there is a UI button to stop creating such coarse meshes. B: The 3D snapshot is depicted as a lighter point cloud overlaid on the real world, and there is a button to capture another 3D snapshot. C: The main menu has buttons for Capture, Annotate, and Settings. D: A line drawing is directly drawn on the side of a building, and there are buttons for choosing different colors. E: A floating line is shown coming from a lamppost in the distance towards the user's position, and there is a button to start 3D World Drawing.
In-situ user interface of CoCreatAR. (A) Users can scan the environment to obtain a Coarse Mesh of the surroundings; (B) Users can tap Capture to take a 3D Snapshot; (C) Users can tap Capture or Annotate to access feature sub-menus from the main menu and tap and hold on the screen to spawn a 3D Cursor; (D) Users can create drawings projected onto surfaces with Surface Draw; (E) Users can create a trajectory or a 3D drawing with Air Draw by moving their smartphone.

Ex-Situ Interface

A screen capture of Unity with additional panels described in the caption. This is CoCreatAR's UI for the ex-situ user. A: A separate Unity panel with the scene hierarchy. B: A 3D point cloud within the main Scene panel. C: A 3D mesh within the main Scene panel. D: Planes (coarse meshes) within the main Scene panel. E: A separate panel with a real-world view from the in-situ user. F: A blue sphere within the In-Situ View panel. G: A blue sphere within the main Scene panel. H: An extension of the scene hierarchy panel. I: Assets in the Project panel, including various materials and standard prisms. J: Sliders within a Scene View Control panel, allowing for the adjustment of the meshes' and 3D snapshots' transparencies.
Ex-situ user interface of CoCreatAR. (A) All objects under NetworkedScene are automatically synchronized between ex-situ and in-situ users; (B) 3D Snapshots captured by the in-situ user; (C) The location mesh of Location A; (D) Coarse Mesh captured by the in-situ user; (E) Live feed of the in-situ user's screen, including AR content; (F) The 3D Cursor of the ex-situ user, projected into world space; (G) Close-up of the 3D Cursor of the ex-situ user as seen in the scene view; (H) List of annotations and spatial captures, persistently saved in the scene for later review; (I) Sample assets that can be added to the scene at runtime.

Example Usage Figure

Four screenshots of the ex-situ participant adjusting virtual objects relative to the location mesh. A: In-situ: “Hmm no [the boombox] is still floating. Let me capture… [in-situ uses 3D Snapshot]”. Ex-situ, highlighted: “Ah now I can see it all. Should I put it on the pavement like this? [ex-situ moves boombox]”. In-situ: “That's good, yeah.” B: In-situ: “Maybe we can put [the map] here! [in-situ uses Surface Draw]”. Ex-situ, highlighted: “Perfect [...] Is this upside down? I can't really tell.” In-situ: “No that's the right way. [ex-situ aligns map with wall] Yeah, great.” C: Ex-situ: “Capture behind you so I can get more of the walls there so I can align [the garlands].” In-situ: “Okay, do you need 3D image or coarse mesh?” Ex-situ, highlighted: “Coarse mesh, for now. I don't need, like, full color, I just need a bit of the geometry [in-situ users Coarse 3D Mesh]”. D: In-situ: “Can you put the wine bottle like, here? [in-situ places 3D Cursor on the table]” Ex-situ, highlighted: “Hmm let me see, it's a little tricky [ex-situ moves wine bottle]” In-situ: “Yeah that's good!”
Overview of CoCreatAR feature usage during Phase 1, shown as ex-situ perspective screenshots. Participant conversations are shown in color-coded speech bubbles: in-situ (green) and ex-situ (blue). Speech bubbles with a glow indicate utterances made at the moment of the screenshot. (A) Alignment of a boombox based on spatial context captured using the 3D Snapshot feature; (B) The ex-situ participant moving the map to a position on the wall as specified by the in-situ participant through Surface Drawing; (C) Alignment of a garland to a previously unmapped region of the street using the Coarse 3D Mesh feature; (D) Alignment of misplaced food items based on in-situ input using the 3D Cursor.

Resources


Paper

Paper

Supplemental

Supplemental

BibTeX

If you find this work useful for your research, please cite:

@inproceedings{numanCoCreatAREnhancingAuthoring2025,
    title = {{CoCreatAR}}: {{Enhancing Authoring of Outdoor Augmented Reality Experiences Through Asymmetric Collaboration}},
    shorttitle = {{CoCreatAR}}: {{Enhancing Authoring of Outdoor AR Experiences Through Asymmetric Collaboration}},
    booktitle = {{Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems}},
    author = {Numan, Nels and Brostow, Gabriel and Park, Suhyun and Julier, Simon and Steed, Anthony and Van Brummelen, Jessica},
    year = {2025},
    month = apr,
    series = {{{CHI}} '25},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    doi = {10.1145/3706598.3714274},
    isbn = {979-8-4007-1394-1/25/04},
}

Acknowledgements


Special thanks to Charlie Houseago, KP Papangelis, Filipe Gaspar, Kelly Cho, George Ash, Adam Hegedus, Stanimir Vichev, Thomas Hall, and Victor Adrian Prisacariu for their feedback and support during the development of CoCreatAR. We also thank our study participants for their time and insights, and Isabel Kraus-Liang for assistance with study logistics. Our user study (Sec. 5) was partially supported by the European Union's Horizon 2020 Research and Innovation program as part of project RISE under grant agreement No. 739578.

© This webpage was inspired by this template.