This special Web-based AR Hackathon will feature at least three challenges. Participants are invited to focus primarily on the first and, also, to explore projects that will highlight or demonstrate the second and third challenges. New challenges may be added as we approach the hackathon.

Location-based data sets πŸ“Œ

This hackathon is organized in conjunction with the OGC Technical Committee meeting, November 18-22 in Toulouse, to showcase how AR can increase the value of geospatial data by delivering information registered with the real world (user context) using Web technologies.

We recommend that, in advance of the event, teams download and/or get acquainted with all the data sets gathered and made available by browsing the list on this page.

Multi-source experiences 🀝

One of the exciting benefits of using a user’s Web browser for viewing AR experiences is that the context of the user and/or the target which they want to discover or explore can be trigger for multiple concurrent AR experiences.

More than one experience about the same target may be designed by a single team using multiple data sets, or by multiple publishers/sources of data. For example, a road intersection may have static information showing the names of streets which cross, as well as the real time feed from a traffic detecting sensor (e.g., motion or pressure sensor in the road). Without changing experiences, the user would be able to see the names of the streets and the number of vehicles passing through it.

Projects that can demonstrate multi-source experiences being triggered (and controlled) in the same browser window will receive special attention.

Sharing/stringing together experiences into a narrative (story) ⛓️

Since AR users are able to move (e.g., by walking or other their chosen form of transportation) through space in real time, they will be able to discover and experience geospatially anchored AR based on one or multiple data sets. In this challenge, teams working along or in collaboration with other teams, will demonstrate the ability to help the user continue learning or discovering a narrative thread while moving. For example, the user may see and hear AR experiences related to indoor spaces until they pass through an exterior door. Upon reaching an outdoor trigger or environment with AR experiences, they would continue to learn about a topic or search for additional clues in a treasure hunt.