Imagine one spacecraft chasing another satellite and steadily closing the gap—with each vehicle traveling more than 16,000 miles per hour in the darkness of space.
The satellite that's being chased, the client, is a multi-ton craft that is running out of fuel.
The chaser satellite, the fully robotic OSAM-1 servicer, follows in hot pursuit, carrying life-saving propellant and tools. It is steadily controlled by humans on the ground—for now.
Everything hinges on the servicer's ability to accurately locate, catch up to, and match its speed with the client satellite.
But such a rendezvous isn't easy.
The client was not designed to be serviced. It does not have any markings that would make it easy for the servicer to find it and track it. The servicer has to do this on its own, using a machine vision system.
To make the matter more complex, OSAM-1 is far from Earth. There's a delay getting data down to Earth—and commands back up to space. Humans on the ground cannot command the servicer quickly and accurately enough in the last few feet of the rendezvous to prevent a crash.
This means that OSAM-1 not only needs to perform relative navigation with its client—it needs to do it autonomously (by itself, with no human guidance)—and in real time, as the scenario unfolds.
NASA is developing this type of ground-breaking relative navigation system—not just for OSAM-1, but for missions for decades to come, including the Journey to Mars.
Key parts of the system are being proven on the International Space Station through a test bed called Raven.
Raven is a technology-filled module on the International Space Station that will help NASA test key elements of a new spacecraft autopilot system. Through Raven, NASA will be one step closer to having a relative navigation capability that it can take "off the shelf" and use with minimum modifications for many missions—for decades to come.
Within its silver frame, Raven contains a carefully curated system that includes:
As spacecraft approach the International Space Station, Raven's components join forces to independently image and track them.
The Raven images above show a Cygnus spacecraft visiting the International Space Station.
View in visible (left), infrared (center) and lidar (right)
Raven tests foundational technologies that will help NASA for decades to come. The OSAM-1 servicing mission will draw on its technologies when its robotic servicer navigates to Landsat 7 to refuel it on orbit. Other NASA missions, including the Journey to Mars, could also draw on Raven technologies when rendezvousing.
Raven was developed and integrated by the Satellite Servicing Projects Division (SSPD) at NASA's Goddard Space Flight Center.
Raven launched to the space station in 2017 aboard the Space Test Program-Houston 5 (STP-H5), a complement of 13 unique experiments from seven different agencies.
Before there was Raven, there was Argon: a NASA-developed, ground-based demonstration module that helped to rapidly mature - as an integrated system - the individual sensors, algorithms, and system technologies a spacecraft would need to perform rendezvous and proximity operations (RPO) at multiple ranges.