If you’re unfamiliar with Spot, I suggest you head over to our Spot 101 - Getting started with our robot post, then come back here. If you’re already familiar with Spot, keep reading.
Many companies use photos within their operational workflows for proof of task completion, remote verification, and more. Some of these companies hope to use Spot to help with this but, the trick is, there are no out-of-the-box solutions to easily traverse image data collected by Spot during an Autowalk mission. This causes them to spend precious time scanning through unorganized photos and manually tracking issues which, if not done well, can have big consequences.
In this post, we discuss Spot’s Autowalk feature, common pain points for teams who monitor workplace locations, and the simple, yet powerful tool we developed for our Spot customers which gets them up-and-running fast. If you’re interested in discussing this topic further, please reach out!
Spot’s Autowalk feature is one of the platform’s more powerful systems. Its purpose is to automate data collection. To achieve this, the operator guides Spot, step-by-step through a mission that Spot will later perform autonomously, visiting different locations and performing various routine tasks in a process called “recording.”
Once the mission has been successfully recorded, Spot can repeat this mission by itself with little to no effort on the operator’s part. When recording the mission, waypoints (or checkpoints) are registered by Spot at points in the journey when it will have to make an adjustment (such as a rotating for a turn, or stepping up or down to accommodate for a change in elevation). These waypoints help keep Spot (literally) on track. The operator can also stop Spot and have it perform an action, which will be registered along with a waypoint. During subsequent runs of an Autowalk mission, Spot will follow the recorded path. If Spot encounters an obstacle, the robot will autonomously walk around it and regain its path.
When an Autowalk mission is replayed, a goal radius is set (the default goal radius size is 0.2m, but this can be increased to 2m or reduced to 0.1m) which will determine the proximity of Spot in relation to that waypoint before it performs an action. With this target in mind, Spot can achieve precision of 10cm from the original location of the recorded action.
Let’s say you’re tasked with monitoring a workplace location to ensure it’s left in a certain condition at the end of the day. This likely requires a human to visit the location, make the rounds, record any issues, or perhaps take photos of key areas for future or remote reference. This sounds like a lot of work, with potential for human error.
Spot can help you with this.
An inspection mission can be recorded for Spot that involves taking various photos every few feet. On the first walk-through, where Spot is guided by an operator, control data is recorded, resulting in the designated number of photos (anywhere from zero to a few hundred). For each inspection mission that follows, the idea is that the same number of near-identical photos are collected at the same waypoints.
Because Spot performs with a high level of accuracy, the data collection is likely to be more consistent between runs of the inspection mission compared to one executed by a human taking a photo using a tripod and camera (and certainly better than a quick smart phone snap).
As a custom software development firm that strives to make the world a more efficient and enjoyable place, we wanted to find and build a simple solution for this opportunity. In comes AME: Autowalk Mission Evaluator - it’s a workflow servicer tool (available for web) that offers a simple UI to walk through a mission’s data and provide feedback to other team members.
Next, we’ll walk you through three features – Mission Index, Mission Walk-through, and Mission Report – of our baseline proposition for those who find themselves in situations similar to the one we discussed above. And like everything we offer at Osedea, AME can be totally customized and tailored to fit your company’s needs.
First, AME lets you see all missions ever recorded by Spot, so you can easily pull up missions from yesterday, last week or even last month. A list alone isn’t very helpful, so we added basic information for each mission (like number of recorded actions and number of approved data points, plus we offer some tools like a walk-through and report view).
Pagination allows you to cycle through a few missions at a time, and you can adjust the order based on our mission data. For each mission, you can expand the row to reveal more information, or interact with them to cycle through their data via the “Walk-through” flow.
You can walk through each recorded datapoint of a mission and easily compare the photo taken with its corresponding control image. From this view, you can easily see the action or waypoint of the specific datapoint, the recorded time and, in this case, a camera angle description. Each action can be named or given a description to better represent the location of the data capture.
To the right, you have “next” and “previous” arrow buttons to easily navigate through the images.
By default, you will view the control image next to the current image. Alternatively, you can select to view these images in a comparison slider or simply look at the current image if that’s all you’re interested in.
Once the image has been viewed, you then can provide feedback which is recorded for future reference. A user can simply approve the datapoint and move onto the next, or reject it. Upon rejecting an image or a datapoint, we ask the user to input their reasoning as seen in the following example.
In both of these cases, the user is brought to the following datapoint once they’ve completed their interaction. This fluid walkthrough allows a user to:
- efficiently scan though many images
- effectively record their findings to be seen by other members of the team or by their future selves.
Upon revisiting any given datapoint, a user will see if it was approved or rejected and by who and for what reason.
Alternatively, a user can review a mission in the report flow, where each action (or physical location) is separated and previews of each datapoint can be seen in a small thumbnail view.
A user can select each of them individually and see the same import datapoint information as in the walkthrough flow.
Here we can see the control vs. current image slide comparison UI which can be used to easily identify differences between the images.
We see AME as the first step to offering a universal evaluator tool to view the visual data collected by Spot. This system can be adapted for other sources of data such as scan results, audio, visual clips, or any other snippets of data that can be displayed in the browser.
In this simple form of the tool we’ve developed, the data is seen by one user type - a moderator of sorts - but this system can be built to accommodate different user flows and roles to help a team work more efficiently. In the aforementioned example, we had someone tasked with ensuring a location was proper at the end of each day. But if this same user wanted to inform an appropriate staff member of an issue detected, or task them with fixing it, a simple notification service can be introduced to help with this need.
In a world where machine learning is getting stronger and more reliable, we can easily implement another level of autonomy to this system by evaluating the data collected and allowing the system to detect issues found. This system would allow a human to spend less time analyzing the data and only validate the system's findings until the model can become completely independent. This way, we, humans, can focus on tasks that add more value for our organizations.
Spot is pretty cool in and of itself, but one of its true power comes in how it can be extended. We built AME to help free you up from routine evaluator tasks so you can focus on your business.
If you’re interested in hearing more about how Spot’s autowalk feature can help you, or how our AME tool can be integrated into your workflow to automate data collection, contact us.