We use cookies to ensure you have the best possible experience. If you click accept, you agree to this use. For more information, please see our privacy policy.
/

Innovation

Spot Showcased at the Smart Factory by Deloitte @ Montreal

Robin Kurtz
Robin Kurtz
5
min read

The beginning of this year brought us a great partnership with Deloitte to help showcase Spot and its capabilities along with Osedea’s. With the help of their Smart Factory here in Montreal, we now have Spot performing Autowalk Missions in a factory setting to help demonstrate to potential partners the advantages of using Boston Dynamics’ Spot at their facilities.

The initial premise was simple, record an Autowalk Mission and let Spot do its thing. Out of the box, this is not a problem for the ever-so-impressive Spot platform from Boston Dynamics. However, we want our audience to see the value of the data collected instantly and for this requirement, we opted to build a dashboard.

The Dashboard

Our partners at Deloitte offered us some prime real estate of a 2 by 2 grid of TVs to display our live data on, so we took the whiteboard to conceptualize a cohesive and informative UI to show Spot’s progression within its Autowalk Mission. As well as, the most recently captured images, live sensor data, and as a cherry on top, some basic object detection on its body cameras.

It was my turn to build the functionality and get the data from Spot into a UI. To achieve this, I broke the project down into a few parts. While working on the data pipeline and backend aspects of our dashboard, our talented UI/UX team worked on a UI to better our audience's experience.

Data fetching

Internally, Spot uses a Data Acquisition Service (read more on it here) to store the data it collects, which offers us methods to retrieve it. Essentially, at each key point of interest in our Autowalk Mission, we perform an action that will capture any sort of data such as RGB or thermal images. Since this system is a closed loop, there is no way for us to inform another service or send the data out from Spot as it is collected. Our solution for this problem is to periodically (constantly, and annoyingly) poke Spot for data.

Some pseudo code will help illustrate this:

def job():
	query_params = make_time_query_params(start_time_secs, end_time_secs, robot)

  success = download_data_REST(
    query_params,
    ROBOT_IP,
    robot.user_token,
    destination_folder=DESTINATION_PATH,
  )

  schedule.every(1).seconds.do(job)
Spot the agile robot showcased at the Smart Factory by Deloitte @ Montreal

Great! The above “code” will request data from Spot and save it to a local directory. Now we need to parse the data.

Data parsing & storing

By default, when we ask Spot for data, we get back a directory that contains some standard content such as a metadata.json file as well as any images we’ve collected during our Autowalk Mission. Within the metadata.json we have a representation of the data collected, we can parse this data and save it to a database. For this project, we went with MongoDB, a NoSQL database as our different types of actions have varying data structures.

We created a DAQParser class that takes in a directory of data and returns us a MissionData instance which offers us some pretty decently structured data that we can then save to our database.

The DAQParser offers us a good place to also transform data when needed, for instance converting the raw thermal data captured from Boston Dynamics’ SpotCAM+IR Payload to a colour map to be easily understood by our seeing audience. While the raw data contains enough information for us to alert different parties if temperatures are outside of their predefined thresholds, having a visual representation is important for operators who have non-machine eyes.

Something like:

def get_colormap_image_from_raw_thermal(
    path: str, save_file_path: str = None, cv2_colormap: int = cv2.COLORMAP_INFERNO
):
	"""
	Pull data from ir_raw.raw file
	Save (if save_file_path is provided)
	Return colored image with provided cv2_colormap
	"""
	# Open the raw IR file and read it into a numpy array   
	data = read_thermal_data(path)

	# Pre process data to convert from decikelvin to Celsius
	processed_data = preprocess_data(data)

	# Apply colormap to create a colored image
	colored_image = cv2.applyColorMap(
		cv2.convertScaleAbs(processed_data, alpha=(255.0 / processed_data.max())),
		cv2_colormap,
	)

	if save_file_path:
	    # Save the colored image
	    cv2.imwrite(save_file_path, colored_image)

	return colored_image
Spot the agile robot at the Smart Factory by Deloitte in Montreal

(The irony of spelling color without a “u” in code and converting the units to Celsius over Fahrenheit is not lost on me.)

This class also contains additional logic to upload files captured by Spot to other destinations such as an AWS S3 Bucket, but that was not needed for this project. 

Once all parsed and packed into a nice entity, we create our MongoDB document for our front end to display.

The backend

So at this stage, we have our data stored in a database (and a local file system for images), but now we need to make it more suitable for our frontend application. A simple REST API was created using Flask to read our database and provide us with the data we’re interested in.

Great, so now time for a frontend? Not quite! Because we want to see our data as soon as possible, we enabled WebSocket communication with Socket.IO, with this we have an event that provides us the most recent data from Spot. Note in this case, data is a representation of an action that we spoke about before.

The frontend

It was now time to dust off my React knowledge and build out a simple dashboard to give our audience something else to look at. We knew this would be hard because everyone just wants to look at Spot, walk around and avoid obstacles (mainly people intentionally getting in its way) with ease. This is where our UI/UX team’s work was extremely important as we wanted our result to be visually impactful, and also represent Spot’s Autowalk Mission as a whole, and not just display the latest data. We wanted the dashboard to be interesting throughout the time Spot is walking around, this added a bit of complexity.

Since we were given a 2 by 2 TV grid, we broke up the dashboard into four quadrants: a timeline of actions within the Autowalk Mission, the images (and alerts) captured by Spot, live sensor data graphed, and a live feed of one of Spots body cameras with basic object detection.

As new data comes in from our backend, we update the two top quadrants to provide instant insight on where Spot is in its Autowalk Mission. The bottom left contains live data captured by additional Dracel sensors displayed in Grafana. This is in fact a service running on Spot and accessed remotely. We also have (bottom right) a simple API that pulls the image from Spot’s body camera and runs basic object detection with a pertained YoloV5 model via PyTorch.

Next steps

Awesome! So at this stage, we have a cool dashboard that shows us live data from Spot and allows our potential customers to easily see what Spot could be programmed to see. However, it does not end there!

We will be implementing Fluke’s SV600 into the Smart Factory’s Autowalk Mission to show off more of Spot’s capabilities. Air leak detection is a massive problem for industries that leverage compressed air both from an economical standpoint and an ecological perspective. This can extend to gas leaks and other problems around your facilities.

We will also be implementing various data extraction solutions leveraging computer vision. Examples include reading an analogue gauge to ensure values are within an expected threshold, tank & reservoir levels as well as detection of safety equipment.

Stay tuned for another blog post on this subject with some examples!

In conclusion

Boston Dynamics has created a very versatile dynamic sensing platform that can bring instant return on investment. After integrating it into your workflow, more value can be obtained by using machine learning techniques to extract data for business insights and predictive maintenance.

Osedea can also offer proof of concepts with Spot to see if it’s the right tool for you and your business. Send us a message, we’d love to discuss how we can collaborate!

Did this article start to give you some ideas? We’d love to work with you! Get in touch and let’s discover what we can do together.

Get in touch
Button Arrow

Further Reading

Innovation

Extending Spot's preception with computer vision

Robin Kurtz
Robin Kurtz
3
min read
Innovation

Giving Spot computer vision goggles

Robin Kurtz
Robin Kurtz
3
min read
Innovation

How we went from a growing business to a world-class competitive firm

Thierry Marcoux
Thierry Marcoux
4
min read
Innovation

Digital solutions: four reasons why real pros are a must

Thierry Marcoux
Thierry Marcoux
5
min read
Innovation

Four ideas for a successful in-house hackathon

Ivana Markovic
Ivana Markovic
3
min read
Innovation

Transforming Vertical Farming with the Spot Robot at Interius Farms

Nicholas Nadeau
Nicholas Nadeau
2
min read