Open Robotics
Powering the world's robots

News

Blog

For blog posts prior to April 2018, visit our old site. You might also enjoy the ROS blog and the Gazebo blog.

 

 
ARIAC 2018 Finals results announced

We are happy to announce the final results of the 2018 Agile Robotics for Industrial Automation Competition (ARIAC), hosted by the National Institute of Standards and Technology (NIST).

Now in its second year, ARIAC is a simulation-based competition designed to address a critical limitation of robots used in industrial environments: that they are not as agile as they need to be. Many robots are not able to quickly detect failures, or recover from those failures. They aren’t able to sense changes in their environment and modify their actions accordingly. The goal of ARIAC is to enable industrial robots on workshop floors to be more productive, more autonomous, and more responsive to the needs of shop floor workers by utilizing the latest advances in artificial intelligence and robot planning.

While autonomously completing order fulfillment tasks, teams were presented with various agility challenges developed based on input from industry representatives. These challenges include:

  • Failing suction grippers, requiring teams to determine if products dropped from the gripper should be retrieved or re-positioned,

  • Reception of updated/high-priority orders, prompting teams to decide whether or not to reuse existing in-progress shipments being filled or to start new ones from scratch,

  • Notification of faulty products, requiring teams to replace inadequate products and plan ahead to ensure enough non-faulty products are available for the high priority orders,

  • Products requested flipped from their original positioning, requiring teams to complete a two-step process to place the product, and

  • Failing sensors, requiring teams to have a high-level model of the environment to continue working through a complete sensor “blackout.”

Teams had control over their system’s suite of sensors positioned throughout the workcell, made up of laser scanners, depth cameras, intelligent vision sensors, quality control sensors and interruptible photoelectric break-beams. Each team chose a unique sensor configuration with varying associated costs and impact on the team’s strategy. Teams that focused on sensors requiring a higher level of processing -- for example, depth cameras in place of intelligent vision sensors -- gained a points boost for their overall lower system cost. The effect of this aspect of the competition was in full swing in the Finals, with the top two teams choosing sensor configurations that had only a single sensor in common.

The virtual nature of the competition enabled participation of teams affiliated with companies and research institutions from a range of countries. The diversity in the teams’ strategies to solving the agility challenges can be seen in the video of highlights from the Finals:

Scoring was performed based on a combination of efficiency, performance and cost metrics over 15 trials. Additionally, judges awarded points for novel, industry-implementable approaches to solving the agility challenges. The overall standings of the finalist teams are as follows.

The top three eligible teams will receive cash prizes.

Top-performing teams will be invited to present at an upcoming workshop that will be open to all, including those that did not participate in ARIAC. In addition to showcasing the various approaches used in the competition, we will also be exploring plans for future competitions. If you are interested in giving a presentation about agility challenges you would like to see in future competitions, please contact Craig Schlenoff (craig.schlenoff@nist.gov).

Congratulations to all teams that participated in the competition!

Deanna Hood
Service Robot Simulator

In the past few months, Open Robotics worked together with Hitachi on a virtual robotics competition called ServiceSim, which focuses on human-robot interaction in an office environment. Competitors must control the ServiceBot robot to perform tasks such as finding a human guest and guiding them to their destination, while making sure the guest doesn't get lost.

The competition runs on Gazebo 8 and ServiceBot is controlled using ROS Kinetic. All the competition software, including all the SDF models of the office, from cubicles to bathrooms and coffee makers, have been released as open source, so anyone is welcome to reuse these for their own environments. The competition environment itself is also versatile and competitors can customize the task and practice in various scenarios.

Several improvements have been made to the simulation of human characters in Gazebo: they now have the ability to collide with objects in simulation and there are new plugins to make the human actors run or walk in given trajectories or while following a given target, with plenty of configuration options.

ServiceBot was modeled from scratch and its URDF description and ROS controllers are available along with the competition software. In addition, a reference solution to the competition which uses the ROS navigation stack and exercises the competition's ROS API, reading sensor data and controlling the robot, is offered as a starting point for competitors.

Hitachi and Open Robotics invite the community to try out the competition software, develop their own solutions to the tasks and try completing the competition with their own robots! Debian packages are available for Ubuntu Xenial or you can use Docker for convenience. Take a look at the tutorials to get started!

Louise Poubel
Going Underground

If you haven't already heard of the DARPA Subterranean Challenge (or "SubT"), it's time to start paying attention.

With SubT, DARPA "aims to develop innovative technologies that would augment operations underground. The SubT Challenge will explore new approaches to rapidly map, navigate, search, and exploit complex underground environments, including human-made tunnel systems, urban underground, and natural cave networks."

DARPA announced SubT back in December, but Program Manager Dr. Timothy Chung recently announced that Open Robotics has been charged with creating and running the simulated track of the challenge.

Unlike our involvement in the DARPA Robotics Challenge in which we created a simulated environment in Gazebo for a single robot – the Atlas, SubT allows for a wide variety of robot participants.

You can read more about it in today's Wired: DARPA'S Next Challenge? A Grueling Underground Journey

DARPA will be announcing more details at the SubT Challenge kickoff in Fall 2018.

Nathan Koenig
Getting ready in Singapore

As we announced back in March, we're opening a new office in Singapore. And now we're happy to report that the office itself is coming together, with furniture and some rugs:

with rugs by conf table and desks 1.jpg
space desks and conf table.jpg

We're located in Block 81, which is part of a complex specifically designed for startups and other small companies, complete with a faux-repurposed-industrial look to the buildings and the nearby Timbre+,  which is a kind of mash-up of hawker market, food truck, and hipster bar. We're looking forward to getting to know our neighbors in the other companies that are operating nearby.

We'll be posting open positions for the Singapore office soon, so if you're in that part of the world and want to join the Open Robotics team, stay tuned!

Nathan Koenig
ROSCon JP coming to Tokyo in September 2018

After several years of holding ROSCon in various locations around the world, we've received inquiries from groups that want to hold their own versions of ROSCon, in the local language and designed for the local audience. We're happy to announce that the first of these events will be held September 14, 2018 in Tokyo: ROSCon JP 2018. If you're a Japanese-speaking ROS user or developer, please make plans to attend!

Guest speakers will include Min Ling Chan from ROS-Industrial APAC and  Brian Gerkey from Open Robotics.

We look forward to seeing exciting new results from the Japanese ROS community!

Nathan Koenig