FANUC Robot Demo | AC’s Booth at ATX West 2024

Behind the Scenes with AC's FANUC Robot Demonstration

Join us for a behind-the-scenes look at the development of our FANUC robot demo at ATX West 2024 in Anaheim, CA. We sit down with Automation Controls Engineer, Chace Hutchins, to learn how he approached the integration and programming of this uniquely purposed FANUC robot workcell.

Meet Chace Hutchins, Automation Controls Engineer

Chace Hutchins | AC Controls Engineer, Manufacturing Automation

CHACE HUTCHINS

ENGINEERING GROUP

  • AC Manufacturing Automation

 

EXPERTISE

  • Controls systems
  • Robot integration & programming
  • Coordinated motion & vision

 

ROBOT DEMO OBJECTIVE

  • Develop robot operation with our design PM, ME, and EE
  • Support robot demonstration at the trade show
  • Connect with trade show visitors and customers searching for robot integrators

Tell us about AC’s FANUC robot demo. What was the project’s objective?

“This build was one of two robot demonstrations in AC’s booth this year at ATX West 2024, in Anaheim, CA.

We wanted to highlight our relationship with FANUC and our role as an Authorized System Integrator of FANUC Robotics.

Our team thought placing board game letter tiles would be a fun demo. So, our Automation team designed a guarded workcell for the trade show using a FANUC articulated robot. The robot would pick and place letters onto a tile display rack, spelling out the phrase: AC AT ATX.”

FANUC Robot Demo | AC's Booth at ATX West 2024 | Image showing a scattered group of gameboard letter tiles the robot would use to spell out a phrase during the trade show demonstration

How did the project come together? What technologies does the design include?

“Our demo is a fairly straightforward feed, pick, and place application, but with some smart nuances.

For our vision system, we integrated an Asyril EYE+ vision feeder system on loan from a generous local partner, Jacob Stock, President of M6 Revolutions. M6 was incredibly supportive and let us leverage their components for this project.

For our robot, our mechanical engineers repurposed one of our FANUC LR-Mate 200iD articulated robots.

And, as always, FANUC support and sales really shined in setting us up with the right resources to be able to flex our robot integration muscles.”

Robot Integration Advantages with FANUC

Can you walk us through the steps of this FANUC robot demonstration? ​​

“Sure, there are a series of coordinated vision and motion technologies integrated into the demo:

(A) Asyril smart hopper to deliver letter tiles to the part feeder and vision system

(B) Asyril Cube 240 backlit flexible feeder with microelectromechanical (MEMS) vibration for part shuffling and reorientation

(C) Asyril EYE+ camera for high-resolution, high-contrast thermal images of the letter tiles in the feeder

(D) FANUC LR-Mate 200iD articulated robot

(E) End-of-arm tool (EOAT) with vacuum suction for the robot to pick and place letter tiles

(F) 3D-printed tile display rack

 

Similar to a ‘hangman’ puzzle, the robot selects all A’s, then all T’s, C’s, and then X.

The vision system captures images of the letters, so the robot can go to the coordinates of the next best letter available.

Then, the robot retrieves the letter tile and places it on the display rack in the correct position to complete the phrase: AC AT ATX.

Using a secondary tool on the same EOAT, the robot picks up the entire tile display rack, moves to the hopper, dumps the tiles, then returns the empty display rack to the front-right of the workcell.

The entire cycle is complete. It resets and repeats continuously during the trade show demonstration.”

Image showing the FANUC Robot Workcell integrated with Asyril EYE+ vision and feeder system
Integrated workcell featuring FANUC LR-MATE 200iD Robot and Asyril EYE+ Camera with Smart Hopper Flexible Feeder System

How was the system configured? What technologies did you use?

“Going into the project, I knew our system required speed and accuracy to pick and place the letter tiles correctly, consistently, and continuously during the demo at the trade show.

I’ve worked with FANUC before, so I knew the robot would perform reliably. So, I just needed to get the system talking first.

Thankfully, by enabling the KAREL and User Socket Messaging robot packages, I didn’t need any extra hardware. This meant I could quickly establish communication between the vision and feeder system and the robot over TCP/IP.

Bypassing the use of a PLC, I programmed the system directly through the FANUC R-30iB controller instead. This combined with the pre-built FANUC plugins by Asyril saved me a ton of time when it came to programming the exchanges between the devices.”

Image showing Chace Hutchins programming the FANUC Robot Demo
Chace programming the integrated FANUC robot with Asyril vision and feeder system.

EXPLORE

Learn more about Asyril’s integrated vision and part feeder system – a crucial feature of an accurate and rapid pick-and-place robot application.

Want to know more about FANUC? See our related blog post:
Robot Integration Advantages.

RESOURCE CREDIT: Asyril | EYE+ Plugin for FANUC Robot

How are the correct letters identified each time?

“Our FANUC robot needed to be able to pick a good part from a random position in the cube feeder and then reorient and place it correctly and consistently on the tile display rack. To achieve that, I first needed to teach the vision system to distinguish good parts from bad parts.”

Teaching letter identification

“In the first teaching phase, I capture images of the letter tiles. Then, using a visual interface, I separate the ‘good’ images (the correct letter for a specific job recipe) from the ‘bad’ images (the remaining letters to ignore for that specific job recipe).

The Eye+ pattern training starts with an AI learning feature that is trained based on populations of arbitrary ‘Good’ and ‘Not Good’ images. This indicates the difference between an A tile versus a T tile and so on.”

What about detecting part position?

Image showing the backlit Asyril Ascube 240 feeder illuminated for the vision system camera imaging of the letter tiles
The Asyril vision system captures a high-contrast image of the letter tiles scattered across the AsyCube 240 MEMS feeder.

“To improve the position location that is communicated to the robot, I needed to go a couple steps further though.”

Detecting part shape and size

“At this stage, the backlight feature of the AsyCube 240 feeder really helps. It creates burst lighting for the camera to capture a high-contrast black and white image of each letter tile, making the part size and shape stand out to the vision system.

To determine the center of those parts, I masked the space within the rectangular bounds of the tile, and then used the high contrast edges.”

Identifying part and letter orientation

“To determine the orientation of the tile in question, and consequently the rotation needed by the robot to pick correctly, I needed to invert the strategy and mask the edges. From there I had to be selective in what parts of the tile letter was masked and unmasked to create an X and Y axis orientation. Each letter is vastly different, so this process is unique to each job.”

How does the FANUC robot use the good part data?

Image showing the robot's vacuum end effector holding a tile over the cube feeder

Bringing it all together: picking and placing good parts!

“Finally, by combining all of this data for each letter, the vision system can rapidly identify the correct letter for the job recipe. The EYE+ feeds all this information very quickly to the robot as coordinates.

Using this positional data, I was able to program the FANUC robot’s movement and position.

The motion had to be accurate so that the center of the vacuum end effector corresponds to the center of the correct letter tile every time.”

Were there any helpful features of the FANUC robot controller interface?

“Yes, I think FANUC makes it easy to teach the robot its positions and to move to those positions quickly.

Additionally having two, even three, separate viewing panels on the pendant helps to bounce between Position Registers, Code, and Robot IO.

FANUC robots are not only solid performers, but they also come with technologies designed for simplicity. Their approach is more intuitive with a plug and play appeal that is geared towards getting up and running quickly. For this project, we had a narrow window of development time, so I appreciated saving a lot of time.”

Image showing FANUC robot teach interface

How did the FANUC robot workcell perform at the trade show?

Chace setting up the FANUC robot workcell at ATX West 2024

“Once we finished the on-site assembly of the workcell at the trade show, the FANUC robot was ready to start spelling!

During the three days of the trade show with the robot operating continuously during the day, we had no major issues. The FANUC robot performed as expected and our integrated Asyril vision system also shined.

On behalf of AC, I want to thank FANUC for their support for this project, as well as Asyril who came by to support us at the show, and also M6 Revolutions for lending us their feeder system for this demonstration. We couldn’t have done it without each of you.

Below is a link to view our timelapse video of this robot and vision system build, which includes a view of the demo performing at the trade show. Enjoy!”

Ready to integrate FANUC robotics technology?

AC is a FANUC Authorized System Integrator; our engineers are experts in choosing, integrating, and configuring automated robotic workcells. We can help you select the right robot, vision, and other technologies for your unique application. How can we help you Automate the Impossible?

Special thanks to AC contributors: Timelapse configuration courtesy of AC Automation; behind-the-scenes robot video and photo assets captured by Chris Jaramillo, AC Technical Writer; trade show video footage and photo assets captured by Lindsay Fritz, AC Marketing Manager; video editing by Francesca Weeks, AC Sr. Technical Writer.