Automatic date printing on packages

dec LOGO

By Yuk Ho ChungBauke Hendriks, Schelte van der Horst, Tim Onstwedder

Dynamic Ear Company (DEC) is a company from Delft, the Netherlands. Dynamic Ear Company develops and manufactures professional hearing protection, audio and related products for musicians, industry, and other areas where sound levels may cause hearing damage.

Assignment

The production volume of Dynamic Ear Company is growing and due to changes in regulations, an expiry date will need to be present on the packaging. To reduce stock at the company, DEC aims to print the expiry date on the packaging on the day it is packaged.

They are currently assembling the packages manually without the expiry dates printed on the packaging. They are looking whether it is possible to automate this date printing process. This will eventually be implemented for all different types, but for this project the scope will be for two types, the “Clamshell” and “Ziploc”, as shown below.

Packages

Clamshell package on the left and Ziploc package on the right

The solution

The packages are still assembled by the operators and the packages are put in on the conveyor belt. The conveyor belt moves the packages towards the robot arm where the vision system detects the packages on it.

The vision system has to detect whether the type of the package is a Ziploc package or a clamshell package. It looks for the blobs and see if they are in range of a specific area (area between the Ziploc and the clamshell). After the vision system has detected some blobs, the type has to be defined and also the location where it has to pick the packages up and the orientation of the package, as the expiry date on the package has to be within a margin of 1 mm in a specific area of the package.

After the packages are detected and identified, they will be picked up by the robot arm which will move it along the printer head to print the expiry date on the package. After the printing, the robot will move to the correct box and put them in the box and repeat the steps until all the packages have been picked up. If no packages are detected, the conveyor belt will move, until the vision system detects packages again.

When a package is not detected or recognized, it will move off from the conveyor belt into a box. So if wanted, the packages can be placed back on the conveyor belt to go through the process again.

The system

For the implementation of the expiry date printing process, the implemented components are:

  • Conveyor belt
  • EOAT (End-of-arm-tool)
  • UR-10 (Collaborative robot arm)
  • Printing unit
  • Vision system
  • UP Board
  • GUI (Graphical User Interface)

Conveyor belt

The conveyor belt transports the packages that have been assembled by the operator to the robot arm. The conveyor belt is also used for the printing unit and the vision system. The frames are mounted together, so they all have a fixed position according to each other.

EOAT

The end-of-arm-tool (EOAT) is used to pick up the packages and to move it along the printer head where the date is printed on the package. The EOAT is a suction cup that is controlled with air. The design of the EOAT is shown below.

EOAT

End-of-arm-tool, consisting of suction cup and the 3D-printed holder

UR10

The robot arm is used to pick up the packages, move it along the printer and to put it in the correct box. Dynamic Ear Company would like to use a collaborative robot. This robot has a built-in safety feature and makes it safer to work with when operators are working in the same area.

UR10

Universal Robot 10 or “UR10”

Printing unit

Dynamic Ear Company has already invested in the Hitachi Inkjet printer. The printer has a printer head which can be mounted in any direction. The printer nozzle is mounted on the conveyor belt to have a fixed position relative to the vision system.

Printer

Hitachi ux-d160w printer with the printing nozzle on the bottom right

Vision system

The vision system detects packages that are on the conveyor belt. It uses a webcam to detect the packages and there is a box above the camera to prevent the camera to see light reflections on the conveyor belt as the conveyor surface is reflective. As mentioned before already, the frame of the camera is also mounted on the conveyor belt for the fixed position between the different components.

Lightbox

Box on top of the camera frame, camera mounted on the frame, printer nozzle and also EOAT visible

UP Board

An Up Board is a small computer that also has Input/Output pins. It connects all the different systems with each other. The conveyor, suction, and robot can be controlled by the UP Board.

upboard

The Up Board

GUI

The Graphical User Interface (GUI) is made in python using the TKinter library and is used to start and stop the whole system. It is also possible to change the speed of the conveyor belt and see if the system is running or not.

Schermafdruk van 2018-06-20 10-31-32

Graphical User Interface made with the Tkinter Python library

Major challenges

These are the major challenges we encountered during the project:

Vision system – hardware

During the start of the project, one of the largest challenges was isolating the packages from the background using the camera. At first, we tried using a matte black background, but since one of the package types was black as well, this proved to be not the right solution. Because of this, the conveyor was switched to one with a light blue belt. This gave a good contrast with the packages for the project.

After switching the conveyor, another issue appeared: The surface of the blue conveyor was shiny. This caused reflections of the lights in the room off of the conveyor belt into the camera.

This was fixed by adding a “roof” to the camera. This stopped the TL-lighting from shining onto the relevant part of the belt and into the camera.

Vision system – software

The switch to the blue conveyor made the singulation of the packages significantly easier. At first, we did this by analyzing the images retrieved from the camera in RGB-space (read: by looking at the standard format of Red, Green and Blue pixels). Later we realized singulation would be much easier by looking at the images in HSV-space. HSV stands for Hue, Saturation, Value. By looking at the hue, it was very easy to filter out the blue conveyor from the mages, regardless of shadows and lighting.

color spaces

“Red Green Blue” color space vs.” Hue Saturation Value” color space

Apart from this, recognizing the type of package proved to be a challenge. At first, we compared histograms (graphs of the frequency of a color in an image) of the packages.

This resulted in some technical problems related to the amount of background color depending on the angle of the package. We solved this by instead looking at the area of the package, since the packages have different sizes.

Integration

There have been challenges integrating the complete system on the UP Board. In the beginning we used a Realsense camera for creating images for the vision system, but we were not able to get it working on the UP Board. That took a couple of days to troubleshoot and then we decided to use a webcam instead. Connecting the conveyor belt to the Up Board also took a few days to get it working properly and just getting all the modules and libraries installed took some time as well. Using the UP Board was quite time consuming, but in the end worth it, since all components can be controlled from the Up Board.

Conclusion

Both finding and picking up the packages and printing the date work. Getting the pickup location and the angle still takes around 10 seconds to get the information out of the vision system. This is caused by heavy calculations that need to be done by the relatively small computer.

The vision system does not work flawlessly: some errors still occur. For example, sometimes on the “Clamshell” package an incorrect part of the package is identified as a “marker”, resulting in an incorrect print. Moreover, the vision system will still try to pick up other objects if they have features similar to the Ziploc packages or the Clamshell packages. We believe if the type recognition is done by comparing histograms of the current image to a reference image, it will probably work better. The accuracy of the printing is not within the target margin of +1 mm yet, but with some more effort repeatability can be improved.

In conclusion, with more time the system could still be improved a lot, but the basic functionality of automating the printing of the expiry date works.