Team:Marburg/Colony Picking

C O L O N Y   P I C K I N G


Modify your OT-2 to be a colony picker costing less than $300

Figure 1 - The OT-2 equipped with the CPU (Colony Picking Unit).

Opentrons OT-2 is currently the cheapest liquid handling robot on the market and each year many iGEM teams get their hands on one through the Opentrons Robot Challenge. This pipetting robot is not only interesting for iGEM teams, but also for start-up companies. To exploit the full potential of the robot and facilitate the workflow of cloning, we decided to turn the OT-2 into a fully functioning colony picker for a relatively low price and open-source based.







Upgrade your OT-2 with the CPU (Colony Picking Unit)

To enable colony picking with the OT-2 we developed a package of custom-made modular hardware, called the Colony Picking Unit (CPU), which fits perfectly into the slots of the OT-2. All devices are simple to install, easy to use and low cost solutions, while maintaining a high quality

sex sex
Figure 2 - The Raspberry Pi 4 and the ArduCAM fixed in the 3D printed case attached to the arm of the OT-2.

sex
Figure 3 - Our custom made light table in action while taking a picture of the 90 mm agar plate with colonies

The CPU consists of the light table, containing an Arduino and LEDs making it possible to automatically switch the light on and off while taking pictures of your plate, and the mount for the OT-2 arm, where the Raspberry Pi 4 Model B with an ArduCAM will be mounted, taking and subsequently analysing the corresponding picture.

Of course, all the files required for 3D printing and instructions on how to install the electronic devices are provided open-source (description and Github repository).

Marburg Colony Identification Neural Network (MCoINN) powered colony detection

For analyzing the taken picture and identifying colonies we decided on using a neural network based artificial intelligence (AI) instead of a hard coded image processing approach This ensures a flexible solution and makes it possible to extend the functionalities in the future even further.

To train the MCoINN, many pictures of agar plates containing E. coli colonies were taken, not only by our team, but also by other iGEM teams in the context of a collaboration. Each individual colony which seemed suitable for picking was manually labelled, serving as training data for MCoINN to define the characteristics of a suitable colony.

sex sex
Figure 4 - Picture of the agar plate that was taken with the ArduCAM before (left) and after (right) being processed by the AI. The identified colonies are marked in green.

MCoINN was trained with around 200 pictures, of which every single one contained multiple colonies. The resulting colony prediction was very reflective of the training data we trained MCoINN with. This was very surprising because we expected it to work with much more training data (~1000 pictures) that we did not even need to do image augmentation for the training data. With much more time it will be interesting to retrain the algorithm with augmented images that will multiply the quantity of the data by at least a factor of 4. Nevertheless this is a very welcomed development, because this lowers the threshold for a custom AI algorithm even lower. The next time any user will only need around 50 colony pictures and image augmentation algorithm to train a functional AI. Moreover, other users who are interested in building their own AI will also benefit from our dataset, which can provide basic structures for the AI regardless of however their colonies look like. We got the pictures from our colony picking pictures collaboration and this is very beneficial for the training because the AI got exposed to various kinds of real lab conditions. More details on the implementation of the AI can be found in our AI documentation and Docker documentation

Placeholder image
Figure 5 - Visualization of the training via TensorBoard. On the x-axis is the run number and the loss values on the y-axis.

Let the OT-2 pick them for you

Combining the designed hardware with the trained AI will turn the OT-2 in a low-budget colony picker. The only thing we had to do, was putting the light table in the OT-2 and connecting it with the robot, mounting the Raspberry Pi 4 and the ArduCAM onto the arm, putting an agar plate onto the light table and then taking a picture of that specific plate. The trained AI would do the rest by identifying the colonies, translating the coordinates from the picture into coordinates of the OT-2 system, to eventually move to that certain position and pick the identified colony.

Placeholder image
Figure 6 - Colony picking in action - the OT-2 in the progress of picking a colony with the P10.

Via the GUI the OT-2 will take a picture of the colony laid on the light table, which will then be processed by the MCoINN. After analysis coordinates of the colonies from the picture is extracted and translated into coordinates that are intelligible for the OT-2. Via the move_to function from the robot library provided by Opentrons, the pipette will be moved to the target colony, which will be picked.

All done on the Graphical User Interface for Directed Engineering (GUIDE)

We at iGEM Marburg are serious about bringing AI and robotic revolution to biology labs. For this the system has to be user friendly and therefore we created a GUI. It will increase the accessibility of the software: with the GUI people with less technical experience can also operate the colony picker. Moreover the AI training is packaged into the GUI and will allow people to retrain their colony picker according to their own individual needs. Using AI as a backend allows our system to be flexible and using a GUI makes it more accessible.

The GUI is written on Python using the Kivy library. On the backend it connects to a server/ client on both the camera module and the Raspberry Pi inside the OT-2 and acts as a man in the middle communication between both. The GUI enables calibration of the OT-2, labware deck overview, and colony picking within a few clicks to guarantee accuracy and intuitive colony picking experience.

Placeholder image
Figure 7 - GUI

Build AI for Your own Need

Our AI is trained for detecting white E. coli colonies, but we understand that other people’s needs will differ. That is why we chose AI instead of hard-coded algorithm: so that other potential users can build one according to their needs. Some would like to pick bigger, or smaller, or green colonies; you name it. We designed our GUI so that other users can easily train their own AI that will suit their needs.

We contained all of our work on AI within a docker file, which enables training a new custom AI as easy as implementing a few commands. Users will just need to provide a folder containing training and test images according to their own needs. The AI can then be trained on a local computer and will output frozen inference graph at the end, which then can be utilized according to each user’s needs. More details on the implementation of the AI can be found in our AI documentation and Docker documentation

How does it all work?

Here is a graph that explains our project:

sex sex
Figure 8 - Conceptual flowchart of the CPU.

See the visual concept: