Back to Top
Alive Cell Tracking Alpha
An Alpha version of tracking algorithm targeting alive cells with strong deformability, and without exposure of fluorescence.
This is an iGEM 2019 product from team SUSTech_Shenzhen.
- The design is discussed by Peter S and Eric Huang.
- The coding is mainly completed by Peter S.
- The GUI interface is implemented by Huang Chaoxin.
For researchers and iGEMers, feel free to use this tool if you want to keep track of alive cells for a relatively large amount of time, and without using dye or fluorescence.
However, be aware that this is only a prototype, meaning that under certain occasions it may not work well as expected. More tests and adjustments are needed in the current stage.
Features
- Customizable Parameters
- Simple GUI Interface with File Chooser
- Visualized Cell Selectors
- Real-time Tracking View
- Precise Missing Detection and Multitracker Reinitialization
- Scalable Output Dataset
- Additional Data Analysis Script
Motivation
In our project, we need to keep track of HL60 to see whether its movement has been affected by IL8. Specifically, we hope to keep our cells alive for a certain amount of time and thus we cannot expose them under fluorescence all the time. The other problem is that each tracking process produces more than 1500 frames, which makes it a nightmare for researchers who want to point out each cell manually. If we can build a simple software to somehow track down as many cells per frame as we can, and control the precision under a specified level in the meantime, the problems above might be solved.
Current Situation
For the time being, most of the cell trackers available online are only able to detect the positions of deal cells, rather than tracking moving cells.
There are also plenty specific to keeping track of molecules in the cellular environment, but the molecules are also processed with fluorescence, which does not meet our needs. In general, we need an approach to keep track of cells that are:
- Alive
- Probably morphologic, and
- Have not been processed by dye or fluorescence.
Roadblocks
We are encountering lots of problems during implementations, listed as below:
- The cells we are targeting are morphologic. The shape of the cells are not fixed.
- Due to limited time, it seems not feasible to implement a tracker from scratch. For existing trackers however, there are chances that they fail on tracking cells.
- With the OpenCV Multi-Tracker API, dynamic handling of cell coordinates as
Numpy
arrays are required. Dynamic specifies the difficulty. - With the existing tracker such as CSRT, learned filter sets for cells are not abundant. Before detection is fully functional, the tracking process can still get easily interrupted when certain cells are missing.
Ideas and Design
- There are 8 standard trackers in the OpenCV package:
BOOSTING
,MIL
,KCF
,TLD
,MEDIANFLOW
,GOTURN
,MOSSE
,CSRT
After testing each tracker, we found that CSRT tracker has the best tracking precision on cells.
CSRT is also specified as CSR-DCF (Discriminative Correlation Filter Tracker with Channel and Spatial Reliability). It considers using learned filter sets to detect possible objects with channel and spatial reliability. With this tracker, we are able to locate each cell at a nice precision.
- For a period of time, we tried to implement the platform under C++, but later on we switched to Python, because the plotting tool (
Matplotlib
) and the file chooser (tkinter
) can be easily packed into the platform.
The next thing to consider seriously is the missing judgment.
If a cell gets lost, there are two conditions:
The multi-tracker claims to lose track of it.
The cell may not be alive anymore, so it is “sitting” in the video. Or under slight turbulences, it seems to “dawdle” around. When this happens in more than a certain number of continuous frames, it may be necessary to doubt whether this cell is still alive or not.
Using the idea above, we drew a flow chart to further extend our ideas:
In our program, we designed a flow chart to guide coding. After determines the parameters with GUI, the users can select and read a source video. Then the users need to select enough number of cells to track. Automatically, the program will assign random colors to cells and initialize a multi-tracker. Then, if the video is finished, only the data last for longer than vt are kept. Or the program will judge whether the cell is dawdelling. If so, and dawdelling too long and missing too many cells, or the program will read the next frame. Then if you want to track more cells, the program will comes back to select cells. Or all the keeping data will be saved into a file and plot a graph.
Implementation
The main algorithm follows the current pseudocode:
1. Handle all parameters in the GUI platform
2. start the program by reading a source video
3. select cells to track
4. assign random colors to new selected cells
5. initialize a multi-tracker
6. keep track of the cells until the tracker finishes its life cycle, or the number of missing cells exceeds the threshold
7. If the user wants to continue adding new cells, jump to step 3 (the selection process). If the user wants to finish, go to the next step
8. save the data of tracking cells' positions into a file
9. use the data to plot out a graph
10. finish
To check the detailed implementations, visit the Github page of our project.
The timeline of our implementations is as follow:
- Implement basic multi-tracker function.
- Dig out the parameters and divide them into an individual file -
Config.py
. - Add a file chooser to let the user choose the source video using a more visualized view.
- Implement the plotting function to plot the results.
- Handle the data saving process.
- Further implement the miss detection procedure to detect missing cells and delete them from the dataset.
- When cells are missing, their previous data may still be valuable. Check whether the maximum continuous valid frames these cells endure have exceeded the threshold.
- Implement a simple GUI to manage the parameters.
A Simple Demo of our Platform
GUI Interface
File Chooser
Cell Selection & Multi-tracking
Missing Cases
Results
Data Analysis with a Provided Script
License
The license we use for this project is CC by 4.0.