GSOC 2021 Progress Updates


Joined the BeagleBoard community and submitted a GSOC project proposal


Week 1 (June 7-13): Received hardware, got organized with mentors and the community. (Also passed my final exams!)

Week 2 (June 14-20): Now is the time to really get to work. I am starting by setting up this status update blog, reading some relevant materials, discussing project deliverables with mentors, and making an intro video about the scope of the project.

Reading items: BeagleBone Cookbook, Exploring BeagleBone, and TinyML

Week goals:

  • set up hardware
  • flesh out the work plan for the rest  of the summer
  • speak with mentors and make a plan for future meetings
  • intro video

Week 3 (June 21-27):

This week I read a lot of the books I started last week. I also bought one more book (eek! I promise myself I will give some of them away when I learn their contents). This additional book, Embedded Linux Primer is really fabulous and is demystifying a lot of what I was doing last week to get set up. I’m beginning to understand what I was doing when I followed the steps in this guide to first install the Linaro cross compiler then set up U-Boot and compile a specific kernel for my BeagleBone. This week I also got that working and was able to boot my BeagleBone AI from the SD card that I’d formatted and install additional TIDL packages etc.

By the end of this week, I met all the mentors and figured out how to ssh into my BeagleBone. With the mentors we talked about trying TFLite on Arduino to compare how it works there, and the TI-Deep Learning Layer.

Week 4 (June 28-July 4):

Kept reading. Wasn’t able to work as much as I’d hoped, but still made some (slow) progress at understanding my own workflow and deciding to play around with the pre-existing TIDL examples. Unfortunately I did not successfully compile them and I think I have all the components of OpenCV installed so I’m not sure why this is happening. I’m still getting the error:

In file included from classification.tidl.cpp:46:
/usr/share/ti/tidl/tidl_api/inc/imgutil.h:32:10: fatal error: opencv2/core.hpp: No such file or directory
 #include "opencv2/core.hpp"

even after doing

sudo apt update ; sudo apt install libopencv-core-dev



Week 5 (July 5-11):

During a mentor meeting, we decided to investigate how to control the IPU on BBAI in order to not have to use TIDL and rely on TI for maintaining that package. So far this is a “try and see” project, and the plan is to apt-get source for the ipumm-dra7xx-installer using the TI Debian Kernel 4.19 for iot armhf. I’m having some trouble with the new image, working on that now.

The main resources that we’ve seen so far for IPU are in Rust.

I’m trying out the examples from TensorFlow Lite for Microcontrollers for Arduino.  The helloworld one is just a sine wave.

Week 6 (July 12-18):
I am working with a new image of 4.19.94-ti-r64 in parallel trying to get ti-tidl set up nicely and compile the on board examples and also investigate ipumm package.

neither of these things have worked out yet.

Week 7 (July 19-25):

My mentor is also working on getting TIDL set up and having issues. I am going to investigate some of the Rust example.

Week 8 (July26-Aug 1):

Followed TensorFlow Lite directions to cross compile for Armv7 processor. Also worked through the TinyML book example of training and testing a basic model.  In case anyone is wondering, TFLite Micro is now it’s own git repo available at:

another thing I figured out is the way to install openCV that works is:

sudo apt install libopencv-dev



Week 9 (Aug 2 -8):

We’re finally zeroing in on what needs to be done. Not sure if there is new TFLite micro documentation (the whole repo moved on July 25 to be separate from the rest of Tensor Flow), or if I’m just seeing it and understanding it better. This documentation explains that we can run TFLite micro on any processor that we can get the dubug output. We had thought IPUs make the most sense, but maybe it’s easier to get these outputs from the DSP?

Another option is to go back to the TI SDK yocto build and use TFLite on that.

In the meantime while figuring this out, I’m also making progress on doing the TFLite Arduino examples in the TinyML book and on the TFLite repo.

Week 10 (Aug 16-22)

GSOC is coming to a close and the initial problem for my project isn’t solved yet: how to get TFLite working on the BeagleBone AI? I went away last week, and then spoke to my mentor again this week. It seems like yocto is not going to solve all of our problems, but another thought is to install the Arduino IDE directly on the BeagleBone AI and connect a monitor to the BeagleBone so that I can run things through the Arduino IDE. School is starting again next week, but I will give this a try when I get a chance. I do want to find a working solution since there are many users out there who would like to use TFLite on the BeagleBone AI.