I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.
The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
I hadn’t really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy…but I looked the USB things up and they’re wildly efficient and he says they work just fine for his applications. I was impressed.
Yeah they’re pretty impressive for some at home stuff and they’re not even that costly.
The Coral is fantastic for use cases that don’t need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.
It runs Tensorflow Lite, so you can also build your own models.
Pretty good for a $25 device!