HuskyLens is an easy-to-use AI machine vision sensor. It is equipped with multiple functions, such as face recognition, object tracking, object recognition, line tracking, color recognition, and tag(QR code) recognition.
#42489
I'm working with Huskylens tag recognition, polling to see whether there are new tags or changes to tag parameters. I'd like to know how long I should wait between calls to the requestBlocks() and read() functions. Is 20ms between calls too short a time?

Thanks.
#42498
james.stanley wrote:
Mon Jun 29, 2020 4:26 am
Huskylens tag recognition, polling to see whether there are new tags or changes to tag parameters. ...
Have you tested (polling rate = xx ms):
- time from last no tag visible, to first tag_id_0 (single tag, none learned yet) returned?
- time from last no tag visible, to first tag_id_1 (single learned tag) returned?
- time from one tag_id_1 at position x0,y0 to first tag_id_1 at position x1,y1 returned?
- time from last no tag visible, to first tag_id_2 (single learned tag from multiple learned tags) returned?
- time from last no tag visible, to first multiple_learned_tags returned for two tags in view?

And to determine if polling rate affects the recognition timing, did you repeat one or more tests with polling rate set at twice, half and one-third the rate of the corresponding test above. If the time goes up for twice and down for half rate, it will be evidence that polling too often affects the recognition rate. If the time does not change more than the variance for a set of tests, then polling rate does not affect recognition rate.

Additionally, did you test if there any affect on the recognition rate between the "basic firmware" and the "object classification firmware"?

All this is to suggest that your question of the optimum polling rate may not have a single answer, and may vary from release to release.

The community would benefit greatly if you or someone would post their findings.
#42501
Hi James,
thank you for your support and contact.
Please note that requestBlocks() is to request all blocks data from huskylens, and store into the buffer of your mainboard, so when finishing request, you could directly call the read() function, no need to wait.
#42503
Thanks for your detailed reply. I can do some of these tests fairly readily.

I'm most concerned about placement accuracy -- I need to know when a tag is in the center of the screen. It's not clear what might interfere with that, whether it's polling frequency, the speed of the tag moving across the screen, etc. These are the things I'll be looking at.
#42513
nana.wang, thanks for your reply. My initial posting was not clear enough -- I wanted to know how long I need to wait between calls to request().

I've since noticed that the sample code in the Huskylens documentation does not have any delay in the loop() where request is called.

I've run my own code in a loop with different delay times between calls to request() (20ms up to 200ms, running on an Adafruit Feather M4 Express with SAMD51 processor). The delays don't make any difference -- using the serial interface, it takes about 4ms to perform the request and the read when there is a tag on the screen.

The time doesn't change (within the resolution of the millis() timer) if I move the camera to move the tag around on the screen.

Apparently Huskylens isn't being overloaded no matter how rapidly I call request().
#42527
Hello James,
HuskyLens will send all datas when the mainboard calls request(), until all datas transfer completed, so it does not matter whether there are some delays between two request().
By the way, could you tell something or share about your project based on tag recognition? What will you do with the tag recognition? Thanks.
#42530
yuyouliang2012 --

Thanks for your reply. I was worried that I was calling request() too often, but it looks like that's not a problem.

I am using tags as landmarks for a mobile robot to navigate and build a map of its surroundings. The Huskylens is mounted on a servo on the robot chassis to scan the area looking for tags. When it sees a tag it uses an IR eye pointed the same way as the Huskylens to tell the distance to the tag (the IR eye is also used to navigate around walls and obstacles). That distance and the angle the servo and chassis are pointed give the tag location. From there the robot can correlate the tag location with odometry from the wheels to build a map of where it's been and correct odometry errors in its position estimation.

I'm working now to determine the best way to use Huskylens to determine the tag location in conjunction with the IR eye; my software still has some kinks to work out.

Thanks for your interest.