iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://github.com/thomas-pegot/esp32-motion
GitHub - thomas-pegot/esp32-motion: Motion detection lib for esp32cam
Skip to content

thomas-pegot/esp32-motion

Repository files navigation

Motion estimation lib (ESP32cam)

More information in Doxygen documentation

Gitpod Ready-to-Code

Contents :

Introduction

The purpose of this library is to implement robust motion estimation algorithm for the ESP32cam and other embedded chip. Motion estimation is the process of finding motion vectors that define the translation from one image to another. This can be resolved by differennt approach:

  • block matching algorithms ( ES, TSS, ARPS, EPZS)
  • optical flow (Lucas-Kanade, Horn-Schunk)
  • pixel recursive algorithm (RANSAC)
  • phase correlation (FFT based)

At the moment, I have implemented Lucas-kanade, ARPS and EPZS (FFMPEG + AVC/MPEG4 paper).

Basic usage

Declaration and initialisation :

First thing first create a motion estimation context :

MotionEstContext me_ctx = {.method = LK_OPTICAL_FLOW,    // algorithm used
                           .width  = 96,  .height = 96 // size of your image
                           };

MotionEstContext me_ctx2 = {
.method = BLOCK_MATCHING_ARPS,  // algo used 
                  .mbSize = 8,  // block size for block matching algo
            .search_param = 7,  // search parameter value for block matching algo
  .width = 640, .height = 480); // images size
                            }

table of correspondance :

macro val function called
LK_OPTICAL_FLOW_8BIT 0 lucas kanade (out 8-bit uchar)
LK_OPTICAL_FLOW 1 lucas kanade (out 16-bit vector)
BLOCK_MATCHING_ARPS 2 ARPS (out 16-bit vector)
BLOCK_MATCHING_EPZS 3 EPZS (out 16-bit vector)

Next allocate motion vectors:

init_context(&me_ctx);

Estimate motion :

Now you can call motion_estimation method and pass current and previous images buffer.

if(!motion_estimation(&me_ctx, (uint8_t *)img_prev, (uint8_t *)img_cur))
    Serial.println("motion estimtion failed!")!

Now motion vectors will be stored in me_ctx.mv_table[0] with the maximum being me_ctx.max.

Note : in case of EZPS algorithm, mv_table acts as a FIFO which means each time you perform an estimation it will push the FIFO :

graph LR;
motion_estimation --> mv_table0 --> mv_table1 --> mv_table2 --> NULL;

Loading

EZPS algorithm need previous motion vectors as a way of prediction to the next generated.

Free memory :

uninit(&me_ctx);

Macros (optional)

In epzs.c changing #define FFMPEG 0 to 1 will use ffmpeg version instead of paper

In lucas_kanade_optical_flow.c changing #define NOSMOOTH 1 to 0 will enable isotropic smooth causing an increase in latency.

Example project

TODOs

  • Implement a skip for the first frame in EPZS that use prediction of previous frame. This might the reason why EPZS can have poor result at start.

Refs

Releases

No releases published

Packages

No packages published