Our Solution

Here we briefly describe the technologies used to register and match fish images in our solution.  It is divided into two parts:  Initial processing of images and then matching process.  The initial processing stage is usually executed on our patented apparatus for registration.  Matching steps are always executed on the our server once the initial processing data is uploaded.

The final part of this explanation will elaborate on how data is catalogued for later search and audit purposes. 

Step #1

Initial Processing

All fish images are preprocessed for spots initially.  A tailored blob detecting algorithm identifies spot-like shapes in the image.  These potential spots are classified as relevant spots or not through the use of a Deepnet applied to the spot image.  The Deepnet is trained on manually labeled data.

As an example, the figure illustrates all the spots detected in black and the classified fish spots in red.  As you can see the system is rather accurate in identifying the relevant spots.

Step #1 continued

Validated spot locations and their sizes are stored for registration of a new fish.

These spots are further aggregated together into clusters of spots with a shape ‘metric.’  We term these clusters, “feature points”.  The figure below illustrates how they are created.  A point’s immediate neighbourhood is characterised as a series of angles (A, B, C, D) which specifies the shape metric.  The features are also stored with a registration fish image.

Step #2

Matching Process

When a candidate match image is provided to the technology, it extracts the spots and then calculates feature clusters as discussed above. The database of registered fish images is searched for similar feature sets. This search is conducted with the feature shape metrics. Each potential match image is then checked with the following process:

Features between the two images with the same metrics are matched as shown in the image below through the shape metric.  (For the technical minded, the Euclidean distance is used to match.)  

Step #2 Continued

A pair of sets of features that geometrically agree overall and have at least 11 matching pairs indicates a match on the fish.  A search is conducted over the matching group of features to find such a pair.  This threshold of 11 is sometimes increased to strengthen confidence in a match.  The image below shows a strong match.

The registration image with the highest match pair count is considered the match.

Step #3

Data Management

As discussed above, on registration an image is stored with its features:  their shape metrics, sizes and locations.  Registration images are labelled with a GPS location, the identity of the fisherman or vessel and a date and time.  A measure of the fish’s head width is also stored for later use.

Registration image data are stored for rapid search and access for 3 months.  After this period, the registration is archived in a compressed form for audit purposes if necessary.  Data is stored on a main production server and a backup server for matching purposes.

When a user matches with the mobile app, the features extracted from the fish image are sent to the server.  These features are used to search the registration data in reverse direction.  (The search scans from the last registration images to the 3 month window end.)  Images with similar features are matched to the match features with the deeper method discussed in section III.  When a strong match is found it is returned to the user as the match along with the zone the fish was caught in, the date and some of the vessel’s identification.

Each match statistic is stored on the system for accounting and recording purposes. Billing is determined from this data. 

Multiple Match Fallback

Should the system find a several registration matches to the fish sample the app will contact a consultant to intervene and assist to validating the possible matches.  This fallback is used to ensure that a user does not get an invalid match early in the rollout process while the system is still undergoing testing.

No Match Situation

If the system discovers more than a minimum number of validated fish spots, but finds no match the image and match attempt data is sent to a consultant for appraisal and recommendation.  The user will receive a message requesting that the user will be contacted shortly.  This situation may arise if the fish is not from a supported vessel or when the algorithm fails to match.  Human assistance will be required to address these situations in early rollout. When sufficiently large confidence is reached in the system, a “no match” message will be used as a default.

No Spots

If a user takes multiple images of a fish and no spots are detected, it may indicate that a new variant of fish is being photographed or that a different, unknown camera is being used. These situations will provide the user with a “no match” response and submit the data to a consultant to assess and potentially use as training data. The consultant may message the user to update them to the issue.

All Registration and Matching Data is Stored

The system will store all data submitted to the server, including images on registration and images when matching fails, including some random image data to check system performance on matching.  No data can be deleted in the system at any point.  A history of training data for the spot Deepnet is continuously grown to ensure high performance from the system.