DeepFish: A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis

The dataset consists of approximately 40 thousand images collected underwater from 20 habitats in the marine-environments of tropical Australia.

The dataset originally contained only classification labels. Thus, we collected point-level and segmentation labels to have a more comprehensive fish analysis benchmark.

Videos for DeepFish were collected for 20 habitats from remote coastal marine environments of tropical Australia. These videos were acquired using cameras mounted on metal frames, deployed over the side of a vessel to acquire video footage underwater. The cameras were lowered to the seabed and left to record the natural fish community, while the vessel maintained a distance of 100 m. The depth and the map coordinates of the cameras were collected using an acoustic depth sounder and a GPS, respectively. Video recording was carried out during daylight hours and in relatively low turbidity periods. The video clips were captured in full HD resolution (1920 × 1080 pixels) from a digital camera. In total, the number of video frames taken is 39,766. 

The DeepFish dataset and code are publicly available at https://alzayats.github.io/DeepFish/ and https://github.com/alzayats/DeepFish, respectively.

The full methodology is available in the Open Access publication from the Related publications link below.

    Data Record Details
    Data record related to this publication DeepFish: A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis
    Data Publication title DeepFish: A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis
  • Description

    The dataset consists of approximately 40 thousand images collected underwater from 20 habitats in the marine-environments of tropical Australia.

    The dataset originally contained only classification labels. Thus, we collected point-level and segmentation labels to have a more comprehensive fish analysis benchmark.

    Videos for DeepFish were collected for 20 habitats from remote coastal marine environments of tropical Australia. These videos were acquired using cameras mounted on metal frames, deployed over the side of a vessel to acquire video footage underwater. The cameras were lowered to the seabed and left to record the natural fish community, while the vessel maintained a distance of 100 m. The depth and the map coordinates of the cameras were collected using an acoustic depth sounder and a GPS, respectively. Video recording was carried out during daylight hours and in relatively low turbidity periods. The video clips were captured in full HD resolution (1920 × 1080 pixels) from a digital camera. In total, the number of video frames taken is 39,766. 

    The DeepFish dataset and code are publicly available at https://alzayats.github.io/DeepFish/ and https://github.com/alzayats/DeepFish, respectively.

    The full methodology is available in the Open Access publication from the Related publications link below.

  • Other Descriptors
    • Descriptor
    • Descriptor type
  • Data type dataset
  • Keywords
    • fish datasets
    • DeepFish
    • deep learning
    • Computer Vision
    • fish Segmentation
    • ARC Centre of Excellence for Coral Reef Studies
  • Funding source
  • Research grant(s)/Scheme name(s)
    • -
  • Research themes
    Tropical Ecosystems, Conservation and Climate Change
    Industries and Economies in the Tropics
    Tropical Health, Medicine and Biosecurity
    FoR Codes (*)
    SEO Codes
    Specify spatial or temporal setting of the data
    Temporal (time) coverage
  • Start Date
  • End Date
  • Time Period
    Spatial (location) coverage
  • Locations
    • Palm Islands, Queensland, Australia
    • Western Australia
  • Related publications
      Name Saleh, Alzayat, Laradji, Issam H., Konovalov, Dmitry A., Bradley, Michael, Vazquez, David, and Sheaves, Marcus (2020) A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis. Scientific Reports, 10. 14671.
    • URL https://doi.org/10.1038/s41598-020-71639-x
    • Notes Open Access
  • Related websites
      Name
    • URL
    • Notes
  • Related metadata (including standards, codebooks, vocabularies, thesauri, ontologies)
      Name
    • URL
    • Notes
  • Related data
  • Related services
      Name
    • URL
    • Notes
    Citation Saleh, Alzayat; Bradley, Michael; Sheaves, Marcus (2020): DeepFish: A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis. James Cook University. https://doi.org/10.25903/5f617fb6d6e0e