WildTrax User Guide

Published

September 8, 2025


🚧 This Guide is in Beta! 🚧

Please pardon our dust as we continue adding content and improvements.

1 The Basics 🔰

WildTrax is an online platform for managing, storing, processing and sharing environmental sensor data. With WildTrax, you can:

  • Manage all components of environmental sensor and biological data from field metadata, to media, to species observations
  • Store data safely and securely while still making it readily accessible to users
  • Process environmental sensor data to a high degree of data quality and integrity
  • Share environmental sensor and biological data with other WildTrax users, collaborators and the public
  • Discover data in your study area or across the WildTrax system

WildTrax acknowledges that it was conceived and is developed on the territory of the Néhiyaw (Cree), Niitsitapi (Blackfoot), Métis, Nakoda (Stoney), Dene, Haudenosaunee (Iroquois) and Anishinaabe (Ojibway/Saulteaux), lands that are now known as part of Treaties 6, 7 and 8 and homeland of the Métis. We respect the sovereignty, lands, histories, languages, knowledge systems and cultures of all First Nations, Métis and Inuit nations.

WildTrax is continuously improved based on user needs and stakeholder engagement processes. Sign up for the newsletter in User settings or check out the News page to get the most up-to-date feature releases. Each sensor in WildTrax is supported organizations who have help pave the way for a multi-sensor experience in WildTrax. Visit our full list of Partners and Sponsors.

1.1 Why use WildTrax?

The WildTrax platform was developed by the Alberta Biodiversity Monitoring Institute (ABMI) and the University of Alberta. The ABMI is an arm’s length, not-for-profit scientific organization that has been providing scientifically credible tools and information products on Alberta’s biodiversity and human footprint to provincial government, industry, environmental decision-makers, and Albertans since 2003. The ABMI has since become a global leader in the application and development of biodiversity monitoring.

Environmental sensors (such as autonomous recording units [ARUs] or remote cameras) are an increasingly common monitoring method used to measure biological, environmental and ecological attributes across broad geographic scales. These sensors allow for automated collection of data over an extended period and can generate large amounts of valuable biological data.

Biological data, such as counts of animals, their behaviour, or other attributes, can be derived from environmental sensors. WildTrax seamlessly integrates such data across multiple sensors, with the additional capacity to incorporate data from point counts, a commonly used method for evaluating species’ relative abundance, especially birds.

Open data is data that can be accessed, re-used or redistributed by anyone and is freely available in a usable and convenient format. Open data benefits the scientific community and society. Data accessibility allows users (e.g., researchers, conservation practitioners and the public) to find, manipulate and analyze data, as well as link data to other types of information. Open data can lead directly to conservation knowledge and action. This requires data to be usable, compatible with other datasets, and reliable.

1.2 Create an account

You can explore WildTrax data for free anytime. While the Data Discover tool lets you browse and map public data without an account, creating an account unlocks roles and privileges to fully access and use the system. Visit wildtrax.ca and click on the button in the top ribbon.

You can create a WildTrax using either a social media account, such as Google, or a username and password linked to your email address. After creating your account, you’ll need to verify it through an email sent to your inbox. This step is required before you can start using your account. For a detailed walkthrough, check out the video tutorial below.

1.3 Data hierarchy and permissions

Your individual user account ensures data security and assigns specific roles within the system. Access to data is determined by several factors, including your role, location settings, and the publication status of projects. WildTrax uses a structured data hierarchy with permissions set the following levels to maintain controlled access. As roles are assigned in the system, data is kept data secure and accessible only to the right people. Each user has their own account with your access depending on your role, which determines the tools, features, and data sets you can see and use. Organization and project data availability considers factors like location settings and whether the data has been published, ensuring it’s shared responsibly and securely.

  • Organizations bring together groups of researchers and users collaborating on diverse research initiatives. Organizations oversee equipment, manage locations and media, and have the ability to create projects using the media they own.

Organizations


  • Locations are geographic sites where environmental sensors are deployed or biological data is collected. They link media and equipment metadata in WildTrax and are managed by organizations.

Locations


  • Projects are groups of specific media to address research questions or implement study designs. Projects can be one of three sensors: ARUs for acoustic data, cameras or point counts.

Projects


  • Tasks are specific assignments given to users to process media, such as audio recordings or image sets, into biological data. Within each task, species tags are applied to extract meaningful information, such as species presence or abundance. Equivalent to tasks in the point count sensor are surveys with species observations further contributing to the biological data available.

ARU Processing

Camera Processing


  • After processing, projects can be published to share with other WildTrax users or the public. Projects will then become available in the dashboard and within Data Discover.

Data Discover Landing Page


Check out this video tutorial on WildTrax’s Hierarchy.

Check out this video tutorial on WildTrax’s Privacy Settings.

1.4 Upload and process data

WildTrax provides tools to manage and process environmental sensor data. Following these steps will help you integrate your data efficiently and make full use of the platform. The general data life cycle in WildTrax includes:

  • Create an Organization: Set up an organization to store and oversee your data.
  • Create a Project: Define a project to process data and address your research questions.
  • Add Users: Invite collaborators with specific roles to your organization or project.
  • Upload Recordings or Images: Organize and upload your media files for processing.
  • Create or Upload Tasks/Surveys: Provide tasks or surveys to guide processing using your chosen methodology.
  • Process or Upload Tags/Observations: Generate tags during processing or import existing observations.
  • Verify and Apply Quality Control: Review and validate tags to ensure accuracy.
  • Publish the Project: Finalize and make your project available for access and use.

1.4.1 Acoustic data

WildTrax supports a wide range of audio file types (WAC, WAV, W4V, MP3, FLAC) and recognizes naming conventions from most major autonomous recording unit models. Files should include a location and a date time stamp and can be uploaded them directly to acoustic projects for immediate task creation, or organizations where you can generate tasks for projects based on other criteria, such as weather conditions or geographic regions, or take advantage of automated pre-processing with BirdNET and HawkEars acoustic classifiers.

1.4.2 Remote camera data

For camera projects, select from one of three-level options to select from in order to upload your folder depending on where your location and deployments exist for each image set, in order to align with when camera are routinely checke in the field. Select the location-level folder to upload, and WildTrax will automatically pre-process images with MegaDetector V6. Quickly move onto species tagging by choosing from image- and tag-based filters in both Full and Series Tagging view.

1.4.3 Point count data

For point count projects, create projects and download the CSV templates to begin synchronizing your data, or contact the Boreal Avian Modelling Centre for help with data standardization. Then, publish the project or share your surveys and observations with others.

1.5 Publish, share and download data

When the project is completed and all data processed, the status of the project can be changed to a published status. Project publication allows other WildTrax users, who are not project members, to access the media, metadata or species detections from the project either through the project dashboard or Data Discover. The publication status will control how data will become visible across the system. Project publication will also lock users from editing species detections and is considered the final version of the data. You can change the status of a project at any time. Location and project membership settings will also determine what you and others can see. Ensure you have these correctly set before publishing a project. Here are some in-depth descriptions of each of the published statuses and what they mean.

Need to understand more about how to publish a project? Check out the video tutorial below.

Active Projects

  • Active: Active projects are currently being processed or designed, or are in the preliminary stages of data uploading. Use this status for any general use or if the project is actively being worked on. This is the default project status when it is first created.
  • Test: Just getting started with an environmental sensor program? Or have some media you want to upload to test WildTrax’s functionalities? Use the Test status in these cases. Project data will not be visible in Data Downloads or Data Discover except to project members.

Published Projects

  • Published – Private: Project data will only be available to project members. Users will need to request access to the project in order to view any details such as species or locations.
  • Published – Map Only: Project data will be accessible through Data Discover but the media and report are not accessible to users who are not project members. If you’re not a project or organization member, the location buffering and visibility settings will apply.
  • Published – Map + Report Only: Project data become available to all WildTrax users through Data Downloads and Data Discover, however, the media is not accessible. Use this setting if you want to make your data publicly available but there are privacy concerns with your media. If you’re not a project or organization member, the location buffering and visibility settings will apply. This status does not exist for point counts since the sensor does not contain media.
  • Published – Public: All of the project data become available to any WildTrax user as well as the public in Data Discover. If you’re not a project or organization member, location buffering and visibility settings will stil apply.

See more about sharing data within an open data network like WildTrax in the Open Data section.

1.5.1 Download data

Reports provide high and low level detail about project data and provide everything you need for downstream analysis and decision-making. Right-click, select multiple or click the pencil icon and select . As previously mentioned, the reports you can see within the dashboard will be limited to your membership to the organizations and projects or the publication status of the project.


2 Accounts and Membership đŸ‘„

WildTrax operates under a role-based access control meaning users can collaborate to manage or share data to answer broader scientific questions by assigning appropriate privileges to the data. Access to certain features, tools, or data may vary based on your permission level. For example, access to project data can be restricted by location settings or the project’s publication status. Here’s an overview of the key permissions levels:

  • Organization membership all you to see or govern access to all projects and data under an Organization. Permissions at the Organization level include:
    • Administrator
    • Read-Only Member
  • Location visibility and buffering controls whether whether locations are visible or hidden for privacy purposes
  • Project membership dictates what you can do within specific projects such as uploading media, processing tasks or verify species tags roles include:
    • Administrator
    • Tagger
    • Read-Only Member
  • Project Status dictates whether members or the public can access a project or its data. Active and Test projects are accessible only to members of the respective Organization or Project. Once a project is published, the visibility and access to its data depend on the permission level assigned to the user.

The interaction of these permission levels defines the scope of data you can view and how you can engage with it on WildTrax. For example, since Organizations are parent entities to projects, Organization administrators automatically inherit Project Administrator privileges. However, users assigned read-only access at the project level will not gain access to the parent Organization.

If components are unavailable or options are greyed out, you likely lack access to the data, organization, or project. To gain access, request it by clicking the next to the organization name, selecting and completing the form. Administrators will review your request and either approve or deny it. Be sure to specify the level of access you need, such as read-only or administrator.

2.1 User settings

The user settings dashboard can be accessed by clicking on your username in the top right corner of the top ribbon when you’re logged into the system, for example .

The user settings dashboard controls the following properties related to your account:

  • Name: your full name
  • Initials: an acronym or set of initials you can use to define an observer or user
  • Subscribe to Newsletter: a toggle that will opt you in for occasional WildTrax newsletters delivered to your email
  • Language: your default language. Currently available in English and French.
  • Affiliation (optional): the organization, institution or group of which you’re a member or user in the system.

Once you’ve made your desired changes, click the button.

The JWT token is a unique authentication token used for accessing WildTrax via the wildrtrax package associated to your email. It also enables secure and flexible integration with WildTrax APIs for building custom applications.

You can change the language in the site at anytime from the top ribbon toggle . If you want to permanently change the language, go to User settings and change it to your preferred language.


3 Data Discover 🌍

Data Discover is the central hub for exploring environmental sensor data in WildTrax. Data Discover allows you to see which organizations have published data on WildTrax and which species were detected, and explore media elements such as images and sounds captured in the environment. In Data Discover, you can search for data from ARUs, cameras, and point counts, using a variety of attribute filters, and create summary statistics within a dynamic mapping interface where you can gain a comprehensive understanding of environmental sensor data in an area that interests you.

After using Data Discover, you may have found an organization or project that you’re interested in – so what’s next? Head over to either the Organization or Project dashboard to find out more about the data owners or the project or to request access to the data. Once you are granted access by the project administrators, proceed to Data Downloads to acquire the data.

3.1 Filters

Use attribute filters or select a specific sensor to search available data within Data Discover. On the left side of the interface, the Filter Panel houses various filters for refining your search. Results will be displayed on the map and in a table below. Ensure locations with spatial coordinates are visible on the map, and toggle between different base maps (light or satellite) in the top right corner.

Data Discover Filters

Data Discover Filters

You can search by:

  • Taxonomy: Classify data based on class, order, family, and genus.
  • Species: Search for individual species or add multiple species to your selection.
  • Organizations
  • Projects by sensor, either ARU, camera, or point count
  • Dates and times (also months and hours) within set intervals or with start and end dates

Note you must select one sensor must be selected before you can proceed with other You can delete the selected options in the filter panel at the bottom left of the panel using Delete Layer.

3.2 Layers

Explore data in depth with up to five customizable layers. Create a new layer by clicking on the icon. The colours for the points correspond to the layer in the filter panel. By hovering over the layer number, you can duplicate an existing layer to preserve its results and further refine your exploration, use the garbage can icon to delete the current layer, and the to control the visibility of the layer on the map. Each summary insight in the bottom-right corner is also colour-coded to correspond with the layer’s filter.

Data Discover Layers

Data Discover Layers

3.3 Searching an area of interest

Refine your selection to an area of interest using the polygon tool in the top-right corner . One polygon per layer is supported. To draw a polygon, click the tool, then click on the map to define points, completing the shape by clicking back to the original point. To remove a polygon, select it on the map and click the Garbage can icon.

Data Discover Polygon Tool

Data Discover Polygon Tool

Data Discover one polygon per layer

Data Discover one polygon per layer

3.4 Summaries and insights

Click the Layer Summary icon in the bottom-right corner to view a visual representation of the organizations, projects, species, and tag counts in your layer. This action opens the Summary Window, where you can explore detailed insights. Within the Summary Window, the Summary tab provides an overview, while the Media tab offers media-specific details.

  • Summary Tab: View pie charts detailing the number of organizations, projects, and species for your selected area. Scroll down for bar charts representing tag counts across months and hours.
  • Media Tab: Tiles correspond to species tags. Play audio clips or view images. Observe the minimum and maximum frequency of an audio clip in its ARU spectrogram. Note that point counts do not include any media.


4 Organizations 🏱

Organizations sit at the top of the WildTrax hierarchy and are the central entity to which environmental sensor data, biological data and metadata are associated. When in doubt, if you’re looking for any information in WildTrax, you can likely find it under the organization. Organizations represent groups of users who collect data, design and publish projects, manage equipment and survey locations. Organizations allow you to coordinate efforts with multiple WildTrax users to create a structured, standardized dataset. Examples of organizations in WildTrax include government branches, industry, research labs, communities, non-profits and NGOs and citizen scientists.

You can jump to any Organization or Project at any time using CTRL + /. This will bring up a search prompt allowing you to navigate to any Organization or Project you have access to.

4.1 Create an Organization

Click on in the top ribbon, followed by My Organizations. This will take you to the organization dashboard. Click the button, from here the Organization Settings form will appear.

Fill in the fields in the form and click Save. A WildTrax administrator will need to confirm your identity before approving your new Organization request. If you’re having any technical difficulties creating an Organization contact WildTrax Support at support@widltrax.ca.

4.1.1 Data storage

WildTrax offers multiple choices for data storage designed to support the platform’s growing user base and evolving data management needs. Each Organization must select the storage location where they need to store their media. This allows you to quickly access an image or recording at a moment’s need. The current options as of September 08, 2025 currently include:

  • WildTrax Live Servers: Hosted at the University of Alberta in Edmonton, Alberta, Canada. These servers provide a cost-effective and sustainable solution with live and off-site backups using Amazon Deep Glacier, ensuring secure and reliable data storage for Canadian and international organizations.
  • Amazon Web Services (AWS): Retained as an option for users requiring cloud-based storage with global accessibility and high-performance capabilities with international storage base options. Current locations include Oregon and Montreal.

The selection of a default storage location now balances data sovereignty, cost efficiency, and upload/download performance based on geographic proximity. For detailed guidance, refer to WildTrax’s Terms and Conditions of Use and Data Access Policies, or contact info@wildtrax.ca for additional support.

Note that each option comes with a cost-recovery model when Organizations exceed a certain limit of data. Organizations will be charged for these storage fees on an annual basis starting in 2026.

4.1.2 Default privacy settings

Organizations can manage the privacy of locations and images within the system to control what other users can access when data becomes published and more widely available. Location privacy can also be managed individually; see Location privacy settings for more information.

For location privacy, two approaches are available within the Default Location Buffering field.

  • True locations: Upload exact coordinates into the system. If a buffer is added, non-members or public viewers will only see buffered locations. This allows you to have access to and manage the true locations within WildTrax while simulatenously allowing users to gain access to the data without knowing the true coordinates
  • Buffered locations: Supply pre-buffered coordinates to the system, specifying the radius used. This ensures the displayed locations are already obscured as intended.

The Allow Location Reports feature enables you to create shareable links for external audiences, such as collaborators or landowners, without granting them privileges or exposing unnecessary details about your organization. The Default Location Photo Access field governs global permissions for accessing location photos.

  • Private: Only members of the organization or project can view the location photos even if the project is published
  • Project-level Access: members of the project can view the location photos, but non-members cannot even if the project is published
  • Publicly Viewable: if a project is published, non-members can view the location photos within the tasks

Human Blurring applies organization-wide settings to blur humans detected in images. The available options include:

  • Blur for anonymous users (if applicable): images of humans will be blurred for all non-read only/non-admin WildTrax users if project data is visible based on the project status.
  • Blur for non-admins: images of humans will be blurred for all WildTrax users regardless of their organization or project membership.
  • Blur for everyone: images of humans will be blurred for all read-only WildTrax users.

Choose the option that best aligns with your needs and data privacy policies. For example, if you opt out of Human Blurring, you acknowledge that uploaded images—and those potentially shared publicly—may include humans, accepting the associated risks.

4.1.3 Organization membership

After your organization is approved, the option will appear in the menu. This will allow you to search for and add any WildTrax user to your organization either as an administrator or read-only member.

Organization administrators collaboratively manage the media and metadata of the organization and have the ability to:

  • Enjoy administrator privileges by default on all projects belonging to the Organization
  • Add WildTrax users to the organization or its projects
  • Read and write to organizational locations
  • Read and write to the visit, equipment, deployment and media metadata

Organization read-only members can:

  • Read the unbuffered locations, i.e., read-only members can see the true locations if they are buffered but cannot modify them
  • Read the visit, equipment, and media metadata
  • Enjoy read-only access to all organizational projects
  • The organization dashboard lists all organizations in WildTrax. The View Only My Organizations toggle in the top-right filters the list to only organizations you’re a part of.

If the organization is greyed out, you are not a member of that organization. Click the drop-down arrow beside the organization name and then click and fill in the Request Access form to request membership. Administrators of the organization will receive a notification and will either approve or deny your membership request.

Once the organization has been approved, The principal investigators of the organization are users who respond to access requests related to the organization or its projects. Without a principal or secondary investigator, all organization and project access requests will default to organization then project administrators, in order. Once you create an Organization, you can add yourself as one of the principal investigators or add more users to your Organization and then assign them those privileges.

Get started using your first Organization by doing any of the following next:

Manage Organization Metadata

Optimize your organization’s metadata by:

Create a Project

  • Define Your Research Questions
    Clearly outline the objectives or questions your project will address. This will help guide data setup and analysis.
  • Decide on an Upload Strategy
    Determine if recordings will be uploaded at the project level or organization level to align with your workflow.
  • Assign Membership Privileges
    Define roles for project participants (e.g., read-only, tagger, admin) based on access requirements.

Need more help? See Accounts, Membership, and Permissions for detailed guidance.

4.2 Locations

Locations refer to the physical, geographic places at which environmental sensors were deployed and/or biological data was collected on the landscape. They are one of the most important components in WildTrax as media and metadata are linked by the location.

All locations are stored exclusively as latitude and longitude coordinates in the WGS84 format in decimal degrees.

4.2.1 Create locations

Click the Locations tab in your organization dashboard. Clicking will open the location form, where you can add the spatial metadata to the location or configure the location’s settings. Only location name, latitude and longitude are required fields. When you’ve filled in the form click Save. The map tab will appear allowing you to visualize the point on the landscape. The location will also be visible on maps across WildTrax.

Did you know that locations are automatically created in the organization when you upload media either via the organization or a project? If a location name already exists in the organization, WildTrax will append the media to it, and if it is a new location WildTrax will prompt you during the upload process to optionally add spatial coordinates to these new locations. This allows processes, such as weather conditions, to be run in the background while your tasks are being created.

4.2.2 Location privacy settings

You can manage a location’s privacy settings individually by clicking the icon next to the location name. These settings control what information users can access about a location and its associated data, ensuring secure and customizable access across the system. WildTrax offers organization administrators multiple tools to define how members view locations and their data, providing robust privacy options to protect sensitive information as needed.

The location visibility setting is used to hide locations and data from WildTrax users who are not part of the organization or project the location belongs to.

  • Use Hidden – Location if you want to hide only the location—only organization and project administrators will see the location in the maps across the system. Species data can still be downloaded by non-members but with no coordinates, non-members will not know where the data comes from.
  • Use Hidden – Location + Data if you want to hide both the location and the species information. This setting will effectively hide everything from users who are not organization or project members.
  • Use Visible if you want the location and data to be visible to everyone once the project is published

If you are using Hidden – Location + Data, you can refer to location reporting to learn how to share data with users who are not members of the organization or project. This may include landowners, collaborators, or other users to whom you do not wish to grant privileges but are still able to share your data.

Location buffering is another way to mask sensitive locations. You can use the location buffering toggle in two ways:

  • True locations: Upload exact coordinates into the system. If a buffer is added, non-members or public viewers will only see buffered locations. This allows you to have access to and manage the true locations within WildTrax while simulatenously allowing users to gain access to the data without knowing the true coordinates
  • Buffered locations: Supply pre-buffered coordinates to the system, specifying the radius used. This ensures the displayed locations are already obscured as intended.

The link to the location reports is also available at the bottom of the form.

4.2.3 Sync locations

WildTrax provides the flexibility to sync location data in batch by uploading and downloading location information and metadata. This feature allows you to manage and edit location data outside of WildTrax and sync it back with your modifications. To upload data, you need organization administrator privileges, but read-only members can still download data. To sync locations, first go to the menu and select :

  • Download Location Data: Click the to download the current list of all locations and metadata in your organization. If no metadata exists, a template CSV will be provided with column headers.
  • Edit the CSV File: Open the downloaded CSV file and make any necessary changes or edits. You can modify existing entries or add new ones. Do not modify the fields beginning with internal_ as they are for WildTrax use only.
  • Upload the Edited CSV: Click the button to upload your edited CSV. This will take you to the Upload CSV form. Select your local CSV file and click Preview Changes to review the updates.

Batch upload processes in WildTrax support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

4.2.4 Merge locations

WildTrax allows you to merge multiple locations into one. This feature is particularly useful if a location has been visited multiple times but was assigned different names for each visit or if the locations represent the same physical place on the landscape. To merge locations:

  • Select the source location (the location you want to modify) and click the , or right-click the location row and select .
  • The location merge form will appear. From the dropdown list, select the target location (the location to merge into).
  • Confirm your selection: the media from both locations will be combined, but only the metadata from the target location will be preserved.
  • Click to complete the process. After merging, all data will be consolidated under the target location.

Merging locations is an irreversible process. Double-check your selections before proceeding.

Merging locations is also only possible when one or both locations lack spatial coordinates (latitude and longitude). If only one location has coordinates, those will be retained during the merge. However, merging is not allowed if both locations contain different spatial coordinates.

4.2.5 Delete locations

You can delete locations using the option, accessible via the next to any location name. However, if any data—such as visits, media, or tags—are associated with a location, deletion will not be allowed until those dependencies are removed (the delete button will appear greyed out). Detailed instructions for managing these dependencies can be found in the sections on projects and sensor types (ARU, camera, point count). Since locations serve as the foundation for all associated data, WildTrax has implemented safeguards to prevent accidental data loss, ensuring secure and reliable management of cascading data.

Note that deleting locations is an irreversible process.

4.2.6 Mapping locations

For each individual location, or a group of selected locations using the feature, you can visualize the locations on a map within Data Discover, further gaining insights about the species and media at the location. Click the or right-click the location and select . A new tab will also visualizing the location in Data Discover.

4.2.7 Location reports

If you’re an organization administrator, you can enable location reports for individual locations within your organization. This feature allows users to share specific location reports with collaborators without granting full project or organizational access—ideal for sharing information with landowners, leaseholders, partners, or collaborators. To use this feature, navigate to the bottom of the location settings, where you’ll find the Report Link field. Copy the link into your browser, or share it with the intended recipient.

4.3 Location photos

Location photos are photos taken as a means to record the landscape around where a sensor was deployed. WildTrax has the ability to upload, store and manage location photos and attribute basic metadata for them. You can also filter and sort through your photos within the location photos tab.

4.3.1 Uploading location photos

Organize your location photos by ensuring each folder on your computer is named to correspond with the specific location where the photos were taken. During the upload process, the system will automatically search within each folder and assign the images to the matching location based on the folder name. You do not need to rename any of the images, but should be aware of which images were taken in which corresponding directions and angles the photos were taken so that you can add the corresponding metadata.

The ideal structure for organizing your location photos before uploading is to create folders named after each location, with the corresponding photos placed within those folders.

Location Photo Structure

Go to the Location photos tab and click on . Each individual location with the files will be in view. When you are ready, click

If the location doesn’t exist in the organization yet, you’ll have to add it first otherwise you’ll get a warning during the upload process. Go to the Location tab and click .

4.3.2 Adding location photo metadata

Each location photo will also have the following properties that can be assigned to each photo. You can do this for each individual image with the pencil icon below each image, or in batch, by selecting multiple image and then clicking then or as needed.

  • Direction: cardinal direction of the location broken down into sub-cardinal units
  • Vertical Angle: angle at which the image was taken relative to cardinal north
  • Access: the accessibility of the image; this is also determined by organization administrators (see Default privacy settings)
  • Comments: any useful information about the image

4.4 Visits

A visit occurs when an observer goes to a location to collect environmental sensor or biological data. If equipment is placed at the location during a visit, this is referred to in WildTrax as a deployment. Visits allow for adding standard field data collection fields as they link field activities to equipment, media, and biological data. Visits also document general landscape details where sensors are placed or data is collected such as landscape features, distance to water or clutter percent. Visits can typically include two main activities either 1) using an environmental sensor or 2) conducting a point count, however a visit can refer to any case when a human has gone to a location regardless of sensor data.

4.4.1 Create visits

Navigate to the Visits tab in your organization. Click and add the required location, visit date and bait fields. Click when you’ve added the metadata you’re interested in and view your new visit in the summary table.

4.4.2 Sync visits

Visit data can be synced in batch by uploading and downloading visit information and metadata. This feature allows you to manage and edit visit data outside of WildTrax and sync it back with your modifications. To upload data, you need organization administrator privileges, but read-only members can still download data. To sync visits, first go to the menu and select :

  • Download Visit Data: Click the to download the current list of all visits and metadata in your organization. If no metadata exists, a template CSV will be provided will column headers
  • Edit the CSV File: Open the downloaded CSV file and make any necessary changes or edits. You can modify existing entries or add new ones. Do not modify the fields beginning with internal_ as they are for WildTrax use only.
  • Upload the Edited CSV: Click the button to upload your edited CSV. This will take you to the Upload CSV form. Select your local CSV file and click Preview Changes to review the updates.

Batch upload processes for visits support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

4.4.3 Delete visits

You can delete visits using the option, accessible via the beside any visit row.

Note that deleting locations is an irreversible process.

4.5 Equipment

The Equipment tab helps manage an organization’s equipment inventory. Here, you can maintain records for ARUs, remote cameras, and additional gear such as SD cards and microphones. These records can later be used to populate deployments. The Equipment metadata also includes a Status field, allowing you to select from various status types to track and alert users about equipment that is malfunctioning, on loan, or retired.

Equipment tab

Equipment tab

4.5.1 Add equipment

Click the Equipment tab in your organization dashboard. Clicking will open the equipment form, where you can add, for example, the equipment’s serial number, make, model. When you’ve filled in the form click . The equipment will then appear in the summary and be available for addition to deployments add other summaries across the system.

4.5.2 Sync equipment

Equipment data can be synced in batch by uploading and downloading equipment information and metadata. This feature allows you to manage and edit equipment data outside of WildTrax and sync it back with your modifications. To upload data, you need organization administrator privileges, but read-only members can still download data. To sync equipment, first go to the menu and select :

  • Download Equipment Data: Click the to download the current list of all equipment and metadata in your organization. If no metadata exists, a template CSV will be provided will column headers
  • Edit the CSV File: Open the downloaded CSV file and make any necessary changes or edits. You can modify existing entries or add new ones. Do not modify the fields beginning with internal_ as they are for WildTrax use only.
  • Upload the Edited CSV: Click the button to upload your edited CSV. This will take you to the Upload CSV form. Select your local CSV file and click Preview Changes to review the updates.

Batch upload processes for visits support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

4.5.3 Delete equipment

You can delete equipment using the option, accessible via the beside any equipment row or by right-clicking the row.

4.6 Deployments

Deployments are equipment with their associated location and visit during deployment and / or retrieval from the field. Once locations and visits are created and equipment inventories added to the Equipment table, users can then take advantage of the Deployments tab to associate the history of what is happening at location or equipment over time.

4.6.1 Add deployments

Click the Deployments tab in your organization dashboard. Clicking will open the deployment form. Here you can select from the list of locations, visits and equipment already in your inventory to begin generating the deployment. When you’ve created the first half of the form, you can add additional equipment, such as microphones or SD cards to the parent sensor equipment; click the button and select from the inventory of equipment available. Once you’ve filled in the form click . The deployment will then appear in the summary and be available for addition to deployments add other summaries across the system.

The deployment form also supports adding locations, visits, and equipment directly. For example, if you haven’t entered a location or equipment yet, you can click Add Location or Add Equipment to open the respective form, enter the details, and then seamlessly continue creating your deployment.

4.6.2 Sync deployments

Deployment data can be synced in batch by uploading and downloading deployment information and metadata. This feature allows you to manage and edit deployment data outside of WildTrax and sync it back with your modifications. To upload data, you need organization administrator privileges, but read-only members can still download data. To sync deployments, first go to the menu and select :

  • Download Deployment Data: Click the to download the current list of all deployments and metadata in your organization. If no metadata exists, a template CSV will be provided will column headers
  • Edit the CSV File: Open the downloaded CSV file and make any necessary changes or edits. You can modify existing entries or add new ones. Do not modify the fields beginning with internal_ as they are for WildTrax use only.
  • Upload the Edited CSV: Click the button to upload your edited CSV. This will take you to the Upload CSV form. Select your local CSV file and click Preview Changes to review the updates.

Batch upload processes for visits support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

4.6.3 Delete deployments

You can delete deployments using the option, accessible via the beside any deployment row or by right-clicking the row.

4.7 Recordings

The Recordings tab in the organization serves not only as a repository for all recordings but also provides tools to upload recordings and generate tasks based on over 40 different criteria—including weather conditions, machine-learning acoustic classifications, and more—allowing you to create the customized projects you need.

Recordings tab

Recordings tab

4.7.1 Upload recordings

  • For a stable and secure upload process, it’s strongly recommended to use an ethernet connection before proceeding. Large uploads can take a while, so start with smaller batches to gauge your upload time, and then proceed with larger batches. Supported formats are the following. Each recording uploaded to WildTrax is either compressed as the FLAC file type:
    • WAC and W4V are proprietary, lossless compressed file formats developed by Wildlife Acoustics
    • WAV is the standard, ubiquitous uncompressed audio file format
    • MP3 a lossy compressed audio file format; works by reducing the accuracy of certain sound components, and eliminating others
    • FLAC is a lossless compressed audio file format
  • Each recording must include a location prefix to be accepted. These are included in Wildlife Acoustics ARUs but check your make and model type. The location name can be changed later if the prefix is incorrect and you do not have a way of changing the name on your local media.
  • Each recording must include a date and time in an accepted format (e.g., YYYYMMDD_HHMMSS). Once uploaded, the recording’s date and time cannot be modified, so ensure they are accurate before uploading. If errors are found after upload:
    • You must delete the recording.
    • Correct the name or metadata in your local copy.
    • Re-upload the corrected recording.
  • The maximum supported length for an individual recording is 30 minutes (1800 seconds) or 320 MB. The R package wildrtrax provides functions to segment larger files into smaller parts.

Please review all details carefully to avoid upload issues.

To upload recordings for storage and processing in WildTrax, navigate to the Recordings tab and then click the upload icon. This opens the recording upload window, where you can configure settings before uploading your files.

WildTrax provides several options to optimize your upload process:

  • Including Subdirectories: Useful if your media is organized hierarchically.
  • Removing Leading Zeros: Uncheck this option to retain leading zeros in location names.
  • Trigger Marking: Distinguishes triggered recordings from schedule-based ones. See Setting up an ultrasonic project.
  • Pre-Scanning: Scans for and displays sample rate and recording length during the upload process.
  • Uploading to a project uploads both the recordings and generates each recording as a task. The recording is still stored in the organization at its full length even if a task was generated at a shorter length.

To begin, click Choose a Folder to Upload, select the directory containing your recordings, and let WildTrax scan the files. Once scanning is complete, optionally enter spatial coordinates for new locations or update missing coordinates. This can also be done later using location sync. If a location doesn’t already exist in the organization, WildTrax will create it through the upload process.

Once you’ve reviewed the queue, click the Begin Upload button to start the upload. You can follow along the uploads in the interface. After the uploads are complete, you can download and review the Log to ensure that all files were successfully uploaded.

Final confirmation of a successful upload will appear in the interface.

4.7.2 Acoustic classifiers

An acoustic classifier is a computer algorithm designed to analyze audio recordings and identify the species producing the sounds. These classifiers interpret spectrograms, visual representations of sound frequencies over time, to determine the presence of specific bird species or other acoustic taxa. BirdNET and HawkEars are two examples of such classifiers used by WildTrax, trained to process audio recordings and classify the species captured within them. For each recording uploaded to WildTrax, BirdNET and HawkEars are automatically run on the recording, and the results are made available in the Recordings tab, allowing users to view or further process the recordings in a project. Similarly, when recordings are uploaded directly through a project, the classifier overlays are accessible in the processing interface. The algorithms behind these classifiers differ, which can result in varying levels of accuracy depending on the species and recording conditions, performing differently in specific situations.

  • BirdNET is a worldwide bioacoustic classifier, which has been trained to classify thousands of species
  • HawkEars is a regional classifier that has been trained to classify hundreds of species, mostly Canadian and northern US species

BirdNET and HawkEars analyze 3-second windows (i.e. periods of time) of the spectrograms, providing confidence scores from 0 to 1 for each species in their model. In other words, a score of 0.9 is more likely a hit of an American Robin than a score of 0.2.

See Wood and Kahl for more information on score thresholds.

4.7.3 Filter recordings

The filter panel in the Recordings tab offers you a way to refine what recordings you’re looking for across your organization. Choose from over 40 different filters that can be applied simultaneously to filter down the recordings you want.

Filter panel

Filter panel

4.7.4 Generate tasks

To generate tasks for a project, navigate to the Recordings tab and open the task generation interface. This allows you to create tasks for selected recordings, assign users, and configure processing criteria such as project, task method, and processing length.

  • Select or filter the recordings you want to include
  • Use the checkboxes to select your desired recordings
  • Click and then to open the task generation pop-up
  • In the pop-up, select your project, task method, assigned users, and processing length.
  • Click to generate the tasks

Generate Tasks interface

If you haven’t created a project yet, the Recordings tab will prompt you to create one directly from the Organization. Once your first project is created, additional projects can be created from the Project Dashboard:

  • Navigate to My Data .
  • Select the appropriate sensor type for your project.
  • Follow the prompts to configure project details such as location, project name, and monitoring objectives.

Create a New Project interface

4.7.5 Delete recordings

You can delete one or many recordings using the option, accessible via the beside any recordings row, by right-clicking the row, or by multi-selection via the button then Remove Recordings.

You can also delete recordings in batch through a project. Note that you still must be an organization administrator in order to delete.

4.8 Image sets

You can access image set metadata from the image sets tab. Note that this information differs in the project’s image sets tab.

Image sets tab

Image sets tab

Here you will find summaries of the following information:

  • Location: the name of the location
  • Image set start date/time: the date and time of the first image collected in the image set (in the format YYYY-MM-DD HH:MM:SS)
  • Total image count: the total number of images Motion image count: the total number of images where the camera was triggered due to heat or motion (i.e., Trigger mode = “Motion Detection” or “M”) in an image set Task count: the total number of tasks Details drop-down: clicking on the drop-down arrow will show the projects the image set is associated with, the observer who tagged the task, the number of unique species detected, and the series gap used in the project.

5 ARU Projects 🎧

The goal of an ARU project is to upload media, process tasks, verify tags and publish the results. Projects belong to organizations and use organization media on order to generate and report on a certain set of results designed by the project administrators. Clicking on the ARU sensor on the main project dashboard will show which projects you have access to. The list of projects you see is determined by your organization membership, project membership and status of the project. You can use the filter and sort some of the project attributes to find what you’re looking for:

5.1 Acoustic data concepts

Acoustic recordings capture sound pressure waves from the environment. Two key parameters when recording are sampling rate (the number of samples per second in hertz) and bit depth (the precision of pressure measurements).

Raw waveforms, which represent the acoustic signal over time, are often difficult to interpret directly for species identification. To make recordings more intelligible, waveforms can be transformed into spectrograms using algorithms such as the Fourier transform. Spectrograms visually represent acoustic energy, with time on the x-axis, frequency on the y-axis, and amplitude or loudness represented by colour intensity.

In WildTrax, spectrograms are generated using the command-line software SoX, with audio file types calibrated to produce clear, high-quality visualizations. These spectrograms reveal patterns that humans or software can use to identify biological signals, such as animal vocalizations.

Animals communicate using acoustic signals for mate attraction, territorial defense, identification, and alarm. These signals are valuable to biologists, providing permanent, unbiased, and analyzable data for studying populations or community assemblages. Compared to traditional point counts, acoustic data allow for reproducible analyses and broader temporal coverage.

  • ARUs can be deployed for extended periods, offering flexibility in study design and enabling optimized data collection. WildTrax provides recommendations for processing acoustic data based on published research and reports

  • The optimal recording duration and amount of data processed depend on study objectives. Different taxa, such as amphibians, owls, or nocturnal species, may require distinct sampling strategies. For songbirds, shorter survey durations (e.g., 1-minute recordings) increase detection rates and allow more days of recordings to be processed, capturing more species efficiently. Short-duration surveys may have higher detection error per visit but achieve higher cumulative detection over time (e.g., 10 × 1-minute point counts vs. 1 × 10-minute point count).

  • While there are no firm guidelines on total recording time per location (e.g., 3, 5, or 10 minutes), processing recordings in 1-minute time blocks within longer intervals provides flexibility and maximizes processing efficiency. This approach also supports estimating parameters such as song rate and occurrence, improving the utility of the dataset. WildTrax supports task lengths to

  • Single-day recordings are generally insufficient to estimate occupancy or probability of occurrence. Deploying ARUs for several days strikes a balance, increasing detections of species with small territories and rarer species with larger home ranges. Extending deployments for a month may be less effective if it reduces the number of locations sampled.

5.2 Acoustic project management

The goal of an ARU project is to upload media, process tasks, verify tags, and publish results. Projects belong to organizations and use the organization’s media, such as recordings, to generate reports based on the objectives set by project administrators. From the main project dashboard, clicking on the ARU sensor will show the projects you have access to, which you can filter and sort by project attributes to help find specific projects quickly. The list of visible projects is determined by:

5.2.1 Create an ARU project

To create a new project you must first be a member of an organization to have privileges to upload media. Once you have the correct membership, click . The Project Settings form will open, allowing you to add details with required fields marked with an asterisk (*). Once saved, the project will appear in the ARU project dashboard. You can access the project’s settings, users, or species assignments at any time using the project menu.

Project Context Menu

 

You can further refine your projects by setting custom dynamic settings for all tasks. These settings do more than customize the look of the spectrogram based on user preferences, they can also be used to isolate specific frequency ranges, which is especially useful when targeting particular species or focusing on species-specific project goals. The spectrogram can always be changed later in the acoustic settings of the task.

 

Next, the Assign Species tab allows you to select which species are included in your project. This helps control tagging accuracy and focus tagging efforts on relevant species. To add a species, click a species in the Not Included column (the row turns blue when selected) then click the arrow button to move it to the Included column. You can also Select All or Unselect All in either column. Presets groups are also an easy way to add species to a project in bulk. Presets are based on geographic and taxonomic categories (e.g. birds, amphibians). Click Apply Preset, Search for and select one or more species groups. Click Submit to add all selected species to your project.

Note: Once a species has been tagged in the project, it cannot be removed. The row will lock automatically to prevent accidental changes.

:::callout-tip collapse=true If you would like to request a new preset group, email info@wildtrax.ca with the details including at minimum the species genus, species name, a common name and preferably a representative short code (e.g. TEWA = Tenneesee Warbler, White-tailed Deer (Odocoileus virginianus) = ODOVIR).

:::

Finally, you can manage users in the User Assignment tab. This will define users with roles and permissions across the project. Users inherit their membership level from the organization.

  • Project Administrators: Manage project settings, upload data, assign users, and task assignments and edit all tags and data
  • Project Taggers: Create and edit assigned tasks and species validations
  • Project Read Access Members: View project details and tasks only

5.2.2 Uploading recordings and creating tasks

To process data in WildTrax, you must generate and assign tasks. A task is a unique combination of a recording, a processing method and duration, and an observer. This structure allows the same recording to be processed by multiple users or with different methods. To upload recordings to a project, which automatically generates the recordings as tasks, navigate to the tab. Click the icon which will open the configuration settings for the upload.

  • For a stable and secure upload process, it’s strongly recommended to use an ethernet connection before proceeding. Large uploads can take a while, so start with smaller batches to gauge your upload time, and then proceed with larger batches. Supported formats are the following. Each recording uploaded to WildTrax is either compressed as the FLAC file type:
    • WAC and W4V are proprietary, lossless compressed file formats developed by Wildlife Acoustics
    • WAV is the standard, ubiquitous uncompressed audio file format
    • MP3 a lossy compressed audio file format; works by reducing the accuracy of certain sound components, and eliminating others
    • FLAC is a lossless compressed audio file format
  • Each recording must include a location prefix to be accepted. These are included in Wildlife Acoustics ARUs but check your make and model type. The location name can be changed later if the prefix is incorrect and you do not have a way of changing the name on your local media.
  • Each recording must include a date and time in an accepted format (e.g., YYYYMMDD_HHMMSS). Once uploaded, the recording’s date and time cannot be modified, so ensure they are accurate before uploading. If errors are found after upload:
    • You must delete the recording.
    • Correct the name or metadata in your local copy.
    • Re-upload the corrected recording.
  • The maximum supported length for an individual recording is 30 minutes (1800 seconds) or 320 MB. The R package wildrtrax provides functions to segment larger files into smaller parts.

Please review all details carefully to avoid upload issues.

To upload recordings for storage and processing in WildTrax, navigate to the Recordings tab and then click the upload icon. This opens the recording upload window, where you can configure settings before uploading your files.

WildTrax provides several options to optimize your upload process:

  • Including Subdirectories: Useful if your media is organized hierarchically.
  • Removing Leading Zeros: Uncheck this option to retain leading zeros in location names.
  • Trigger Marking: Distinguishes triggered recordings from schedule-based ones. See Setting up an ultrasonic project.
  • Pre-Scanning: Scans for and displays sample rate and recording length during the upload process.

To begin, click Choose a Folder to Upload, select the directory containing your recordings, and let WildTrax scan the files. Once scanning is complete, optionally enter spatial coordinates for new locations or update missing coordinates. This can also be done later using location sync. If a location doesn’t already exist in the organization, WildTrax will create it through the upload process.

Once you’ve reviewed the queue, click the Begin Upload button to start the upload. You can follow along the uploads in the interface. After the uploads are complete, you can download and review the Log to ensure that all files were successfully uploaded.

Final confirmation of a successful upload will appear in the interface.

5.2.3 Syncing data to ARU projects

Clicking on the menu features in the menu allows you to make bulk changes to your project data for locations, tasks and tags. This is especially useful for large updates such as renaming locations, adjusting visibility, reassigning tasks, or syncing tags from external sources.

Tip

Batch changes for location, task and tag sync only support add and update actions. Existing data will not be deleted even if you upload an empty CSV.

Project-level location management only includes locations linked to that specific project, unlike organization-level management which shows all locations. To add or update locations:

  1. Go to Manage → Download Location CSV.
    • If no locations exist yet, a template CSV will be provided.
  2. Edit or add location details in the CSV.
  3. Re-upload the file using Manage → Upload Location CSV.

Tasks represent the assignments for processing recordings. You can manually generate tasks if they weren’t created during the initial upload or if you need to create tasks from organization-level recordings. To add or update tasks:

  1. Go to Manage → Upload Tasks.
  2. Download one of the following:
    • Current Tasks CSV – If you want to update existing tasks.
    • Template CSV – If starting from scratch.
  3. Fill in the required fields:
    • location
    • recording_date_time (YYYY-MM-DD HH:MM:SS)
    • task_method (1SPM, 1SPT, or None)
    • recording_sample_frequency (recording sampling frequency in hertz, e.g. 44100)
    • task_duration (in seconds)
    • task_is_complete (t or f whether the task has been completed for processing)
    • observer (name of the user who is assigned the task)
    • task_comments (any additional information about the task, a maximum of 1000 characters)
  4. Re-upload CSV using {40px}

Tags represent species detections and other annotations made on recordings. These can be synced in bulk, especially when importing data from external sources like classifiers. To upload tags:

  1. Go to Manage → Upload Tags.
  2. Download a Template CSV if needed.
  3. Fill in the required fields:
    • Location Name
    • Recording Date/Time (YYYY-MM-DD HH:MM:SS)
    • Task Method (SPM, SPT, or NONE)
    • Observer ID
    • Species Common Name
    • Individual ID (e.g., 1, 2 for multiple individuals)
    • Vocalization Type (Song, Call, or Non-vocal Signal)
    • Start Time (in seconds)
    • Tag Duration (in seconds)
    • Minimum Frequency and Maximum Frequency (in Hz)
  4. Select Choose a CSV File to Upload.
  5. Click QA Tag Data to validate the formatting before finalizing.
Warning

Caution:
When importing classifier outputs, ensure that tag start times align properly.
Some classifiers use windowed data, which may not match WildTrax tag timing.

5.2.4 Setting up an ultrasonic project 🩇

WildTrax natively supports ultrasonic recordings that are typically trigger-based recordings for recordings bats and other ultrasonic vocalizing species. In order to use WildTrax for these data, there are some specific steps needed in order to setup a project appropriately

WildTrax supports syncing ultrasonic data from external recognizers. If you have processed your recordings with a tool like Kaleidoscope, you can upload the resulting tags to WildTrax using the wildrtrax R package.

  • Download and install the wildtrax package in R.
  • Use the function wt_kaleidoscope_tags() to convert Kaleidoscope output into WildTrax-compatible tags
  • Once you have a CSV with your WildTrax tags, navigate to your project in WildTrax.
  • Click the and select the then .

Select your CSV file and proceed the prompt to follow checks to ensure all tags are linked to the correct recordings. If there are no errors, you will have your classifications tgas synchronized with the media on the recordings in each task.

5.2.5 Merge projects

You may need to use environmental sensors across a number of years or for specific questions related to study design. WildTrax offers an unlimited number of projects that can be created at any time in order to cater to this need. However, there are times when these questions and multi-year projects can be unified to collate and make the data more cohesive. WildTrax gives you this ability with the project merging tool. Clicking on the drop-down arrow beside the project name on the dashboard will also show the Merge Project button. This function allows you to merge a source project to a target project.

5.2.6 Delete a project

5.2.7 Publishing a project

Go to the pencil icon or right-click on the project within the dashboard. Go to Status and change the project status. Note that publishing the project will lock it to taggers, read-only members, etc. Choose your publication status wisely (see #sec-publishing-and-sharing)

5.2.8 Downloading data

5.3 Acoustic tagging

This section is designed to provide a comprehensive overview of the tools and features available within the acoustic tagging page. Jump ahead to the section on methods if you’re looking for more details on tagging methods for species identification.

5.3.1 Controls and settings

WildTrax implements audio editing controls using hotkeys for easy navigation through a task. Hover over the panel above the spectrogram at any time to see the controls.

Hotkeys:
- Z: Jump back 10 seconds
- M: Jump forward 10 seconds
- B: Toggle display for boxes
- T: Toggle display for species / individual text
- L: Show/hide tags on the left channel
- R: Show/hide tags on the right channel
- Left Arrow: Previous marker
- Right Arrow: Next marker
- Up Arrow: Go to the first marker
- Down Arrow: Go to the last marker
- Spacebar: Pause / Play
- Tab: Move between fields
- 1: Create a marker
- Enter: Select within a field

You can dynamically adjust audio settings that control spectrogram and audio playback parameters. These settings default at the project level but can be modified while processing a task. Access them by clicking the gear icon below the spectrogram or via Manage > Audio Settings. Click Apply to save or Reset to revert to project defaults.

Key audio settings include:
- Channel: Display left, right, or both channels.
- Amplify: Increase gain to hear faint sounds.
- Minimum and Maximum Frequency (Hz): Spectrogram display bounds. Audio plays for the full spectrum unless Generate Audio is enabled.
- X-Scale: Adjust the time window displayed. Lower values show more time; higher values show less.
- Y-Height: Height of the spectrogram in pixels.
- Use Monochrome: Toggle black-and-white or color display.
- Generate Audio: Play audio only in selected frequency ranges.
- Noise Filter: Apply a filter to highlight signals over background noise.
- Contrast: Sets dynamic range in dBFS (default 120). Lowering increases spectrogram contrast.
- Brightness: Sets the upper limit of the Z-axis in dBFS. Negative numbers increase perceived brightness.


5.3.2 Noise, weather, and malfunctions đŸŒ§ïž

ARUs capture all sounds, including geophonic (e.g., wind, rain) and anthropogenic (e.g., traffic, machinery) noise. Noise assessment is important because it can interfere with species detection, depending on frequency and amplitude.

  • Industrial noise, such as motors or fans, can be nearly constant.
  • Traffic or nearby equipment produces intermittent signals.
  • Biotic noise (e.g., insects) may also appear in recordings.

Noise can affect detection by amplitude or frequency. Even faint broad-spectrum sounds may mask biotic signals, and loud low-frequency noise can reduce detection accuracy.

ARUs produce some internal static, which can indicate equipment issues, especially in older models with lower signal-to-noise ratios.

WildTrx allows you to avoid processing tasks where wind, rain, or industrial noise exceeds project-defined thresholds. The weather panel shows conditions from the nearest weather station at the closest timestamp to the recording. You can open detailed weather information to view hourly and daily conditions from multiple stations within a 155 km radius.


5.3.3 Map and location photos

If the task has spatial metadata or photos, the Map and Location Photos tabs appear next to the Audit tab.

  • Map: View where the task was conducted.
  • Location Photos: See seasonal habitat changes. Images are sorted by visit date to compare conditions across seasons.

These features assist with species identification by providing context about habitat and site conditions.


5.3.4 Acoustic tagging methods

WildTrax uses acoustic processing methods based on avian point count recommendations from:
- Bioacoustic Unit
- Boreal Avian Modelling Project
- Environment and Climate Change Canada

The processing method is set during upload or task generation and depends on two factors: processing length and count-removal type.

Count-removal defines how individual detections are handled:
- 1SPM: One tag per species per minute
- 1SPT: One tag per species per task
- None: Unlimited tags

WildTrax tracks individuals at the individual level (e.g., OVEN 1, OVEN 2).

For additional processing methods, contact WildTrax Support.

5.3.5 Tags

You can add tags (click and drag a box) or markers (Hotkey: 1) on spectrograms.

The boxes are a fundamental way WildTrax differs from many other audio tagging systems. They allow you to enclose a species detection in real-time while providing information about vocalization frequency, length, and amplitude. After drawing a box, WildTrax pauses playback and requires you to enter species metadata.

Markers indicate where a signal is detected without stopping playback. You can return later to draw a box and enter species metadata. This dual system allows flexible workflows for rapid annotation.

To create a tag click and drag on the spectrogram to create a box around the signal. Include harmonic frequencies if relevant to your study. The Add New Tag window will appear on the right side of the spectrogram. Follow project-specific rules and constraints for tag creation:

  • Tags must be 0.1–20 seconds long.

  • Minimum frequency: 0 Hz, maximum frequency: 12,000 Hz.

  • Minimum tag dimensions: 0.1 sec × 200 Hz (5×5 pixels minimum).

  • Tag should tightly enclose the signal, capturing key vocalization patterns.

  • Some methods may restrict the number of tags per species-individual.
    The tag metadata also includes the following fields:

  • Species: Dropdown using short-hand codes or common names

  • Individual: Unique identifier for each individual of a species (e.g., OVEN 1, OVEN 2). Automatically updated if left blank.

  • Needs Review: Flag for verification or uncertainty.

  • Abundance: Estimate of number of individuals. For amphibians, see Calling Intensity (CI) below.

  • Vocalization Type: Song, call, non-vocal, or call+feeding buzz for ultrasonic species.

  • Comments: Notes about songs, calls, or unknowns.

All tag information is saved in the logged sounds table below the spectrogram. You can edit tags by clicking the box or in the table. Hover over cells for time details, and click to jump directly to the signal.

5.3.6 Avian individuals and abundance 🐩

  • Individual: Tracks distinct animals.
  • Abundance: Number of individuals within the tag.
  • Example: Tennessee Warbler (TEWA) - Individual = 1, Abundance = 1.

Factors that improve accuracy for individual tagging:

  • Territorial or sedentary individuals
  • Unique song or behavior
  • Low species richness
  • Low abiotic noise

When individuals cannot be distinguished, use abundance to reflect multiple animals or TMTT (“Too Many To Tag”) for uncountable numbers.

5.3.7 Amphibian abundance 🐾

Amphibians congregate during the breeding season. Calling intensity (CI) is used instead of counting individuals, adapted from the NAAMP Amphibian Calling Index (ACI, 2005):

  • CI 1: Individuals countable, calls separated
  • CI 2: Overlapping calls, individuals distinguishable
  • CI 3: Full chorus, continuous overlapping calls

5.3.8 Confidence and unknowns

  • Tag uncertain signals as Needs Review.
  • Unknowns are categorized conservatively (e.g., UNTL instead of UNKN).
  • Factors affecting identification: amplitude, signal complexity, masking, observer skill.
  • Project replication standards help account for faint or generic call notes.

5.3.9 Vocalization types

Categories in WildTrax:

  • Song: Territorial/mate-attracting vocalizations (e.g., passerine males).
    • Examples: Ovenbird (Seiurus aurocapillus) song, Black-throated Green Warbler (Setophaga virens) multiple songs.
  • Call: Non-mate-attracting vocalizations; sex not distinguished.
    • Examples: Alarm calls, begging calls, simpler non-passerine calls.
    • Exceptions: Corvids (calls), Yellow Rail (song for territoriality).
  • Non-vocal: Mechanical sounds produced by a species.
    • Examples:
      • Wilson’s Snipe (Gallinago delicata) - winnowing
      • Yellow-bellied Sapsucker (Sphyrapicus varius) - drumming
  • Call+Feeding Buzz: Rapid calls used for prey capture by echolocating species.
    • Example: Little Brown Bat (Myotis lucifugus) feeding buzz

WildTrax is flexible in categorization, but these standards help maintain consistency. Suggestions for improving vocalization standards can be sent to WildTrax Info.

5.3.10 Ultrasonic species 🩇

As most recordings coming from ultrasonic projects are trigger-based most tasks end up being slightly longer or approximately the length of the tag.

5.3.11 Acoustic classifier overlays

If enabled in the project, each processing tasks includes the ability to overlay acoustic classifier detections. Within WildTrax, the goal is to help promote

5.3.12 Species verification

The species verification tab houses a set of tools to assist in the quality control of acoustic tags. In species verification, administrators assign users as validators to verify all tags grouped from a species-vocalization type in a project. Species-vocalization type is used for the grouping to allow the user to focus on one signal type at a time, e.g., Wilson’s Snipe (WISN) calling vs. winnowing. Species verification is important because it allows you to efficiently check all the tags produced in a project before publishing it, thereby dramatically increasing data quality output.

ARU species verification in WildTrax is a two-stage approach:

  1. Create or import tags in a project

  2. Verify all the tags of a species-vocalization type

This allows a WildTrax user to have the first pass at processing the acoustic data and then use the verification tools to target and verify tags. Verification tags are populated once the task is completed (or switched to Transcribed). Clicking on the Species tab in the project page displays a list of species-vocalization types, summaries of the total number of verified tags, and tools to assign users for verification, in order to help you manage the verification process. Click on any of the species to enter the verification page.

This is the standard practice currently used in WildTrax. Use this route if you have trained taggers who will be generating tags from assigned tasks.

Upload recordings to a project and generate tasks by checking Create a new task Assign users as taggers to the tasks Tag all of the tasks in a project Assign validators under the Species tab Verify all of the tags for each species-vocalization type where desired; return to the tasks to change or delete tags where needed Filter by verified tags and mark high quality ones as Nice, or mark as Nice as you verify Proceed to publishing your verified project

If you have outputs from a recognizer and wish to verify the hits and share the results in WildTrax:

Upload recordings to a project and create tasks Upload tags from the automated classifier Assign validators under the Species tab Verify all of the tags for each species-vocalization type where desired; return to the tasks to change or delete tags where needed Rate the tags to help determine recognizer performance Proceed to publishing your verified project

The verification page header designates the species-vocalization type you’re verifying and the filters are metadata you can use to filter the list of tags you see in the single tag panels. The single tag panels allow you to individually access and manipulate the audio and spectrogram properties, and take actions in order to verify the tag.

You can listen to any of the audio in the single tag panels by clicking the Play button in the top-left hand corner. The icons below the spectrogram indicate the following:

Verify: when checked green , the tag has been verified. The background of the single tag panel will also change to green. Rating : when checked yellow , the tag has been rated Task link : opens a new tab to the task where the tag was created; the tag will be coloured black in the task BirdNET probability: maximum probability (0-1) returned from BirdNET. The number is the maximum value found in all of the 3-second windows where BirdNET also positively detected the species and intersected the tag. Amplitude : peak amplitude (in dBFS) of the tag Abundance: abundance of the species made in the tag. If TMTT, the icon will appear as

WildTrax uses the BirdNET API and returns the maximum probability from all the 3-second windows that intersected the tag. Note, results from other species are not returned.

WildTrax returns the maximum value in all of the 3-second windows (in dashed blue lines here) generated by BirdNET that intersect the tag. The value indicates the probability of BirdNET having detected the species in that interval. If BirdNET doesn’t detect the same species, the probability will be 0. You can also click on the button which opens the Help menu describing everything from the legend to keyboard shortcuts and tag selection methods so you can customize your verification workflow the way you want.

Clicking on the top-right corner of the single tag panel opens the detailed verification window. This window allows you to manipulate the audio and spectrogram parameters in order to verify the tag. The main sections of the detailed verification window include:

Task link (in the header) Action buttons Tag details Spectrogram Filters The task link will open a new tab and jump back to the task to view where the tag took place, highlighting the tag in black. This is useful if you need more context beyond what is available in the detailed verification window.

Actions are where you can quickly validate the tag after you’ve done any filtering or audio manipulation.

to verify. The icon will turn green when the tag is verified. Screen Shot 2022 01 27 at 12.31.09 PM to rate the tag following eBird guidelines. to delete. This will delete the tag from the system. The changes will also be tracked in the audit table of the task. Tag details in the left column summarize other useful information about the media and tag in order to help make a decision on the verification of the tag.

Minimum frequency: minimum frequency of the tag Maximum frequency: maximum frequency of the tag Length: length of the recording (seconds) Default channel: indicates the channel used for default verification. If one of the channels were malfunctioning, the better channel will appear by default. BirdNET probability: maximum value returned from BirdNET Peak dBFS: maximum amplitude of the tag

You can manipulate the audio and spectrogram using the different filters and editors located below the tag. These include the following settings that you can combine in any way you’d like to generate the best-looking and -sounding spectrogram to verify the tag.

Amplify: increases the gain, or amplitude, of the media. Noise filter: runs a noise profile on the tag and attempts to eliminate noise. Channel filter: Select the channel you want to display and listen to (Left or Right). By default, the left channel is visible while both channels are audible. Z Contrast: Contrast range in dB. Default is 120 dB. This sets the dynamic-range of the spectrogram to be the selected value dBFS to 0 dBFS. This may range from 20 to 180 dBFS. Decreasing dynamic-range effectively increases the ‘contrast’ of the spectrogram display, and vice versa. Z Brightness: Sets the upper limit of the Z-axis in dBFS. A negative number effectively increases the ‘brightness’ of the spectrogram display, and vice versa. Y Scale: Expand the number of pixels displayed on the Y axis. Default is 1x. A larger number stretches the spectrogram vertically. Frequency filter toggle: when turned on, limits the audio file to playing only the frequency bounds of the tag. Helpful for eliminating other bandwidths to more clearly hear the signal.

6 Camera Projects 📾

The goal of a camera project is to upload image sets (i.e., tasks), process tasks (i.e., tag species and individuals within images), verify tags, and publish the results. The main interface you will interact with when processing image sets is the camera project dashboard. Projects are purposeful groupings of an organization’s media and metadata to answer specific questions or implement a study design.

6.1 Camera data concepts

A remote camera can be deployed at a location for a short or long period of time. One or several units can be deployed at a single location for years, swapping out the batteries and SD cards every so often depending on usage, or the unit(s) can be moved to a new location. This flexibility provides great benefit to a user or researcher, as the only limitation to data collection is storage space and battery life. Depending on the length of time these units will be in the field prior to being serviced, camera settings can be changed to optimize battery life. When developing a remote camera sampling design for questions related to density estimation, relative abundance, occupancy modeling, etc., strong considerations should be made regarding the length of time in the field, number of units to install and the distance between units. Resources on camera deployment methods, sampling protocols and analytical approaches can be found in the Resources section.

6.2 Camera project management

Projects are purposeful groupings of an organization‘s media and metadata to answer specific questions or implement a study design. Projects can be one of three sensors: ARUs, cameras or point counts. For each of these sensor types, a collection of tasks (ARU), image sets (camera) or surveys (point counts) are processed or uploaded in order retrieve species data.

WildTrax requires at least one administrator per project. Project administrators manage the upload of media, user management and assignments, auditing and species verification, and project publication. Organization administrators create the project and assign a project administrator to manage it. These administrators also add read-only members or share location reports with other WildTrax users.

The general life cycle of a project involves:

Creating a project Adding users to the project Uploading media or data depending on the sensor Creating or uploading tasks or surveys Creating (processing) or uploading tags or observations Verifying and quality controlling tags

6.2.1 Create a camera project

6.2.2 Camera project settings

6.2.3 Species assignment

6.2.4 User assignment

6.2.5 Uploading images and creating tasks

6.2.6 Image sets

6.2.7 Image auto-classification

The use of remote cameras can lead to the capture of hundreds, thousands or even hundreds of thousands of images in a single image set. The large data sets collected are a benefit to users; however, image processing is also usually a bottleneck to producing meaningful data in a timely manner. The time required for humans to process each image and categorize animals or humans can be incredibly time-consuming and inefficient. WildTrax’s auto-tagger features allow you to reduce the time required to review remote camera images by applying tags to images, or auto-tagging before you begin manually tagging. WildTrax’s current auto-tagging classification includes:

  • MegaDectector Version 6 (MDV6-yolov10-e) from Pytorch Wildlife.
  • The STAFF/SETUP tagger

The STAFF/SETUP tagger is a setting used to select the application of the STAFF/SETUP tag automatically to image sets. The auto-tagger will automatically tag images of humans that occur within the first or last series as “STAFF/SETUP” (using a 5-minute series gap), unless there are <2 images auto-tagged as human, or the STAFF/SETUP tag has already been applied.

Once you’ve applied auto-tagger setting in Camera project settings, any image data uploaded into the project will be run through the auto-tagger based on your selection of classifier categories before they become available for tagging.

6.2.8 Syncing data to camera projects

Administrators can benefit from managing only the locations from the media in the project.

To sync location information for locations present in the project:

Click on the “Manage” button > “Download Location CSV” to obtain the “manage tags CSV”, which includes the current list of tags in the project. Make the changes to location information you’d like. Note that you cannot add or delete locations using the Manage button from the project page.. However, you can add or delete locations using the Manage button on the Organization > Location page. Attempting to add, change, or remove data in the location column will result in an error on re-upload into WildTrax. Re-upload the modified CSV by clicking “Manage” > “Upload Location CSV.”

Batch upload processes in WildTrax support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

6.2.9 Downloading reports and media

6.2.10 Merge projects

6.2.11 Delete a project

6.2.12 Publishing a project

6.3 Image Tagging

Tagging images in WildTrax is relatively flexible in that the user can be as general or as detailed as your question requires. Tagging entails the application of one or more tags, composed of a species, sex, age and number of individuals, to each image. In the future, WildTrax will allow for the application of additional tags such as coat colour, snow depth, etc.

From the tagging screen, select one or more images for tagging in multiple ways:

Click on the image to select it. Click and drag your cursor over groups of images Using Shift, you can click on the first image; then, holding Shift, click on the last image to select all images in between. Using Ctrl, you can select multiple images that are not in consecutive order. This includes being able to drag boxes around multiple subsets of images. Select images in the panel on the left-hand side of the screen. Note: Selected images will be highlighted in teal in the number panel. Press ESC at any time to deselect highlighted images. Apply tag(s) to selected image(s) using the tagging window. With image(s) selected, click on “Tag Selected X,” where X represents the number of images you have highlighted for tagging, and fill in applicable information.

The tagging form differs when tagging a single image (Single image tagging form) compared to when tagging multiple images (Batch image tagging form):

Single image tagging form:

6.3.1 Tagging page controls and settings

6.3.2 Individual-level tags

The individual(s)-level tags refer to a tag applied to one or more individuals with the same combination of characteristics (i.e., all are adult males displaying the same behaviour). Note that what constitutes the “same combination” will depend upon the tagging field options selected within camera project settings. Individual(s)-level tags appear in the upper gray portion of the tagging page.

The following fields can be optionally selected in the camera project settings to classify characteristics of one or more individuals (if they have the same characteristics) in a tag.

Some of the tagging field options are “one-to-many fields,” meaning you can select multiple options that apply. In the tag report, one-to-many fields will occur as a comma-separated list.

Since more than one individual-level tag may occur for a single image, it’s important to note that each tag corresponds with a unique row in the tag report (see section 7.1 Data Downloads).

Species The species menu is divided into Mammals, Birds, and Human tags. Common and frequently used species appear at the top of the drop-down menu to facilitate quick tagging.

Special species tags are also used:

Unidentified: a species tag used if the individual in the image cannot be identified based on visible features. This is often used when the only images of an individual are blurs, blotches of fur, etc. NONE: a species tag used for motion-activated images with no individual(s) present. STAFF/SETUP: a species tag used for the series of photos taken while staff are setting up or taking down the camera (humans that occur within the first or last series, using a 5-minute series gap). This tag is applied automatically if the STAFF/SETUP auto-tagger is selected in the project settings. Count (camera) Count is the number of individuals of a particular species, age, sex, behaviour, etc. (i.e., applies to a specific tag where all other fields remain the same rather than an image). For example, if adults and juveniles occurred together in an image, the count would not equate to the number of both adults and juveniles, but rather, a separate tag for juveniles should be applied, and the counts in the separate tags should include the number of adults and the number of juveniles, respectively.

The default count for all wild animals is ‘1.’ The number can be changed if the count is greater than one. The default count for domestic animals, birds, vehicles, and humans is ‘VNA.’ Users can also input VNA if they do not want to collect information in this field.

Age class Age class is the categorical age class (choose one) of individual(s) in a tag (when identifiable).

Age class options:

Adult (Adult; default for mammals): an animal that is old enough to breed. Juv (Juvenile): an animal during its first summer [mammals older than neonates but still requiring parental care]. The juvenile tag is only used for an animal’s first summer when they have apparent juvenile features, such as spots. UNKN (Unknown): the age class of the individual is unclear. VNA (Variable not applicable; default for domestic animals, birds, and humans): the tag does not apply, or the user is not interested in collecting information for this field. Sex class Sex class is the categorical sex class (choose one) of individual(s) in a tag (when identifiable). For example, ungulate species such as deer, elk, and moose can be identified by sex class based on the presence of antlers, but antlers are not manifested year-round. Therefore, it is recommended that antler-less ungulates are only tagged as a female between May 15 and October 1. Outside of these dates, if antlers are not present, the default of UNKN should be used. Some species, such as bears, are often photographed with their young. When an adult mammal is with a juvenile, it can be assumed to be Female and tagged as such.

Sex class options include: Male, Female, Unknown (default), and VNA (variable not applicable). Users can select VNA if they are not interested in collecting this information.

Behaviour Behaviour is a one-to-many field (choose all that apply) used to classify behaviour(s) of mammals (reported as a comma-separated list when syncing tags).

Behaviour options include: Travelling, Standing, Running, Feeding/Foraging, Drinking, Bedding, Inspecting, Inspecting Camera, Vigilant, Territorial Display, Rutting/Matting, Unknown, Other, and VNA (variable not applicable; default).

Health/Disease Health/disease is a one-to-many field (choose all that apply) used to classify descriptors of health and/or disease status (reported as a comma-separated list when syncing tags).

Health/Disease options include: Poor Condition, Discolouration, Hair loss, Lumps, Scarring, Injured, Malformed (environmental and/or genetic), Diseased, Ticks, Mange, Dead, and Other.

Direction of travel Direction of travel is a categorical field (choose one) used to classify the direction of travel of moving individual(s). The 12 categories represent the 12 positions of a clock. Assuming the camera always faces 12 o’clock position, the option entered for a moving individual should represent the clock position that an animal moves towards in relation to the direction the camera faces. For example, if the animal travels from left to right, and the movement is perpendicular to where the camera faces, the direction of travel would be “3 -o- Clock.”

Direction of travel options include: 1 -o- Clock, 2 -o- Clock, 3 -o- Clock, 4 -o- Clock, 5 -o- Clock, 6 -o- Clock, 7 -o- Clock, 8 -o- Clock, 9 -o- Clock, 10 -o- Clock, 11 -o- Clock, and 12 -o- Clock.

Coat colour Coat colour is a one-to-many field (choose all that apply) used to classify the coat colour(s) of mammals (reported as a comma-separated list when syncing tags).

Coat colour options include: Beige, Cream, Brown, Chocolate Brown, Dark Brown, Black, Blonde, Cinnamon, Grey, Red, Orange, Yellow, White, Melanistic, and Other.

Coat attributes Coat attributes is a one-to-many field (choose all that apply) used to classify attributes of mammals’ coats (reported as a comma-separated list when syncing tags).

Coat attributes tag options include: Spots, Speckled, Salt-pepper, Stripes, Cross-Phase, Chest Blaze, and Other.

Antler tine attributes Antler tine attributes is a combined field (choose one combination) used to document information on antler tine attributes, including antler position (the side of the rack being categorized), tine count (the number of antler tines present) and tine count precision (the precision of the tine count). Users will only be able to apply this tag to mammal species with antlers (e.g., Moose, White-tailed deer, etc.).

Antler tine attributes:

Antler position: the side of the rack being categorized (options include: Left, Right, and Symmetrical) Tine count: the number of antler tines present. Tine count precision: the precision of the tine count (options include: Exact, Approximate, and At least) A screenshot of a computer

Description automatically generated

Note that the three antler tine attributes are concatenated into one value in the resulting tag report.

Collar Collar flags are used to identify individuals affixed with a collar (e.g., a GPS collar; may be interpreted as “off-leash” for Domestic Dogs) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Ear tag Ear tag flags are used to identify individuals with ear tags (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Interacting with human feature (IHF) Interacting with human feature (IHF) flags are used to indicate when individual mammals use or interact with human features (e.g., an animal walking in the adjacent forest vs. along the fence, or digging in a compost pile) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Tag needs review Tag needs review flags are applied when species attributes are unclear and need to be checked. Each individual-level tag in an image is associated with a review tag, so it is clear which tag needs to be reviewed (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Tag comments Tag comments are any comments entered by the observer to describe any additional details about the individual(s)-level tag. Note there is also a field to document image comments.

Field of View (FOV) is the extent of a scene that is visible in an image (Wearn & Glover-Kapfer, 2017).

For remote cameras, the camera’s FOV is influenced by how the camera is set up (i.e., camera height and angle, slope, etc.), and it often remains largely unchanged throughout a deployment. However, a camera’s FOV can change during deployment, such as snow blocking the lens or because the camera was nudged by an animal moving past it (e.g., and now faces another tree rather than an open area). When the camera’s FOV has changed when compared to the setup view, users should use the image FOV tags to document images that shouldn’t be included as part of the observation period (and thus the sampling effort). It is important to use the image FOV tags correctly since they allow for the correct estimation of each camera’s sampling effort.

FOV Field of View tags are only used when the camera’s FOV has changed significantly (for 3+ hours) compared to the FOV at setup. When this occurs, the images are considered to be ‘Out of range,’ and the observation period ends for the camera. The image(s) may be motion or time-lapse-triggered images.

FOV ‘Out of Range’ criteria:

The camera’s FOV changed for 3+ hours The change in FOV was “significant”, which may occur due to: Loss of visibility – e.g., the lens is more than 50% covered (by snow, vegetation, fallen trees, etc.). Discretion is used where (e.g.) cattle are leaning on a post and making the camera go in and out of position repeatedly. In such cases, the camera is said to be not working properly the whole time this is happening. Major changes in the roll, pitch, and yaw of the camera: Roll – the tilt is more than 30 degrees from level. Note: The lines in the image below show the angle to which the horizon would need to rotate to be considered out of range. Pitch – the camera’s angle shifted upwards or downwards such that the pole (if used) is now beyond the bottom of the image or above the center of the image. Yaw – the bottom of the pole (if used) is out of view beyond the right or left side of the image. Application of Image FOV tags There are four potential FOV tags, “Within,” “Out of Range,” “END – Last good image in FOV,” and “START – First good image in FOV”. Importantly, the Out of Range tag is automatically applied to the images between the END and START tags (if applied) after a FOV review has been completed.

If the FOV remains unchanged (or altered, but for < 3 hours) compared to the setup view, the camera’s FOV is assumed to be “Within” the normal range (the default), and the observation period includes the entire deployment. However, what if, for example, you’re tagging an image set with ~2,500 images, and there was a period in the middle of the deployment (e.g., images 1001-1551) where snow covered the lens of the camera for more than 3 hours, and thus, these images should be considered “Out of Range.” However, users do not need to manually apply the Out of Range tag manually since WildTrax will populate it automatically after the field of view review if the END and/or START tags are applied. For the first 999 images, the user can leave the image FOV as the default (“WITHIN”) since the FOV remained the same as the setup view, and thus the observation period began with the first image and continued until the FOV changed. Users should apply the END tag to image 1000 since it is the last image with the correct FOV and signifies the end of the observational period. Since the snow melted after image 1551, the user should apply the START tag to image 1552 to signify that the observation period has recommenced.

To summarize, follow these instructions to use the FOV tags correctly:

WITHIN (default): applied to images where the camera is assumed to have the camera setup FOV (i.e., is “WITHIN” the normal range). The Field of View field defaults to ‘WITHIN’ as images are assumed to be within range. Out of Range: applied to images where the camera’s FOV has changed significantly (for 3+ hours) compared to the setup FOV. This tag does not need to be applied manually; WildTrax will complete the process automatically during the Field of View review. Images may be motion or time-lapse images. END – Last good image in FOV: if the FOV has changed significantly (for > 3 hours), apply to the last image with the correct FOV (the camera setup FOV) before the FOV changed (e.g., the last image before snow covers the lens or a cow leans against the camera) to signify the end of the observational period. Note that the ‘END’ and ‘START’ tags are only used if the view changes compared to the set-up images. START – First good image in FOV: applied if a) the END tag was applied, and b) if the camera’s FOV returns to the correct FOV (the camera setup FOV); applied to the first image captured with the corrected FOV (e.g., snow melts or the cows stops leaning against the camera) to signify that the observation period has recommenced. Don’t apply a ‘START’ tag at the beginning of an image set to indicate that the camera has been successfully set up. Note that the ‘END’ and ‘START’ tags are only used if the view changes compared to the set-up images. After the Field of View review is complete (see the section on Field of View review), the ‘Out of Range’ tag will be automatically applied to the images between the END tag (end of the observation period) and the START tag (beginning of a new observation period). Snow presence Snow presence flags are applied to images where snow is present on the ground (reported as ‘TRUE’ or ‘FALSE’ when syncing tags). If snow is in the air (i.e., it is snowing) but not on the ground, snow presence should not be flagged.

Image snow depth (m) Image snow depth (m) is the depth of snow (in metres) at the distance at which the camera detects motion at the ground/snow surface level.

Image water depth (m) Image water depth (m) isthe depth of water (in metres) at the distance at which the camera detects motion at the water’s surface.

Fire presence Fire presence flags are applied to images where the camera was clearly triggered by fire. Note: there may be animals present in these images or not (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Malfunction Malfunction flags are applied to images when it appears that the camera is not working properly (e.g., images are completely black or pixelated) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Nice Nice flags are used to highlight high-quality images, so they are easier to find at a later date (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Image comments Image comments are comments entered by the observer to describe any additional details about the image. Note: there is also a field to document individual(s)-level comments.

6.3.3 Image-level tags

Image-level tags are applied based on information that can only occur once per image (e.g., snow or fire is either present in an image or it is not). Image-level tags are displayed in the larger grey box at the bottom of the tagging page. Note that this box will shift down as new tags are added.

6.3.4 Filter panel

Filters panel (b): The filters panel can be used to search for tagged images or filter for certain parameters within an image set. You can filter for tagged images containing any of the tagging field options (see section 5.2.3 – Tag types for more information), such as tags of a specific species, sex, or age, as well as a variety of descriptive tags, date ranges and field of view. From this window, you can view all hidden images (time-lapsed and auto-tagged) as well as select a date and time of images you want to view.

Number of images per page (e): where you can define how many images are displayed per page.

Bounding box display threshold (f): the minimum classifier confidence level for which bounding boxes will be displayed for the selected detection categories. The default is managed in the camera project settings.

bounding box Page numbers (g): depending on the number of images you display per page, your corresponding number of pages will increase or decrease.

page numbers Image information icon (h): The icon at the bottom right-hand corner of an image can be used to see the image metadata (e.g., Equipment make, Equipment model, Flash mode, etc.). The icon will also display the image URL, which can be used to download a specific image.

6.3.5 Image classifiers

6.3.6 Tagging scenarios

There are four main tagging scenarios you may encounter:

≄ 1 images / 1 individual / 1 species: if you select one or more images with a single individual of a species, then a single tag is applied. Once the tag fields are completed, click “Save All and Close” to apply the tag. 1 image / >1 individuals / 1 species: if you select a single image that contains > 1 individual of a single species, but age and/or sex differ among these individuals, then multiple tags are applied. Once you have completed the tag fields for the first individual, click to create a new tag for the next individual. Example: an image with a female moose with a calf. In this case, you would create a unique tag for each individual. 3) 1 image / ≄ 2 species: If you select a single image that contains > 1 species, then multiple tags are applied. Once you have completed the tag fields for the first species/individual, click “Add Tag” to create a new tag for the next species/individual. Example: an image where a deer triggered the camera and a coyote was also captured in the background. In this case, you would tag the deer and then click “Add Tag” to create a second tag for the coyote (or vice versa).

  1. 1 image / >1 individuals / 1 species / tags differ: If you select multiple images that contain > 1 individual of a single species and age, sex, or any other tags (e.g., behaviour) differ among these individuals, then multiple tags are applied. Once you have completed the tag fields for the first individual, click “Add Tag” to create a new tag for the next individual.

Update all untagged Images with abundant species that will all have the same tag, such as Domestic Cows, may be left untagged until the end. Once all other images in the image set (including NONE) have been tagged, you can select the Updated All Untagged button and enter the Domestic Cow tag in the tagging window. Doing so will tag all remaining untagged images (on all pages) as Domestic Cow. The Updated All Untagged button can be used for any species whose tag attributes (e.g., Individual count, Sex class, or Age class) defaults to ‘VNA.’ Thus, the button can be used for all domestic animals, humans, or NONE. This button cannot be used if the tag attributes vary.

6.3.7 Field of view review

A Field of View review is completed is only if an ‘Out of Range,’ ‘END,’ and/or ‘START’ tag was previously applied during the tagging process, and after all images in an image set have been tagged.

6.3.8 Species verification

Species verification is completed as part of a quality control step within WildTrax to ensure the accurate application of tags. This step is only carried out when all image sets within a project have a status of ”Tagging Complete”. In general, all wild mammals are double-checked. Domestic and bird species are verified at the discretion of the project administrator, who assigns species to taggers.

The main objectives for species verification are:

Ensure manually-applied species tags are correct Ensure context-tagged species tags are correct Ensure auto-tagged species tags are correct (if applicable) Conduct additional analyses on verified images To complete species verification:

Access your assigned species through the Verify Species tab of the project page.

Checks to complete during species verification: Correct species ID tag(s) and use of attributes tags. If errors are found, you can edit the tag by open the tagging form either by: clicking on the image and selecting the “Tag Selected” button in the top right-hand corner of the screen or clicking on the tag below the image. In both cases, edit the tag in the tagging form and click “Save All and Close.” Once all images on a page have been checked, click on the “Verify Species” button (e.g., Verify Canada Lynx) at the bottom of the screen. The next page of images to verify will automatically appear. When all images on a page are verified, the images and page navigation boxes turn green to indicate they are complete.

Notes on species verification

If a tag was given a “Tag Needs Review” tag during the first round of tagging, that image will have an orange border around it in the side panel to emphasize that this image’s species tag needs to be reviewed with extra care. ‘Unidentified’ and ‘Tag Needs Review’ tags: Images labelled as ‘Unidentified’ and ‘Tag Needs Review’ are either unidentifiable species and shapes that triggered the camera or identifications that taggers are not 100% sure of, respectively. These tags should be double-checked to verify that tags have been applied correctly. Tags with the ‘Tag Needs Review’ flag that cannot be identified but still possess identifying features should remain as ‘Tag Needs Review.’ However, if the species in the tag cannot be identified and there are no identifiable features, re-tag the species as ‘Unidentified’ and remove the ‘Tag Needs Review’ tag.

6.3.9 Location verification

Double-checking the location information is an easy but important task. It should be completed for all applicable tasks where reference signs was used during field set-up and/or pick-up activities. If this does not apply, please continue to the section on Tagging images.

If applicable, you can review the initial STAFF/SETUP photos and compare the location name on the reference sign against the photo labels on WildTrax. If the STAFF/SETUP portion of the auto-tagger and ‘Hide Auto-’ were selected in the camera project settings, STAFF/SETUP photos will be automatically hidden from view in the image tagging page. If these settings apply, the filter will need to be adjusted to complete location validaton.

  1. Click on a task to be taken to the tagging page.

  2. On the filter panel, Uncheck image 119

Notify the project admin if there are any mismatches.

7 Point Count Projects 🐩

Point counts are a methodology used to survey animals, mainly birds. It involves an observer standing at a pre-determined location for specific period of time, counting the individuals they detect. Specifically for birds, this detection can either be aurally or visually. To account for error in detecting a species, either because it didn’t sing or wasn’t observed during the survey, or the observer misses or misidentifies it, distance estimation and duration intervals are used. All these attributes are what define a survey in WildTrax. When the observer detects a species, it is assigned into a distance band, the duration interval it was detected, a species and the abundance (or count of individuals). Each detection then becomes an observation.

The majority of the point count data available in WildTrax is through a collaboration with the Boreal Avian Modelling Centre who support the conservation of North America’s boreal birds through high-quality, data-driven and collaborative science and help users harmonize their point count data to a standard that can be used with many other data sets, including ARUs.

7.1 Point count project management

To create a new project, click the Add Point Count Project button in the project dashboard under the point tab. You must be the administrator of at least one organization within WildTrax in order to create a project. The main settings panel will have general information fields and a description of the project.

The general life cycle of a point count project includes:

  • Creating the project and adding users
  • Uploading surveys
  • Publishing the project

Project Title: The full name used to identify the project as defined by the admin. Organization: The name of the organization. Year: The year the data was collected. Data Source: A field used to specify the original source of the data; this field should be entered if you are importing data from an outside source (e.g., previously tagged data). Organization description: A short description about the organization.. Status: The status of the project (i.e., “Active,” “Test Only,” “Published – Private,” “Published – Public,” “Published – Map+Report Only,” or “Published – Map Only”). Once you save your project details, you will be able to manage your users. Select the User Assignment tab to add and view current administrator and read-only project members.

7.1.1 Syncing point count surveys

You can synchronize your point count with WildTrax in order to batch upload or download and modify and re-upload surveys within the system. The format of the sync columns can be found in Table 1.

Table 1
wt_get_sync_columns("download-point-count-by-project-id", project = 804)

Batch upload processes in WildTrax support add and update operations only; deletions are not allowed. For example, if you accidentally upload an empty CSV, no existing data will be deleted.

You can upload your point count data directly to a project in WildTrax. Here are the required CSV fields :

point_count_data <- data.frame(
  Field = c(
    "Location", "Point Count Date/Time", "Task Method", "Task Distance", 
    "Observer ID", "Species Common Name", "Distance Detection", 
    "Detection Time", "Abundance", "Detection Seen", 
    "Detection Heard", "Task Comments"
  ),
  Description = c(
    "Name of the location.",
    "The date and time of the point count in the format YYYY-MM-DD HH:MM:SS.",
    "Method used for processing recordings: SPM = 1 tag per individual per species per minute, SPT = 1 tag per individual per species per task, NONE = Other methods.",
    "Distance method used during the point count.",
    "Unique numeric identifier for the observer who processed the image, recording, or conducted the point count.",
    "Common name of the observed species.",
    "Distance at which the individual/species was detected.",
    "Time(s) within a recording or point count when a tagged individual/species was detected, formatted as HH:MM:SS.",
    "Number of individuals observed",
    "Was the observation made visually?",
    "Was the observation made acoustically?",
    "Observer's comments on the task or point count."
  )
)

# Render the interactive table
datatable(
  point_count_data,
  options = list(pageLength = 12, autoWidth = TRUE),
  rownames = FALSE,
  caption = "Key Fields for Point Count Data"
)

To upload and sync your survey data:

  • Click the Manage button and select Upload Surveys.
  • In the pop-up window, select Choose a CSV file to upload your survey data.
  • Once uploaded, the QA Project Data button will turn green. Click it to verify that your data adheres to WildTrax standards. Note: This process may take a few minutes depending on the file size.

Survey data automatically creates new locations if they don’t already exist in the organization and links existing locations to spatial metadata if names match. Locations can be merged as needed, and visits for single-visit surveys can be generated directly within the organization, ensuring seamless integration of survey data into WildTrax while maintaining metadata consistency.

7.1.2 Adding location information

7.2 Viewing point count data

The point count survey page outlines the observations and many other attributes of the survey. The header contains information on the duration method, distance method, location name and date and time of the survey. It also contains the weather panel indicating the local weather conditions. You can also use the or at any time to access any project level functions. Species are then listed on the left side of the page with their appropriate distance and time intervals bins they were detected in. You also have the location map available on the map from where the survey took place.

7.3 Publishing a point count project

To publish a point count project first ensure that all surveys have been added to the project and that all associated metadata is prepared as intended (e.g. check location spatial coordinates to ensure everything is there). Next go to the and go to the Status field. Refer to Publishing and sharing data for which data privacy level is best for you.