guide
- Get started
- Label Studio features
- Billing & Usage
- Release notes
Security and Privacy
- Secure Label Studio
Install and Upgrade
- Install and upgrade Label Studio
- Database setup
- Start Label Studio
- Troubleshooting
Deploy and Install Enterprise
- Install Label Studio Enterprise
- Set up persistent storage
- Set up an ingress controller
- Install with Docker
- Deploy on Kubernetes
- Install on airgapped server
- Install on Amazon EKS
- Available Helm values
Manage Users
- Set up user accounts
- Manage access
- Set up authentication
Import Data
- Import data
- Import pre-annotations
- Cloud storage setup
Labeling and Projects
- Project setup
- Manage data
- Set up your labeling interface
- Label and annotate data
Manage Annotations
- Review annotations
- Annotation statistics
- Custom agreement metric
- Export annotations
Machine Learning Setup
- Machine learning integration
- Write your own ML backend
- ML Examples and Tutorials
- Active learning loop
- Troubleshoot machine learning
Integrations
- Webhook Setup
- Webhooks Event Reference
- Custom Webhooks
- Python SDK Tutorial
- Backend API
Advanced Development
- Frontend library
- Frontend reference
- Update scripts and API calls
Release notes for Label Studio Enterprise
This is a list of noteworthy new features and bug fixes for the latest version of Label Studio Enterprise.
For the release notes for Label Studio Community Edition, see the release notes on GitHub.
New features
This version of Label Studio Enterprise introduces the following new features:
- Write a custom agreement metric to evaluate annotator and model performance according to metrics that you define. See how to write a custom agreement metric for more.
- Export a snapshot of your data labeling project, specifying what to include in the export. See more details in the export documentation.
- Set up webhooks to send events to a configured URL to take automated action in your model pipeline. See how to set up webhooks.
- Perform dynamic ML-assisted labeling with interactive preannotation. See more about ML-assisted labeling with interactive preannotations.
This release also includes other important improvements.
Data manager improvements
- Create annotations from predictions by selecting tasks and using the drop-down menu options.
- Added the ability to remove reviewer and annotator assignments to tasks using the drop-down menu options for selected tasks.
- Enhanced filtering behavior to be robust and support filtering by reviewers, annotators, and fields in your task data.
- Added the ability to label tasks as displayed, allowing you to filter and sort your data and label tasks accordingly. See more in Filter or sort project data.
- Improved performance by reducing the time it takes to load tasks.
Labeling and tag improvements
- Added the ability to manipulate regions when labeling images, such as selecting and moving multiple regions, duplicating regions, and more. See Advanced image labeling.
- Added new hotkeys to accelerate labeling.
- Added a dedicated
<Video>
tag and a<Number>
tag. - Added a
hint
parameter for the<Label>
tag so that you can provide additional guidance to annotators.
Import and export improvements
- Remove annotations from a synced S3 bucket when an annotation is deleted.
- Manually sync annotations to a target storage bucket.
- Added the review status for annotations synced to a target storage bucket.
- Export specific tasks by ID.
- Scan bucket prefixes recursively to account for nested dataset storage.
- Specify Google Cloud Storage (GCS) credentials for target storage connections.
See more details in the cloud storage setup documentation.
Assorted other improvements
- Improve LDAP user management to support permission syncing at the workspace level and disallow manual management of project members. See how to set up LDAP authentication for more.
- See information about your active organization on your user account and settings page, such as your organization ID and an overview of your project and annotation activity.
- Added the ability to switch between mean time and median time when reviewing how much time annotators took to annotate a task. See more details in the dashboard documentation.
- Added pagination for the projects page.
- Updated the result format of the API call for
/api/project/id/tasks
to be consistent with the format returned by/api/tasks/id
. - Added the option to hide the skip button from annotators.
- Added the ability to retrieve predictions for all tasks in a project using the API.
Bug fixes
This release of Label Studio Enterprise includes fixes for the following bugs and other improvements.
- Instructions were not appearing for annotators before labeling in some cases.
- Annotators logging in for the first time with LDAP accounts saw a runtime error.
- The “Review Finished” modal appeared before annotation reviews were complete.
- Using the Naive matching metric for brush annotation projects led to unexpected behavior when saving annotations.
- Creating users using the API did not work as expected.
- Specifying the
completed_by
user when creating an annotation with the API did not work as expected. - There was an issue with the
Paragraphs
tag. - There was an issue with labeling only selected tasks.
- Brush strokes did not change size when zooming in on an image for labeling.
- Improved the cards that display to reviewers and annotators.
- Submitting predictions as annotations did not work as expected.
- Hotkeys stopping working after submitting an annotation.
- The
selected
parameter did not work properly for the<Label>
tag. - There was an issue submitting and loading annotations with relations.
- Updating an annotation in the review stream did not work as expected.
Couldn't find what you were looking for? Please let us know on
Slack
If you found an error, you can file an issue on GitHub!

If you found an error, you can file an issue on GitHub!