- Get started
- Label Studio features
- Billing & Usage
- Release notes
Install and Upgrade
- Install and upgrade Label Studio
- Install Label Studio Enterprise on-premises using Docker
- Deploy Label Studio Enterprise on Kubernetes
- Database setup
- Start Label Studio
Security and Privacy
- Secure Label Studio
- Set up user accounts
- Manage access
- Set up authentication
- Import data
- Import pre-annotations
- Cloud storage setup
Labeling and Projects
- Project setup
- Set up your labeling interface
- Label and annotate data
- Review annotations
- Annotation statistics
- Custom agreement metric
- Export annotations
Machine Learning Setup
- Machine learning setup
- Write your own ML backend
- ML Examples and Tutorials
- Active learning loop
- Troubleshoot machine learning
- Webhook Setup
- Webhooks Event Reference
- Custom Webhooks
- Python SDK Tutorial
- Backend API
- Frontend library
- Frontend reference
- Update scripts and API calls
Release notes for Label Studio Enterprise
This is a list of noteworthy new features and bug fixes for the latest version of Label Studio Enterprise.
For the release notes for Label Studio Community Edition, see the release notes on GitHub.
This version of Label Studio Enterprise introduces the following new features:
- Write a custom agreement metric to evaluate annotator and model performance according to metrics that you define. See how to write a custom agreement metric for more.
- Export a snapshot of your data labeling project, specifying what to include in the export. See more details in the export documentation.
- Set up webhooks to send events to a configured URL to take automated action in your model pipeline. See how to set up webhooks.
- Perform dynamic ML-assisted labeling with interactive preannotation. See more about ML-assisted labeling with interactive preannotations.
This release also includes other important improvements.
- Create annotations from predictions by selecting tasks and using the drop-down menu options.
- Added the ability to remove reviewer and annotator assignments to tasks using the drop-down menu options for selected tasks.
- Enhanced filtering behavior to be robust and support filtering by reviewers, annotators, and fields in your task data.
- Added the ability to label tasks as displayed, allowing you to filter and sort your data and label tasks accordingly. See more in Filter or sort project data.
- Improved performance by reducing the time it takes to load tasks.
- Added the ability to manipulate regions when labeling images, such as selecting and moving multiple regions, duplicating regions, and more. See Advanced image labeling.
- Added new hotkeys to accelerate labeling.
- Added a dedicated
<Video>tag and a
- Added a
hintparameter for the
<Label>tag so that you can provide additional guidance to annotators.
- Remove annotations from a synced S3 bucket when an annotation is deleted.
- Manually sync annotations to a target storage bucket.
- Added the review status for annotations synced to a target storage bucket.
- Export specific tasks by ID.
- Scan bucket prefixes recursively to account for nested dataset storage.
- Specify Google Cloud Storage (GCS) credentials for target storage connections.
See more details in the cloud storage setup documentation.
- Improve LDAP user management to support permission syncing at the workspace level and disallow manual management of project members. See how to set up LDAP authentication for more.
- See information about your active organization on your user account and settings page, such as your organization ID and an overview of your project and annotation activity.
- Added the ability to switch between mean time and median time when reviewing how much time annotators took to annotate a task. See more details in the dashboard documentation.
- Added pagination for the projects page.
- Updated the result format of the API call for
/api/project/id/tasksto be consistent with the format returned by
- Added the option to hide the skip button from annotators.
- Added the ability to retrieve predictions for all tasks in a project using the API.
This release of Label Studio Enterprise includes fixes for the following bugs and other improvements.
- Instructions were not appearing for annotators before labeling in some cases.
- Annotators logging in for the first time with LDAP accounts saw a runtime error.
- The “Review Finished” modal appeared before annotation reviews were complete.
- Using the Naive matching metric for brush annotation projects led to unexpected behavior when saving annotations.
- Creating users using the API did not work as expected.
- Specifying the
completed_byuser when creating an annotation with the API did not work as expected.
- There was an issue with the
- There was an issue with labeling only selected tasks.
- Brush strokes did not change size when zooming in on an image for labeling.
- Improved the cards that display to reviewers and annotators.
- Submitting predictions as annotations did not work as expected.
- Hotkeys stopping working after submitting an annotation.
selectedparameter did not work properly for the
- There was an issue submitting and loading annotations with relations.
- Updating an annotation in the review stream did not work as expected.