info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/product/ux/technical-writing/#assignments
For information on how to configure value stream analytics (VSA) in GitLab, see our [analytics documentation](../user/analytics/value_stream_analytics.md).
|Create custom value streams|Yes|No, only one value stream (default) is present with the default stages|no, only one value stream (default) is present with the default stages|
|Cycle time and lead time summary (Key metrics)|Yes|Yes|No|
|New issues, commits and deploys (Key metrics)|Yes, excluding commits|Yes|Yes|
|Uses aggregated backend|Yes|No|No|
|Date filter behavior|Filters items [finished within the date range](https://gitlab.com/groups/gitlab-org/-/epics/6046)|Filters items by creation date.|Filters items by creation date.|
|Authorization|At least reporter|At least reporter|Can be public.|
## VSA core domain objects
### Stages
A stage represents an event pair (start and end events) with additional metadata, such as the name
of the stage. Stages are configurable by the user within the pairing rules defined in the backend.
- Start event column: uses the `merge_requests.created_at` timestamp column.
- End event identifier: Merge request merge time.
- End event column: uses the `merge_request_metrics.merged_at` timestamp column.
- Stage event hash ID: a calculated hash for the pair of start and end event identifiers.
- If two stages have the same configuration of start and end events, then their stage event hash.
IDs are identical.
- The stage event hash ID is later used to store the aggregated data in partitioned database tables.
Historically, value stream analytics defined [7 stages](https://gitlab.com/gitlab-org/gitlab/-/blob/master/lib/gitlab/analytics/cycle_analytics/default_stages.rb)
which are always available to the end-users regardless of the subscription.
### Value streams
Value streams are container objects for the stages. There can be multiple value streams per group
In some cases, defining the `timestamp_projection` method is not enough. The calculation query should know which table contains the timestamp expression. Each `Event` class is responsible for making modifications to the calculation query to make the `timestamp_projection` work. This usually means joining an additional table.
Example for joining the `issue_metrics` table and using the `first_mentioned_in_commit_at` column as the timestamp expression:
Some start/end event pairs are not "compatible" with each other. For example:
- "Issue created" to "Merge Request created": The event classes are defined on different domain models, the `object_type` method is different.
- "Issue closed" to "Issue created": Issue must be created first before it can be closed.
- "Issue closed" to "Issue closed": Duration is always 0.
The `StageEvents` module describes the allowed `start_event` and `end_event` pairings (`PAIRING_RULES` constant). If a new event is added, it needs to be registered in this module.
The [original implementation](https://gitlab.com/gitlab-org/gitlab/-/issues/847) of value stream analytics defined 7 stages. These stages are always available for each parent, however altering these stages is not possible.
To make things efficient and reduce the number of records created, the default stages are expressed as in-memory objects (not persisted). When the user creates a custom stage for the first time, all the stages are persisted. This behavior is implemented in the value stream analytics service objects.
`DataCollector` is the central point where the data is queried from the database. The class always operates on a single stage and consists of the following components:
To support the aggregated value stream analytics backend, these classes were reimplemented within [`Aggregated`](https://gitlab.com/gitlab-org/gitlab/-/tree/master/lib/gitlab/analytics/cycle_analytics/aggregated) namespace.
- Rails Controller (`Analytics::CycleAnalytics` module): Value stream analytics exposes its data via JSON endpoints, implemented within the `analytics` workspace. Configuring the stages are also implements JSON endpoints (CRUD).
[Project VSA](../user/analytics/value_stream_analytics.md) is available for all users and:
- Includes a mixture of key and DORA metrics based on the tier.
- Uses the set of [default stages](#default-stages).
[Group VSA](../user/group/value_stream_analytics/index.md) is only available for licensed users and extends project VSA to include:
- An [overview stage](https://gitlab.com/gitlab-org/gitlab/-/issues/321438).
- The ability to create custom value streams.
The group and project level VSA frontends are both built with Vue and Vuex and follow a similar pattern:
- The `index.js` file extracts any URL query parameters, creates the Vue app and Vuex store, and dispatches an `initialize` Vuex action.
- The `base.vue` file is used to render the main components for each page, metrics, filters, charts, and the stage table.
The group VSA Vuex store makes use of [Vuex modules](https://vuex.vuejs.org/guide/modules.html) to separate some of the state and logic used for rendering the charts.
### Shared components
Parts of the UI are shared between project VSA and group VSA such as the stage table and path. These shared components live in the project VSA directory `app/assets/javascripts/cycle_analytics/components` and are included at the group level VSA where needed.
All the frontend code for group-level features are located in `ee/app/assets/javascripts/analytics/cycle_analytics/components`.
Since we have a lots of events and possible pairings, testing each pairing is not possible. The rule is to have at least one test case using an `Event` class.
Writing a test case for a stage using a new `Event` can be challenging since data must be created for both events. To make this a bit simpler, each test case must be implemented in the `data_collector_spec.rb` where the stage is tested through the `DataCollector`. Each test case is turned into multiple tests, covering the following cases:
The VSA frontend is tested extensively on two different levels (integration, unit):
- End-to-end integration tests using a real backend via Capybara and RSpec.
- Jest frontend tests with pre-generated data fixtures.
## Development setup and testing
Running Value Stream Analytics can be done via the [GDK](https://gitlab.com/gitlab-org/gitlab-development-kit). By default, you'll be able to view the project-level (FOSS) version of the feature.
If your GDK is up and running, you can run the seed script to generate some data: