New upstream version 13.0.3

This commit is contained in:
Pirate Praveen 2020-05-30 21:06:31 +05:30
parent 44f8ec8b25
commit 5736b18dda
74 changed files with 1316 additions and 265 deletions

View file

@ -1,5 +1,344 @@
Please view this file on the master branch, on stable branches it's out of date.
## 13.0.2 (2020-05-28)
- No changes.
## 13.0.1 (2020-05-27)
### Security (3 changes)
- Change the mirror user along with pull mirror settings.
- Allow only users with a verified email to be member of a group when the group has restricted membership based on email domain.
- Do not auto-confirm email in Trial registration.
## 13.0.0 (2020-05-22)
### Security (1 change)
- Apply CODEOWNERS validations to web requests. !31283
### Removed (1 change)
- Remove deprecated route for serving full-size Design Management design files. !30917
### Fixed (77 changes, 5 of them are from the community)
- Enforce CODEOWNER rules for renaming of files. !26513
- Preserve date filters in value stream and productivity analytics. !27102
- Fix 404s when clicking links in full code quality report. !27138
- Remove Admin > Settings > Templates link from sidenav when insufficient license. !27172
- Geo - Does not write to DB when using 2FA via OTP for admin mode. !27450
- Use group parent for subgroup labels requests. !27564
- Hidden stages should not appear in duration chart. !27568
- Maven packages API allows HEAD requests to package files when using Amazon S3 as a object storage backend. !27612
- Prevent renaming a locked file. !27623
- Use absolute URLs to ensure links to the source of SAST vulnerabilities resolve. !27747
- Prevent issues from being promoted twice. !27837
- Display error message in custom metric form validation when prometheus URL is blocked. !27863
- Append inapplicable rules when creating MR. !27886
- Fix passing project_ids to Value Stream Analytics. !27895
- Support inapplicable rules when creating MR. !27971
- Fix vsa label dropdown limit. !28073
- Fix analytics group and project loading spinners styling. !28094
- Include subgroups when populating group security dashboard. !28154
- Hide ability to create alert on custom metrics dashboard. !28180
- Fix epics not preserving their state on Group Import. !28203
- Fix invalid scoped label documentation URL when rendered in Markdown. !28268
- Fix create epic in tree after selecting Add issue. !28300
- Fix HTTP status code for Todos API when an epic cannot be found. !28310
- Allow SCIM to create an identity for an existing user. !28379
- Perform Geo actions on projects only. !28421
- Allow unsetting "Required pipeline configuration" for CI/CD. !28432
- Geo: Self-service framework does not associate geo_event_log row to geo_event. !28457
- Return overridden property in API response. !28521
- Fix billed users id and count from shared group. !28645
- Hide Request subtitle when Threat Monitoring has no data. !28760
- Fix repository settings page loading issues for some imported projects with pull mirroring. !28879
- Fix group_plan in group member data for system hooks. !29013
- Fix imageLoading bug when scrolling back to design. !29223
- Fix Elasticsearch rollout stale cache bug. !29233
- Resolve Creating an annotation on the design that is bigger that screen size is broken. !29351
- Fix incorrect dropdown styling for embedded metric charts. !29380 (Gilang Gumilar)
- Close all other sidebars when the boards list settings sidebar is opened. !29456
- Fix incorrect repositioning of design comment pins when mouse leaves viewport. !29464
- Fix board edit weight values 0 or None. !29606
- Fix lack-of padding on design view notes sidebar. !29654
- Remove duplicate QA attribute for burndown charts. !29719
- Add validation to Conan recipe that conforms with Conan.io. !29739 (Bola Ahmed Buari)
- Fix caching performance and some cache bugs with elasticsearch enabled check. !29751
- Fix wiki indexing for imported projects. !29952
- Fix filter todos by design. !29972 (Gilang Gumilar)
- Fix the error message on Security & Operations dashboard by fixing the API response. !30047
- Fix showing New requirement button to unauthenticated user. !30085 (Gilang Gumilar)
- Ignore invalid license_scanning reports. !30114
- Sort dependency scanning reports before merging. !30190
- Fix typos on Threat Management page. !30218
- Fix infinte loading spinner when visiting non-existent design. !30263
- Fix third party advisory links. !30271
- Fix dismissal state not being updated. !30503
- Add sort and order for policy violations. !30568
- If a parent group enforces SAML SSO, when adding a member to that group, its subgroup or its project the autocomplete shows only users who have identity with the SAML provider of the parent group. !30607
- Geo: Fix empty synchronisation status when nothing is synchronised. !30710
- Don't hide Open/Closed Icon for Issue. !30819
- No seat link sync for licenses without expiration. !30874
- Fix project insights page when projects.only parameter is used. !30988
- Fixes styling on vulnerability counts. !31076
- Hide child epic icon on Roadmap for accounts without child epics support. !31250
- Add license check for the 'send emails from Admin area' feature. !31434
- Avoid saving author object in audit_events table. !31456
- Fix 500 errors caused by globally searching for scopes which cannot be used without Elasticsearch. !31508
- Show .Net license scan results as links. !31552
- Fix incorrect notice text on insights page. !31570
- Add license restriction to HealthStatus. !31571
- Fix issues search to include the epic filter ANY. !31614
- Reduce 413 errors when making bulk indexing requests to Elasticsearch. !31653
- Fix CI minutes notification when unauthenticated. !31724
- Add LFS workhorse upload path to allowed upload paths. !31794
- Relax force pull mirror update restriction. !32075
- Show correct last successful update timestamp. !32078
- Fix missing dismissed_by field. !32147
- Fix empty unit display of time metrics. !32388
- Fixes file row commits not showing for certain projects.
- Validates ElasticSearch URLs are valid HTTP(S) URLs. (mbergeron)
### Deprecated (2 changes)
- Document planned deprecation of 'marked_for_deletion_at' attribute in Projects API in GitLab 13.0. !28993
- Document planned deprecation of 'projects' and 'shared_projects' attributes in Groups API in GitLab 13.0. !29113
### Changed (64 changes, 1 of them is from the community)
- Allow Value Stream Analytics custom stages to be manually ordered. !26074
- Create more intuitive Popover information for Geo Node Sync Status. !27033
- Make "Value Stream" the default page that appears when clicking the group-level "Analytics" sidebar item. !27277 (Gilang Gumilar)
- Update renewal banner messaging and styling. !27530
- Hide company question for new account signups. !27563
- Create more intuitive Verification Popover for Geo Node Syncs. !27624
- Disabled Primary Node Removal button when removal is not allowed. !27836
- Move issue/apic hierarchy items to a tooltip. !27969
- Update copy for Adding Licenses. !27970
- Sort events alphabetically on Value Stream Analytics Stage form. !28005
- Return 202 Accepted status code when fetching incompleted Vulnerability Export from API. !28314
- Use dropdown to change health status. !28547
- Change GraphQL arguments in group.timelogs query to use startTime and endTime. !28560
- Enable issue board focus mode for all tiers on Enterprise Edition. !28597
- Clarify detected license results in merge request: Group licenses by status. !28631
- Deleting packages from a deeper paginated page will no longer return you to the first page. !28638
- Sort Dependency List by severity by default. !28654
- Make Dependency List pipeline subheading more succinct. !28665
- Enable export issues feature for all tiers on Enterprise Edition. !28675
- Remove scoped labels documentation link. !28701
- Change summary text copy for license-compliance MR widget. !28732
- Update License Compliance docs url. !28853
- Differentiate between empty and disabled Geo sync. !28963
- Move deploy keys section back to repository settings. !29184
- Hide Pipeline Security tab from reporters. !29334
- Make Geo Selective Sync More Clear from the Node Details view. !29596
- Fix checkbox alignment and responsive word-break for secure table rows. !29659
- Add tracking to Subscription banner. !29735
- SSO Enforcement requires sign in within 7 days. !29786
- Geo - Better Out of Date Errors. !29800
- Replace non-standard term in Issues Analytics chart labels. !29810
- Change default icon for neutral-state items in merge request widget. !30008
- Remove tasks_by_type_chart feature flag. !30034
- Modify existing out of runner minutes banner to handle 3 different warning levels. !30088
- Resolve Allow multiple screenshots to be uploaded in copy paste. !30152
- Improve the running out of CI minutes email notification. !30188
- Change 'Whats New' dropdown item url. !30198
- Link Buy additional minutes button straight to funnel. !30248
- Improve design management image loads by avoiding refetching image urls for designs. !30280
- Change logic to find the current license. !30296
- Link license management button to license compliance policies section. !30344
- Sort license policy violations first. !30564
- Change UI requirements route from project/requirements to project/requirements_management/requirements. !30583
- Change default concurrency of merge trains to twenty. !30599
- Clarify security report findings in merge request widget. !30688
- Consolidate epic tree buttons. !30816
- Increase ProcessBookkeepingService batch to 10_000. !30817
- Modify GraphQL mutation for adding projects to Instance Security Dashboard to support only single project id. !30865
- Add subscription banner to group/subgroup pages. !30883
- Geo - Add Empty States. !31010
- Remove customizable_cycle_analytics feature flag. !31189
- Geo - Bring settings UI in-line with other Geo Views. !31257
- Cards in Epic Tree have two lines of content. !31300
- Add request information to vulnerability-detail modal. !31422
- Move registrations progress bar to a generic place and make it configurable. !31484
- Reduce epic health status noise in epic tree. !31555
- Modify GraphQL mutation for removing project from Instance Security Dashboard to use id instead of projectId. !31575
- Enable onboarding issues experiment on Welcome screen. !31656
- Expring Subscription banner has plan specific text. !31777
- Geo - Better visualize type of node on form. !31784
- Improve tooltips for compliance framework badges. !31883
- Allow developers to see ci minutes notifications. !31937
- Log Audit Event when Group SAML adds a user to a group. !32333
- Audit logs now uses filtered search.
### Performance (13 changes, 1 of them is from the community)
- Geo - Improve query to retrieve Job Artifacts, LFS Objects, and Uploads with files stored locally. !24891
- Geo - Use bulk insert to improve performance of RegistryBackfillService. !27720
- Improve Group Security Dashboard performance. !27959
- Fix N+1 queries in Audit Events controllers. !28399
- Move Clusters Application CertManager to batch counting. !28771
- Move Clusters Application Helm to batch counting. !28995
- Move Clusters Application Ingress to batch counting. !28999
- Move Clusters Application Knative to batch counting. !29003
- Preload path locks for TreeSummary. !29949
- Debounce pull mirror invocation. !30157
- Advanced Search API: Eager load more issue associations to reduce N+1 queries. !30233
- Make the ElasticCommitIndexer idempotent to enable job de-duplication. !31500 (mbergeron)
- Use data-interchange format based on .ndjson for Project import and export. !31601
### Added (129 changes, 13 of them are from the community)
- Support setting threshold for browser performance degradation through CI config. !21824
- Restrict page access when restricted level is public. !22522 (briankabiro)
- Add ability to expand epics in roadmap. !23600
- Prefer smaller image size for design cards in Design Management. !24828
- Warn users when they close a blocked issue. !25089
- Added setting to use a custom service desk email. !25240
- Add ability to explore zoomed in designs via click-and-drag. !25405
- Allow GMA groups to specify their own PAT expiry setting. !25963
- Add vulnerabilities field to QueryType. !26348
- On-demand Evidence creation. !26591
- Add API endpoint for new members' count in Group Activity Analytics. !26601
- Allow multiple root certificates for smartcard auth. !26812
- Add scanned URL count and link to scanned resources in DAST reports. !26825
- Add a button to export vulnerabilities in CSV format. !26838
- Add #resolved_on_default_branch to Vulnerability. !26906
- Migrate SAML to SCIM Identities. !26940
- Expose smaller sized Design Management design images in GraphQL. !26947
- Add Snowplow tracking for Container Registry events. !27001
- Geo - Support git clone/pull operations for repositories that are not yet replicated. !27072
- Add an indicator icon to issues on subepics when filtering by epic. !27212
- Add a count of the scanned resources to Security::Scan. !27260
- Track primary package file checksum counts. !27271
- Anonymize GitLab user/group names on Status Detail Pages. !27273
- Enable NetworkPolicy Statistics by default. !27365
- Separate approval setting entities into own class files. !27423 (Rajendra Kadam)
- Show custom 'media broken' icon for broken images in Design Management. !27460
- Display loading spinner when Design Card images are loading. !27475
- Updated link to Status Page docs. !27500
- Adds support for storing notes for vulnerabilities. !27515
- Add endpoints to fetch notes/discussions for vulnerability. !27585
- Support PyPi package upload. !27632
- Separate conan package entities into own class files. !27642 (Rajendra Kadam)
- Separate model only entities into own class files. !27665 (Rajendra Kadam)
- Add terraform report to merge request widget. !27700
- Add a DB column to allow GMA groups to specify their PAT expiry setting. !27769
- Separate user, project and entity helper modules into own class files. !27771 (Rajendra Kadam)
- Add rake task for reindexing Elasticsearch. !27772
- Separate group, member, group_detail and identity modules into own class files. !27797 (Rajendra Kadam)
- Indicate whether the alert is firing. !27825
- Support PyPi package installation. !27827
- Separate code review, design and group modules into own class files. !27860 (Rajendra Kadam)
- Add Health Status badge in Epic tree. !27869
- Separate nuget package entities into own class files. !27898 (Rajendra Kadam)
- Create issue link when creating issue from vulnerability. !27899
- Geo: Proxy SSH git operations for repositories that are not yet replicated. !27994
- Add pipeline statuses to Compliance Dashboard. !28001
- Add Usage Ping For Status Page. !28002
- Adds additional pipeline information to packages API result. !28040
- Allow to save issue health status. !28146
- Add a compliance framework setting to project. !28182
- Add project template for HIPAA Audit Protocol. !28187
- Create system note when health status is updated. !28232
- Pre-populate prometheus alert modal. !28291 (Gilang Gumilar)
- Usage ping for customized Value Stream Management stage count. !28308
- Add evidence to vulnerability details. !28315
- Added confidential column to epics table. !28428
- Geo GraphQL API: Add geoNode at root. !28454
- Refactor duplicate specs in ee user model. !28513 (Rajendra Kadam)
- Refactor duplicate specs in ee group model. !28538 (Rajendra Kadam)
- Lazy-loading design images with IntersectionObserver. !28555
- Create group-level audit event for Group SAML SSO sign in. !28575
- Renewal banner has auto-renew specific messaging. !28579
- Automatically embed metrics in issues for alerts from manually configured Prometheus instances. !28622
- Epic tree move child with drag and drop. !28629
- Show the number of scanned resources in the DAST vulnerability report. !28718
- Expose 'marked_for_deletion_on' attribute in Projects API. !28754
- Show policy violations in license compliance. !28862
- Migrate project snippets to the ghost user when the user is deleted. !28870 (George Thomas @thegeorgeous)
- Provide instance level setting to enable or disable 'default branch protection' at the group level for group owners. !28997
- Add spentAt field to TimelogType and deprecate date field. !29024
- Add hierarchy depth to roadmaps. !29105
- Add mutation to Dismiss Vulnerability GraphQL API. !29150
- Add "whats new" item to help dropdown. !29183
- Expose hasParent GraphQL field on epic. !29214
- Display indicator to rule name in approval rules. !29315
- Add status page url to setting form in UI. !29359
- Enable Standalone Vulnerabilities feature for improving Vulnerability Management. !29431
- Add `group_id` column into vulnerability_exports table. !29498
- Enable creation on custom index with rake. !29598 (mbergeron)
- Replace pipeline quota with usage quotas for user namespaces. !29806
- Add GraphQL mutation to update limit metric settings on board lists. !29897
- Adds PyPi installation instructions to package details page. !29935
- Survey Responses landing page. !29951
- Add limit metric to list type. !30028
- Add GraphQL query for Instance Security Dashboard projects. !30064
- Adds PyPi tab to the packages list page. !30078
- Export an instance-level vulnerabilities report. !30079
- Add GraphQL mutation for adding projects to Instance Security Dashboard. !30092
- Add GraphQL mutation for removing projects from Instance Security Dashboard. !30100
- Show specific success message when uploading a future license. !30161
- Show all licenses and highlight current license in license history. !30172
- Adds pipeline project name and link to package title. !30275
- Enable expiring subscription banner. !30304
- Handle subscription purchase flow via GitLab. !30324
- Add Approved By in filtered MR search. !30335
- Add autocompletion to Design View comment form. !30347
- Add confidential flag in epic create. !30370
- Add API endpoints to generate instance level security report exports. !30397
- Add scanner name, version and URL to Vulnerability Modal. !30458
- Introduce negative filters for code review analytics. !30506
- Add NuGet dependencies extraction to the GitLab Packages Repository. !30618
- Add vulnerability fields to GraphQL project, group, and global scope. !30663
- Add vulnerability history to graphQL. !30674
- Add package tags support to the GitLab NuGet Packages Repository. !30726
- Add `Group Name` and `Project Name` into vulnerability export CSV file. !30755
- Display banner to admins when there are users over license. !30813
- Geo Replication View. !30890
- Record impersonator information on audit event. !30970
- Measure project import and export service execution. !30977
- Adds package_name filter to group packages API. !30980
- Add Nuget metadatum support in GitLab Packages. !30994
- Show the You're running out of CI minutes warnings on relevant pages in GitLab. !31012
- Add spinner to roadmap. !31080
- Add standaloneVulnerabilitiesEnabled filter to group projects resolver for GraphQL. !31081
- Add push rules configuration for groups - frontend. !31085
- Add project import to group as audit event. !31103
- Add Web IDE terminal usage counter. !31158
- Add versions array to the package API payload. !31231
- Rate limit the 'Send emails from Admin Area' feature. !31308
- Add graphql endpoint for project packages list. !31344
- Add project filter. !31444
- Show specific content for future-dated licenses. !31463
- Add lead time and cycle time to value stream analytics. !31559
- Geo - Enable geo_file_registry_ssot_sync feature flag by default. !31671
- Geo - Enable geo_job_artifact_registry_ssot_sync feature flag by default. !31672
- Introduce a new API endpoint to generate group-level vulnerability exports. !31889
- Add number of projects with compliance framework to usage ping. !31923
- Adds versions tab and additional versions list to packages details page. !31940
- Add link to customer portal from license dashboard.
### Other (12 changes, 3 of them are from the community)
- Add verification related fields to packages_package_files table. !25411
- Refactor expected_paginated_array_response. !25500 (George Thomas @thegeorgeous)
- Monitoring for Elasticsearch incremental updates buffer queue. !27384
- Use warning icon for alert widget in monitoring dashboard. !27545
- Improved tests by removing duplicated specs. !28525 (Leandro Silva)
- Move service desk from Premium to Starter plan. !29980
- Change license history wording. !30148
- Make active_users param mandatory for SyncSeatLinkRequestWorker#perform. !30810
- Track group wiki storage in DB. !31121
- Replace undefined confidence with unknown severity for occurrences. !31200
- Replace undefined confidence with unknown severity for vulnerabilities. !31593
- Translate unauthenticated user string for Audit Event. !31856 (Sashi Kumar)
## 12.10.5 (2020-05-13)
### Fixed (1 change)

View file

@ -2,6 +2,42 @@
documentation](doc/development/changelog.md) for instructions on adding your own
entry.
## 13.0.3 (2020-05-29)
### Fixed (8 changes, 1 of them is from the community)
- Fixed redirection to project snippets. !32530
- Fix Geo replication for design thumbnails. !32703
- Fix 404s downloading build artifacts. !32741
- Fix Auto DevOps manual rollout jobs not being allowed to fail. !32865
- Update deprecated routes in irker integration. !32923 (Marc Jeanmougin)
- Change format of variables parameter in Prometheus proxy API for metrics dashboard. !33062
- Fix issue and MR API performance regression when Markdown cache is stale. !33235
- Fix close issue when user created the issue. !33294
## 13.0.2 (2020-05-28)
- No changes.
## 13.0.1 (2020-05-27)
### Security (12 changes)
- Add an extra validation to Static Site Editor payload.
- Hide EKS secret key in admin integrations settings.
- Added data integrity check before updating a deploy key.
- Display only verified emails on notifications and profile page.
- Require confirmed email address for GitLab OAuth authentication.
- Kubernetes cluster details page no longer exposes Service Token.
- Fix confirming unverified emails with soft email confirmation flow enabled.
- Disallow user to control PUT request using mermaid markdown in issue description.
- Check forked project permissions before allowing fork.
- Limit memory footprint of a command that generates ZIP artifacts metadata.
- Fix file enuming using Group Import.
- Prevent XSS in the monitoring dashboard.
## 13.0.0 (2020-05-22)
### Removed (20 changes, 5 of them are from the community)

View file

@ -1 +1 @@
13.0.0
13.0.3

View file

@ -1 +1 @@
8.31.0
8.31.1

View file

@ -1 +1 @@
13.0.0
13.0.3

View file

@ -108,7 +108,6 @@ export default class Clusters {
});
this.installApplication = this.installApplication.bind(this);
this.showToken = this.showToken.bind(this);
this.errorContainer = document.querySelector('.js-cluster-error');
this.successContainer = document.querySelector('.js-cluster-success');
@ -119,7 +118,6 @@ export default class Clusters {
);
this.errorReasonContainer = this.errorContainer.querySelector('.js-error-reason');
this.successApplicationContainer = document.querySelector('.js-cluster-application-notice');
this.showTokenButton = document.querySelector('.js-show-cluster-token');
this.tokenField = document.querySelector('.js-cluster-token');
this.ingressDomainHelpText = document.querySelector('.js-ingress-domain-help-text');
this.ingressDomainSnippet =
@ -258,7 +256,6 @@ export default class Clusters {
}
addListeners() {
if (this.showTokenButton) this.showTokenButton.addEventListener('click', this.showToken);
eventHub.$on('installApplication', this.installApplication);
eventHub.$on('updateApplication', data => this.updateApplication(data));
eventHub.$on('saveKnativeDomain', data => this.saveKnativeDomain(data));
@ -275,7 +272,6 @@ export default class Clusters {
}
removeListeners() {
if (this.showTokenButton) this.showTokenButton.removeEventListener('click', this.showToken);
eventHub.$off('installApplication', this.installApplication);
eventHub.$off('updateApplication', this.updateApplication);
eventHub.$off('saveKnativeDomain');
@ -344,18 +340,6 @@ export default class Clusters {
}
}
showToken() {
const type = this.tokenField.getAttribute('type');
if (type === 'password') {
this.tokenField.setAttribute('type', 'text');
this.showTokenButton.textContent = s__('ClusterIntegration|Hide');
} else {
this.tokenField.setAttribute('type', 'password');
this.showTokenButton.textContent = s__('ClusterIntegration|Show');
}
}
hideAll() {
this.errorContainer.classList.add('hidden');
this.successContainer.classList.add('hidden');

View file

@ -89,9 +89,10 @@ export default class Issue {
initIssueBtnEventListeners() {
const issueFailMessage = __('Unable to update this issue at this time.');
return $(document).on(
// NOTE: data attribute seems unnecessary but is actually necessary
return $('.js-issuable-buttons[data-action="close-reopen"]').on(
'click',
'.js-issuable-actions a.btn-close, .js-issuable-actions a.btn-reopen, a.btn-close-anyway',
'a.btn-close, a.btn-reopen, a.btn-close-anyway',
e => {
e.preventDefault();
e.stopImmediatePropagation();
@ -109,6 +110,7 @@ export default class Issue {
this.disableCloseReopenButton($button);
const url = $button.attr('href');
return axios
.put(url)
.then(({ data }) => {

View file

@ -1,6 +1,7 @@
<script>
import { __, s__, sprintf } from '~/locale';
import { GlFormGroup, GlFormInput, GlFormRadioGroup, GlFormTextarea } from '@gitlab/ui';
import { escape as esc } from 'lodash';
const defaultFileName = dashboard => dashboard.path.split('/').reverse()[0];
@ -42,7 +43,7 @@ export default {
html: sprintf(
__('Commit to %{branchName} branch'),
{
branchName: `<strong>${this.defaultBranch}</strong>`,
branchName: `<strong>${esc(this.defaultBranch)}</strong>`,
},
false,
),

View file

@ -218,13 +218,16 @@ export const fetchPrometheusMetric = (
{ commit, state, getters },
{ metric, defaultQueryParams },
) => {
const queryParams = { ...defaultQueryParams };
let queryParams = { ...defaultQueryParams };
if (metric.step) {
queryParams.step = metric.step;
}
if (Object.keys(state.promVariables).length > 0) {
queryParams.variables = getters.getCustomVariablesArray;
queryParams = {
...queryParams,
...getters.getCustomVariablesParams,
};
}
commit(types.REQUEST_METRIC_RESULT, { metricId: metric.metricId });

View file

@ -1,5 +1,5 @@
import { flatMap } from 'lodash';
import { NOT_IN_DB_PREFIX } from '../constants';
import { addPrefixToCustomVariableParams } from './utils';
const metricsIdsInPanel = panel =>
panel.metrics.filter(metric => metric.metricId && metric.result).map(metric => metric.metricId);
@ -116,13 +116,27 @@ export const filteredEnvironments = state =>
* Maps an variables object to an array along with stripping
* the variable prefix.
*
* This method outputs an object in the below format
*
* {
* variables[key1]=value1,
* variables[key2]=value2,
* }
*
* This is done so that the backend can identify the custom
* user-defined variables coming through the URL and differentiate
* from other variables used for Prometheus API endpoint.
*
* @param {Object} variables - Custom variables provided by the user
* @returns {Array} The custom variables array to be send to the API
* in the format of [variable1, variable1_value]
* in the format of {variables[key1]=value1, variables[key2]=value2}
*/
export const getCustomVariablesArray = state =>
flatMap(state.promVariables, (variable, key) => [key, variable.value]);
export const getCustomVariablesParams = state =>
Object.keys(state.promVariables).reduce((acc, variable) => {
acc[addPrefixToCustomVariableParams(variable)] = state.promVariables[variable]?.value;
return acc;
}, {});
// prevent babel-plugin-rewire from generating an invalid default during karma tests
export default () => {};

View file

@ -229,3 +229,19 @@ export const normalizeQueryResult = timeSeries => {
return normalizedResult;
};
/**
* Custom variables defined in the dashboard yml file are
* eventually passed over the wire to the backend Prometheus
* API proxy.
*
* This method adds a prefix to the URL param keys so that
* the backend can differential these variables from the other
* variables.
*
* This is currently only used by getters/getCustomVariablesParams
*
* @param {String} key Variable key that needs to be prefixed
* @returns {String}
*/
export const addPrefixToCustomVariableParams = key => `variables[${key}]`;

View file

@ -40,7 +40,9 @@ export default class Profile {
bindEvents() {
$('.js-preferences-form').on('change.preference', 'input[type=radio]', this.submitForm);
$('.js-group-notification-email').on('change', this.submitForm);
$('#user_notification_email').on('change', this.submitForm);
$('#user_notification_email').on('select2-selecting', event => {
setTimeout(this.submitForm.bind(event.currentTarget));
});
$('#user_notified_of_own_activity').on('change', this.submitForm);
this.form.on('submit', this.onSubmitForm);
}

View file

@ -191,8 +191,10 @@ class Admin::ApplicationSettingsController < Admin::ApplicationController
params[:application_setting][:import_sources]&.delete("")
params[:application_setting][:restricted_visibility_levels]&.delete("")
params[:application_setting].delete(:elasticsearch_aws_secret_access_key) if params[:application_setting][:elasticsearch_aws_secret_access_key].blank?
params[:application_setting][:required_instance_ci_template] = nil if params[:application_setting][:required_instance_ci_template].blank?
remove_blank_params_for!(:elasticsearch_aws_secret_access_key, :eks_secret_access_key)
# TODO Remove domain_blacklist_raw in APIv5 (See https://gitlab.com/gitlab-org/gitlab-foss/issues/67204)
params.delete(:domain_blacklist_raw) if params[:domain_blacklist_file]
params.delete(:domain_blacklist_raw) if params[:domain_blacklist]
@ -261,6 +263,10 @@ class Admin::ApplicationSettingsController < Admin::ApplicationController
render action
end
def remove_blank_params_for!(*keys)
params[:application_setting].delete_if { |setting, value| setting.to_sym.in?(keys) && value.blank? }
end
# overridden in EE
def valid_setting_panels
VALID_SETTING_PANELS

View file

@ -53,10 +53,16 @@ module MembershipActions
end
def request_access
membershipable.request_access(current_user)
access_requester = membershipable.request_access(current_user)
redirect_to polymorphic_path(membershipable),
notice: _('Your request for access has been queued for review.')
if access_requester.persisted?
redirect_to polymorphic_path(membershipable),
notice: _('Your request for access has been queued for review.')
else
redirect_to polymorphic_path(membershipable),
alert: _("Your request for access could not be processed: %{error_meesage}") %
{ error_meesage: access_requester.errors.full_messages.to_sentence }
end
end
def approve_access_request

View file

@ -4,6 +4,8 @@ class Oauth::AuthorizationsController < Doorkeeper::AuthorizationsController
include Gitlab::Experimentation::ControllerConcern
include InitializesCurrentUserMode
before_action :verify_confirmed_email!, only: [:new]
layout 'profile'
# Overridden from Doorkeeper::AuthorizationsController to
@ -21,4 +23,13 @@ class Oauth::AuthorizationsController < Doorkeeper::AuthorizationsController
render "doorkeeper/authorizations/error"
end
end
private
def verify_confirmed_email!
return if current_user&.confirmed?
pre_auth.error = :unconfirmed_email
render "doorkeeper/authorizations/error"
end
end

View file

@ -113,7 +113,7 @@ class Projects::ArtifactsController < Projects::ApplicationController
def build
@build ||= begin
build = build_from_id || build_from_ref
build = build_from_id || build_from_sha || build_from_ref
build&.present(current_user: current_user)
end
end
@ -127,7 +127,8 @@ class Projects::ArtifactsController < Projects::ApplicationController
project.builds.find_by_id(params[:job_id]) if params[:job_id]
end
def build_from_ref
def build_from_sha
return if params[:job].blank?
return unless @ref_name
commit = project.commit(@ref_name)
@ -136,6 +137,13 @@ class Projects::ArtifactsController < Projects::ApplicationController
project.latest_successful_build_for_sha(params[:job], commit.id)
end
def build_from_ref
return if params[:job].blank?
return unless @ref_name
project.latest_successful_build_for_ref(params[:job], @ref_name)
end
def artifacts_file
@artifacts_file ||= build&.artifacts_file_for_type(params[:file_type] || :archive)
end

View file

@ -37,6 +37,8 @@ class Projects::DeployKeysController < Projects::ApplicationController
end
def update
access_denied! unless deploy_key
if deploy_key.update(update_params)
flash[:notice] = _('Deploy key was successfully updated.')
redirect_to_repository
@ -85,10 +87,12 @@ class Projects::DeployKeysController < Projects::ApplicationController
end
def update_params
permitted_params = [deploy_keys_projects_attributes: [:id, :can_push]]
permitted_params = [deploy_keys_projects_attributes: [:can_push]]
permitted_params << :title if can?(current_user, :update_deploy_key, deploy_key)
params.require(:deploy_key).permit(*permitted_params)
key_update_params = params.require(:deploy_key).permit(*permitted_params)
key_update_params.dig(:deploy_keys_projects_attributes, '0')&.merge!(id: deploy_keys_project.id)
key_update_params
end
def authorize_update_deploy_key!

View file

@ -102,7 +102,7 @@ module CacheMarkdownField
def updated_cached_html_for(markdown_field)
return unless cached_markdown_fields.markdown_fields.include?(markdown_field)
refresh_markdown_cache if attribute_invalidated?(cached_markdown_fields.html_field(markdown_field))
refresh_markdown_cache! if attribute_invalidated?(cached_markdown_fields.html_field(markdown_field))
cached_html_for(markdown_field)
end

View file

@ -14,6 +14,7 @@ class NotificationSetting < ApplicationRecord
validates :user_id, uniqueness: { scope: [:source_type, :source_id],
message: "already exists in source",
allow_nil: true }
validate :owns_notification_email, if: :notification_email_changed?
scope :for_groups, -> { where(source_type: 'Namespace') }
@ -97,6 +98,13 @@ class NotificationSetting < ApplicationRecord
def event_enabled?(event)
respond_to?(event) && !!public_send(event) # rubocop:disable GitlabSecurity/PublicSend
end
def owns_notification_email
return if user.temp_oauth_email?
return if notification_email.empty?
errors.add(:notification_email, _("is not an email you own")) unless user.verified_emails.include?(notification_email)
end
end
NotificationSetting.prepend_if_ee('EE::NotificationSetting')

View file

@ -237,9 +237,10 @@ class User < ApplicationRecord
if previous_changes.key?('email')
# Grab previous_email here since previous_changes changes after
# #update_emails_with_primary_email and #update_notification_email are called
previous_confirmed_at = previous_changes.key?('confirmed_at') ? previous_changes['confirmed_at'][0] : confirmed_at
previous_email = previous_changes[:email][0]
update_emails_with_primary_email(previous_email)
update_emails_with_primary_email(previous_confirmed_at, previous_email)
update_invalid_gpg_signatures
if previous_email == notification_email
@ -755,15 +756,15 @@ class User < ApplicationRecord
end
def owns_notification_email
return if temp_oauth_email?
return if new_record? || temp_oauth_email?
errors.add(:notification_email, _("is not an email you own")) unless all_emails.include?(notification_email)
errors.add(:notification_email, _("is not an email you own")) unless verified_emails.include?(notification_email)
end
def owns_public_email
return if public_email.blank?
errors.add(:public_email, _("is not an email you own")) unless all_emails.include?(public_email)
errors.add(:public_email, _("is not an email you own")) unless verified_emails.include?(public_email)
end
def owns_commit_email
@ -811,13 +812,15 @@ class User < ApplicationRecord
# By using an `after_commit` instead of `after_update`, we avoid the recursive callback
# scenario, though it then requires us to use the `previous_changes` hash
# rubocop: disable CodeReuse/ServiceClass
def update_emails_with_primary_email(previous_email)
def update_emails_with_primary_email(previous_confirmed_at, previous_email)
primary_email_record = emails.find_by(email: email)
Emails::DestroyService.new(self, user: self).execute(primary_email_record) if primary_email_record
# the original primary email was confirmed, and we want that to carry over. We don't
# have access to the original confirmation values at this point, so just set confirmed_at
Emails::CreateService.new(self, user: self, email: previous_email).execute(confirmed_at: confirmed_at)
Emails::CreateService.new(self, user: self, email: previous_email).execute(confirmed_at: previous_confirmed_at)
update_columns(confirmed_at: primary_email_record.confirmed_at) if primary_email_record&.confirmed_at
end
# rubocop: enable CodeReuse/ServiceClass
@ -1209,18 +1212,20 @@ class User < ApplicationRecord
all_emails
end
def all_public_emails
all_emails(include_private_email: false)
end
def verified_emails
def verified_emails(include_private_email: true)
verified_emails = []
verified_emails << email if primary_email_verified?
verified_emails << private_commit_email
verified_emails << private_commit_email if include_private_email
verified_emails.concat(emails.confirmed.pluck(:email))
verified_emails
end
def public_verified_emails
emails = verified_emails(include_private_email: false)
emails << email unless temp_oauth_email?
emails.uniq
end
def any_email?(check_email)
downcased = check_email.downcase

View file

@ -10,6 +10,12 @@ module Clusters
def execute(cluster)
if validate_params(cluster)
token = params.dig(:platform_kubernetes_attributes, :token)
if token.blank?
params[:platform_kubernetes_attributes]&.delete(:token)
end
cluster.update(params)
else
false

View file

@ -32,8 +32,8 @@ module Prometheus
def validate_variables(_result)
return success unless variables
unless variables.is_a?(Array) && variables.size.even?
return error(_('Optional parameter "variables" must be an array of keys and values. Ex: [key1, value1, key2, value2]'))
unless variables.is_a?(ActionController::Parameters)
return error(_('Optional parameter "variables" must be a Hash. Ex: variables[key1]=value1'))
end
success
@ -88,12 +88,7 @@ module Prometheus
end
def variables_hash
# .each_slice(2) converts ['key1', 'value1', 'key2', 'value2'] into
# [['key1', 'value1'], ['key2', 'value2']] which is then converted into
# a hash by to_h: {'key1' => 'value1', 'key2' => 'value2'}
# to_h will raise an ArgumentError if the number of elements in the original
# array is not even.
variables&.each_slice(2).to_h
variables.to_h
end
def query(result)

View file

@ -26,6 +26,6 @@
= f.text_field :eks_access_key_id, class: 'form-control'
.form-group
= f.label :eks_secret_access_key, 'Secret access key', class: 'label-bold'
= f.password_field :eks_secret_access_key, value: @application_setting.eks_secret_access_key, class: 'form-control'
= f.password_field :eks_secret_access_key, autocomplete: 'off', class: 'form-control'
= f.submit 'Save changes', class: "btn btn-success"

View file

@ -25,16 +25,10 @@
label: s_('ClusterIntegration|CA Certificate'), label_class: 'label-bold',
input_group_class: 'gl-field-error-anchor', append: copy_ca_cert_btn
- show_token_btn = (platform_field.button s_('ClusterIntegration|Show'),
type: 'button', class: 'js-show-cluster-token btn btn-default')
- copy_token_btn = clipboard_button(text: platform.token, title: s_('ClusterIntegration|Copy Service Token'),
class: 'input-group-text btn-default') if cluster.read_only_kubernetes_platform_fields?
= platform_field.text_field :token, type: 'password', class: 'js-select-on-focus js-cluster-token',
required: true, title: s_('ClusterIntegration|Service token is required.'),
readonly: cluster.read_only_kubernetes_platform_fields?,
label: s_('ClusterIntegration|Service Token'), label_class: 'label-bold',
input_group_class: 'gl-field-error-anchor', append: show_token_btn + copy_token_btn
= platform_field.password_field :token, type: 'password', class: 'js-select-on-focus js-cluster-token',
readonly: cluster.read_only_kubernetes_platform_fields?, autocomplete: 'new-password',
label: s_('ClusterIntegration|Enter new Service Token'), label_class: 'label-bold',
input_group_class: 'gl-field-error-anchor'
= platform_field.form_group :authorization_type do
= platform_field.check_box :authorization_type, { disabled: true, label: s_('ClusterIntegration|RBAC-enabled cluster'),

View file

@ -5,7 +5,7 @@
- help_text = email_change_disabled ? s_("Your account uses dedicated credentials for the \"%{group_name}\" group and can only be updated through SSO.") % { group_name: @user.managing_group.name } : read_only_help_text
= form.text_field :email, required: true, class: 'input-lg', value: (@user.email unless @user.temp_oauth_email?), help: help_text.html_safe, readonly: readonly || email_change_disabled
= form.select :public_email, options_for_select(@user.all_public_emails, selected: @user.public_email),
= form.select :public_email, options_for_select(@user.public_verified_emails, selected: @user.public_email),
{ help: s_("Profiles|This email will be displayed on your public profile"), include_blank: s_("Profiles|Do not show on profile") },
control_class: 'select2 input-lg', disabled: email_change_disabled
- commit_email_link_url = help_page_path('user/profile/index', anchor: 'commit-email', target: '_blank')

View file

@ -1,6 +1,6 @@
- form = local_assigns.fetch(:form)
.form-group
= form.label :notification_email, class: "label-bold"
= form.select :notification_email, @user.all_public_emails, { include_blank: false }, class: "select2", disabled: local_assigns.fetch(:email_change_disabled, nil)
= form.select :notification_email, @user.public_verified_emails, { include_blank: false }, class: "select2", disabled: local_assigns.fetch(:email_change_disabled, nil)
.help-block
= local_assigns.fetch(:help_text, nil)

View file

@ -13,4 +13,4 @@
.table-section.section-30
= form_for setting, url: profile_notifications_group_path(group), method: :put, html: { class: 'update-notifications' } do |f|
= f.select :notification_email, @user.all_public_emails, { include_blank: 'Global notification email' }, class: 'select2 js-group-notification-email'
= f.select :notification_email, @user.public_verified_emails, { include_blank: 'Global notification email' }, class: 'select2 js-group-notification-email'

View file

@ -1,9 +1,9 @@
- page_title 'Edit Deploy Key'
%h3.page-title Edit Deploy Key
%h3.page-title= _('Edit Deploy Key')
%hr
%div
= form_for [@project.namespace.becomes(Namespace), @project, @deploy_key], html: { class: 'js-requires-input' } do |f|
= form_for [@project.namespace.becomes(Namespace), @project, @deploy_key], include_id: false, html: { class: 'js-requires-input' } do |f|
= render partial: 'shared/deploy_keys/form', locals: { form: f, deploy_key: @deploy_key }
.form-actions
= f.submit 'Save changes', class: 'btn-success btn'

View file

@ -32,7 +32,7 @@
%a.btn.btn-default.float-right.d-block.d-sm-none.gutter-toggle.issuable-gutter-toggle.js-sidebar-toggle{ href: "#" }
= icon('angle-double-left')
.detail-page-header-actions.js-issuable-actions
.detail-page-header-actions.js-issuable-actions.js-issuable-buttons{ data: { "action": "close-reopen" } }
.clearfix.issue-btn-group.dropdown
%button.btn.btn-default.float-left.d-md-none.d-lg-none.d-xl-none{ type: "button", data: { toggle: "dropdown" } }
Options

View file

@ -70,7 +70,7 @@ class IrkerWorker # rubocop:disable Scalability/IdempotentWorker
def send_new_branch(project, repo_name, committer, branch)
repo_path = project.full_path
newbranch = "#{Gitlab.config.gitlab.url}/#{repo_path}/branches"
newbranch = "#{Gitlab.config.gitlab.url}/#{repo_path}/-/branches"
newbranch = "\x0302\x1f#{newbranch}\x0f" if @colors
privmsg = "[#{repo_name}] #{committer} has created a new branch " \
@ -124,7 +124,7 @@ class IrkerWorker # rubocop:disable Scalability/IdempotentWorker
def compare_url(data, repo_path)
sha1 = Commit.truncate_sha(data['before'])
sha2 = Commit.truncate_sha(data['after'])
compare_url = "#{Gitlab.config.gitlab.url}/#{repo_path}/compare" \
compare_url = "#{Gitlab.config.gitlab.url}/#{repo_path}/-/compare" \
"/#{sha1}...#{sha2}"
colorize_url compare_url
end

View file

@ -36,6 +36,7 @@ en:
access_denied: 'The resource owner or authorization server denied the request.'
invalid_scope: 'The requested scope is invalid, unknown, or malformed.'
server_error: 'The authorization server encountered an unexpected condition which prevented it from fulfilling the request.'
unconfirmed_email: 'Verify the email address in your account profile before you sign in.'
temporarily_unavailable: 'The authorization server is currently unable to handle the request due to a temporary overloading or maintenance of the server.'
#configuration error messages

View file

@ -1198,7 +1198,7 @@ PUT /projects/:id
| `approvals_before_merge` | integer | no | **(STARTER)** How many approvers should approve merge request by default |
| `external_authorization_classification_label` | string | no | **(PREMIUM)** The classification label for the project |
| `mirror` | boolean | no | **(STARTER)** Enables pull mirroring in a project |
| `mirror_user_id` | integer | no | **(STARTER)** User responsible for all the activity surrounding a pull mirror event |
| `mirror_user_id` | integer | no | **(STARTER)** User responsible for all the activity surrounding a pull mirror event. Can only be set by admins. |
| `mirror_trigger_builds` | boolean | no | **(STARTER)** Pull mirroring triggers builds |
| `only_mirror_protected_branches` | boolean | no | **(STARTER)** Only mirror protected branches |
| `mirror_overwrites_diverged_branches` | boolean | no | **(STARTER)** Pull mirror overwrites diverged branches |

View file

@ -453,7 +453,7 @@ DAST can be [configured](#customizing-the-dast-settings) using environment varia
| `DAST_FULL_SCAN_DOMAIN_VALIDATION_REQUIRED` | no | Requires [domain validation](#domain-validation) when running DAST full scans. Boolean. `true`, `True`, or `1` are considered as true value, otherwise false. Defaults to `false`. Not supported for API scans. |
| `DAST_AUTO_UPDATE_ADDONS` | no | By default the versions of ZAP add-ons are pinned to those provided with the DAST image. Set to `true` to allow ZAP to download the latest versions. |
| `DAST_API_HOST_OVERRIDE` | no | Used to override domains defined in API specification files. |
| `DAST_EXCLUDE_RULES` | no | Set to a comma-separated list of Vulnerability Rule IDs to exclude them from scans. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://github.com/zaproxy/zaproxy/blob/master/docs/scanners.md). For example, `HTTP Parameter Override` has a rule ID of `10026`. |
| `DAST_EXCLUDE_RULES` | no | Set to a comma-separated list of Vulnerability Rule IDs to exclude them from the scan report. Currently, excluded rules will get executed but the alerts from them will be suppressed. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://github.com/zaproxy/zaproxy/blob/develop/docs/scanners.md). For example, `HTTP Parameter Override` has a rule ID of `10026`. |
| `DAST_REQUEST_HEADERS` | no | Set to a comma-separated list of request header names and values. For example, `Cache-control: no-cache,User-Agent: DAST/1.0` |
| `DAST_ZAP_USE_AJAX_SPIDER` | no | Use the AJAX spider in addition to the traditional spider, useful for crawling sites that require JavaScript. Boolean. `true`, `True`, or `1` are considered as true value, otherwise false. Defaults to `false`. |

View file

@ -88,6 +88,23 @@ or more users or by the `@name` of one or more groups that should
be owners of the file. Groups must be added as [members of the project](members/index.md),
or they will be ignored.
Starting in [GitLab 13.0](https://gitlab.com/gitlab-org/gitlab/-/issues/32432), you can now specify
groups or subgroups from the project's group hierarchy as potential code owners.
For example, consider the following hierarchy for a given project:
```text
group >> sub-group >> sub-subgroup >> myproject >> file.md
```
Any of the following groups would be eligible to be specified as code owners:
- `@group`
- `@group/sub-group`
- `@group/sub-group/sub-subgroup`
In addition, any groups that have been invited to the project using the **Settings > Members** tool will also be recognized as eligible code owners.
The order in which the paths are defined is significant: the last
pattern that matches a given path will be used to find the code
owners.

View file

@ -4,6 +4,8 @@ module API
class GroupImport < Grape::API
MAXIMUM_FILE_SIZE = 50.megabytes.freeze
helpers Helpers::FileUploadHelpers
helpers do
def parent_group
find_group!(params[:parent_id]) if params[:parent_id].present?
@ -48,29 +50,20 @@ module API
params do
requires :path, type: String, desc: 'Group path'
requires :name, type: String, desc: 'Group name'
requires :file, type: ::API::Validations::Types::WorkhorseFile, desc: 'The group export file to be imported'
optional :parent_id, type: Integer, desc: "The ID of the parent group that the group will be imported into. Defaults to the current user's namespace."
optional 'file.path', type: String, desc: 'Path to locally stored body (generated by Workhorse)'
optional 'file.name', type: String, desc: 'Real filename as send in Content-Disposition (generated by Workhorse)'
optional 'file.type', type: String, desc: 'Real content type as send in Content-Type (generated by Workhorse)'
optional 'file.size', type: Integer, desc: 'Real size of file (generated by Workhorse)'
optional 'file.md5', type: String, desc: 'MD5 checksum of the file (generated by Workhorse)'
optional 'file.sha1', type: String, desc: 'SHA1 checksum of the file (generated by Workhorse)'
optional 'file.sha256', type: String, desc: 'SHA256 checksum of the file (generated by Workhorse)'
end
post 'import' do
authorize_create_group!
require_gitlab_workhorse!
uploaded_file = UploadedFile.from_params(params, :file, ImportExportUploader.workhorse_local_upload_path)
bad_request!('Unable to process group import file') unless uploaded_file
validate_file!
group_params = {
path: params[:path],
name: params[:name],
parent_id: params[:parent_id],
visibility_level: closest_allowed_visibility_level,
import_export_upload: ImportExportUpload.new(import_file: uploaded_file)
import_export_upload: ImportExportUpload.new(import_file: params[:file])
}
group = ::Groups::CreateService.new(current_user, group_params).execute

View file

@ -444,6 +444,8 @@ module API
not_found!("Source Project") unless fork_from_project
authorize! :fork_project, fork_from_project
result = ::Projects::ForkService.new(fork_from_project, current_user).execute(user_project)
if result

View file

@ -86,6 +86,7 @@ staging:
canary:
extends: .auto-deploy
stage: canary
allow_failure: true
script:
- auto-deploy check_kube_domain
- auto-deploy download_chart
@ -176,6 +177,7 @@ production_manual:
.manual_rollout_template: &manual_rollout_template
<<: *rollout_template
stage: production
allow_failure: true
rules:
- if: '$CI_KUBERNETES_ACTIVE == null || $CI_KUBERNETES_ACTIVE == ""'
when: never

View file

@ -21,7 +21,7 @@ module Gitlab
project_id: project.id,
project: project.path,
namespace: project.namespace.path,
return_url: return_url,
return_url: sanitize_url(return_url),
is_supported_content: supported_content?.to_s,
base_url: Gitlab::Routing.url_helpers.project_show_sse_path(project, full_path)
}
@ -52,6 +52,10 @@ module Gitlab
def full_path
"#{ref}/#{file_path}"
end
def sanitize_url(url)
url if Gitlab::UrlSanitizer.valid_web?(url)
end
end
end
end

View file

@ -3,6 +3,7 @@
module Gitlab
class UrlSanitizer
ALLOWED_SCHEMES = %w[http https ssh git].freeze
ALLOWED_WEB_SCHEMES = %w[http https].freeze
def self.sanitize(content)
regexp = URI::DEFAULT_PARSER.make_regexp(ALLOWED_SCHEMES)
@ -12,17 +13,21 @@ module Gitlab
content.gsub(regexp, '')
end
def self.valid?(url)
def self.valid?(url, allowed_schemes: ALLOWED_SCHEMES)
return false unless url.present?
return false unless url.is_a?(String)
uri = Addressable::URI.parse(url.strip)
ALLOWED_SCHEMES.include?(uri.scheme)
allowed_schemes.include?(uri.scheme)
rescue Addressable::URI::InvalidURIError
false
end
def self.valid_web?(url)
valid?(url, allowed_schemes: ALLOWED_WEB_SCHEMES)
end
def initialize(url, credentials: nil)
%i[user password].each do |symbol|
credentials[symbol] = credentials[symbol].presence if credentials&.key?(symbol)

View file

@ -4622,9 +4622,6 @@ msgstr ""
msgid "ClusterIntegration|Copy Kubernetes cluster name"
msgstr ""
msgid "ClusterIntegration|Copy Service Token"
msgstr ""
msgid "ClusterIntegration|Could not load IAM roles"
msgstr ""
@ -4703,6 +4700,9 @@ msgstr ""
msgid "ClusterIntegration|Enabled stack"
msgstr ""
msgid "ClusterIntegration|Enter new Service Token"
msgstr ""
msgid "ClusterIntegration|Enter the details for your Amazon EKS Kubernetes cluster"
msgstr ""
@ -4787,9 +4787,6 @@ msgstr ""
msgid "ClusterIntegration|Helm streamlines installing and managing Kubernetes applications. Tiller runs inside of your Kubernetes Cluster, and manages releases of your charts."
msgstr ""
msgid "ClusterIntegration|Hide"
msgstr ""
msgid "ClusterIntegration|If you are setting up multiple clusters and are using Auto DevOps, %{help_link_start}read this first%{help_link_end}."
msgstr ""
@ -5183,9 +5180,6 @@ msgstr ""
msgid "ClusterIntegration|Set the global mode for the WAF in this cluster. This can be overridden at the environmental level."
msgstr ""
msgid "ClusterIntegration|Show"
msgstr ""
msgid "ClusterIntegration|Something went wrong on our end."
msgstr ""
@ -14887,7 +14881,7 @@ msgstr ""
msgid "Optional"
msgstr ""
msgid "Optional parameter \"variables\" must be an array of keys and values. Ex: [key1, value1, key2, value2]"
msgid "Optional parameter \"variables\" must be a Hash. Ex: variables[key1]=value1"
msgstr ""
msgid "Optionally, you can %{link_to_customize} how FogBugz email addresses and usernames are imported into GitLab."
@ -22244,9 +22238,6 @@ msgstr ""
msgid "This user will be the author of all events in the activity feed that are the result of an update, like new branches being created or new commits being pushed to existing branches."
msgstr ""
msgid "This user will be the author of all events in the activity feed that are the result of an update, like new branches being created or new commits being pushed to existing branches. Upon creation or when reassigning you can only assign yourself to be the mirror user."
msgstr ""
msgid "This variable can not be masked."
msgstr ""
@ -25014,6 +25005,9 @@ msgstr ""
msgid "You will be removed from existing projects/groups"
msgstr ""
msgid "You will be the author of all events in the activity feed that are the result of an update, like new branches being created or new commits being pushed to existing branches."
msgstr ""
msgid "You will first need to set up Jira Integration to use this feature."
msgstr ""
@ -25272,6 +25266,9 @@ msgstr ""
msgid "Your projects"
msgstr ""
msgid "Your request for access could not be processed: %{error_meesage}"
msgstr ""
msgid "Your request for access has been queued for review."
msgstr ""
@ -25710,6 +25707,9 @@ msgstr ""
msgid "email '%{email}' does not match the allowed domain of '%{email_domain}'"
msgstr ""
msgid "email '%{email}' is not a verified email."
msgstr ""
msgid "enabled"
msgstr ""

View file

@ -155,6 +155,46 @@ describe Admin::ApplicationSettingsController do
end
end
describe 'PATCH #integrations' do
before do
stub_feature_flags(instance_level_integrations: false)
sign_in(admin)
end
describe 'EKS integration' do
let(:application_setting) { ApplicationSetting.current }
let(:settings_params) do
{
eks_integration_enabled: '1',
eks_account_id: '123456789012',
eks_access_key_id: 'dummy access key',
eks_secret_access_key: 'dummy secret key'
}
end
it 'updates EKS settings' do
patch :integrations, params: { application_setting: settings_params }
expect(application_setting.eks_integration_enabled).to be_truthy
expect(application_setting.eks_account_id).to eq '123456789012'
expect(application_setting.eks_access_key_id).to eq 'dummy access key'
expect(application_setting.eks_secret_access_key).to eq 'dummy secret key'
end
context 'secret access key is blank' do
let(:settings_params) { { eks_secret_access_key: '' } }
it 'does not update the secret key' do
application_setting.update!(eks_secret_access_key: 'dummy secret key')
patch :integrations, params: { application_setting: settings_params }
expect(application_setting.reload.eks_secret_access_key).to eq 'dummy secret key'
end
end
end
end
describe 'PUT #reset_registration_token' do
before do
sign_in(admin)

View file

@ -3,7 +3,6 @@
require 'spec_helper'
describe Oauth::AuthorizationsController do
let(:user) { create(:user) }
let!(:application) { create(:oauth_application, scopes: 'api read_user', redirect_uri: 'http://example.com') }
let(:params) do
{
@ -19,53 +18,68 @@ describe Oauth::AuthorizationsController do
end
describe 'GET #new' do
context 'without valid params' do
it 'returns 200 code and renders error view' do
get :new
context 'when the user is confirmed' do
let(:user) { create(:user) }
context 'without valid params' do
it 'returns 200 code and renders error view' do
get :new
expect(response).to have_gitlab_http_status(:ok)
expect(response).to render_template('doorkeeper/authorizations/error')
end
end
context 'with valid params' do
render_views
it 'returns 200 code and renders view' do
get :new, params: params
expect(response).to have_gitlab_http_status(:ok)
expect(response).to render_template('doorkeeper/authorizations/new')
end
it 'deletes session.user_return_to and redirects when skip authorization' do
application.update(trusted: true)
request.session['user_return_to'] = 'http://example.com'
get :new, params: params
expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(:found)
end
context 'when there is already an access token for the application' do
context 'when the request scope matches any of the created token scopes' do
before do
scopes = Doorkeeper::OAuth::Scopes.from_string('api')
allow(Doorkeeper.configuration).to receive(:scopes).and_return(scopes)
create :oauth_access_token, application: application, resource_owner_id: user.id, scopes: scopes
end
it 'authorizes the request and redirects' do
get :new, params: params
expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(:found)
end
end
end
end
end
context 'when the user is unconfirmed' do
let(:user) { create(:user, confirmed_at: nil) }
it 'returns 200 and renders error view' do
get :new, params: params
expect(response).to have_gitlab_http_status(:ok)
expect(response).to render_template('doorkeeper/authorizations/error')
end
end
context 'with valid params' do
render_views
it 'returns 200 code and renders view' do
get :new, params: params
expect(response).to have_gitlab_http_status(:ok)
expect(response).to render_template('doorkeeper/authorizations/new')
end
it 'deletes session.user_return_to and redirects when skip authorization' do
application.update(trusted: true)
request.session['user_return_to'] = 'http://example.com'
get :new, params: params
expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(:found)
end
context 'when there is already an access token for the application' do
context 'when the request scope matches any of the created token scopes' do
before do
scopes = Doorkeeper::OAuth::Scopes.from_string('api')
allow(Doorkeeper.configuration).to receive(:scopes).and_return(scopes)
create :oauth_access_token, application: application, resource_owner_id: user.id, scopes: scopes
end
it 'authorizes the request and redirects' do
get :new, params: params
expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(:found)
end
end
end
end
end
end

View file

@ -5,8 +5,8 @@ require 'spec_helper'
describe Profiles::NotificationsController do
let(:user) do
create(:user) do |user|
user.emails.create(email: 'original@example.com')
user.emails.create(email: 'new@example.com')
user.emails.create(email: 'original@example.com', confirmed_at: Time.current)
user.emails.create(email: 'new@example.com', confirmed_at: Time.current)
user.notification_email = 'original@example.com'
user.save!
end

View file

@ -3,6 +3,8 @@
require 'spec_helper'
describe Projects::ArtifactsController do
include RepoHelpers
let(:user) { project.owner }
let_it_be(:project) { create(:project, :repository, :public) }
@ -481,6 +483,22 @@ describe Projects::ArtifactsController do
expect(response).to redirect_to(path)
end
end
context 'with a failed pipeline on an updated master' do
before do
create_file_in_repo(project, 'master', 'master', 'test.txt', 'This is test')
create(:ci_pipeline,
project: project,
sha: project.commit.sha,
ref: project.default_branch,
status: 'failed')
get :latest_succeeded, params: params_from_ref(project.default_branch)
end
it_behaves_like 'redirect to the job'
end
end
end
end

View file

@ -256,7 +256,7 @@ describe Projects::DeployKeysController do
end
def deploy_key_params(title, can_push)
deploy_keys_projects_attributes = { '0' => { id: deploy_keys_project, can_push: can_push } }
deploy_keys_projects_attributes = { '0' => { can_push: can_push } }
{ deploy_key: { title: title, deploy_keys_projects_attributes: deploy_keys_projects_attributes } }
end
@ -300,6 +300,42 @@ describe Projects::DeployKeysController do
expect { subject }.to change { deploy_keys_project.reload.can_push }.from(false).to(true)
end
end
context 'when a different deploy key id param is injected' do
let(:extra_params) { deploy_key_params('updated title', '1') }
let(:hacked_params) do
extra_params.reverse_merge(id: other_deploy_key_id,
namespace_id: project.namespace,
project_id: project)
end
subject { put :update, params: hacked_params }
context 'and that deploy key id exists' do
let(:other_project) { create(:project) }
let(:other_deploy_key) do
key = create(:deploy_key)
project.deploy_keys << key
key
end
let(:other_deploy_key_id) { other_deploy_key.id }
it 'does not update the can_push attribute' do
expect { subject }.not_to change { deploy_key.deploy_keys_project_for(project).can_push }
end
end
context 'and that deploy key id does not exist' do
let(:other_deploy_key_id) { 9999 }
it 'returns 404' do
subject
expect(response).to have_gitlab_http_status(:not_found)
end
end
end
end
context 'with admin as project maintainer' do

View file

@ -84,12 +84,12 @@ describe Projects::Environments::PrometheusApiController do
before do
expected_params[:query] = %{up{pod_name="#{pod_name}"}}
expected_params[:variables] = ['pod_name', pod_name]
expected_params[:variables] = { 'pod_name' => pod_name }
end
it 'replaces variables with values' do
get :proxy, params: environment_params.merge(
query: 'up{pod_name="{{pod_name}}"}', variables: ['pod_name', pod_name]
query: 'up{pod_name="{{pod_name}}"}', variables: { 'pod_name' => pod_name }
)
expect(response).to have_gitlab_http_status(:success)

View file

@ -48,6 +48,10 @@ FactoryBot.define do
after(:build) { |user, _| user.block! }
end
trait :unconfirmed do
confirmed_at { nil }
end
trait :with_avatar do
avatar { fixture_file_upload('spec/fixtures/dk.png') }
end

View file

@ -39,7 +39,7 @@ describe 'User Cluster', :js do
expect(page.find_field('cluster[platform_kubernetes_attributes][api_url]').value)
.to have_content('http://example.com')
expect(page.find_field('cluster[platform_kubernetes_attributes][token]').value)
.to have_content('my-token')
.to be_empty
end
end

View file

@ -0,0 +1,21 @@
# frozen_string_literal: true
require 'spec_helper'
describe 'OAuth Provider' do
describe 'Standard OAuth Authorization' do
let(:application) { create(:oauth_application, scopes: 'read_user') }
before do
sign_in(user)
visit oauth_authorization_path(client_id: application.uid,
redirect_uri: application.redirect_uri.split.first,
response_type: 'code',
state: 'my_state',
scope: 'read_user')
end
it_behaves_like 'Secure OAuth Authorizations'
end
end

View file

@ -32,5 +32,11 @@ describe "User downloads artifacts" do
it_behaves_like "downloading"
end
context "via SHA" do
let(:url) { latest_succeeded_project_artifacts_path(project, "#{pipeline.sha}/download", job: job.name) }
it_behaves_like "downloading"
end
end
end

View file

@ -46,7 +46,7 @@ describe 'User Cluster', :js do
expect(page.find_field('cluster[platform_kubernetes_attributes][api_url]').value)
.to have_content('http://example.com')
expect(page.find_field('cluster[platform_kubernetes_attributes][token]').value)
.to have_content('my-token')
.to be_empty
end
it 'user sees RBAC is enabled by default' do

View file

@ -82,28 +82,6 @@ describe('Clusters', () => {
});
});
describe('showToken', () => {
it('should update token field type', () => {
cluster.showTokenButton.click();
expect(cluster.tokenField.getAttribute('type')).toEqual('text');
cluster.showTokenButton.click();
expect(cluster.tokenField.getAttribute('type')).toEqual('password');
});
it('should update show token button text', () => {
cluster.showTokenButton.click();
expect(cluster.showTokenButton.textContent).toEqual('Hide');
cluster.showTokenButton.click();
expect(cluster.showTokenButton.textContent).toEqual('Show');
});
});
describe('checkForNewInstalls', () => {
const INITIAL_APP_MAP = {
helm: { status: null, title: 'Helm Tiller' },

View file

@ -0,0 +1,82 @@
<div class="description" updated-at="">
<div class="md issue-realtime-trigger-pulse">
<svg
id="mermaid-1587752414912"
width="100%"
xmlns="http://www.w3.org/2000/svg"
style="max-width: 185.35000610351562px;"
viewBox="0 0 185.35000610351562 50.5"
class="mermaid"
>
<g transform="translate(0, 0)">
<g class="output">
<g class="clusters"></g>
<g class="edgePaths"></g>
<g class="edgeLabels"></g>
<g class="nodes">
<g
class="node js-issuable-buttons btn-close clickable"
style="opacity: 1;"
id="A"
transform="translate(92.67500305175781,25.25)"
title="click to PUT"
>
<a
class="js-issuable-buttons btn-close clickable"
href="https://invalid"
rel="noopener"
>
<rect
rx="0"
ry="0"
x="-84.67500305175781"
y="-17.25"
width="169.35000610351562"
height="34.5"
class="label-container"
></rect>
<g class="label" transform="translate(0,0)">
<g transform="translate(-74.67500305175781,-7.25)">
<text style="">
<tspan xml:space="preserve" dy="1em" x="1">Click to send a PUT request</tspan>
</text>
</g>
</g>
</a>
</g>
</g>
</g>
</g>
<text class="source" display="none">
Click to send a PUT request
</text>
</svg>
</div>
<textarea
data-update-url="/h5bp/html5-boilerplate/-/issues/35.json"
dir="auto"
class="hidden js-task-list-field"
></textarea>
<div class="modal-open recaptcha-modal js-recaptcha-modal" style="display: none;">
<div role="dialog" tabindex="-1" class="modal d-block">
<div role="document" class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title float-left">Please solve the reCAPTCHA</h4>
<button type="button" data-dismiss="modal" aria-label="Close" class="close float-right">
<span aria-hidden="true">×</span>
</button>
</div>
<div class="modal-body">
<div>
<p>We want to be sure it is you, please confirm you are not a robot.</p>
<div></div>
</div>
</div>
<!---->
</div>
</div>
</div>
<div class="modal-backdrop fade show"></div>
</div>
</div>

View file

@ -18,6 +18,7 @@ describe('Issue', () => {
preloadFixtures('issues/closed-issue.html');
preloadFixtures('issues/issue-with-task-list.html');
preloadFixtures('issues/open-issue.html');
preloadFixtures('static/issue_with_mermaid_graph.html');
function expectErrorMessage() {
const $flashMessage = $('div.flash-alert');
@ -228,4 +229,30 @@ describe('Issue', () => {
});
});
});
describe('when not displaying blocked warning', () => {
describe('when clicking a mermaid graph inside an issue description', () => {
let mock;
let spy;
beforeEach(() => {
loadFixtures('static/issue_with_mermaid_graph.html');
mock = new MockAdapter(axios);
spy = jest.spyOn(axios, 'put');
});
afterEach(() => {
mock.restore();
jest.clearAllMocks();
});
it('does not make a PUT request', () => {
Issue.prototype.initIssueBtnEventListeners();
$('svg a.js-issuable-actions').trigger('click');
expect(spy).not.toHaveBeenCalled();
});
});
});
});

View file

@ -3,9 +3,17 @@ import DuplicateDashboardForm from '~/monitoring/components/duplicate_dashboard_
import { dashboardGitResponse } from '../mock_data';
describe('DuplicateDashboardForm', () => {
let wrapper;
let wrapper;
const createMountedWrapper = (props = {}) => {
// Use `mount` to render native input elements
wrapper = mount(DuplicateDashboardForm, {
propsData: { ...props },
sync: false,
});
};
describe('DuplicateDashboardForm', () => {
const defaultBranch = 'master';
const findByRef = ref => wrapper.find({ ref });
@ -20,14 +28,7 @@ describe('DuplicateDashboardForm', () => {
};
beforeEach(() => {
// Use `mount` to render native input elements
wrapper = mount(DuplicateDashboardForm, {
propsData: {
dashboard: dashboardGitResponse[0],
defaultBranch,
},
sync: false,
});
createMountedWrapper({ dashboard: dashboardGitResponse[0], defaultBranch });
});
it('renders correctly', () => {
@ -146,3 +147,18 @@ describe('DuplicateDashboardForm', () => {
});
});
});
describe('DuplicateDashboardForm escapes elements', () => {
const branchToEscape = "<img/src='x'onerror=alert(document.domain)>";
beforeEach(() => {
createMountedWrapper({ dashboard: dashboardGitResponse[0], defaultBranch: branchToEscape });
});
it('should escape branch name data', () => {
const branchOptionHtml = wrapper.vm.branchOptions[0].html;
const escapedBranch = '&lt;img/src=&#39;x&#39;onerror=alert(document.domain)&gt';
expect(branchOptionHtml).toEqual(expect.stringContaining(escapedBranch));
});
});

View file

@ -329,7 +329,7 @@ describe('Monitoring store Getters', () => {
});
});
describe('getCustomVariablesArray', () => {
describe('getCustomVariablesParams', () => {
let state;
beforeEach(() => {
@ -340,25 +340,21 @@ describe('Monitoring store Getters', () => {
it('transforms the promVariables object to an array in the [variable, variable_value] format for all variable types', () => {
mutations[types.SET_VARIABLES](state, mockTemplatingDataResponses.allVariableTypes);
const variablesArray = getters.getCustomVariablesArray(state);
const variablesArray = getters.getCustomVariablesParams(state);
expect(variablesArray).toEqual([
'simpleText',
'Simple text',
'advText',
'default',
'simpleCustom',
'value1',
'advCustomNormal',
'value2',
]);
expect(variablesArray).toEqual({
'variables[advCustomNormal]': 'value2',
'variables[advText]': 'default',
'variables[simpleCustom]': 'value1',
'variables[simpleText]': 'Simple text',
});
});
it('transforms the promVariables object to an empty array when no keys are present', () => {
mutations[types.SET_VARIABLES](state, {});
const variablesArray = getters.getCustomVariablesArray(state);
const variablesArray = getters.getCustomVariablesParams(state);
expect(variablesArray).toEqual([]);
expect(variablesArray).toEqual({});
});
});

View file

@ -65,5 +65,23 @@ describe Gitlab::StaticSiteEditor::Config do
it { is_expected.to include(is_supported_content: 'false') }
end
context 'when return_url is not a valid URL' do
let(:return_url) { 'example.com' }
it { is_expected.to include(return_url: nil) }
end
context 'when return_url has a javascript scheme' do
let(:return_url) { 'javascript:alert(document.domain)' }
it { is_expected.to include(return_url: nil) }
end
context 'when return_url is missing' do
let(:return_url) { nil }
it { is_expected.to include(return_url: nil) }
end
end
end

View file

@ -60,6 +60,30 @@ describe Gitlab::UrlSanitizer do
end
end
describe '.valid_web?' do
where(:value, :url) do
false | nil
false | ''
false | '123://invalid:url'
false | 'valid@project:url.git'
false | 'valid:pass@project:url.git'
false | %w(test array)
false | 'ssh://example.com'
false | 'ssh://:@example.com'
false | 'ssh://foo@example.com'
false | 'ssh://foo:bar@example.com'
false | 'ssh://foo:bar@example.com/group/group/project.git'
false | 'git://example.com/group/group/project.git'
false | 'git://foo:bar@example.com/group/group/project.git'
true | 'http://foo:bar@example.com/group/group/project.git'
true | 'https://foo:bar@example.com/group/group/project.git'
end
with_them do
it { expect(described_class.valid_web?(url)).to eq(value) }
end
end
describe '#sanitized_url' do
context 'credentials in hash' do
where(username: ['foo', '', nil], password: ['bar', '', nil])

View file

@ -209,8 +209,8 @@ describe CacheMarkdownField, :clean_gitlab_redis_cache do
thing.cached_markdown_version += 1
end
it 'calls #refresh_markdown_cache' do
expect(thing).to receive(:refresh_markdown_cache)
it 'calls #refresh_markdown_cache!' do
expect(thing).to receive(:refresh_markdown_cache!)
expect(thing.updated_cached_html_for(:description)).to eq(html)
end
@ -227,8 +227,8 @@ describe CacheMarkdownField, :clean_gitlab_redis_cache do
thing.try(:save)
end
it 'does not call #refresh_markdown_cache' do
expect(thing).not_to receive(:refresh_markdown_cache)
it 'does not call #refresh_markdown_cache!' do
expect(thing).not_to receive(:refresh_markdown_cache!)
expect(thing.updated_cached_html_for(:description)).to eq(html)
end

View file

@ -110,6 +110,11 @@ describe Group do
let(:group_notification_email) { 'user+group@example.com' }
let(:subgroup_notification_email) { 'user+subgroup@example.com' }
before do
create(:email, :confirmed, user: user, email: group_notification_email)
create(:email, :confirmed, user: user, email: subgroup_notification_email)
end
subject { subgroup.notification_email_for(user) }
context 'when both group notification emails are set' do

View file

@ -48,6 +48,33 @@ RSpec.describe NotificationSetting do
expect(notification_setting.reopen_merge_request).to eq(false)
end
end
context 'notification_email' do
let_it_be(:user) { create(:user) }
subject { described_class.new(source_id: 1, source_type: 'Project', user_id: user.id) }
it 'allows to change email to verified one' do
email = create(:email, :confirmed, user: user)
subject.update(notification_email: email.email)
expect(subject).to be_valid
end
it 'does not allow to change email to not verified one' do
email = create(:email, user: user)
subject.update(notification_email: email.email)
expect(subject).to be_invalid
end
it 'allows to change email to empty one' do
subject.update(notification_email: '')
expect(subject).to be_valid
end
end
end
describe '#for_projects' do

View file

@ -310,7 +310,7 @@ describe User do
end
it_behaves_like 'an object with RFC3696 compliant email-formated attributes', :public_email, :notification_email do
subject { build(:user).tap { |user| user.emails << build(:email, email: email_value) } }
subject { create(:user).tap { |user| user.emails << build(:email, email: email_value, confirmed_at: Time.current) } }
end
describe '#commit_email' do
@ -567,6 +567,32 @@ describe User do
user = build(:user, email: "temp-email-for-oauth@example.com")
expect(user).to be_valid
end
it 'does not accept not verified emails' do
email = create(:email)
user = email.user
user.update(notification_email: email.email)
expect(user).to be_invalid
end
end
context 'owns_public_email' do
it 'accepts verified emails' do
email = create(:email, :confirmed, email: 'test@test.com')
user = email.user
user.update(public_email: email.email)
expect(user).to be_valid
end
it 'does not accept not verified emails' do
email = create(:email)
user = email.user
user.update(public_email: email.email)
expect(user).to be_invalid
end
end
context 'set_commit_email' do
@ -916,6 +942,108 @@ describe User do
expect(@user.emails.count).to eq 1
expect(@user.emails.first.confirmed_at).not_to eq nil
end
context 'when the first email was unconfirmed and the second email gets confirmed' do
let(:user) { create(:user, :unconfirmed, email: 'should-be-unconfirmed@test.com') }
before do
user.update!(email: 'should-be-confirmed@test.com')
user.confirm
end
it 'updates user.email' do
expect(user.email).to eq('should-be-confirmed@test.com')
end
it 'confirms user.email' do
expect(user).to be_confirmed
end
it 'keeps the unconfirmed email unconfirmed' do
email = user.emails.first
expect(email.email).to eq('should-be-unconfirmed@test.com')
expect(email).not_to be_confirmed
end
it 'has only one email association' do
expect(user.emails.size).to be(1)
end
end
end
context 'when an existing email record is set as primary' do
let(:user) { create(:user, email: 'confirmed@test.com') }
context 'when it is unconfirmed' do
let(:originally_unconfirmed_email) { 'should-stay-unconfirmed@test.com' }
before do
user.emails << create(:email, email: originally_unconfirmed_email, confirmed_at: nil)
user.update!(email: originally_unconfirmed_email)
end
it 'keeps the user confirmed' do
expect(user).to be_confirmed
end
it 'keeps the original email' do
expect(user.email).to eq('confirmed@test.com')
end
context 'when the email gets confirmed' do
before do
user.confirm
end
it 'keeps the user confirmed' do
expect(user).to be_confirmed
end
it 'updates the email' do
expect(user.email).to eq(originally_unconfirmed_email)
end
end
end
context 'when it is confirmed' do
let!(:old_confirmed_email) { user.email }
let(:confirmed_email) { 'already-confirmed@test.com' }
before do
user.emails << create(:email, :confirmed, email: confirmed_email)
user.update!(email: confirmed_email)
end
it 'keeps the user confirmed' do
expect(user).to be_confirmed
end
it 'updates the email' do
expect(user.email).to eq(confirmed_email)
end
it 'moves the old email' do
email = user.reload.emails.first
expect(email.email).to eq(old_confirmed_email)
expect(email).to be_confirmed
end
end
end
context 'when unconfirmed user deletes a confirmed additional email' do
let(:user) { create(:user, :unconfirmed) }
before do
user.emails << create(:email, :confirmed)
end
it 'does not affect the confirmed status' do
expect { user.emails.confirmed.destroy_all }.not_to change { user.confirmed? } # rubocop: disable Cop/DestroyAll
end
end
describe '#update_notification_email' do
@ -2069,6 +2197,31 @@ describe User do
end
end
describe '#public_verified_emails' do
let(:user) { create(:user) }
it 'returns only confirmed public emails' do
email_confirmed = create :email, user: user, confirmed_at: Time.current
create :email, user: user
expect(user.public_verified_emails).to contain_exactly(
user.email,
email_confirmed.email
)
end
it 'returns confirmed public emails plus main user email when user is not confirmed' do
user = create(:user, confirmed_at: nil)
email_confirmed = create :email, user: user, confirmed_at: Time.current
create :email, user: user
expect(user.public_verified_emails).to contain_exactly(
user.email,
email_confirmed.email
)
end
end
describe '#verified_email?' do
let(:user) { create(:user) }
@ -4231,9 +4384,10 @@ describe User do
context 'when an ancestor has a level other than Global' do
let(:ancestor) { create(:group) }
let(:group) { create(:group, parent: ancestor) }
let(:email) { create(:email, :confirmed, email: 'ancestor@example.com', user: user) }
before do
create(:notification_setting, user: user, source: ancestor, level: 'participating', notification_email: 'ancestor@example.com')
create(:notification_setting, user: user, source: ancestor, level: 'participating', notification_email: email.email)
end
it 'has the same level set' do
@ -4258,10 +4412,12 @@ describe User do
let(:grand_ancestor) { create(:group) }
let(:ancestor) { create(:group, parent: grand_ancestor) }
let(:group) { create(:group, parent: ancestor) }
let(:ancestor_email) { create(:email, :confirmed, email: 'ancestor@example.com', user: user) }
let(:grand_email) { create(:email, :confirmed, email: 'grand@example.com', user: user) }
before do
create(:notification_setting, user: user, source: grand_ancestor, level: 'participating', notification_email: 'grand@example.com')
create(:notification_setting, user: user, source: ancestor, level: 'global', notification_email: 'ancestor@example.com')
create(:notification_setting, user: user, source: grand_ancestor, level: 'participating', notification_email: grand_email.email)
create(:notification_setting, user: user, source: ancestor, level: 'global', notification_email: ancestor_email.email)
end
it 'has the same email set' do
@ -4299,7 +4455,7 @@ describe User do
context 'when group has notification email set' do
it 'returns group notification email' do
group_notification_email = 'user+group@example.com'
create(:email, :confirmed, user: user, email: group_notification_email)
create(:notification_setting, user: user, source: group, notification_email: group_notification_email)
is_expected.to eq(group_notification_email)

View file

@ -11,7 +11,7 @@ describe API::GroupImport do
let(:file) { File.join('spec', 'fixtures', 'group_export.tar.gz') }
let(:export_path) { "#{Dir.tmpdir}/group_export_spec" }
let(:workhorse_token) { JWT.encode({ 'iss' => 'gitlab-workhorse' }, Gitlab::Workhorse.secret, 'HS256') }
let(:workhorse_header) { { 'GitLab-Workhorse' => '1.0', Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER => workhorse_token } }
let(:workhorse_headers) { { 'GitLab-Workhorse' => '1.0', Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER => workhorse_token } }
before do
allow_next_instance_of(Gitlab::ImportExport) do |import_export|
@ -35,7 +35,7 @@ describe API::GroupImport do
}
end
subject { post api('/groups/import', user), params: params, headers: workhorse_header }
subject { upload_archive(file_upload, workhorse_headers, params) }
shared_examples 'when all params are correct' do
context 'when user is authorized to create new group' do
@ -151,7 +151,7 @@ describe API::GroupImport do
params[:file] = file_upload
expect do
post api('/groups/import', user), params: params, headers: workhorse_header
upload_archive(file_upload, workhorse_headers, params)
end.not_to change { Group.count }.from(1)
expect(response).to have_gitlab_http_status(:bad_request)
@ -171,7 +171,7 @@ describe API::GroupImport do
context 'without a file from workhorse' do
it 'rejects the request' do
subject
upload_archive(nil, workhorse_headers, params)
expect(response).to have_gitlab_http_status(:bad_request)
end
@ -179,7 +179,7 @@ describe API::GroupImport do
context 'without a workhorse header' do
it 'rejects request without a workhorse header' do
post api('/groups/import', user), params: params
upload_archive(file_upload, {}, params)
expect(response).to have_gitlab_http_status(:forbidden)
end
@ -189,9 +189,7 @@ describe API::GroupImport do
let(:params) do
{
path: 'test-import-group',
name: 'test-import-group',
'file.path' => file_upload.path,
'file.name' => file_upload.original_filename
name: 'test-import-group'
}
end
@ -229,9 +227,7 @@ describe API::GroupImport do
{
path: 'test-import-group',
name: 'test-import-group',
file: fog_file,
'file.remote_id' => file_name,
'file.size' => fog_file.size
file: fog_file
}
end
@ -245,10 +241,21 @@ describe API::GroupImport do
include_examples 'when some params are missing'
end
end
def upload_archive(file, headers = {}, params = {})
workhorse_finalize(
api('/groups/import', user),
method: :post,
file_key: :file,
params: params.merge(file: file),
headers: headers,
send_rewritten_field: true
)
end
end
describe 'POST /groups/import/authorize' do
subject { post api('/groups/import/authorize', user), headers: workhorse_header }
subject { post api('/groups/import/authorize', user), headers: workhorse_headers }
it 'authorizes importing group with workhorse header' do
subject
@ -258,7 +265,7 @@ describe API::GroupImport do
end
it 'rejects requests that bypassed gitlab-workhorse' do
workhorse_header.delete(Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER)
workhorse_headers.delete(Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER)
subject

View file

@ -19,7 +19,7 @@ describe API::NotificationSettings do
end
describe "PUT /notification_settings" do
let(:email) { create(:email, user: user) }
let(:email) { create(:email, :confirmed, user: user) }
it "updates global notification settings for the current user" do
put api("/notification_settings", user), params: { level: 'watch', notification_email: email.email }

View file

@ -1891,6 +1891,17 @@ describe API::Projects do
expect(project_fork_target).to be_forked
end
it 'fails without permission from forked_from project' do
project_fork_source.project_feature.update_attribute(:forking_access_level, ProjectFeature::PRIVATE)
post api("/projects/#{project_fork_target.id}/fork/#{project_fork_source.id}", user)
expect(response).to have_gitlab_http_status(:forbidden)
expect(project_fork_target.forked_from_project).to be_nil
expect(project_fork_target.fork_network_member).not_to be_present
expect(project_fork_target).not_to be_forked
end
it 'denies project to be forked from a private project' do
post api("/projects/#{project_fork_target.id}/fork/#{private_project_fork_source.id}", user)

View file

@ -9,15 +9,11 @@ describe 'OpenID Connect requests' do
name: 'Alice',
username: 'alice',
email: 'private@example.com',
emails: [public_email],
public_email: public_email.email,
website_url: 'https://example.com',
avatar: fixture_file_upload('spec/fixtures/dk.png')
)
end
let(:public_email) { build :email, email: 'public@example.com' }
let(:access_grant) { create :oauth_access_grant, application: application, resource_owner_id: user.id }
let(:access_token) { create :oauth_access_token, application: application, resource_owner_id: user.id }
@ -37,7 +33,7 @@ describe 'OpenID Connect requests' do
'name' => 'Alice',
'nickname' => 'alice',
'email' => 'public@example.com',
'email_verified' => false,
'email_verified' => true,
'website' => 'https://example.com',
'profile' => 'http://localhost/alice',
'picture' => "http://localhost/uploads/-/system/user/avatar/#{user.id}/dk.png",
@ -62,6 +58,11 @@ describe 'OpenID Connect requests' do
get '/oauth/userinfo', params: {}, headers: { 'Authorization' => "Bearer #{access_token.token}" }
end
before do
email = create(:email, :confirmed, email: 'public@example.com', user: user)
user.update!(public_email: email.email)
end
context 'Application without OpenID scope' do
let(:application) { create :oauth_application, scopes: 'api' }
@ -123,7 +124,7 @@ describe 'OpenID Connect requests' do
end
it 'has false in email_verified claim' do
expect(json_response['email_verified']).to eq(false)
expect(json_response['email_verified']).to eq(true)
end
end

View file

@ -5,8 +5,8 @@ require 'spec_helper'
describe 'view user notifications' do
let(:user) do
create(:user) do |user|
user.emails.create(email: 'original@example.com')
user.emails.create(email: 'new@example.com')
user.emails.create(email: 'original@example.com', confirmed_at: Time.current)
user.emails.create(email: 'new@example.com', confirmed_at: Time.current)
user.notification_email = 'original@example.com'
user.save!
end

View file

@ -47,6 +47,39 @@ describe Clusters::UpdateService do
expect(cluster.platform.namespace).to eq('custom-namespace')
end
end
context 'when service token is empty' do
let(:params) do
{
platform_kubernetes_attributes: {
token: ''
}
}
end
it 'does not update the token' do
current_token = cluster.platform.token
is_expected.to eq(true)
cluster.platform.reload
expect(cluster.platform.token).to eq(current_token)
end
end
context 'when service token is not empty' do
let(:params) do
{
platform_kubernetes_attributes: {
token: 'new secret token'
}
}
end
it 'updates the token' do
is_expected.to eq(true)
expect(cluster.platform.token).to eq('new secret token')
end
end
end
context 'when invalid params' do

View file

@ -2457,6 +2457,8 @@ describe NotificationService, :mailer do
group = create(:group)
project.update(group: group)
create(:email, :confirmed, user: u_custom_notification_enabled, email: group_notification_email)
create(:notification_setting, user: u_custom_notification_enabled, source: group, notification_email: group_notification_email)
end
@ -2491,6 +2493,7 @@ describe NotificationService, :mailer do
group = create(:group)
project.update(group: group)
create(:email, :confirmed, user: u_member, email: group_notification_email)
create(:notification_setting, user: u_member, source: group, notification_email: group_notification_email)
end
@ -2584,6 +2587,7 @@ describe NotificationService, :mailer do
group = create(:group)
project.update(group: group)
create(:email, :confirmed, user: u_member, email: group_notification_email)
create(:notification_setting, user: u_member, source: group, notification_email: group_notification_email)
end

View file

@ -64,7 +64,7 @@ describe Prometheus::ProxyVariableSubstitutionService do
let(:params_keys) do
{
query: 'up{pod_name="{{pod_name}}"}',
variables: ['pod_name', pod_name]
variables: { 'pod_name' => pod_name }
}
end
@ -76,7 +76,7 @@ describe Prometheus::ProxyVariableSubstitutionService do
let(:params_keys) do
{
query: 'up{pod_name="{{pod_name}}",env="{{ci_environment_slug}}"}',
variables: ['pod_name', pod_name, 'ci_environment_slug', 'custom_value']
variables: { 'pod_name' => pod_name, 'ci_environment_slug' => 'custom_value' }
}
end
@ -95,8 +95,7 @@ describe Prometheus::ProxyVariableSubstitutionService do
}
end
it_behaves_like 'error', 'Optional parameter "variables" must be an ' \
'array of keys and values. Ex: [key1, value1, key2, value2]'
it_behaves_like 'error', 'Optional parameter "variables" must be a Hash. Ex: variables[key1]=value1'
end
context 'with nil variables' do

View file

@ -0,0 +1,19 @@
# frozen_string_literal: true
RSpec.shared_examples 'Secure OAuth Authorizations' do
context 'when user is confirmed' do
let(:user) { create(:user) }
it 'asks the user to authorize the application' do
expect(page).to have_text "Authorize #{application.name} to use your account?"
end
end
context 'when user is unconfirmed' do
let(:user) { create(:user, confirmed_at: nil) }
it 'displays an error' do
expect(page).to have_text I18n.t('doorkeeper.errors.messages.unconfirmed_email')
end
end
end

View file

@ -28,6 +28,7 @@ RSpec.shared_examples 'an email sent to a user' do
it 'is sent to user\'s group notification email' do
group_notification_email = 'user+group@example.com'
create(:email, :confirmed, user: recipient, email: group_notification_email)
create(:notification_setting, user: recipient, source: group, notification_email: group_notification_email)
expect(subject).to deliver_to(group_notification_email)

View file

@ -80,25 +80,21 @@ RSpec.shared_examples 'a mentionable' do
context 'when there are cached markdown fields' do
before do
if subject.is_a?(CacheMarkdownField)
subject.refresh_markdown_cache
end
skip unless subject.is_a?(CacheMarkdownField)
end
it 'sends in cached markdown fields when appropriate' do
if subject.is_a?(CacheMarkdownField) && subject.extractors[author].blank?
expect_next_instance_of(Gitlab::ReferenceExtractor) do |ext|
attrs = subject.class.mentionable_attrs.collect(&:first) & subject.cached_markdown_fields.markdown_fields
attrs.each do |field|
expect(ext).to receive(:analyze).with(subject.send(field), hash_including(rendered: anything))
end
subject.extractors[author] = nil
expect_next_instance_of(Gitlab::ReferenceExtractor) do |ext|
attrs = subject.class.mentionable_attrs.collect(&:first) & subject.cached_markdown_fields.markdown_fields
attrs.each do |field|
expect(ext).to receive(:analyze).with(subject.send(field), hash_including(rendered: anything))
end
expect(subject).not_to receive(:refresh_markdown_cache)
expect(subject).to receive(:cached_markdown_fields).at_least(:once).and_call_original
subject.all_references(author)
end
expect(subject).to receive(:cached_markdown_fields).at_least(:once).and_call_original
subject.all_references(author)
end
end
@ -126,26 +122,40 @@ RSpec.shared_examples 'an editable mentionable' do
context 'when there are cached markdown fields' do
before do
if subject.is_a?(CacheMarkdownField)
subject.refresh_markdown_cache
end
skip unless subject.is_a?(CacheMarkdownField)
subject.save!
end
it 'refreshes markdown cache if necessary' do
subject.save!
set_mentionable_text.call('This is a text')
if subject.is_a?(CacheMarkdownField) && subject.extractors[author].blank?
expect_next_instance_of(Gitlab::ReferenceExtractor) do |ext|
subject.cached_markdown_fields.markdown_fields.each do |field|
expect(ext).to receive(:analyze).with(subject.send(field), hash_including(rendered: anything))
end
subject.extractors[author] = nil
expect_next_instance_of(Gitlab::ReferenceExtractor) do |ext|
subject.cached_markdown_fields.markdown_fields.each do |field|
expect(ext).to receive(:analyze).with(subject.send(field), hash_including(rendered: anything))
end
end
expect(subject).to receive(:refresh_markdown_cache)
expect(subject).to receive(:cached_markdown_fields).at_least(:once).and_call_original
expect(subject).to receive(:refresh_markdown_cache).and_call_original
expect(subject).to receive(:cached_markdown_fields).at_least(:once).and_call_original
subject.all_references(author)
end
context 'when the markdown cache is stale' do
before do
expect(subject).to receive(:latest_cached_markdown_version).at_least(:once) do
(Gitlab::MarkdownCache::CACHE_COMMONMARK_VERSION + 1) << 16
end
end
it 'persists the refreshed cache so that it does not have to be refreshed every time' do
expect(subject).to receive(:refresh_markdown_cache).once.and_call_original
subject.all_references(author)
subject.reload
subject.all_references(author)
end
end

View file

@ -0,0 +1,34 @@
# frozen_string_literal: true
require 'spec_helper'
describe 'admin/application_settings/_eks' do
let_it_be(:admin) { create(:admin) }
let(:page) { Capybara::Node::Simple.new(rendered) }
before do
assign(:application_setting, application_setting)
allow(view).to receive(:current_user) { admin }
allow(view).to receive(:expanded) { true }
end
shared_examples 'EKS secret access key input' do
it 'renders an empty password field' do
render
expect(rendered).to have_field('Secret access key', type: 'password')
expect(page.find_field('Secret access key').value).to be_blank
end
end
context 'when eks_secret_access_key is not set' do
let(:application_setting) { build(:application_setting) }
include_examples 'EKS secret access key input'
end
context 'when eks_secret_access_key is set' do
let(:application_setting) { build(:application_setting, eks_secret_access_key: 'eks_secret_access_key') }
include_examples 'EKS secret access key input'
end
end