Update upstream source from tag 'upstream/15.11.11+ds1'
Update to upstream version '15.11.11+ds1'
with Debian dir aa157ebbd9
This commit is contained in:
commit
9b799e0416
83 changed files with 2187 additions and 641 deletions
|
@ -1254,6 +1254,8 @@
|
|||
rules:
|
||||
- <<: *if-not-canonical-namespace
|
||||
when: never
|
||||
- <<: *if-security-merge-request
|
||||
when: never
|
||||
- <<: *if-merge-request-targeting-stable-branch
|
||||
when: always
|
||||
|
||||
|
|
|
@ -363,7 +363,6 @@ Gitlab/StrongMemoizeAttr:
|
|||
- 'ee/app/models/vulnerabilities/finding.rb'
|
||||
- 'ee/app/presenters/approval_rule_presenter.rb'
|
||||
- 'ee/app/presenters/ci/minutes/usage_presenter.rb'
|
||||
- 'ee/app/presenters/merge_request_approver_presenter.rb'
|
||||
- 'ee/app/serializers/dashboard_operations_project_entity.rb'
|
||||
- 'ee/app/serializers/ee/member_user_entity.rb'
|
||||
- 'ee/app/services/app_sec/dast/pipelines/find_latest_service.rb'
|
||||
|
|
|
@ -1069,7 +1069,6 @@ RSpec/MissingFeatureCategory:
|
|||
- 'ee/spec/models/approval_merge_request_rule_spec.rb'
|
||||
- 'ee/spec/models/approval_state_spec.rb'
|
||||
- 'ee/spec/models/approval_wrapped_any_approver_rule_spec.rb'
|
||||
- 'ee/spec/models/approval_wrapped_code_owner_rule_spec.rb'
|
||||
- 'ee/spec/models/approval_wrapped_rule_spec.rb'
|
||||
- 'ee/spec/models/approvals/scan_finding_wrapped_rule_set_spec.rb'
|
||||
- 'ee/spec/models/approvals/wrapped_rule_set_spec.rb'
|
||||
|
|
63
CHANGELOG.md
63
CHANGELOG.md
|
@ -2,6 +2,69 @@
|
|||
documentation](doc/development/changelog.md) for instructions on adding your own
|
||||
entry.
|
||||
|
||||
## 15.11.11 (2023-07-04)
|
||||
|
||||
### Security (1 change)
|
||||
|
||||
- [Add authorization to the subscriptions group controller](gitlab-org/security/gitlab@a6cb78e56a0e6b08ee7bbabd1687fb43a9f8703b) ([merge request](gitlab-org/security/gitlab!3381))
|
||||
|
||||
## 15.11.10 (2023-06-28)
|
||||
|
||||
### Security (10 changes)
|
||||
|
||||
- [Revert 'security-leaked-ci-job-token-permission-15-11' from '15-11'"](gitlab-org/security/gitlab@19f73bf5494d34b43eb8c807f860d545acae0c32) ([merge request](gitlab-org/security/gitlab!3375))
|
||||
- [Use fully qualified ref when loading code owner file](gitlab-org/security/gitlab@d7ffb4cca68373bff38bd05f0b8afc868cda9e04) ([merge request](gitlab-org/security/gitlab!3354))
|
||||
- [Maintainer can leak masked webhook secrets by manipulating URL masking](gitlab-org/security/gitlab@3a7ccdac5e41870fdce362c38d0a1d1437906fbd) ([merge request](gitlab-org/security/gitlab!3361))
|
||||
- [Remove approvals when the only commit gets amended](gitlab-org/security/gitlab@f8a4ad8be7e5fdf752f525ed58b94b1ce625b9a1) ([merge request](gitlab-org/security/gitlab!3368))
|
||||
- [Fix for fork permissions check in compare controller](gitlab-org/security/gitlab@8edf44b13e55ffe0c912f98134d0341a5a6bcd28) ([merge request](gitlab-org/security/gitlab!3344))
|
||||
- [Webhook token leaked in Sidekiq logs if log format is 'default'](gitlab-org/security/gitlab@02b58237085930c62ee277c9ebd89a0560f44a98) ([merge request](gitlab-org/security/gitlab!3347))
|
||||
- [Mitigate epic reference filter ReDOS](gitlab-org/security/gitlab@4c2cd6e5f7c994aca554be37d9ea9e5e114341f1) ([merge request](gitlab-org/security/gitlab!3339))
|
||||
- [Increasing security for CI_JOB_TOKEN on public and internal projects](gitlab-org/security/gitlab@4f8a00b2499e876df5b65eca921812fbb3215800) ([merge request](gitlab-org/security/gitlab!3319))
|
||||
- [Sanitize user email addresses in admin confirm user dialog](gitlab-org/security/gitlab@608c8001c349b0a62aae81850de669d3af02ab60) ([merge request](gitlab-org/security/gitlab!3332))
|
||||
- [Obfuscate email of service desk issue creator in issue REST API](gitlab-org/security/gitlab@a092ebc54cce4492f87f8ed2bf67c31793b0bd0e) ([merge request](gitlab-org/security/gitlab!3316))
|
||||
|
||||
## 15.11.9 (2023-06-15)
|
||||
|
||||
### Changed (1 change)
|
||||
|
||||
- [Make MigrateSharedVulnerabilityIdentifiers use slow iteration](gitlab-org/gitlab@1d91c7b295b22e844b81fb665748c447028525cd) ([merge request](gitlab-org/gitlab!122856)) **GitLab Enterprise Edition**
|
||||
|
||||
## 15.11.8 (2023-06-06)
|
||||
|
||||
### Fixed (2 changes)
|
||||
|
||||
- [Fix memory leak in CI config includes entry](gitlab-org/gitlab@3e367e614c855352295e3bcab25bf5af4ec66bf5) ([merge request](gitlab-org/gitlab!122540))
|
||||
- [Fix serialization of pull_requests in Bitbucket Server Import](gitlab-org/gitlab@201ad93dfdf8f4acfb6d6eee32e2bd6a4ff68157) ([merge request](gitlab-org/gitlab!122396))
|
||||
|
||||
### Security (1 change)
|
||||
|
||||
- [Validate description length in labels](gitlab-org/gitlab@2c821ee0823e37a57a6dc049591097232c933713) ([merge request](gitlab-org/gitlab!122697))
|
||||
|
||||
### Performance (1 change)
|
||||
|
||||
- [LFS: Serve pre-signed URLs in `/lfs/objects/batch`](gitlab-org/gitlab@df3a9655a0e8f0bdab3433cefcbd37acfb8ebcff) ([merge request](gitlab-org/gitlab!122348))
|
||||
|
||||
## 15.11.7 (2023-06-05)
|
||||
|
||||
### Security (16 changes)
|
||||
|
||||
- [Fix DoS on test report artifacts](gitlab-org/security/gitlab@76133e75ad38326bf971e2d913263349781aecbe) ([merge request](gitlab-org/security/gitlab!3200))
|
||||
- [Fix XSS in Abuse Reports form action](gitlab-org/security/gitlab@e9f9b656b34bb30a7bd66ce82a9d8f6ac43c1ba8) ([merge request](gitlab-org/security/gitlab!3290))
|
||||
- [Escape the source branch link correctly](gitlab-org/security/gitlab@77ed3e8c2ef51e7bcc89ad1c8c549424a69e3478) ([merge request](gitlab-org/security/gitlab!3288))
|
||||
- [Import source owners with maintainer access if importer is a maintainer](gitlab-org/security/gitlab@98f939c9ba3efd5e51807adbaee189f180131544) ([merge request](gitlab-org/security/gitlab!3283))
|
||||
- [Filter inaccessible issuable notes when exporting project](gitlab-org/security/gitlab@29fe6582dd81855cdb263e118459aba370a7c7eb) ([merge request](gitlab-org/security/gitlab!3274))
|
||||
- [Block tag names that are prepended with refs/tags/, due to conflicts](gitlab-org/security/gitlab@2b39d58eb21cf2ecc581bc5e1bd4dd48dcfd20bc) ([merge request](gitlab-org/security/gitlab!3262))
|
||||
- [Set IP in ActionContoller filter before IP enforcement is evaluated](gitlab-org/security/gitlab@8d6e83ff72564f3fa3b24e2040072024b715a073) ([merge request](gitlab-org/security/gitlab!3279))
|
||||
- [Prevent primary email returned as verified on unsaved change](gitlab-org/security/gitlab@8ac9a3e3efea62d9b3e6d758ab3f1c43f2354ea5) ([merge request](gitlab-org/security/gitlab!3223))
|
||||
- [Use UntrustedRegexp to protect FrontMatter filter](gitlab-org/security/gitlab@5d300c3af9c37a7607d795868ae2b4e51b8802c5) ([merge request](gitlab-org/security/gitlab!3257))
|
||||
- [Improve ambiguous_ref? logic to include heads and tags](gitlab-org/security/gitlab@f478b7673efd183971e9375be84ad06af641893f) ([merge request](gitlab-org/security/gitlab!3247))
|
||||
- [Use UntrustedRegexp to protect InlineDiff filter](gitlab-org/security/gitlab@4056d6ccc5b592029ea92ebb90b6e6a66c6eb157) ([merge request](gitlab-org/security/gitlab!3254))
|
||||
- [Ignore user-defined diff paths in diff notes](gitlab-org/security/gitlab@b21208f4c10e8bd4e6754f9bfd2cc5fc96c8be20) ([merge request](gitlab-org/security/gitlab!3267))
|
||||
- [Reject NPM metadata requests with invalid package_name](gitlab-org/security/gitlab@c4d0f6256bae18d9bb9f6afc87afeeb3ff971335) ([merge request](gitlab-org/security/gitlab!3285))
|
||||
- [Use UntrustedRegexp to protect MathFilter regex](gitlab-org/security/gitlab@fd8298b140dba65ac77ed340a5f78e1fc8032db6) ([merge request](gitlab-org/security/gitlab!3251))
|
||||
- [Resolve Overall Project Vulnerability Disclosure](gitlab-org/security/gitlab@199048eb1c61063409e25d3433e7276faf95709b) ([merge request](gitlab-org/security/gitlab!3230))
|
||||
- [Validate description length in labels](gitlab-org/security/gitlab@208342903aabd7c4b78c24c0b9b173dfbd62e405) ([merge request](gitlab-org/security/gitlab!3242))
|
||||
|
||||
## 15.11.6 (2023-05-24)
|
||||
|
||||
### Changed (1 change)
|
||||
|
|
|
@ -1 +1 @@
|
|||
15.11.6
|
||||
15.11.11
|
|
@ -1 +1 @@
|
|||
15.11.6
|
||||
15.11.11
|
2
VERSION
2
VERSION
|
@ -1 +1 @@
|
|||
15.11.6
|
||||
15.11.11
|
|
@ -24,7 +24,9 @@ export default class SingleFileDiff {
|
|||
this.content = $('.diff-content', this.file);
|
||||
this.$chevronRightIcon = $('.diff-toggle-caret .chevron-right', this.file);
|
||||
this.$chevronDownIcon = $('.diff-toggle-caret .chevron-down', this.file);
|
||||
this.diffForPath = this.content.find('[data-diff-for-path]').data('diffForPath');
|
||||
this.diffForPath = this.content
|
||||
.find('div:not(.note-text)[data-diff-for-path]')
|
||||
.data('diffForPath');
|
||||
this.isOpen = !this.diffForPath;
|
||||
if (this.diffForPath) {
|
||||
this.collapsedContent = this.content;
|
||||
|
|
|
@ -1,17 +1,10 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class AbuseReportsController < ApplicationController
|
||||
before_action :set_user, only: [:new, :add_category]
|
||||
before_action :set_user, only: [:add_category]
|
||||
|
||||
feature_category :insider_threat
|
||||
|
||||
def new
|
||||
@abuse_report = AbuseReport.new(
|
||||
user_id: @user.id,
|
||||
reported_from_url: params.fetch(:ref_url, '')
|
||||
)
|
||||
end
|
||||
|
||||
def add_category
|
||||
@abuse_report = AbuseReport.new(
|
||||
user_id: @user.id,
|
||||
|
|
|
@ -89,10 +89,14 @@ class Projects::CompareController < Projects::ApplicationController
|
|||
# target == start_ref == from
|
||||
def target_project
|
||||
strong_memoize(:target_project) do
|
||||
next source_project.default_merge_request_target unless compare_params.key?(:from_project_id)
|
||||
next source_project if compare_params[:from_project_id].to_i == source_project.id
|
||||
|
||||
target_project = target_projects(source_project).find_by_id(compare_params[:from_project_id])
|
||||
target_project =
|
||||
if !compare_params.key?(:from_project_id)
|
||||
source_project.default_merge_request_target
|
||||
elsif compare_params[:from_project_id].to_i == source_project.id
|
||||
source_project
|
||||
else
|
||||
target_projects(source_project).find_by_id(compare_params[:from_project_id])
|
||||
end
|
||||
|
||||
# Just ignore the field if it points at a non-existent or hidden project
|
||||
next source_project unless target_project && can?(current_user, :read_code, target_project)
|
||||
|
|
|
@ -6,6 +6,10 @@ module Repositories
|
|||
include Gitlab::Utils::StrongMemoize
|
||||
|
||||
LFS_TRANSFER_CONTENT_TYPE = 'application/octet-stream'
|
||||
# Downloading directly with presigned URLs via batch requests
|
||||
# require longer expire time.
|
||||
# The 1h should be enough to download 100 objects.
|
||||
LFS_DIRECT_BATCH_EXPIRE_IN = 3600.seconds
|
||||
|
||||
skip_before_action :lfs_check_access!, only: [:deprecated]
|
||||
before_action :lfs_check_batch_operation!, only: [:batch]
|
||||
|
@ -22,7 +26,11 @@ module Repositories
|
|||
end
|
||||
|
||||
if download_request?
|
||||
render json: { objects: download_objects! }, content_type: LfsRequest::CONTENT_TYPE
|
||||
if Feature.enabled?(:lfs_batch_direct_downloads, project)
|
||||
render json: { objects: download_objects! }, content_type: LfsRequest::CONTENT_TYPE
|
||||
else
|
||||
render json: { objects: legacy_download_objects! }, content_type: LfsRequest::CONTENT_TYPE
|
||||
end
|
||||
elsif upload_request?
|
||||
render json: { objects: upload_objects! }, content_type: LfsRequest::CONTENT_TYPE
|
||||
else
|
||||
|
@ -52,11 +60,34 @@ module Repositories
|
|||
end
|
||||
|
||||
def download_objects!
|
||||
existing_oids = project.lfs_objects
|
||||
.for_oids(objects_oids)
|
||||
.index_by(&:oid)
|
||||
|
||||
objects.each do |object|
|
||||
if lfs_object = existing_oids[object[:oid]]
|
||||
object[:actions] = download_actions(object, lfs_object)
|
||||
|
||||
if Guest.can?(:download_code, project)
|
||||
object[:authenticated] = true
|
||||
end
|
||||
else
|
||||
object[:error] = {
|
||||
code: 404,
|
||||
message: _("Object does not exist on the server or you don't have permissions to access it")
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
objects
|
||||
end
|
||||
|
||||
def legacy_download_objects!
|
||||
existing_oids = project.lfs_objects_oids(oids: objects_oids)
|
||||
|
||||
objects.each do |object|
|
||||
if existing_oids.include?(object[:oid])
|
||||
object[:actions] = download_actions(object)
|
||||
object[:actions] = proxy_download_actions(object)
|
||||
|
||||
if Guest.can?(:download_code, project)
|
||||
object[:authenticated] = true
|
||||
|
@ -85,7 +116,26 @@ module Repositories
|
|||
objects
|
||||
end
|
||||
|
||||
def download_actions(object)
|
||||
def download_actions(object, lfs_object)
|
||||
if lfs_object.file.file_storage? || lfs_object.file.class.proxy_download_enabled?
|
||||
proxy_download_actions(object)
|
||||
else
|
||||
direct_download_actions(lfs_object)
|
||||
end
|
||||
end
|
||||
|
||||
def direct_download_actions(lfs_object)
|
||||
{
|
||||
download: {
|
||||
href: lfs_object.file.url(
|
||||
content_type: "application/octet-stream",
|
||||
expire_at: LFS_DIRECT_BATCH_EXPIRE_IN.since
|
||||
)
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
def proxy_download_actions(object)
|
||||
{
|
||||
download: {
|
||||
href: "#{project.http_url_to_repo}/gitlab-lfs/objects/#{object[:oid]}",
|
||||
|
|
|
@ -243,13 +243,13 @@ module MergeRequestsHelper
|
|||
end
|
||||
|
||||
branch = if merge_request.for_fork?
|
||||
_('%{fork_icon} %{source_project_path}:%{source_branch}').html_safe % { fork_icon: fork_icon.html_safe, source_project_path: merge_request.source_project_path.html_safe, source_branch: merge_request.source_branch.html_safe }
|
||||
html_escape(_('%{fork_icon} %{source_project_path}:%{source_branch}')) % { fork_icon: fork_icon.html_safe, source_project_path: merge_request.source_project_path, source_branch: merge_request.source_branch }
|
||||
else
|
||||
merge_request.source_branch
|
||||
end
|
||||
|
||||
branch_title = if merge_request.for_fork?
|
||||
_('%{source_project_path}:%{source_branch}').html_safe % { source_project_path: merge_request.source_project_path.html_safe, source_branch: merge_request.source_branch.html_safe }
|
||||
html_escape(_('%{source_project_path}:%{source_branch}')) % { source_project_path: merge_request.source_project_path, source_branch: merge_request.source_branch }
|
||||
else
|
||||
merge_request.source_branch
|
||||
end
|
||||
|
|
|
@ -136,7 +136,7 @@ module UsersHelper
|
|||
|
||||
def confirm_user_data(user)
|
||||
message = if user.unconfirmed_email.present?
|
||||
_('This user has an unconfirmed email address (%{email}). You may force a confirmation.') % { email: user.unconfirmed_email }
|
||||
_('This user has an unconfirmed email address (%{email}). You may force a confirmation.').html_safe % { email: user.unconfirmed_email }
|
||||
else
|
||||
_('This user has an unconfirmed email address. You may force a confirmation.')
|
||||
end
|
||||
|
|
|
@ -9,6 +9,7 @@ module Ci
|
|||
|
||||
STORE_COLUMN = :file_store
|
||||
NotSupportedAdapterError = Class.new(StandardError)
|
||||
|
||||
FILE_FORMAT_ADAPTERS = {
|
||||
# While zip is a streamable file format, performing streaming
|
||||
# reads requires that each entry in the zip has certain headers
|
||||
|
@ -41,6 +42,9 @@ module Ci
|
|||
raise NotSupportedAdapterError, 'This file format requires a dedicated adapter'
|
||||
end
|
||||
|
||||
::Gitlab::Ci::Artifacts::DecompressedArtifactSizeValidator
|
||||
.new(file: file, file_format: file_format.to_sym).validate!
|
||||
|
||||
log_artifacts_filesize(file.model)
|
||||
|
||||
file.open do |stream|
|
||||
|
|
|
@ -3,19 +3,6 @@
|
|||
module Exportable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
def readable_records(association, current_user: nil)
|
||||
association_records = try(association)
|
||||
return unless association_records.present?
|
||||
|
||||
if has_many_association?(association)
|
||||
DeclarativePolicy.user_scope do
|
||||
association_records.select { |record| readable_record?(record, current_user) }
|
||||
end
|
||||
else
|
||||
readable_record?(association_records, current_user) ? association_records : nil
|
||||
end
|
||||
end
|
||||
|
||||
def exportable_association?(association, current_user: nil)
|
||||
return false unless respond_to?(association)
|
||||
return true if has_many_association?(association)
|
||||
|
@ -30,8 +17,17 @@ module Exportable
|
|||
exportable_restricted_associations & keys
|
||||
end
|
||||
|
||||
def has_many_association?(association_name)
|
||||
self.class.reflect_on_association(association_name)&.macro == :has_many
|
||||
def to_authorized_json(keys_to_authorize, current_user, options)
|
||||
modified_options = filtered_associations_opts(options, keys_to_authorize)
|
||||
record_hash = as_json(modified_options).with_indifferent_access
|
||||
|
||||
keys_to_authorize.each do |key|
|
||||
next unless record_hash.key?(key)
|
||||
|
||||
record_hash[key] = authorized_association_records(key, current_user, options)
|
||||
end
|
||||
|
||||
record_hash.to_json
|
||||
end
|
||||
|
||||
private
|
||||
|
@ -47,4 +43,47 @@ module Exportable
|
|||
record.readable_by?(user)
|
||||
end
|
||||
end
|
||||
|
||||
def has_many_association?(association_name)
|
||||
self.class.reflect_on_association(association_name)&.macro == :has_many
|
||||
end
|
||||
|
||||
def readable_records(association, current_user: nil)
|
||||
association_records = try(association)
|
||||
return unless association_records.present?
|
||||
|
||||
if has_many_association?(association)
|
||||
DeclarativePolicy.user_scope do
|
||||
association_records.select { |record| readable_record?(record, current_user) }
|
||||
end
|
||||
else
|
||||
readable_record?(association_records, current_user) ? association_records : nil
|
||||
end
|
||||
end
|
||||
|
||||
def authorized_association_records(key, current_user, options)
|
||||
records = readable_records(key, current_user: current_user)
|
||||
empty_assoc = has_many_association?(key) ? [] : nil
|
||||
return empty_assoc unless records.present?
|
||||
|
||||
assoc_opts = association_options(key, options)&.dig(key)
|
||||
records.as_json(assoc_opts)
|
||||
end
|
||||
|
||||
def filtered_associations_opts(options, associations)
|
||||
options_copy = options.deep_dup
|
||||
|
||||
associations.each do |key|
|
||||
assoc_opts = association_options(key, options_copy)
|
||||
next unless assoc_opts
|
||||
|
||||
assoc_opts[key] = { only: [:id] }
|
||||
end
|
||||
|
||||
options_copy
|
||||
end
|
||||
|
||||
def association_options(key, options)
|
||||
options[:include].find { |assoc| assoc.key?(key) }
|
||||
end
|
||||
end
|
||||
|
|
|
@ -27,6 +27,7 @@ module Issuable
|
|||
include ClosedAtFilterable
|
||||
include VersionedDescription
|
||||
include SortableTitle
|
||||
include Exportable
|
||||
|
||||
TITLE_LENGTH_MAX = 255
|
||||
TITLE_HTML_LENGTH_MAX = 800
|
||||
|
@ -226,6 +227,10 @@ module Issuable
|
|||
issuable_severity&.severity || IssuableSeverity::DEFAULT
|
||||
end
|
||||
|
||||
def exportable_restricted_associations
|
||||
super + [:notes]
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def validate_description_length?
|
||||
|
|
|
@ -135,6 +135,7 @@ class WebHook < ApplicationRecord
|
|||
|
||||
return if url_variables_were.blank? || interpolated_url_was == interpolated_url
|
||||
|
||||
self.url_variables = {} if url_variables_were.keys.intersection(url_variables.keys).any?
|
||||
self.url_variables = {} if url_changed? && url_variables_were.to_a.intersection(url_variables.to_a).any?
|
||||
end
|
||||
|
||||
|
|
|
@ -25,7 +25,6 @@ class Issue < ApplicationRecord
|
|||
include FromUnion
|
||||
include EachBatch
|
||||
include PgFullTextSearchable
|
||||
include Exportable
|
||||
|
||||
extend ::Gitlab::Utils::Override
|
||||
|
||||
|
|
|
@ -13,6 +13,7 @@ class Label < ApplicationRecord
|
|||
cache_markdown_field :description, pipeline: :single_line
|
||||
|
||||
DEFAULT_COLOR = ::Gitlab::Color.of('#6699cc')
|
||||
DESCRIPTION_LENGTH_MAX = 512.kilobytes
|
||||
|
||||
attribute :color, ::Gitlab::Database::Type::Color.new, default: DEFAULT_COLOR
|
||||
|
||||
|
@ -31,6 +32,10 @@ class Label < ApplicationRecord
|
|||
validates :title, uniqueness: { scope: [:group_id, :project_id] }
|
||||
validates :title, length: { maximum: 255 }
|
||||
|
||||
# we validate the description against DESCRIPTION_LENGTH_MAX only for labels being created and on updates if
|
||||
# the description changes to avoid breaking the existing labels which may have their descriptions longer
|
||||
validates :description, bytesize: { maximum: -> { DESCRIPTION_LENGTH_MAX } }, if: :validate_description_length?
|
||||
|
||||
default_scope { order(title: :asc) } # rubocop:disable Cop/DefaultScope
|
||||
|
||||
scope :templates, -> { where(template: true, type: [Label.name, nil]) }
|
||||
|
@ -277,6 +282,16 @@ class Label < ApplicationRecord
|
|||
|
||||
private
|
||||
|
||||
def validate_description_length?
|
||||
return false unless description_changed?
|
||||
|
||||
previous_description = changes['description'].first
|
||||
# previous_description will be nil for new records
|
||||
return true if previous_description.blank?
|
||||
|
||||
previous_description.bytesize <= DESCRIPTION_LENGTH_MAX || description.bytesize > previous_description.bytesize
|
||||
end
|
||||
|
||||
def issues_count(user, params = {})
|
||||
params.merge!(subject_foreign_key => subject.id, label_name: title, scope: 'all')
|
||||
IssuesFinder.new(user, params.with_indifferent_access).execute.count
|
||||
|
|
|
@ -117,12 +117,14 @@ class ProjectTeam
|
|||
owners.include?(user)
|
||||
end
|
||||
|
||||
def import(source_project, current_user = nil)
|
||||
def import(source_project, current_user)
|
||||
target_project = project
|
||||
|
||||
source_members = source_project.project_members.to_a
|
||||
target_user_ids = target_project.project_members.pluck_user_ids
|
||||
|
||||
importer_access_level = max_member_access(current_user.id)
|
||||
|
||||
source_members.reject! do |member|
|
||||
# Skip if user already present in team
|
||||
!member.invite? && target_user_ids.include?(member.user_id)
|
||||
|
@ -132,6 +134,8 @@ class ProjectTeam
|
|||
new_member = member.dup
|
||||
new_member.id = nil
|
||||
new_member.source = target_project
|
||||
# So that a maintainer cannot import a member with owner access
|
||||
new_member.access_level = [new_member.access_level, importer_access_level].min
|
||||
new_member.created_by = current_user
|
||||
new_member
|
||||
end
|
||||
|
|
|
@ -1538,7 +1538,9 @@ class User < ApplicationRecord
|
|||
# rubocop: enable CodeReuse/ServiceClass
|
||||
|
||||
def primary_email_verified?
|
||||
confirmed? && !temp_oauth_email?
|
||||
return false unless confirmed? && !temp_oauth_email?
|
||||
|
||||
!email_changed? || new_record?
|
||||
end
|
||||
|
||||
def accept_pending_invitations!
|
||||
|
|
|
@ -0,0 +1,8 @@
|
|||
---
|
||||
name: lfs_batch_direct_downloads
|
||||
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/122221
|
||||
rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/413684
|
||||
milestone: '16.1'
|
||||
type: development
|
||||
group: group::tenant scale
|
||||
default_enabled: false
|
|
@ -207,7 +207,7 @@ InitializerConnections.raise_if_new_database_connection do
|
|||
end
|
||||
|
||||
# Spam reports
|
||||
resources :abuse_reports, only: [:new, :create] do
|
||||
resources :abuse_reports, only: [:create] do
|
||||
collection do
|
||||
post :add_category
|
||||
end
|
||||
|
|
299
db/15_11_migration_fixes.txt
Normal file
299
db/15_11_migration_fixes.txt
Normal file
|
@ -0,0 +1,299 @@
|
|||
20211202041233
|
||||
20211202094944
|
||||
20211202135508
|
||||
20211202145237
|
||||
20211203091642
|
||||
20211203160952
|
||||
20211203161149
|
||||
20211203161840
|
||||
20211203161942
|
||||
20211204010826
|
||||
20211206073851
|
||||
20211206074547
|
||||
20211206161271
|
||||
20211207081708
|
||||
20211207090503
|
||||
20211207125331
|
||||
20211207135331
|
||||
20211207154413
|
||||
20211207154414
|
||||
20211207165508
|
||||
20211207173510
|
||||
20211207173511
|
||||
20211208111425
|
||||
20211208122200
|
||||
20211208122201
|
||||
20211208171402
|
||||
20211209093636
|
||||
20211209093828
|
||||
20211209093923
|
||||
20211209094222
|
||||
20211209103048
|
||||
20211209203820
|
||||
20211209203821
|
||||
20211209230042
|
||||
20211210025754
|
||||
20211210031721
|
||||
20211210140000
|
||||
20211210140629
|
||||
20211210173137
|
||||
20211213064821
|
||||
20211213102111
|
||||
20211213130324
|
||||
20211213142344
|
||||
20211213154259
|
||||
20211213154704
|
||||
20211214012507
|
||||
20211214110307
|
||||
20211215090620
|
||||
20211215182006
|
||||
20211216133107
|
||||
20211216134134
|
||||
20211216135651
|
||||
20211216220939
|
||||
20211217050753
|
||||
20211217120000
|
||||
20211217145923
|
||||
20211217174331
|
||||
20211220064757
|
||||
20211220120402
|
||||
20211220123956
|
||||
20211220174504
|
||||
20211223125921
|
||||
20211224112937
|
||||
20211224114539
|
||||
20211229023654
|
||||
20211230112517
|
||||
20211230113031
|
||||
20220104060049
|
||||
20220104174445
|
||||
20220105020514
|
||||
20220105082217
|
||||
20220105121325
|
||||
20220105152547
|
||||
20220105153149
|
||||
20220106111958
|
||||
20220106112043
|
||||
20220106112085
|
||||
20220106141756
|
||||
20220106163326
|
||||
20220106185033
|
||||
20220106230629
|
||||
20220106230712
|
||||
20220106231518
|
||||
20220106233459
|
||||
20220106235626
|
||||
20220107064845
|
||||
20220107091629
|
||||
20220107165036
|
||||
20220109133006
|
||||
20220109134455
|
||||
20220110170953
|
||||
20220110171049
|
||||
20220110224913
|
||||
20220110231420
|
||||
20220110233155
|
||||
20220111002756
|
||||
20220111023852
|
||||
20220111093534
|
||||
20220111095006
|
||||
20220111095007
|
||||
20220111101421
|
||||
20220111102314
|
||||
20220111154950
|
||||
20220111154951
|
||||
20220111200254
|
||||
20220111221516
|
||||
20220112015940
|
||||
20220112090556
|
||||
20220112115413
|
||||
20220112205111
|
||||
20220112230642
|
||||
20220112232037
|
||||
20220112232605
|
||||
20220112232723
|
||||
20220113013319
|
||||
20220113014438
|
||||
20220113015830
|
||||
20220113035519
|
||||
20220113040447
|
||||
20220113111440
|
||||
20220113125401
|
||||
20220113135449
|
||||
20220113135924
|
||||
20220113164801
|
||||
20220113164901
|
||||
20220114105525
|
||||
20220114131950
|
||||
20220116175851
|
||||
20220117034056
|
||||
20220117082611
|
||||
20220117225936
|
||||
20220118015633
|
||||
20220118020026
|
||||
20220118141950
|
||||
20220118155846
|
||||
20220118155847
|
||||
20220118155848
|
||||
20220118204039
|
||||
20220119094023
|
||||
20220119094503
|
||||
20220119141407
|
||||
20220119141736
|
||||
20220119143130
|
||||
20220119144253
|
||||
20220119144458
|
||||
20220119151221
|
||||
20220119153706
|
||||
20220119154442
|
||||
20220119170426
|
||||
20220119193130
|
||||
20220119201340
|
||||
20220119203119
|
||||
20220119220620
|
||||
20220120033115
|
||||
20220120085655
|
||||
20220120094340
|
||||
20220120123700
|
||||
20220120123800
|
||||
20220120160625
|
||||
20220120211831
|
||||
20220120211832
|
||||
20220121214752
|
||||
20220121214753
|
||||
20220121221651
|
||||
20220124130028
|
||||
20220124145019
|
||||
20220124151456
|
||||
20220124151949
|
||||
20220124152824
|
||||
20220124153233
|
||||
20220124153234
|
||||
20220124180704
|
||||
20220124180705
|
||||
20220124184338
|
||||
20220124200927
|
||||
20220124204046
|
||||
20220124214131
|
||||
20220124215857
|
||||
20220124221521
|
||||
20220125083520
|
||||
20220125084127
|
||||
20220125084348
|
||||
20220125122228
|
||||
20220125122640
|
||||
20220125122725
|
||||
20220125230538
|
||||
20220126191624
|
||||
20220126201752
|
||||
20220126202654
|
||||
20220126203421
|
||||
20220126210021
|
||||
20220126210022
|
||||
20220126210657
|
||||
20220127112243
|
||||
20220127112412
|
||||
20220127132200
|
||||
20220127132201
|
||||
20220128093756
|
||||
20220128103042
|
||||
20220128155251
|
||||
20220128155814
|
||||
20220128194722
|
||||
20220131000000
|
||||
20220131000001
|
||||
20220131135725
|
||||
20220131192643
|
||||
20220201034731
|
||||
20220201141705
|
||||
20220201173212
|
||||
20220201193033
|
||||
20220201205300
|
||||
20220202034409
|
||||
20220202105733
|
||||
20220202115350
|
||||
20220203074916
|
||||
20220203091304
|
||||
20220203123333
|
||||
20220203133652
|
||||
20220203134942
|
||||
20220204053655
|
||||
20220204093120
|
||||
20220204095121
|
||||
20220204110725
|
||||
20220204154220
|
||||
20220204193000
|
||||
20220204194347
|
||||
20220207080758
|
||||
20220207083129
|
||||
20220208080921
|
||||
20220208115439
|
||||
20220208170445
|
||||
20220208171826
|
||||
20220209111007
|
||||
20220211090920
|
||||
20220211125954
|
||||
20220211214605
|
||||
20220212120735
|
||||
20220213100000
|
||||
20220213103859
|
||||
20220213104531
|
||||
20220215164709
|
||||
20220215190020
|
||||
20220216110023
|
||||
20220216201949
|
||||
20220217100008
|
||||
20220217113058
|
||||
20220217135229
|
||||
20220221102333
|
||||
20220221214928
|
||||
20220222072536
|
||||
20220222191845
|
||||
20220222192524
|
||||
20220222192525
|
||||
20220223112304
|
||||
20220223124428
|
||||
20220224000000
|
||||
20220224204415
|
||||
20220225133705
|
||||
20220301002101
|
||||
20220301003502
|
||||
20220301091503
|
||||
20220301093434
|
||||
20220301175104
|
||||
20220301175426
|
||||
20220302110724
|
||||
20220302114046
|
||||
20220302203410
|
||||
20220303190555
|
||||
20220303191047
|
||||
20220304052335
|
||||
20220304061631
|
||||
20220304062107
|
||||
20220304152729
|
||||
20220304165107
|
||||
20220304201847
|
||||
20220305223212
|
||||
20220307192534
|
||||
20220307192610
|
||||
20220307192645
|
||||
20220307192725
|
||||
20220307203458
|
||||
20220307203459
|
||||
20220308000205
|
||||
20220308115219
|
||||
20220308115502
|
||||
20220309084838
|
||||
20220309084954
|
||||
20220309100648
|
||||
20220309154855
|
||||
20220310011530
|
||||
20220310011613
|
||||
20220310095341
|
||||
20220310101118
|
||||
20220310134207
|
||||
20220310141349
|
||||
20220311010352
|
||||
20220314094841
|
||||
20220314154235
|
||||
20220314162342
|
|
@ -15434,7 +15434,7 @@ four standard [pagination arguments](#connection-pagination-arguments):
|
|||
|
||||
Represents vulnerable project counts for each grade.
|
||||
|
||||
Returns [`[VulnerableProjectsByGrade!]!`](#vulnerableprojectsbygrade).
|
||||
Returns [`[VulnerableProjectsByGrade!]`](#vulnerableprojectsbygrade).
|
||||
|
||||
###### Arguments
|
||||
|
||||
|
|
|
@ -2381,6 +2381,11 @@ curl --request DELETE --header "PRIVATE-TOKEN: <your_access_token>" "https://git
|
|||
|
||||
Import members from another project.
|
||||
|
||||
If the importing member's role in the target project is:
|
||||
|
||||
- Maintainer, then members with the Owner role in the source project are imported with the Maintainer role.
|
||||
- Owner, then members with the Owner role in the source project are imported with the Owner role.
|
||||
|
||||
```plaintext
|
||||
POST /projects/:id/import_project_members/:project_id
|
||||
```
|
||||
|
|
|
@ -200,6 +200,11 @@ Prerequisite:
|
|||
|
||||
- You must have the Maintainer or Owner role.
|
||||
|
||||
If the importing member's role in the target project is:
|
||||
|
||||
- Maintainer, then members with the Owner role in the source project are imported with the Maintainer role.
|
||||
- Owner, then members with the Owner role in the source project are imported with the Owner role.
|
||||
|
||||
To import users:
|
||||
|
||||
1. On the top bar, select **Main menu > Projects** and find your project.
|
||||
|
|
|
@ -27,6 +27,11 @@ module API
|
|||
end
|
||||
|
||||
helpers do
|
||||
params :package_name do
|
||||
requires :package_name, type: String, file_path: true, desc: 'Package name',
|
||||
documentation: { example: 'mypackage' }
|
||||
end
|
||||
|
||||
def redirect_or_present_audit_report
|
||||
redirect_registry_request(
|
||||
forward_to_registry: true,
|
||||
|
@ -172,7 +177,7 @@ module API
|
|||
tags %w[npm_packages]
|
||||
end
|
||||
params do
|
||||
requires :package_name, type: String, desc: 'Package name'
|
||||
use :package_name
|
||||
end
|
||||
route_setting :authentication, job_token_allowed: true, deploy_token_allowed: true
|
||||
get '*package_name', format: false, requirements: ::API::Helpers::Packages::Npm::NPM_ENDPOINT_REQUIREMENTS do
|
||||
|
|
|
@ -55,7 +55,14 @@ module API
|
|||
end
|
||||
|
||||
expose :moved_to_id
|
||||
expose :service_desk_reply_to
|
||||
expose :service_desk_reply_to do |issue|
|
||||
issue.present(
|
||||
current_user: options[:current_user],
|
||||
# We need to pass it explicitly to account for the case where `issue`
|
||||
# is a `WorkItem` which doesn't have a presenter yet.
|
||||
presenter_class: IssuePresenter
|
||||
).service_desk_reply_to
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -17,19 +17,21 @@ module Banzai
|
|||
# encoded and will therefore not interfere with the detection of the dollar syntax.
|
||||
|
||||
# Corresponds to the "$...$" syntax
|
||||
DOLLAR_INLINE_PATTERN = %r{
|
||||
(?<matched>\$(?<math>(?:\S[^$\n]*?\S|[^$\s]))\$)(?:[^\d]|$)
|
||||
}x.freeze
|
||||
DOLLAR_INLINE_UNTRUSTED =
|
||||
'(?P<matched>\$(?P<math>(?:\S[^$\n]*?\S|[^$\s]))\$)(?:[^\d]|$)'
|
||||
DOLLAR_INLINE_UNTRUSTED_REGEX =
|
||||
Gitlab::UntrustedRegexp.new(DOLLAR_INLINE_UNTRUSTED, multiline: false)
|
||||
|
||||
# Corresponds to the "$$...$$" syntax
|
||||
DOLLAR_DISPLAY_INLINE_PATTERN = %r{
|
||||
(?<matched>\$\$\ *(?<math>[^$\n]+?)\ *\$\$)
|
||||
}x.freeze
|
||||
DOLLAR_DISPLAY_INLINE_UNTRUSTED =
|
||||
'(?P<matched>\$\$\ *(?P<math>[^$\n]+?)\ *\$\$)'
|
||||
DOLLAR_DISPLAY_INLINE_UNTRUSTED_REGEX =
|
||||
Gitlab::UntrustedRegexp.new(DOLLAR_DISPLAY_INLINE_UNTRUSTED, multiline: false)
|
||||
|
||||
# Order dependent. Handle the `$$` syntax before the `$` syntax
|
||||
DOLLAR_MATH_PIPELINE = [
|
||||
{ pattern: DOLLAR_DISPLAY_INLINE_PATTERN, style: :display },
|
||||
{ pattern: DOLLAR_INLINE_PATTERN, style: :inline }
|
||||
{ pattern: DOLLAR_DISPLAY_INLINE_UNTRUSTED_REGEX, style: :display },
|
||||
{ pattern: DOLLAR_INLINE_UNTRUSTED_REGEX, style: :inline }
|
||||
].freeze
|
||||
|
||||
# Do not recognize math inside these tags
|
||||
|
@ -46,16 +48,18 @@ module Banzai
|
|||
next if has_ancestor?(node, IGNORED_ANCESTOR_TAGS)
|
||||
|
||||
node_html = node.to_html
|
||||
next unless node_html.match?(DOLLAR_INLINE_PATTERN) ||
|
||||
node_html.match?(DOLLAR_DISPLAY_INLINE_PATTERN)
|
||||
next unless DOLLAR_INLINE_UNTRUSTED_REGEX.match?(node_html) ||
|
||||
DOLLAR_DISPLAY_INLINE_UNTRUSTED_REGEX.match?(node_html)
|
||||
|
||||
temp_doc = Nokogiri::HTML.fragment(node_html)
|
||||
|
||||
DOLLAR_MATH_PIPELINE.each do |pipeline|
|
||||
temp_doc.xpath('child::text()').each do |temp_node|
|
||||
html = temp_node.to_html
|
||||
temp_node.content.scan(pipeline[:pattern]).each do |matched, math|
|
||||
html.sub!(matched, math_html(math: math, style: pipeline[:style]))
|
||||
|
||||
pipeline[:pattern].scan(temp_node.content).each do |match|
|
||||
math = pipeline[:pattern].extract_named_group(:math, match)
|
||||
html.sub!(match.first, math_html(math: math, style: pipeline[:style]))
|
||||
end
|
||||
|
||||
temp_node.replace(html)
|
||||
|
|
|
@ -6,13 +6,13 @@ module Banzai
|
|||
def call
|
||||
lang_mapping = Gitlab::FrontMatter::DELIM_LANG
|
||||
|
||||
html.sub(Gitlab::FrontMatter::PATTERN) do |_match|
|
||||
lang = $~[:lang].presence || lang_mapping[$~[:delim]]
|
||||
Gitlab::FrontMatter::PATTERN_UNTRUSTED_REGEX.replace_gsub(html) do |match|
|
||||
lang = match[:lang].presence || lang_mapping[match[:delim]]
|
||||
|
||||
before = $~[:before]
|
||||
before = "\n#{before}" if $~[:encoding].presence
|
||||
before = match[:before]
|
||||
before = "\n#{before}" if match[:encoding].presence
|
||||
|
||||
"#{before}```#{lang}:frontmatter\n#{$~[:front_matter]}```\n"
|
||||
"#{before}```#{lang}:frontmatter\n#{match[:front_matter]}```\n"
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -6,6 +6,14 @@ module Banzai
|
|||
class InlineDiffFilter < HTML::Pipeline::Filter
|
||||
IGNORED_ANCESTOR_TAGS = %w(pre code tt).to_set
|
||||
|
||||
INLINE_DIFF_DELETION_UNTRUSTED = '(?:\[\-(.*?)\-\]|\{\-(.*?)\-\})'
|
||||
INLINE_DIFF_DELETION_UNTRUSTED_REGEX =
|
||||
Gitlab::UntrustedRegexp.new(INLINE_DIFF_DELETION_UNTRUSTED, multiline: false)
|
||||
|
||||
INLINE_DIFF_ADDITION_UNTRUSTED = '(?:\[\+(.*?)\+\]|\{\+(.*?)\+\})'
|
||||
INLINE_DIFF_ADDITION_UNTRUSTED_REGEX =
|
||||
Gitlab::UntrustedRegexp.new(INLINE_DIFF_ADDITION_UNTRUSTED, multiline: false)
|
||||
|
||||
def call
|
||||
doc.xpath('descendant-or-self::text()').each do |node|
|
||||
next if has_ancestor?(node, IGNORED_ANCESTOR_TAGS)
|
||||
|
@ -21,8 +29,13 @@ module Banzai
|
|||
end
|
||||
|
||||
def inline_diff_filter(text)
|
||||
html_content = text.gsub(/(?:\[\-(.*?)\-\]|\{\-(.*?)\-\})/, '<span class="idiff left right deletion">\1\2</span>')
|
||||
html_content.gsub(/(?:\[\+(.*?)\+\]|\{\+(.*?)\+\})/, '<span class="idiff left right addition">\1\2</span>')
|
||||
html_content = INLINE_DIFF_DELETION_UNTRUSTED_REGEX.replace_gsub(text) do |match|
|
||||
%(<span class="idiff left right deletion">#{match[1]}#{match[2]}</span>)
|
||||
end
|
||||
|
||||
INLINE_DIFF_ADDITION_UNTRUSTED_REGEX.replace_gsub(html_content) do |match|
|
||||
%(<span class="idiff left right addition">#{match[1]}#{match[2]}</span>)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -29,15 +29,19 @@ module Banzai
|
|||
@references_per_parent[parent_type] ||= begin
|
||||
refs = Hash.new { |hash, key| hash[key] = Set.new }
|
||||
|
||||
prepare_doc_for_scan.to_enum(:scan, regex).each do
|
||||
parent_path = if parent_type == :project
|
||||
full_project_path($~[:namespace], $~[:project])
|
||||
else
|
||||
full_group_path($~[:group])
|
||||
end
|
||||
[filter.object_class.link_reference_pattern, filter.object_class.reference_pattern].each do |pattern|
|
||||
next unless pattern
|
||||
|
||||
ident = filter.identifier($~)
|
||||
refs[parent_path] << ident if ident
|
||||
prepare_doc_for_scan.to_enum(:scan, pattern).each do
|
||||
parent_path = if parent_type == :project
|
||||
full_project_path($~[:namespace], $~[:project])
|
||||
else
|
||||
full_group_path($~[:group])
|
||||
end
|
||||
|
||||
ident = filter.identifier($~)
|
||||
refs[parent_path] << ident if ident
|
||||
end
|
||||
end
|
||||
|
||||
refs
|
||||
|
@ -171,15 +175,6 @@ module Banzai
|
|||
|
||||
delegate :project, :group, :parent, :parent_type, to: :filter
|
||||
|
||||
def regex
|
||||
strong_memoize(:regex) do
|
||||
[
|
||||
filter.object_class.link_reference_pattern,
|
||||
filter.object_class.reference_pattern
|
||||
].compact.reduce { |a, b| Regexp.union(a, b) }
|
||||
end
|
||||
end
|
||||
|
||||
def refs_cache
|
||||
Gitlab::SafeRequestStore["banzai_#{parent_type}_refs".to_sym] ||= {}
|
||||
end
|
||||
|
|
|
@ -80,6 +80,7 @@ module BitbucketServer
|
|||
state: state,
|
||||
title: title,
|
||||
source_branch_name: source_branch_name,
|
||||
source_branch_sha: source_branch_sha,
|
||||
target_branch_name: target_branch_name,
|
||||
target_branch_sha: target_branch_sha
|
||||
}
|
||||
|
|
|
@ -157,11 +157,11 @@ module ExtractsRef
|
|||
end
|
||||
|
||||
def ambiguous_ref?(project, ref)
|
||||
return false unless ref
|
||||
return true if project.repository.ambiguous_ref?(ref)
|
||||
return false unless ref.starts_with?(%r{(refs|heads|tags)/})
|
||||
|
||||
return false unless ref&.starts_with?('refs/')
|
||||
|
||||
unprefixed_ref = ref.sub(%r{^refs/(heads|tags)/}, '')
|
||||
unprefixed_ref = ref.sub(%r{^(refs/)?(heads|tags)/}, '')
|
||||
project.repository.commit(unprefixed_ref).present?
|
||||
end
|
||||
end
|
||||
|
|
|
@ -3,16 +3,16 @@
|
|||
Dear GitLab user,
|
||||
|
||||
%p
|
||||
As part of our commitment to keeping GitLab secure, we have identified and addressed a vulnerability in GitLab that allowed some users to bypass the email verification process in a #{link_to("recent security release", "https://about.gitlab.com/releases/2020/05/27/security-release-13-0-1-released", target: '_blank')}.
|
||||
As part of our commitment to keeping GitLab secure, we have identified and addressed a vulnerability in GitLab that allowed some users to bypass the email verification process in a #{link_to('recent security release', 'https://about.gitlab.com/releases/2020/05/27/security-release-13-0-1-released', target: '_blank')}.
|
||||
|
||||
%p
|
||||
As a precautionary measure, you will need to re-verify some of your account's email addresses before continuing to use GitLab. Sorry for the inconvenience!
|
||||
|
||||
%p
|
||||
We have already sent the re-verification email with a subject line of "Confirmation instructions" from #{@verification_from_mail}. Please feel free to contribute any questions or comments to #{link_to("this issue", "https://gitlab.com/gitlab-com/www-gitlab-com/-/issues/7942", target: '_blank')}.
|
||||
We have already sent the re-verification email with a subject line of 'Confirmation instructions' from #{@verification_from_mail}. Please feel free to contribute any questions or comments to #{link_to('this issue', 'https://gitlab.com/gitlab-com/www-gitlab-com/-/issues/7942', target: '_blank')}.
|
||||
|
||||
%p
|
||||
If you are not "#{@user.username}", please #{link_to 'report this to our administrator', new_abuse_report_url(user_id: @user.id)}
|
||||
If you are not "#{@user.username}", please report abuse from the user's #{link_to('profile page', user_url(@user.id), target: '_blank', rel: 'noopener noreferrer')}. #{link_to('Learn more.', help_page_url('user/report_abuse', anchor: 'report-abuse-from-the-users-profile-page', target: '_blank', rel: 'noopener noreferrer'))}
|
||||
|
||||
%p
|
||||
Thank you for being a GitLab user!
|
||||
|
|
|
@ -9,6 +9,8 @@ As a precautionary measure, you will need to re-verify some of your account's em
|
|||
We have already sent the re-verification email with a subject line of "Confirmation instructions" from <%= @verification_from_mail %>.
|
||||
Please feel free to contribute any questions or comments to this issue: https://gitlab.com/gitlab-com/www-gitlab-com/-/issues/7942
|
||||
|
||||
If you are not "<%= @user.username %>", please report this to our administrator. Report link: <%= new_abuse_report_url(user_id: @user.id) %>
|
||||
If you are not "<%= @user.username %>", please report abuse from the user's profile page: <%= user_url(@user.id) %>.
|
||||
|
||||
Learn more: <%= help_page_url('user/report_abuse', anchor: 'report-abuse-from-the-users-profile-page') %>
|
||||
|
||||
Thank you for being a GitLab user!
|
||||
|
|
|
@ -10,7 +10,8 @@ module Gitlab
|
|||
'Only a project maintainer or owner can delete a protected tag.',
|
||||
delete_protected_tag_non_web: 'You can only delete protected tags using the web interface.',
|
||||
create_protected_tag: 'You are not allowed to create this tag as it is protected.',
|
||||
default_branch_collision: 'You cannot use default branch name to create a tag'
|
||||
default_branch_collision: 'You cannot use default branch name to create a tag',
|
||||
prohibited_tag_name: 'You cannot create a tag with a prohibited pattern.'
|
||||
}.freeze
|
||||
|
||||
LOG_MESSAGES = {
|
||||
|
@ -29,11 +30,20 @@ module Gitlab
|
|||
end
|
||||
|
||||
default_branch_collision_check
|
||||
prohibited_tag_checks
|
||||
protected_tag_checks
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def prohibited_tag_checks
|
||||
return if deletion?
|
||||
|
||||
if tag_name.start_with?("refs/tags/") # rubocop: disable Style/GuardClause
|
||||
raise GitAccess::ForbiddenError, ERROR_MESSAGES[:prohibited_tag_name]
|
||||
end
|
||||
end
|
||||
|
||||
def protected_tag_checks
|
||||
logger.log_timed(LOG_MESSAGES[__method__]) do
|
||||
return unless ProtectedTag.protected?(project, tag_name) # rubocop:disable Cop/AvoidReturnFromBlocks
|
||||
|
|
|
@ -0,0 +1,60 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Gitlab
|
||||
module Ci
|
||||
module Artifacts
|
||||
class DecompressedArtifactSizeValidator
|
||||
DEFAULT_MAX_BYTES = 4.gigabytes.freeze
|
||||
|
||||
FILE_FORMAT_VALIDATORS = {
|
||||
gzip: ::Gitlab::Ci::DecompressedGzipSizeValidator
|
||||
}.freeze
|
||||
|
||||
FileDecompressionError = Class.new(StandardError)
|
||||
|
||||
def initialize(file:, file_format:, max_bytes: DEFAULT_MAX_BYTES)
|
||||
@file = file
|
||||
@file_path = file&.path
|
||||
@file_format = file_format
|
||||
@max_bytes = max_bytes
|
||||
end
|
||||
|
||||
def validate!
|
||||
validator_class = FILE_FORMAT_VALIDATORS[file_format.to_sym]
|
||||
|
||||
return if file_path.nil?
|
||||
return if validator_class.nil?
|
||||
|
||||
if file.respond_to?(:object_store) && file.object_store == ObjectStorage::Store::REMOTE
|
||||
return if valid_on_storage?(validator_class)
|
||||
elsif validator_class.new(archive_path: file_path, max_bytes: max_bytes).valid?
|
||||
return
|
||||
end
|
||||
|
||||
raise(FileDecompressionError, 'File decompression error')
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
attr_reader :file_path, :file, :file_format, :max_bytes
|
||||
|
||||
def valid_on_storage?(validator_class)
|
||||
temp_filename = "#{SecureRandom.uuid}.gz"
|
||||
|
||||
is_valid = false
|
||||
Tempfile.open(temp_filename, '/tmp') do |tempfile|
|
||||
tempfile.binmode
|
||||
::Faraday.get(file.url) do |req|
|
||||
req.options.on_data = proc { |chunk, _| tempfile.write(chunk) }
|
||||
end
|
||||
|
||||
is_valid = validator_class.new(archive_path: tempfile.path, max_bytes: max_bytes).valid?
|
||||
tempfile.unlink
|
||||
end
|
||||
|
||||
is_valid
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
|
@ -7,7 +7,7 @@ module Gitlab
|
|||
##
|
||||
# Entry that represents a list of include.
|
||||
#
|
||||
class Includes < ::Gitlab::Config::Entry::Node
|
||||
class Includes < ::Gitlab::Config::Entry::ComposableArray
|
||||
include ::Gitlab::Config::Entry::Validatable
|
||||
|
||||
validations do
|
||||
|
@ -23,16 +23,8 @@ module Gitlab
|
|||
end
|
||||
end
|
||||
|
||||
def self.aspects
|
||||
super.append -> do
|
||||
@config = Array.wrap(@config)
|
||||
|
||||
@config.each_with_index do |config, i|
|
||||
@entries[i] = ::Gitlab::Config::Entry::Factory.new(Entry::Include)
|
||||
.value(config || {})
|
||||
.create!
|
||||
end
|
||||
end
|
||||
def composable_class
|
||||
Entry::Include
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
79
lib/gitlab/ci/decompressed_gzip_size_validator.rb
Normal file
79
lib/gitlab/ci/decompressed_gzip_size_validator.rb
Normal file
|
@ -0,0 +1,79 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Gitlab
|
||||
module Ci
|
||||
class DecompressedGzipSizeValidator
|
||||
DEFAULT_MAX_BYTES = 4.gigabytes.freeze
|
||||
TIMEOUT_LIMIT = 210.seconds
|
||||
|
||||
ServiceError = Class.new(StandardError)
|
||||
|
||||
def initialize(archive_path:, max_bytes: DEFAULT_MAX_BYTES)
|
||||
@archive_path = archive_path
|
||||
@max_bytes = max_bytes
|
||||
end
|
||||
|
||||
def valid?
|
||||
validate
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def validate
|
||||
pgrps = nil
|
||||
valid_archive = true
|
||||
|
||||
validate_archive_path
|
||||
|
||||
Timeout.timeout(TIMEOUT_LIMIT) do
|
||||
stderr_r, stderr_w = IO.pipe
|
||||
stdout, wait_threads = Open3.pipeline_r(*command, pgroup: true, err: stderr_w)
|
||||
|
||||
# When validation is performed on a small archive (e.g. 100 bytes)
|
||||
# `wait_thr` finishes before we can get process group id. Do not
|
||||
# raise exception in this scenario.
|
||||
pgrps = wait_threads.map do |wait_thr|
|
||||
Process.getpgid(wait_thr[:pid])
|
||||
rescue Errno::ESRCH
|
||||
nil
|
||||
end
|
||||
pgrps.compact!
|
||||
|
||||
status = wait_threads.last.value
|
||||
|
||||
if status.success?
|
||||
result = stdout.readline
|
||||
|
||||
valid_archive = false if result.to_i > max_bytes
|
||||
else
|
||||
valid_archive = false
|
||||
end
|
||||
|
||||
ensure
|
||||
stdout.close
|
||||
stderr_w.close
|
||||
stderr_r.close
|
||||
end
|
||||
|
||||
valid_archive
|
||||
rescue StandardError
|
||||
pgrps.each { |pgrp| Process.kill(-1, pgrp) } if pgrps
|
||||
|
||||
false
|
||||
end
|
||||
|
||||
def validate_archive_path
|
||||
Gitlab::Utils.check_path_traversal!(archive_path)
|
||||
|
||||
raise(ServiceError, 'Archive path is a symlink') if File.lstat(archive_path).symlink?
|
||||
raise(ServiceError, 'Archive path is not a file') unless File.file?(archive_path)
|
||||
end
|
||||
|
||||
def command
|
||||
[['gzip', '-dc', archive_path], ['wc', '-c']]
|
||||
end
|
||||
|
||||
attr_reader :archive_path, :max_bytes
|
||||
end
|
||||
end
|
||||
end
|
|
@ -8,15 +8,35 @@ module Gitlab
|
|||
';;;' => 'json'
|
||||
}.freeze
|
||||
|
||||
DELIM = Regexp.union(DELIM_LANG.keys)
|
||||
DELIM_UNTRUSTED = "(?:#{Gitlab::FrontMatter::DELIM_LANG.keys.map { |x| RE2::Regexp.escape(x) }.join('|')})".freeze
|
||||
|
||||
PATTERN = %r{
|
||||
\A(?<encoding>[^\r\n]*coding:[^\r\n]*\R)? # optional encoding line
|
||||
(?<before>\s*)
|
||||
^(?<delim>#{DELIM})[ \t]*(?<lang>\S*)\R # opening front matter marker (optional language specifier)
|
||||
(?<front_matter>.*?) # front matter block content (not greedy)
|
||||
^(\k<delim> | \.{3}) # closing front matter marker
|
||||
[^\S\r\n]*(\R|\z)
|
||||
}mx.freeze
|
||||
# Original pattern:
|
||||
# \A(?<encoding>[^\r\n]*coding:[^\r\n]*\R)? # optional encoding line
|
||||
# (?<before>\s*)
|
||||
# ^(?<delim>#{DELIM})[ \t]*(?<lang>\S*)\R # opening front matter marker (optional language specifier)
|
||||
# (?<front_matter>.*?) # front matter block content (not greedy)
|
||||
# ^(\k<delim> | \.{3}) # closing front matter marker
|
||||
# [^\S\r\n]*(\R|\z)
|
||||
# rubocop:disable Style/StringConcatenation
|
||||
# rubocop:disable Style/LineEndConcatenation
|
||||
PATTERN_UNTRUSTED =
|
||||
# optional encoding line
|
||||
"\\A(?P<encoding>[^\\r\\n]*coding:[^\\r\\n]*#{::Gitlab::UntrustedRegexp::BACKSLASH_R})?" +
|
||||
'(?P<before>\s*)' +
|
||||
|
||||
# opening front matter marker (optional language specifier)
|
||||
"^(?P<delim>#{DELIM_UNTRUSTED})[ \\t]*(?P<lang>\\S*)#{::Gitlab::UntrustedRegexp::BACKSLASH_R}" +
|
||||
|
||||
# front matter block content (not greedy)
|
||||
'(?P<front_matter>(?:\n|.)*?)' +
|
||||
|
||||
# closing front matter marker
|
||||
"^((?P<delim_closing>#{DELIM_UNTRUSTED})|\\.{3})" +
|
||||
"[^\\S\\r\\n]*(#{::Gitlab::UntrustedRegexp::BACKSLASH_R}|\\z)"
|
||||
# rubocop:enable Style/LineEndConcatenation
|
||||
# rubocop:enable Style/StringConcatenation
|
||||
|
||||
PATTERN_UNTRUSTED_REGEX =
|
||||
Gitlab::UntrustedRegexp.new(PATTERN_UNTRUSTED, multiline: true)
|
||||
end
|
||||
end
|
||||
|
|
|
@ -121,29 +121,10 @@ module Gitlab
|
|||
def authorized_record_json(record, options)
|
||||
include_keys = options[:include].flat_map(&:keys)
|
||||
keys_to_authorize = record.try(:restricted_associations, include_keys)
|
||||
|
||||
return record.to_json(options) if keys_to_authorize.blank?
|
||||
|
||||
record_hash = record.as_json(options).with_indifferent_access
|
||||
filtered_record_hash(record, keys_to_authorize, record_hash).to_json(options)
|
||||
end
|
||||
|
||||
def filtered_record_hash(record, keys_to_authorize, record_hash)
|
||||
keys_to_authorize.each do |key|
|
||||
next unless record_hash[key].present?
|
||||
|
||||
readable = record.try(:readable_records, key, current_user: current_user)
|
||||
if record.has_many_association?(key)
|
||||
readable_ids = readable.pluck(:id)
|
||||
|
||||
record_hash[key].keep_if do |association_record|
|
||||
readable_ids.include?(association_record[:id])
|
||||
end
|
||||
else
|
||||
record_hash[key] = nil unless readable.present?
|
||||
end
|
||||
end
|
||||
|
||||
record_hash
|
||||
record.to_authorized_json(keys_to_authorize, current_user, options)
|
||||
end
|
||||
|
||||
def batch(relation, key)
|
||||
|
|
|
@ -6,7 +6,8 @@ module Gitlab
|
|||
include Sidekiq::ServerMiddleware
|
||||
|
||||
def call(worker, job, queue)
|
||||
logger.info "arguments: #{Gitlab::Json.dump(job['args'])}"
|
||||
loggable_args = Gitlab::ErrorTracking::Processor::SidekiqProcessor.loggable_arguments(job['args'], job['class'])
|
||||
logger.info "arguments: #{Gitlab::Json.dump(loggable_args)}"
|
||||
yield
|
||||
end
|
||||
end
|
||||
|
|
|
@ -13,6 +13,10 @@ module Gitlab
|
|||
class UntrustedRegexp
|
||||
require_dependency 're2'
|
||||
|
||||
# recreate Ruby's \R metacharacter
|
||||
# https://ruby-doc.org/3.2.2/Regexp.html#class-Regexp-label-Character+Classes
|
||||
BACKSLASH_R = '(\n|\v|\f|\r|\x{0085}|\x{2028}|\x{2029}|\r\n)'
|
||||
|
||||
delegate :===, :source, to: :regexp
|
||||
|
||||
def initialize(pattern, multiline: false)
|
||||
|
|
|
@ -53,7 +53,7 @@ module Gitlab
|
|||
include Gitlab::Utils::StrongMemoize
|
||||
|
||||
def initialize(delim = nil, lang = '', text = nil)
|
||||
@lang = lang.downcase.presence || Gitlab::FrontMatter::DELIM_LANG[delim]
|
||||
@lang = lang&.downcase.presence || Gitlab::FrontMatter::DELIM_LANG[delim]
|
||||
@text = text&.strip!
|
||||
end
|
||||
|
||||
|
@ -109,11 +109,17 @@ module Gitlab
|
|||
end
|
||||
|
||||
def parse_front_matter_block
|
||||
wiki_content.match(Gitlab::FrontMatter::PATTERN) { |m| Block.new(m[:delim], m[:lang], m[:front_matter]) } || Block.new
|
||||
if match = Gitlab::FrontMatter::PATTERN_UNTRUSTED_REGEX.match(wiki_content)
|
||||
Block.new(match[:delim], match[:lang], match[:front_matter])
|
||||
else
|
||||
Block.new
|
||||
end
|
||||
end
|
||||
|
||||
def strip_front_matter_block
|
||||
wiki_content.gsub(Gitlab::FrontMatter::PATTERN, '')
|
||||
Gitlab::FrontMatter::PATTERN_UNTRUSTED_REGEX.replace_gsub(wiki_content) do
|
||||
''
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
30
lib/tasks/gitlab/db/migration_fix_15_11.rake
Normal file
30
lib/tasks/gitlab/db/migration_fix_15_11.rake
Normal file
|
@ -0,0 +1,30 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
desc 'db | migration_fix_15_11'
|
||||
task migration_fix_15_11: [:environment] do
|
||||
next if Gitlab.com?
|
||||
|
||||
only_on = %i[main ci].select { |db| Gitlab::Database.has_database?(db) }
|
||||
Gitlab::Database::EachDatabase.each_database_connection(only: only_on) do |conn, database|
|
||||
begin
|
||||
first_migration = conn.execute('SELECT * FROM schema_migrations ORDER BY version ASC LIMIT 1')
|
||||
rescue ActiveRecord::StatementInvalid
|
||||
# Uninitialized DB, skip
|
||||
next
|
||||
end
|
||||
next if first_migration.none? # No migrations have been run yet
|
||||
# If we are affected, the first migration in the schema_migrations table
|
||||
# will be 20220314184009
|
||||
next unless first_migration.first['version'] == '20220314184009'
|
||||
|
||||
puts "Running 15.11 migration fix for #{database}"
|
||||
fixes = File.readlines(Rails.root.join('db/15_11_migration_fixes.txt')).map(&:chomp)
|
||||
conn.transaction do
|
||||
fixes.each do |version|
|
||||
conn.execute("INSERT INTO schema_migrations (version) VALUES ('#{version}')")
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
Rake::Task['db:migrate'].enhance(['migration_fix_15_11'])
|
|
@ -55,12 +55,13 @@ RSpec.describe Admin::HooksController do
|
|||
hook.update!(url_variables: { 'foo' => 'bar', 'baz' => 'woo' })
|
||||
|
||||
hook_params = {
|
||||
url: 'http://example.com/{baz}?token={token}',
|
||||
url: 'http://example.com/{bar}?token={token}',
|
||||
enable_ssl_verification: false,
|
||||
url_variables: [
|
||||
{ key: 'token', value: 'some secret value' },
|
||||
{ key: 'baz', value: 'qux' },
|
||||
{ key: 'foo', value: nil }
|
||||
{ key: 'baz', value: nil },
|
||||
{ key: 'foo', value: nil },
|
||||
{ key: 'bar', value: 'qux' }
|
||||
]
|
||||
}
|
||||
|
||||
|
@ -72,7 +73,7 @@ RSpec.describe Admin::HooksController do
|
|||
expect(flash[:notice]).to include('was updated')
|
||||
expect(hook).to have_attributes(hook_params.except(:url_variables))
|
||||
expect(hook).to have_attributes(
|
||||
url_variables: { 'token' => 'some secret value', 'baz' => 'qux' }
|
||||
url_variables: { 'token' => 'some secret value', 'bar' => 'qux' }
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe Projects::CompareController do
|
||||
RSpec.describe Projects::CompareController, feature_category: :source_code_management do
|
||||
include ProjectForksHelper
|
||||
|
||||
using RSpec::Parameterized::TableSyntax
|
||||
|
@ -211,6 +211,36 @@ RSpec.describe Projects::CompareController do
|
|||
end
|
||||
end
|
||||
|
||||
context 'when the target project is the default source but hidden to the user' do
|
||||
let(:project) { create(:project, :repository, :private) }
|
||||
let(:from_ref) { 'improve%2Fmore-awesome' }
|
||||
let(:to_ref) { 'feature' }
|
||||
let(:whitespace) { nil }
|
||||
|
||||
let(:request_params) do
|
||||
{
|
||||
namespace_id: project.namespace,
|
||||
project_id: project,
|
||||
from: from_ref,
|
||||
to: to_ref,
|
||||
w: whitespace,
|
||||
page: page,
|
||||
straight: straight
|
||||
}
|
||||
end
|
||||
|
||||
it 'does not show the diff' do
|
||||
allow(controller).to receive(:source_project).and_return(project)
|
||||
expect(project).to receive(:default_merge_request_target).and_return(private_fork)
|
||||
|
||||
show_request
|
||||
|
||||
expect(response).to be_successful
|
||||
expect(assigns(:diffs)).to be_empty
|
||||
expect(assigns(:commits)).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when the source ref does not exist' do
|
||||
let(:from_project_id) { nil }
|
||||
let(:from_ref) { 'non-existent-source-ref' }
|
||||
|
|
|
@ -193,37 +193,78 @@ RSpec.describe ProjectsController, feature_category: :projects do
|
|||
end
|
||||
end
|
||||
|
||||
context 'when the default branch name can resolve to another ref' do
|
||||
let!(:project_with_default_branch) do
|
||||
create(:project, :public, :custom_repo, files: ['somefile']).tap do |p|
|
||||
p.repository.create_branch("refs/heads/refs/heads/#{other_ref}", 'master')
|
||||
p.change_head("refs/heads/#{other_ref}")
|
||||
end.reload
|
||||
context 'when the default branch name is ambiguous' do
|
||||
let_it_be(:project_with_default_branch) do
|
||||
create(:project, :public, :custom_repo, files: ['somefile'])
|
||||
end
|
||||
|
||||
let(:other_ref) { 'branch-name' }
|
||||
|
||||
context 'but there is no other ref' do
|
||||
it 'responds with ok' do
|
||||
get :show, params: { namespace_id: project_with_default_branch.namespace, id: project_with_default_branch }
|
||||
expect(response).to be_ok
|
||||
end
|
||||
end
|
||||
|
||||
context 'and that other ref exists' do
|
||||
let(:tree_with_default_branch) do
|
||||
branch = project_with_default_branch.repository.find_branch(project_with_default_branch.default_branch)
|
||||
project_tree_path(project_with_default_branch, branch.target)
|
||||
end
|
||||
shared_examples 'ambiguous ref redirects' do
|
||||
let(:project) { project_with_default_branch }
|
||||
let(:branch_ref) { "refs/heads/#{ref}" }
|
||||
let(:repo) { project.repository }
|
||||
|
||||
before do
|
||||
project_with_default_branch.repository.create_branch(other_ref, 'master')
|
||||
repo.create_branch(branch_ref, 'master')
|
||||
repo.change_head(ref)
|
||||
end
|
||||
|
||||
it 'redirects to tree view for the default branch' do
|
||||
get :show, params: { namespace_id: project_with_default_branch.namespace, id: project_with_default_branch }
|
||||
expect(response).to redirect_to(tree_with_default_branch)
|
||||
after do
|
||||
repo.change_head('master')
|
||||
repo.delete_branch(branch_ref)
|
||||
end
|
||||
|
||||
subject do
|
||||
get(
|
||||
:show,
|
||||
params: {
|
||||
namespace_id: project.namespace,
|
||||
id: project
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
context 'when there is no conflicting ref' do
|
||||
let(:other_ref) { 'non-existent-ref' }
|
||||
|
||||
it { is_expected.to have_gitlab_http_status(:ok) }
|
||||
end
|
||||
|
||||
context 'and that other ref exists' do
|
||||
let(:other_ref) { 'master' }
|
||||
|
||||
let(:project_default_root_tree_path) do
|
||||
sha = repo.find_branch(project.default_branch).target
|
||||
project_tree_path(project, sha)
|
||||
end
|
||||
|
||||
it 'redirects to tree view for the default branch' do
|
||||
is_expected.to redirect_to(project_default_root_tree_path)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when ref starts with ref/heads/' do
|
||||
let(:ref) { "refs/heads/#{other_ref}" }
|
||||
|
||||
include_examples 'ambiguous ref redirects'
|
||||
end
|
||||
|
||||
context 'when ref starts with ref/tags/' do
|
||||
let(:ref) { "refs/tags/#{other_ref}" }
|
||||
|
||||
include_examples 'ambiguous ref redirects'
|
||||
end
|
||||
|
||||
context 'when ref starts with heads/' do
|
||||
let(:ref) { "heads/#{other_ref}" }
|
||||
|
||||
include_examples 'ambiguous ref redirects'
|
||||
end
|
||||
|
||||
context 'when ref starts with tags/' do
|
||||
let(:ref) { "tags/#{other_ref}" }
|
||||
|
||||
include_examples 'ambiguous ref redirects'
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -20,6 +20,10 @@ FactoryBot.define do
|
|||
public_email { email }
|
||||
end
|
||||
|
||||
trait :notification_email do
|
||||
notification_email { email }
|
||||
end
|
||||
|
||||
trait :private_profile do
|
||||
private_profile { true }
|
||||
end
|
||||
|
|
|
@ -501,7 +501,9 @@ RSpec.describe 'Admin::Users::User', feature_category: :user_management do
|
|||
end
|
||||
|
||||
context 'when user has an unconfirmed email', :js do
|
||||
let(:unconfirmed_user) { create(:user, :unconfirmed) }
|
||||
# Email address contains HTML to ensure email address is displayed in an HTML safe way.
|
||||
let_it_be(:unconfirmed_email) { "#{generate(:email)}<h2>testing<img/src=http://localhost:8000/test.png>" }
|
||||
let_it_be(:unconfirmed_user) { create(:user, :unconfirmed, unconfirmed_email: unconfirmed_email) }
|
||||
|
||||
where(:path_helper) do
|
||||
[
|
||||
|
@ -521,7 +523,9 @@ RSpec.describe 'Admin::Users::User', feature_category: :user_management do
|
|||
|
||||
within_modal do
|
||||
expect(page).to have_content("Confirm user #{unconfirmed_user.name}?")
|
||||
expect(page).to have_content('This user has an unconfirmed email address. You may force a confirmation.')
|
||||
expect(page).to have_content(
|
||||
"This user has an unconfirmed email address (#{unconfirmed_email}). You may force a confirmation."
|
||||
)
|
||||
|
||||
click_button 'Confirm user'
|
||||
end
|
||||
|
|
|
@ -67,6 +67,21 @@ describe('SingleFileDiff', () => {
|
|||
expect(mock.history.get.length).toBe(1);
|
||||
});
|
||||
|
||||
it('ignores user-defined diff path attributes', () => {
|
||||
setHTMLFixture(`
|
||||
<div class="diff-file">
|
||||
<div class="diff-content">
|
||||
<div class="diff-viewer" data-type="simple">
|
||||
<div class="note-text"><a data-diff-for-path="test/note/path">Test note</a></div>
|
||||
<div data-diff-for-path="${blobDiffPath}">MOCK CONTENT</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
`);
|
||||
const { diffForPath } = new SingleFileDiff(document.querySelector('.diff-file'));
|
||||
expect(diffForPath).toEqual(blobDiffPath);
|
||||
});
|
||||
|
||||
it('does not load diffs via axios for already expanded diffs', async () => {
|
||||
setHTMLFixture(`
|
||||
<div class="diff-file">
|
||||
|
|
|
@ -153,10 +153,18 @@ RSpec.describe MergeRequestsHelper, feature_category: :code_review_workflow do
|
|||
end
|
||||
|
||||
describe '#merge_request_source_branch' do
|
||||
let_it_be(:project) { create(:project) }
|
||||
let(:forked_project) { fork_project(project) }
|
||||
let(:merge_request_forked) { create(:merge_request, source_project: forked_project, target_project: project) }
|
||||
let(:malicious_branch_name) { 'name<script>test</script>' }
|
||||
let(:project) { create(:project) }
|
||||
let(:merge_request) { create(:merge_request, source_project: project, target_project: project) }
|
||||
let(:forked_project) { fork_project(project) }
|
||||
let(:merge_request_forked) do
|
||||
create(
|
||||
:merge_request,
|
||||
source_project: forked_project,
|
||||
source_branch: malicious_branch_name,
|
||||
target_project: project
|
||||
)
|
||||
end
|
||||
|
||||
context 'when merge request is a fork' do
|
||||
subject { merge_request_source_branch(merge_request_forked) }
|
||||
|
@ -164,6 +172,10 @@ RSpec.describe MergeRequestsHelper, feature_category: :code_review_workflow do
|
|||
it 'does show the fork icon' do
|
||||
expect(subject).to match(/fork/)
|
||||
end
|
||||
|
||||
it 'escapes properly' do
|
||||
expect(subject).to include(html_escape(malicious_branch_name))
|
||||
end
|
||||
end
|
||||
|
||||
context 'when merge request is not a fork' do
|
||||
|
|
47
spec/lib/api/entities/issue_spec.rb
Normal file
47
spec/lib/api/entities/issue_spec.rb
Normal file
|
@ -0,0 +1,47 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe ::API::Entities::Issue, feature_category: :team_planning do
|
||||
let_it_be(:project) { create(:project) }
|
||||
let(:issue) { build_stubbed(:issue, project: project) }
|
||||
let(:current_user) { build_stubbed(:user) }
|
||||
let(:options) { { current_user: current_user }.merge(option_addons) }
|
||||
let(:option_addons) { {} }
|
||||
let(:entity) { described_class.new(issue, options) }
|
||||
|
||||
subject(:json) { entity.as_json }
|
||||
|
||||
describe '#service_desk_reply_to', feature_category: :service_desk do
|
||||
# Setting to true (default) doesn't play nice with stubs
|
||||
let(:option_addons) { { include_subscribed: false } }
|
||||
let(:issue) { build_stubbed(:issue, project: project, service_desk_reply_to: email) }
|
||||
let(:email) { 'creator@example.com' }
|
||||
let(:role) { :developer }
|
||||
|
||||
subject { json[:service_desk_reply_to] }
|
||||
|
||||
context 'as developer' do
|
||||
before do
|
||||
stub_member_access_level(issue.project, developer: current_user)
|
||||
end
|
||||
|
||||
it { is_expected.to eq(email) }
|
||||
end
|
||||
|
||||
context 'as guest' do
|
||||
before do
|
||||
stub_member_access_level(issue.project, guest: current_user)
|
||||
end
|
||||
|
||||
it { is_expected.to eq('cr*****@e*****.c**') }
|
||||
end
|
||||
|
||||
context 'without email' do
|
||||
let(:email) { nil }
|
||||
|
||||
specify { expect(json).to have_key(:service_desk_reply_to) }
|
||||
it { is_expected.to eq(nil) }
|
||||
end
|
||||
end
|
||||
end
|
|
@ -189,19 +189,29 @@ RSpec.describe Banzai::Filter::FrontMatterFilter, feature_category: :team_planni
|
|||
end
|
||||
end
|
||||
|
||||
it 'fails fast for strings with many spaces' do
|
||||
content = "coding:" + " " * 50_000 + ";"
|
||||
describe 'protects against malicious backtracking' do
|
||||
it 'fails fast for strings with many spaces' do
|
||||
content = "coding:" + " " * 50_000 + ";"
|
||||
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(content) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(content) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
|
||||
it 'fails fast for strings with many newlines' do
|
||||
content = "coding:\n" + ";;;" + "\n" * 10_000 + "x"
|
||||
it 'fails fast for strings with many newlines' do
|
||||
content = "coding:\n" + ";;;" + "\n" * 10_000 + "x"
|
||||
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(content) }
|
||||
end.not_to raise_error
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(content) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
|
||||
it 'fails fast for strings with many `coding:`' do
|
||||
content = "coding:" * 120_000 + "\n" * 80_000 + ";"
|
||||
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(content) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -67,4 +67,12 @@ RSpec.describe Banzai::Filter::InlineDiffFilter do
|
|||
doc = "<tt>START {+something added+} END</tt>"
|
||||
expect(filter(doc).to_html).to eq(doc)
|
||||
end
|
||||
|
||||
it 'protects against malicious backtracking' do
|
||||
doc = '[-{-' * 250_000
|
||||
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(doc) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
end
|
||||
|
|
|
@ -215,6 +215,14 @@ RSpec.describe Banzai::Filter::MathFilter, feature_category: :team_planning do
|
|||
expect(doc.search('.js-render-math').count).to eq(2)
|
||||
end
|
||||
|
||||
it 'protects against malicious backtracking' do
|
||||
doc = pipeline_filter("$$#{' ' * 1_000_000}$")
|
||||
|
||||
expect do
|
||||
Timeout.timeout(3.seconds) { filter(doc) }
|
||||
end.not_to raise_error
|
||||
end
|
||||
|
||||
def pipeline_filter(text)
|
||||
context = { project: nil, no_sourcepos: true }
|
||||
doc = Banzai::Pipeline::PreProcessPipeline.call(text, {})
|
||||
|
|
|
@ -115,6 +115,7 @@ RSpec.describe BitbucketServer::Representation::PullRequest, feature_category: :
|
|||
author: "root",
|
||||
description: "Test",
|
||||
source_branch_name: "refs/heads/root/CODE_OF_CONDUCTmd-1530600625006",
|
||||
source_branch_sha: "074e2b4dddc5b99df1bf9d4a3f66cfc15481fdc8",
|
||||
target_branch_name: "refs/heads/master",
|
||||
target_branch_sha: "839fa9a2d434eb697815b8fcafaecc51accfdbbc",
|
||||
title: "Added a new line"
|
||||
|
|
|
@ -7,6 +7,6 @@ RSpec.describe Gitlab::BackgroundMigration::Mailers::UnconfirmMailer do
|
|||
let(:subject) { described_class.unconfirm_notification_email(user) }
|
||||
|
||||
it 'contains abuse report url' do
|
||||
expect(subject.body.encoded).to include(Rails.application.routes.url_helpers.new_abuse_report_url(user_id: user.id))
|
||||
expect(subject.body.encoded).to include(Gitlab::Routing.url_helpers.user_url(user.id))
|
||||
end
|
||||
end
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe Gitlab::Checks::TagCheck do
|
||||
RSpec.describe Gitlab::Checks::TagCheck, feature_category: :source_code_management do
|
||||
include_context 'change access checks context'
|
||||
|
||||
describe '#validate!' do
|
||||
|
@ -14,6 +14,29 @@ RSpec.describe Gitlab::Checks::TagCheck do
|
|||
expect { subject.validate! }.to raise_error(Gitlab::GitAccess::ForbiddenError, 'You are not allowed to change existing tags on this project.')
|
||||
end
|
||||
|
||||
context "prohibited tags check" do
|
||||
it "prohibits tag names that include refs/tags/ at the head" do
|
||||
allow(subject).to receive(:tag_name).and_return("refs/tags/foo")
|
||||
|
||||
expect { subject.validate! }.to raise_error(Gitlab::GitAccess::ForbiddenError, "You cannot create a tag with a prohibited pattern.")
|
||||
end
|
||||
|
||||
it "doesn't prohibit a nested refs/tags/ string in a tag name" do
|
||||
allow(subject).to receive(:tag_name).and_return("fix-for-refs/tags/foo")
|
||||
|
||||
expect { subject.validate! }.not_to raise_error
|
||||
end
|
||||
|
||||
context "deleting a refs/tags headed tag" do
|
||||
let(:newrev) { "0000000000000000000000000000000000000000" }
|
||||
let(:ref) { "refs/tags/refs/tags/267208abfe40e546f5e847444276f7d43a39503e" }
|
||||
|
||||
it "doesn't prohibit the deletion of a refs/tags/ tag name" do
|
||||
expect { subject.validate! }.not_to raise_error
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with protected tag' do
|
||||
let!(:protected_tag) { create(:protected_tag, project: project, name: 'v*') }
|
||||
|
||||
|
|
|
@ -0,0 +1,95 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe Gitlab::Ci::Artifacts::DecompressedArtifactSizeValidator, feature_category: :build_artifacts do
|
||||
include WorkhorseHelpers
|
||||
|
||||
let_it_be(:file_path) { File.join(Dir.tmpdir, 'decompressed_archive_size_validator_spec.gz') }
|
||||
let(:file) { File.open(file_path) }
|
||||
let(:file_format) { :gzip }
|
||||
let(:max_bytes) { 20 }
|
||||
let(:gzip_valid?) { true }
|
||||
let(:validator) { instance_double(::Gitlab::Ci::DecompressedGzipSizeValidator, valid?: gzip_valid?) }
|
||||
|
||||
before(:all) do
|
||||
Zlib::GzipWriter.open(file_path) do |gz|
|
||||
gz.write('Hello World!')
|
||||
end
|
||||
end
|
||||
|
||||
after(:all) do
|
||||
FileUtils.rm(file_path)
|
||||
end
|
||||
|
||||
before do
|
||||
allow(::Gitlab::Ci::DecompressedGzipSizeValidator)
|
||||
.to receive(:new)
|
||||
.and_return(validator)
|
||||
end
|
||||
|
||||
subject { described_class.new(file: file, file_format: file_format, max_bytes: max_bytes) }
|
||||
|
||||
shared_examples 'when file does not exceed allowed compressed size' do
|
||||
let(:gzip_valid?) { true }
|
||||
|
||||
it 'passes validation' do
|
||||
expect { subject.validate! }.not_to raise_error
|
||||
end
|
||||
end
|
||||
|
||||
shared_examples 'when file exceeds allowed decompressed size' do
|
||||
let(:gzip_valid?) { false }
|
||||
|
||||
it 'raises an exception' do
|
||||
expect { subject.validate! }
|
||||
.to raise_error(Gitlab::Ci::Artifacts::DecompressedArtifactSizeValidator::FileDecompressionError)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#validate!' do
|
||||
it_behaves_like 'when file does not exceed allowed compressed size'
|
||||
|
||||
it_behaves_like 'when file exceeds allowed decompressed size'
|
||||
end
|
||||
|
||||
context 'when file is not provided' do
|
||||
let(:file) { nil }
|
||||
|
||||
it 'passes validation' do
|
||||
expect { subject.validate! }.not_to raise_error
|
||||
end
|
||||
end
|
||||
|
||||
context 'when the file is located in the cloud' do
|
||||
let(:remote_path) { File.join(remote_store_path, remote_id) }
|
||||
|
||||
let(:file_url) { "http://s3.amazonaws.com/#{remote_path}" }
|
||||
let(:file) do
|
||||
instance_double(JobArtifactUploader,
|
||||
path: file_path,
|
||||
url: file_url,
|
||||
object_store: ObjectStorage::Store::REMOTE)
|
||||
end
|
||||
|
||||
let(:remote_id) { 'generated-remote-id-12345' }
|
||||
let(:remote_store_path) { ObjectStorage::TMP_UPLOAD_PATH }
|
||||
|
||||
before do
|
||||
stub_request(:get, %r{s3.amazonaws.com/#{remote_path}})
|
||||
.to_return(status: 200, body: File.read('spec/fixtures/build.env.gz'))
|
||||
end
|
||||
|
||||
it_behaves_like 'when file does not exceed allowed compressed size'
|
||||
|
||||
it_behaves_like 'when file exceeds allowed decompressed size'
|
||||
end
|
||||
|
||||
context 'when file_format is not on the list' do
|
||||
let_it_be(:file_format) { 'rar' }
|
||||
|
||||
it 'passes validation' do
|
||||
expect { subject.validate! }.not_to raise_error
|
||||
end
|
||||
end
|
||||
end
|
16
spec/lib/gitlab/ci/config/entry/includes_spec.rb
Normal file
16
spec/lib/gitlab/ci/config/entry/includes_spec.rb
Normal file
|
@ -0,0 +1,16 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'fast_spec_helper'
|
||||
require_dependency 'active_model'
|
||||
|
||||
RSpec.describe ::Gitlab::Ci::Config::Entry::Includes, feature_category: :pipeline_composition do
|
||||
subject(:include_entry) { described_class.new(config) }
|
||||
|
||||
describe '#initialize' do
|
||||
let(:config) { 'test.yml' }
|
||||
|
||||
it 'does not increase aspects' do
|
||||
2.times { expect { described_class.new(config) }.not_to change { described_class.aspects.count } }
|
||||
end
|
||||
end
|
||||
end
|
127
spec/lib/gitlab/ci/decompressed_gzip_size_validator_spec.rb
Normal file
127
spec/lib/gitlab/ci/decompressed_gzip_size_validator_spec.rb
Normal file
|
@ -0,0 +1,127 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe Gitlab::Ci::DecompressedGzipSizeValidator, feature_category: :importers do
|
||||
let_it_be(:filepath) { File.join(Dir.tmpdir, 'decompressed_gzip_size_validator_spec.gz') }
|
||||
|
||||
before(:all) do
|
||||
create_compressed_file
|
||||
end
|
||||
|
||||
after(:all) do
|
||||
FileUtils.rm(filepath)
|
||||
end
|
||||
|
||||
subject { described_class.new(archive_path: filepath, max_bytes: max_bytes) }
|
||||
|
||||
describe '#valid?' do
|
||||
let(:max_bytes) { 20 }
|
||||
|
||||
context 'when file does not exceed allowed decompressed size' do
|
||||
it 'returns true' do
|
||||
expect(subject.valid?).to eq(true)
|
||||
end
|
||||
|
||||
context 'when the waiter thread no longer exists due to being terminated or crashing' do
|
||||
it 'gracefully handles the absence of the waiter without raising exception' do
|
||||
allow(Process).to receive(:getpgid).and_raise(Errno::ESRCH)
|
||||
|
||||
expect(subject.valid?).to eq(true)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when file exceeds allowed decompressed size' do
|
||||
let(:max_bytes) { 1 }
|
||||
|
||||
it 'returns false' do
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when exception occurs during header readings' do
|
||||
shared_examples 'raises exception and terminates validator process group' do
|
||||
let(:std) { instance_double(IO, close: nil) }
|
||||
let(:wait_thr) { double }
|
||||
let(:wait_threads) { [wait_thr, wait_thr] }
|
||||
|
||||
before do
|
||||
allow(Process).to receive(:getpgid).and_return(2)
|
||||
allow(Open3).to receive(:pipeline_r).and_return([std, wait_threads])
|
||||
allow(wait_thr).to receive(:[]).with(:pid).and_return(1)
|
||||
allow(wait_thr).to receive(:value).and_raise(exception)
|
||||
end
|
||||
|
||||
it 'terminates validator process group' do
|
||||
expect(Process).to receive(:kill).with(-1, 2).twice
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when timeout occurs' do
|
||||
let(:exception) { Timeout::Error }
|
||||
|
||||
include_examples 'raises exception and terminates validator process group'
|
||||
end
|
||||
|
||||
context 'when exception occurs' do
|
||||
let(:error_message) { 'Error!' }
|
||||
let(:exception) { StandardError.new(error_message) }
|
||||
|
||||
include_examples 'raises exception and terminates validator process group'
|
||||
end
|
||||
end
|
||||
|
||||
describe 'archive path validation' do
|
||||
let(:filesize) { nil }
|
||||
|
||||
context 'when archive path is traversed' do
|
||||
let(:filepath) { '/foo/../bar' }
|
||||
|
||||
it 'does not pass validation' do
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when archive path is not a string' do
|
||||
let(:filepath) { 123 }
|
||||
|
||||
it 'returns false' do
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when archive path is a symlink' do
|
||||
let(:filepath) { File.join(Dir.tmpdir, 'symlink') }
|
||||
|
||||
before do
|
||||
FileUtils.ln_s(filepath, filepath, force: true)
|
||||
end
|
||||
|
||||
it 'returns false' do
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when archive path is not a file' do
|
||||
let(:filepath) { Dir.mktmpdir }
|
||||
let(:filesize) { File.size(filepath) }
|
||||
|
||||
after do
|
||||
FileUtils.rm_rf(filepath)
|
||||
end
|
||||
|
||||
it 'returns false' do
|
||||
expect(subject.valid?).to eq(false)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def create_compressed_file
|
||||
Zlib::GzipWriter.open(filepath) do |gz|
|
||||
gz.write('Hello World!')
|
||||
end
|
||||
end
|
||||
end
|
|
@ -7,6 +7,8 @@ RSpec.describe Gitlab::ImportExport::Project::TreeSaver, :with_license, feature_
|
|||
let_it_be(:exportable_path) { 'project' }
|
||||
let_it_be(:user) { create(:user) }
|
||||
let_it_be(:group) { create(:group) }
|
||||
let_it_be(:private_project) { create(:project, :private, group: group) }
|
||||
let_it_be(:private_mr) { create(:merge_request, source_project: private_project, project: private_project) }
|
||||
let_it_be(:project) { setup_project }
|
||||
|
||||
shared_examples 'saves project tree successfully' do
|
||||
|
@ -118,6 +120,13 @@ RSpec.describe Gitlab::ImportExport::Project::TreeSaver, :with_license, feature_
|
|||
expect(reviewer).not_to be_nil
|
||||
expect(reviewer['user_id']).to eq(user.id)
|
||||
end
|
||||
|
||||
it 'has merge requests system notes' do
|
||||
system_notes = subject.first['notes'].select { |note| note['system'] }
|
||||
|
||||
expect(system_notes.size).to eq(1)
|
||||
expect(system_notes.first['note']).to eq('merged')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with snippets' do
|
||||
|
@ -492,6 +501,9 @@ RSpec.describe Gitlab::ImportExport::Project::TreeSaver, :with_license, feature_
|
|||
create(:milestone, project: project)
|
||||
discussion_note = create(:discussion_note, noteable: issue, project: project)
|
||||
mr_note = create(:note, noteable: merge_request, project: project)
|
||||
create(:system_note, noteable: merge_request, project: project, author: user, note: 'merged')
|
||||
private_system_note = "mentioned in merge request #{private_mr.to_reference(project)}"
|
||||
create(:system_note, noteable: merge_request, project: project, author: user, note: private_system_note)
|
||||
create(:note, noteable: snippet, project: project)
|
||||
create(:note_on_commit,
|
||||
author: user,
|
||||
|
|
|
@ -36,6 +36,21 @@ RSpec.describe Ci::Artifactable do
|
|||
expect { |b| artifact.each_blob(&b) }.to yield_control.exactly(3).times
|
||||
end
|
||||
end
|
||||
|
||||
context 'when decompressed artifact size validator fails' do
|
||||
let(:artifact) { build(:ci_job_artifact, :junit) }
|
||||
|
||||
before do
|
||||
allow_next_instance_of(Gitlab::Ci::DecompressedGzipSizeValidator) do |instance|
|
||||
allow(instance).to receive(:valid?).and_return(false)
|
||||
end
|
||||
end
|
||||
|
||||
it 'fails on blob' do
|
||||
expect { |b| artifact.each_blob(&b) }
|
||||
.to raise_error(::Gitlab::Ci::Artifacts::DecompressedArtifactSizeValidator::FileDecompressionError)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when file format is raw' do
|
||||
|
|
|
@ -9,6 +9,7 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
let_it_be(:issue) { create(:issue, project: project, milestone: milestone) }
|
||||
let_it_be(:note1) { create(:system_note, project: project, noteable: issue) }
|
||||
let_it_be(:note2) { create(:system_note, project: project, noteable: issue) }
|
||||
let_it_be(:options) { { include: [{ notes: { only: [:note] }, milestone: { only: :title } }] } }
|
||||
|
||||
let_it_be(:model_klass) do
|
||||
Class.new(ApplicationRecord) do
|
||||
|
@ -28,19 +29,27 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
|
||||
subject { model_klass.new }
|
||||
|
||||
describe '.readable_records' do
|
||||
describe '.to_authorized_json' do
|
||||
let_it_be(:model_record) { model_klass.new }
|
||||
|
||||
context 'when model does not respond to association name' do
|
||||
it 'returns nil' do
|
||||
expect(subject.readable_records(:foo, current_user: user)).to be_nil
|
||||
context 'when key to authorize is not an association name' do
|
||||
it 'returns string without given key' do
|
||||
expect(subject.to_authorized_json([:foo], user, options)).not_to include('foo')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when model does respond to association name' do
|
||||
context 'when key to authorize is an association name' do
|
||||
let(:key_to_authorize) { :notes }
|
||||
|
||||
subject(:record_json) { model_record.to_authorized_json([key_to_authorize], user, options) }
|
||||
|
||||
context 'when there are no records' do
|
||||
it 'returns nil' do
|
||||
expect(model_record.readable_records(:notes, current_user: user)).to be_nil
|
||||
before do
|
||||
allow(model_record).to receive(:notes).and_return(Note.none)
|
||||
end
|
||||
|
||||
it 'returns string including the empty association' do
|
||||
expect(record_json).to include("\"notes\":[]")
|
||||
end
|
||||
end
|
||||
|
||||
|
@ -57,8 +66,9 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
end
|
||||
end
|
||||
|
||||
it 'returns collection of readable records' do
|
||||
expect(model_record.readable_records(:notes, current_user: user)).to contain_exactly(note1, note2)
|
||||
it 'returns string containing all records' do
|
||||
expect(record_json)
|
||||
.to include("\"notes\":[{\"note\":\"#{note1.note}\"},{\"note\":\"#{note2.note}\"}]")
|
||||
end
|
||||
end
|
||||
|
||||
|
@ -70,8 +80,19 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
end
|
||||
end
|
||||
|
||||
it 'returns collection of readable records' do
|
||||
expect(model_record.readable_records(:notes, current_user: user)).to eq([])
|
||||
it 'returns string including the empty association' do
|
||||
expect(record_json).to include("\"notes\":[]")
|
||||
end
|
||||
end
|
||||
|
||||
context 'when user can read some records' do
|
||||
before do
|
||||
allow(model_record).to receive(:readable_records).with(:notes, current_user: user)
|
||||
.and_return([note1])
|
||||
end
|
||||
|
||||
it 'returns string containing readable records only' do
|
||||
expect(record_json).to include("\"notes\":[{\"note\":\"#{note1.note}\"}]")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
@ -87,13 +108,15 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
it 'calls #readable_by?' do
|
||||
expect(note1).to receive(:readable_by?).with(user)
|
||||
|
||||
model_record.readable_records(:notes, current_user: user)
|
||||
record_json
|
||||
end
|
||||
end
|
||||
|
||||
context 'with single relation' do
|
||||
let(:key_to_authorize) { :milestone }
|
||||
|
||||
before do
|
||||
allow(model_record).to receive(:try).with(:milestone).and_return(issue.milestone)
|
||||
allow(model_record).to receive(:milestone).and_return(issue.milestone)
|
||||
end
|
||||
|
||||
context 'when user can read the record' do
|
||||
|
@ -101,8 +124,8 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
allow(milestone).to receive(:readable_by?).with(user).and_return(true)
|
||||
end
|
||||
|
||||
it 'returns collection of readable records' do
|
||||
expect(model_record.readable_records(:milestone, current_user: user)).to eq(milestone)
|
||||
it 'returns string including association' do
|
||||
expect(record_json).to include("\"milestone\":{\"title\":\"#{milestone.title}\"}")
|
||||
end
|
||||
end
|
||||
|
||||
|
@ -111,8 +134,8 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
allow(milestone).to receive(:readable_by?).with(user).and_return(false)
|
||||
end
|
||||
|
||||
it 'returns collection of readable records' do
|
||||
expect(model_record.readable_records(:milestone, current_user: user)).to be_nil
|
||||
it 'returns string with null association' do
|
||||
expect(record_json).to include("\"milestone\":null")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
@ -211,26 +234,4 @@ RSpec.describe Exportable, feature_category: :importers do
|
|||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '.has_many_association?' do
|
||||
let(:model_associations) { [:notes, :labels] }
|
||||
|
||||
context 'when association type is `has_many`' do
|
||||
it 'returns true' do
|
||||
expect(subject.has_many_association?(:notes)).to eq(true)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when association type is `has_one`' do
|
||||
it 'returns true' do
|
||||
expect(subject.has_many_association?(:milestone)).to eq(false)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when association type is `belongs_to`' do
|
||||
it 'returns true' do
|
||||
expect(subject.has_many_association?(:project)).to eq(false)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -1081,4 +1081,22 @@ RSpec.describe Issuable do
|
|||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with exportable associations' do
|
||||
let_it_be(:project) { create(:project, group: create(:group, :private)) }
|
||||
|
||||
context 'for issues' do
|
||||
let_it_be_with_reload(:resource) { create(:issue, project: project) }
|
||||
|
||||
it_behaves_like 'an exportable'
|
||||
end
|
||||
|
||||
context 'for merge requests' do
|
||||
let_it_be_with_reload(:resource) do
|
||||
create(:merge_request, source_project: project, project: project)
|
||||
end
|
||||
|
||||
it_behaves_like 'an exportable'
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -258,6 +258,13 @@ RSpec.describe WebHook, feature_category: :integrations do
|
|||
expect(hook.url_variables).to eq({})
|
||||
end
|
||||
|
||||
it 'resets url variables if url variables are overwritten' do
|
||||
hook.url_variables = hook.url_variables.merge('abc' => 'baz')
|
||||
|
||||
expect(hook).not_to be_valid
|
||||
expect(hook.url_variables).to eq({})
|
||||
end
|
||||
|
||||
it 'does not reset url variables if both url and url variables are changed' do
|
||||
hook.url = 'http://example.com/{one}/{two}'
|
||||
hook.url_variables = { 'one' => 'foo', 'two' => 'bar' }
|
||||
|
|
|
@ -44,6 +44,122 @@ RSpec.describe Label do
|
|||
is_expected.to allow_value("customer's request").for(:title)
|
||||
is_expected.to allow_value('s' * 255).for(:title)
|
||||
end
|
||||
|
||||
describe 'description length' do
|
||||
let(:invalid_description) { 'x' * (::Label::DESCRIPTION_LENGTH_MAX + 1) }
|
||||
let(:valid_description) { 'short description' }
|
||||
let(:label) { build(:label, project: project, description: description) }
|
||||
|
||||
let(:error_message) do
|
||||
format(
|
||||
_('is too long (%{size}). The maximum size is %{max_size}.'),
|
||||
size: ActiveSupport::NumberHelper.number_to_human_size(invalid_description.bytesize),
|
||||
max_size: ActiveSupport::NumberHelper.number_to_human_size(::Label::DESCRIPTION_LENGTH_MAX)
|
||||
)
|
||||
end
|
||||
|
||||
subject(:validate) { label.validate }
|
||||
|
||||
context 'when label is a new record' do
|
||||
context 'when description exceeds the maximum size' do
|
||||
let(:description) { invalid_description }
|
||||
|
||||
it 'adds a description too long error' do
|
||||
validate
|
||||
|
||||
expect(label.errors[:description]).to contain_exactly(error_message)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when description is within the allowed limits' do
|
||||
let(:description) { valid_description }
|
||||
|
||||
it 'does not add a validation error' do
|
||||
validate
|
||||
|
||||
expect(label.errors).not_to have_key(:description)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when label is an existing record' do
|
||||
before do
|
||||
label.description = existing_description
|
||||
label.save!(validate: false)
|
||||
label.description = description
|
||||
end
|
||||
|
||||
context 'when record already had a valid description' do
|
||||
let(:existing_description) { 'small difference so it triggers description_changed?' }
|
||||
|
||||
context 'when new description exceeds the maximum size' do
|
||||
let(:description) { invalid_description }
|
||||
|
||||
it 'adds a description too long error' do
|
||||
validate
|
||||
|
||||
expect(label.errors[:description]).to contain_exactly(error_message)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when new description is within the allowed limits' do
|
||||
let(:description) { valid_description }
|
||||
|
||||
it 'does not add a validation error' do
|
||||
validate
|
||||
|
||||
expect(label.errors).not_to have_key(:description)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when record existed with an invalid description' do
|
||||
let(:existing_description) { "#{invalid_description} small difference so it triggers description_changed?" }
|
||||
|
||||
context 'when description is not changed' do
|
||||
let(:description) { existing_description }
|
||||
|
||||
it 'does not add a validation error' do
|
||||
validate
|
||||
|
||||
expect(label.errors).not_to have_key(:description)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when new description exceeds the maximum size' do
|
||||
context 'when new description is shorter than existing description' do
|
||||
let(:description) { invalid_description }
|
||||
|
||||
it 'allows updating descriptions that already existed above the limit' do
|
||||
validate
|
||||
|
||||
expect(label.errors).not_to have_key(:description)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when new description is longer than existing description' do
|
||||
let(:description) { "#{existing_description}1" }
|
||||
|
||||
it 'adds a description too long error' do
|
||||
validate
|
||||
|
||||
expect(label.errors[:description]).to contain_exactly(error_message)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when new description is within the allowed limits' do
|
||||
let(:description) { valid_description }
|
||||
|
||||
it 'does not add a validation error' do
|
||||
validate
|
||||
|
||||
expect(label.errors).not_to have_key(:description)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'scopes' do
|
||||
|
|
|
@ -106,36 +106,6 @@ RSpec.describe ProjectMember do
|
|||
end
|
||||
end
|
||||
|
||||
describe '.import_team' do
|
||||
before do
|
||||
@project_1 = create(:project)
|
||||
@project_2 = create(:project)
|
||||
|
||||
@user_1 = create :user
|
||||
@user_2 = create :user
|
||||
|
||||
@project_1.add_developer(@user_1)
|
||||
@project_2.add_reporter(@user_2)
|
||||
|
||||
@status = @project_2.team.import(@project_1)
|
||||
end
|
||||
|
||||
it { expect(@status).to be_truthy }
|
||||
|
||||
describe 'project 2 should get user 1 as developer. user_2 should not be changed' do
|
||||
it { expect(@project_2.users).to include(@user_1) }
|
||||
it { expect(@project_2.users).to include(@user_2) }
|
||||
|
||||
it { expect(Ability.allowed?(@user_1, :create_project, @project_2)).to be_truthy }
|
||||
it { expect(Ability.allowed?(@user_2, :read_project, @project_2)).to be_truthy }
|
||||
end
|
||||
|
||||
describe 'project 1 should not be changed' do
|
||||
it { expect(@project_1.users).to include(@user_1) }
|
||||
it { expect(@project_1.users).not_to include(@user_2) }
|
||||
end
|
||||
end
|
||||
|
||||
describe '.truncate_teams' do
|
||||
before do
|
||||
@project_1 = create(:project)
|
||||
|
|
|
@ -164,6 +164,57 @@ RSpec.describe ProjectTeam, feature_category: :subgroups do
|
|||
end
|
||||
end
|
||||
|
||||
describe '#import_team' do
|
||||
let_it_be(:source_project) { create(:project) }
|
||||
let_it_be(:target_project) { create(:project) }
|
||||
let_it_be(:source_project_owner) { source_project.first_owner }
|
||||
let_it_be(:source_project_developer) { create(:user) { |user| source_project.add_developer(user) } }
|
||||
let_it_be(:current_user) { create(:user) { |user| target_project.add_maintainer(user) } }
|
||||
|
||||
subject(:import) { target_project.team.import(source_project, current_user) }
|
||||
|
||||
it { is_expected.to be_truthy }
|
||||
|
||||
it 'target project includes source member with the same access' do
|
||||
import
|
||||
|
||||
imported_member_access = target_project.members.find_by!(user: source_project_developer).access_level
|
||||
expect(imported_member_access).to eq(Gitlab::Access::DEVELOPER)
|
||||
end
|
||||
|
||||
it 'does not change the source project members' do
|
||||
import
|
||||
|
||||
expect(source_project.users).to include(source_project_developer)
|
||||
expect(source_project.users).not_to include(current_user)
|
||||
end
|
||||
|
||||
shared_examples 'imports source owners with correct access' do
|
||||
specify do
|
||||
import
|
||||
|
||||
source_owner_access_in_target = target_project.members.find_by!(user: source_project_owner).access_level
|
||||
expect(source_owner_access_in_target).to eq(target_access_level)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when importer is a maintainer in target project' do
|
||||
it_behaves_like 'imports source owners with correct access' do
|
||||
let(:target_access_level) { Gitlab::Access::MAINTAINER }
|
||||
end
|
||||
end
|
||||
|
||||
context 'when importer is an owner in target project' do
|
||||
before do
|
||||
target_project.add_owner(current_user)
|
||||
end
|
||||
|
||||
it_behaves_like 'imports source owners with correct access' do
|
||||
let(:target_access_level) { Gitlab::Access::OWNER }
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#find_member' do
|
||||
context 'personal project' do
|
||||
let(:project) do
|
||||
|
|
|
@ -668,34 +668,116 @@ RSpec.describe User, feature_category: :user_profile do
|
|||
end
|
||||
end
|
||||
|
||||
describe '#commit_email=' do
|
||||
subject(:user) { create(:user) }
|
||||
shared_examples 'for user notification, public, and commit emails' do
|
||||
context 'when confirmed primary email' do
|
||||
let(:user) { create(:user) }
|
||||
let(:email) { user.email }
|
||||
|
||||
it 'can be set to a confirmed email' do
|
||||
confirmed = create(:email, :confirmed, user: user)
|
||||
user.commit_email = confirmed.email
|
||||
it 'can be set' do
|
||||
set_email
|
||||
|
||||
expect(user).to be_valid
|
||||
expect(user).to be_valid
|
||||
end
|
||||
|
||||
context 'when primary email is changed' do
|
||||
before do
|
||||
user.email = generate(:email)
|
||||
end
|
||||
|
||||
it 'can not be set' do
|
||||
set_email
|
||||
|
||||
expect(user).not_to be_valid
|
||||
end
|
||||
end
|
||||
|
||||
context 'when confirmed secondary email' do
|
||||
let(:email) { create(:email, :confirmed, user: user).email }
|
||||
|
||||
it 'can be set' do
|
||||
set_email
|
||||
|
||||
expect(user).to be_valid
|
||||
end
|
||||
end
|
||||
|
||||
context 'when unconfirmed secondary email' do
|
||||
let(:email) { create(:email, user: user).email }
|
||||
|
||||
it 'can not be set' do
|
||||
set_email
|
||||
|
||||
expect(user).not_to be_valid
|
||||
end
|
||||
end
|
||||
|
||||
context 'when invalid confirmed secondary email' do
|
||||
let(:email) { create(:email, :confirmed, :skip_validate, user: user, email: 'invalid') }
|
||||
|
||||
it 'can not be set' do
|
||||
set_email
|
||||
|
||||
expect(user).not_to be_valid
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
it 'can not be set to an unconfirmed email' do
|
||||
unconfirmed = create(:email, user: user)
|
||||
user.commit_email = unconfirmed.email
|
||||
context 'when unconfirmed primary email ' do
|
||||
let(:user) { create(:user, :unconfirmed) }
|
||||
let(:email) { user.email }
|
||||
|
||||
expect(user).not_to be_valid
|
||||
it 'can not be set' do
|
||||
set_email
|
||||
|
||||
expect(user).not_to be_valid
|
||||
end
|
||||
end
|
||||
|
||||
it 'can not be set to a non-existent email' do
|
||||
user.commit_email = 'non-existent-email@nonexistent.nonexistent'
|
||||
context 'when new record' do
|
||||
let(:user) { build(:user, :unconfirmed) }
|
||||
let(:email) { user.email }
|
||||
|
||||
expect(user).not_to be_valid
|
||||
it 'can not be set' do
|
||||
set_email
|
||||
|
||||
expect(user).not_to be_valid
|
||||
end
|
||||
|
||||
context 'when skipping confirmation' do
|
||||
before do
|
||||
user.skip_confirmation = true
|
||||
end
|
||||
|
||||
it 'can be set' do
|
||||
set_email
|
||||
|
||||
expect(user).to be_valid
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
it 'can not be set to an invalid email, even if confirmed' do
|
||||
confirmed = create(:email, :confirmed, :skip_validate, user: user, email: 'invalid')
|
||||
user.commit_email = confirmed.email
|
||||
describe 'notification_email' do
|
||||
include_examples 'for user notification, public, and commit emails' do
|
||||
subject(:set_email) do
|
||||
user.notification_email = email
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
expect(user).not_to be_valid
|
||||
describe 'public_email' do
|
||||
include_examples 'for user notification, public, and commit emails' do
|
||||
subject(:set_email) do
|
||||
user.public_email = email
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'commit_email' do
|
||||
include_examples 'for user notification, public, and commit emails' do
|
||||
subject(:set_email) do
|
||||
user.commit_email = email
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
@ -3461,15 +3543,40 @@ RSpec.describe User, feature_category: :user_profile do
|
|||
|
||||
describe '#verified_emails' do
|
||||
let(:user) { create(:user) }
|
||||
let!(:confirmed_email) { create(:email, :confirmed, user: user) }
|
||||
|
||||
before do
|
||||
create(:email, user: user)
|
||||
end
|
||||
|
||||
it 'returns only confirmed emails' do
|
||||
email_confirmed = create :email, user: user, confirmed_at: Time.current
|
||||
create :email, user: user
|
||||
|
||||
expect(user.verified_emails).to contain_exactly(
|
||||
user.email,
|
||||
user.private_commit_email,
|
||||
email_confirmed.email
|
||||
confirmed_email.email
|
||||
)
|
||||
end
|
||||
|
||||
it 'does not return primary email when primary email is changed' do
|
||||
original_email = user.email
|
||||
user.email = generate(:email)
|
||||
|
||||
expect(user.verified_emails).to contain_exactly(
|
||||
user.private_commit_email,
|
||||
confirmed_email.email,
|
||||
original_email
|
||||
)
|
||||
end
|
||||
|
||||
it 'does not return unsaved primary email even if skip_confirmation is enabled' do
|
||||
original_email = user.email
|
||||
user.skip_confirmation = true
|
||||
user.email = generate(:email)
|
||||
|
||||
expect(user.verified_emails).to contain_exactly(
|
||||
user.private_commit_email,
|
||||
confirmed_email.email,
|
||||
original_email
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
@ -19,43 +19,6 @@ RSpec.describe AbuseReportsController, feature_category: :insider_threat do
|
|||
sign_in(reporter)
|
||||
end
|
||||
|
||||
describe 'GET new' do
|
||||
let(:ref_url) { 'http://example.com' }
|
||||
|
||||
it 'sets the instance variables' do
|
||||
get new_abuse_report_path(user_id: user.id, ref_url: ref_url)
|
||||
|
||||
expect(assigns(:abuse_report)).to be_kind_of(AbuseReport)
|
||||
expect(assigns(:abuse_report)).to have_attributes(
|
||||
user_id: user.id,
|
||||
reported_from_url: ref_url
|
||||
)
|
||||
end
|
||||
|
||||
context 'when the user has already been deleted' do
|
||||
it 'redirects the reporter to root_path' do
|
||||
user_id = user.id
|
||||
user.destroy!
|
||||
|
||||
get new_abuse_report_path(user_id: user_id)
|
||||
|
||||
expect(response).to redirect_to root_path
|
||||
expect(flash[:alert]).to eq(_('Cannot create the abuse report. The user has been deleted.'))
|
||||
end
|
||||
end
|
||||
|
||||
context 'when the user has already been blocked' do
|
||||
it 'redirects the reporter to the user\'s profile' do
|
||||
user.block
|
||||
|
||||
get new_abuse_report_path(user_id: user.id)
|
||||
|
||||
expect(response).to redirect_to user
|
||||
expect(flash[:alert]).to eq(_('Cannot create the abuse report. This user has been blocked.'))
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe 'POST add_category', :aggregate_failures do
|
||||
subject(:request) { post add_category_abuse_reports_path, params: request_params }
|
||||
|
||||
|
|
|
@ -13,11 +13,12 @@ RSpec.describe API::NpmInstancePackages, feature_category: :package_registry do
|
|||
describe 'GET /api/v4/packages/npm/*package_name' do
|
||||
let(:url) { api("/packages/npm/#{package_name}") }
|
||||
|
||||
subject { get(url) }
|
||||
|
||||
it_behaves_like 'handling get metadata requests', scope: :instance
|
||||
it_behaves_like 'rejects invalid package names'
|
||||
|
||||
context 'with a duplicate package name in another project' do
|
||||
subject { get(url) }
|
||||
|
||||
let_it_be(:project2) { create(:project, :public, namespace: namespace) }
|
||||
let_it_be(:package2) do
|
||||
create(:npm_package,
|
||||
|
|
|
@ -23,6 +23,9 @@ RSpec.describe API::NpmProjectPackages, feature_category: :package_registry do
|
|||
|
||||
it_behaves_like 'handling get metadata requests', scope: :project
|
||||
it_behaves_like 'accept get request on private project with access to package registry for everyone'
|
||||
it_behaves_like 'rejects invalid package names' do
|
||||
subject { get(url) }
|
||||
end
|
||||
end
|
||||
|
||||
describe 'GET /api/v4/projects/:id/packages/npm/-/package/*package_name/dist-tags' do
|
||||
|
|
|
@ -323,6 +323,62 @@ RSpec.describe 'Git LFS API and storage', feature_category: :source_code_managem
|
|||
|
||||
it_behaves_like 'process authorization header', renew_authorization: renew_authorization
|
||||
end
|
||||
|
||||
context 'when downloading an LFS object that is stored on object storage' do
|
||||
before do
|
||||
stub_lfs_object_storage
|
||||
lfs_object.file.migrate!(LfsObjectUploader::Store::REMOTE)
|
||||
end
|
||||
|
||||
context 'when lfs.object_store.proxy_download=true' do
|
||||
before do
|
||||
stub_lfs_object_storage(proxy_download: true)
|
||||
end
|
||||
|
||||
it_behaves_like 'LFS http 200 response'
|
||||
|
||||
it 'does return proxied address URL' do
|
||||
expect(json_response['objects'].first).to include(sample_object)
|
||||
expect(json_response['objects'].first['actions']['download']['href']).to eq(objects_url(project, sample_oid))
|
||||
end
|
||||
end
|
||||
|
||||
context 'when "lfs.object_store.proxy_download" is "false"' do
|
||||
before do
|
||||
stub_lfs_object_storage(proxy_download: false)
|
||||
end
|
||||
|
||||
it_behaves_like 'LFS http 200 response'
|
||||
|
||||
it 'does return direct object storage URL' do
|
||||
expect(json_response['objects'].first).to include(sample_object)
|
||||
expect(json_response['objects'].first['actions']['download']['href']).to start_with("https://lfs-objects.s3.amazonaws.com/")
|
||||
expect(json_response['objects'].first['actions']['download']['href']).to include("X-Amz-Expires=3600&")
|
||||
end
|
||||
|
||||
context 'when feature flag "lfs_batch_direct_downloads" is "false"' do
|
||||
before do
|
||||
stub_feature_flags(lfs_batch_direct_downloads: false)
|
||||
end
|
||||
|
||||
it_behaves_like 'LFS http 200 response'
|
||||
|
||||
it 'does return proxied address URL' do
|
||||
expect(json_response['objects'].first).to include(sample_object)
|
||||
expect(json_response['objects'].first['actions']['download']['href']).to eq(objects_url(project, sample_oid))
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when sending objects=[]' do
|
||||
let(:body) { download_body([]) }
|
||||
|
||||
it_behaves_like 'LFS http expected response code and message' do
|
||||
let(:response_code) { 404 }
|
||||
let(:message) { 'Not found.' }
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'when user is authenticated' do
|
||||
|
|
|
@ -351,14 +351,6 @@ RSpec.describe InvitesController, 'routing' do
|
|||
end
|
||||
end
|
||||
|
||||
RSpec.describe AbuseReportsController, 'routing' do
|
||||
let_it_be(:user) { create(:user) }
|
||||
|
||||
it 'to #new' do
|
||||
expect(get("/-/abuse_reports/new?user_id=#{user.id}")).to route_to('abuse_reports#new', user_id: user.id.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
RSpec.describe SentNotificationsController, 'routing' do
|
||||
it 'to #unsubscribe' do
|
||||
expect(get("/-/sent_notifications/4bee17d4a63ed60cf5db53417e9aeb4c/unsubscribe"))
|
||||
|
|
|
@ -6,7 +6,6 @@ RSpec.shared_examples 'reportable note' do |type|
|
|||
|
||||
let(:comment) { find("##{ActionView::RecordIdentifier.dom_id(note)}") }
|
||||
let(:more_actions_selector) { '.more-actions.dropdown' }
|
||||
let(:abuse_report_path) { new_abuse_report_path(user_id: note.author.id, ref_url: noteable_note_url(note)) }
|
||||
|
||||
it 'has an edit button' do
|
||||
expect(comment).to have_selector('.js-note-edit')
|
||||
|
|
|
@ -1,27 +1,28 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
RSpec.shared_examples 'resource with exportable associations' do
|
||||
before do
|
||||
stub_licensed_features(stubbed_features) if stubbed_features.any?
|
||||
end
|
||||
RSpec.shared_examples 'an exportable' do |restricted_association: :project|
|
||||
let_it_be(:user) { create(:user) }
|
||||
|
||||
describe '#exportable_association?' do
|
||||
let(:association) { single_association }
|
||||
let(:association) { restricted_association }
|
||||
|
||||
subject { resource.exportable_association?(association, current_user: user) }
|
||||
|
||||
it { is_expected.to be_falsey }
|
||||
|
||||
context 'when user can read resource' do
|
||||
context 'when user can only read resource' do
|
||||
before do
|
||||
group.add_developer(user)
|
||||
allow(Ability).to receive(:allowed?).and_call_original
|
||||
allow(Ability).to receive(:allowed?)
|
||||
.with(user, :"read_#{resource.to_ability_name}", resource)
|
||||
.and_return(true)
|
||||
end
|
||||
|
||||
it { is_expected.to be_falsey }
|
||||
|
||||
context "when user can read resource's association" do
|
||||
before do
|
||||
other_group.add_developer(user)
|
||||
allow(resource).to receive(:readable_record?).with(anything, user).and_return(true)
|
||||
end
|
||||
|
||||
it { is_expected.to be_truthy }
|
||||
|
@ -31,41 +32,48 @@ RSpec.shared_examples 'resource with exportable associations' do
|
|||
|
||||
it { is_expected.to be_falsey }
|
||||
end
|
||||
|
||||
context 'for an unauthenticated user' do
|
||||
let(:user) { nil }
|
||||
|
||||
it { is_expected.to be_falsey }
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#readable_records' do
|
||||
subject { resource.readable_records(association, current_user: user) }
|
||||
describe '#to_authorized_json' do
|
||||
let(:options) { { include: [{ notes: { only: [:id] } }] } }
|
||||
|
||||
subject { resource.to_authorized_json(keys, user, options) }
|
||||
|
||||
before do
|
||||
group.add_developer(user)
|
||||
allow(Ability).to receive(:allowed?).and_call_original
|
||||
allow(Ability).to receive(:allowed?)
|
||||
.with(user, :"read_#{resource.to_ability_name}", resource)
|
||||
.and_return(true)
|
||||
end
|
||||
|
||||
context 'when association not supported' do
|
||||
let(:association) { :foo }
|
||||
let(:keys) { [:foo] }
|
||||
|
||||
it { is_expected.to be_nil }
|
||||
it { is_expected.not_to include('foo') }
|
||||
end
|
||||
|
||||
context 'when association is `:notes`' do
|
||||
let(:association) { :notes }
|
||||
let_it_be(:readable_note) { create(:system_note, noteable: resource, project: project, note: 'readable') }
|
||||
let_it_be(:restricted_note) { create(:system_note, noteable: resource, project: project, note: 'restricted') }
|
||||
|
||||
it { is_expected.to match_array([readable_note]) }
|
||||
let(:restricted_note_access) { false }
|
||||
let(:keys) { [:notes] }
|
||||
|
||||
context 'when user have access' do
|
||||
before do
|
||||
other_group.add_developer(user)
|
||||
end
|
||||
before do
|
||||
allow(Ability).to receive(:allowed?).and_call_original
|
||||
allow(Ability).to receive(:allowed?).with(user, :read_note, readable_note).and_return(true)
|
||||
allow(Ability).to receive(:allowed?).with(user, :read_note, restricted_note).and_return(restricted_note_access)
|
||||
end
|
||||
|
||||
it 'returns all records' do
|
||||
is_expected.to match_array([readable_note, restricted_note])
|
||||
it { is_expected.to include("\"notes\":[{\"id\":#{readable_note.id}}]") }
|
||||
|
||||
context 'when user have access to all notes' do
|
||||
let(:restricted_note_access) { true }
|
||||
|
||||
it 'string includes all notes' do
|
||||
is_expected.to include("\"notes\":[{\"id\":#{readable_note.id}},{\"id\":#{restricted_note.id}}]")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
@ -849,3 +849,14 @@ RSpec.shared_examples 'handling different package names, visibilities and user r
|
|||
it_behaves_like example_name, status: status
|
||||
end
|
||||
end
|
||||
|
||||
RSpec.shared_examples 'rejects invalid package names' do
|
||||
let(:package_name) { "%0d%0ahttp:/%2fexample.com" }
|
||||
|
||||
it do
|
||||
subject
|
||||
|
||||
expect(response).to have_gitlab_http_status(:bad_request)
|
||||
expect(Gitlab::Json.parse(response.body)).to eq({ 'error' => 'package_name should be a valid file path' })
|
||||
end
|
||||
end
|
||||
|
|
66
spec/tasks/gitlab/db/migration_fix_15_11_rake_spec.rb
Normal file
66
spec/tasks/gitlab/db/migration_fix_15_11_rake_spec.rb
Normal file
|
@ -0,0 +1,66 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'rake_helper'
|
||||
|
||||
RSpec.describe 'migration_fix_15_11', :reestablished_active_record_base, feature_category: :database do
|
||||
let(:db) { ApplicationRecord.connection }
|
||||
let(:target_init_schema) { '20220314184009' }
|
||||
let(:earlier_init_schema) { '20210101010101' }
|
||||
|
||||
before :all do
|
||||
Rake.application.rake_require 'active_record/railties/databases'
|
||||
Rake.application.rake_require 'tasks/gitlab/db/migration_fix_15_11'
|
||||
|
||||
Rake::Task.define_task :environment
|
||||
end
|
||||
|
||||
describe 'migration_fix_15_11' do
|
||||
context 'when fix is needed' do
|
||||
it 'patches init_schema' do
|
||||
db.execute('DELETE FROM schema_migrations')
|
||||
db.execute("INSERT INTO schema_migrations (version) VALUES ('#{target_init_schema}')")
|
||||
run_rake_task(:migration_fix_15_11)
|
||||
result = db.execute('SELECT * FROM schema_migrations')
|
||||
expect(result.count).to eq(300)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when fix is not needed because no migrations have been run' do
|
||||
it 'does nothing' do
|
||||
db.execute('DELETE FROM schema_migrations')
|
||||
run_rake_task(:migration_fix_15_11)
|
||||
result = db.execute('SELECT * FROM schema_migrations')
|
||||
expect(result.count).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when fix is not needed because DB has not been initialized' do
|
||||
it 'does nothing' do
|
||||
db.execute('DROP TABLE schema_migrations')
|
||||
expect { run_rake_task(:migration_fix_15_11) }.not_to raise_error
|
||||
end
|
||||
end
|
||||
|
||||
context 'when fix is not needed because there is an earlier init_schema' do
|
||||
it 'does nothing' do
|
||||
db.execute('DELETE FROM schema_migrations')
|
||||
db.execute("INSERT INTO schema_migrations (version) VALUES ('#{earlier_init_schema}')")
|
||||
run_rake_task(:migration_fix_15_11)
|
||||
result = db.execute('SELECT * FROM schema_migrations')
|
||||
expect(result.pluck('version')).to match_array [earlier_init_schema]
|
||||
end
|
||||
end
|
||||
|
||||
context 'when fix is not needed because the fix has been run already' do
|
||||
it 'does not affect the schema_migrations table' do
|
||||
db.execute('DELETE FROM schema_migrations')
|
||||
db.execute("INSERT INTO schema_migrations (version) VALUES ('#{target_init_schema}')")
|
||||
run_rake_task(:migration_fix_15_11)
|
||||
fixed_table = db.execute('SELECT version FROM schema_migrations').pluck('version')
|
||||
run_rake_task(:migration_fix_15_11)
|
||||
test_fixed_table = db.execute('SELECT version FROM schema_migrations').pluck('version')
|
||||
expect(fixed_table).to match_array test_fixed_table
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
Loading…
Reference in a new issue