Compare commits

..

36 commits

Author SHA1 Message Date
Pirate Praveen
5dacee4226
Upload to bookworm-fasttrack 2023-07-11 01:24:39 +05:30
Pirate Praveen
ad7d6a3bb3
Fix last version in mainstcript 2023-07-11 01:23:41 +05:30
Pirate Praveen
0dcb62a93e
Upload to bookworm-fasttrack 2023-07-10 21:33:27 +05:30
Pirate Praveen
9118f0c2df
Merge tag 'debian/16.0.7+ds1-3' into bookworm-fasttrack
gitlab Debian release 16.0.7+ds1-3
2023-07-10 21:33:11 +05:30
Pirate Praveen
ba0a95abfe
Upload to bookworm-fasttrack 2023-07-10 19:22:08 +05:30
Pirate Praveen
602ea3e45b
Merge tag 'debian/16.0.7+ds1-2' into bookworm-fasttrack
gitlab Debian release 16.0.7+ds1-2
2023-07-10 19:21:34 +05:30
Pirate Praveen
13ecc019b9
Upload to bookworm-fasttrack 2023-07-07 13:50:43 +05:30
Pirate Praveen
ec15c6c1e9
Merge tag 'debian/15.11.11+ds1-1' into bookworm-fasttrack
gitlab Debian release 15.11.11+ds1-1
2023-07-07 13:50:13 +05:30
Pirate Praveen
b8846ca4cb
Upload to bookworm-fasttrack 2023-07-03 13:47:45 +05:30
Pirate Praveen
8d07393c28
Merge tag 'debian/15.11.6+ds1-1' into bookworm-fasttrack
gitlab Debian release 15.11.6+ds1-1
2023-07-03 13:46:18 +05:30
Mohammed Bilal
7eeaeee39c
Upload to bookworm-fasttrack 2023-06-13 17:26:47 +05:30
Mohammed Bilal
d108af0c58
Merge tag 'debian/15.10.8+ds1-1' into bookworm-fasttrack
gitlab Debian release 15.10.8+ds1-1
2023-06-13 14:11:31 +05:30
Pirate Praveen
853cfa7372
Upload to bookworm-fasttrack 2023-05-30 21:45:07 +05:30
Pirate Praveen
bf28b97a0a
Relax the immutable option to yarnpkg install 2023-05-30 21:45:07 +05:30
Pirate Praveen
e2fca5a2ed
Upload to bookworm-fasttrack 2023-05-30 15:00:36 +05:30
Pirate Praveen
0fc9f8ca58
Switch order of links for package.json 2023-05-30 15:00:35 +05:30
Pirate Praveen
9036fc3e42
Fix yarn.lock link login (reverse the order in links) 2023-05-30 13:17:45 +05:30
Pirate Praveen
60f59a997e
Upload to bookworm-fasttrack 2023-05-29 22:38:07 +05:30
Pirate Praveen
64b54c0bc0
Merge tag 'debian/15.10.7+ds1-1' into bookworm-fasttrack
gitlab Debian release 15.10.7+ds1-1
2023-05-29 22:37:47 +05:30
Pirate Praveen
8b9b9c4567
Upload to bookworm-fasttrack 2023-05-24 20:13:10 +05:30
Pirate Praveen
5ab75b5a94
Merge tag 'debian/15.9.8+ds1-3' into bookworm-fasttrack
gitlab Debian release 15.9.8+ds1-3
2023-05-24 20:12:46 +05:30
Pirate Praveen
2c5ccc23ee
Upload to bookworm-fasttrack 2023-05-23 02:06:54 +05:30
Pirate Praveen
1a20f090d9
Update minimum version of ruby-gitlab-markup to 1.9~ 2023-05-23 02:06:21 +05:30
Pirate Praveen
e3cafbfa66
Update minimum version of ruby-json to 2.6.3 2023-05-23 01:53:53 +05:30
Pirate Praveen
fe5bb584cb
Add ruby-google-apis-container-v1 as dependency 2023-05-23 01:38:54 +05:30
Pirate Praveen
8643b56a93
Upload to bookworm-fasttrack 2023-05-20 13:13:19 +05:30
Pirate Praveen
7bd430c0da
Merge tag 'debian/15.9.8+ds1-1' into bookworm-fasttrack
gitlab Debian release 15.9.8+ds1-1
2023-05-20 13:12:33 +05:30
Pirate Praveen
7ae5a99329
Upload to bookworm-fasttrack 2023-03-25 19:10:18 +05:30
Pirate Praveen
ab2cb256ac
Add libssl-dev to Depends for building openssl 3.0.2 gem 2023-03-25 19:09:32 +05:30
Pirate Praveen
ba766e97ca
Upload to bookworm-fasttrack 2023-03-24 20:41:01 +05:30
Pirate Praveen
8611e16a6c
Force ruby upgrade by adding conflict with libruby2.7 2023-03-24 20:40:21 +05:30
Pirate Praveen
1cb44ffae2
Upload to bookworm-fasttrack 2023-03-24 19:58:50 +05:30
Pirate Praveen
c9c08cc59a
Use openssl 3.0.2 from rubygems.org as workaround for #1032070 2023-03-24 19:57:54 +05:30
Pirate Praveen
f651e93aa9
Upload to bookworm-fasttrack 2023-03-24 18:00:07 +05:30
Pirate Praveen
72fcf92d05
Update minimum versions of nodejs and other node packages to force an upgrade 2023-03-24 17:59:09 +05:30
Pirate Praveen
5618e90737
Upload to bookworm-fasttrack 2023-03-24 12:38:53 +05:30
68 changed files with 659 additions and 1365 deletions

View file

@ -5126,6 +5126,7 @@ RSpec/MissingFeatureCategory:
- 'spec/policies/ci/bridge_policy_spec.rb'
- 'spec/policies/ci/build_policy_spec.rb'
- 'spec/policies/ci/pipeline_policy_spec.rb'
- 'spec/policies/ci/pipeline_schedule_policy_spec.rb'
- 'spec/policies/ci/trigger_policy_spec.rb'
- 'spec/policies/clusters/agent_policy_spec.rb'
- 'spec/policies/clusters/agent_token_policy_spec.rb'

View file

@ -2,28 +2,6 @@
documentation](doc/development/changelog.md) for instructions on adding your own
entry.
## 16.0.8 (2023-08-01)
### Fixed (1 change)
- [Disable IAT verification by default](gitlab-org/security/gitlab@6d17a50539b8518da18bc68accc03b48d73173a0)
### Security (13 changes)
- [Prevent leaking emails of newly created users](gitlab-org/security/gitlab@b2872b398599cd7ee20c4119ae4c8e6ba2a6882d) ([merge request](gitlab-org/security/gitlab!3451))
- [Added redirect to filtered params](gitlab-org/security/gitlab@49ffc2cc98af0e66305c8a653c74e0b92ee06ce8) ([merge request](gitlab-org/security/gitlab!3443))
- [Relocate PlantUML config and disable SVG support](gitlab-org/security/gitlab@c6ded17a7d17ec8c3ed55cb94b8e6e524b6bbd5e) ([merge request](gitlab-org/security/gitlab!3440))
- [Sanitize multiple hardlinks from import archives](gitlab-org/security/gitlab@9dabd8ebca50d8ea3781a0c4955a40cd07c453e7) ([merge request](gitlab-org/security/gitlab!3437))
- [Validates project path availability](gitlab-org/security/gitlab@97e6ce4d15c8f4bcc7f60a560b789a023d391531) ([merge request](gitlab-org/security/gitlab!3428))
- [Fix policy project assign](gitlab-org/security/gitlab@c1cca8ce8f24f6466563a50463e3254c5c423e97) ([merge request](gitlab-org/security/gitlab!3425))
- [Fix pipeline schedule authorization for protected branch/tag](gitlab-org/security/gitlab@0c7017d993a33ef9fc693d4435505a4aea0141d1) ([merge request](gitlab-org/security/gitlab!3363))
- [Mitigate autolink filter ReDOS](gitlab-org/security/gitlab@9072c630608a81645548b64b32d9f81bd258ba06) ([merge request](gitlab-org/security/gitlab!3432))
- [Fix XSS vector in Web IDE](gitlab-org/security/gitlab@2832d1ae3b3e1bfc42bbeaeb29841a1e5fecac8a) ([merge request](gitlab-org/security/gitlab!3411))
- [Mitigate project reference filter ReDOS](gitlab-org/security/gitlab@9c73619acaad3eb3605bf632f066bcee59b86566) ([merge request](gitlab-org/security/gitlab!3429))
- [Add a stricter regex for the Harbor search param](gitlab-org/security/gitlab@c27e5e48a02d3411e84617b4fb7fd3f0fb49b618) ([merge request](gitlab-org/security/gitlab!3396))
- [Update pipeline user to the last policy MR author](gitlab-org/security/gitlab@b1e9bcb33106ba7e279d5fd42c4f2c1727629f63) ([merge request](gitlab-org/security/gitlab!3393))
- [Prohibit 40 character hex plus a hyphen if branch name is path](gitlab-org/security/gitlab@66c81ff6b50d0e53fc1f1b153439ad95614c9d09) ([merge request](gitlab-org/security/gitlab!3406))
## 16.0.7 (2023-07-04)
### Security (1 change)

View file

@ -1 +1 @@
16.0.8
16.0.7

View file

@ -1 +1 @@
16.0.8
16.0.7

View file

@ -1 +1 @@
16.0.8
16.0.7

View file

@ -21,6 +21,7 @@ class Projects::PipelineSchedulesController < Projects::ApplicationController
end
def new
@schedule = project.pipeline_schedules.new
end
def create
@ -101,15 +102,6 @@ class Projects::PipelineSchedulesController < Projects::ApplicationController
variables_attributes: [:id, :variable_type, :key, :secret_value, :_destroy])
end
def new_schedule
# We need the `ref` here for `authorize_create_pipeline_schedule!`
@schedule ||= project.pipeline_schedules.new(ref: params.dig(:schedule, :ref))
end
def authorize_create_pipeline_schedule!
return access_denied! unless can?(current_user, :create_pipeline_schedule, new_schedule)
end
def authorize_play_pipeline_schedule!
return access_denied! unless can?(current_user, :play_pipeline_schedule, schedule)
end

View file

@ -584,8 +584,6 @@ class Project < ApplicationRecord
validates :max_artifacts_size, numericality: { only_integer: true, greater_than: 0, allow_nil: true }
validates :suggestion_commit_message, length: { maximum: MAX_SUGGESTIONS_TEMPLATE_LENGTH }
validate :path_availability, if: :path_changed?
# Scopes
scope :pending_delete, -> { where(pending_delete: true) }
scope :without_deleted, -> { where(pending_delete: false) }
@ -3182,15 +3180,6 @@ class Project < ApplicationRecord
end
strong_memoize_attr :frozen_outbound_job_token_scopes?
def path_availability
base, _, host = path.partition('.')
return unless host == Gitlab.config.pages&.dig('host')
return unless ProjectSetting.where(pages_unique_domain: base).exists?
errors.add(:path, s_('Project|already in use'))
end
private
def pages_unique_domain_enabled?

View file

@ -52,8 +52,6 @@ class ProjectSetting < ApplicationRecord
validate :validates_mr_default_target_self
validate :pages_unique_domain_availability, if: :pages_unique_domain_changed?
attribute :legacy_open_source_license_available, default: -> do
Feature.enabled?(:legacy_open_source_license_available, type: :ops)
end
@ -104,15 +102,6 @@ class ProjectSetting < ApplicationRecord
pages_unique_domain_enabled ||
pages_unique_domain_in_database.present?
end
def pages_unique_domain_availability
host = Gitlab.config.pages&.dig('host')
return if host.blank?
return unless Project.where(path: "#{pages_unique_domain}.#{host}").exists?
errors.add(:pages_unique_domain, s_('ProjectSetting|already in use'))
end
end
ProjectSetting.prepend_mod

View file

@ -5,18 +5,7 @@ module Ci
alias_method :pipeline_schedule, :subject
condition(:protected_ref) do
if full_ref?(@subject.ref)
is_tag = Gitlab::Git.tag_ref?(@subject.ref)
ref_name = Gitlab::Git.ref_name(@subject.ref)
else
# NOTE: this block should not be removed
# until the full ref validation is in place
# and all old refs are updated and validated
is_tag = @subject.project.repository.tag_exists?(@subject.ref)
ref_name = @subject.ref
end
ref_protected?(@user, @subject.project, is_tag, ref_name)
ref_protected?(@user, @subject.project, @subject.project.repository.tag_exists?(@subject.ref), @subject.ref)
end
condition(:owner_of_schedule) do
@ -42,15 +31,6 @@ module Ci
enable :take_ownership_pipeline_schedule
end
rule { protected_ref }.policy do
prevent :play_pipeline_schedule
prevent :create_pipeline_schedule
end
private
def full_ref?(ref)
Gitlab::Git.tag_ref?(ref) || Gitlab::Git.branch_ref?(ref)
end
rule { protected_ref }.prevent :play_pipeline_schedule
end
end

View file

@ -49,7 +49,11 @@ module BulkImports
end
def validate_symlink
raise(BulkImports::Error, 'Invalid file') if Gitlab::Utils::FileInfo.linked?(filepath)
raise(BulkImports::Error, 'Invalid file') if symlink?(filepath)
end
def symlink?(filepath)
File.lstat(filepath).symlink?
end
def extract_archive

View file

@ -53,7 +53,7 @@ module BulkImports
end
def validate_symlink(filepath)
raise(ServiceError, 'Invalid file') if Gitlab::Utils::FileInfo.linked?(filepath)
raise(ServiceError, 'Invalid file') if File.lstat(filepath).symlink?
end
def decompress_file

View file

@ -171,7 +171,6 @@ module Gitlab
# - Any parameter containing `password`
# - Any parameter containing `secret`
# - Any parameter ending with `key`
# - Any parameter named `redirect`, filtered for security concerns of exposing sensitive information
# - Two-factor tokens (:otp_attempt)
# - Repo/Project Import URLs (:import_url)
# - Build traces (:trace)
@ -214,7 +213,6 @@ module Gitlab
variables
content
sharedSecret
redirect
)
# Enable escaping HTML in JSON.

124
debian/changelog vendored
View file

@ -1,28 +1,14 @@
gitlab (16.0.8+ds1-1) UNRELEASED; urgency=medium
[ Pirate Praveen ]
* Stop installing google-api-client in postinst
[ Aravinth Manivannan ]
* New upstream version 16.0.8+ds1
* New upstream version 16.0.8+ds1
-- Aravinth Manivannan <realaravinth@batsense.net> Sat, 09 Sep 2023 12:36:47 +0000
gitlab (16.0.7+ds1-5) unstable; urgency=medium
* Use ruby-app-store-connect (now packaged)
* Revert "Remove ruby-whitequark-parser dependency (no longer used in
migrations)" - we can only remove in 16.1+
-- Pirate Praveen <praveen@debian.org> Tue, 11 Jul 2023 20:10:37 +0530
gitlab (16.0.7+ds1-4) unstable; urgency=medium
gitlab (16.0.7+ds1-3~fto12+2) bookworm-fasttrack; urgency=medium
* Fix last version in mainstcript
* Remove ruby-whitequark-parser dependency (no longer used in migrations)
-- Pirate Praveen <praveen@debian.org> Tue, 11 Jul 2023 19:23:06 +0530
-- Pirate Praveen <praveen@debian.org> Tue, 11 Jul 2023 01:24:32 +0530
gitlab (16.0.7+ds1-3~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Mon, 10 Jul 2023 21:33:15 +0530
gitlab (16.0.7+ds1-3) unstable; urgency=medium
@ -32,6 +18,12 @@ gitlab (16.0.7+ds1-3) unstable; urgency=medium
-- Pirate Praveen <praveen@debian.org> Mon, 10 Jul 2023 20:23:03 +0530
gitlab (16.0.7+ds1-2~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Mon, 10 Jul 2023 19:21:41 +0530
gitlab (16.0.7+ds1-2) unstable; urgency=medium
* Reupload to unstable
@ -52,6 +44,12 @@ gitlab (16.0.7+ds1-1) experimental; urgency=medium
-- Pirate Praveen <praveen@debian.org> Sun, 09 Jul 2023 16:17:27 +0530
gitlab (15.11.11+ds1-1~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Fri, 07 Jul 2023 13:50:17 +0530
gitlab (15.11.11+ds1-1) unstable; urgency=medium
* Update minimum version of ruby-ruby-parser to 3.20
@ -59,6 +57,12 @@ gitlab (15.11.11+ds1-1) unstable; urgency=medium
-- Pirate Praveen <praveen@debian.org> Fri, 07 Jul 2023 11:08:26 +0530
gitlab (15.11.6+ds1-1~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Mon, 03 Jul 2023 13:46:43 +0530
gitlab (15.11.6+ds1-1) experimental; urgency=medium
[ Vinay Keshava ]
@ -79,6 +83,12 @@ gitlab (15.10.8+ds1-2) unstable; urgency=medium
-- Pirate Praveen <praveen@debian.org> Sun, 11 Jun 2023 22:24:28 +0530
gitlab (15.10.8+ds1-1~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack
-- Mohammed Bilal <mdbilal@disroot.org> Tue, 13 Jun 2023 14:11:51 +0530
gitlab (15.10.8+ds1-1) experimental; urgency=medium
[ Pirate Praveen ]
@ -93,6 +103,25 @@ gitlab (15.10.8+ds1-1) experimental; urgency=medium
-- Mohammed Bilal <mdbilal@disroot.org> Fri, 09 Jun 2023 10:55:28 +0530
gitlab (15.10.7+ds1-1~fto12+3) bookworm-fasttrack; urgency=medium
* Relax the immutable option to yarnpkg install
-- Pirate Praveen <praveen@debian.org> Tue, 30 May 2023 21:01:23 +0530
gitlab (15.10.7+ds1-1~fto12+2) bookworm-fasttrack; urgency=medium
* Fix yarn.lock link login (reverse the order in links)
* Switch order of links for package.json
-- Pirate Praveen <praveen@debian.org> Tue, 30 May 2023 13:18:21 +0530
gitlab (15.10.7+ds1-1~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Mon, 29 May 2023 22:37:52 +0530
gitlab (15.10.7+ds1-1) experimental; urgency=medium
* New upstream version 15.10.7+ds1
@ -108,6 +137,12 @@ gitlab (15.10.7+ds1-1) experimental; urgency=medium
-- Pirate Praveen <praveen@debian.org> Sat, 27 May 2023 22:28:27 +0530
gitlab (15.9.8+ds1-3~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Wed, 24 May 2023 20:12:50 +0530
gitlab (15.9.8+ds1-3) experimental; urgency=medium
* Update minimum version of ruby-zeitwerk to 2.6.1~ (this fixes gitlab-puma
@ -126,6 +161,20 @@ gitlab (15.9.8+ds1-2) experimental; urgency=medium
-- Pirate Praveen <praveen@debian.org> Tue, 23 May 2023 19:46:42 +0530
gitlab (15.9.8+ds1-1~fto12+2) bookworm-fasttrack; urgency=medium
* Add ruby-google-apis-container-v1 as dependency
* Update minimum version of ruby-json to 2.6.3
* Update minimum version of ruby-gitlab-markup to 1.9~
-- Pirate Praveen <praveen@debian.org> Tue, 23 May 2023 02:06:43 +0530
gitlab (15.9.8+ds1-1~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Sat, 20 May 2023 13:12:52 +0530
gitlab (15.9.8+ds1-1) experimental; urgency=medium
[ Vinay Keshava ]
@ -168,6 +217,37 @@ gitlab (15.8.5+ds1-1) experimental; urgency=medium
-- Vinay Keshava <vinaykeshava@disroot.org> Mon, 03 Apr 2023 14:16:49 +0530
gitlab (15.8.4+ds1-3~fto12+5) bookworm-fasttrack; urgency=medium
* Add libssl-dev to Depends for building openssl 3.0.2 gem
-- Pirate Praveen <praveen@debian.org> Sat, 25 Mar 2023 19:10:09 +0530
gitlab (15.8.4+ds1-3~fto12+4) bookworm-fasttrack; urgency=medium
* Force ruby upgrade by adding conflict with libruby2.7
-- Pirate Praveen <praveen@debian.org> Fri, 24 Mar 2023 20:40:54 +0530
gitlab (15.8.4+ds1-3~fto12+3) bookworm-fasttrack; urgency=medium
* Use openssl 3.0.2 from rubygems.org as workaround for #1032070
-- Pirate Praveen <praveen@debian.org> Fri, 24 Mar 2023 19:58:34 +0530
gitlab (15.8.4+ds1-3~fto12+2) bookworm-fasttrack; urgency=medium
* Update minimum versions of nodejs and other node packages to force an
upgrade when upgrading from bullseye
-- Pirate Praveen <praveen@debian.org> Fri, 24 Mar 2023 17:59:40 +0530
gitlab (15.8.4+ds1-3~fto12+1) bookworm-fasttrack; urgency=medium
* Rebuild for bookworm-fasttrack.
-- Pirate Praveen <praveen@debian.org> Fri, 24 Mar 2023 12:37:48 +0530
gitlab (15.8.4+ds1-3) experimental; urgency=medium
* Drop ruby-omniauth-shibboleth dependency (dropped in 15.9)

16
debian/control vendored
View file

@ -83,8 +83,11 @@ Architecture: all
XB-Ruby-Versions: ${ruby:Versions}
Depends: ${shlibs:Depends}, ${misc:Depends},
gitlab-common (>= 16~),
ruby (>= 1:2.7~),
ruby (>= 1:3.1~),
rubygems-integration (>= 1.18~),
# for building openssl 3.0.2, see #1032070
ruby-dev,
libssl-dev,
lsb-base (>= 3.0-6),
rake (>= 12.3.0~),
bundler (>= 1.17.3~),
@ -94,7 +97,7 @@ Depends: ${shlibs:Depends}, ${misc:Depends},
bc,
redis-server (>= 5:6.0.12~),
# See https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=956211
nodejs (>= 12~),
nodejs (>= 16.15~),
nginx | httpd,
default-mta | postfix | exim4 | mail-transport-agent,
openssh-client,
@ -454,7 +457,6 @@ Depends: ${shlibs:Depends}, ${misc:Depends},
ruby-cvss-suite,
# Apple plist parsing
ruby-cfpropertylist (>= 3.0~),
ruby-app-store-connect,
# For phone verification
ruby-telesign (>= 2.2.4~),
# for ruby 3.1
@ -466,7 +468,7 @@ Depends: ${shlibs:Depends}, ${misc:Depends},
node-rails-actioncable,
node-autosize (>= 4.0.2~dfsg1-5~),
node-axios (>= 0.17.1~),
node-babel7,
node-babel7 (>= 7.20~),
node-babel-loader (>= 8.0~),
node-babel-plugin-lodash,
node-bootstrap,
@ -475,7 +477,9 @@ Depends: ${shlibs:Depends}, ${misc:Depends},
node-clipboard (>= 2.0.8~),
node-compression-webpack-plugin (>= 3.0.1~),
node-copy-webpack-plugin (>= 5.0~),
node-core-js (>= 3.2.1~),
node-core-js (>= 3.26~),
node-core-js-compat (>= 3.26~),
node-core-js-pure (>= 3.26~),
node-cron-validator,
node-css-loader (>= 5.0~),
# node-d3 includes d3-sankey
@ -549,7 +553,7 @@ Depends: ${shlibs:Depends}, ${misc:Depends},
Recommends: certbot,
gitaly (>= 16~),
openssh-server
Conflicts: libruby2.5
Conflicts: libruby2.5, libruby2.7
Description: git powered software platform to collaborate on code (non-omnibus)
gitlab provides web based interface to host source code and track issues.
It allows anyone for fork a repository and send merge requests. Code review

View file

@ -76,16 +76,30 @@ runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v '~> 1.10' "^faraday$" >/
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v '~> 0.0.12' "^arr-pm$" >/dev/null; then gem install -v '~> 0.0.12' arr-pm; fi"
# we have a newer incompatible version in the archive
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v '~> 0.6.1' "^omniauth_openid_connect$" >/dev/null; then gem install -v '~> 0.6.1' omniauth_openid_connect; fi"
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i "^app_store_connect$" >/dev/null; then gem install app_store_connect; fi"
# Packaged version is probably buggy - task lists on issues broken
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v 2.3.2 "^deckar01-task_list$" >/dev/null; then gem install -v 2.3.2 deckar01-task_list; fi"
# We need this version to use newer googleauth
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v 0.53.0 "^google-api-client$" >/dev/null; then gem install -v 0.53.0 google-api-client; fi"
# We have a newer incompatible version in the archive
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v 0.10.0 "^google-apis-core$" >/dev/null; then gem install -v 0.10.0 google-apis-core; fi"
# archive has gitaly 16.0
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v '~> 15.9.0.pre.rc3' "^gitaly$" >/dev/null; then gem install -v '~> 15.9.0.pre.rc3' gitaly; fi"
# We need a newer openssl in ruby 3.1, see #1032070
ruby_version=$(ruby -e 'print "#{RUBY_VERSION}"')
if [ $(printf %.1s "${ruby_version}") -ge 3 ]; then
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v 3.0.2 "^openssl$" >/dev/null; then gem install -v 3.0.2 openssl; fi"
fi
# Uninstall rack 3.x
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v '~>3.0' "^rack$" >/dev/null; then gem uninstall -v '~>3.0' rack; fi"
# Uninstall gitlab-labkit since it require jaeger-client '~> 1.1.0'
if [ "$(gem which gitlab-labkit)" = "/var/lib/gitlab/.gem/gems/gitlab-labkit-0.29.0/lib/gitlab-labkit.rb" ]; then
runuser -u ${gitlab_user} -- sh -c "gem uninstall -v '~> 0.29.0' gitlab-labkit"
fi
# Gitlab needs this specific version due to
# https://github.com/fog/fog-google/issues/421
runuser -u ${gitlab_user} -- sh -c "if ! gem list -i -v 2.1.0 "^fog-core$" >/dev/null; then gem install -v 2.1.0 fog-core; fi"

View file

@ -34,13 +34,8 @@ module Banzai
# https://github.com/vmg/rinku/blob/v2.0.1/ext/rinku/autolink.c#L65
#
# Rubular: http://rubular.com/r/nrL3r9yUiq
# Note that it's not possible to use Gitlab::UntrustedRegexp for LINK_PATTERN,
# as `(?<!` is unsupported in `re2`, see https://github.com/google/re2/wiki/Syntax
LINK_PATTERN = %r{([a-z][a-z0-9\+\.-]+://[^\s>]+)(?<!\?|!|\.|,|:)}.freeze
ENTITY_UNTRUSTED = '((?:&[\w#]+;)+)\z'
ENTITY_UNTRUSTED_REGEX = Gitlab::UntrustedRegexp.new(ENTITY_UNTRUSTED, multiline: false)
# Text matching LINK_PATTERN inside these elements will not be linked
IGNORE_PARENTS = %w(a code kbd pre script style).to_set
@ -90,14 +85,10 @@ module Banzai
# Remove any trailing HTML entities and store them for appending
# outside the link element. The entity must be marked HTML safe in
# order to be output literally rather than escaped.
dropped = ''
match = ENTITY_UNTRUSTED_REGEX.replace_gsub(match) do |entities|
dropped = entities[1].html_safe
match.gsub!(/((?:&[\w#]+;)+)\z/, '')
dropped = (Regexp.last_match(1) || '').html_safe
''
end
# To match the behavior of Rinku, if the matched link ends with a
# To match the behaviour of Rinku, if the matched link ends with a
# closing part of a matched pair of punctuation, we remove that trailing
# character unless there are an equal number of closing and opening
# characters in the link.

View file

@ -11,7 +11,7 @@ module Banzai
def call
return doc unless settings.plantuml_enabled? && doc.at_xpath(lang_tag)
Gitlab::Plantuml.configure
plantuml_setup
doc.xpath(lang_tag).each do |node|
img_tag = Nokogiri::HTML::DocumentFragment.parse(
@ -38,6 +38,15 @@ module Banzai
def settings
Gitlab::CurrentSettings.current_application_settings
end
def plantuml_setup
Asciidoctor::PlantUml.configure do |conf|
conf.url = settings.plantuml_url
conf.png_enable = settings.plantuml_enabled
conf.svg_enable = false
conf.txt_enable = false
end
end
end
end
end

View file

@ -24,7 +24,7 @@ module BulkImports
return if tar_filepath?(file_path)
return if lfs_json_filepath?(file_path)
return if File.directory?(file_path)
return if Gitlab::Utils::FileInfo.linked?(file_path)
return if File.lstat(file_path).symlink?
size = File.size(file_path)
oid = LfsObject.calculate_oid(file_path)

View file

@ -24,7 +24,7 @@ module BulkImports
# Validate that the path is OK to load
Gitlab::Utils.check_allowed_absolute_path_and_path_traversal!(file_path, [Dir.tmpdir])
return if File.directory?(file_path)
return if Gitlab::Utils::FileInfo.linked?(file_path)
return if File.lstat(file_path).symlink?
avatar_path = AVATAR_PATTERN.match(file_path)
return save_avatar(file_path) if avatar_path

View file

@ -32,7 +32,7 @@ module BulkImports
end
def validate_symlink
return unless Gitlab::Utils::FileInfo.linked?(filepath)
return unless File.lstat(filepath).symlink?
File.delete(filepath)
raise_error 'Invalid downloaded file'

View file

@ -26,7 +26,7 @@ module BulkImports
return unless portable.lfs_enabled?
return unless File.exist?(bundle_path)
return if File.directory?(bundle_path)
return if Gitlab::Utils::FileInfo.linked?(bundle_path)
return if File.lstat(bundle_path).symlink?
portable.design_repository.create_from_bundle(bundle_path)
end

View file

@ -26,7 +26,7 @@ module BulkImports
return unless File.exist?(bundle_path)
return if File.directory?(bundle_path)
return if Gitlab::Utils::FileInfo.linked?(bundle_path)
return if File.lstat(bundle_path).symlink?
portable.repository.create_from_bundle(bundle_path)
end

View file

@ -77,11 +77,20 @@ module Gitlab
context[:pipeline] = :ascii_doc
context[:max_includes] = [MAX_INCLUDES, context[:max_includes]].compact.min
Gitlab::Plantuml.configure
plantuml_setup
html = ::Asciidoctor.convert(input, asciidoc_opts)
html = Banzai.render(html, context)
html.html_safe
end
def self.plantuml_setup
Asciidoctor::PlantUml.configure do |conf|
conf.url = Gitlab::CurrentSettings.plantuml_url
conf.svg_enable = Gitlab::CurrentSettings.plantuml_enabled
conf.png_enable = Gitlab::CurrentSettings.plantuml_enabled
conf.txt_enable = false
end
end
end
end

View file

@ -42,7 +42,7 @@ module Gitlab
def prohibited_branch_checks
return if deletion?
if branch_name =~ %r{\A\h{40}(-/|/|\z)}
if branch_name =~ %r{\A\h{40}(/|\z)}
raise GitAccess::ForbiddenError, ERROR_MESSAGES[:prohibited_hex_branch_name]
end
end

View file

@ -65,7 +65,7 @@ module Gitlab
def validate_archive_path
Gitlab::Utils.check_path_traversal!(archive_path)
raise(ServiceError, 'Archive path is a symlink or hard link') if Gitlab::Utils::FileInfo.linked?(archive_path)
raise(ServiceError, 'Archive path is a symlink') if File.lstat(archive_path).symlink?
raise(ServiceError, 'Archive path is not a file') unless File.file?(archive_path)
end

View file

@ -25,7 +25,7 @@ module Gitlab
message: 'params invalid'
}, allow_blank: true
validates :search, format: {
with: /\A(name=[a-zA-Z0-9\-:]+(?:,name=[a-zA-Z0-9\-:]+)*)\z/,
with: /\A([a-z\_]*=[a-zA-Z0-9\- :]*,*)*\z/,
message: 'params invalid'
}, allow_blank: true

View file

@ -5,11 +5,8 @@ module Gitlab
module CommandLineUtil
UNTAR_MASK = 'u+rwX,go+rX,go-w'
DEFAULT_DIR_MODE = 0700
CLEAN_DIR_IGNORE_FILE_NAMES = %w[. ..].freeze
CommandLineUtilError = Class.new(StandardError)
FileOversizedError = Class.new(CommandLineUtilError)
HardLinkError = Class.new(CommandLineUtilError)
FileOversizedError = Class.new(StandardError)
def tar_czf(archive:, dir:)
tar_with_options(archive: archive, dir: dir, options: 'czf')
@ -93,7 +90,7 @@ module Gitlab
def untar_with_options(archive:, dir:, options:)
execute_cmd(%W(tar -#{options} #{archive} -C #{dir}))
execute_cmd(%W(chmod -R #{UNTAR_MASK} #{dir}))
clean_extraction_dir!(dir)
remove_symlinks(dir)
end
# rubocop:disable Gitlab/ModuleWithInstanceVariables
@ -125,27 +122,17 @@ module Gitlab
true
end
# Scans and cleans the directory tree.
# Symlinks are considered legal but are removed.
# Files sharing hard links are considered illegal and the directory will be removed
# and a `HardLinkError` exception will be raised.
#
# @raise [HardLinkError] if there multiple hard links to the same file detected.
# @return [Boolean] true
def clean_extraction_dir!(dir)
def remove_symlinks(dir)
ignore_file_names = %w[. ..]
# Using File::FNM_DOTMATCH to also delete symlinks starting with "."
Dir.glob("#{dir}/**/*", File::FNM_DOTMATCH).each do |filepath|
next if CLEAN_DIR_IGNORE_FILE_NAMES.include?(File.basename(filepath))
raise HardLinkError, 'File shares hard link' if Gitlab::Utils::FileInfo.shares_hard_link?(filepath)
FileUtils.rm(filepath) if Gitlab::Utils::FileInfo.linked?(filepath)
end
Dir.glob("#{dir}/**/*", File::FNM_DOTMATCH)
.reject { |f| ignore_file_names.include?(File.basename(f)) }
.each do |filepath|
FileUtils.rm(filepath) if File.lstat(filepath).symlink?
end
true
rescue HardLinkError
FileUtils.remove_dir(dir)
raise
end
end
end

View file

@ -87,7 +87,7 @@ module Gitlab
def validate_archive_path
Gitlab::Utils.check_path_traversal!(@archive_path)
raise(ServiceError, 'Archive path is a symlink or hard link') if Gitlab::Utils::FileInfo.linked?(@archive_path)
raise(ServiceError, 'Archive path is a symlink') if File.lstat(@archive_path).symlink?
raise(ServiceError, 'Archive path is not a file') unless File.file?(@archive_path)
end

View file

@ -23,7 +23,7 @@ module Gitlab
mkdir_p(@shared.export_path)
mkdir_p(@shared.archive_path)
clean_extraction_dir!(@shared.export_path)
remove_symlinks(@shared.export_path)
copy_archive
wait_for_archived_file do
@ -35,7 +35,7 @@ module Gitlab
false
ensure
remove_import_file
clean_extraction_dir!(@shared.export_path)
remove_symlinks(@shared.export_path)
end
private

View file

@ -21,9 +21,7 @@ module Gitlab
# This reads from `tree/project.json`
path = file_path("#{importable_path}.json")
if !File.exist?(path) || Gitlab::Utils::FileInfo.linked?(path)
raise Gitlab::ImportExport::Error, 'Invalid file'
end
raise Gitlab::ImportExport::Error, 'Invalid file' if !File.exist?(path) || File.symlink?(path)
data = File.read(path, MAX_JSON_DOCUMENT_SIZE)
json_decode(data)
@ -36,7 +34,7 @@ module Gitlab
# This reads from `tree/project/merge_requests.ndjson`
path = file_path(importable_path, "#{key}.ndjson")
next if !File.exist?(path) || Gitlab::Utils::FileInfo.linked?(path)
next if !File.exist?(path) || File.symlink?(path)
File.foreach(path, MAX_JSON_DOCUMENT_SIZE).with_index do |line, line_num|
documents << [json_decode(line), line_num]

View file

@ -57,7 +57,7 @@ module Gitlab
source_child = File.join(source_path, child)
target_child = File.join(target_path, child)
next if Gitlab::Utils::FileInfo.linked?(source_child)
next if File.lstat(source_child).symlink?
if File.directory?(source_child)
FileUtils.mkdir_p(target_child, mode: DEFAULT_DIR_MODE) unless File.exist?(target_child)

View file

@ -10,12 +10,13 @@ module Gitlab
def execute
return if host.blank?
gitlab_host = ::Gitlab.config.pages.host.downcase.prepend(".")
gitlab_host = ::Settings.pages.host.downcase.prepend(".")
if host.ends_with?(gitlab_host)
name = host.delete_suffix(gitlab_host)
by_unique_domain(name) || by_namespace_domain(name)
by_namespace_domain(name) ||
by_unique_domain(name)
else
by_custom_domain(host)
end

View file

@ -130,7 +130,7 @@ module Gitlab
# `NAMESPACE_FORMAT_REGEX`, with the negative lookbehind assertion removed. This means that the client-side validation
# will pass for usernames ending in `.atom` and `.git`, but will be caught by the server-side validation.
PATH_START_CHAR = '[a-zA-Z0-9_\.]'
PATH_REGEX_STR = PATH_START_CHAR + '[a-zA-Z0-9_\-\.]' + "{0,#{Namespace::URL_MAX_LENGTH - 1}}"
PATH_REGEX_STR = PATH_START_CHAR + '[a-zA-Z0-9_\-\.]*'
NAMESPACE_FORMAT_REGEX_JS = PATH_REGEX_STR + '[a-zA-Z0-9_\-]|[a-zA-Z0-9_]'
NO_SUFFIX_REGEX = /(?<!\.git|\.atom)/.freeze

View file

@ -1,20 +0,0 @@
# frozen_string_literal: true
require "asciidoctor_plantuml/plantuml"
module Gitlab
module Plantuml
class << self
def configure
Asciidoctor::PlantUml.configure do |conf|
conf.url = Gitlab::CurrentSettings.plantuml_url
conf.png_enable = Gitlab::CurrentSettings.plantuml_enabled
conf.svg_enable = false
conf.txt_enable = false
conf
end
end
end
end
end

View file

@ -1,35 +0,0 @@
# frozen_string_literal: true
module Gitlab
module Utils
module FileInfo
class << self
# Returns true if:
# - File or directory is a symlink.
# - File shares a hard link.
def linked?(file)
stat = to_file_stat(file)
stat.symlink? || shares_hard_link?(stat)
end
# Returns:
# - true if file shares a hard link with another file.
# - false if file is a directory, as directories cannot be hard linked.
def shares_hard_link?(file)
stat = to_file_stat(file)
stat.file? && stat.nlink > 1
end
private
def to_file_stat(filepath_or_stat)
return filepath_or_stat if filepath_or_stat.is_a?(File::Stat)
File.lstat(filepath_or_stat)
end
end
end
end
end

View file

@ -4,7 +4,7 @@ require 'jwt'
module JSONWebToken
class HMACToken < Token
LEEWAY = 60
IAT_LEEWAY = 60
JWT_ALGORITHM = 'HS256'
def initialize(secret)
@ -13,7 +13,7 @@ module JSONWebToken
@secret = secret
end
def self.decode(token, secret, leeway: LEEWAY, verify_iat: false)
def self.decode(token, secret, leeway: IAT_LEEWAY, verify_iat: true)
JWT.decode(token, secret, true, leeway: leeway, verify_iat: verify_iat, algorithm: JWT_ALGORITHM)
end

View file

@ -35713,9 +35713,6 @@ msgstr ""
msgid "ProjectSettings|With GitLab Pages you can host your static websites on GitLab. GitLab Pages uses a caching mechanism for efficiency. Your changes may not take effect until that cache is invalidated, which usually takes less than a minute."
msgstr ""
msgid "ProjectSetting|already in use"
msgstr ""
msgid "ProjectTemplates|.NET Core"
msgstr ""
@ -36010,9 +36007,6 @@ msgstr ""
msgid "ProjectsNew|Your project will be created at:"
msgstr ""
msgid "Project|already in use"
msgstr ""
msgid "PrometheusAlerts|exceeded"
msgstr ""
@ -53174,6 +53168,9 @@ msgstr ""
msgid "eligible users"
msgstr ""
msgid "email '%{email}' is not a verified email."
msgstr ""
msgid "email address settings"
msgstr ""
@ -53479,9 +53476,6 @@ msgstr ""
msgid "is not valid. The iteration group has to match the iteration cadence group."
msgstr ""
msgid "is not verified."
msgstr ""
msgid "is one of"
msgstr ""

View file

@ -59,7 +59,7 @@
"@gitlab/svgs": "3.46.0",
"@gitlab/ui": "62.10.0",
"@gitlab/visual-review-tools": "1.7.3",
"@gitlab/web-ide": "0.0.1-dev-20230713160749-patch-1",
"@gitlab/web-ide": "0.0.1-dev-20230511143809",
"@mattiasbuelens/web-streams-adapter": "^0.1.0",
"@popperjs/core": "^2.11.2",
"@rails/actioncable": "6.1.4-7",

View file

@ -4,7 +4,6 @@ require 'spec_helper'
RSpec.describe Projects::PipelineSchedulesController, feature_category: :continuous_integration do
include AccessMatchersForController
using RSpec::Parameterized::TableSyntax
let_it_be(:user) { create(:user) }
let_it_be(:project) { create(:project, :public, :repository) }
@ -46,43 +45,6 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
end
end
shared_examples 'protecting ref' do
where(:branch_access_levels, :tag_access_level, :maintainer_accessible, :developer_accessible) do
[:no_one_can_push, :no_one_can_merge] | :no_one_can_create | \
:be_denied_for | :be_denied_for
[:maintainers_can_push, :maintainers_can_merge] | :maintainers_can_create | \
:be_allowed_for | :be_denied_for
[:developers_can_push, :developers_can_merge] | :developers_can_create | \
:be_allowed_for | :be_allowed_for
end
with_them do
context 'when branch is protected' do
let(:ref_prefix) { 'heads' }
let(:ref_name) { 'master' }
before do
create(:protected_branch, *branch_access_levels, name: ref_name, project: project)
end
it { expect { go }.to try(maintainer_accessible, :maintainer).of(project) }
it { expect { go }.to try(developer_accessible, :developer).of(project) }
end
context 'when tag is protected' do
let(:ref_prefix) { 'tags' }
let(:ref_name) { 'v1.0.0' }
before do
create(:protected_tag, tag_access_level, name: ref_name, project: project)
end
it { expect { go }.to try(maintainer_accessible, :maintainer).of(project) }
it { expect { go }.to try(developer_accessible, :developer).of(project) }
end
end
end
describe 'GET #index' do
render_views
@ -196,9 +158,7 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
end
describe 'security' do
let(:schedule) { attributes_for(:ci_pipeline_schedule, ref: "refs/#{ref_prefix}/#{ref_name}") }
let(:ref_prefix) { 'heads' }
let(:ref_name) { "master" }
let(:schedule) { attributes_for(:ci_pipeline_schedule) }
it 'is allowed for admin when admin mode enabled', :enable_admin_mode do
expect { go }.to be_allowed_for(:admin)
@ -217,8 +177,6 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
it { expect { go }.to be_denied_for(:user) }
it { expect { go }.to be_denied_for(:external) }
it { expect { go }.to be_denied_for(:visitor) }
it_behaves_like 'protecting ref'
end
def go
@ -469,7 +427,7 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
end
describe 'POST #play', :clean_gitlab_redis_rate_limiting do
let(:ref_name) { 'master' }
let(:ref) { 'master' }
before do
project.add_developer(user)
@ -485,7 +443,7 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
it 'does not allow pipeline to be executed' do
expect(RunPipelineScheduleWorker).not_to receive(:perform_async)
go
post :play, params: { namespace_id: project.namespace.to_param, project_id: project, id: pipeline_schedule.id }
expect(response).to have_gitlab_http_status(:not_found)
end
@ -495,14 +453,16 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
it 'executes a new pipeline' do
expect(RunPipelineScheduleWorker).to receive(:perform_async).with(pipeline_schedule.id, user.id).and_return('job-123')
go
post :play, params: { namespace_id: project.namespace.to_param, project_id: project, id: pipeline_schedule.id }
expect(flash[:notice]).to start_with 'Successfully scheduled a pipeline to run'
expect(response).to have_gitlab_http_status(:found)
end
it 'prevents users from scheduling the same pipeline repeatedly' do
2.times { go }
2.times do
post :play, params: { namespace_id: project.namespace.to_param, project_id: project, id: pipeline_schedule.id }
end
expect(flash.to_a.size).to eq(2)
expect(flash[:alert]).to eq _('You cannot play this scheduled pipeline at the moment. Please wait a minute.')
@ -510,14 +470,17 @@ RSpec.describe Projects::PipelineSchedulesController, feature_category: :continu
end
end
describe 'security' do
let!(:pipeline_schedule) { create(:ci_pipeline_schedule, project: project, ref: "refs/#{ref_prefix}/#{ref_name}") }
context 'when a developer attempts to schedule a protected ref' do
it 'does not allow pipeline to be executed' do
create(:protected_branch, project: project, name: ref)
protected_schedule = create(:ci_pipeline_schedule, project: project, ref: ref)
it_behaves_like 'protecting ref'
end
expect(RunPipelineScheduleWorker).not_to receive(:perform_async)
def go
post :play, params: { namespace_id: project.namespace.to_param, project_id: project, id: pipeline_schedule.id }
post :play, params: { namespace_id: project.namespace.to_param, project_id: project, id: protected_schedule.id }
expect(response).to have_gitlab_http_status(:not_found)
end
end
end

View file

@ -226,22 +226,6 @@ RSpec.describe Banzai::Filter::AutolinkFilter, feature_category: :team_planning
end
end
it 'protects against malicious backtracking' do
doc = "http://#{'&' * 1_000_000}x"
expect do
Timeout.timeout(30.seconds) { filter(doc) }
end.not_to raise_error
end
it 'does not timeout with excessively long scheme' do
doc = "#{'h' * 1_000_000}://example.com"
expect do
Timeout.timeout(30.seconds) { filter(doc) }
end.not_to raise_error
end
# Rinku does not escape these characters in HTML attributes, but content_tag
# does. We don't care about that difference for these specs, though.
def unescape(html)

View file

@ -39,7 +39,6 @@ RSpec.describe Banzai::Filter::References::ProjectReferenceFilter, feature_categ
it_behaves_like 'fails fast', 'A' * 50000
it_behaves_like 'fails fast', '/a' * 50000
it_behaves_like 'fails fast', "mailto:#{'a-' * 499_000}@aaaaaaaa..aaaaaaaa.example.com"
end
it 'allows references with text after the > character' do

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe BulkImports::Common::Pipelines::LfsObjectsPipeline, feature_category: :importers do
RSpec.describe BulkImports::Common::Pipelines::LfsObjectsPipeline do
let_it_be(:portable) { create(:project) }
let_it_be(:oid) { 'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855' }
@ -118,22 +118,13 @@ RSpec.describe BulkImports::Common::Pipelines::LfsObjectsPipeline, feature_categ
context 'when file path is symlink' do
it 'returns' do
symlink = File.join(tmpdir, 'symlink')
FileUtils.ln_s(lfs_file_path, symlink)
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(symlink).and_call_original
FileUtils.ln_s(File.join(tmpdir, lfs_file_path), symlink)
expect { pipeline.load(context, symlink) }.not_to change { portable.lfs_objects.count }
end
end
context 'when file path shares multiple hard links' do
it 'returns' do
FileUtils.link(lfs_file_path, File.join(tmpdir, 'hard_link'))
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(lfs_file_path).and_call_original
expect { pipeline.load(context, lfs_file_path) }.not_to change { portable.lfs_objects.count }
end
end
context 'when path is a directory' do
it 'returns' do
expect { pipeline.load(context, Dir.tmpdir) }.not_to change { portable.lfs_objects.count }

View file

@ -105,7 +105,6 @@ RSpec.describe BulkImports::Common::Pipelines::UploadsPipeline, feature_category
it 'returns' do
path = File.join(tmpdir, 'test')
FileUtils.touch(path)
expect { pipeline.load(context, path) }.not_to change { portable.uploads.count }
end
end
@ -119,22 +118,13 @@ RSpec.describe BulkImports::Common::Pipelines::UploadsPipeline, feature_category
context 'when path is a symlink' do
it 'does not upload the file' do
symlink = File.join(tmpdir, 'symlink')
FileUtils.ln_s(upload_file_path, symlink)
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(symlink).and_call_original
FileUtils.ln_s(File.join(tmpdir, upload_file_path), symlink)
expect { pipeline.load(context, symlink) }.not_to change { portable.uploads.count }
end
end
context 'when path has multiple hard links' do
it 'does not upload the file' do
FileUtils.link(upload_file_path, File.join(tmpdir, 'hard_link'))
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(upload_file_path).and_call_original
expect { pipeline.load(context, upload_file_path) }.not_to change { portable.uploads.count }
end
end
context 'when path traverses' do
it 'does not upload the file' do
path_traversal = "#{uploads_dir_path}/avatar/../../../../etc/passwd"

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe BulkImports::Projects::Pipelines::DesignBundlePipeline, feature_category: :importers do
RSpec.describe BulkImports::Projects::Pipelines::DesignBundlePipeline do
let_it_be(:design) { create(:design, :with_file) }
let(:portable) { create(:project) }
@ -125,9 +125,9 @@ RSpec.describe BulkImports::Projects::Pipelines::DesignBundlePipeline, feature_c
context 'when path is symlink' do
it 'returns' do
symlink = File.join(tmpdir, 'symlink')
FileUtils.ln_s(design_bundle_path, symlink)
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(symlink).and_call_original
FileUtils.ln_s(File.join(tmpdir, design_bundle_path), symlink)
expect(portable.design_repository).not_to receive(:create_from_bundle)
pipeline.load(context, symlink)
@ -136,19 +136,6 @@ RSpec.describe BulkImports::Projects::Pipelines::DesignBundlePipeline, feature_c
end
end
context 'when path has multiple hard links' do
it 'returns' do
FileUtils.link(design_bundle_path, File.join(tmpdir, 'hard_link'))
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(design_bundle_path).and_call_original
expect(portable.design_repository).not_to receive(:create_from_bundle)
pipeline.load(context, design_bundle_path)
expect(portable.design_repository.exists?).to eq(false)
end
end
context 'when path is not under tmpdir' do
it 'returns' do
expect { pipeline.load(context, '/home/test.txt') }

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe BulkImports::Projects::Pipelines::RepositoryBundlePipeline, feature_category: :importers do
RSpec.describe BulkImports::Projects::Pipelines::RepositoryBundlePipeline do
let_it_be(:source) { create(:project, :repository) }
let(:portable) { create(:project) }
@ -123,9 +123,9 @@ RSpec.describe BulkImports::Projects::Pipelines::RepositoryBundlePipeline, featu
context 'when path is symlink' do
it 'returns' do
symlink = File.join(tmpdir, 'symlink')
FileUtils.ln_s(bundle_path, symlink)
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(symlink).and_call_original
FileUtils.ln_s(File.join(tmpdir, bundle_path), symlink)
expect(portable.repository).not_to receive(:create_from_bundle)
pipeline.load(context, symlink)
@ -134,19 +134,6 @@ RSpec.describe BulkImports::Projects::Pipelines::RepositoryBundlePipeline, featu
end
end
context 'when path has mutiple hard links' do
it 'returns' do
FileUtils.link(bundle_path, File.join(tmpdir, 'hard_link'))
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(bundle_path).and_call_original
expect(portable.repository).not_to receive(:create_from_bundle)
pipeline.load(context, bundle_path)
expect(portable.repository.exists?).to eq(false)
end
end
context 'when path is not under tmpdir' do
it 'returns' do
expect { pipeline.load(context, '/home/test.txt') }

View file

@ -32,12 +32,6 @@ RSpec.describe Gitlab::Checks::BranchCheck do
expect { subject.validate! }.to raise_error(Gitlab::GitAccess::ForbiddenError, "You cannot create a branch with a 40-character hexadecimal branch name.")
end
it "prohibits 40-character hexadecimal branch names followed by a dash as the start of a path" do
allow(subject).to receive(:branch_name).and_return("267208abfe40e546f5e847444276f7d43a39503e-/test")
expect { subject.validate! }.to raise_error(Gitlab::GitAccess::ForbiddenError, "You cannot create a branch with a 40-character hexadecimal branch name.")
end
it "doesn't prohibit a nested hexadecimal in a branch name" do
allow(subject).to receive(:branch_name).and_return("267208abfe40e546f5e847444276f7d43a39503e-fix")

View file

@ -105,16 +105,6 @@ RSpec.describe Gitlab::Ci::DecompressedGzipSizeValidator, feature_category: :imp
end
end
context 'when archive path has multiple hard links' do
before do
FileUtils.link(filepath, File.join(Dir.mktmpdir, 'hard_link'))
end
it 'returns false' do
expect(subject).not_to be_valid
end
end
context 'when archive path is not a file' do
let(:filepath) { Dir.mktmpdir }
let(:filesize) { File.size(filepath) }

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe Gitlab::GithubImport::AttachmentsDownloader, feature_category: :importers do
RSpec.describe Gitlab::GithubImport::AttachmentsDownloader do
subject(:downloader) { described_class.new(file_url) }
let_it_be(:file_url) { 'https://example.com/avatar.png' }
@ -39,26 +39,6 @@ RSpec.describe Gitlab::GithubImport::AttachmentsDownloader, feature_category: :i
end
end
context 'when file shares multiple hard links' do
let(:tmpdir) { Dir.mktmpdir }
let(:hard_link) { File.join(tmpdir, 'hard_link') }
before do
existing_file = File.join(tmpdir, 'file.txt')
FileUtils.touch(existing_file)
FileUtils.link(existing_file, hard_link)
allow(downloader).to receive(:filepath).and_return(hard_link)
end
it 'raises expected exception' do
expect(Gitlab::Utils::FileInfo).to receive(:linked?).with(hard_link).and_call_original
expect { downloader.perform }.to raise_exception(
described_class::DownloadError,
'Invalid downloaded file'
)
end
end
context 'when filename is malicious' do
let_it_be(:file_url) { 'https://example.com/ava%2F..%2Ftar.png' }

View file

@ -3,8 +3,6 @@
require 'spec_helper'
RSpec.describe Gitlab::Harbor::Query do
using RSpec::Parameterized::TableSyntax
let_it_be(:harbor_integration) { create(:harbor_integration) }
let(:params) { {} }
@ -113,20 +111,19 @@ RSpec.describe Gitlab::Harbor::Query do
end
context 'search' do
where(:search_param, :is_valid) do
"name=desc" | true
"name=value1,name=value-2" | true
"name=value1,name=value_2" | false
"name=desc,key=value" | false
"name=value1, name=value2" | false
"name" | false
context 'with valid search' do
let(:params) { { search: 'name=desc' } }
it 'initialize successfully' do
expect(query.valid?).to eq(true)
end
end
with_them do
let(:params) { { search: search_param } }
context 'with invalid search' do
let(:params) { { search: 'blabla' } }
it "validates according to the regex" do
expect(query.valid?).to eq(is_valid)
it 'initialize failed' do
expect(query.valid?).to eq(false)
end
end
end

View file

@ -5,16 +5,13 @@ require 'spec_helper'
RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importers do
include ExportFileHelper
let(:path) { "#{Dir.tmpdir}/symlink_test" }
let(:archive) { 'spec/fixtures/symlink_export.tar.gz' }
let(:shared) { Gitlab::ImportExport::Shared.new(nil) }
# Separate where files are written during this test by their kind, to avoid them interfering with each other:
# - `source_dir` Dir to compress files from.
# - `target_dir` Dir to decompress archived files into.
# - `archive_dir` Dir to write any archive files to.
let(:source_dir) { Dir.mktmpdir }
let(:target_dir) { Dir.mktmpdir }
let(:tmpdir) { Dir.mktmpdir }
let(:archive_dir) { Dir.mktmpdir }
subject(:mock_class) do
subject do
Class.new do
include Gitlab::ImportExport::CommandLineUtil
@ -28,59 +25,38 @@ RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importe
end
before do
FileUtils.mkdir_p(source_dir)
FileUtils.mkdir_p(path)
end
after do
FileUtils.rm_rf(source_dir)
FileUtils.rm_rf(target_dir)
FileUtils.rm_rf(path)
FileUtils.rm_rf(archive_dir)
FileUtils.remove_entry(tmpdir)
end
shared_examples 'deletes symlinks' do |compression, decompression|
it 'deletes the symlinks', :aggregate_failures do
Dir.mkdir("#{source_dir}/.git")
Dir.mkdir("#{source_dir}/folder")
FileUtils.touch("#{source_dir}/file.txt")
FileUtils.touch("#{source_dir}/folder/file.txt")
FileUtils.touch("#{source_dir}/.gitignore")
FileUtils.touch("#{source_dir}/.git/config")
File.symlink('file.txt', "#{source_dir}/.symlink")
File.symlink('file.txt', "#{source_dir}/.git/.symlink")
File.symlink('file.txt', "#{source_dir}/folder/.symlink")
archive_file = File.join(archive_dir, 'symlink_archive.tar.gz')
subject.public_send(compression, archive: archive_file, dir: source_dir)
subject.public_send(decompression, archive: archive_file, dir: target_dir)
Dir.mkdir("#{tmpdir}/.git")
Dir.mkdir("#{tmpdir}/folder")
FileUtils.touch("#{tmpdir}/file.txt")
FileUtils.touch("#{tmpdir}/folder/file.txt")
FileUtils.touch("#{tmpdir}/.gitignore")
FileUtils.touch("#{tmpdir}/.git/config")
File.symlink('file.txt', "#{tmpdir}/.symlink")
File.symlink('file.txt', "#{tmpdir}/.git/.symlink")
File.symlink('file.txt', "#{tmpdir}/folder/.symlink")
archive = File.join(archive_dir, 'archive')
subject.public_send(compression, archive: archive, dir: tmpdir)
expect(File).to exist("#{target_dir}/file.txt")
expect(File).to exist("#{target_dir}/folder/file.txt")
expect(File).to exist("#{target_dir}/.gitignore")
expect(File).to exist("#{target_dir}/.git/config")
expect(File).not_to exist("#{target_dir}/.symlink")
expect(File).not_to exist("#{target_dir}/.git/.symlink")
expect(File).not_to exist("#{target_dir}/folder/.symlink")
end
end
subject.public_send(decompression, archive: archive, dir: archive_dir)
shared_examples 'handles shared hard links' do |compression, decompression|
let(:archive_file) { File.join(archive_dir, 'hard_link_archive.tar.gz') }
subject(:decompress) { mock_class.public_send(decompression, archive: archive_file, dir: target_dir) }
before do
Dir.mkdir("#{source_dir}/dir")
FileUtils.touch("#{source_dir}/file.txt")
FileUtils.touch("#{source_dir}/dir/.file.txt")
FileUtils.link("#{source_dir}/file.txt", "#{source_dir}/.hard_linked_file.txt")
mock_class.public_send(compression, archive: archive_file, dir: source_dir)
end
it 'raises an exception and deletes the extraction dir', :aggregate_failures do
expect(FileUtils).to receive(:remove_dir).with(target_dir).and_call_original
expect(Dir).to exist(target_dir)
expect { decompress }.to raise_error(described_class::HardLinkError)
expect(Dir).not_to exist(target_dir)
expect(File.exist?("#{archive_dir}/file.txt")).to eq(true)
expect(File.exist?("#{archive_dir}/folder/file.txt")).to eq(true)
expect(File.exist?("#{archive_dir}/.gitignore")).to eq(true)
expect(File.exist?("#{archive_dir}/.git/config")).to eq(true)
expect(File.exist?("#{archive_dir}/.symlink")).to eq(false)
expect(File.exist?("#{archive_dir}/.git/.symlink")).to eq(false)
expect(File.exist?("#{archive_dir}/folder/.symlink")).to eq(false)
end
end
@ -236,8 +212,6 @@ RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importe
end
describe '#gzip' do
let(:path) { source_dir }
it 'compresses specified file' do
tempfile = Tempfile.new('test', path)
filename = File.basename(tempfile.path)
@ -255,16 +229,14 @@ RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importe
end
describe '#gunzip' do
let(:path) { source_dir }
it 'decompresses specified file' do
filename = 'labels.ndjson.gz'
gz_filepath = "spec/fixtures/bulk_imports/gz/#{filename}"
FileUtils.copy_file(gz_filepath, File.join(path, filename))
FileUtils.copy_file(gz_filepath, File.join(tmpdir, filename))
subject.gunzip(dir: path, filename: filename)
subject.gunzip(dir: tmpdir, filename: filename)
expect(File.exist?(File.join(path, 'labels.ndjson'))).to eq(true)
expect(File.exist?(File.join(tmpdir, 'labels.ndjson'))).to eq(true)
end
context 'when exception occurs' do
@ -278,7 +250,7 @@ RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importe
it 'archives a folder without compression' do
archive_file = File.join(archive_dir, 'archive.tar')
result = subject.tar_cf(archive: archive_file, dir: source_dir)
result = subject.tar_cf(archive: archive_file, dir: tmpdir)
expect(result).to eq(true)
expect(File.exist?(archive_file)).to eq(true)
@ -298,35 +270,29 @@ RSpec.describe Gitlab::ImportExport::CommandLineUtil, feature_category: :importe
end
describe '#untar_zxf' do
let(:tar_archive_fixture) { 'spec/fixtures/symlink_export.tar.gz' }
it_behaves_like 'deletes symlinks', :tar_czf, :untar_zxf
it_behaves_like 'handles shared hard links', :tar_czf, :untar_zxf
it 'has the right mask for project.json' do
subject.untar_zxf(archive: tar_archive_fixture, dir: target_dir)
subject.untar_zxf(archive: archive, dir: path)
expect(file_permissions("#{target_dir}/project.json")).to eq(0755) # originally 777
expect(file_permissions("#{path}/project.json")).to eq(0755) # originally 777
end
it 'has the right mask for uploads' do
subject.untar_zxf(archive: tar_archive_fixture, dir: target_dir)
subject.untar_zxf(archive: archive, dir: path)
expect(file_permissions("#{target_dir}/uploads")).to eq(0755) # originally 555
expect(file_permissions("#{path}/uploads")).to eq(0755) # originally 555
end
end
describe '#untar_xf' do
let(:tar_archive_fixture) { 'spec/fixtures/symlink_export.tar.gz' }
it_behaves_like 'deletes symlinks', :tar_cf, :untar_xf
it_behaves_like 'handles shared hard links', :tar_cf, :untar_xf
it 'extracts archive without decompression' do
filename = 'archive.tar.gz'
archive_file = File.join(archive_dir, 'archive.tar')
FileUtils.copy_file(tar_archive_fixture, File.join(archive_dir, filename))
FileUtils.copy_file(archive, File.join(archive_dir, filename))
subject.gunzip(dir: archive_dir, filename: filename)
result = subject.untar_xf(archive: archive_file, dir: archive_dir)

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe Gitlab::ImportExport::DecompressedArchiveSizeValidator, feature_category: :importers do
RSpec.describe Gitlab::ImportExport::DecompressedArchiveSizeValidator do
let_it_be(:filepath) { File.join(Dir.tmpdir, 'decompressed_archive_size_validator_spec.gz') }
before(:all) do
@ -121,7 +121,7 @@ RSpec.describe Gitlab::ImportExport::DecompressedArchiveSizeValidator, feature_c
context 'which archive path is a symlink' do
let(:filepath) { File.join(Dir.tmpdir, 'symlink') }
let(:error_message) { 'Archive path is a symlink or hard link' }
let(:error_message) { 'Archive path is a symlink' }
before do
FileUtils.ln_s(filepath, filepath, force: true)
@ -132,19 +132,6 @@ RSpec.describe Gitlab::ImportExport::DecompressedArchiveSizeValidator, feature_c
end
end
context 'when archive path shares multiple hard links' do
let(:filesize) { 32 }
let(:error_message) { 'Archive path is a symlink or hard link' }
before do
FileUtils.link(filepath, File.join(Dir.mktmpdir, 'hard_link'))
end
it 'returns false' do
expect(subject).not_to be_valid
end
end
context 'when archive path is not a file' do
let(:filepath) { Dir.mktmpdir }
let(:filesize) { File.size(filepath) }

View file

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe Gitlab::ImportExport::FileImporter, feature_category: :importers do
RSpec.describe Gitlab::ImportExport::FileImporter do
include ExportFileHelper
let(:shared) { Gitlab::ImportExport::Shared.new(nil) }
@ -113,73 +113,28 @@ RSpec.describe Gitlab::ImportExport::FileImporter, feature_category: :importers
end
context 'error' do
subject(:import) { described_class.import(importable: build(:project), archive_file: '', shared: shared) }
before do
allow_next_instance_of(described_class) do |instance|
allow(instance).to receive(:wait_for_archived_file).and_raise(StandardError, 'foo')
allow(instance).to receive(:wait_for_archived_file).and_raise(StandardError)
end
described_class.import(importable: build(:project), archive_file: '', shared: shared)
end
it 'removes symlinks in root folder' do
import
expect(File.exist?(symlink_file)).to be false
end
it 'removes hidden symlinks in root folder' do
import
expect(File.exist?(hidden_symlink_file)).to be false
end
it 'removes symlinks in subfolders' do
import
expect(File.exist?(subfolder_symlink_file)).to be false
end
it 'does not remove a valid file' do
import
expect(File.exist?(valid_file)).to be true
end
it 'returns false and sets an error on shared' do
result = import
expect(result).to eq(false)
expect(shared.errors.join).to eq('foo')
end
context 'when files in the archive share hard links' do
let(:hard_link_file) { "#{shared.export_path}/hard_link_file.txt" }
before do
FileUtils.link(valid_file, hard_link_file)
end
it 'returns false and sets an error on shared' do
result = import
expect(result).to eq(false)
expect(shared.errors.join).to eq('File shares hard link')
end
it 'removes all files in export path' do
expect(Dir).to exist(shared.export_path)
expect(File).to exist(symlink_file)
expect(File).to exist(hard_link_file)
expect(File).to exist(valid_file)
import
expect(File).not_to exist(symlink_file)
expect(File).not_to exist(hard_link_file)
expect(File).not_to exist(valid_file)
expect(Dir).not_to exist(shared.export_path)
end
end
end
context 'when file exceeds acceptable decompressed size' do
@ -202,10 +157,8 @@ RSpec.describe Gitlab::ImportExport::FileImporter, feature_category: :importers
allow(Gitlab::ImportExport::DecompressedArchiveSizeValidator).to receive(:max_bytes).and_return(1)
end
it 'returns false and sets an error on shared' do
result = subject.import
expect(result).to eq(false)
it 'returns false' do
expect(subject.import).to eq(false)
expect(shared.errors.join).to eq('Decompressed archive size validation failed.')
end
end

View file

@ -35,22 +35,16 @@ RSpec.describe Gitlab::ImportExport::Json::NdjsonReader, feature_category: :impo
expect(subject).to eq(root_tree)
end
context 'when project.json is symlink or hard link' do
using RSpec::Parameterized::TableSyntax
context 'when project.json is symlink' do
it 'raises error an error' do
Dir.mktmpdir do |tmpdir|
FileUtils.touch(File.join(tmpdir, 'passwd'))
File.symlink(File.join(tmpdir, 'passwd'), File.join(tmpdir, 'project.json'))
where(:link_method) { [:link, :symlink] }
ndjson_reader = described_class.new(tmpdir)
with_them do
it 'raises an error' do
Dir.mktmpdir do |tmpdir|
FileUtils.touch(File.join(tmpdir, 'passwd'))
FileUtils.send(link_method, File.join(tmpdir, 'passwd'), File.join(tmpdir, 'project.json'))
ndjson_reader = described_class.new(tmpdir)
expect { ndjson_reader.consume_attributes(importable_path) }
.to raise_error(Gitlab::ImportExport::Error, 'Invalid file')
end
expect { ndjson_reader.consume_attributes(importable_path) }
.to raise_error(Gitlab::ImportExport::Error, 'Invalid file')
end
end
end
@ -103,24 +97,18 @@ RSpec.describe Gitlab::ImportExport::Json::NdjsonReader, feature_category: :impo
end
end
context 'when relation file is a symlink or hard link' do
using RSpec::Parameterized::TableSyntax
context 'when relation file is a symlink' do
it 'yields nothing to the Enumerator' do
Dir.mktmpdir do |tmpdir|
Dir.mkdir(File.join(tmpdir, 'project'))
File.write(File.join(tmpdir, 'passwd'), "{}\n{}")
File.symlink(File.join(tmpdir, 'passwd'), File.join(tmpdir, 'project', 'issues.ndjson'))
where(:link_method) { [:link, :symlink] }
ndjson_reader = described_class.new(tmpdir)
with_them do
it 'yields nothing to the Enumerator' do
Dir.mktmpdir do |tmpdir|
Dir.mkdir(File.join(tmpdir, 'project'))
File.write(File.join(tmpdir, 'passwd'), "{}\n{}")
FileUtils.send(link_method, File.join(tmpdir, 'passwd'), File.join(tmpdir, 'project', 'issues.ndjson'))
result = ndjson_reader.consume_relation(importable_path, 'issues')
ndjson_reader = described_class.new(tmpdir)
result = ndjson_reader.consume_relation(importable_path, 'issues')
expect(result.to_a).to eq([])
end
expect(result.to_a).to eq([])
end
end
end

View file

@ -4,17 +4,15 @@ require 'spec_helper'
RSpec.describe Gitlab::ImportExport::RecursiveMergeFolders do
describe '.merge' do
it 'merges folder and ignores symlinks and files that share hard links' do
it 'merge folder and ignore symlinks' do
Dir.mktmpdir do |tmpdir|
source = "#{tmpdir}/source"
FileUtils.mkdir_p("#{source}/folder/folder")
FileUtils.touch("#{source}/file1.txt")
FileUtils.touch("#{source}/file_that_shares_hard_links.txt")
FileUtils.touch("#{source}/folder/file2.txt")
FileUtils.touch("#{source}/folder/folder/file3.txt")
FileUtils.ln_s("#{source}/file1.txt", "#{source}/symlink-file1.txt")
FileUtils.ln_s("#{source}/folder", "#{source}/symlink-folder")
FileUtils.link("#{source}/file_that_shares_hard_links.txt", "#{source}/hard_link.txt")
target = "#{tmpdir}/target"
FileUtils.mkdir_p("#{target}/folder/folder")

View file

@ -9,10 +9,6 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
project.update_pages_deployment!(create(:pages_deployment, project: project))
end
before do
stub_pages_setting(host: 'example.com')
end
it 'returns nil when host is empty' do
expect(described_class.new(nil).execute).to be_nil
expect(described_class.new('').execute).to be_nil
@ -73,7 +69,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
end
it 'returns the virual domain with no lookup_paths' do
virtual_domain = described_class.new("#{project.namespace.path}.example.com").execute
virtual_domain = described_class.new("#{project.namespace.path}.#{Settings.pages.host}").execute
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.cache_key).to match(/pages_domain_for_namespace_#{project.namespace.id}_/)
@ -86,7 +82,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
end
it 'returns the virual domain with no lookup_paths' do
virtual_domain = described_class.new("#{project.namespace.path}.example.com".downcase).execute
virtual_domain = described_class.new("#{project.namespace.path}.#{Settings.pages.host}".downcase).execute
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.cache_key).to be_nil
@ -108,7 +104,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
end
it 'returns the virual domain when there are pages deployed for the project' do
virtual_domain = described_class.new("#{project.namespace.path}.example.com").execute
virtual_domain = described_class.new("#{project.namespace.path}.#{Settings.pages.host}").execute
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.cache_key).to match(/pages_domain_for_namespace_#{project.namespace.id}_/)
@ -117,7 +113,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
end
it 'finds domain with case-insensitive' do
virtual_domain = described_class.new("#{project.namespace.path}.Example.com").execute
virtual_domain = described_class.new("#{project.namespace.path}.#{Settings.pages.host.upcase}").execute
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.cache_key).to match(/pages_domain_for_namespace_#{project.namespace.id}_/)
@ -131,7 +127,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
end
it 'returns the virual domain when there are pages deployed for the project' do
virtual_domain = described_class.new("#{project.namespace.path}.example.com").execute
virtual_domain = described_class.new("#{project.namespace.path}.#{Settings.pages.host}").execute
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.cache_key).to be_nil
@ -147,7 +143,7 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
project.project_setting.update!(pages_unique_domain: 'unique-domain')
end
subject(:virtual_domain) { described_class.new('unique-domain.example.com').execute }
subject(:virtual_domain) { described_class.new("unique-domain.#{Settings.pages.host.upcase}").execute }
context 'when pages unique domain is enabled' do
before_all do
@ -175,19 +171,6 @@ RSpec.describe Gitlab::Pages::VirtualHostFinder, feature_category: :pages do
expect(virtual_domain.lookup_paths.first.project_id).to eq(project.id)
end
context 'when a project path conflicts with a unique domain' do
it 'prioritizes the unique domain project' do
group = create(:group, path: 'unique-domain')
other_project = build(:project, path: 'unique-domain.example.com', group: group)
other_project.save!(validate: false)
other_project.update_pages_deployment!(create(:pages_deployment, project: other_project))
other_project.mark_pages_as_deployed
expect(virtual_domain).to be_an_instance_of(Pages::VirtualDomain)
expect(virtual_domain.lookup_paths.first.project_id).to eq(project.id)
end
end
context 'when :cache_pages_domain_api is disabled' do
before do
stub_feature_flags(cache_pages_domain_api: false)

View file

@ -1,59 +0,0 @@
# frozen_string_literal: true
require "spec_helper"
RSpec.describe Gitlab::Plantuml, feature_category: :shared do
describe ".configure" do
subject { described_class.configure }
let(:plantuml_url) { "http://plantuml.foo.bar" }
before do
allow(Gitlab::CurrentSettings).to receive(:plantuml_url).and_return(plantuml_url)
end
context "when PlantUML is enabled" do
before do
allow(Gitlab::CurrentSettings).to receive(:plantuml_enabled).and_return(true)
end
it "configures the endpoint URL" do
expect(subject.url).to eq(plantuml_url)
end
it "enables PNG support" do
expect(subject.png_enable).to be_truthy
end
it "disables SVG support" do
expect(subject.svg_enable).to be_falsey
end
it "disables TXT support" do
expect(subject.txt_enable).to be_falsey
end
end
context "when PlantUML is disabled" do
before do
allow(Gitlab::CurrentSettings).to receive(:plantuml_enabled).and_return(false)
end
it "configures the endpoint URL" do
expect(subject.url).to eq(plantuml_url)
end
it "enables PNG support" do
expect(subject.png_enable).to be_falsey
end
it "disables SVG support" do
expect(subject.svg_enable).to be_falsey
end
it "disables TXT support" do
expect(subject.txt_enable).to be_falsey
end
end
end
end

View file

@ -1,88 +0,0 @@
# frozen_string_literal: true
require 'fast_spec_helper'
RSpec.describe Gitlab::Utils::FileInfo, feature_category: :shared do
let(:tmpdir) { Dir.mktmpdir }
let(:file_path) { "#{tmpdir}/test.txt" }
before do
FileUtils.touch(file_path)
end
after do
FileUtils.rm_rf(tmpdir)
end
describe '.linked?' do
it 'raises an error when file does not exist' do
expect { subject.linked?('foo') }.to raise_error(Errno::ENOENT)
end
shared_examples 'identifies a linked file' do
it 'returns false when file or dir is not a link' do
expect(subject.linked?(tmpdir)).to eq(false)
expect(subject.linked?(file)).to eq(false)
end
it 'returns true when file or dir is symlinked' do
FileUtils.symlink(tmpdir, "#{tmpdir}/symlinked_dir")
FileUtils.symlink(file_path, "#{tmpdir}/symlinked_file.txt")
expect(subject.linked?("#{tmpdir}/symlinked_dir")).to eq(true)
expect(subject.linked?("#{tmpdir}/symlinked_file.txt")).to eq(true)
end
it 'returns true when file has more than one hard link' do
FileUtils.link(file_path, "#{tmpdir}/hardlinked_file.txt")
expect(subject.linked?(file)).to eq(true)
expect(subject.linked?("#{tmpdir}/hardlinked_file.txt")).to eq(true)
end
end
context 'when file is a File::Stat' do
let(:file) { File.lstat(file_path) }
it_behaves_like 'identifies a linked file'
end
context 'when file is path' do
let(:file) { file_path }
it_behaves_like 'identifies a linked file'
end
end
describe '.shares_hard_link?' do
it 'raises an error when file does not exist' do
expect { subject.shares_hard_link?('foo') }.to raise_error(Errno::ENOENT)
end
shared_examples 'identifies a file that shares a hard link' do
it 'returns false when file or dir does not share hard links' do
expect(subject.shares_hard_link?(tmpdir)).to eq(false)
expect(subject.shares_hard_link?(file)).to eq(false)
end
it 'returns true when file has more than one hard link' do
FileUtils.link(file_path, "#{tmpdir}/hardlinked_file.txt")
expect(subject.shares_hard_link?(file)).to eq(true)
expect(subject.shares_hard_link?("#{tmpdir}/hardlinked_file.txt")).to eq(true)
end
end
context 'when file is a File::Stat' do
let(:file) { File.lstat(file_path) }
it_behaves_like 'identifies a file that shares a hard link'
end
context 'when file is path' do
let(:file) { file_path }
it_behaves_like 'identifies a file that shares a hard link'
end
end
end

View file

@ -25,7 +25,7 @@ RSpec.describe JSONWebToken::HMACToken do
end
describe '.decode' do
let(:leeway) { described_class::LEEWAY }
let(:leeway) { described_class::IAT_LEEWAY }
let(:decoded_token) { described_class.decode(encoded_token, secret, leeway: leeway) }
context 'with an invalid token' do

View file

@ -77,36 +77,12 @@ RSpec.describe ProjectSetting, type: :model, feature_category: :projects do
expect(project_setting).not_to be_valid
expect(project_setting.errors.full_messages).to include("Pages unique domain has already been taken")
end
it "validates if the pages_unique_domain already exist as a project path" do
stub_pages_setting(host: 'example.com')
create(:project, path: "random-unique-domain.example.com")
project_setting = build(:project_setting, pages_unique_domain: "random-unique-domain")
expect(project_setting).not_to be_valid
expect(project_setting.errors.full_messages_for(:pages_unique_domain))
.to match(["Pages unique domain already in use"])
end
context "when updating" do
it "validates if the pages_unique_domain already exist as a project path" do
stub_pages_setting(host: 'example.com')
project_setting = create(:project_setting)
create(:project, path: "random-unique-domain.example.com")
expect(project_setting.update(pages_unique_domain: "random-unique-domain")).to eq(false)
expect(project_setting.errors.full_messages_for(:pages_unique_domain))
.to match(["Pages unique domain already in use"])
end
end
end
describe 'target_platforms=' do
it 'stringifies and sorts' do
project_setting = build(:project_setting, target_platforms: [:watchos, :ios])
expect(project_setting.target_platforms).to eq %w[ios watchos]
expect(project_setting.target_platforms).to eq %w(ios watchos)
end
end

View file

@ -830,37 +830,6 @@ RSpec.describe Project, factory_default: :keep, feature_category: :projects do
expect(project).to be_valid
end
context 'when validating if path already exist as pages unique domain' do
before do
stub_pages_setting(host: 'example.com')
end
it 'rejects paths that match pages unique domain' do
create(:project_setting, pages_unique_domain: 'some-unique-domain')
project = build(:project, path: 'some-unique-domain.example.com')
expect(project).not_to be_valid
expect(project.errors.full_messages_for(:path)).to match(['Path already in use'])
end
it 'accepts path when the host does not match' do
create(:project_setting, pages_unique_domain: 'some-unique-domain')
project = build(:project, path: 'some-unique-domain.another-example.com')
expect(project).to be_valid
end
it 'accepts path when the domain does not match' do
create(:project_setting, pages_unique_domain: 'another-unique-domain')
project = build(:project, path: 'some-unique-domain.example.com')
expect(project).to be_valid
end
end
context 'path is unchanged' do
let_it_be(:invalid_path_project) do
project = create(:project, :repository, :public)
@ -4856,33 +4825,6 @@ RSpec.describe Project, factory_default: :keep, feature_category: :projects do
project.update!(visibility_level: Gitlab::VisibilityLevel::INTERNAL)
end
context 'when validating if path already exist as pages unique domain' do
before do
stub_pages_setting(host: 'example.com')
end
it 'rejects paths that match pages unique domain' do
stub_pages_setting(host: 'example.com')
create(:project_setting, pages_unique_domain: 'some-unique-domain')
expect(project.update(path: 'some-unique-domain.example.com')).to eq(false)
expect(project.errors.full_messages_for(:path)).to match(['Path already in use'])
end
it 'accepts path when the host does not match' do
create(:project_setting, pages_unique_domain: 'some-unique-domain')
expect(project.update(path: 'some-unique-domain.another-example.com')).to eq(true)
end
it 'accepts path when the domain does not match' do
stub_pages_setting(host: 'example.com')
create(:project_setting, pages_unique_domain: 'another-unique-domain')
expect(project.update(path: 'some-unique-domain.example.com')).to eq(true)
end
end
it 'does not validate the visibility' do
expect(project).not_to receive(:visibility_level_allowed_as_fork).and_call_original
expect(project).not_to receive(:visibility_level_allowed_by_group).and_call_original

View file

@ -2,13 +2,10 @@
require 'spec_helper'
RSpec.describe Ci::PipelineSchedulePolicy, :models, :clean_gitlab_redis_cache, feature_category: :continuous_integration do
using RSpec::Parameterized::TableSyntax
RSpec.describe Ci::PipelineSchedulePolicy, :models, :clean_gitlab_redis_cache do
let_it_be(:user) { create(:user) }
let_it_be_with_reload(:project) { create(:project, :repository, create_tag: tag_ref_name) }
let_it_be_with_reload(:pipeline_schedule) { create(:ci_pipeline_schedule, :nightly, project: project) }
let_it_be(:tag_ref_name) { "v1.0.0" }
let_it_be(:project) { create(:project, :repository) }
let_it_be(:pipeline_schedule, reload: true) { create(:ci_pipeline_schedule, :nightly, project: project) }
let(:policy) do
described_class.new(user, pipeline_schedule)
@ -16,143 +13,51 @@ RSpec.describe Ci::PipelineSchedulePolicy, :models, :clean_gitlab_redis_cache, f
describe 'rules' do
describe 'rules for protected ref' do
context 'for branch' do
%w[refs/heads/master master].each do |branch_ref|
context "with #{branch_ref}" do
let_it_be(:branch_ref_name) { "master" }
let_it_be(:branch_pipeline_schedule) do
create(:ci_pipeline_schedule, :nightly, project: project, ref: branch_ref)
end
before do
project.add_developer(user)
end
where(:push_access_level, :merge_access_level, :project_role, :accessible) do
:no_one_can_push | :no_one_can_merge | :owner | :be_disallowed
:no_one_can_push | :no_one_can_merge | :maintainer | :be_disallowed
:no_one_can_push | :no_one_can_merge | :developer | :be_disallowed
:no_one_can_push | :no_one_can_merge | :reporter | :be_disallowed
:no_one_can_push | :no_one_can_merge | :guest | :be_disallowed
context 'when no one can push or merge to the branch' do
before do
create(:protected_branch, :no_one_can_push, name: pipeline_schedule.ref, project: project)
end
:maintainers_can_push | :no_one_can_merge | :owner | :be_allowed
:maintainers_can_push | :no_one_can_merge | :maintainer | :be_allowed
:maintainers_can_push | :no_one_can_merge | :developer | :be_disallowed
:maintainers_can_push | :no_one_can_merge | :reporter | :be_disallowed
:maintainers_can_push | :no_one_can_merge | :guest | :be_disallowed
:developers_can_push | :no_one_can_merge | :owner | :be_allowed
:developers_can_push | :no_one_can_merge | :maintainer | :be_allowed
:developers_can_push | :no_one_can_merge | :developer | :be_allowed
:developers_can_push | :no_one_can_merge | :reporter | :be_disallowed
:developers_can_push | :no_one_can_merge | :guest | :be_disallowed
:no_one_can_push | :maintainers_can_merge | :owner | :be_allowed
:no_one_can_push | :maintainers_can_merge | :maintainer | :be_allowed
:no_one_can_push | :maintainers_can_merge | :developer | :be_disallowed
:no_one_can_push | :maintainers_can_merge | :reporter | :be_disallowed
:no_one_can_push | :maintainers_can_merge | :guest | :be_disallowed
:maintainers_can_push | :maintainers_can_merge | :owner | :be_allowed
:maintainers_can_push | :maintainers_can_merge | :maintainer | :be_allowed
:maintainers_can_push | :maintainers_can_merge | :developer | :be_disallowed
:maintainers_can_push | :maintainers_can_merge | :reporter | :be_disallowed
:maintainers_can_push | :maintainers_can_merge | :guest | :be_disallowed
:developers_can_push | :maintainers_can_merge | :owner | :be_allowed
:developers_can_push | :maintainers_can_merge | :maintainer | :be_allowed
:developers_can_push | :maintainers_can_merge | :developer | :be_allowed
:developers_can_push | :maintainers_can_merge | :reporter | :be_disallowed
:developers_can_push | :maintainers_can_merge | :guest | :be_disallowed
:no_one_can_push | :developers_can_merge | :owner | :be_allowed
:no_one_can_push | :developers_can_merge | :maintainer | :be_allowed
:no_one_can_push | :developers_can_merge | :developer | :be_allowed
:no_one_can_push | :developers_can_merge | :reporter | :be_disallowed
:no_one_can_push | :developers_can_merge | :guest | :be_disallowed
:maintainers_can_push | :developers_can_merge | :owner | :be_allowed
:maintainers_can_push | :developers_can_merge | :maintainer | :be_allowed
:maintainers_can_push | :developers_can_merge | :developer | :be_allowed
:maintainers_can_push | :developers_can_merge | :reporter | :be_disallowed
:maintainers_can_push | :developers_can_merge | :guest | :be_disallowed
:developers_can_push | :developers_can_merge | :owner | :be_allowed
:developers_can_push | :developers_can_merge | :maintainer | :be_allowed
:developers_can_push | :developers_can_merge | :developer | :be_allowed
:developers_can_push | :developers_can_merge | :reporter | :be_disallowed
:developers_can_push | :developers_can_merge | :guest | :be_disallowed
end
with_them do
before do
create(:protected_branch, push_access_level, merge_access_level, name: branch_ref_name,
project: project)
project.add_role(user, project_role)
end
context 'for create_pipeline_schedule' do
subject(:policy) { described_class.new(user, new_branch_pipeline_schedule) }
let(:new_branch_pipeline_schedule) { project.pipeline_schedules.new(ref: branch_ref) }
it { expect(policy).to try(accessible, :create_pipeline_schedule) }
end
context 'for play_pipeline_schedule' do
subject(:policy) { described_class.new(user, branch_pipeline_schedule) }
it { expect(policy).to try(accessible, :play_pipeline_schedule) }
end
end
end
it 'does not include ability to play pipeline schedule' do
expect(policy).to be_disallowed :play_pipeline_schedule
end
end
context 'for tag' do
%w[refs/tags/v1.0.0 v1.0.0].each do |tag_ref|
context "with #{tag_ref}" do
let_it_be(:tag_pipeline_schedule) do
create(:ci_pipeline_schedule, :nightly, project: project, ref: tag_ref)
end
context 'when developers can push to the branch' do
before do
create(:protected_branch, :developers_can_merge, name: pipeline_schedule.ref, project: project)
end
where(:access_level, :project_role, :accessible) do
:no_one_can_create | :owner | :be_disallowed
:no_one_can_create | :maintainer | :be_disallowed
:no_one_can_create | :developer | :be_disallowed
:no_one_can_create | :reporter | :be_disallowed
:no_one_can_create | :guest | :be_disallowed
it 'includes ability to update pipeline' do
expect(policy).to be_allowed :play_pipeline_schedule
end
end
:maintainers_can_create | :owner | :be_allowed
:maintainers_can_create | :maintainer | :be_allowed
:maintainers_can_create | :developer | :be_disallowed
:maintainers_can_create | :reporter | :be_disallowed
:maintainers_can_create | :guest | :be_disallowed
context 'when no one can create the tag' do
let(:tag) { 'v1.0.0' }
:developers_can_create | :owner | :be_allowed
:developers_can_create | :maintainer | :be_allowed
:developers_can_create | :developer | :be_allowed
:developers_can_create | :reporter | :be_disallowed
:developers_can_create | :guest | :be_disallowed
end
before do
pipeline_schedule.update!(ref: tag)
with_them do
before do
create(:protected_tag, access_level, name: tag_ref_name, project: project)
project.add_role(user, project_role)
end
create(:protected_tag, :no_one_can_create, name: pipeline_schedule.ref, project: project)
end
context 'for create_pipeline_schedule' do
subject(:policy) { described_class.new(user, new_tag_pipeline_schedule) }
it 'does not include ability to play pipeline schedule' do
expect(policy).to be_disallowed :play_pipeline_schedule
end
end
let(:new_tag_pipeline_schedule) { project.pipeline_schedules.new(ref: tag_ref) }
context 'when no one can create the tag but it is not a tag' do
before do
create(:protected_tag, :no_one_can_create, name: pipeline_schedule.ref, project: project)
end
it { expect(policy).to try(accessible, :create_pipeline_schedule) }
end
context 'for play_pipeline_schedule' do
subject(:policy) { described_class.new(user, tag_pipeline_schedule) }
it { expect(policy).to try(accessible, :play_pipeline_schedule) }
end
end
end
it 'includes ability to play pipeline schedule' do
expect(policy).to be_allowed :play_pipeline_schedule
end
end
end

View file

@ -50,17 +50,6 @@ RSpec.describe API::Internal::Base, feature_category: :system_access do
expect(response).to have_gitlab_http_status(:ok)
end
it 'authenticates using a jwt token with an IAT from 10 seconds in the future' do
headers =
travel_to(Time.now + 10.seconds) do
gitlab_shell_internal_api_request_header
end
perform_request(headers: headers)
expect(response).to have_gitlab_http_status(:ok)
end
it 'returns 401 when jwt token is expired' do
headers = gitlab_shell_internal_api_request_header

View file

@ -43,21 +43,13 @@ RSpec.describe BulkImports::ArchiveExtractionService, feature_category: :importe
context 'when archive file is a symlink' do
it 'raises an error' do
FileUtils.ln_s(filepath, File.join(tmpdir, 'symlink'))
FileUtils.ln_s(File.join(tmpdir, filename), File.join(tmpdir, 'symlink'))
expect { described_class.new(tmpdir: tmpdir, filename: 'symlink').execute }
.to raise_error(BulkImports::Error, 'Invalid file')
end
end
context 'when archive file shares multiple hard links' do
it 'raises an error' do
FileUtils.link(filepath, File.join(tmpdir, 'hard_link'))
expect { subject.execute }.to raise_error(BulkImports::Error, 'Invalid file')
end
end
context 'when filepath is being traversed' do
it 'raises an error' do
expect { described_class.new(tmpdir: File.join(Dir.mktmpdir, 'test', '..'), filename: 'name').execute }

View file

@ -3,8 +3,6 @@
require 'spec_helper'
RSpec.describe BulkImports::FileDecompressionService, feature_category: :importers do
using RSpec::Parameterized::TableSyntax
let_it_be(:tmpdir) { Dir.mktmpdir }
let_it_be(:ndjson_filename) { 'labels.ndjson' }
let_it_be(:ndjson_filepath) { File.join(tmpdir, ndjson_filename) }
@ -72,68 +70,39 @@ RSpec.describe BulkImports::FileDecompressionService, feature_category: :importe
end
end
shared_examples 'raises an error and removes the file' do |error_message:|
specify do
expect { subject.execute }
.to raise_error(BulkImports::FileDecompressionService::ServiceError, error_message)
expect(File).not_to exist(file)
end
end
shared_context 'when compressed file' do
let_it_be(:file) { File.join(tmpdir, 'file.gz') }
subject { described_class.new(tmpdir: tmpdir, filename: 'file.gz') }
before do
FileUtils.send(link_method, File.join(tmpdir, gz_filename), file)
end
end
shared_context 'when decompressed file' do
let_it_be(:file) { File.join(tmpdir, 'file.txt') }
subject { described_class.new(tmpdir: tmpdir, filename: gz_filename) }
before do
original_file = File.join(tmpdir, 'original_file.txt')
FileUtils.touch(original_file)
FileUtils.send(link_method, original_file, file)
subject.instance_variable_set(:@decompressed_filepath, file)
end
end
context 'when compressed file is a symlink' do
let(:link_method) { :symlink }
let_it_be(:symlink) { File.join(tmpdir, 'symlink.gz') }
include_context 'when compressed file'
before do
FileUtils.ln_s(File.join(tmpdir, gz_filename), symlink)
end
include_examples 'raises an error and removes the file', error_message: 'File decompression error'
end
subject { described_class.new(tmpdir: tmpdir, filename: 'symlink.gz') }
context 'when compressed file shares multiple hard links' do
let(:link_method) { :link }
it 'raises an error and removes the file' do
expect { subject.execute }
.to raise_error(BulkImports::FileDecompressionService::ServiceError, 'File decompression error')
include_context 'when compressed file'
include_examples 'raises an error and removes the file', error_message: 'File decompression error'
expect(File.exist?(symlink)).to eq(false)
end
end
context 'when decompressed file is a symlink' do
let(:link_method) { :symlink }
let_it_be(:symlink) { File.join(tmpdir, 'symlink') }
include_context 'when decompressed file'
before do
FileUtils.ln_s(File.join(tmpdir, ndjson_filename), symlink)
include_examples 'raises an error and removes the file', error_message: 'Invalid file'
end
subject.instance_variable_set(:@decompressed_filepath, symlink)
end
context 'when decompressed file shares multiple hard links' do
let(:link_method) { :link }
subject { described_class.new(tmpdir: tmpdir, filename: gz_filename) }
include_context 'when decompressed file'
it 'raises an error and removes the file' do
expect { subject.execute }.to raise_error(described_class::ServiceError, 'Invalid file')
include_examples 'raises an error and removes the file', error_message: 'Invalid file'
expect(File.exist?(symlink)).to eq(false)
end
end
end
end

View file

@ -10,7 +10,7 @@ RSpec.describe BulkImports::FileDownloadService, feature_category: :importers do
let_it_be(:content_type) { 'application/octet-stream' }
let_it_be(:content_disposition) { nil }
let_it_be(:filename) { 'file_download_service_spec' }
let_it_be(:tmpdir) { Dir.mktmpdir }
let_it_be(:tmpdir) { Dir.tmpdir }
let_it_be(:filepath) { File.join(tmpdir, filename) }
let_it_be(:content_length) { 1000 }
@ -247,36 +247,6 @@ RSpec.describe BulkImports::FileDownloadService, feature_category: :importers do
end
end
context 'when file shares multiple hard links' do
let_it_be(:hard_link) { File.join(tmpdir, 'hard_link') }
before do
existing_file = File.join(Dir.mktmpdir, filename)
FileUtils.touch(existing_file)
FileUtils.link(existing_file, hard_link)
end
subject do
described_class.new(
configuration: config,
relative_url: '/test',
tmpdir: tmpdir,
filename: 'hard_link',
file_size_limit: file_size_limit,
allowed_content_types: allowed_content_types
)
end
it 'raises an error and removes the file' do
expect { subject.execute }.to raise_error(
described_class::ServiceError,
'Invalid downloaded file'
)
expect(File.exist?(hard_link)).to eq(false)
end
end
context 'when dir is not in tmpdir' do
subject do
described_class.new(

View file

@ -6,9 +6,7 @@ RSpec.describe Ci::PipelineSchedules::UpdateService, feature_category: :continuo
let_it_be(:user) { create(:user) }
let_it_be(:reporter) { create(:user) }
let_it_be(:project) { create(:project, :public, :repository) }
let_it_be(:pipeline_schedule) do
create(:ci_pipeline_schedule, project: project, owner: user, ref: 'master')
end
let_it_be(:pipeline_schedule) { create(:ci_pipeline_schedule, project: project, owner: user) }
before_all do
project.add_maintainer(user)

View file

@ -1134,10 +1134,10 @@
resolved "https://registry.yarnpkg.com/@gitlab/visual-review-tools/-/visual-review-tools-1.7.3.tgz#9ea641146436da388ffbad25d7f2abe0df52c235"
integrity sha512-NMV++7Ew1FSBDN1xiZaauU9tfeSfgDHcOLpn+8bGpP+O5orUPm2Eu66R5eC5gkjBPaXosNAxNWtriee+aFk4+g==
"@gitlab/web-ide@0.0.1-dev-20230713160749-patch-1":
version "0.0.1-dev-20230713160749-patch-1"
resolved "https://registry.yarnpkg.com/@gitlab/web-ide/-/web-ide-0.0.1-dev-20230713160749-patch-1.tgz#6420b55aae444533f9a4bd6269503d98a72aaa2e"
integrity sha512-Dh8XQyPwDY6fkd/A+hTHCqrD23u5qnlaxKu5myyxDEgBNGgu4SGblFU9B6NHNm8eGUZk6Cs5MuMk+NUvWRKbmA==
"@gitlab/web-ide@0.0.1-dev-20230511143809":
version "0.0.1-dev-20230511143809"
resolved "https://registry.yarnpkg.com/@gitlab/web-ide/-/web-ide-0.0.1-dev-20230511143809.tgz#c13dfb4d1edab2e020d4a102d4ec18048917490f"
integrity sha512-caP5WSaTuIhPrPGUWyvPT4np6swkKQHM1Pa9HiBnGhiOhhQ1+3X/+J9EoZXUhnhwiBzS7sp32Uyttam4am/sTA==
"@graphql-eslint/eslint-plugin@3.18.0":
version "3.18.0"