Update upstream source from tag 'upstream/15.6.4+ds2'

Update to upstream version '15.6.4+ds2'
with Debian dir ca2d232154
This commit is contained in:
Mohammed Bilal 2023-01-13 21:01:02 +05:30
commit 615ac8dd1f
28639 changed files with 0 additions and 15045731 deletions

File diff suppressed because it is too large Load diff

View file

@ -1,9 +0,0 @@
# Microsoft Open Source Code of Conduct
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
Resources:
- [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
- [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
- Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns

View file

@ -1,88 +0,0 @@
# Azure SDK for Go Contributing Guide
Thank you for your interest in contributing to Azure SDK for Go.
- For reporting bugs, requesting features, or asking for support, please file an issue in the [issues](https://github.com/Azure/azure-sdk-for-go/issues) section of the project.
- If you would like to become an active contributor to this project please follow the instructions provided in [Microsoft Azure Projects Contribution Guidelines](https://azure.github.io/azure-sdk/policies_opensource.html).
- To make code changes, or contribute something new, please follow the [GitHub Forks / Pull requests model](https://help.github.com/articles/fork-a-repo/): Fork the repo, make the change and propose it back by submitting a pull request.
## Pull Requests
- **DO** follow the API design and implementation [Go Guidelines](https://azure.github.io/azure-sdk/golang_introduction.html).
- When submitting large changes or features, **DO** have an issue or spec doc that describes the design, usage, and motivating scenario.
- **DO** submit all code changes via pull requests (PRs) rather than through a direct commit. PRs will be reviewed and potentially merged by the repo maintainers after a peer review that includes at least one maintainer.
- **DO** review your own PR to make sure there are no unintended changes or commits before submitting it.
- **DO NOT** submit "work in progress" PRs. A PR should only be submitted when it is considered ready for review and subsequent merging by the contributor.
- If the change is work-in-progress or an experiment, **DO** start off as a temporary draft PR.
- **DO** give PRs short-but-descriptive names (e.g. "Improve code coverage for Azure.Core by 10%", not "Fix #1234") and add a description which explains why the change is being made.
- **DO** refer to any relevant issues, and include [keywords](https://help.github.com/articles/closing-issues-via-commit-messages/) that automatically close issues when the PR is merged.
- **DO** tag any users that should know about and/or review the change.
- **DO** ensure each commit successfully builds. The entire PR must pass all tests in the Continuous Integration (CI) system before it'll be merged.
- **DO** address PR feedback in an additional commit(s) rather than amending the existing commits, and only rebase/squash them when necessary. This makes it easier for reviewers to track changes.
- **DO** assume that ["Squash and Merge"](https://github.com/blog/2141-squash-your-commits) will be used to merge your commit unless you request otherwise in the PR.
- **DO NOT** mix independent, unrelated changes in one PR. Separate real product/test code changes from larger code formatting/dead code removal changes. Separate unrelated fixes into separate PRs, especially if they are in different modules or files that otherwise wouldn't be changed.
- **DO** comment your code focusing on "why", where necessary. Otherwise, aim to keep it self-documenting with appropriate names and style.
- **DO** add [GoDoc style comments](https://azure.github.io/azure-sdk/golang_introduction.html#documentation-style) when adding new APIs or modifying header files.
- **DO** make sure there are no typos or spelling errors, especially in user-facing documentation.
- **DO** verify if your changes have impact elsewhere. For instance, do you need to update other docs or exiting markdown files that might be impacted?
- **DO** add relevant unit tests to ensure CI will catch future regressions.
## Merging Pull Requests (for project contributors with write access)
- **DO** use ["Squash and Merge"](https://github.com/blog/2141-squash-your-commits) by default for individual contributions unless requested by the PR author.
Do so, even if the PR contains only one commit. It creates a simpler history than "Create a Merge Commit".
Reasons that PR authors may request "Merge and Commit" may include (but are not limited to):
- The change is easier to understand as a series of focused commits. Each commit in the series must be buildable so as not to break `git bisect`.
- Contributor is using an e-mail address other than the primary GitHub address and wants that preserved in the history. Contributor must be willing to squash
the commits manually before acceptance.
## Developer Guide
### Repo structure
Most packages under the `services` directory in the SDK are generated from [Azure API specs][azure_rest_specs]
using [Azure/autorest.go][] and [Azure/autorest][]. These generated packages depend on the HTTP client implemented at [Azure/go-autorest][]. Therefore when contributing, please make sure you do not change anything under the `services` directory.
[azure_rest_specs]: https://github.com/Azure/azure-rest-api-specs
[azure/autorest]: https://github.com/Azure/autorest
[azure/autorest.go]: https://github.com/Azure/autorest.go
[azure/go-autorest]: https://github.com/Azure/go-autorest
For bugs or feature requests you can submit them using the [Github issues page][issues] and filling the appropriate template.
### Codespaces
Codespaces is new technology that allows you to use a container as your development environment. This repo provides a Codespaces container which is supported by both GitHub Codespaces and VS Code Codespaces.
#### GitHub Codespaces
1. From the Azure SDK GitHub repo, click on the "Code -> Open with Codespaces" button.
1. Open a Terminal. The development environment will be ready for you. Continue to [Building and Testing](https://github.com/Azure/azure-sdk-for-go/blob/main/CONTRIBUTING.md#building-and-testing).
#### VS Code Codespaces
1. Install the [VS Code Remote Extension Pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack)
1. When you open the Azure SDK for Go repo in VS Code, it will prompt you to open the project in the Dev Container. If it does not prompt you, then hit CTRL+P, and select "Remote-Containers: Open Folder in Container..."
1. Open a Terminal. The development environment will be ready for you. Continue to [Building and Testing](https://github.com/Azure/azure-sdk-for-go/blob/main/CONTRIBUTING.md#building-and-testing).
### Building and Testing
#### Building
SDKs are either old (track 1) or new (track 2):
- Old (Track 1) SDKs are found in the services/ and profiles/ top level folders.
- CI is in /azure-pipelines.yml
- New (Track 2) SDKs are found in the sdk/ top level folder.
- CI is in /eng/pipelines/templates/steps/build.yml
To build, run `go build` from the respective SDK directory.
There currently is not a repository wide way to build or regenerate code.
#### Testing
To test, run 'go test' from the respective directory.

View file

@ -1,391 +0,0 @@
# This file is autogenerated, do not edit; changes may be undone by the next 'dep ensure'.
[[projects]]
branch = "master"
digest = "1:ec784146416ff958c6dbde13b831b2a1155e3d719702347a1b9bd255664b8d9c"
name = "github.com/Azure/go-autorest"
packages = [
".",
"autorest",
"autorest/adal",
"autorest/azure",
"autorest/azure/auth",
"autorest/azure/cli",
"autorest/date",
"autorest/to",
"autorest/validation",
"logger",
"tracing",
]
pruneopts = "UT"
revision = "c7f947c0610de1bc279f76e6d453353f95cd1bfa"
[[projects]]
digest = "1:acd3946bd57e665227c8ebbe07fbd378422780999e332e9c3748a6c9144d020f"
name = "github.com/Masterminds/semver"
packages = ["."]
pruneopts = "UT"
revision = "7bb0c843b53d6ad21a3f619cb22c4b442bb3ef3e"
version = "v3.1.1"
[[projects]]
digest = "1:66d043a2967e0aa8c3d66a4c150b5f1eb65c8f3126cc083208bd58a0ca2ecaca"
name = "github.com/dimchansky/utfbom"
packages = ["."]
pruneopts = "UT"
revision = "6ae8f945ca96f30defc7e8ab12ec5d10cf86ded4"
version = "v1.1.1"
[[projects]]
branch = "master"
digest = "1:8eeffdeea12263d2f33a1a915db373be27a8b198b6e9526899b25446aee0984b"
name = "github.com/dnaeon/go-vcr"
packages = [
"cassette",
"recorder",
]
pruneopts = "UT"
revision = "7e02de29c04550ab0fa86feab15f1c0c13425e28"
[[projects]]
digest = "1:3be7b7206a3fd8686175fd974eb33a1c9dd109c149bdf19801e32b8f48733530"
name = "github.com/form3tech-oss/jwt-go"
packages = ["."]
pruneopts = "UT"
revision = "9162a5abdbc046b7c8b03ee90052cee67e25caa7"
version = "v3.2.2"
[[projects]]
branch = "master"
digest = "1:78102ee4d536347316bc42e818340cc50902e45dbd7fdd524c5a1fc0cb07b588"
name = "github.com/globalsign/mgo"
packages = [
".",
"bson",
"internal/json",
"internal/sasl",
"internal/scram",
]
pruneopts = "UT"
revision = "eeefdecb41b842af6dc652aaea4026e8403e62df"
[[projects]]
digest = "1:b4c37f161f9abc89aede53a621205cda38d2df6659dff7b8044d8295ed24291f"
name = "github.com/gofrs/uuid"
packages = ["."]
pruneopts = "UT"
revision = "4b36aa0ea796d49630bbae624f8f05ab7013af77"
version = "v4.0.0"
[[projects]]
digest = "1:9e62e8886ca549ad17aa4db1783e469f10e424652595304fdc2c8ecda5d25476"
name = "github.com/golang/protobuf"
packages = ["proto"]
pruneopts = "UT"
revision = "ae97035608a719c7a1c1c41bed0ae0744bdb0c6f"
version = "v1.5.2"
[[projects]]
digest = "1:d06fbb4c29d4266211e7fa20d29b61469db346e6df1bea40a70cea066cf25019"
name = "github.com/hashicorp/errwrap"
packages = ["."]
pruneopts = "UT"
revision = "7b00e5db719c64d14dd0caaacbd13e76254d02c0"
version = "v1.1.0"
[[projects]]
digest = "1:71c646a40ec9bcfcc93ce31cde043356cb95670203d219c7edede05779081fe4"
name = "github.com/hashicorp/go-multierror"
packages = ["."]
pruneopts = "UT"
revision = "9974e9ec57696378079ecc3accd3d6f29401b3a0"
version = "v1.1.1"
[[projects]]
digest = "1:870d441fe217b8e689d7949fef6e43efbc787e50f200cb1e70dbca9204a1d6be"
name = "github.com/inconshreveable/mousetrap"
packages = ["."]
pruneopts = "UT"
revision = "76626ae9c91c4f2a10f34cad8ce83ea42c93bb75"
version = "v1.0"
[[projects]]
branch = "master"
digest = "1:fbb258160f42c1534d260e05ff1748ae24cd250b27775aa02c33b4fa4846c752"
name = "github.com/jongio/azidext"
packages = ["go/azidext"]
pruneopts = "UT"
revision = "0df38e7e4890b066a2e6d5782e947789de945d23"
[[projects]]
digest = "1:759375161ead34b8a830e14e75196bb51a999f125531d2faf0a1a944551a586f"
name = "github.com/kr/pretty"
packages = ["."]
pruneopts = "UT"
revision = "ead452280cd055b2ae8a7f0db5eb37a878d902f7"
version = "v0.2.1"
[[projects]]
digest = "1:7218fd69ff5436d016101bbc6183cdc289aa45ac37b48e78846318e4ef389bea"
name = "github.com/kr/text"
packages = ["."]
pruneopts = "UT"
revision = "702c74938df48b97370179f33ce2107bd7ff3b3e"
version = "v0.2.0"
[[projects]]
digest = "1:5d231480e1c64a726869bc4142d270184c419749d34f167646baa21008eb0a79"
name = "github.com/mitchellh/go-homedir"
packages = ["."]
pruneopts = "UT"
revision = "af06845cf3004701891bf4fdb884bfe4920b3727"
version = "v1.1.0"
[[projects]]
branch = "master"
digest = "1:3df1aab3f223cf94c6d7f566cc248903dff5bd7677769d5198ffd66cf7e9753e"
name = "github.com/pkg/browser"
packages = ["."]
pruneopts = "UT"
revision = "ce105d075bb4b07d775ac3e595dececaa077eab1"
[[projects]]
digest = "1:ae3c4c486b4de2c71f6e1b9e53d125d13c80aaecc549e9a93342e8274ec761f4"
name = "github.com/shopspring/decimal"
packages = ["."]
pruneopts = "UT"
revision = "2568a29459476f824f35433dfbef158d6ad8618c"
version = "v1.2.0"
[[projects]]
digest = "1:aa19e3e16e7c19e615a088d41b801a441eef93651fca6913e7a2a46598129326"
name = "github.com/spf13/cobra"
packages = ["."]
pruneopts = "UT"
revision = "8380ddd3132bdf8fd77731725b550c181dda0aa8"
version = "v1.1.3"
[[projects]]
digest = "1:524b71991fc7d9246cc7dc2d9e0886ccb97648091c63e30eef619e6862c955dd"
name = "github.com/spf13/pflag"
packages = ["."]
pruneopts = "UT"
revision = "2e9d26c8c37aae03e3f9d4e90b7116f5accb7cab"
version = "v1.0.5"
[[projects]]
branch = "master"
digest = "1:3c9c113a5e5475facaa56d1690def66c4b5077d1c57c190c180835cae6440118"
name = "golang.org/x/crypto"
packages = [
"pkcs12",
"pkcs12/internal/rc2",
]
pruneopts = "UT"
revision = "4f45737414dc36c02b62c3adac99418bd1fa70db"
[[projects]]
digest = "1:719c9561433767f273d18f7508bb0f00c098caee9af48cbb7d02c9e6d2354b32"
name = "golang.org/x/mod"
packages = [
"module",
"semver",
]
pruneopts = "UT"
revision = "d6ab96f2441f9631f81862375ef66782fc4a9c12"
version = "v0.4.2"
[[projects]]
branch = "master"
digest = "1:60397b4be0f6f80cf41b1b29863778b5afcb7d9cf4ccfd2d00bf438738caa888"
name = "golang.org/x/net"
packages = [
"context",
"context/ctxhttp",
"http/httpguts",
"http2",
"http2/hpack",
"idna",
]
pruneopts = "UT"
revision = "e915ea6b2b7d7f3955e2d6d432eaebd7cf5921e7"
[[projects]]
branch = "master"
digest = "1:f61fe67341c0aa7358c3050eb03d66e4484721526eef17c74e43bd648518492e"
name = "golang.org/x/oauth2"
packages = [
".",
"internal",
]
pruneopts = "UT"
revision = "f6687ab2804cbebdfdeef385bee94918b1ce83de"
[[projects]]
branch = "master"
digest = "1:2e2b5a89dfa972ce677ab8673bee4fc57699b431f43ca9fdb45fedddbe685691"
name = "golang.org/x/sys"
packages = [
"execabs",
"internal/unsafeheader",
"windows",
]
pruneopts = "UT"
revision = "66c3f260301cac915959651293c11c8207d331e8"
[[projects]]
digest = "1:d343bfab7897dd73ee9bf8172d1fc83d935b7512067856ed596aaf57a9345f3c"
name = "golang.org/x/text"
packages = [
"collate",
"collate/build",
"internal/colltab",
"internal/gen",
"internal/language",
"internal/language/compact",
"internal/tag",
"internal/triegen",
"internal/ucd",
"language",
"secure/bidirule",
"transform",
"unicode/bidi",
"unicode/cldr",
"unicode/norm",
"unicode/rangetable",
]
pruneopts = "UT"
revision = "e328d63cff14134669501e0e154e4f141c784322"
version = "v0.3.6"
[[projects]]
branch = "master"
digest = "1:880fe1e6068d71dc4bc55bb29b5e388d296b76b2598153025a02041699b26e83"
name = "golang.org/x/tools"
packages = [
"go/ast/astutil",
"imports",
"internal/event",
"internal/event/core",
"internal/event/keys",
"internal/event/label",
"internal/fastwalk",
"internal/gocommand",
"internal/gopathwalk",
"internal/imports",
]
pruneopts = "UT"
revision = "07295caad09cddc39fc3b5631934f4780c2014de"
[[projects]]
branch = "master"
digest = "1:918a46e4a2fb83df33f668f5a6bd51b2996775d073fce1800d3ec01b0a5ddd2b"
name = "golang.org/x/xerrors"
packages = [
".",
"internal",
]
pruneopts = "UT"
revision = "5ec99f83aff198f5fbd629d6c8d8eb38a04218ca"
[[projects]]
digest = "1:15bbb120d95283019f8891f2762fab3e735075b2cf9b378497277d2fe6c4abca"
name = "google.golang.org/appengine"
packages = [
"internal",
"internal/base",
"internal/datastore",
"internal/log",
"internal/remote_api",
"internal/urlfetch",
"urlfetch",
]
pruneopts = "UT"
revision = "5d1c1d03f8703c2e81478d9a30e9afa2d3e4bd8a"
version = "v1.6.7"
[[projects]]
digest = "1:bc8a58ba53d55c6c60aba097ba2981a37cf074c8c19e852763529dbc0ae0eb05"
name = "google.golang.org/protobuf"
packages = [
"encoding/prototext",
"encoding/protowire",
"internal/descfmt",
"internal/descopts",
"internal/detrand",
"internal/encoding/defval",
"internal/encoding/messageset",
"internal/encoding/tag",
"internal/encoding/text",
"internal/errors",
"internal/filedesc",
"internal/filetype",
"internal/flags",
"internal/genid",
"internal/impl",
"internal/order",
"internal/pragma",
"internal/set",
"internal/strs",
"internal/version",
"proto",
"reflect/protodesc",
"reflect/protoreflect",
"reflect/protoregistry",
"runtime/protoiface",
"runtime/protoimpl",
"types/descriptorpb",
]
pruneopts = "UT"
revision = "f2d1f6cbe10b90d22296ea09a7217081c2798009"
version = "v1.26.0"
[[projects]]
branch = "v1"
digest = "1:b39f3febddd151ee9993a0a10c20393efeddb1d31c0c706c9d03d054dbafd308"
name = "gopkg.in/check.v1"
packages = ["."]
pruneopts = "UT"
revision = "10cb98267c6cb43ea9cd6793f29ff4089c306974"
[[projects]]
digest = "1:5054a1f394226de9e6ddc47b0ba77e35092a4112f4a1cd9cb94aba1f5bdc3ec6"
name = "gopkg.in/yaml.v2"
packages = ["."]
pruneopts = "UT"
revision = "7649d4548cb53a614db133b2a8ac1f31859dda8c"
version = "v2.4.0"
[solve-meta]
analyzer-name = "dep"
analyzer-version = 1
input-imports = [
"github.com/Azure/go-autorest/autorest",
"github.com/Azure/go-autorest/autorest/adal",
"github.com/Azure/go-autorest/autorest/azure",
"github.com/Azure/go-autorest/autorest/azure/auth",
"github.com/Azure/go-autorest/autorest/date",
"github.com/Azure/go-autorest/autorest/to",
"github.com/Azure/go-autorest/autorest/validation",
"github.com/Azure/go-autorest/tracing",
"github.com/Masterminds/semver",
"github.com/dnaeon/go-vcr/cassette",
"github.com/dnaeon/go-vcr/recorder",
"github.com/globalsign/mgo",
"github.com/gofrs/uuid",
"github.com/hashicorp/go-multierror",
"github.com/jongio/azidext/go/azidext",
"github.com/pkg/browser",
"github.com/shopspring/decimal",
"github.com/spf13/cobra",
"github.com/spf13/pflag",
"golang.org/x/crypto/pkcs12",
"golang.org/x/net/http/httpguts",
"golang.org/x/net/http2",
"golang.org/x/oauth2",
"golang.org/x/tools/imports",
"gopkg.in/check.v1",
]
solver-name = "gps-cdcl"
solver-version = 1

View file

@ -1,58 +0,0 @@
# Gopkg.toml example
#
# Refer to https://github.com/golang/dep/blob/master/docs/Gopkg.toml.md
# for detailed Gopkg.toml documentation.
#
# required = ["github.com/user/thing/cmd/thing"]
# ignored = ["github.com/user/project/pkgX", "bitbucket.org/user/project/pkgA/pkgY"]
#
# [[constraint]]
# name = "github.com/user/project"
# version = "1.0.0"
#
# [[constraint]]
# name = "github.com/user/project2"
# branch = "dev"
# source = "github.com/myfork/project2"
#
# [[override]]
# name = "github.com/x/y"
# version = "2.4.0"
ignored = ["github.com/ahmetb/go-linq/*", "github.com/google/go-github/*", "github.com/go-git/go-git/*", "github.com/Azure/azure-sdk-for-go/sdk/*", "github.com/Azure/azure-amqp-common-go/*"]
[prune]
unused-packages = true
go-tests = true
[[constraint]]
branch = "master"
name = "github.com/Azure/go-autorest"
[[constraint]]
branch = "master"
name = "github.com/dnaeon/go-vcr"
[[constraint]]
branch = "master"
name = "github.com/globalsign/mgo"
[[constraint]]
name = "github.com/gofrs/uuid"
version = "4.0.0"
[[constraint]]
name = "github.com/shopspring/decimal"
version = "1.0.0"
[[constraint]]
branch = "master"
name = "golang.org/x/crypto"
[[constraint]]
branch = "master"
name = "golang.org/x/tools"
[[constraint]]
branch = "v1"
name = "gopkg.in/check.v1"

View file

@ -1,21 +0,0 @@
The MIT License (MIT)
Copyright (c) Microsoft Corporation.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -1,29 +0,0 @@
NOTICES AND INFORMATION
Do Not Translate or Localize
This software incorporates material from third parties. Microsoft makes certain
open source code available at https://3rdpartysource.microsoft.com, or you may
send a check or money order for US $5.00, including the product name, the open
source component name, and version number, to:
Source Code Compliance Team
Microsoft Corporation
One Microsoft Way
Redmond, WA 98052
USA
Notwithstanding any other terms, you may reverse engineer this software to the
extent required to debug changes to any libraries licensed under the GNU Lesser
General Public License.
------------------------------------------------------------------------------
Azure SDK for Go uses third-party libraries or other resources that may be
distributed under licenses different than the Azure SDK for Go software.
In the event that we accidentally failed to list a required notice, please
bring it to our attention. Post an issue or email us:
azgosdkhelp@microsoft.com
The attached notices are provided for information only.

View file

@ -1,100 +0,0 @@
# Azure SDK for Go
[![godoc](https://godoc.org/github.com/Azure/azure-sdk-for-go?status.svg)](https://godoc.org/github.com/Azure/azure-sdk-for-go)
[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/go/Azure.azure-sdk-for-go?branchName=main)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=640&branchName=main)
This repository is for active development of the Azure SDK for Go. For consumers of the SDK you can follow the links below to visit the documentation you are interested in
* [Overview of Azure SDK for Go](https://docs.microsoft.com/azure/developer/go/)
* [SDK Reference](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go)
* [Code Samples for Azure Go SDK](https://github.com/azure-samples/azure-sdk-for-go-samples)
* [Azure REST API Docs](https://docs.microsoft.com/rest/api/)
* [General Azure Docs](https://docs.microsoft.com/azure)
## Getting Started
To get started with a library, see the README.md file located in the library's project folder. You can find these library folders grouped by service in the `/sdk` directory.
> NOTE: Go **1.18** or later is required.
## Packages available
Each service might have a number of libraries available from each of the following categories:
* [Client - New Releases](#client-new-releases)
* [Client - Previous Versions](#client-previous-versions)
* [Management - New Releases](#management-new-releases)
* [Management - Previous Versions](#management-previous-versions)
### Client: New Releases
We have a new wave of packages that are being announced as **stable** and several that are currently released in **beta**. These libraries allow you to use, consume, and interact with existing resources, for example, uploading a blob. These libraries share a number of core functionalities including retries, logging, transport protocols, authentication protocols, etc. that can be found in the [azcore](https://github.com/Azure/azure-sdk-for-go/blob/main/sdk/azcore) library. You can learn more about these libraries by reading about the [Azure SDK Go guidelines](https://azure.github.io/azure-sdk/golang_introduction.html).
You can find the most up-to-date list of new packages on our [latest page](https://azure.github.io/azure-sdk/releases/latest/index.html#go). These new libraries can be identified by locating them under the [`sdk`](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk) directory in the repository.
> NOTE: If you need to ensure your code is ready for production use one of the stable, non-beta libraries.
### Client: Previous Versions
The last stable versions of packages that have been provided for usage with Azure are production-ready. These libraries might not implement the [Azure Go SDK guidelines](https://azure.github.io/azure-sdk/golang_introduction.html) or have the same feature set as the New releases, however they do offer a wider coverage of services.
Previous Go SDK packages are located under [/services folder](https://github.com/Azure/azure-sdk-for-go/tree/master/services), and you can see the full list [on this page](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/services).
### Management: New Releases
A new set of management libraries that follow the [Azure SDK Design Guidelines for Go](https://azure.github.io/azure-sdk/golang_introduction.html) are available at `sdk/resourcemanagement`. These new libraries provide a number of core capabilities that are shared amongst all Azure SDKs, including the intuitive Azure Identity library, an HTTP Pipeline with custom policies, error-handling, distributed tracing, and much more.
To get started, please follow the [quickstart guide here](https://aka.ms/azsdk/go/mgmt). To see the benefits of migrating and how to migrate to the new libraries, please visit the [migration guide](https://aka.ms/azsdk/go/mgmt/migration).
You can find the [most up to date list of all of the new packages on our page](https://azure.github.io/azure-sdk/releases/latest/mgmt/go.html)
> NOTE: If you need to ensure your code is ready for production use one of the stable, non-beta libraries. Also, if you are experiencing authentication issues with the management libraries after upgrading certain packages, it's possible that you upgraded to the new versions of SDK without changing the authentication code. Please refer to the migration guide for proper instructions.
* [Quickstart tutorial for new releases](https://aka.ms/azsdk/go/mgmt). Documentation is also available at each readme file of the individual module (Example: [Readme for Compute Module](https://github.com/Azure/azure-sdk-for-go/tree/main/sdk/resourcemanager/compute/armcompute))
### Management: Previous Versions
For a complete list of management libraries which enable you to provision and manage Azure resources, please [check here](https://azure.github.io/azure-sdk/releases/latest/all/go.html). They might not have the same feature set as the new releases but they do offer wider coverage of services.
Previous packages are located under [/services folder](https://github.com/Azure/azure-sdk-for-go/tree/master/services), and you can see the full list [on this page](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/services).
* [Quickstart tutorial for previous versions](https://aka.ms/azsdk/go/mgmt/previous)
## Other Azure Go Packages
Azure provides several other packages for using services from Go, listed below. These packages do NOT follow the New Release guidelines.
| Service | Import Path/Repo |
| -------------------- | -------------------------------------------------------------------------------------------------- |
| Storage - Blobs | [github.com/Azure/azure-storage-blob-go](https://github.com/Azure/azure-storage-blob-go) |
| Storage - Files | [github.com/Azure/azure-storage-file-go](https://github.com/Azure/azure-storage-file-go) |
| Storage - Queues | [github.com/Azure/azure-storage-queue-go](https://github.com/Azure/azure-storage-queue-go) |
| Event Hubs | [github.com/Azure/azure-event-hubs-go](https://github.com/Azure/azure-event-hubs-go) |
| Application Insights | [github.com/Microsoft/ApplicationInsights-go](https://github.com/Microsoft/ApplicationInsights-go) |
## Samples
More code samples for using the management library for Go SDK can be found in the following locations
- [Go SDK Code Samples Repo(New version)](https://aka.ms/azsdk/go/mgmt/samples)
- [Go SDK Code Samples Repo](https://github.com/azure-samples/azure-sdk-for-go-samples)
- Example files under each package. For example, examples for Network packages can be [found here](https://github.com/Azure/azure-sdk-for-go/blob/main/sdk/resourcemanager/network/armnetwork/ze_generated_example_loadbalancernetworkinterfaces_client_test.go)
## Reporting security issues and security bugs
Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) <secure@microsoft.com>. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the [Security TechCenter](https://www.microsoft.com/msrc/faqs-report-an-issue).
## Need help?
* File an issue via [Github Issues](https://github.com/Azure/azure-sdk-for-go/issues)
* Check [previous questions](https://stackoverflow.com/questions/tagged/azure+go) or ask new ones on StackOverflow using `azure` and `go` tags.
## Community
* Chat with us in the **[#Azure SDK
channel](https://gophers.slack.com/messages/CA7HK8EEP)** on the [Gophers
Slack](https://gophers.slack.com/). Sign up
[here](https://invite.slack.golangbridge.org) first if necessary.
## Contribute
See [CONTRIBUTING.md](https://github.com/Azure/azure-sdk-for-go/blob/main/CONTRIBUTING.md).
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/legal/intellectualproperty/trademarks/usage/general). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

View file

@ -1,41 +0,0 @@
<!-- BEGIN MICROSOFT SECURITY.MD V0.0.5 BLOCK -->
## Security
Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://docs.microsoft.com/previous-versions/tn-archive/cc751383(v=technet.10)), please report it to us as described below.
## Reporting Security Issues
**Please do not report security vulnerabilities through public GitHub issues.**
Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://msrc.microsoft.com/create-report).
If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://www.microsoft.com/msrc/pgp-key-msrc).
You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc).
Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
* Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
* Full paths of source file(s) related to the manifestation of the issue
* The location of the affected source code (tag/branch/commit or direct URL)
* Any special configuration required to reproduce the issue
* Step-by-step instructions to reproduce the issue
* Proof-of-concept or exploit code (if possible)
* Impact of the issue, including how an attacker might exploit the issue
This information will help us triage your report more quickly.
If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://microsoft.com/msrc/bounty) page for more details about our active programs.
## Preferred Languages
We prefer all communications to be in English.
## Policy
Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://www.microsoft.com/msrc/cvd).
<!-- END MICROSOFT SECURITY.MD BLOCK -->

View file

@ -1,28 +0,0 @@
# Support
## How to file issues and get help
Customers with an [Azure support plan](https://azure.microsoft.com/support/options/) can open an [Azure support ticket](https://azure.microsoft.com/support/create-ticket/).
**We recommend this option if your problem requires immediate attention.**
### Github issues
We use [GitHub Issues](https://github.com/Azure/azure-sdk-for-go/issues/new/choose) to track bugs, questions, and feature requests.
GitHub issues are free, but **response time is not guaranteed.** See [GitHub issues support process](https://devblogs.microsoft.com/azure-sdk/github-issue-support-process/) for more details.
### Community resources
- Search for similar issues in [our GitHub repository](https://github.com/Azure/azure-sdk-for-go/issues)
- Ask a question on [StackOverflow](https://stackoverflow.com/questions/tagged/azure-sdk+go) and tag it with "azure-sdk" and "go"
- Share or upvote feature requests on [Feedback Page](https://feedback.azure.com/forums/34192--general-feedback).
- Take a look at the [Azure SDK blog](https://devblogs.microsoft.com/azure-sdk/).
- Ask a question on [Twitter](https://twitter.com/AzureSDK)
- Ask a question at [Microsoft Q&A](https://docs.microsoft.com/answers/products/azure?WT.mc_id=Portal-Microsoft_Azure_Support&product=all)
- Ask a question at [Microsoft Tech Community](https://techcommunity.microsoft.com/t5/azure/ct-p/Azure)
### Security bugs
Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center(secure@microsoft.com).
You should receive a response within 24 hours.
Further information, including the MSRC PGP key, can be found in the [Security TechCenter](https://www.microsoft.com/msrc/faqs-report-an-issue?rtc=1)
## Microsoft Support Policy
Please refer to [Azure SDK Support and Lifecycle information](https://azure.github.io/azure-sdk/policies_support.html)

View file

@ -1,130 +0,0 @@
# This builds only track 1 SDKs. See eng\pipelines\templates\steps\build.yml for track 2.
trigger:
paths:
exclude:
- sdk/
- eng/tools/
- eng/config.json
pr:
paths:
exclude:
- sdk/
- eng/tools/
- eng/config.json
jobs:
- job: Build_Test
strategy:
matrix:
Linux_Go117:
pool.name: azsdk-pool-mms-ubuntu-2004-general
go.version: '1.17.8'
Linux_Go118:
pool.name: azsdk-pool-mms-ubuntu-2004-general
go.version: '1.18'
pool:
name: $(pool.name)
variables:
- template: /eng/pipelines/templates/variables/globals.yml
- name: GOPATH
value: '$(system.defaultWorkingDirectory)/work'
- name: sdkPath
value: '$(GOPATH)/src/github.com/$(build.repository.name)'
- name: GO111MODULE
value: 'off'
- name: IGNORE_BREAKING_CHANGES
value: 'true'
- name: go.list.filter
value: 'profiles services storage'
- name: go.test.filter
value: '-path ./vendor -prune -o -path ./sdk -prune -o -path ./eng -prune -o'
steps:
- task: GoTool@0
inputs:
version: '$(go.version)'
displayName: "Select Go Version"
- script: |
set -e
mkdir -p '$(GOPATH)/bin'
mkdir -p '$(sdkPath)'
shopt -s dotglob extglob
mv !(work) '$(sdkPath)'
echo '##vso[task.prependpath]$(GOROOT)/bin'
echo '##vso[task.prependpath]$(GOPATH)/bin'
displayName: 'Create Go Workspace'
- script: |
set -e
go version
curl -sSL https://raw.githubusercontent.com/golang/dep/master/install.sh | sh
dep ensure -v
go get -u golang.org/x/lint/golint
workingDirectory: '$(sdkPath)'
displayName: 'Install Dependencies'
- script: |
for dd in $(go.list.filter); do
cd $(sdkPath)/$dd
go vet -v ./...
done
workingDirectory: '$(sdkPath)'
displayName: 'Vet'
- script: |
for dd in $(go.list.filter); do
cd $(sdkPath)/$dd
go build -v ./...
done
workingDirectory: '$(sdkPath)'
displayName: 'Build'
- script: go test $(dirname $(find . $(go.test.filter) -name '*_test.go' -print) | sort -u)
workingDirectory: '$(sdkPath)'
displayName: 'Run Tests'
- template: /eng/common/pipelines/templates/steps/verify-links.yml
parameters:
${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
Urls: ($(sdkPath)/eng/common/scripts/get-markdown-files-from-changed-files.ps1)
${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
Urls: $(Get-ChildItem -Path '$(sdkPath)/*.md' -Recurse | Where {$_.FullName -notlike "*/vendor/*" -and $_.FullName -notlike "*/sdk/*"})
Directory: ''
ScriptDirectory: '$(sdkPath)/eng/common/scripts'
WorkingDirectory: '$(sdkPath)'
- script: go run ./eng/tools/apidiff/main.go packages ./services FETCH_HEAD~1 FETCH_HEAD --copyrepo --breakingchanges || $IGNORE_BREAKING_CHANGES
workingDirectory: '$(sdkPath)'
displayName: 'Display Breaking Changes'
- script: go run ./eng/tools/pkgchk/main.go ./services --exceptions ./eng/tools/pkgchk/exceptions.txt
workingDirectory: '$(sdkPath)'
displayName: 'Verify Package Directory'
- script: grep -L -r --include *.go --exclude-dir vendor -P "Copyright (\d{4}|\(c\)) Microsoft" ./ | tee >&2
workingDirectory: '$(sdkPath)'
displayName: 'Copyright Header Check'
failOnStderr: true
condition: succeededOrFailed()
- script: |
for dd in $(go.list.filter); do
cd $(sdkPath)/$dd
gofmt -s -l -d $(find . -name '*.go' -print) >&2
done
workingDirectory: '$(sdkPath)'
displayName: 'Format Check'
failOnStderr: true
condition: and(succeededOrFailed(), startsWith(variables['go.version'], '1.16'))
- script: |
golint ./storage/... >&2
workingDirectory: '$(sdkPath)'
displayName: 'Linter Check'
failOnStderr: true
condition: succeededOrFailed()

View file

@ -1,15 +0,0 @@
/*
Package sdk provides Go packages for managing and using Azure services.
GitHub repo: https://github.com/Azure/azure-sdk-for-go
Official documentation: https://docs.microsoft.com/azure/go
API reference: https://godoc.org/github.com/Azure/azure-sdk-for-go
Samples: https://github.com/Azure-Samples/azure-sdk-for-go-samples
*/
package sdk
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License. See License.txt in the project root for license information.

View file

@ -1,195 +0,0 @@
## Guide for migrating to `sdk/resourcemanager/**/arm**` from `services/**/mgmt/**`
This document is intended for users that are familiar with the previous version of the Azure SDK For Go for management modules (`services/**/mgmt/**`) and wish to migrate their application to the next version of Azure resource management libraries (`sdk/resourcemanager/**/arm**`)
**For users new to the Azure SDK For Go for resource management modules, please see the [README for 'sdk/azcore`](https://github.com/Azure/azure-sdk-for-go/tree/main/sdk/azcore) and the README for every individual package.**
## Table of contents
* [Prerequisites](#prerequisites)
* [General Changes](#general-changes)
* [Breaking Changes](#breaking-changes)
* [Authentication](#authentication)
* [Error Handling](#error-handling)
* [Long Running Operations](#long-running-operations)
* [Pagination](#pagination)
* [Customized Policy](#customized-policy)
* [Custom HTTP Client](#custom-http-client)
## Prerequisites
- Go 1.18
- Latest version of resource management modules
## General Changes
The latest Azure SDK For Go for management modules is using the [Go Modules](https://github.com/golang/go/wiki/Modules) to manage the dependencies. We ship every RP as an individual module to create a more flexible user experience.
## Breaking Changes
### Authentication
In the previous version (`services/**/mgmt/**`), `autorest.Authorizer` is used in authentication process.
In the latest version (`sdk/resourcemanager/**/arm**`), in order to provide a unified authentication based on Azure Identity for all Azure Go SDKs, the authentication mechanism has been re-designed and improved to offer a simpler interface.
To the show the code snippets for the change:
**Previous version (`services/**/mgmt/**`)**
```go
authorizer, err := adal.NewServicePrincipalToken(oAuthToken, "<ClientId>", "<ClientSecret>", endpoint)
client := resources.NewGroupsClient("<SubscriptionId>")
client.Authorizer = authorizer
```
**Latest version (`sdk/resourcemanager/**/arm**`)**
```go
credential, err := azidentity.NewClientSecretCredential("<TenantId>", "<ClientId>", "<ClientSecret>", nil)
client, err := armresources.NewResourceGroupsClient("<SubscriptionId>", credential, nil)
```
For detailed information on the benefits of using the new authentication types, please refer to [this page](https://github.com/Azure/azure-sdk-for-go/blob/main/sdk/azidentity/README.md)
### Error Handling
There are some minor changes in the error handling.
- When there is an error in the SDK request, in the previous version (`services/**/mgmt/**`), the return value will all be non-nil, and you can get the raw HTTP response from the response value. In the latest version (`sdk/resourcemanager/**/arm**`), the first return value will be empty and you need to convert the error to the `azcore.ResponseError` interface to get the raw HTTP response. When the request is successful and there is no error returned, you can get the raw HTTP response from request context.
**Previous version (`services/**/mgmt/**`)**
```go
resp, err := resourceGroupsClient.CreateOrUpdate(context.TODO(), resourceGroupName, resourceGroupParameters)
if err != nil {
log.Fatalf("Status code: %d", resp.Response().StatusCode)
}
```
**Latest version (`sdk/resourcemanager/**/arm**`)**
```go
var rawResponse *http.Response
ctxWithResp := runtime.WithCaptureResponse(context.TODO(), &rawResponse)
resp, err := resourceGroupsClient.CreateOrUpdate(ctxWithResp, resourceGroupName, resourceGroupParameters, nil)
if err != nil {
var respErr *azcore.ResponseError
if errors.As(err, &respErr) {
log.Fatalf("Status code: %d", respErr.RawResponse.StatusCode)
} else {
log.Fatalf("Other error: %+v", err)
}
}
log.Printf("Status code: %d", rawResponse.StatusCode)
```
### Long Running Operations
In the previous version, if a request is a long-running operation, a struct `**Future` will be returned, which is an extension of the interface `azure.FutureAPI`. You need to invoke the `future.WaitForCompletionRef` to wait until it finishes.
In the latest version, if a request is a long-running operation, the function name will start with `Begin` to indicate this function will return a poller type which contains the polling methods.
**Previous version (`services/**/mgmt/**`)**
```go
future, err := virtualMachinesClient.CreateOrUpdate(context.TODO(), "<resource group name>", "<virtual machine name>", param)
if err != nil {
log.Fatal(err)
}
if err := future.WaitForCompletionRef(context.TODO(), virtualMachinesClient.Client); err != nil {
log.Fatal(err)
}
vm, err := future.Result(virtualMachinesClient)
if err != nil {
log.Fatal(err)
}
log.Printf("virtual machine ID: %v", *vm.ID)
```
**Latest version (`sdk/resourcemanager/**/arm**`)**
```go
poller, err := client.BeginCreateOrUpdate(context.TODO(), "<resource group name>", "<virtual machine name>", param, nil)
if err != nil {
log.Fatal(err)
}
resp, err := poller.PollUntilDone(context.TODO(), 30*time.Second)
if err != nil {
log.Fatal(err)
}
log.Printf("virtual machine ID: %v", *resp.VirtualMachine.ID)
```
### Pagination
In the previous version, if a request is a paginated operation, a struct `**ResultPage` will be returned, which is a struct with some paging methods but no interfaces are defined regarding that.
In the latest version, if a request is a paginated operation, a struct `**Pager` will be returned that contains the paging methods.
**Previous version (`services/**/mgmt/**`)**
```go
pager, err := resourceGroupsClient.List(context.TODO(), "", nil)
if err != nil {
log.Fatal(err)
}
for p.NotDone() {
for _, v := range pager.Values() {
log.Printf("resource group ID: %s\n", *rg.ID)
}
if err := pager.NextWithContext(context.TODO()); err != nil {
log.Fatal(err)
}
}
```
**Latest version (`sdk/resourcemanager/**/arm**`)**
```go
pager := resourceGroupsClient.NewListPager(nil)
for pager.More() {
nextResult, err := pager.NextPage(ctx)
if err != nil {
log.Fatalf("failed to advance page: %v", err)
}
for _, rg := range nextResult.Value {
log.Printf("resource group ID: %s\n", *rg.ID)
}
}
```
### Customized Policy
Because of adopting Azure Core which is a shared library across all Azure SDKs, there is also a minor change regarding how customized policy in configured.
In the previous version (`services/**/mgmt/**`), we use the `(autorest.Client).Sender`, `(autorest.Client).RequestInspector` and `(autorest.Client).ResponseInspector` properties in `github.com/Azure/go-autorest/autorest` module to provide customized interceptor for the HTTP traffic.
In latest version (`sdk/resourcemanager/**/arm**`), we use `arm.ClientOptions.PerCallPolicies` and `arm.ClientOptions.PerRetryPolicies` in `github.com/Azure/azure-sdk-for-go/sdk/azcore/arm` package instead to inject customized policy to the pipeline.
### Custom HTTP Client
Similar to the customized policy, there are changes regarding how the custom HTTP client is configured as well. You can now use the `arm.ClientOptions.Transport` option in `github.com/Azure/azure-sdk-for-go/sdk/azcore/arm` package to use your own implementation of HTTP client and plug in what they need into the configuration. The HTTP client must implement the `policy.Transporter` interface.
**Previous version (`services/**/mgmt/**`)**
```go
httpClient := NewYourOwnHTTPClient{}
client := resources.NewGroupsClient("<SubscriptionId>")
client.Sender = &httpClient
```
**Latest version (`sdk/resourcemanager/**/arm**`)**
```go
httpClient := NewYourOwnHTTPClient{}
options := &arm.ClientOptions{
ClientOptions: policy.ClientOptions{
Transport: &httpClient,
},
}
client, err := armresources.NewResourceGroupsClient("<SubscriptionId>", credential, options)
```
## Need help?
If you have encountered an issue during migration, please file an issue via [Github Issues](https://github.com/Azure/azure-sdk-for-go/issues) and make sure you add the "Preview" label to the issue

View file

@ -1,10 +0,0 @@
# Developer Documentation
Note: this documentation is for developers of SDKs, if you need documentation for using SDKs, refer to the individual READMEs in each package's root. For example, for help with the `azidentity` package refer to the [README here](https://github.com/Azure/azure-sdk-for-go/blob/main/sdk/azidentity/README.md)
- If you are onboarding an entirely new service and starting from scratch please refer to the [new service onboarding documentation][new_service_docs].
- If you are ready to release a package, follow the release [release documentation][release].
<!-- LINKS -->
[new_service_docs]: https://github.com/Azure/azure-sdk-for-go/blob/main/documentation/developer_setup.md
[release]: https://github.com/Azure/azure-sdk-for-go/blob/main/documentation/release.md

View file

@ -1,96 +0,0 @@
# Generate code
## Generate SDK packages
### Generate an Azure-SDK-for-Go service package
1. [Install AutoRest](https://github.com/Azure/autorest#installing-autorest).
1. Call autorest with the following arguments...
``` cmd
autorest path/to/readme/file --go --go-sdk-folder=<your/gopath/src/github.com/Azure/azure-sdk-for-go> --package-version=<version> --user-agent=<Azure-SDK-For-Go/version services> [--tag=choose/a/tag/in/the/readme/file]
```
For example...
``` cmd
autorest C:/azure-rest-api-specs/specification/advisor/resource-manager/readme.md --go --go-sdk-folder=C:/goWorkspace/src/github.com/Azure/azure-sdk-for-go --tag=package-2016-07-preview --package-version=v11.2.0-beta --user-agent='Azure-SDK-For-Go/v11.2.0-beta services'
```
- If you are looking to generate code based on a specific swagger file, you can replace `path/to/readme/file` with `--input-file=path/to/swagger/file`.
- If the readme file you want to use as input does not have golang tags yet, you can call autorest like this...
``` cmd
autorest path/to/readme/file --go --license-header=<MICROSOFT_APACHE_NO_VERSION> --namespace=<packageName> --output-folder=<your/gopath/src/github.com/Azure/azure-sdk-for-go/services/serviceName/mgmt/APIversion/packageName> --package-version=<version> --user-agent=<Azure-SDK-For-Go/version services> --clear-output-folder --can-clear-output-folder --tag=<choose/a/tag/in/the/readme/file>
```
For example...
``` cmd
autorest --input-file=https://raw.githubusercontent.com/Azure/azure-rest-api-specs/current/specification/network/resource-manager/Microsoft.Network/2017-10-01/loadBalancer.json --go --license-header=MICROSOFT_APACHE_NO_VERSION --namespace=lb --output-folder=C:/goWorkspace/src/github.com/Azure/azure-sdk-for-go/services/network/mgmt/2017-09-01/network/lb --package-version=v11.2.0-beta --clear-output-folder --can-clear-output-folder
```
1. Run `go fmt` on the generated package folder.
1. To make sure the SDK has been generated correctly, also run `golint`, `go build` and `go vet`.
### Generate Azure SDK for Go service packages in bulk
All services, all API versions.
1. [Install AutoRest](https://github.com/Azure/autorest#installing-autorest).
This repo contains a tool to generate the SDK, which depends on the golang tags from the readme files in the Azure REST API specs repo. The tool assumes you have an [Azure REST API specs](https://github.com/Azure/azure-rest-api-specs) clone, and [golint](https://github.com/golang/lint) is installed.
1. `cd eng/tools/generator`
1. `go install`
1. Add `GOPATH/bin` to your `PATH`, in case it was not already there.
1. Call the generator tool like this...
``` cmd
generator r [v] [l=logs/output/folder] version=<version> path/to/your/swagger/repo/clone
```
For example...
``` cmd
generator r v l=temp version=v11.2.0-beta C:/azure-rest-api-specs
```
The generator tool already runs `go fmt`, `golint`, `go build` and `go vet`; so running them is not necessary.
#### Use the generator tool to generate a single package
1. Just call the generator tool specifying the service to be generated in the input folder.
``` cmd
generator r [v] [l=logs/output/folder] version=<version> path/to/your/swagger/repo/clone/specification/service
```
For example...
``` cmd
generator r v l=temp version=v11.2.0-beta C:/azure-rest-api-specs/specification/network
```
## Include a new package in the SDK
1. Submit a pull request to the Azure REST API specs repo adding the golang tags for the service and API versions in the service readme file, if the needed tags are not there yet.
1. Once the tags are available in the Azure REST API specs repo, generate the SDK.
1. In the changelog file, document the new generated SDK. Include the [autorest.go extension](https://github.com/Azure/autorest.go) version used, and the Azure REST API specs repo commit from where the SDK was generated.
1. Install [dep](https://github.com/golang/dep).
1. Run `dep ensure`.
1. Submit a pull request to this repo, and we will review it.
## Generate Azure SDK for Go profiles
Take a look into the [profile generator documentation](https://github.com/Azure/azure-sdk-for-go/tree/main/eng/tools/profileBuilder)

View file

@ -1,332 +0,0 @@
# Developer Set Up
* [Installing Go](#installing-go)
* [Create a Client](#create-a-client)
* [Documenting Code](#documenting-code)
* [Constructors](#constructors)
* [Defining Methods](#defining-methods)
* [Write Tests](#write-tests)
## Installing Go
The Azure-sdk-for-go team supports Go versions latest and latest-1, to see the exact versions we support you can check the pipeline defintions [here][pipeline_definitions]. The CI pipelines test the latest and latest-1 versions on both Windows and Linux virtual machines. If you do not already have Go installed, refer to this [workspace setup][workspace_setup] article for a more in depth tutorial on setting up your Go environment (there is also an MSI if you are developing on Windows at the [go download page][go_download]). After installing Go and configuring your workspace, fork the `azure-sdk-for-go` repository and clone it to a directory that looks like: `<GO HOME>/src/github.com/Azure/azure-sdk-for-go`.
## Create a Client
After you have the generated code from Autorest, the next step is to wrap this generated code in a "convenience layer" that the customers will use directly to interact with the service. Go is not an object-oriented language like C#, Java, or Python. There is no type hierarchy in Go. Clients and models will be defined as `struct`s and methods will be defined on these structs to interact with the service.
In other languages, types can be specifically marked "public" or "private", in Go exported types and methods are defined by starting with a capital letter. The methods on structs also follow this rule, if it is for use outside of the model it must start with a capital letter.
### Documenting Code
Code is documented directly in line and can be created directly using the `doc` tool which is part of the Go toolchain. To document a type, variable, constant, function, or package write a regular comment directly preceding its declaration (with no intervening blank line). For an example, here is the documentation for the `fmt.Fprintf` function:
```golang
// Fprint formats using the default formats for its operands and writes to w.
// Spaces are added between operands when neither is a string.
// It returns the number of bytes written and any write error encountered.
func Fprint(w io.Writer, a ...interface{}) (n int, err error) {
```
Each package needs to include a `doc.go` file and not be a part of a service version. For more details about this file there is a detailed write-up in the [repo wiki][doc_go_template]. In the `doc.go` file you should include a short service overview, basic examples, and (if they exist) a link to samples in the [`azure-sdk-for-go-samples` repository][go_azsdk_samples]
### Constructors
All clients should be able to be initialized directly from the user and should begin with `New`. For example to define a constructor for a new client for the Tables service we start with defining the struct `ServiceClient`:
```golang
// A ServiceClient represents a client to the table service. It can be used to query the available tables, add/remove tables, and various other service level operations.
type ServiceClient struct {
client *tableClient
service *serviceClient
cred SharedKeyCredential
}
```
Note that there are no exported fields on the `ServiceClient` struct, and as a rule of thumb, generated clients and credentials should be private.
Constructors for clients are separate methods that are not associated with the struct. The constructor for the ServiceClient is as follow:
```golang
// NewServiceClient creates a ServiceClient struct using the specified serviceURL, credential, and options.
func NewServiceClient(serviceURL string, cred azcore.TokenCredential, options *ClientOptions) (ServiceClient, error) {
conOptions := getConnectionOptions(serviceURL, options)
conOptions.PerRetryPolicies = append(conOptions.PerRetryPolicies, runtime.NewBearerTokenPolicy(cred, []string{"https://storage.azure.com/.default"}, nil))
con := generated.NewConnection(serviceURL, conOptions)
return ServiceClient{
client: generated.NewTableClient(con, generated.Enum0TwoThousandNineteen0202),
service: generated.NewServiceClient(con, generated.Enum0TwoThousandNineteen0202),
con: con,
}, nil
}
```
In `Go`, the method parameters are enclosed with parenthesis immediately following the method name with the parameter name preceding the parameter type. The return arguments follow the parameters. If a method has more than one return parameter the types of the parameter must be enclosed in parenthesis. Note the `*` before a type indicates a pointer to that type. All methods that create a new client or interact with the service should return an `error` type as the last argument.
This client takes three parameters, the first is the service URL for the specific account. The second is an [`interface`][go_interfaces] which is a specific struct that has definitions for a certain set of methods. In the case of `azcore.TokenCredential` the `GetToken(context.Context, options policy.TokenRequestOptions)` method must be defined to be a valid interface. The final argument to methods that create clients or interact with the service should be a pointer to an `Options` parameter. This options struct should have `azcore.ClientOptions` embedded and any service specific options. Making this final parameter a pointer allows the customer to pass in `nil` if there are no specific options they want to change.
### Defining Methods
Defining a method follows the format:
```golang
// Create creates the table with the tableName specified when NewClient was called.
func (t *Client) Create(ctx context.Context, options *CreateTableOptions) (CreateTableResponse, error) {
if options == nil {
options = &CreateTableOptions{}
}
resp, err := t.client.Create(ctx, generated.Enum1Three0, generated.TableProperties{TableName: &t.name}, options.toGenerated(), &generated.QueryOptions{})
return createTableResponseFromGen(&resp), err
}
```
The `(s *Client)` portion is the "receiver". Methods can be defined for either pointer (with a `*`) or receiver (without a `*`) types. Pointer receivers will not copy types on method calls and allows the method to mutate the receiving struct. Client methods should use a pointer receiver.
All methods that perform I/O of any kind, sleep, or perform a significant amount of CPU-bound work must have the first parameter be of type [`context.Context`][golang_context] which allows the customer to carry a deadline, cancellation signal, and other values across API boundaries. The remaining parameters should be parameters specific to that method. The return types for methods should be first a "Response" object and second an `error` object.
## Write Tests
Testing is built into the Go toolchain as well with the `testing` library. The testing infrastructure located in the `sdk/internal/recording` directory takes care of generating recordings, establishing the mode a test is being run in (options are "record" or "playback") and reading environment variables. The HTTP traffic is intercepted by a custom [test-proxy][test_proxy_docs] in both the "recording" and "playback" case to either persist or read HTTP interactions from a file. There is one small step that needs to be added to you client creation to route traffic to this test proxy. All three of these modes are specified in the `AZURE_RECORD_MODE` environment variable:
| Mode | Powershell Command | Usage |
| ---- | ------------------ | ----- |
| record | `$ENV:AZURE_RECORD_MODE="record"` | Running against a live service and recording HTTP interactions |
| playback | `$ENV:AZURE_RECORD_MODE="playback"` | Running tests against recording HTTP interactiosn |
| live | `$ENV:AZURE_RECORD_MODE="live"` | Bypassing test proxy, running against live service, and not recording HTTP interactions (used by live pipelines) |
To get started first install [`docker`][get_docker]. Then to start the proxy, from the root of the repository, run the command `./eng/common/testproxy/docker-start-proxy.ps1 start`. This command will take care of pulling the pinned docker image and running it in the background.
It is not required to run the test-proxy from within the docker container, but this is how the proxy is run in the Azure DevOps pipelines. If you would like to run the test-proxy in a different manner the [documentation][test_proxy_docs] has more information.
### Test Mode Options
There are three options for test modes: "recording", "playback", and "live, each with their own purpose.
Recording mode is for testing against a live service and 'recording' the HTTP interactions in a JSON file for use later. This is helpful for developers because not every request will have to run through the service and makes your tests run much quicker. This also allows us to run our tests in public pipelines without fear of leaking secrets to our developer subscriptions.
In playback mode the JSON file that the HTTP interactions are saved to is used in place of a real HTTP call. This is quicker and is used most often for quickly verifying you did not change the behavior of your library.
Live mode is used by the internal pipelines to test directly against a service (similar to how a customer would do so). This mode bypasses any interactions with the test proxy.
### Routing Requests to the Proxy
All clients contain an options struct as the last parameter of the constructor function. In this options struct you need to have a way to provide a custom HTTP transport object. In your tests, you will replace the default HTTP transport object with a custom one in the `internal/recording` library that takes care of routing requests. Here is an example:
```golang
package aztables
import (
...
"github.com/Azure/azure-sdk-for-go/sdk/internal/recording"
)
var pathToPackage = "sdk/data/aztables/testdata"
func createClientForRecording(t *testing.T, tableName string, serviceURL string, cred SharedKeyCredential) (*Client, error) {
transport, err := recording.NewRecordingHTTPClient(t)
require.NoError(t, err)
options := &ClientOptions{
ClientOptions: azcore.ClientOptions{
Transport: client,
},
}
// Validate the URL ends with a "/"
if !strings.HasSuffix(serviceURL, "/") && tableName != "" {
serviceURL += "/"
}
serviceURL += tableName
return NewClientWithSharedKey(serviceURL, &cred, options)
}
```
Including this in a file for test helper methods will ensure that before each test the developer simply has to add
```golang
func TestExample(t *testing.T) {
err := recording.Start(t, "path/to/package", nil)
defer recording.Stop(t, nil)
client, err := createClientForRecording(t, "myTableName", "myServiceUrl", myCredential)
require.NoError(t, err)
...
<test code>
}
```
The first two methods (`Start` and `Stop`) tell the proxy when an individual test is starting and stopping to communicate when to start recording HTTP interactions and when to persist it to disk. `Start` takes three parameters, the `t *testing.T` parameter of the test, the path to where the recordings live for a package (this should be the path to the package), and an optional options struct. `Stop` just takes the `t *testing.T` and an options struct as parameters.
### Writing Tests
A simple test for `aztables` is shown below:
```golang
import (
"fmt"
"os"
"github.com/stretchr/testify/require"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/internal/recording"
)
const (
accountName := os.GetEnv("TABLES_PRIMARY_ACCOUNT_NAME")
accountKey := os.GetEnv("TABLES_PRIMARY_ACCOUNT_KEY")
)
// Test creating a single table
func TestCreateTable(t *testing.T) {
err := recording.Start(t, pathToPackage, nil)
require.NoError(t, err)
defer func() {
err := recording.Stop(t, nil)
require.NoError(t, err)
}()
serviceUrl := fmt.Sprintf("https://%v.table.core.windows.net", accountName)
cred, err := azidentity.NewDefaultAzureCredential(nil)
require.NoError(t, err)
client, err := createClientForRecording(t, "tableName", serviceUrl, cred)
require.NoError(t, err)
resp, err := client.Create()
require.NoError(t, err)
require.Equal(t, resp.TableResponse.TableName, "tableName")
defer client.Delete() // Clean up resources
..
.. More test functionality
..
}
```
The first part of the test above is for getting the secrets needed for authentication from your environment, best practice is to store your test secrets in environment variables.
The rest of the snippet shows a test that creates a single table and requirements (similar to assertions in other languages) that the response from the service has the same table name as the supplied parameter. Every test in Go has to have exactly one parameter, the `t *testing.T` object, and it must begin with `Test`. After making a service call or creating an object you can make assertions on that object by using the external `testify/require` library. In the example above, we "require" that the error returned is `nil`, meaning the call was successful and then we require that the response object has the same table name as supplied.
Check out the docs for more information about the methods available in the [`require`][require_package] libraries.
If you set the environment variable `AZURE_RECORD_MODE` to "record" and run `go test` with this code and the proper environment variables this test would pass and you would be left with a new directory and file. Test recordings are saved to a `recording` directory in the same directory that your test code lives. Running the above test would also create a file `recording/TestCreateTable.json` with the HTTP interactions persisted on disk. Now you can set `AZURE_RECORD_MODE` to "playback" and run `go test` again, the test will have the same output but without reaching the service.
### Scrubbing Secrets
The recording files eventually live in the main repository (`github.com/Azure/azure-sdk-for-go`) and we need to make sure that all of these recordings are free from secrets. To do this we use Sanitizers with regular expressions for replacements. All of the available sanitizers are available as methods from the `recording` package. The recording methods generally take three parameters: the test instance (`t *testing.T`), the value to be removed (ie. an account name or key), and the value to use in replacement.
| Sanitizer Type | Method |
| -------------- | ------ |
| Body Key Sanitizer | `AddBodyKeySanitizer(jsonPath, value, regex string, options *RecordingOptions)` |
| Body Regex Sanitizer | `AddBodyRegexSanitizer(value, regex string, options *RecordingOptions)` |
| Continuation Sanitizer | `AddContinuationSanitizer(key, method string, resetAfterFirst bool, options *RecordingOptions)` |
| General Regex Sanitizer | `AddGeneralRegexSanitizer(value, regex string, options *RecordingOptions)` |
| Header Regex Sanitizer | `AddHeaderRegexSanitizer(key, value, regex string, options *RecordingOptions)` |
| OAuth Response Sanitizer | `AddOAuthResponseSanitizer(options *RecordingOptions)` |
| Remove Header Sanitizer | `AddRemoveHeaderSanitizer(headersForRemoval []string, options *RecordingOptions)` |
| URI Sanitizer | `AddURISanitizer(value, regex string, options *RecordingOptions)` |
| URI Subscription ID Sanitizer | `AddURISubscriptionIDSanitizer(value string, options *RecordingOptions)` |
To add a scrubber that replaces the URL of your account use the `TestMain()` function to set sanitizers before you begin running tests.
```golang
func TestMain(m *testing.M) {
// Initialize
if recording.GetRecordMode() == "record" {
// start all tests with a proxy using it's defaults.
err := recording.ResetProxy(nil)
if err != nil {
panic(err)
}
vaultUrl := os.Getenv("AZURE_KEYVAULT_URL")
err = recording.AddURISanitizer(fakeKvURL, vaultUrl, nil)
if err != nil {
panic(err)
}
}
exitVal := m.Run()
if recording.GetRecordMode() == recording.PlaybackMode || recording.GetRecordMode() == recording.RecordingMode {
// reset the proxy to it's defaults
err := recording.ResetProxy(nil)
if err != nil {
panic(err)
}
}
```
Note that removing the names of accounts and other values in your recording can have side effects when running your tests in playback. To take care of this, there are additional methods in the `internal/recording` module for reading environment variables and defaulting to the processed recording value. For example, an `aztables` test for the client constructor and "requiring" the account name to be the same as provided could look like this:
```golang
func TestClient(t *testing.T) {
accountName := recording.GetEnvVariable(t, "TABLES_PRIMARY_ACCOUNT_NAME", "fakeAccountName")
// If running in playback, the value is "fakeAccountName". If running in "record" the value is the environment variable
accountKey := recording.GetEnvVariable(t, "TABLES_PRIMARY_ACCOUNT_KEY", "fakeAccountKey")
cred, err := NewSharedKeyCredential(accountName, accountKey)
require.NoError(t, err)
client, err := NewClient("someTableName", someServiceURL, cred, nil)
require.NoError(t, err)
require.Equal(t, accountName, client.AccountName())
}
```
### Using `azidentity` Credentials In Tests
The credentials in `azidentity` are not automatically configured to run in playback mode. To make sure your tests run in playback mode even with `azidentity` credentials the best practice is to use a simple `FakeCredential` type that inserts a fake Authorization header to mock a credential. An example for swapping the `DefaultAzureCredential` using a helper function is shown below in the context of `aztables`
```golang
type FakeCredential struct {}
func NewFakeCredential() *FakeCredential {
return &FakeCredential{}
}
func (f *FakeCredential) GetToken(ctx context.Context, options policy.TokenRequestOptions) (*azcore.AccessToken, error) {
return &azcore.AccessToken{Token: "***", ExpiresOn: time.Now().Add(time.Hour)}, nil
}
func getAADCredential() (azcore.TokenCredential, error) {
if recording.GetRecordMode() == recording.PlaybackMode {
return NewFakeCredential(), nil
}
return azidentity.NewDefaultCredential(nil)
}
func TestClientWithAAD(t *testing.T) {
accountName := recording.GetEnvVariable(t, "TABLES_PRIMARY_ACCOUNT_NAME", "fakeAccountName")
cred, err := getAADCredential()
require.NoError(t, err)
...
...run tests...
}
```
The `FakeCredential` show here implements the `azcore.TokenCredential` interface and can be used anywhere the `azcore.TokenCredential` is used.
## Create Pipelines
When you create the first PR for your library you will want to create this PR against a `track2-<package>` library. Submitting PRs to the `main` branch should only be done once your package is close to being released. Treating `track2-<package>` as your main development branch will allow nightly CI and live pipeline runs to pick up issues as soon as they are introduced. After creating this PR add a comment with the following:
```
/azp run prepare-pipelines
```
This creates the pipelines that will verify future PRs. The `azure-sdk-for-go` is tested against latest and latest-1 on Windows and Linux. All of your future PRs (regardless of whether they are made to `track2-<package>` or another branch) will be tested against these versions. For more information about the individual checks run by CI and troubleshooting common issues check out the `eng_sys.md` file.
<!-- LINKS -->
[doc_go_template]: https://github.com/Azure/azure-sdk-for-go/wiki/doc.go-template
[get_docker]: https://docs.docker.com/get-docker/
[go_azsdk_samples]: https://github.com/azure-samples/azure-sdk-for-go-samples
[go_download]: https://golang.org/dl/
[go_interfaces]: https://gobyexample.com/interfaces
[pipeline_definitions]: https://github.com/Azure/azure-sdk-for-go/blob/main/eng/pipelines/templates/jobs/archetype-sdk-client.yml
[require_package]: https://pkg.go.dev/github.com/stretchr/testify/require
[test_proxy_docs]: https://github.com/Azure/azure-sdk-tools/tree/main/tools/test-proxy
[workspace_setup]: https://www.digitalocean.com/community/tutorials/how-to-install-go-and-set-up-a-local-programming-environment-on-windows-10

View file

@ -1,51 +0,0 @@
# Engineering System Checks
* [Build and Test](#build-and-test)
* [Analyze Stages](#analyze-stages)
## Build and Test
Our build system runs PR changes against the latest two versions of Go on both Windows and Linux.
## Analyze
### Link Verification Check
Verifies all of the links are valid in your README files. This step also checks that locale codes in links are removed. If this is failing first check if you have locale codes (ie. `en-us`) in your links, then check to see if the link works locally.
If you are trying to add a link that will exist in the next PR (ie. you are adding a samples README or migration guide), you can use an `aka.ms` link or use a temporary link (ie: `https://microsoft.com`) and create a follow-up PR to correct the temporary link.
### Lint
Some of the most common linting errors are:
* `errcheck`: An error was returned but it was not checked to be `nil`
* `varcheck`: A variable is unused
* `deadcode`: A struct or method is unused
* `ineffasign`: An ineffectual assignment, the variable is not used after declaration.
For more information about the linters run checkout the [golangci website][golangci_website]
To run this locally, first install the tool with:
```bash
go install github.com/golangci/golangci-lint/cmd/golangci-lint@v1.41.1
```
```bash
golangci-lint run -c <path_to_root>/eng/.golangci.yml in <path_to_my_package>
```
### Copyright Header Check
Every source file must have the MIT header comment at the top of the file. At the top of each file you need to include the following snippet and a new line before the package definition:
```golang
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.
package <mypackage>
```
### Format Check
Your package should follow the default formatting, which you can run locally with the command:
```bash
go fmt
```
<!-- LINKS -->
[golangci_website]: https://golangci-lint.run/usage/linters/

View file

@ -1,372 +0,0 @@
Getting Started - New Azure Go SDK
=============================================================
We are excited to announce that a new set of management libraries are
now production-ready. Those packages share a number of new features
such as Azure Identity support, HTTP pipeline, error-handling.,etc, and
they also follow the new Azure SDK guidelines which create easy-to-use
APIs that are idiomatic, compatible, and dependable.
You can find the full list of those new libraries
[here](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk)
In this basic quickstart guide, we will walk you through how to
authenticate to Azure and start interacting with Azure resources. There are several possible approaches to
authentication. This document illustrates the most common scenario.
Migration from older versions of Azure management libraries for Go
------------------------------------------------------------------
If you are an existing user of the older version of Azure management library for Go (packages that are located under [`/services`](https://github.com/Azure/azure-sdk-for-go/tree/main/services)), and you are looking for a migration guide to upgrade to the latest version of the SDK, please refer to [this migration guide](https://aka.ms/azsdk/go/mgmt/migration) for detailed instructions.
Prerequisites
-------------
You will need Go 1.18 and latest version of resource management modules.
You will need the following values to authenticate to Azure
- **Subscription ID**
- **Client ID**
- **Client Secret**
- **Tenant ID**
These values can be obtained from the portal, here's the instructions:
### Get Subscription ID
1. Login into your Azure account
2. Select Subscriptions in the left sidebar
3. Select whichever subscription is needed
4. Click on Overview
5. Copy the Subscription ID
### Get Client ID / Client Secret / Tenant ID
For information on how to get Client ID, Client Secret, and Tenant ID,
please refer to [this document](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal)
### Setting Environment Variables
After you obtained the values, you need to set the following values as
your environment variables
- `AZURE_CLIENT_ID`
- `AZURE_CLIENT_SECRET`
- `AZURE_TENANT_ID`
- `AZURE_SUBSCRIPTION_ID`
To set the following environment variables on your development system:
Windows (Note: Administrator access is required)
1. Open the Control Panel
2. Click System Security, then System
3. Click Advanced system settings on the left
4. Inside the System Properties window, click the `Environment Variables…` button.
5. Click on the property you would like to change, then click the `Edit…` button. If the property name is not listed, then click the `New…` button.
Linux-based OS :
export AZURE_CLIENT_ID="__CLIENT_ID__"
export AZURE_CLIENT_SECRET="__CLIENT_SECRET__"
export AZURE_TENANT_ID="__TENANT_ID__"
export AZURE_SUBSCRIPTION_ID="__SUBSCRIPTION_ID__"
Install the package
-------------------
This project uses Go modules for versioning and dependency management.
As an example, to install the Azure Compute module, you would run :
```sh
go get github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/compute/armcompute
```
We also recommend installing other packages for authentication and core functionalities :
```sh
go get github.com/Azure/azure-sdk-for-go/sdk/azcore
go get github.com/Azure/azure-sdk-for-go/sdk/azidentity
```
Authentication
--------------
Once the environment is setup, all you need to do is to create an authenticated client. Before creating a client, you will first need to authenticate to Azure. In specific, you will need to provide a credential for authenticating with the Azure service. The `azidentity` module provides facilities for various ways of authenticating with Azure including client/secret, certificate, managed identity, and more.
Our default option is to use **DefaultAzureCredential** which will make use of the environment variables we have set and take care of the authentication flow for us.
```go
cred, err := azidentity.NewDefaultAzureCredential(nil)
```
For more details on how authentication works in `azidentity`, please see the documentation for `azidentity` at [pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/azidentity](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/azidentity).
Creating a Resource Management Client
-------------------------------------
Once you have a credential, you will need to decide what service to use and create a client to connect to that service. In this section, we will use `Compute` as our target service. The Compute modules consist of one or more clients. A client groups a set of related APIs, providing access to its functionality within the specified subscription. You will need to create one or more clients to access the APIs you require using your `azcore.TokenCredential`.
To show an example, we will create a client to manage Virtual Machines. The code to achieve this task would be:
```go
client, err := armcompute.NewVirtualMachinesClient("<subscription ID>", credential, nil)
```
You can use the same pattern to connect with other Azure services that you are using. For example, in order to manage Virtual Network resources, you would install the Network package and create a `VirtualNetwork` Client:
```go
client, err := armnetwork.NewVirtualNetworksClient("<subscription ID>", credential, nil)
```
Interacting with Azure Resources
--------------------------------
Now that we are authenticated and have created our sub-resource clients, we can use our client to make API calls. For resource management scenarios, most of our cases are centered around creating / updating / reading / deleting Azure resources. Those scenarios correspond to what we call "operations" in Azure. Once you are sure of which operations you want to call, you can then implement the operation call using the management client we just created in previous section.
To write the concrete code for the API call, you might need to look up the information of request parameters, types, and response body for a certain opertaion. We recommend using the following site for SDK reference:
- [Official Go docs for new Azure Go SDK packages](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk) - This web-site contains the complete SDK references for each released package as well as embedded code snippets for some operation
To see the reference for a certain package, you can either click into each package on the web-site, or directly add the SDK path to the end of URL. For example, to see the reference for Azure Compute package, you can use [https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/compute/armcompute](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/compute/armcompute). Certain development tool or IDE has features that allow you to directly look up API definitions as well.
Let's illustrate the SDK usage by a few quick examples. In the following sample. we are going to create a resource group using the SDK. To achieve this scenario, we can take the follow steps
- **Step 1** : Decide which client we want to use, in our case, we know that it's related to Resource Group so our choice is the [ResourceGroupsClient](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/resources/armresources#ResourceGroupsClient)
- **Step 2** : Find out which operation is responsible for creating a resource group. By locating the client in previous step, we are able to see all the functions under `ResourceGroupsClient`, and we can see [the `CreateOrUpdate` function](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/resources/armresources#ResourceGroupsClient.CreateOrUpdate) is what need.
- **Step 3** : Using the information about this operation, we can then fill in the required parameters, and implement it using the Go SDK. If we need extra information on what those parameters mean, we can also use the [Azure service documentation](https://docs.microsoft.com/azure/?product=featured) on Microsoft Docs
Let's show our what final code looks like
Example: Creating a Resource Group
---------------------------------
***Import the packages***
```go
import (
"context"
"log"
"os"
"time"
"github.com/Azure/azure-sdk-for-go/sdk/azcore"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/resources/armresources"
)
```
***Define some global variables***
```go
var (
ctx = context.TODO()
subscriptionId = os.Getenv("AZURE_SUBSCRIPTION_ID")
location = "westus2"
resourceGroupName = "resourceGroupName"
interval = 5 * time.Second
)
```
***Write a function to create a resource group***
```go
func createResourceGroup(ctx context.Context, credential azcore.TokenCredential) (*armresources.ResourceGroupsClientCreateOrUpdateResponse, error) {
rgClient, err := armresources.NewResourceGroupsClient(subscriptionId, credential, nil)
if err != nil {
return nil, err
}
param := armresources.ResourceGroup{
Location: to.Ptr(location),
}
resp, err := rgClient.CreateOrUpdate(ctx, resourceGroupName, param, nil)
return &resp, err
}
```
***Invoking the `createResourceGroup` function in main***
```go
func main() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("authentication failure: %+v", err)
}
resourceGroup, err := createResourceGroup(ctx, cred)
if err != nil {
log.Fatalf("cannot create resource group: %+v", err)
}
log.Printf("Resource Group %s created", *resourceGroup.ResourceGroup.ID)
}
```
Let's demonstrate management client's usage by showing additional samples
Example: Managing Resource Groups
---------------------------------
***Update a resource group***
```go
func updateResourceGroup(ctx context.Context, credential azcore.TokenCredential) (*armresources.ResourceGroupsClientUpdateResponse, error) {
rgClient, err := armresources.NewResourceGroupsClient(subscriptionId, credential, nil)
if err != nil {
return nil, err
}
update := armresources.ResourceGroupPatchable{
Tags: map[string]*string{
"new": to.Ptr("tag"),
},
}
resp,err :=rgClient.Update(ctx, resourceGroupName, update, nil)
return &resp, err
}
```
***List all resource groups***
```go
func listResourceGroups(ctx context.Context, credential azcore.TokenCredential) ([]*armresources.ResourceGroup, error) {
rgClient, err := armresources.NewResourceGroupsClient(subscriptionId, credential, nil)
if err != nil {
return nil, err
}
pager := rgClient.NewListPager(nil)
var resourceGroups []*armresources.ResourceGroup
for pager.More() {
nextResult, err := pager.NextPage(ctx)
if err != nil {
return nil, err
}
if nextResult.ResourceGroupListResult.Value != nil {
resourceGroups = append(resourceGroups, nextResult.ResourceGroupListResult.Value...)
}
}
return resourceGroups, nil
}
```
***Delete a resource group***
```go
func deleteResourceGroup(ctx context.Context, credential azcore.TokenCredential) error {
rgClient, err := armresources.NewResourceGroupsClient(subscriptionId, credential, nil)
if err != nil {
return err
}
poller, err := rgClient.BeginDelete(ctx, resourceGroupName, nil)
if err != nil {
return err
}
_, err = poller.PollUntilDone(ctx, interval)
return err
}
```
***Invoking the update, list and delete of resource group in the main function***
```go
func main() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("authentication failure: %+v", err)
}
resourceGroup, err := createResourceGroup(ctx, cred)
if err != nil {
log.Fatalf("cannot create resource group: %+v", err)
}
log.Printf("Resource Group %s created", *resourceGroup.ResourceGroup.ID)
updatedRG, err := updateResourceGroup(ctx, cred)
if err != nil {
log.Fatalf("cannot update resource group: %+v", err)
}
log.Printf("Resource Group %s updated", *updatedRG.ResourceGroup.ID)
rgList, err := listResourceGroups(ctx, cred)
if err != nil {
log.Fatalf("cannot list resource group: %+v", err)
}
log.Printf("We totally have %d resource groups", len(rgList))
if err := deleteResourceGroup(ctx, cred); err != nil {
log.Fatalf("cannot delete resource group: %+v", err)
}
log.Printf("Resource Group deleted")
}
```
Example: Managing Virtual Machines
---------------------------------
In addition to resource groups, we will also use Virtual Machine as an example and show how to manage how to create a Virtual Machine which involves three Azure services (Resource Group, Network and Compute)
Due to the complexity of this scenario, please [click here](https://aka.ms/azsdk/go/mgmt/samples) for the complete sample.
Long Running Operations
-----------------------
In the samples above, you might notice that some operations have a ``Begin`` prefix (for example, ``BeginDelete``). This indicates the operation is a Long-Running Operation (LRO). For resource management libraries, this kind of operation is quite common since certain resource operations may take a while to finish. When you need to use those LROs, you will need to use a poller and keep polling for the result until it is done. To illustrate this pattern, here is an example
```go
ctx := context.TODO()
poller, err := client.BeginCreate(ctx, "resource_identifier", "additonal_parameter")
if err != nil {
// handle error...
}
resp, err = poller.PollUntilDone(ctx, 5 * time.Second)
if err != nil {
// handle error...
}
log.Printf("LRO done")
// dealing with `resp`
```
Note that you will need to pass a polling interval to ```PollUntilDone``` and tell the poller how often it should try to get the status. This number is usually small but it's best to consult the [Azure service documentation](https://docs.microsoft.com/azure/?product=featured) on best practices and recommdend intervals for your specific use cases.
For more advanced usage of LRO and design guidelines of LRO, please visit [this documentation here](https://azure.github.io/azure-sdk/golang_introduction.html#methods-invoking-long-running-operations)
## Code Samples
More code samples for using the management library for Go SDK can be found in the following locations
- [Go SDK Code Samples](https://aka.ms/azsdk/go/mgmt/samples)
- Example files under each package. For example, examples for Network packages can be [found here](https://github.com/Azure/azure-sdk-for-go/blob/main/sdk/resourcemanager/network/armnetwork/ze_generated_example_loadbalancernetworkinterfaces_client_test.go)
Need help?
----------
- File an issue via [Github
Issues](https://github.com/Azure/azure-sdk-for-go/issues)
- Check [previous
questions](https://stackoverflow.com/questions/tagged/azure+go)
or ask new ones on StackOverflow using azure and Go tags.
Contributing
------------
For details on contributing to this repository, see the contributing
guide.
This project welcomes contributions and suggestions. Most contributions
require you to agree to a Contributor License Agreement (CLA) declaring
that you have the right to, and actually do, grant us the rights to use
your contribution. For details, visit <https://cla.microsoft.com>.
When you submit a pull request, a CLA-bot will automatically determine
whether you need to provide a CLA and decorate the PR appropriately
(e.g., label, comment). Simply follow the instructions provided by the
bot. You will only need to do this once across all repositories using
our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For
more information see the Code of Conduct FAQ or contact
[opencode@microsoft.com](mailto:opencode@microsoft.com) with any questions or comments.

View file

@ -1,531 +0,0 @@
# Azure SDK for Go - Previous Versions
This guide is for developers who are using the old versions of Azure Go SDK. Those SDKs are located under
[services folder](https://github.com/Azure/azure-sdk-for-go/tree/master/services).
## Package Updates
Most packages in the SDK are generated from [Azure API specs][azure_rest_specs]
using [Azure/autorest.go][] and [Azure/autorest][]. These generated packages
depend on the HTTP client implemented at [Azure/go-autorest][].
[azure_rest_specs]: https://github.com/Azure/azure-rest-api-specs
[azure/autorest]: https://github.com/Azure/autorest
[azure/autorest.go]: https://github.com/Azure/autorest.go
[azure/go-autorest]: https://github.com/Azure/go-autorest
The SDK codebase adheres to [semantic versioning](https://semver.org) and thus
avoids breaking changes other than at major (x.0.0) releases. Because Azure's
APIs are updated frequently, we release a **new major version at the end of
each month** with a full changelog. For more details and background see [SDK Update
Practices](https://github.com/Azure/azure-sdk-for-go/wiki/SDK-Update-Practices).
To more reliably manage dependencies like the Azure SDK in your applications we
recommend [golang/dep](https://github.com/golang/dep).
Packages that are still in public preview can be found under the ./services/preview
directory. Please be aware that since these packages are in preview they are subject
to change, including breaking changes outside of a major semver bump.
# Install and Use:
## Install
```sh
$ go get -u github.com/Azure/azure-sdk-for-go/...
```
and you should also make sure to include the minimum version of [`go-autorest`](https://github.com/Azure/go-autorest) that is specified in `Gopkg.toml` file.
Or if you use dep, within your repo run:
```sh
$ dep ensure -add github.com/Azure/azure-sdk-for-go
```
If you need to install Go, follow [the official instructions](https://golang.org/dl/).
## Use
For many more scenarios and examples see
[Azure-Samples/azure-sdk-for-go-samples][samples_repo].
Apply the following general steps to use packages in this repo. For more on
authentication and the `Authorizer` interface see [the next
section](#authentication).
1. Import a package from the [services][services_dir] directory.
2. Create and authenticate a client with a `New*Client` func, e.g.
`c := compute.NewVirtualMachinesClient(...)`.
3. Invoke API methods using the client, e.g.
`res, err := c.CreateOrUpdate(...)`.
4. Handle responses and errors.
[services_dir]: https://github.com/Azure/azure-sdk-for-go/tree/master/services
For example, to create a new virtual network (substitute your own values for
strings in angle brackets):
```go
package main
import (
"context"
"github.com/Azure/azure-sdk-for-go/services/network/mgmt/2017-09-01/network"
"github.com/Azure/go-autorest/autorest/azure/auth"
"github.com/Azure/go-autorest/autorest/to"
)
func main() {
// create a VirtualNetworks client
vnetClient := network.NewVirtualNetworksClient("<subscriptionID>")
// create an authorizer from env vars or Azure Managed Service Idenity
authorizer, err := auth.NewAuthorizerFromEnvironment()
if err == nil {
vnetClient.Authorizer = authorizer
}
// call the VirtualNetworks CreateOrUpdate API
vnetClient.CreateOrUpdate(context.Background(),
"<resourceGroupName>",
"<vnetName>",
network.VirtualNetwork{
Location: to.StringPtr("<azureRegion>"),
VirtualNetworkPropertiesFormat: &network.VirtualNetworkPropertiesFormat{
AddressSpace: &network.AddressSpace{
AddressPrefixes: &[]string{"10.0.0.0/8"},
},
Subnets: &[]network.Subnet{
{
Name: to.StringPtr("<subnet1Name>"),
SubnetPropertiesFormat: &network.SubnetPropertiesFormat{
AddressPrefix: to.StringPtr("10.0.0.0/16"),
},
},
{
Name: to.StringPtr("<subnet2Name>"),
SubnetPropertiesFormat: &network.SubnetPropertiesFormat{
AddressPrefix: to.StringPtr("10.1.0.0/16"),
},
},
},
},
})
}
```
## Authentication
Typical SDK operations must be authenticated and authorized. The _Authorizer_
interface allows use of any auth style in requests, such as inserting an OAuth2
Authorization header and bearer token received from Azure AD.
The SDK itself provides a simple way to get an authorizer which first checks
for OAuth client credentials in environment variables and then falls back to
Azure's [Managed Service Identity](https://github.com/Azure/azure-sdk-for-go/) when available, e.g. when on an Azure
VM. The following snippet from [the previous section](#use) demonstrates
this helper.
```go
import "github.com/Azure/go-autorest/autorest/azure/auth"
// create a VirtualNetworks client
vnetClient := network.NewVirtualNetworksClient("<subscriptionID>")
// create an authorizer from env vars or Azure Managed Service Idenity
authorizer, err := auth.NewAuthorizerFromEnvironment()
if err == nil {
vnetClient.Authorizer = authorizer
}
// call the VirtualNetworks CreateOrUpdate API
vnetClient.CreateOrUpdate(context.Background(),
// ...
```
The following environment variables help determine authentication configuration:
- `AZURE_ENVIRONMENT`: Specifies the Azure Environment to use. If not set, it
defaults to `AzurePublicCloud`. Not applicable to authentication with Managed
Service Identity (MSI).
- `AZURE_AD_RESOURCE`: Specifies the AAD resource ID to use. If not set, it
defaults to `ResourceManagerEndpoint` for operations with Azure Resource
Manager. You can also choose an alternate resource programmatically with
`auth.NewAuthorizerFromEnvironmentWithResource(resource string)`.
### More Authentication Details
The previous is the first and most recommended of several authentication
options offered by the SDK because it allows seamless use of both service
principals and [Azure Managed Service Identity][]. Other options are listed
below.
> Note: If you need to create a new service principal, run `az ad sp create-for-rbac -n "<app_name>" --role Contributor --scopes /subscriptions/<subscription_id>` in the
> [azure-cli](https://github.com/Azure/azure-cli). See [these
> docs](https://docs.microsoft.com/cli/azure/create-an-azure-service-principal-azure-cli?view=azure-cli-latest)
> for more info. Copy the new principal's ID, secret, and tenant ID for use in
> your app, or consider the `--sdk-auth` parameter for serialized output.
[azure managed service identity]: https://docs.microsoft.com/azure/active-directory/msi-overview
- The `auth.NewAuthorizerFromEnvironment()` described above creates an authorizer
from the first available of the following configuration:
1. **Client Credentials**: Azure AD Application ID and Secret.
- `AZURE_TENANT_ID`: Specifies the Tenant to which to authenticate.
- `AZURE_CLIENT_ID`: Specifies the app client ID to use.
- `AZURE_CLIENT_SECRET`: Specifies the app secret to use.
2. **Client Certificate**: Azure AD Application ID and X.509 Certificate.
- `AZURE_TENANT_ID`: Specifies the Tenant to which to authenticate.
- `AZURE_CLIENT_ID`: Specifies the app client ID to use.
- `AZURE_CERTIFICATE_PATH`: Specifies the certificate Path to use.
- `AZURE_CERTIFICATE_PASSWORD`: Specifies the certificate password to use.
3. **Resource Owner Password**: Azure AD User and Password. This grant type is *not
recommended*, use device login instead if you need interactive login.
- `AZURE_TENANT_ID`: Specifies the Tenant to which to authenticate.
- `AZURE_CLIENT_ID`: Specifies the app client ID to use.
- `AZURE_USERNAME`: Specifies the username to use.
- `AZURE_PASSWORD`: Specifies the password to use.
4. **Azure Managed Service Identity**: Delegate credential management to the
platform. Requires that code is running in Azure, e.g. on a VM. All
configuration is handled by Azure. See [Azure Managed Service
Identity](https://docs.microsoft.com/azure/active-directory/msi-overview)
for more details.
- The `auth.NewAuthorizerFromFile()` method creates an authorizer using
credentials from an auth file created by the [Azure CLI][]. Follow these
steps to utilize:
1. Create a service principal and output an auth file using `az ad sp create-for-rbac --role Contributor --scopes /subscriptions/<subscription_id> --sdk-auth > client_credentials.json`.
2. Set environment variable `AZURE_AUTH_LOCATION` to the path of the saved
output file.
3. Use the authorizer returned by `auth.NewAuthorizerFromFile()` in your
client as described above.
- The `auth.NewAuthorizerFromCLI()` method creates an authorizer which
uses [Azure CLI][] to obtain its credentials.
The default audience being requested is `https://management.azure.com` (Azure ARM API).
To specify your own audience, export `AZURE_AD_RESOURCE` as an evironment variable.
This is read by `auth.NewAuthorizerFromCLI()` and passed to Azure CLI to acquire the access token.
For example, to request an access token for Azure Key Vault, export
```
AZURE_AD_RESOURCE="https://vault.azure.net"
```
- `auth.NewAuthorizerFromCLIWithResource(AUDIENCE_URL_OR_APPLICATION_ID)` - this method is self contained and does
not require exporting environment variables. For example, to request an access token for Azure Key Vault:
```
auth.NewAuthorizerFromCLIWithResource("https://vault.azure.net")
```
To use `NewAuthorizerFromCLI()` or `NewAuthorizerFromCLIWithResource()`, follow these steps:
1. Install [Azure CLI v2.0.12](https://docs.microsoft.com/cli/azure/install-azure-cli) or later. Upgrade earlier versions.
2. Use `az login` to sign in to Azure.
If you receive an error, use `az account get-access-token` to verify access.
If Azure CLI is not installed to the default directory, you may receive an error
reporting that `az` cannot be found.
Use the `AzureCLIPath` environment variable to define the Azure CLI installation folder.
If you are signed in to Azure CLI using multiple accounts or your account has
access to multiple subscriptions, you need to specify the specific subscription
to be used. To do so, use:
```
az account set --subscription <subscription-id>
```
To verify the current account settings, use:
```
az account list
```
[azure cli]: https://github.com/Azure/azure-cli
- Finally, you can use OAuth's [Device Flow][] by calling
`auth.NewDeviceFlowConfig()` and extracting the Authorizer as follows:
```go
config := auth.NewDeviceFlowConfig(clientID, tenantID)
a, err := config.Authorizer()
```
[device flow]: https://oauth.net/2/device-flow/
# Versioning
azure-sdk-for-go provides at least a basic Go binding for every Azure API. To
provide maximum flexibility to users, the SDK even includes previous versions of
Azure APIs which are still in use. This enables us to support users of the
most updated Azure datacenters, regional datacenters with earlier APIs, and
even on-premises installations of Azure Stack.
**SDK versions** apply globally and are tracked by git
[tags](https://github.com/Azure/azure-sdk-for-go/tags). These are in x.y.z form
and generally adhere to [semantic versioning](https://semver.org) specifications.
**Service API versions** are generally represented by a date string and are
tracked by offering separate packages for each version. For example, to choose the
latest API versions for Compute and Network, use the following imports:
```go
import (
"github.com/Azure/azure-sdk-for-go/services/compute/mgmt/2017-12-01/compute"
"github.com/Azure/azure-sdk-for-go/services/network/mgmt/2017-09-01/network"
)
```
Occasionally service-side changes require major changes to existing versions.
These cases are noted in the changelog, and for this reason `Service API versions`
cannot be used alone to ensure backwards compatibility.
All available services and versions are listed under the `services/` path in
this repo and in [GoDoc][services_godoc]. Run `find ./services -type d -mindepth 3` to list all available service packages.
[services_godoc]: https://godoc.org/github.com/Azure/azure-sdk-for-go/services
### Profiles
Azure **API profiles** specify subsets of Azure APIs and versions. Profiles can provide:
- **stability** for your application by locking to specific API versions; and/or
- **compatibility** for your application with Azure Stack and regional Azure datacenters.
In the Go SDK, profiles are available under the `profiles/` path and their
component API versions are aliases to the true service package under
`services/`. You can use them as follows:
```go
import "github.com/Azure/azure-sdk-for-go/profiles/2017-03-09/compute/mgmt/compute"
import "github.com/Azure/azure-sdk-for-go/profiles/2017-03-09/network/mgmt/network"
import "github.com/Azure/azure-sdk-for-go/profiles/2017-03-09/storage/mgmt/storage"
```
The following profiles are available for hybrid Azure and Azure Stack environments.
- 2017-03-09
- 2018-03-01
In addition to versioned profiles, we also provide two special profiles
`latest` and `preview`. The `latest` profile contains the latest API version
of each service, excluding any preview versions and/or content. The `preview`
profile is similar to the `latest` profile but includes preview API versions.
The `latest` and `preview` profiles can help you stay up to date with API
updates as you build applications. Since they are by definition not stable,
however, they **should not** be used in production apps. Instead, choose the
latest specific API version (or an older one if necessary) from the `services/`
path.
As an example, to automatically use the most recent Compute APIs, use one of
the following imports:
```go
import "github.com/Azure/azure-sdk-for-go/profiles/latest/compute/mgmt/compute"
import "github.com/Azure/azure-sdk-for-go/profiles/preview/compute/mgmt/compute"
```
### Avoiding Breaking Changes
To avoid breaking changes, when specifying imports you should specify a `Service API Version` or `Profile`, as well as lock (using [dep](https://github.com/golang/dep) and soon with [Go Modules](https://github.com/golang/go/wiki/Modules)) to a specific SDK version.
For example, in your source code imports, use a `Service API Version` (`2017-12-01`):
```go
import "github.com/Azure/azure-sdk-for-go/services/compute/mgmt/2017-12-01/compute"
```
or `Profile` version (`2017-03-09`):
```go
import "github.com/Azure/azure-sdk-for-go/profiles/2017-03-09/compute/mgmt/compute"
```
As well as, for dep, a `Gopkg.toml` file with:
```toml
[[constraint]]
name = "github.com/Azure/azure-sdk-for-go"
version = "21.0.0"
```
Combined, these techniques will ensure that breaking changes should not occur. If you are extra sensitive to changes, adding an additional [version pin](https://golang.github.io/dep/docs/Gopkg.toml.html#version-rules) in your SDK Version should satisfy your needs:
```toml
[[constraint]]
name = "github.com/Azure/azure-sdk-for-go"
version = "=21.3.0"
```
## Inspecting and Debugging
### Built-in Basic Request/Response Logging
Starting with `go-autorest v10.15.0` you can enable basic logging of requests and responses through setting environment variables.
Setting `AZURE_GO_SDK_LOG_LEVEL` to `INFO` will log request/response without their bodies. To include the bodies set the log level to `DEBUG`.
By default the logger writes to stderr, however it can also write to stdout or a file
if specified in `AZURE_GO_SDK_LOG_FILE`. Note that if the specified file already exists it will be truncated.
**IMPORTANT:** by default the logger will redact the Authorization and Ocp-Apim-Subscription-Key
headers. Any other secrets will _not_ be redacted.
### Writing Custom Request/Response Inspectors
All clients implement some handy hooks to help inspect the underlying requests being made to Azure.
- `RequestInspector`: View and manipulate the go `http.Request` before it's sent
- `ResponseInspector`: View the `http.Response` received
Here is an example of how these can be used with `net/http/httputil` to see requests and responses.
```go
vnetClient := network.NewVirtualNetworksClient("<subscriptionID>")
vnetClient.RequestInspector = LogRequest()
vnetClient.ResponseInspector = LogResponse()
// ...
func LogRequest() autorest.PrepareDecorator {
return func(p autorest.Preparer) autorest.Preparer {
return autorest.PreparerFunc(func(r *http.Request) (*http.Request, error) {
r, err := p.Prepare(r)
if err != nil {
log.Println(err)
}
dump, _ := httputil.DumpRequestOut(r, true)
log.Println(string(dump))
return r, err
})
}
}
func LogResponse() autorest.RespondDecorator {
return func(p autorest.Responder) autorest.Responder {
return autorest.ResponderFunc(func(r *http.Response) error {
err := p.Respond(r)
if err != nil {
log.Println(err)
}
dump, _ := httputil.DumpResponse(r, true)
log.Println(string(dump))
return err
})
}
}
```
## Tracing and Metrics
All packages and the runtime are instrumented using [OpenCensus](https://opencensus.io/).
### Enable
By default, no tracing provider will be compiled into your program, and the legacy approach of setting `AZURE_SDK_TRACING_ENABLED` environment variable will no longer take effect.
To enable tracing, you must now add the following include to your source file.
``` go
import _ "github.com/Azure/go-autorest/tracing/opencensus"
```
To hook up a tracer simply call `tracing.Register()` passing in a type that satisfies the `tracing.Tracer` interface.
**Note**: In future major releases of the SDK, tracing may become enabled by default.
### Usage
Once enabled, all SDK calls will emit traces and metrics and the traces will correlate the SDK calls with the raw http calls made to Azure API's. To consume those traces, if are not doing it yet, you need to register an exporter of your choice such as [Azure App Insights](https://docs.microsoft.com/azure/application-insights/opencensus-local-forwarder) or [Zipkin](https://opencensus.io/quickstart/go/tracing/#exporting-traces).
To correlate the SDK calls between them and with the rest of your code, pass in a context that has a span initiated using the [opencensus-go library](https://github.com/census-instrumentation/opencensus-go) using the `trace.Startspan(ctx context.Context, name string, o ...StartOption)` function. Here is an example:
```go
func doAzureCalls() {
// The resulting context will be initialized with a root span as the context passed to
// trace.StartSpan() has no existing span.
ctx, span := trace.StartSpan(context.Background(), "doAzureCalls", trace.WithSampler(trace.AlwaysSample()))
defer span.End()
// The traces from the SDK calls will be correlated under the span inside the context that is passed in.
zone, _ := zonesClient.CreateOrUpdate(ctx, rg, zoneName, dns.Zone{Location: to.StringPtr("global")}, "", "")
zone, _ = zonesClient.Get(ctx, rg, *zone.Name)
for i := 0; i < rrCount; i++ {
rr, _ := recordsClient.CreateOrUpdate(ctx, rg, zoneName, fmt.Sprintf("rr%d", i), dns.CNAME, rdSet{
RecordSetProperties: &dns.RecordSetProperties{
TTL: to.Int64Ptr(3600),
CnameRecord: &dns.CnameRecord{
Cname: to.StringPtr("vladdbCname"),
},
},
},
"",
"",
)
}
}
```
## Request Retry Policy
The SDK provides a baked in retry policy for failed requests with default values that can be configured.
Each [client](https://godoc.org/github.com/Azure/go-autorest/autorest#Client) object contains the follow fields.
- `RetryAttempts` - the number of times to retry a failed request
- `RetryDuration` - the duration to wait between retries
For async operations the follow values are also used.
- `PollingDelay` - the duration to wait between polling requests
- `PollingDuration` - the total time to poll an async request before timing out
Please see the [documentation](https://godoc.org/github.com/Azure/go-autorest/autorest#pkg-constants) for the default values used.
Changing one or more values will affect all subsequet API calls.
The default policy is to call `autorest.DoRetryForStatusCodes()` from an API's `Sender` method. Example:
```go
func (client OperationsClient) ListSender(req *http.Request) (*http.Response, error) {
sd := autorest.GetSendDecorators(req.Context(), autorest.DoRetryForStatusCodes(client.RetryAttempts, client.RetryDuration, autorest.StatusCodesForRetry...))
return autorest.SendWithSender(client, req, sd...)
}
```
Details on how `autorest.DoRetryforStatusCodes()` works can be found in the [documentation](https://godoc.org/github.com/Azure/go-autorest/autorest#DoRetryForStatusCodes).
The slice of `SendDecorators` used in a `Sender` method can be customized per API call by smuggling them in the context. Here's an example.
```go
ctx := context.Background()
autorest.WithSendDecorators(ctx, []autorest.SendDecorator{
autorest.DoRetryForStatusCodesWithCap(client.RetryAttempts,
client.RetryDuration, time.Duration(0),
autorest.StatusCodesForRetry...)})
client.List(ctx)
```
This will replace the default slice of `SendDecorators` with the provided slice.
The `PollingDelay` and `PollingDuration` values are used exclusively by [WaitForCompletionRef()](https://godoc.org/github.com/Azure/go-autorest/autorest/azure#Future.WaitForCompletionRef) when blocking on an async call until it completes.
# Resources
- SDK docs are at [godoc.org](https://godoc.org/github.com/Azure/azure-sdk-for-go/).
- SDK samples are at [Azure-Samples/azure-sdk-for-go-samples](https://github.com/Azure-Samples/azure-sdk-for-go-samples).
- SDK notifications are published via the [Azure update feed](https://azure.microsoft.com/updates/).
- Azure API docs are at [docs.microsoft.com/rest/api](https://docs.microsoft.com/rest/api/).
- General Azure docs are at [docs.microsoft.com/azure](https://docs.microsoft.com/azure).
## Reporting security issues and security bugs
Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) <secure@microsoft.com>. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the [Security TechCenter](https://www.microsoft.com/msrc/faqs-report-an-issue).

View file

@ -1,25 +0,0 @@
# Releasing Packages
After going through a minimal architects board review and preparing your package for release, verify you are ready for release by following the "Release Checklist", and finally release your package by following the "Release Process"
## Release Checklist
- [] Verify there are no replace directives in the go.mod file
- [] Verify the package has a LICENSE file
- [] Verify documentation is present and accurate for all public methods and types. Reference the [content guidelines](https://review.docs.microsoft.com/help/contribute-ref/contribute-ref-how-to-document-sdk?branch=master#api-reference) for best practices. You can start the `godoc` server by running `godoc -http=:6060` from the module home and navigating to `localhost:6060` in the browser.
- [] Verify there are no broken links
- [] Verify all links are non-localized (no "en-us" in links)
- [] Check the package manager link goes to the correct package
- [] Verify Samples
- [] Verify samples are visible in the [sample browser](https://docs.microsoft.com/samples/browse/)
- [] Verify release notes follow [general guidelines](https://azure.github.io/azure-sdk/policies_releasenotes.html)
- [] Verify troubleshooting section of README contains information about how to enable logging
- [] Verify CHANGELOG follows [current guidance](https://azure.github.io/azure-sdk/policies_releases.html#changelog-guidance)
- [] Verify all champion scenarios have a getting started scenario
## Release Process
1. Complete all steps of the Release Checklist shown above
2. Mark the package as 'in-release' by running the `./eng/common/scripts/Prepare-Release.ps1` script and following the prompts. The script may update the version and/or `CHANGELOG.md` of the package. If changes are made, these changes need to be committed and merged before continuing with the release process.
3. Run the pipeline from the `internal` Azure Devops. This will require you to approve the release after both the live and recorded test pipelines pass.
4. Validate the package was released properly by running `go get <your-package>@<your-version>` (ie. `go get github.com/Azure/azure-sdk-for-go/sdk/azcore@v0.20.0`) and validating that pkg.go.dev has updated with the latest version.

View file

@ -1,5 +0,0 @@
#reference https://github.com/golangci/golangci-lint#config-file for more options
run:
# default is true. Enables skipping of directories:
# vendor$, third_party$, testdata$, examples$, Godeps$, builtin$
skip-dirs-use-default: true

View file

@ -1,60 +0,0 @@
format: v0.1-alpha
minimumCheckRuns: 1
timeout: 10
message: >
This pull request is protected by [Check Enforcer](https://aka.ms/azsdk/check-enforcer).
# What is Check Enforcer?
Check Enforcer helps ensure all pull requests are covered by at least one
check-run (typically an Azure Pipeline). When all check-runs associated
with this pull request pass then Check Enforcer itself will pass.
# Why am I getting this message?
You are getting this message because Check Enforcer did not detect any
check-runs being associated with this pull request within five minutes. This
may indicate that your pull request is not covered by any pipelines and so
Check Enforcer is correctly blocking the pull request being merged.
# What should I do now?
If the **check-enforcer** check-run is not passing and all other check-runs
associated with this PR are passing (excluding _license-cla_) then you could
try telling _Check Enforcer_ to evaluate your pull request again. You can
do this by adding a comment to this pull request as follows:
```
/check-enforcer evaluate
```
Typically evaulation only takes a few seconds. If you know that your pull
request is not covered by a pipeline and this is expected you can override
Check Enforcer using the following command:
```
/check-enforcer override
```
Note that using the override command triggers alerts so that follow-up
investigations can occur (PRs still need to be approved as normal).
# What if I am onboarding a new service?
Often, new services do not have validation pipelines associated with them.
In order to bootstrap pipelines for a new service, please perform following steps:
## For track 2 SDKs
Issue the following command as a pull request comment:
```
/azp run prepare-pipelines
```
This will run a pipeline that analyzes the source tree and creates the
pipelines necessary to build and validate your pull request. Once the pipeline
has been created you can trigger the pipeline using the following comment:
```
/azp run go - [service] - ci
```

View file

@ -1,30 +0,0 @@
{
"init": {
"initScript": {
"path": "./eng/scripts/automation_init.sh",
"logPrefix": "[GO]",
"stderr":{
"storeAllLog": true
}
}
},
"generateAndBuild": {
"generateAndBuildScript": {
"path": "generator automation-v2",
"logPrefix": "[GO-Generate]",
"stderr":{
"storeLogByFilter": "[error|Error|Exception]"
}
}
},
"mockTest": {
"mockTestScript": {
"path": "./eng/scripts/automation/Invoke-MockTest.ps1",
"script": "pwsh",
"logPrefix": "[GO-MockTest]",
"stderr":{
"storeLogByFilter": "[error|Error|Exception]"
}
}
}
}

View file

@ -1,356 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<title>Interdependency Graph</title>
<meta charset="utf-8">
<script src="https://cdn.jsdelivr.net/npm/cytoscape@3.11.0/dist/cytoscape.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/dagre/0.8.4/dagre.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/cytoscape-dagre@2.2.2/cytoscape-dagre.min.js"></script>
<script type="application/javascript">
const renderGraph = (data) => {
const config = {
container: document.getElementById('cy'),
elements: [],
autounselectify: true,
layout: {
name: 'dagre',
ranker: 'tight-tree',
nodeSep: 10,
rankSep: 400,
padding: 10
},
style: [
{
selector: '.hidden',
style: {
'display': 'none'
}
},
{
selector: 'node',
style: {
'background-color': '#fff',
'border-color': '#333',
'border-width': '1px',
'height': 'label',
'label': 'data(label)',
'padding': '8px',
'shape': 'round-rectangle',
'text-halign': 'center',
'text-valign': 'center',
'text-wrap': 'wrap',
'width': 'label'
}
},
{
selector: 'node.internal',
style: {
'background-color': '#7f7'
}
},
{
selector: 'node.internalbinary',
style: {
'background-color': '#fb7'
}
},
{
selector: 'node.collapsed',
style: {
'background-color': '#b7f'
}
},
{
selector: 'node.search',
style: {
'background-color': '#ff7',
'border-width': '6px',
'display': 'element'
}
},
{
selector: 'node.highlight',
style: {
'background-color': '#fff',
'border-width': '6px',
'display': 'element'
}
},
{
selector: 'node.highlight.in',
style: {
'border-color': '#7bf'
}
},
{
selector: 'node.highlight.out',
style: {
'border-color': '#f77'
}
},
{
selector: 'node.highlight.source',
style: {
'border-color': '#f77'
}
},
{
selector: 'node.highlight.internal',
style: {
'background-color': '#7f7'
}
},
{
selector: 'node.highlight.internalbinary',
style: {
'background-color': '#fb7'
}
},
{
selector: 'node.highlight.collapsed',
style: {
'background-color': '#b7f'
}
},
{
selector: 'node.highlight.search',
style: {
'background-color': '#ff7'
}
},
{
selector: 'edge',
style: {
'curve-style': 'bezier',
'label': 'data(label)',
'line-color': '#333',
'target-arrow-color': '#333',
'target-arrow-shape': 'triangle',
'width': '1.5px'
}
},
{
selector: 'edge.highlight',
style: {
'display': 'element',
'width': '6px'
}
},
{
selector: 'edge.highlight.in',
style: {
'line-color': '#7bf',
'target-arrow-color': '#7bf'
}
},
{
selector: 'edge.highlight.out',
style: {
'line-color': '#f77',
'target-arrow-color': '#f77'
}
}
]
}
// Add the nodes
for (const pkg of Object.keys(data)) {
config.elements.push({
data: {
id: pkg,
label: `${data[pkg].name}\n${data[pkg].version}`
},
classes: data[pkg].type
})
}
// Add the edges
for (const pkg of Object.keys(data)) {
for (const dep of data[pkg].deps) {
const dest = `${dep.name}:${dep.version}`
const edge = {
data: {
id: `${pkg}:${dest}`,
source: pkg,
target: dest,
label: dep.label || ''
}
}
config.elements.push(edge)
}
}
const cy = cytoscape(config)
cy.on('mouseover', 'node', event => {
const element = event.target
if (element.hasClass('pinned')) { return }
element.addClass('highlight source')
element.outgoers().addClass('highlight out')
element.incomers().addClass('highlight in')
})
cy.on('mouseout', 'node', event => {
const element = event.target
if (element.hasClass('pinned')) { return }
element.removeClass('source')
if (!element.hasClass('in') && !element.hasClass('out')) {
element.removeClass('highlight')
}
element.outgoers().forEach(e => {
e.removeClass('out')
if (!e.hasClass('in') && !e.hasClass('source')) {
e.removeClass('highlight')
}
})
element.incomers().forEach(e => {
e.removeClass('in')
if (!e.hasClass('out') && !e.hasClass('source')) {
e.removeClass('highlight')
}
})
})
cy.on('cxttap', 'node', event => {
const element = event.target
if (!element.hasClass('pinned')) {
element.addClass('pinned')
} else {
element.removeClass('pinned')
}
})
document.addEventListener('keydown', event => {
if (document.activeElement.id === 'search') { return }
if (event.key === '-') {
cy.nodes('.internal').forEach(node => {
if (!node.hasClass('hidden')) {
triggerCollapse(cy, node, true)
}
})
} else if (event.key === '=') {
cy.nodes('.internal').forEach(node => {
triggerCollapse(cy, node, false)
})
}
})
let searchTerm = ''
document.getElementById('search').addEventListener('input', event => {
const newValue = event.target.value
if (searchTerm !== newValue) {
searchTerm = newValue
cy.nodes().removeClass('search')
if (searchTerm.length > 0) {
const matches = cy.nodes(`[label *= '${searchTerm}']`)
matches.addClass('search')
document.getElementById('matches').innerText = `Matches: ${matches.length}`
} else {
document.getElementById('matches').innerText = ''
}
}
})
cy.on('tap', 'node', event => {
const element = event.target
const collapse = !element.hasClass('collapsed')
triggerCollapse(cy, element, collapse)
element.emit('mouseout')
element.emit('mouseover')
})
}
const triggerCollapse = (cy, element, collapse) => {
if (element.outgoers().length === 0) { return }
if (collapse) {
element.addClass('collapsed')
} else {
element.removeClass('collapsed')
}
if (collapse) {
element.outgoers('edge').addClass('hidden')
const orphans = cy.filter(e => {
return e.isNode() &&
!e.hasClass('internal') &&
!e.incomers('edge').some(g => !g.hasClass('hidden'))
})
orphans.forEach(o => {
o.addClass('hidden')
o.successors().addClass('hidden') // no-op when only one tier of external nodes are present
})
} else {
element.outgoers().removeClass('hidden')
}
}
</script>
<style>
body {
margin: 10 auto;
color: #333;
font-weight: 300;
font-family: "Helvetica Neue", Helvetica, Arial, sans-serf;
pointer-events: none;
}
h1 {
font-size: 3em;
font-weight: 300;
z-index: -2;
}
#cy {
width: 100%;
height: 100%;
position: absolute;
left: 0;
top: 0;
z-index: -1;
pointer-events: all;
}
.panel {
display: inline-block;
}
.panel div {
margin: 4px auto;
}
.panel input {
pointer-events: all;
}
</style>
</head>
<body>
<div class="panel">
<h1>Dependency Graph</h1>
<label for="search">Search:</label>
<input id="search" type="search" autocomplete="off" size="64" />
<div id="matches"></div>
</div>
<div id="cy"></div>
<script type="application/javascript">
const params = new URLSearchParams(window.location.search);
const src = params.get("data") || "data.js";
const script = document.createElement("script");
script.src = src;
script.async = false;
script.addEventListener("load", () => renderGraph(data));
script.addEventListener("error", e => {
const dest = document.getElementsByClassName("panel")[0];
dest.innerText = `Failed to load ${src}`;
});
document.head.appendChild(script);
</script>
</body>
</html>

View file

@ -1,3 +0,0 @@
# Common Engineering System
Updates under this directory should only be made in the `azure-sdk-tools` repo as any changes under this directory outside of that repo will end up getting overwritten with future updates. For information about making updates see [common engineering system docs](https://github.com/Azure/azure-sdk-tools/blob/main/doc/common/common_engsys.md)

View file

@ -1,17 +0,0 @@
@echo off
REM Copyright (c) Microsoft Corporation. All rights reserved.
REM Licensed under the MIT License.
setlocal
for /f "usebackq delims=" %%i in (`where pwsh 2^>nul`) do (
set _cmd=%%i
)
if "%_cmd%"=="" (
echo Error: PowerShell not found. Please visit https://github.com/powershell/powershell for install instructions.
exit /b 2
)
call "%_cmd%" -NoLogo -NoProfile -File "%~dpn0.ps1" %*

View file

@ -1,618 +0,0 @@
---
external help file: -help.xml
Module Name:
online version:
schema: 2.0.0
---
# New-TestResources.ps1
## SYNOPSIS
Deploys live test resources defined for a service directory to Azure.
## SYNTAX
### Default (Default)
```
New-TestResources.ps1 [-BaseName <String>] [-ResourceGroupName <String>] [-ServiceDirectory] <String>
[-TestApplicationId <String>] [-TestApplicationSecret <String>] [-TestApplicationOid <String>]
[-SubscriptionId <String>] [-DeleteAfterHours <Int32>] [-Location <String>] [-Environment <String>]
[-ArmTemplateParameters <Hashtable>] [-AdditionalParameters <Hashtable>] [-EnvironmentVariables <Hashtable>]
[-CI] [-Force] [-OutFile] [-SuppressVsoCommands] [-WhatIf] [-Confirm] [<CommonParameters>]
```
### Provisioner
```
New-TestResources.ps1 [-BaseName <String>] [-ResourceGroupName <String>] [-ServiceDirectory] <String>
[-TestApplicationId <String>] [-TestApplicationSecret <String>] [-TestApplicationOid <String>]
-TenantId <String> [-SubscriptionId <String>] -ProvisionerApplicationId <String>
-ProvisionerApplicationSecret <String> [-DeleteAfterHours <Int32>] [-Location <String>]
[-Environment <String>] [-ArmTemplateParameters <Hashtable>] [-AdditionalParameters <Hashtable>]
[-EnvironmentVariables <Hashtable>] [-CI] [-Force] [-OutFile] [-SuppressVsoCommands] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
## DESCRIPTION
Deploys live test resouces specified in test-resources.json or test-resources.bicep
files to a new resource group.
This script searches the directory specified in $ServiceDirectory recursively
for files named test-resources.json or test-resources.bicep.
All found test-resources.json
and test-resources.bicep files will be deployed to the test resource group.
If no test-resources.json or test-resources.bicep files are located the script
exits without making changes to the Azure environment.
A service principal may optionally be passed to $TestApplicationId and $TestApplicationSecret.
Test resources will grant this service principal access to the created resources.
If no service principal is specified, a new one will be created and assigned the
'Owner' role for the resource group associated with the test resources.
This script runs in the context of credentials already specified in Connect-AzAccount
or those specified in $ProvisionerApplicationId and $ProvisionerApplicationSecret.
## EXAMPLES
### EXAMPLE 1
```
Connect-AzAccount -Subscription 'REPLACE_WITH_SUBSCRIPTION_ID'
New-TestResources.ps1 keyvault
```
Run this in a desktop environment to create a new AAD application and Service Principal
for running live tests against the test resources created.
The principal will have ownership
rights to the resource group and the resources that it contains, but no other resources in
the subscription.
Requires PowerShell 7 to use ConvertFrom-SecureString -AsPlainText or convert
the SecureString to plaintext by another means.
### EXAMPLE 2
```
Connect-AzAccount -Subscription 'REPLACE_WITH_SUBSCRIPTION_ID'
New-TestResources.ps1 `
-BaseName 'azsdk' `
-ServiceDirectory 'keyvault' `
-SubscriptionId 'REPLACE_WITH_SUBSCRIPTION_ID' `
-ResourceGroupName 'REPLACE_WITH_NAME_FOR_RESOURCE_GROUP' `
-Location 'eastus'
```
Run this in a desktop environment to specify the name and location of the resource
group that test resources are being deployed to.
This will also create a new AAD
application and Service Principal for running live tests against the rest resources
created.
The principal will have ownership rights to the resource group and the
resources that it contains, but no other resources in the subscription.
Requires PowerShell 7 to use ConvertFrom-SecureString -AsPlainText or convert
the SecureString to plaintext by another means.
### EXAMPLE 3
```
Connect-AzAccount -Subscription 'REPLACE_WITH_SUBSCRIPTION_ID'
New-TestResources.ps1 `
-BaseName 'azsdk' `
-ServiceDirectory 'keyvault' `
-SubscriptionId 'REPLACE_WITH_SUBSCRIPTION_ID' `
-ResourceGroupName 'REPLACE_WITH_NAME_FOR_RESOURCE_GROUP' `
-Location 'eastus' `
-TestApplicationId 'REPLACE_WITH_TEST_APPLICATION_ID' `
-TestApplicationSecret 'REPLACE_WITH_TEST_APPLICATION_SECRET'
```
Run this in a desktop environment to specify the name and location of the resource
group that test resources are being deployed to.
This will grant ownership rights
to the 'TestApplicationId' for the resource group and the resources that it contains,
without altering its existing permissions.
### EXAMPLE 4
```
New-TestResources.ps1 `
-BaseName 'azsdk' `
-ServiceDirectory 'keyvault' `
-SubscriptionId 'REPLACE_WITH_SUBSCRIPTION_ID' `
-ResourceGroupName 'REPLACE_WITH_NAME_FOR_RESOURCE_GROUP' `
-Location 'eastus' `
-ProvisionerApplicationId 'REPLACE_WITH_PROVISIONER_APPLICATION_ID' `
-ProvisionerApplicationSecret 'REPLACE_WITH_PROVISIONER_APPLICATION_ID' `
-TestApplicationId 'REPLACE_WITH_TEST_APPLICATION_ID' `
-TestApplicationOid 'REPLACE_WITH_TEST_APPLICATION_OBJECT_ID' `
-TestApplicationSecret 'REPLACE_WITH_TEST_APPLICATION_SECRET'
```
Run this in a desktop environment to specify the name and location of the resource
group that test resources are being deployed to.
The script will be executed in the
context of the 'ProvisionerApplicationId' rather than the caller.
Depending on the permissions of the Provisioner Application principal, the script may
grant ownership rights 'TestApplicationId' for the resource group and the resources
that it contains, or may emit a message indicating that it was unable to perform the grant.
For the Provisioner Application principal to perform the grant, it will need the
permission 'Application.ReadWrite.OwnedBy' for the Microsoft Graph API.
Requires PowerShell 7 to use ConvertFrom-SecureString -AsPlainText or convert
the SecureString to plaintext by another means.
### EXAMPLE 5
```
New-TestResources.ps1 `
-ServiceDirectory '$(ServiceDirectory)' `
-TenantId '$(TenantId)' `
-ProvisionerApplicationId '$(ProvisionerId)' `
-ProvisionerApplicationSecret '$(ProvisionerSecret)' `
-TestApplicationId '$(TestAppId)' `
-TestApplicationSecret '$(TestAppSecret)' `
-DeleteAfterHours 24 `
-CI `
-Force `
-Verbose
```
Run this in an Azure DevOps CI (with approrpiate variables configured) before
executing live tests.
The script will output variables as secrets (to enable
log redaction).
## PARAMETERS
### -BaseName
A name to use in the resource group and passed to the ARM template as 'baseName'.
Limit $BaseName to enough characters to be under limit plus prefixes specified in
the ARM template.
See also https://docs.microsoft.com/azure/architecture/best-practices/resource-naming
Note: The value specified for this parameter will be overriden and generated
by New-TestResources.ps1 if $CI is specified.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ResourceGroupName
Set this value to deploy directly to a Resource Group that has already been
created or to create a new resource group with this name.
If not specified, the $BaseName will be used to generate name for the resource
group that will be created.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ServiceDirectory
A directory under 'sdk' in the repository root - optionally with subdirectories
specified - in which to discover ARM templates named 'test-resources.json' and
Bicep templates named 'test-resources.bicep'.
This can also be an absolute path
or specify parent directories.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TestApplicationId
Optional Azure Active Directory Application ID to authenticate the test runner
against deployed resources.
Passed to the ARM template as 'testApplicationId'.
If not specified, a new AAD Application will be created and assigned the 'Owner'
role for the resource group associated with the test resources.
No permissions
will be granted to the subscription or other resources.
For those specifying a Provisioner Application principal as 'ProvisionerApplicationId',
it will need the permission 'Application.ReadWrite.OwnedBy' for the Microsoft Graph API
in order to create the Test Application principal.
This application is used by the test runner to execute tests against the
live test resources.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TestApplicationSecret
Optional service principal secret (password) to authenticate the test runner
against deployed resources.
Passed to the ARM template as
'testApplicationSecret'.
This application is used by the test runner to execute tests against the
live test resources.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TestApplicationOid
Service Principal Object ID of the AAD Test Application.
This is used to assign
permissions to the AAD application so it can access tested features on the live
test resources (e.g.
Role Assignments on resources).
It is passed as to the ARM
template as 'testApplicationOid'
If not specified, an attempt will be made to query it from the Azure Active Directory
tenant.
For those specifying a service principal as 'ProvisionerApplicationId',
it will need the permission 'Application.Read.All' for the Microsoft Graph API
in order to query AAD.
For more information on the relationship between AAD Applications and Service
Principals see: https://docs.microsoft.com/azure/active-directory/develop/app-objects-and-service-principals
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TenantId
The tenant ID of a service principal when a provisioner is specified.
The same
Tenant ID is used for Test Application and Provisioner Application.
This value is passed to the ARM template as 'tenantId'.
```yaml
Type: String
Parameter Sets: Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SubscriptionId
Optional subscription ID to use for new resources when logging in as a
provisioner.
You can also use Set-AzContext if not provisioning.
If you do not specify a SubscriptionId and are not logged in, one will be
automatically selected for you by the Connect-AzAccount cmdlet.
Once you are logged in (or were previously), the selected SubscriptionId
will be used for subsequent operations that are specific to a subscription.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ProvisionerApplicationId
Optional Application ID of the Azure Active Directory service principal to use for
provisioning the test resources.
If not, specified New-TestResources.ps1 uses the
context of the caller to provision.
If specified, the Provisioner Application principal would benefit from the following
permissions to the Microsoft Graph API:
- 'Application.Read.All' in order to query AAD to obtain the 'TestApplicaitonOid'
- 'Application.ReadWrite.OwnedBy' in order to create the Test Application principal
or grant an existing principal ownership of the resource group associated with
the test resources.
If the provisioner does not have these permissions, it can still be used with
New-TestResources.ps1 by specifying an existing Test Application principal, including
its Object ID, and managing permissions to the resource group manually.
This value is not passed to the ARM template.
```yaml
Type: String
Parameter Sets: Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ProvisionerApplicationSecret
A service principal secret (password) used to provision test resources when a
provisioner is specified.
This value is not passed to the ARM template.
```yaml
Type: String
Parameter Sets: Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DeleteAfterHours
Positive integer number of hours from the current time to set the
'DeleteAfter' tag on the created resource group.
The computed value is a
timestamp of the form "2020-03-04T09:07:04.3083910Z".
An optional cleanup process can delete resource groups whose "DeleteAfter"
timestamp is less than the current time.
This is used for CI automation.
```yaml
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: 120
Accept pipeline input: False
Accept wildcard characters: False
```
### -Location
Optional location where resources should be created.
If left empty, the default
is based on the cloud to which the template is being deployed:
* AzureCloud -\> 'westus2'
* AzureUSGovernment -\> 'usgovvirginia'
* AzureChinaCloud -\> 'chinaeast2'
* Dogfood -\> 'westus'
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Environment
Optional name of the cloud environment.
The default is the Azure Public Cloud
('AzureCloud')
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: AzureCloud
Accept pipeline input: False
Accept wildcard characters: False
```
### -ArmTemplateParameters
Optional key-value pairs of parameters to pass to the ARM template(s).
```yaml
Type: Hashtable
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -AdditionalParameters
Optional key-value pairs of parameters to pass to the ARM template(s) and pre-post scripts.
```yaml
Type: Hashtable
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -EnvironmentVariables
Optional key-value pairs of parameters to set as environment variables to the shell.
```yaml
Type: Hashtable
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: @{}
Accept pipeline input: False
Accept wildcard characters: False
```
### -CI
Indicates the script is run as part of a Continuous Integration / Continuous
Deployment (CI/CD) build (only Azure Pipelines is currently supported).
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: ($null -ne $env:SYSTEM_TEAMPROJECTID)
Accept pipeline input: False
Accept wildcard characters: False
```
### -Force
Force creation of resources instead of being prompted.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -OutFile
Save test environment settings into a .env file next to test resources template.
The contents of the file are protected via the .NET Data Protection API (DPAPI).
This is supported only on Windows.
The environment file is scoped to the current
service directory.
The environment file will be named for the test resources template that it was
generated for.
For ARM templates, it will be test-resources.json.env.
For
Bicep templates, test-resources.bicep.env.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -SuppressVsoCommands
By default, the -CI parameter will print out secrets to logs with Azure Pipelines log
commands that cause them to be redacted.
For CI environments that don't support this (like
stress test clusters), this flag can be set to $false to avoid printing out these secrets to the logs.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: ($null -eq $env:SYSTEM_TEAMPROJECTID)
Accept pipeline input: False
Accept wildcard characters: False
```
### -WhatIf
Shows what would happen if the cmdlet runs.
The cmdlet is not run.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: wi
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before running the cmdlet.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: cf
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](https://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
## NOTES
## RELATED LINKS

View file

@ -1,191 +0,0 @@
# Live Test Resource Management
Running and recording live tests often requires first creating some resources
in Azure. Service directories that include a `test-resources.json` or `test-resources.bicep`
file require running [New-TestResources.ps1][] to create these resources and output
environment variables you must set.
The following scripts can be used both in on your desktop for developer
scenarios as well as on hosted agents for continuous integration testing.
* [New-TestResources.ps1][] - Creates new test resources for a given service.
* [Remove-TestResources.ps1][] - Deletes previously created resources.
## Prerequisites
1. Install [PowerShell][] version 7.0 or newer.
2. Install the [Azure PowerShell][PowerShellAz].
## On the Desktop
To set up your Azure account to run live tests, you'll need to log into Azure,
and create the resources defined in your `test-resources.json` or `test-resources.bicep`
template as shown in the following example using Azure Key Vault. The script will create
a service principal automatically, or you may create a service principal that can be reused
subsequently.
Note that `-Subscription` is an optional parameter but recommended if your account
is a member of multiple subscriptions. If you didn't specify it when logging in,
you should select your desired subscription using `Select-AzSubscription`. The
default can be saved using `Set-AzDefault` for future sessions.
```powershell
Connect-AzAccount -Subscription 'YOUR SUBSCRIPTION ID'
eng\common\TestResources\New-TestResources.ps1 keyvault
```
The `OutFile` switch will be set by default if you are running this for a .NET project on Windows.
This will save test environment settings into a `test-resources.json.env` file next to `test-resources.json`
or a `test-resources.bicep.env` file next to `test-resources.bicep`. The file is protected via DPAPI.
The environment file would be scoped to the current repository directory and avoids the need to
set environment variables or restart your IDE to recognize them.
Along with some log messages, this will output environment variables based on
your current shell like in the following example:
```powershell
${env:KEYVAULT_TENANT_ID} = '<<secret>>'
${env:KEYVAULT_CLIENT_ID} = '<<secret>>'
${env:KEYVAULT_CLIENT_SECRET} = '<<secret>>'
${env:KEYVAULT_SUBSCRIPTION_ID} = 'YOUR SUBSCRIPTION ID'
${env:KEYVAULT_RESOURCE_GROUP} = 'rg-myusername'
${env:KEYVAULT_LOCATION} = 'westus2'
${env:KEYVAULT_SKU} = 'premium'
${env:AZURE_KEYVAULT_URL} = '<<url>>'
```
For security reasons we do not set these environment variables automatically
for either the current process or persistently for future sessions. You must
do that yourself based on your current platform and shell.
If your current shell was detected properly, you should be able to copy and
paste the output directly in your terminal and add to your profile script.
For example, in PowerShell on Windows you can copy the output above and paste
it back into the terminal to set those environment variables for the current
process. To persist these variables for future terminal sessions or for
applications started outside the terminal, you could copy and paste the
following commands:
```powershell
setx KEYVAULT_TENANT_ID ${env:KEYVAULT_TENANT_ID}
setx KEYVAULT_CLIENT_ID ${env:KEYVAULT_CLIENT_ID}
setx KEYVAULT_CLIENT_SECRET ${env:KEYVAULT_CLIENT_SECRET}
setx KEYVAULT_SUBSCRIPTION_ID ${env:KEYVAULT_SUBSCRIPTION_ID}
setx KEYVAULT_RESOURCE_GROUP ${env:KEYVAULT_RESOURCE_GROUP}
setx KEYVAULT_LOCATION ${env:KEYVAULT_LOCATION}
setx KEYVAULT_SKU ${env:KEYVAULT_SKU}
setx AZURE_KEYVAULT_URL ${env:AZURE_KEYVAULT_URL}
```
### Pre- and Post- Scripts
Sometimes creating test resources requires either some work to be done prior to or after the main test-resources.json script is executed.
For these scenarios a `test-resources-pre.ps1` or `test-resources-post.ps1`, respectively, can be created in the same folder as the `test-resources.json` file.
For example, it may be necessary to create artifacts prior to provisioning the actual resource, such as a certificate.
Typically the created artifact will need to be passed to `test-resources.json` to be used in the ARM template or as output (or both).
Below is an example of how `$templateFileParameters` can be used to pass data from the `pre-` script to `test-resources.json`.
**Snippet from `test-resources-pre.ps1`**
```powershell
$cert = New-X509Certificate2 -SubjectName 'E=opensource@microsoft.com, CN=Azure SDK, OU=Azure SDK, O=Microsoft, L=Frisco, S=TX, C=US' -ValidDays 3652
# Create new entries in $templateFileParameters
$templateFileParameters['ConfidentialLedgerPrincipalPEM'] = Format-X509Certificate2 -Certificate $cert
$templateFileParameters['ConfidentialLedgerPrincipalPEMPK'] = Format-X509Certificate2 -Type Pkcs8 -Certificate $cert
```
**Snippet from the corresponding `test-resources.json`.**
Note that the values present in `$templateFileParameters` will map to parameters of the same name.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"_comment": "Other required parameters would go here... (this is not part of the actual test-resources.json)",
"ConfidentialLedgerPrincipalPEM": {
"type": "string",
"metadata": {
"description": "The certificate to configure as a certBasedSecurityPrincipal."
}
},
"ConfidentialLedgerPrincipalPEMPK": {
"type": "string",
"metadata": {
"description": "The certificate to configure as a certBasedSecurityPrincipal."
}
}
},
}
```
### Cleaning up Resources
By default, resource groups are tagged with a `DeleteAfter` value and date according to the default or specified
value for the `-DeleteAfterHours` switch. You can use this tag in scheduled jobs to remove older resources based
on that date.
If you are not ready for the resources to be deleted, you can update the resource group by running [Update-TestResources.ps1][]:
```powershell
Update-TestResources.ps1 keyvault
```
This will extend the expiration time by the default value (e.g. 48 hours) from now.
Alternatively, after running or recording live tests, if you do not plan on further testing
you can immediately remove the test resources you created above by running [Remove-TestResources.ps1][]:
```powershell
Remove-TestResources.ps1 keyvault -Force
```
If you persisted environment variables, you should also remove those as well.
### Passing Additional Arguments
Some test-resources.json templates utilize the `AdditionalParameters` parameter to control additional resource configuration options. For example:
```powershell
New-TestResources.ps1 keyvault -AdditionalParameters @{enableHsm = $true}
```
## In CI
Test pipelines should include deploy-test-resources.yml and
remove-test-resources.yml like in the following examples:
```yml
- template: /eng/common/TestResources/deploy-test-resources.yml
parameters:
ServiceDirectory: '${{ parameters.ServiceDirectory }}'
# Run tests
- template: /eng/common/TestResources/remove-test-resources.yml
```
Be sure to link the **Secrets for Resource Provisioner** variable group
into the test pipeline for these scripts to work.
## Documentation
To regenerate documentation for scripts within this directory, you can install
[platyPS][] and run it like in the following example:
```powershell
Install-Module platyPS -Scope CurrentUser -Force
New-MarkdownHelp -Command .\New-TestResources.ps1 -OutputFolder . -Force
```
After the markdown files are generated, please make sure all "http" URIs use "https".
PowerShell markdown documentation created with [platyPS][].
[New-TestResources.ps1]: https://aka.ms/azsdk/tools/New-TestResources
[Update-TestResources.ps1]: https://aka.ms/azsdk/tools/Update-TestResources
[Remove-TestResources.ps1]: https://aka.ms/azsdk/tools/Remove-TestResources
[PowerShell]: https://github.com/PowerShell/PowerShell
[PowerShellAz]: https://docs.microsoft.com/powershell/azure/install-az-ps
[platyPS]: https://github.com/PowerShell/platyPS

View file

@ -1,17 +0,0 @@
@echo off
REM Copyright (c) Microsoft Corporation. All rights reserved.
REM Licensed under the MIT License.
setlocal
for /f "usebackq delims=" %%i in (`where pwsh 2^>nul`) do (
set _cmd=%%i
)
if "%_cmd%"=="" (
echo Error: PowerShell not found. Please visit https://github.com/powershell/powershell for install instructions.
exit /b 2
)
call "%_cmd%" -NoLogo -NoProfile -File "%~dpn0.ps1" %*

View file

@ -1,321 +0,0 @@
#!/usr/bin/env pwsh
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
#Requires -Version 6.0
#Requires -PSEdition Core
#Requires -Modules @{ModuleName='Az.Accounts'; ModuleVersion='1.6.4'}
#Requires -Modules @{ModuleName='Az.Resources'; ModuleVersion='1.8.0'}
[CmdletBinding(DefaultParameterSetName = 'Default', SupportsShouldProcess = $true, ConfirmImpact = 'Medium')]
param (
# Limit $BaseName to enough characters to be under limit plus prefixes, and https://docs.microsoft.com/azure/architecture/best-practices/resource-naming.
[Parameter(ParameterSetName = 'Default')]
[Parameter(ParameterSetName = 'Default+Provisioner', Mandatory = $true, Position = 0)]
[ValidatePattern('^[-a-zA-Z0-9\.\(\)_]{0,80}(?<=[a-zA-Z0-9\(\)])$')]
[string] $BaseName,
[Parameter(ParameterSetName = 'ResourceGroup')]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner')]
[string] $ResourceGroupName,
[Parameter(ParameterSetName = 'Default+Provisioner', Mandatory = $true)]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner', Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[string] $TenantId,
[Parameter()]
[ValidatePattern('^[0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}$')]
[string] $SubscriptionId,
[Parameter(ParameterSetName = 'Default+Provisioner', Mandatory = $true)]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner', Mandatory = $true)]
[ValidatePattern('^[0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}$')]
[string] $ProvisionerApplicationId,
[Parameter(ParameterSetName = 'Default+Provisioner', Mandatory = $true)]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner', Mandatory = $true)]
[string] $ProvisionerApplicationSecret,
[Parameter(ParameterSetName = 'Default', Position = 0)]
[Parameter(ParameterSetName = 'Default+Provisioner')]
[Parameter(ParameterSetName = 'ResourceGroup')]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner')]
[string] $ServiceDirectory,
[Parameter()]
[ValidateSet('AzureCloud', 'AzureUSGovernment', 'AzureChinaCloud', 'Dogfood')]
[string] $Environment = 'AzureCloud',
[Parameter(ParameterSetName = 'ResourceGroup')]
[Parameter(ParameterSetName = 'ResourceGroup+Provisioner')]
[switch] $CI,
[Parameter()]
[switch] $Force,
# Captures any arguments not declared here (no parameter errors)
[Parameter(ValueFromRemainingArguments = $true)]
$RemoveTestResourcesRemainingArguments
)
# By default stop for any error.
if (!$PSBoundParameters.ContainsKey('ErrorAction')) {
$ErrorActionPreference = 'Stop'
}
# Support actions to invoke on exit.
$exitActions = @({
if ($exitActions.Count -gt 1) {
Write-Verbose 'Running registered exit actions.'
}
})
trap {
# Like using try..finally in PowerShell, but without keeping track of more braces or tabbing content.
$exitActions.Invoke()
}
. $PSScriptRoot/SubConfig-Helpers.ps1
# Source helpers to purge resources.
. "$PSScriptRoot\..\scripts\Helpers\Resource-Helpers.ps1"
function Log($Message) {
Write-Host ('{0} - {1}' -f [DateTime]::Now.ToLongTimeString(), $Message)
}
function Retry([scriptblock] $Action, [int] $Attempts = 5) {
$attempt = 0
$sleep = 5
while ($attempt -lt $Attempts) {
try {
$attempt++
return $Action.Invoke()
} catch {
if ($attempt -lt $Attempts) {
$sleep *= 2
Write-Warning "Attempt $attempt failed: $_. Trying again in $sleep seconds..."
Start-Sleep -Seconds $sleep
} else {
Write-Error -ErrorRecord $_
}
}
}
}
if ($ProvisionerApplicationId) {
$null = Disable-AzContextAutosave -Scope Process
Log "Logging into service principal '$ProvisionerApplicationId'"
$provisionerSecret = ConvertTo-SecureString -String $ProvisionerApplicationSecret -AsPlainText -Force
$provisionerCredential = [System.Management.Automation.PSCredential]::new($ProvisionerApplicationId, $provisionerSecret)
# Use the given subscription ID if provided.
$subscriptionArgs = if ($SubscriptionId) {
@{SubscriptionId = $SubscriptionId}
}
$provisionerAccount = Retry {
Connect-AzAccount -Force:$Force -Tenant $TenantId -Credential $provisionerCredential -ServicePrincipal -Environment $Environment @subscriptionArgs
}
$exitActions += {
Write-Verbose "Logging out of service principal '$($provisionerAccount.Context.Account)'"
$null = Disconnect-AzAccount -AzureContext $provisionerAccount.Context
}
}
$context = Get-AzContext
if (!$ResourceGroupName) {
if ($CI) {
if (!$ServiceDirectory) {
Write-Warning "ServiceDirectory parameter is empty, nothing to remove"
exit 0
}
$envVarName = (BuildServiceDirectoryPrefix (GetServiceLeafDirectoryName $ServiceDirectory)) + "RESOURCE_GROUP"
$ResourceGroupName = [Environment]::GetEnvironmentVariable($envVarName)
if (!$ResourceGroupName) {
Write-Error "Could not find resource group name environment variable '$envVarName'. This is likely due to an earlier failure in the 'Deploy Test Resources' step above."
exit 0
}
} else {
if (!$BaseName) {
$UserName = GetUserName
$BaseName = GetBaseName $UserName $ServiceDirectory
Log "BaseName was not set. Using default base name '$BaseName'"
}
# Format the resource group name like in New-TestResources.ps1.
$ResourceGroupName = "rg-$BaseName"
}
}
# If no subscription was specified, try to select the Azure SDK Developer Playground subscription.
# Ignore errors to leave the automatically selected subscription.
if ($SubscriptionId) {
$currentSubcriptionId = $context.Subscription.Id
if ($currentSubcriptionId -ne $SubscriptionId) {
Log "Selecting subscription '$SubscriptionId'"
$null = Select-AzSubscription -Subscription $SubscriptionId
$exitActions += {
Log "Selecting previous subscription '$currentSubcriptionId'"
$null = Select-AzSubscription -Subscription $currentSubcriptionId
}
# Update the context.
$context = Get-AzContext
}
} else {
Log "Attempting to select subscription 'Azure SDK Developer Playground (faa080af-c1d8-40ad-9cce-e1a450ca5b57)'"
$null = Select-AzSubscription -Subscription 'faa080af-c1d8-40ad-9cce-e1a450ca5b57' -ErrorAction Ignore
# Update the context.
$context = Get-AzContext
$SubscriptionId = $context.Subscription.Id
$PSBoundParameters['SubscriptionId'] = $SubscriptionId
}
# Use cache of well-known team subs without having to be authenticated.
$wellKnownSubscriptions = @{
'faa080af-c1d8-40ad-9cce-e1a450ca5b57' = 'Azure SDK Developer Playground'
'a18897a6-7e44-457d-9260-f2854c0aca42' = 'Azure SDK Engineering System'
'2cd617ea-1866-46b1-90e3-fffb087ebf9b' = 'Azure SDK Test Resources'
}
# Print which subscription is currently selected.
$subscriptionName = $context.Subscription.Id
if ($wellKnownSubscriptions.ContainsKey($subscriptionName)) {
$subscriptionName = '{0} ({1})' -f $wellKnownSubscriptions[$subscriptionName], $subscriptionName
}
Log "Selected subscription '$subscriptionName'"
if ($ServiceDirectory) {
$root = [System.IO.Path]::Combine("$PSScriptRoot/../../../sdk", $ServiceDirectory) | Resolve-Path
$preRemovalScript = Join-Path -Path $root -ChildPath 'remove-test-resources-pre.ps1'
if (Test-Path $preRemovalScript) {
Log "Invoking pre resource removal script '$preRemovalScript'"
if (!$PSCmdlet.ParameterSetName.StartsWith('ResourceGroup')) {
$PSBoundParameters.Add('ResourceGroupName', $ResourceGroupName);
}
&$preRemovalScript @PSBoundParameters
}
# Make sure environment files from New-TestResources -OutFile are removed.
Get-ChildItem -Path $root -Filter test-resources.json.env -Recurse | Remove-Item -Force:$Force
}
$verifyDeleteScript = {
try {
$group = Get-AzResourceGroup -name $ResourceGroupName
} catch {
if ($_.ToString().Contains("Provided resource group does not exist")) {
Write-Verbose "Resource group '$ResourceGroupName' not found. Continuing..."
return
}
throw $_
}
if ($group.ProvisioningState -ne "Deleting")
{
throw "Resource group is in '$($group.ProvisioningState)' state, expected 'Deleting'"
}
}
# Get any resources that can be purged after the resource group is deleted coerced into a collection even if empty.
$purgeableResources = Get-PurgeableGroupResources $ResourceGroupName
Log "Deleting resource group '$ResourceGroupName'"
if ($Force -and !$purgeableResources) {
Remove-AzResourceGroup -Name "$ResourceGroupName" -Force:$Force -AsJob
Write-Verbose "Running background job to delete resource group '$ResourceGroupName'"
Retry $verifyDeleteScript 3
} else {
# Don't swallow interactive confirmation when Force is false
Remove-AzResourceGroup -Name "$ResourceGroupName" -Force:$Force
}
# Now purge the resources that should have been deleted with the resource group.
Remove-PurgeableResources $purgeableResources
$exitActions.Invoke()
<#
.SYNOPSIS
Deletes the resource group deployed for a service directory from Azure.
.DESCRIPTION
Removes a resource group and all its resources previously deployed using
New-TestResources.ps1.
If you are not currently logged into an account in the Az PowerShell module,
you will be asked to log in with Connect-AzAccount. Alternatively, you (or a
build pipeline) can pass $ProvisionerApplicationId and
$ProvisionerApplicationSecret to authenticate a service principal with access to
create resources.
.PARAMETER BaseName
A name to use in the resource group and passed to the ARM template as 'baseName'.
This will delete the resource group named 'rg-<baseName>'
.PARAMETER ResourceGroupName
The name of the resource group to delete.
.PARAMETER TenantId
The tenant ID of a service principal when a provisioner is specified.
.PARAMETER SubscriptionId
Optional subscription ID to use when deleting resources when logging in as a
provisioner. You can also use Set-AzContext if not provisioning.
If you do not specify a SubscriptionId and are not logged in, one will be
automatically selected for you by the Connect-AzAccount cmdlet.
Once you are logged in (or were previously), the selected SubscriptionId
will be used for subsequent operations that are specific to a subscription.
.PARAMETER ProvisionerApplicationId
A service principal ID to provision test resources when a provisioner is specified.
.PARAMETER ProvisionerApplicationSecret
A service principal secret (password) to provision test resources when a provisioner is specified.
.PARAMETER ServiceDirectory
A directory under 'sdk' in the repository root - optionally with subdirectories
specified - in which to discover pre removal script named 'remove-test-resources-pre.json'.
.PARAMETER Environment
Name of the cloud environment. The default is the Azure Public Cloud
('PublicCloud')
.PARAMETER CI
Run script in CI mode. Infers various environment variable names based on CI convention.
.PARAMETER Force
Force removal of resource group without asking for user confirmation
.EXAMPLE
Remove-TestResources.ps1 keyvault -Force
Use the currently logged-in account to delete the resources created for Key Vault testing.
.EXAMPLE
Remove-TestResources.ps1 `
-ResourceGroupName "${env:AZURE_RESOURCEGROUP_NAME}" `
-TenantId '$(TenantId)' `
-ProvisionerApplicationId '$(AppId)' `
-ProvisionerApplicationSecret '$(AppSecret)' `
-Force `
-Verbose `
When run in the context of an Azure DevOps pipeline, this script removes the
resource group whose name is stored in the environment variable
AZURE_RESOURCEGROUP_NAME.
#>

View file

@ -1,308 +0,0 @@
---
external help file: -help.xml
Module Name:
online version:
schema: 2.0.0
---
# Remove-TestResources.ps1
## SYNOPSIS
Deletes the resource group deployed for a service directory from Azure.
## SYNTAX
### Default (Default)
```
Remove-TestResources.ps1 [-BaseName <String>] [-SubscriptionId <String>] [-ServiceDirectory] <String>
[-Environment <String>] [-Force] [-RemoveTestResourcesRemainingArguments <Object>] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
### Default+Provisioner
```
Remove-TestResources.ps1 -BaseName <String> -TenantId <String> [-SubscriptionId <String>]
-ProvisionerApplicationId <String> -ProvisionerApplicationSecret <String> [[-ServiceDirectory] <String>]
[-Environment <String>] [-Force] [-RemoveTestResourcesRemainingArguments <Object>] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
### ResourceGroup+Provisioner
```
Remove-TestResources.ps1 -ResourceGroupName <String> -TenantId <String> [-SubscriptionId <String>]
-ProvisionerApplicationId <String> -ProvisionerApplicationSecret <String> [[-ServiceDirectory] <String>]
[-Environment <String>] [-CI] [-Force] [-RemoveTestResourcesRemainingArguments <Object>] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
### ResourceGroup
```
Remove-TestResources.ps1 -ResourceGroupName <String> [-SubscriptionId <String>] [[-ServiceDirectory] <String>]
[-Environment <String>] [-CI] [-Force] [-RemoveTestResourcesRemainingArguments <Object>] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
## DESCRIPTION
Removes a resource group and all its resources previously deployed using
New-TestResources.ps1.
If you are not currently logged into an account in the Az PowerShell module,
you will be asked to log in with Connect-AzAccount.
Alternatively, you (or a
build pipeline) can pass $ProvisionerApplicationId and
$ProvisionerApplicationSecret to authenticate a service principal with access to
create resources.
## EXAMPLES
### EXAMPLE 1
```
Remove-TestResources.ps1 keyvault -Force
Use the currently logged-in account to delete the resources created for Key Vault testing.
```
### EXAMPLE 2
```
Remove-TestResources.ps1 `
-ResourceGroupName "${env:AZURE_RESOURCEGROUP_NAME}" `
-TenantId '$(TenantId)' `
-ProvisionerApplicationId '$(AppId)' `
-ProvisionerApplicationSecret '$(AppSecret)' `
-Force `
-Verbose `
When run in the context of an Azure DevOps pipeline, this script removes the
resource group whose name is stored in the environment variable
AZURE_RESOURCEGROUP_NAME.
```
## PARAMETERS
### -BaseName
A name to use in the resource group and passed to the ARM template as 'baseName'.
This will delete the resource group named 'rg-\<baseName\>'
```yaml
Type: String
Parameter Sets: Default
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
```yaml
Type: String
Parameter Sets: Default+Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ResourceGroupName
The name of the resource group to delete.
```yaml
Type: String
Parameter Sets: ResourceGroup+Provisioner, ResourceGroup
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TenantId
The tenant ID of a service principal when a provisioner is specified.
```yaml
Type: String
Parameter Sets: Default+Provisioner, ResourceGroup+Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SubscriptionId
Optional subscription ID to use when deleting resources when logging in as a
provisioner.
You can also use Set-AzContext if not provisioning.
If you do not specify a SubscriptionId and are not logged in, one will be
automatically selected for you by the Connect-AzAccount cmdlet.
Once you are logged in (or were previously), the selected SubscriptionId
will be used for subsequent operations that are specific to a subscription.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ProvisionerApplicationId
A service principal ID to provision test resources when a provisioner is specified.
```yaml
Type: String
Parameter Sets: Default+Provisioner, ResourceGroup+Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ProvisionerApplicationSecret
A service principal secret (password) to provision test resources when a provisioner is specified.
```yaml
Type: String
Parameter Sets: Default+Provisioner, ResourceGroup+Provisioner
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ServiceDirectory
A directory under 'sdk' in the repository root - optionally with subdirectories
specified - in which to discover pre removal script named 'remove-test-resources-pre.json'.
```yaml
Type: String
Parameter Sets: Default
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
```yaml
Type: String
Parameter Sets: Default+Provisioner, ResourceGroup+Provisioner, ResourceGroup
Aliases:
Required: False
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Environment
Name of the cloud environment.
The default is the Azure Public Cloud
('PublicCloud')
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: AzureCloud
Accept pipeline input: False
Accept wildcard characters: False
```
### -CI
Run script in CI mode. Infers various environment variable names based on CI convention.
### -Force
Force removal of resource group without asking for user confirmation
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -RemoveTestResourcesRemainingArguments
Captures any arguments not declared here (no parameter errors)
```yaml
Type: Object
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -WhatIf
Shows what would happen if the cmdlet runs.
The cmdlet is not run.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: wi
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before running the cmdlet.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: cf
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](https://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
## NOTES
## RELATED LINKS

View file

@ -1,120 +0,0 @@
function BuildServiceDirectoryPrefix([string]$serviceName) {
$serviceName = $serviceName -replace '[\./\\]', '_'
return $serviceName.ToUpperInvariant() + "_"
}
# If the ServiceDirectory has multiple segments use the last directory name
# e.g. D:\foo\bar -> bar or foo/bar -> bar
function GetServiceLeafDirectoryName([string]$serviceDirectory) {
return $serviceDirectory ? (Split-Path -Leaf $serviceDirectory) : ""
}
function GetUserName() {
$UserName = $env:USER ?? $env:USERNAME
# Remove spaces, etc. that may be in $UserName
$UserName = $UserName -replace '\W'
return $UserName
}
function GetBaseName([string]$user, [string]$serviceDirectoryName) {
# Handle service directories in nested directories, e.g. `data/aztables`
$serviceDirectorySafeName = $serviceDirectoryName -replace '[\./\\]', ''
return "$user$serviceDirectorySafeName".ToLowerInvariant()
}
function ShouldMarkValueAsSecret([string]$serviceName, [string]$key, [string]$value, [array]$allowedValues = @())
{
$logOutputNonSecret = @(
# Environment Variables
"RESOURCEGROUP_NAME",
# Deployment Outputs
"CLIENT_ID",
"TENANT_ID",
"SUBSCRIPTION_ID",
"RESOURCE_GROUP",
"LOCATION",
"ENVIRONMENT",
"AUTHORITY_HOST",
"RESOURCE_MANAGER_URL",
"SERVICE_MANAGEMENT_URL",
"ENDPOINT_SUFFIX",
"SERVICE_DIRECTORY",
# This is used in many places and is harder to extract from the base subscription config, so hardcode it for now.
"STORAGE_ENDPOINT_SUFFIX",
# Parameters
"Environment",
"SubscriptionId",
"TenantId",
"TestApplicationId",
"TestApplicationOid",
"ProvisionerApplicationId"
)
$serviceDirectoryPrefix = BuildServiceDirectoryPrefix $serviceName
$suffix1 = $key -replace $serviceDirectoryPrefix, ""
$suffix2 = $key -replace "AZURE_", ""
$variants = @($key, $suffix1, $suffix2)
if ($variants | Where-Object { $logOutputNonSecret -contains $_ }) {
return $false
}
if ($allowedValues -contains $value) {
return $false
}
return $true
}
function SetSubscriptionConfiguration([object]$subscriptionConfiguration)
{
foreach($pair in $subscriptionConfiguration.GetEnumerator()) {
if ($pair.Value -is [Hashtable]) {
foreach($nestedPair in $pair.Value.GetEnumerator()) {
# Mark values as secret so we don't print json blobs containing secrets in the logs.
# Prepend underscore to the variable name, so we can still access the variable names via environment
# variables if they get set subsequently.
if (ShouldMarkValueAsSecret "AZURE_" $nestedPair.Name $nestedPair.Value) {
Write-Host "##vso[task.setvariable variable=_$($nestedPair.Name);issecret=true;]$($nestedPair.Value)"
}
}
} else {
if (ShouldMarkValueAsSecret "AZURE_" $pair.Name $pair.Value) {
Write-Host "##vso[task.setvariable variable=_$($pair.Name);issecret=true;]$($pair.Value)"
}
}
}
Write-Host ($subscriptionConfiguration | ConvertTo-Json)
$serialized = $subscriptionConfiguration | ConvertTo-Json -Compress
Write-Host "##vso[task.setvariable variable=SubscriptionConfiguration;]$serialized"
}
function UpdateSubscriptionConfiguration([object]$subscriptionConfigurationBase, [object]$subscriptionConfiguration)
{
foreach ($pair in $subscriptionConfiguration.GetEnumerator()) {
if ($pair.Value -is [Hashtable]) {
if (!$subscriptionConfigurationBase.ContainsKey($pair.Name)) {
$subscriptionConfigurationBase[$pair.Name] = @{}
}
foreach($nestedPair in $pair.Value.GetEnumerator()) {
# Mark values as secret so we don't print json blobs containing secrets in the logs.
# Prepend underscore to the variable name, so we can still access the variable names via environment
# variables if they get set subsequently.
if (ShouldMarkValueAsSecret "AZURE_" $nestedPair.Name $nestedPair.Value) {
Write-Host "##vso[task.setvariable variable=_$($nestedPair.Name);issecret=true;]$($nestedPair.Value)"
}
$subscriptionConfigurationBase[$pair.Name][$nestedPair.Name] = $nestedPair.Value
}
} else {
if (ShouldMarkValueAsSecret "AZURE_" $pair.Name $pair.Value) {
Write-Host "##vso[task.setvariable variable=_$($pair.Name);issecret=true;]$($pair.Value)"
}
$subscriptionConfigurationBase[$pair.Name] = $pair.Value
}
}
$serialized = $subscriptionConfigurationBase | ConvertTo-Json -Compress
Write-Host ($subscriptionConfigurationBase | ConvertTo-Json)
Write-Host "##vso[task.setvariable variable=SubscriptionConfiguration;]$serialized"
}

View file

@ -1,17 +0,0 @@
@echo off
REM Copyright (c) Microsoft Corporation. All rights reserved.
REM Licensed under the MIT License.
setlocal
for /f "usebackq delims=" %%i in (`where pwsh 2^>nul`) do (
set _cmd=%%i
)
if "%_cmd%"=="" (
echo Error: PowerShell not found. Please visit https://github.com/powershell/powershell for install instructions.
exit /b 2
)
call "%_cmd%" -NoLogo -NoProfile -File "%~dpn0.ps1" %*

View file

@ -1,205 +0,0 @@
#!/usr/bin/env pwsh
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
#Requires -Version 6.0
#Requires -PSEdition Core
#Requires -Modules @{ModuleName='Az.Accounts'; ModuleVersion='1.6.4'}
#Requires -Modules @{ModuleName='Az.Resources'; ModuleVersion='1.8.0'}
[CmdletBinding(DefaultParameterSetName = 'Default')]
param (
[Parameter(ParameterSetName = 'Default', Position = 0)]
[string] $ServiceDirectory,
[Parameter(ParameterSetName = 'Default')]
[ValidatePattern('^[-a-zA-Z0-9\.\(\)_]{0,80}(?<=[a-zA-Z0-9\(\)])$')]
[string] $BaseName,
[Parameter(ParameterSetName = 'ResourceGroup')]
[ValidatePattern('^[-\w\._\(\)]+$')]
[string] $ResourceGroupName,
[Parameter()]
[ValidatePattern('^[0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}$')]
[string] $SubscriptionId,
[Parameter()]
[ValidateRange(1, 7*24)]
[int] $DeleteAfterHours = 48
)
. $PSScriptRoot/SubConfig-Helpers.ps1
# By default stop for any error.
if (!$PSBoundParameters.ContainsKey('ErrorAction')) {
$ErrorActionPreference = 'Stop'
}
function Log($Message) {
Write-Host ('{0} - {1}' -f [DateTime]::Now.ToLongTimeString(), $Message)
}
function Retry([scriptblock] $Action, [int] $Attempts = 5) {
$attempt = 0
$sleep = 5
while ($attempt -lt $Attempts) {
try {
$attempt++
return $Action.Invoke()
} catch {
if ($attempt -lt $Attempts) {
$sleep *= 2
Write-Warning "Attempt $attempt failed: $_. Trying again in $sleep seconds..."
Start-Sleep -Seconds $sleep
} else {
Write-Error -ErrorRecord $_
}
}
}
}
# Support actions to invoke on exit.
$exitActions = @({
if ($exitActions.Count -gt 1) {
Write-Verbose 'Running registered exit actions'
}
})
# Make sure $ResourceGroupName is set.
if (!$ResourceGroupName) {
# Make sure $BaseName is set.
if (!$BaseName) {
$UserName = GetUserName
$BaseName = GetBaseName $UserName $ServiceDirectory
Log "BaseName was not set. Using default base name '$BaseName'"
}
$ResourceGroupName = "rg-$BaseName"
}
# This script is intended for interactive users. Make sure they are logged in or fail.
$context = Get-AzContext
if (!$context) {
throw "You must be already logged in to use this script. Run 'Connect-AzAccount' and try again."
}
# If no subscription was specified, try to select the Azure SDK Developer Playground subscription.
# Ignore errors to leave the automatically selected subscription.
if ($SubscriptionId) {
$currentSubcriptionId = $context.Subscription.Id
if ($currentSubcriptionId -ne $SubscriptionId) {
Log "Selecting subscription '$SubscriptionId'"
$null = Select-AzSubscription -Subscription $SubscriptionId
$exitActions += {
Log "Selecting previous subscription '$currentSubcriptionId'"
$null = Select-AzSubscription -Subscription $currentSubcriptionId
}
# Update the context.
$context = Get-AzContext
}
} else {
Log "Attempting to select subscription 'Azure SDK Developer Playground (faa080af-c1d8-40ad-9cce-e1a450ca5b57)'"
$null = Select-AzSubscription -Subscription 'faa080af-c1d8-40ad-9cce-e1a450ca5b57' -ErrorAction Ignore
# Update the context.
$context = Get-AzContext
$SubscriptionId = $context.Subscription.Id
$PSBoundParameters['SubscriptionId'] = $SubscriptionId
}
# Use cache of well-known team subs without having to be authenticated.
$wellKnownSubscriptions = @{
'faa080af-c1d8-40ad-9cce-e1a450ca5b57' = 'Azure SDK Developer Playground'
'a18897a6-7e44-457d-9260-f2854c0aca42' = 'Azure SDK Engineering System'
'2cd617ea-1866-46b1-90e3-fffb087ebf9b' = 'Azure SDK Test Resources'
}
# Print which subscription is currently selected.
$subscriptionName = $context.Subscription.Id
if ($wellKnownSubscriptions.ContainsKey($subscriptionName)) {
$subscriptionName = '{0} ({1})' -f $wellKnownSubscriptions[$subscriptionName], $subscriptionName
}
Log "Selected subscription '$subscriptionName'"
# try..finally will also trap Ctrl+C.
try {
Log "Getting resource group '$ResourceGroupName'"
$resourceGroup = Get-AzResourceGroup -Name $ResourceGroupName
# Update DeleteAfter
$deleteAfter = [DateTime]::UtcNow.AddHours($DeleteAfterHours).ToString('o')
Log "Updating DeleteAfter to '$deleteAfter'"
Write-Warning "Any clean-up scripts running against subscription '$SubscriptionId' may delete resource group '$ResourceGroupName' after $DeleteAfterHours hours."
$resourceGroup.Tags['DeleteAfter'] = $deleteAfter
Log "Updating resource group '$ResourceGroupName'"
Retry {
# Allow the resource group to write to output.
Set-AzResourceGroup -Name $ResourceGroupName -Tag $resourceGroup.Tags
}
} finally {
$exitActions.Invoke()
}
<#
.SYNOPSIS
Updates a resource group previously deployed for a service directory.
.DESCRIPTION
Updates a resource group that was created using New-TestResources.ps1.
You can use this, for example, to update the `DeleteAfterHours` property
to keep an existing resource group deployed for a longer period of time.
.PARAMETER ServiceDirectory
A directory under 'sdk' in the repository root - optionally with subdirectories
specified - in which to discover ARM templates named 'test-resources.json'.
This can also be an absolute path or specify parent directories.
.PARAMETER BaseName
A name to use in the resource group and passed to the ARM template as 'baseName'.
This will update the resource group named 'rg-<baseName>'
.PARAMETER ResourceGroupName
The name of the resource group to update.
.PARAMETER SubscriptionId
Optional subscription ID to use when deleting resources when logging in as a
provisioner. You can also use Set-AzContext if not provisioning.
If you do not specify a SubscriptionId and are not logged in, one will be
automatically selected for you by the Connect-AzAccount cmdlet.
Once you are logged in (or were previously), the selected SubscriptionId
will be used for subsequent operations that are specific to a subscription.
.PARAMETER DeleteAfterHours
Positive integer number of hours from the current time to set the
'DeleteAfter' tag on the created resource group. The computed value is a
timestamp of the form "2020-03-04T09:07:04.3083910Z".
An optional cleanup process can delete resource groups whose "DeleteAfter"
timestamp is less than the current time.
.EXAMPLE
Update-TestResources.ps1 keyvault -DeleteAfterHours 24
Update the 'rg-${USERNAME}keyvault` resource group to be deleted after 24
hours from now if a clean-up script is running against the current subscription.
.EXAMPLE
Update-TestResources.ps1 -ResourceGroupName rg-userkeyvault -Subscription fa9c6912-f641-4226-806c-5139584b89ca
Update the 'rg-userkeyvault' resource group to be deleted after 48
hours from now if a clean-up script is running against the subscription 'fa9c6912-f641-4226-806c-5139584b89ca'.
#>

View file

@ -1,153 +0,0 @@
---
external help file: -help.xml
Module Name:
online version:
schema: 2.0.0
---
# Update-TestResources.ps1
## SYNOPSIS
Updates a resource group previously deployed for a service directory.
## SYNTAX
### Default (Default)
```
Update-TestResources.ps1 [-ServiceDirectory] <String> [-BaseName <String>] [-SubscriptionId <String>]
[-DeleteAfterHours <Int32>] [<CommonParameters>]
```
### ResourceGroup
```
Update-TestResources.ps1 [-ResourceGroupName <String>] [-SubscriptionId <String>] [-DeleteAfterHours <Int32>]
[<CommonParameters>]
```
## DESCRIPTION
Updates a resource group that was created using New-TestResources.ps1.
You can use this, for example, to update the \`DeleteAfterHours\` property
to keep an existing resource group deployed for a longer period of time.
## EXAMPLES
### EXAMPLE 1
```
Update-TestResources.ps1 keyvault -DeleteAfterHours 24
```
Update the 'rg-${USERNAME}keyvault\` resource group to be deleted after 24
hours from now if a clean-up script is running against the current subscription.
### EXAMPLE 2
```
Update-TestResources.ps1 -ResourceGroupName rg-userkeyvault -Subscription fa9c6912-f641-4226-806c-5139584b89ca
```
Update the 'rg-userkeyvault' resource group to be deleted after 48
hours from now if a clean-up script is running against the subscription 'fa9c6912-f641-4226-806c-5139584b89ca'.
## PARAMETERS
### -ServiceDirectory
A directory under 'sdk' in the repository root - optionally with subdirectories
specified - in which to discover ARM templates named 'test-resources.json'.
This can also be an absolute path or specify parent directories.
```yaml
Type: String
Parameter Sets: Default
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -BaseName
A name to use in the resource group and passed to the ARM template as 'baseName'.
This will update the resource group named 'rg-\<baseName\>'
```yaml
Type: String
Parameter Sets: Default
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ResourceGroupName
The name of the resource group to update.
```yaml
Type: String
Parameter Sets: ResourceGroup
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SubscriptionId
Optional subscription ID to use when deleting resources when logging in as a
provisioner.
You can also use Set-AzContext if not provisioning.
If you do not specify a SubscriptionId and are not logged in, one will be
automatically selected for you by the Connect-AzAccount cmdlet.
Once you are logged in (or were previously), the selected SubscriptionId
will be used for subsequent operations that are specific to a subscription.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DeleteAfterHours
Positive integer number of hours from the current time to set the
'DeleteAfter' tag on the created resource group.
The computed value is a
timestamp of the form "2020-03-04T09:07:04.3083910Z".
An optional cleanup process can delete resource groups whose "DeleteAfter"
timestamp is less than the current time.
```yaml
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: 48
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](https://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
## NOTES
## RELATED LINKS

View file

@ -1,38 +0,0 @@
parameters:
- name: SubscriptionConfiguration
type: string
default: $(sub-config-azure-cloud-test-resources)
- name: SubscriptionConfigurations
type: object
default: null
steps:
- ${{ if parameters.SubscriptionConfiguration }}:
- pwsh: |
$config = @'
${{ parameters.SubscriptionConfiguration }}
'@ | ConvertFrom-Json -AsHashtable
. ./eng/common/TestResources/SubConfig-Helpers.ps1
SetSubscriptionConfiguration $config
displayName: Initialize SubscriptionConfiguration variable
- ${{ if parameters.SubscriptionConfigurations }}:
- pwsh: |
Write-Host "##vso[task.setvariable variable=SubscriptionConfiguration;]{}"
displayName: Initialize SubscriptionConfiguration variable for merging
condition: eq(variables['SubscriptionConfiguration'], '')
- ${{ each config in parameters.SubscriptionConfigurations }}:
- pwsh: |
$configBase = @'
$(SubscriptionConfiguration)
'@ | ConvertFrom-Json -AsHashtable
$config = @'
${{ config }}
'@ | ConvertFrom-Json -AsHashtable
. ./eng/common/TestResources/SubConfig-Helpers.ps1
UpdateSubscriptionConfiguration $configBase $config
displayName: Merge Test Resource Configurations

View file

@ -1,15 +0,0 @@
{
"azConfigEndpointSuffix": ".azconfig.azure.cn",
"azureAuthorityHost": "https://login.chinacloudapi.cn/",
"cognitiveServicesEndpointSuffix": ".cognitiveservices.azure.cn",
"containerRegistryEndpointSuffix": ".azurecr.cn",
"cosmosEndpointSuffix": "cosmos.azure.cn",
"enableStorageVersioning": false,
"keyVaultDomainSuffix": ".vault.azure.cn",
"keyVaultEndpointSuffix": ".vault.azure.cn",
"keyVaultSku": "standard",
"searchEndpointSuffix": "search.azure.cn",
"serviceBusEndpointSuffix": ".servicebus.chinacloudapi.cn",
"storageEndpointSuffix": "core.chinacloudapi.cn",
"textAnalyticsSku": "S"
}

View file

@ -1,8 +0,0 @@
{
"azureAuthorityHost": "https://login.microsoftonline.com/",
"cognitiveServicesEndpointSuffix": ".cognitiveservices.azure.com",
"communicationServicesEndpointSuffix": ".communication.azure.com",
"keyVaultDomainSuffix": ".vault.azure.net",
"keyVaultEndpointSuffix": ".vault.azure.net",
"storageEndpointSuffix": "core.windows.net"
}

View file

@ -1,15 +0,0 @@
{
"azConfigEndpointSuffix": ".azconfig.azure.us",
"azureAuthorityHost": "https://login.microsoftonline.us/",
"cognitiveServicesEndpointSuffix": ".cognitiveservices.azure.us",
"containerRegistryEndpointSuffix": ".azurecr.us",
"cosmosEndpointSuffix": "cosmos.azure.us",
"enableStorageVersioning": false,
"formRecognizerLocation": "usgovvirginia",
"keyVaultDomainSuffix": ".vault.usgovcloudapi.net",
"keyVaultEndpointSuffix": ".vault.usgovcloudapi.net",
"keyVaultSku": "premium",
"searchEndpointSuffix": "search.azure.us",
"serviceBusEndpointSuffix": ".servicebus.usgovcloudapi.net",
"storageEndpointSuffix": "core.usgovcloudapi.net"
}

View file

@ -1,61 +0,0 @@
parameters:
ServiceDirectory: not-set
ArmTemplateParameters: '@{}'
DeleteAfterHours: 8
Location: ''
SubscriptionConfiguration: $(sub-config-azure-cloud-test-resources)
# SubscriptionConfiguration will be splatted into the parameters of the test
# resources script. It should be JSON in the form:
# {
# "SubscriptionId": "<subscription id>",
# "TenantId": "<tenant id>",
# "TestApplicationId": "<test app id>",
# "TestApplicationSecret": "<test app secret>",
# "ProvisionerApplicationId": "<provisioner app id>",
# "ProvisionerApplicationSecret": "<provisioner app secret>",
# "Environment": "AzureCloud | AzureGov | AzureChina | <other environment>"
# "EnvironmentVariables": {
# "SERVICE_MANAGEMENT_URL": "<service management url>",
# "STORAGE_ENDPOINT_SUFFIX": "<storage endpoint suffix>",
# "RESOURCE_MANAGER_URL": "<resource manager url>",
# "SEARCH_ENDPOINT_SUFFIX": "<search endpoint suffix>",
# "COSMOS_TABLES_ENDPOINT_SUFFIX": "<cosmos tables endpoint suffix>"
# },
# "ArmTemplateParameters": {
# "keyVaultDomainSuffix": "<keyVaultDomainSuffix>",
# "storageEndpointSuffix": "<storageEndpointSuffix>",
# "endpointSuffix": "<endpointSuffix>",
# "azureAuthorityHost": "<azureAuthorityHost>",
# "keyVaultEndpointSuffix": "<keyVaultEndpointSuffix>"
# }
# }
steps:
- template: /eng/common/pipelines/templates/steps/cache-ps-modules.yml
- template: /eng/common/TestResources/setup-environments.yml
- pwsh: |
eng/common/scripts/Import-AzModules.ps1
$subscriptionConfiguration = @'
${{ parameters.SubscriptionConfiguration }}
'@ | ConvertFrom-Json -AsHashtable;
# The subscriptionConfiguration may have ArmTemplateParameters defined, so
# pass those in via the ArmTemplateParameters flag, and handle any
# additional parameters from the pipelines via AdditionalParameters
eng/common/TestResources/New-TestResources.ps1 `
-ServiceDirectory '${{ parameters.ServiceDirectory }}' `
-Location '${{ parameters.Location }}' `
-DeleteAfterHours '${{ parameters.DeleteAfterHours }}' `
@subscriptionConfiguration `
-AdditionalParameters ${{ parameters.ArmTemplateParameters }} `
-CI `
-Force `
-Verbose | Out-Null
displayName: Deploy test resources
env:
TEMP: $(Agent.TempDirectory)

View file

@ -1,38 +0,0 @@
# Assumes steps in deploy-test-resources.yml was run previously. Requires
# environment variable: <ServiceDirectory>_RESOURCE_GROUP and Az PowerShell module
parameters:
ServiceDirectory: ''
SubscriptionConfiguration: $(sub-config-azure-cloud-test-resources)
# SubscriptionConfiguration will be splat into the parameters of the test
# resources script. It should be JSON in the form:
# {
# "SubscriptionId": "<subscription id>",
# "TenantId": "<tenant id>",
# "TestApplicationId": "<test app id>",
# "TestApplicationSecret": "<test app secret>",
# "ProvisionerApplicationId": "<provisioner app id>",
# "ProvisionerApplicationSecret": "<provisioner app secret>",
# "Environment": "AzureCloud | AzureGov | AzureChina | <other environment>"
# }
# The Remove-TestResources.ps1 script accommodates extra parameters so it will
# not error when parameters are provided which the script doesn't use.
steps:
- pwsh: |
eng/common/scripts/Import-AzModules.ps1
$subscriptionConfiguration = @"
${{ parameters.SubscriptionConfiguration }}
"@ | ConvertFrom-Json -AsHashtable;
eng/common/TestResources/Remove-TestResources.ps1 `
@subscriptionConfiguration `
-ServiceDirectory "${{ parameters.ServiceDirectory }}" `
-CI `
-Force `
-Verbose
displayName: Remove test resources
condition: eq(variables['CI_HAS_DEPLOYED_RESOURCES'], 'true')
continueOnError: true

View file

@ -1,34 +0,0 @@
# Cloud Configuration will be splat into parameters of `Add-AzEnvironment`. It
# should be JSON in the form (not all fields are required):
# {
# "Name": "<environment name>",
# "PublishSettingsFileUrl": "<publish settings file url>",
# "ServiceEndpoint": "<service endpoint>",
# "ManagementPortalUrl": "<management portal url>",
# "ActiveDirectoryEndpoint": "<active directory endpoint>",
# "ActiveDirectoryServiceEndpointResourceId": "<active directory service endpoint resource id>",
# "ResourceManagerEndpoint": "<resource manager endpoint>",
# "GalleryEndpoint": "<gallery endpoint>",
# "GraphEndpoint": "<graph endpoint>",
# "GraphAudience": "<graph audience>",
# "AzureKeyVaultDnsSuffix": "<key vault suffix>",
# "AzureKeyVaultServiceEndpointResourceId": "<key vault service endpoint resource id>"
# }
steps:
- bash: sudo chown -R runner ~/.Azure
displayName: (MacOS) Grant access to ~/.Azure
condition: contains(variables['OSVmImage'], 'mac')
- task: Powershell@2
displayName: Register Dogfood environment
inputs:
targetType: inline
pwsh: true
script: |
eng/common/scripts/Import-AzModules.ps1
$environmentSpec = @"
$(env-config-dogfood)
"@ | ConvertFrom-Json -AsHashtable;
Add-AzEnvironment @environmentSpec

View file

@ -1,206 +0,0 @@
# Generates an index page for cataloging different versions of the Docs
[CmdletBinding()]
Param (
$DocFx,
$RepoRoot,
$DocGenDir,
$DocOutDir = "${RepoRoot}/docfx_project",
$DocfxJsonPath = "${PSScriptRoot}\docfx.json",
$MainJsPath = "${PSScriptRoot}\templates\matthews\styles\main.js"
)
. "${PSScriptRoot}\..\scripts\common.ps1"
# Given the github io blob storage url and language regex,
# the helper function will return a list of artifact names.
function Get-BlobStorage-Artifacts($blobStorageUrl, $blobDirectoryRegex, $blobArtifactsReplacement) {
LogDebug "Reading artifact from storage blob ..."
$returnedArtifacts = @()
$pageToken = ""
Do {
$resp = ""
if (!$pageToken) {
# First page call.
$resp = Invoke-RestMethod -Method Get -Uri $blobStorageUrl
}
else {
# Next page call
$blobStorageUrlPageToken = $blobStorageUrl + "&marker=$pageToken"
$resp = Invoke-RestMethod -Method Get -Uri $blobStorageUrlPageToken
}
# Convert to xml documents.
$xmlDoc = [xml](removeBomFromString $resp)
foreach ($elem in $xmlDoc.EnumerationResults.Blobs.BlobPrefix) {
# What service return like "dotnet/Azure.AI.Anomalydetector/", needs to fetch out "Azure.AI.Anomalydetector"
$artifact = $elem.Name -replace $blobDirectoryRegex, $blobArtifactsReplacement
$returnedArtifacts += $artifact
}
# Fetch page token
$pageToken = $xmlDoc.EnumerationResults.NextMarker
} while ($pageToken)
return $returnedArtifacts
}
# The sequence of Bom bytes differs by different encoding.
# The helper function here is only to strip the utf-8 encoding system as it is used by blob storage list api.
# Return the original string if not in BOM utf-8 sequence.
function RemoveBomFromString([string]$bomAwareString) {
if ($bomAwareString.length -le 3) {
return $bomAwareString
}
$bomPatternByteArray = [byte[]] (0xef, 0xbb, 0xbf)
# The default encoding for powershell is ISO-8859-1, so converting bytes with the encoding.
$bomAwareBytes = [Text.Encoding]::GetEncoding(28591).GetBytes($bomAwareString.Substring(0, 3))
if (@(Compare-Object $bomPatternByteArray $bomAwareBytes -SyncWindow 0).Length -eq 0) {
return $bomAwareString.Substring(3)
}
return $bomAwareString
}
function Get-TocMapping {
Param (
[Parameter(Mandatory = $true)] [Object[]] $metadata,
[Parameter(Mandatory = $true)] [String[]] $artifacts
)
# Used for sorting the toc display order
$orderServiceMapping = @{}
foreach ($artifact in $artifacts) {
$packageInfo = $metadata | ? { $_.Package -eq $artifact -and $_.Hide -ne "true" }
$serviceName = ""
if (!$packageInfo) {
LogDebug "There is no service name for artifact $artifact or it is marked as hidden. Please check csv of Azure/azure-sdk/_data/release/latest repo if this is intended. "
continue
}
elseif (!$packageInfo[0].ServiceName) {
LogWarning "There is no service name for artifact $artifact. Please check csv of Azure/azure-sdk/_data/release/latest repo if this is intended. "
# If no service name retrieved, print out warning message, and put it into Other page.
$serviceName = "Other"
}
else {
if ($packageInfo.Length -gt 1) {
LogWarning "There are more than 1 packages fetched out for artifact $artifact. Please check csv of Azure/azure-sdk/_data/release/latest repo if this is intended. "
}
$serviceName = $packageInfo[0].ServiceName.Trim()
}
# Define the order of "New", "Type", if not match, return the length of the array.
$CustomOrder_New = "true", "false", ""
$newIndex = $CustomOrder_New.IndexOf($packageInfo[0].New.ToLower())
$newIndex = $newIndex -eq -1 ? $CustomOrder_New.Count : $newIndex
$CustomOrder_Type = "client", "mgmt", "compat", "spring", ""
$typeIndex = $CustomOrder_Type.IndexOf($packageInfo[0].Type.ToLower())
$typeIndex = $typeIndex -eq -1 ? $CustomOrder_Type.Count : $typeIndex
$orderServiceMapping[$artifact] = [PSCustomObject][ordered]@{
NewIndex = $newIndex
TypeIndex = $typeIndex
ServiceName = $serviceName
DisplayName = $packageInfo[0].DisplayName.Trim()
Artifact = $artifact
}
}
return $orderServiceMapping
}
function GenerateDocfxTocContent([Hashtable]$tocContent, [String]$lang, [String]$campaignId = "UA-62780441-46") {
LogDebug "Start generating the docfx toc and build docfx site..."
LogDebug "Initializing Default DocFx Site..."
& $($DocFx) init -q -o "${DocOutDir}"
# The line below is used for testing in local
#docfx init -q -o "${DocOutDir}"
LogDebug "Copying template and configuration..."
New-Item -Path "${DocOutDir}" -Name "templates" -ItemType "directory" -Force
Copy-Item "${DocGenDir}/templates/*" -Destination "${DocOutDir}/templates" -Force -Recurse
$headerTemplateLocation = "${DocOutDir}/templates/matthews/partials/head.tmpl.partial"
if ($campaignId -and (Test-Path $headerTemplateLocation)){
$headerTemplateContent = Get-Content -Path $headerTemplateLocation -Raw
$headerTemplateContent = $headerTemplateContent -replace "GA_CAMPAIGN_ID", $campaignId
Set-Content -Path $headerTemplateLocation -Value $headerTemplateContent -NoNewline
}
Copy-Item "${DocGenDir}/docfx.json" -Destination "${DocOutDir}/" -Force
$YmlPath = "${DocOutDir}/api"
New-Item -Path $YmlPath -Name "toc.yml" -Force
$visitedService = @{}
# Sort and display toc service name by alphabetical order, and then sort artifact by order.
$sortedToc = $tocContent.Values | Sort-Object ServiceName, NewIndex, TypeIndex, DisplayName, Artifact
foreach ($serviceMapping in $sortedToc) {
$artifact = $serviceMapping.Artifact
$serviceName = $serviceMapping.ServiceName
$displayName = $serviceMapping.DisplayName
# handle spaces in service name, EG "Confidential Ledger"
# handle / in service name, EG "Database for MySQL/PostgreSQL". Leaving a "/" present will generate a bad link location.
$fileName = ($serviceName -replace '\s', '').Replace("/","").ToLower().Trim()
if ($visitedService.ContainsKey($serviceName)) {
if ($displayName) {
Add-Content -Path "$($YmlPath)/${fileName}.md" -Value "#### $artifact`n##### ($displayName)"
}
else {
Add-Content -Path "$($YmlPath)/${fileName}.md" -Value "#### $artifact"
}
}
else {
Add-Content -Path "$($YmlPath)/toc.yml" -Value "- name: ${serviceName}`r`n href: ${fileName}.md"
New-Item -Path $YmlPath -Name "${fileName}.md" -Force
if ($displayName) {
Add-Content -Path "$($YmlPath)/${fileName}.md" -Value "#### $artifact`n##### ($displayName)"
}
else {
Add-Content -Path "$($YmlPath)/${fileName}.md" -Value "#### $artifact"
}
$visitedService[$serviceName] = $true
}
}
# Generate toc homepage.
LogDebug "Creating Site Title and Navigation..."
New-Item -Path "${DocOutDir}" -Name "toc.yml" -Force
Add-Content -Path "${DocOutDir}/toc.yml" -Value "- name: Azure SDK for $lang APIs`r`n href: api/`r`n homepage: api/index.md"
LogDebug "Copying root markdowns"
Copy-Item "$($RepoRoot)/README.md" -Destination "${DocOutDir}/api/index.md" -Force
Copy-Item "$($RepoRoot)/CONTRIBUTING.md" -Destination "${DocOutDir}/api/CONTRIBUTING.md" -Force
LogDebug "Building site..."
& $($DocFx) build "${DocOutDir}/docfx.json"
# The line below is used for testing in local
#docfx build "${DocOutDir}/docfx.json"
Copy-Item "${DocGenDir}/assets/logo.svg" -Destination "${DocOutDir}/_site/" -Force
}
function UpdateDocIndexFiles {
Param (
[Parameter(Mandatory=$false)] [String]$appTitleLang = $Language,
[Parameter(Mandatory=$false)] [String]$lang = $Language,
[Parameter(Mandatory=$false)] [String]$packageRegex = "`"`"",
[Parameter(Mandatory=$false)] [String]$regexReplacement = ""
)
# Update docfx.json
$docfxContent = Get-Content -Path $DocfxJsonPath -Raw
$docfxContent = $docfxContent -replace "`"_appTitle`": `"`"", "`"_appTitle`": `"Azure SDK for $appTitleLang`""
$docfxContent = $docfxContent -replace "`"_appFooter`": `"`"", "`"_appFooter`": `"Azure SDK for $appTitleLang`""
Set-Content -Path $DocfxJsonPath -Value $docfxContent -NoNewline
# Update main.js var lang
$mainJsContent = Get-Content -Path $MainJsPath -Raw
$mainJsContent = $mainJsContent -replace "var SELECTED_LANGUAGE = ''", "var SELECTED_LANGUAGE = '$lang'"
# Update main.js package regex and replacement
$mainJsContent = $mainJsContent -replace "var PACKAGE_REGEX = ''", "var PACKAGE_REGEX = $packageRegex"
$mainJsContent = $mainJsContent -replace "var PACKAGE_REPLACEMENT = ''", "var PACKAGE_REPLACEMENT = `"$regexReplacement`""
Set-Content -Path $MainJsPath -Value $mainJsContent -NoNewline
}
if ($GetGithubIoDocIndexFn -and (Test-Path "function:$GetGithubIoDocIndexFn"))
{
&$GetGithubIoDocIndexFn
}
else
{
LogWarning "The function for 'GetGithubIoDocIndexFn' was not found.`
Make sure it is present in eng/scripts/Language-Settings.ps1 and referenced in eng/common/scripts/common.ps1.`
See https://github.com/Azure/azure-sdk-tools/blob/main/doc/common/common_engsys.md#code-structure"
}

View file

@ -1,76 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 16.0.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.0//EN" "http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd">
<svg version="1.0" id="logo" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
width="46.92px" height="46.315px" viewBox="0 0 46.92 46.315" style="enable-background:new 0 0 46.92 46.315;"
xml:space="preserve">
<style type="text/css">
<![CDATA[
.st0{fill:#DDDDDD;}
]]>
</style>
<g>
<g>
<path class="st0" d="M15.297,27.558l-0.47-1.53h-2.736l-0.526,1.53H9.922l2.649-7.418h1.798l2.614,7.418H15.297z M13.453,21.807
h-0.012l-0.948,2.948h1.889L13.453,21.807z"/>
<path class="st0" d="M17.341,27.558v-1.116l2.804-3.229h-2.613v-1.15h4.47v1.173l-2.805,3.139h2.883v1.185L17.341,27.558
L17.341,27.558z"/>
<path class="st0" d="M26.249,27.558v-0.77c-0.373,0.609-0.944,0.914-1.709,0.914c-0.276,0-0.529-0.047-0.756-0.144
c-0.226-0.097-0.424-0.233-0.585-0.415c-0.165-0.178-0.293-0.389-0.387-0.637c-0.094-0.245-0.14-0.521-0.14-0.826v-3.618h1.453
v3.396c0,0.686,0.311,1.028,0.929,1.028c0.37,0,0.651-0.119,0.842-0.353c0.192-0.235,0.286-0.534,0.286-0.899v-3.173h1.443v5.495
H26.249z"/>
<path class="st0" d="M28.429,27.558v-5.495h1.363v0.658c0.12-0.186,0.243-0.332,0.373-0.435c0.131-0.105,0.264-0.186,0.403-0.241
c0.138-0.056,0.278-0.091,0.419-0.107c0.141-0.013,0.287-0.022,0.434-0.022h0.189v1.486c-0.133-0.023-0.266-0.031-0.401-0.031
c-0.887,0-1.33,0.441-1.33,1.328v2.859H28.429z"/>
<path class="st0" d="M33.212,25.189c0.021,0.418,0.142,0.749,0.361,0.996c0.22,0.244,0.509,0.366,0.867,0.366
c0.237,0,0.448-0.056,0.631-0.164c0.181-0.106,0.3-0.257,0.353-0.45h1.496c-0.172,0.565-0.467,1.001-0.895,1.307
c-0.424,0.308-0.932,0.457-1.52,0.457c-1.831,0-2.747-0.996-2.747-2.992c0-0.424,0.058-0.81,0.178-1.151
c0.12-0.344,0.293-0.637,0.52-0.883c0.228-0.245,0.506-0.433,0.832-0.563c0.328-0.131,0.703-0.194,1.129-0.194
c0.85,0,1.494,0.272,1.927,0.814c0.436,0.544,0.655,1.364,0.655,2.459L33.212,25.189L33.212,25.189z M35.502,24.272
c-0.008-0.2-0.045-0.376-0.107-0.529s-0.146-0.28-0.25-0.38c-0.104-0.102-0.225-0.175-0.36-0.225
c-0.133-0.048-0.271-0.071-0.412-0.071c-0.291,0-0.542,0.106-0.753,0.319c-0.213,0.212-0.335,0.51-0.363,0.888h2.246V24.272z"/>
<path class="st0" d="M17.923,30.788c-0.038-0.38-0.175-0.661-0.413-0.85c-0.239-0.188-0.596-0.28-1.073-0.28
c-0.879,0-1.318,0.301-1.318,0.907c0,0.216,0.089,0.396,0.274,0.542c0.183,0.144,0.474,0.266,0.877,0.359
c0.477,0.117,0.89,0.219,1.24,0.317c0.35,0.095,0.668,0.207,0.96,0.332c0.165,0.068,0.319,0.152,0.464,0.253
c0.146,0.099,0.269,0.222,0.37,0.373c0.1,0.15,0.18,0.328,0.241,0.538c0.06,0.208,0.089,0.464,0.089,0.758
c0,0.355-0.074,0.677-0.223,0.949c-0.15,0.277-0.352,0.508-0.603,0.699c-0.254,0.19-0.553,0.333-0.895,0.434
c-0.342,0.101-0.704,0.152-1.083,0.152c-1.081,0-1.892-0.207-2.436-0.628c-0.545-0.418-0.832-1.032-0.861-1.854h1.497
c0.006,0.38,0.163,0.682,0.465,0.9c0.301,0.219,0.68,0.329,1.133,0.329c0.492,0,0.866-0.089,1.117-0.269
c0.254-0.178,0.38-0.421,0.38-0.728c0-0.119-0.015-0.228-0.044-0.326c-0.031-0.103-0.088-0.193-0.174-0.274
c-0.087-0.084-0.203-0.155-0.352-0.22c-0.15-0.064-0.342-0.119-0.581-0.174c-0.566-0.117-1.058-0.237-1.469-0.361
c-0.415-0.124-0.755-0.276-1.023-0.459c-0.269-0.184-0.466-0.403-0.592-0.665c-0.127-0.261-0.19-0.591-0.19-0.995
c0-0.305,0.056-0.594,0.167-0.86c0.111-0.269,0.283-0.5,0.515-0.697c0.23-0.197,0.515-0.354,0.855-0.471
c0.34-0.115,0.739-0.173,1.201-0.173c0.447,0,0.849,0.058,1.208,0.173c0.358,0.116,0.665,0.281,0.92,0.49
c0.256,0.212,0.457,0.469,0.598,0.773c0.142,0.303,0.222,0.632,0.235,0.998h-1.476V30.788z"/>
<path class="st0" d="M20.179,36.018v-7.42h2.874c0.498,0,0.961,0.084,1.387,0.254c0.422,0.164,0.787,0.404,1.087,0.717
c0.303,0.313,0.539,0.695,0.711,1.142c0.17,0.447,0.254,0.954,0.254,1.521c0,0.565-0.075,1.08-0.228,1.549
c-0.153,0.465-0.372,0.861-0.654,1.192c-0.281,0.331-0.623,0.59-1.014,0.775c-0.398,0.183-0.834,0.271-1.312,0.271H20.179z
M22.974,34.756c0.677,0,1.17-0.209,1.476-0.632c0.306-0.419,0.459-1.052,0.459-1.894c0-0.42-0.039-0.776-0.113-1.073
c-0.074-0.298-0.194-0.542-0.361-0.733c-0.167-0.19-0.383-0.327-0.643-0.415c-0.261-0.085-0.579-0.128-0.949-0.128H21.69v4.872
h1.283V34.756z"/>
<path class="st0" d="M31.689,36.018l-2.202-3.342l-0.823,0.817v2.524h-1.53v-7.42h1.53v3.018l2.838-3.018h2.01l-2.925,2.927
l2.991,4.492L31.689,36.018L31.689,36.018z"/>
</g>
<g>
<path class="st0" d="M14.765,39.33c-2.16,0-3.927-0.006-5.563-0.016c-2.536-0.016-4.625-1.017-6.209-2.979
c-1.229-1.524-1.793-3.27-1.678-5.191c0.176-2.971,1.701-5.256,4.41-6.608c0.397-0.199,0.433-0.294,0.44-0.688
c0.028-1.434,0.518-2.787,1.452-4.025c1.007-1.33,2.369-2.237,4.05-2.699c0.903-0.25,1.907-0.292,2.953-0.128
c0.584,0.089,1.137,0.287,1.668,0.476c0.066,0.022,0.13,0.045,0.195,0.068c0.517-1.254,1.284-2.365,2.293-3.313
c1.06-1,2.309-1.791,3.708-2.346c1.645-0.653,3.423-0.951,5.268-0.884c3.103,0.11,5.763,1.16,7.917,3.121
c1.729,1.579,2.735,3.528,2.992,5.793c0.004,0.038,0.012,0.074,0.017,0.108c0.02,0.095,0.042,0.211,0.042,0.343l-0.002,0.104
c0,0.004,0,0.009,0,0.01c0.057,0.019,0.121,0.038,0.198,0.063c1.579,0.522,2.956,1.4,4.094,2.609
c1.193,1.264,1.985,2.746,2.359,4.407c0.35,1.553,0.333,3.063-0.051,4.49c-0.707,2.635-2.27,4.654-4.643,6.007
c-1.473,0.837-3.111,1.264-4.876,1.269c-1.92,0-3.837,0-5.754,0h-7.479c-1.3,0-2.603,0-3.901,0
C17.368,39.327,16.067,39.33,14.765,39.33z M13.397,17.959c-0.509,0-0.995,0.062-1.446,0.187c-1.452,0.401-2.624,1.18-3.483,2.318
c-0.799,1.055-1.214,2.199-1.238,3.403c-0.016,0.815-0.296,1.253-1.029,1.62c-2.384,1.192-3.671,3.118-3.826,5.72
c-0.099,1.649,0.389,3.152,1.446,4.462c1.375,1.702,3.188,2.574,5.389,2.587c1.632,0.008,3.398,0.01,5.556,0.01
c1.302,0,2.602,0,3.902-0.002c1.301,0,2.603-0.003,3.904-0.003h7.478c1.917,0,3.834,0,5.752,0c1.577,0,3.042-0.381,4.352-1.131
c2.119-1.203,3.51-3.007,4.143-5.356c0.337-1.258,0.353-2.6,0.043-3.98c-0.332-1.476-1.039-2.789-2.095-3.913
c-1.017-1.078-2.248-1.862-3.659-2.329c-0.078-0.025-0.146-0.046-0.203-0.064c-0.622-0.19-0.729-0.436-0.723-1.031l0.002-0.102
c0-0.024-0.009-0.082-0.021-0.133c-0.012-0.063-0.022-0.125-0.03-0.187c-0.229-2.004-1.118-3.729-2.651-5.131
c-1.959-1.788-4.397-2.743-7.237-2.845c-1.7-0.058-3.333,0.213-4.84,0.811c-1.273,0.504-2.408,1.223-3.371,2.131
c-0.954,0.897-1.667,1.955-2.117,3.143c-0.18,0.469-0.519,0.537-0.706,0.537c-0.104,0-0.21-0.019-0.335-0.062
c-0.137-0.047-0.274-0.096-0.413-0.143c-0.511-0.183-0.993-0.354-1.479-0.429C14.095,17.987,13.739,17.959,13.397,17.959z"/>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 6.5 KiB

View file

@ -1,72 +0,0 @@
{
"metadata": [
{
"src": [
{
"files": [
"src/**.csproj"
]
}
],
"dest": "api",
"disableGitFeatures": false,
"disableDefaultFilter": false
}
],
"build": {
"content": [
{
"files": [
"api/**.yml",
"api/**.md",
"api/index.md"
]
},
{
"files": [
"toc.yml",
"*.md"
]
}
],
"resource": [
{
"files": [
"images/**"
]
}
],
"overwrite": [
{
"files": [
"apidoc/**.md"
],
"exclude": [
"obj/**",
"_site/**"
]
}
],
"dest": "_site",
"globalMetadataFiles": [],
"fileMetadataFiles": [],
"template": [
"default",
"templates/matthews"
],
"postProcessors": [],
"markdownEngineName": "markdig",
"noLangKeyword": false,
"keepFileLink": false,
"cleanupCacheHistory": false,
"disableGitFeatures": false,
"globalMetadata": {
"_appTitle": "",
"_appFooter": "",
"_enableSearch": false,
"_enableNewTab": true,
"_appFaviconPath": "https://c.s-microsoft.com/favicon.ico?v2",
"_disableContribution": true
}
}
}

View file

@ -1,17 +0,0 @@
{{^_disableContribution}}
<div class="contribution-panel mobile-hide">
{{#docurl}}
<a href="{{docurl}}" title="{{__global.improveThisDoc}}" class="fab btn-warning pull-right"><i class="glyphicon glyphicon-pencil"></i></a>
{{/docurl}}
{{#sourceurl}}
<a href="{{sourceurl}}" title="{{__global.viewSource}}" class="fab btn-info pull-right"><i class="fa fa-code"></i></a>
{{/sourceurl}}
</div>
{{/_disableContribution}}
<div class="hidden-sm col-md-2" role="complementary">
<div class="sideaffix">
<nav class="bs-docs-sidebar hidden-print hidden-xs hidden-sm affix" id="affix">
</nav>
</div>
</div>

View file

@ -1,100 +0,0 @@
<h1 id="{{id}}" data-uid="{{uid}}">{{>partials/title}}</h1>
<div class="markdown level0 summary">{{{summary}}}</div>
<div class="markdown level0 conceptual">{{{conceptual}}}</div>
{{#inClass}}
<div class="inheritance">
<h5>{{__global.inheritance}}</h5>
{{#inheritance}}
<div class="level{{index}}">{{{specName.0.value}}}</div>
{{/inheritance}}
<div class="level{{level}}"><span class="xref">{{name.0.value}}</span></div>
</div>
{{/inClass}}
{{#derivedClasses}}
<div class="level{{index}}">{{{specName.0.value}}}</div>
{{/derivedClasses}}
{{#inheritedMembers.0}}
<div class="inheritedMembers">
<h5>{{__global.inheritedMembers}}</h5>
{{/inheritedMembers.0}}
{{#inheritedMembers}}
<div>
{{#definition}}
<xref uid="{{definition}}" text="{{nameWithType.0.value}}" alt="{{fullName.0.value}}"/>
{{/definition}}
{{^definition}}
<xref uid="{{uid}}" text="{{nameWithType.0.value}}" alt="{{fullName.0.value}}"/>
{{/definition}}
</div>
{{/inheritedMembers}}
{{#inheritedMembers.0}}
</div>
{{/inheritedMembers.0}}
<h6><strong>{{__global.namespace}}</strong>: {{namespace}}</h6>
<h6><strong>{{__global.assembly}}</strong>: {{assemblies.0}}.dll</h6>
<h5 id="{{id}}_syntax">{{__global.syntax}}</h5>
<div class="codewrapper">
<pre><code class="lang-{{_lang}} hljs">{{syntax.content.0.value}}</code></pre>
</div>
{{#syntax.parameters.0}}
<h5 class="parameters">{{__global.parameters}}</h5>
<table>
{{/syntax.parameters.0}}
{{#syntax.parameters}}
<tr>
<td>
<span class="pull-right">{{{type.specName.0.value}}}</span>
<span class="parametername">{{{id}}}</span>
<p>{{{description}}}</p>
</td>
</tr>
{{/syntax.parameters}}
{{#syntax.parameters.0}}
</table>
{{/syntax.parameters.0}}
{{#syntax.return}}
<h5 class="returns">{{__global.returns}}</h5>
<table>
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
</table>
{{/syntax.return}}
{{#syntax.typeParameters.0}}
<h5 class="typeParameters">{{__global.typeParameters}}</h5>
<table>
{{/syntax.typeParameters.0}}
{{#syntax.typeParameters}}
<tr>
<td>
<span class="parametername">{{{id}}}</span>
<p>{{{description}}}</p>
</td>
</tr>
{{/syntax.typeParameters}}
{{#syntax.typeParameters.0}}
</table>
{{/syntax.typeParameters.0}}
{{#remarks}}
<h5 id="{{id}}_remarks"><strong>{{__global.remarks}}</strong></h5>
<div class="markdown level0 remarks">{{{remarks}}}</div>
{{/remarks}}
{{#example.0}}
<h5 id="{{id}}_examples"><strong>{{__global.examples}}</strong></h5>
{{/example.0}}
{{#example}}
{{{.}}}
{{/example}}

View file

@ -1,210 +0,0 @@
{{>partials/class.header}}
{{#children}}
<h3 id="{{id}}">{{>partials/classSubtitle}}</h3>
{{#children}}
{{^_disableContribution}}
{{#docurl}}
<span class="small pull-right mobile-hide">
<span class="divider"> </span>
<a href="{{docurl}}" title="{{__global.improveThisDoc}}"><i class="fa fa-pencil"></i></a>
</span>
{{/docurl}}
{{#sourceurl}}
<span class="small pull-right mobile-hide">
<a href="{{sourceurl}}" title="{{__global.viewSource}}"><i class="fa fa-code"></i></a>
</span>
{{/sourceurl}}
{{/_disableContribution}}
{{#overload}}
<a id="{{id}}" data-uid="{{uid}}"></a>
{{/overload}}
<h4 id="{{id}}" data-uid="{{uid}}"><a href="#collapsible-{{id}}" class="expander" data-toggle="collapse">{{name.0.value}}</a></h4>
<div id="collapsible-{{id}}" class="collapse in">
<div class="markdown level1 summary">{{{summary}}}</div>
<div class="markdown level1 conceptual">{{{conceptual}}}</div>
<h5 class="decalaration">{{__global.declaration}}</h5>
{{#syntax}}
<div class="codewrapper">
<pre><code class="lang-{{_lang}} hljs">{{syntax.content.0.value}}</code></pre>
</div>
{{#parameters.0}}
<h5 class="parameters">{{__global.parameters}}</h5>
<table>
{{/parameters.0}}
{{#parameters}}
<tr>
<td>
<span class="pull-right">{{{type.specName.0.value}}}</span>
<span class="parametername">{{{id}}}</span>
<p>{{{description}}}</p>
</td>
</tr>
{{/parameters}}
{{#parameters.0}}
</table>
{{/parameters.0}}
{{#return}}
<h5 class="returns">{{__global.returns}}</h5>
<table>
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
</table>
{{/return}}
{{#typeParameters.0}}
<h5 class="typeParameters">{{__global.typeParameters}}</h5>
<table>
{{/typeParameters.0}}
{{#typeParameters}}
<tr>
<td>
<span class="parametername">{{{id}}}</span>
<p>{{{description}}}</p>
</td>
</tr>
{{/typeParameters}}
{{#typeParameters.0}}
</table>
{{/typeParameters.0}}
{{#fieldValue}}
<h5 class="fieldValue">{{__global.fieldValue}}</h5>
<table>
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
</table>
{{/fieldValue}}
{{#propertyValue}}
<h5 class="propertyValue">{{__global.propertyValue}}</h5>
<table>
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
</table>
{{/propertyValue}}
{{#eventType}}
<h5 class="eventType">{{__global.eventType}}</h5>
<table>
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
</table>
{{/eventType}}
{{/syntax}}
{{#overridden}}
<h5 class="overrides">{{__global.overrides}}</h5>
<div><xref uid="{{uid}}" altProperty="fullName" displayProperty="nameWithType"/></div>
{{/overridden}}
{{#implements.0}}
<h5 class="implements">{{__global.implements}}</h5>
{{/implements.0}}
{{#implements}}
{{#definition}}
<div><xref uid="{{definition}}" altProperty="fullName" displayProperty="nameWithType"/></div>
{{/definition}}
{{^definition}}
<div><xref uid="{{uid}}" altProperty="fullName" displayProperty="nameWithType"/></div>
{{/definition}}
{{/implements}}
{{#remarks}}
<h5 id="{{id}}_remarks">{{__global.remarks}}</h5>
<div class="markdown level1 remarks">{{{remarks}}}</div>
{{/remarks}}
{{#example.0}}
<h5 id="{{id}}_examples">{{__global.examples}}</h5>
{{/example.0}}
{{#example}}
{{{.}}}
{{/example}}
{{#exceptions.0}}
<h5 class="exceptions">{{__global.exceptions}}</h5>
<table>
{{/exceptions.0}}
{{#exceptions}}
<tr>
<td>
{{{type.specName.0.value}}}
<p>{{{description}}}</p>
</td>
</tr>
{{/exceptions}}
{{#exceptions.0}}
</table>
{{/exceptions.0}}
{{#seealso.0}}
<h5 id="{{id}}_seealso">{{__global.seealso}}</h5>
<div class="seealso">
{{/seealso.0}}
{{#seealso}}
{{#isCref}}
<div>{{{type.specName.0.value}}}</div>
{{/isCref}}
{{^isCref}}
<div>{{{url}}}</div>
{{/isCref}}
{{/seealso}}
{{#seealso.0}}
</div>
{{/seealso.0}}
</div>
{{/children}}
{{/children}}
{{#extensionMethods.0}}
<h3 id="extensionmethods">{{__global.extensionMethods}}</h3>
{{/extensionMethods.0}}
{{#extensionMethods}}
<div>
{{#definition}}
<xref uid="{{definition}}" altProperty="fullName" displayProperty="nameWithType"/>
{{/definition}}
{{^definition}}
<xref uid="{{uid}}" altProperty="fullName" displayProperty="nameWithType"/>
{{/definition}}
</div>
{{/extensionMethods}}
{{#seealso.0}}
<h3 id="seealso">{{__global.seealso}}</h3>
<div class="seealso">
{{/seealso.0}}
{{#seealso}}
{{#isCref}}
<div>{{{type.specName.0.value}}}</div>
{{/isCref}}
{{^isCref}}
<div>{{{url}}}</div>
{{/isCref}}
{{/seealso}}
{{#seealso.0}}
</div>
{{/seealso.0}}

View file

@ -1,24 +0,0 @@
{{>partials/class.header}}
{{#children}}
{{#children}}
<h4 id="{{id}}"><a href="#collapsible-{{id}}" class="expander" data-toggle="collapse">{{name.0.value}}</a></h4>
<div id="collapsible-{{id}}" class="collapse in">
<p>{{{summary}}}</p>
</div>
{{/children}}
{{/children}}
{{#extensionMethods.0}}
<h3 id="extensionmethods">{{__global.extensionMethods}}</h3>
{{/extensionMethods.0}}
{{#extensionMethods}}
<div>
{{#definition}}
<xref uid="{{definition}}" fullName="{{fullName.0.value}}" name="{{nameWithType.0.value}}"/>
{{/definition}}
{{^definition}}
<xref uid="{{uid}}" fullName="{{fullName.0.value}}" name="{{nameWithType.0.value}}"/>
{{/definition}}
</div>
{{/extensionMethods}}

View file

@ -1,28 +0,0 @@
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title>{{#title}}{{title}}{{/title}}{{^title}}{{>partials/title}}{{/title}} {{#_appTitle}}| {{_appTitle}} {{/_appTitle}}</title>
<meta name="viewport" content="width=device-width">
<meta name="title" content="{{#title}}{{title}}{{/title}}{{^title}}{{>partials/title}}{{/title}} {{#_appTitle}}| {{_appTitle}} {{/_appTitle}}">
<meta name="generator" content="docfx {{_docfxVersion}}">
{{#_description}}<meta name="description" content="{{_description}}">{{/_description}}
<link rel="shortcut icon" href="{{_rel}}{{{_appFaviconPath}}}{{^_appFaviconPath}}favicon.ico{{/_appFaviconPath}}">
<link rel="stylesheet" href="{{_rel}}styles/docfx.vendor.css">
<link rel="stylesheet" href="{{_rel}}styles/docfx.css">
<link rel="stylesheet" href="{{_rel}}styles/main.css">
<meta property="docfx:navrel" content="{{_navRel}}">
<meta property="docfx:tocrel" content="{{_tocRel}}">
{{#_noindex}}<meta name="searchOption" content="noindex">{{/_noindex}}
{{#_enableSearch}}<meta property="docfx:rel" content="{{_rel}}">{{/_enableSearch}}
{{#_enableNewTab}}<meta property="docfx:newtab" content="true">{{/_enableNewTab}}
<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=GA_CAMPAIGN_ID"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'GA_CAMPAIGN_ID');
</script>
</head>

View file

@ -1,17 +0,0 @@
<h1 id="{{id}}" data-uid="{{uid}}">{{>partials/title}}</h1>
<div class="markdown level0 summary">{{{summary}}}</div>
<div class="markdown level0 conceptual">{{{conceptual}}}</div>
<div class="markdown level0 remarks">{{{remarks}}}</div>
{{#children}}
<h3 id="{{id}}">{{>partials/namespaceSubtitle}}</h3>
<table>
{{#children}}
<tr>
<td>
<p><xref uid="{{uid}}" altProperty="fullName" displayProperty="name"/></p>
<p>{{{summary}}}</p>
</td>
</tr>
{{/children}}
</table>
{{/children}}

View file

@ -1,311 +0,0 @@
@import url("https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.6.3/css/font-awesome.min.css");
/* Clickability fix for selector on sm devices */
@media (min-width: 768px) and (max-width: 991px) {
article h1:first-of-type:before {
height: 0;
margin-top: 0;
}
}
#search {
border: none;
}
.fa-code {
font-size: 19px;
}
.sidetoc,
body .toc,
.sidefilter,
.sidetoggle {
background-color: #f9fbe7;
}
.sidenav,
.toc-toggle {
padding: 0;
}
.sidetoggle {
padding-bottom: 15px;
}
/* Remove center align from Navbar and Collapsible section */
.collapse.in,
.collapsing {
text-align: unset;
}
article h4 {
border-bottom: none;
line-height: normal;
}
@media (min-width: 768px) {
.sidetoc, .sidefilter {
margin-left: -15px;
}
}
@media (max-width: 767px) {
.navbar-collapse {
text-align: center !important;
}
.navbar-collapse li .active {
border-radius: 20px;
}
}
/* Collapsible Sections
------------------------------------------------------- */
.expander:after {
font-family: 'Glyphicons Halflings';
content: "\e260";
margin-left: 5px;
color: grey;
font-size: small;
}
.expander.collapsed:after {
content: "\e259";
}
/* Floating buttons
------------------------------------------------------- */
.fab {
width: 40px;
height: 40px;
text-align: center;
padding: 11px 0 0 0;
border: none;
outline: none;
color: #FFF;
border-radius: 100%;
box-shadow: 0 3px 6px rgba(0,0,0,0.16), 0 3px 6px rgba(0,0,0,0.23);
transition:.3s;
}
.fab:hover {
transform: scale(1.1);
}
.fab + .fab {
margin-right: 15px;
}
.contribution-panel {
z-index: 1000;
position: fixed;
right: 30px;
top: 70px;
}
/* Bootstrap docs like sidebar
------------------------------------------------------- */
.affix h5 {
display: none;
}
/* active & hover links */
.affix ul > li > a:hover,
.affix ul > li.active > a,
.affix ul > li > a:focus {
color: #563d7c;
text-decoration: none;
background-color: transparent;
border-left-color: #563d7c;
}
/* all active links */
.affix ul > li.active > a,
.affix ul > li.active:hover > a,
.affix ul > li.active:focus >a {
font-weight: 700;
}
/* nested active links */
.affix ul ul > li.active > a,
.affix ul ul > li.active:hover > a,
.affix ul ul > li.active:focus > a {
font-weight: 500;
}
/* all links */
.affix ul > li > a {
color: #999;
border-left: 2px solid transparent;
padding: 4px 20px;
font-size: 13px;
font-weight: 400;
}
/* nested links */
.affix ul ul > li > a {
padding-top: 1px;
padding-bottom: 1px;
padding-left: 30px;
font-size: 12px;
}
/* hide inactive nested list */
.affix ul ul {
display: none;
}
/* show active nested list */
.affix ul > li.active > ul {
display: block;
}
.affix > ul > li > a:before {
content: '';
}
.affix ul ul > li > a:before {
content: '';
}
/* Style Buttons
------------------------------------------------------- */
.btn-warning {
background-color: #0071c5;
}
.btn-info {
background-color: #0071c5;
}
/* Navbar Hamburger
------------------------------------------------------- */
.icon-bar {
transition: 0.4s;
}
/* Rotate first bar */
.change .icon-bar:nth-of-type(2) {
transform: rotate(-45deg) translate(-4px, 5px) ;
}
/* Fade out the second bar */
.change .icon-bar:nth-of-type(3) {
opacity: 0;
}
/* Rotate last bar */
.change .icon-bar:nth-of-type(4) {
transform: rotate(45deg) translate(-4px, -5px) ;
}
/* Custom Navbar
------------------------------------------------------- */
.navbar-inverse {
background-color: #0071c5;
opacity: 0.95;
border-color: #0071c5;
}
.navbar-inverse .navbar-brand {
color: #ffffff;
}
.navbar-inverse .navbar-brand:hover,
.navbar-inverse .navbar-brand:focus {
color: #ecdbff;
}
.navbar-inverse .navbar-text {
color: #ffffff;
}
.navbar-inverse .navbar-nav > li > a {
color: #ffffff;
}
.navbar-inverse .navbar-nav > li > a:hover,
.navbar-inverse .navbar-nav > li > a:focus {
color: #ecdbff;
}
.navbar-inverse .navbar-nav > .active > a,
.navbar-inverse .navbar-nav > .active > a:hover,
.navbar-inverse .navbar-nav > .active > a:focus {
color: #ecdbff;
background-color: #0071c5;
}
.navbar-inverse .navbar-nav > .open > a,
.navbar-inverse .navbar-nav > .open > a:hover,
.navbar-inverse .navbar-nav > .open > a:focus {
color: #ecdbff;
background-color: #0071c5;
}
.navbar-inverse .navbar-toggle {
border-color: #0071c5;
}
.navbar-inverse .navbar-toggle:hover,
.navbar-inverse .navbar-toggle:focus {
background-color: #0071c5;
}
.navbar-inverse .navbar-toggle .icon-bar {
background-color: #ffffff;
}
.navbar-inverse .navbar-collapse,
.navbar-inverse .navbar-form {
border: none;
}
.navbar-inverse .navbar-link {
color: #ffffff;
}
.navbar-inverse .navbar-link:hover {
color: #ecdbff;
}
.versionarrow {
margin-left: 0.8em;
margin-top: -1.5em;
margin-bottom: -1em;
padding: 1em;
}
.versionarrow::before {
position: absolute;
content: '';
width: 0;
height: 0;
border: .5em solid transparent;
border-left-color: gray;
transform-origin: 0 50%;
transition: transform .1s;
margin-top: 0.2em;
}
.versionarrow.disable {
text-decoration: line-through;
}
.versionarrow.down::before {
transform: rotate(90deg);
margin-top: 0em;
transition: transform .1s;
}
@media (max-width: 767px) {
.navbar-inverse .navbar-nav .open .dropdown-menu > li > a {
color: #ffffff;
}
.navbar-inverse .navbar-nav .open .dropdown-menu > li > a:hover,
.navbar-inverse .navbar-nav .open .dropdown-menu > li > a:focus {
color: #ecdbff;
}
.navbar-inverse .navbar-nav .open .dropdown-menu > .active > a,
.navbar-inverse .navbar-nav .open .dropdown-menu > .active > a:hover,
.navbar-inverse .navbar-nav .open .dropdown-menu > .active > a:focus {
color: #ecdbff;
background-color: #0071c5;
}
}
.navbar-version-select {
padding: 2px;
border: none;
border-radius: 2px;
box-shadow: none;
-webkit-appearance: media-time-remaining-display;
margin-top: 14px;
}

View file

@ -1,238 +0,0 @@
// Use container fluid
var containers = $(".container");
containers.removeClass("container");
containers.addClass("container-fluid");
WINDOW_CONTENTS = window.location.href.split('/')
var SELECTED_LANGUAGE = ''
var PACKAGE_REGEX = ''
var PACKAGE_REPLACEMENT = ''
ATTR1 = '[<span class="hljs-meta">System.ComponentModel.EditorBrowsable</span>]\n<'
// Navbar Hamburger
$(function () {
$(".navbar-toggle").click(function () {
$(this).toggleClass("change");
})
})
// Select list to replace affix on small screens
$(function () {
var navItems = $(".sideaffix .level1 > li");
if (navItems.length == 0) {
return;
}
var selector = $("<select/>");
selector.addClass("form-control visible-sm visible-xs");
var form = $("<form/>");
form.append(selector);
form.prependTo("article");
selector.change(function () {
window.location = $(this).find("option:selected").val();
})
function work(item, level) {
var link = item.children('a');
var text = link.text();
for (var i = 0; i < level; ++i) {
text = '&nbsp;&nbsp;' + text;
}
selector.append($('<option/>', {
'value': link.attr('href'),
'html': text
}));
var nested = item.children('ul');
if (nested.length > 0) {
nested.children('li').each(function () {
work($(this), level + 1);
});
}
}
navItems.each(function () {
work($(this), 0);
});
})
$(function () {
// Inject line breaks and spaces into the code sections
$(".lang-csharp").each(function () {
var text = $(this).html();
text = text.replace(/, /g, ",</br>&#09;&#09");
text = text.replace(ATTR1, '<');
$(this).html(text);
});
// Add text to empty links
$("p > a").each(function () {
var link = $(this).attr('href')
if ($(this).text() === "" && $(this).children().attr("src") === "") {
$(this).html(link)
}
});
})
function httpGetAsync(targetUrl, callback) {
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function () {
if (xmlHttp.readyState == 4 && xmlHttp.status == 200)
callback(xmlHttp.responseText);
}
xmlHttp.open("GET", targetUrl, true); // true for asynchronous
xmlHttp.send(null);
}
function httpGetAsyncFallbackOnFail(targetUrl, successCallback, failureCallback) {
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function () {
if (xmlHttp.readyState == 4) {
if (xmlHttp.status == 200) {
successCallback(xmlHttp.responseText);
} else {
failureCallback(xmlHttp.status)
}
}
}
xmlHttp.open("GET", targetUrl, true); // true for asynchronous
xmlHttp.send(null);
}
function populateOptions(selector, packageName) {
var versionRequestUrl = "https://azuresdkdocs.blob.core.windows.net/$web/" + SELECTED_LANGUAGE + "/" + packageName + "/versioning/versions"
httpGetAsync(versionRequestUrl, function (responseText) {
var versionselector = document.createElement("select")
var cv = WINDOW_CONTENTS[6]
versionselector.className = 'navbar-version-select'
if (responseText) {
options = responseText.match(/[^\r\n]+/g)
for (var i in options) {
$(versionselector).append('<option value="' + options[i] + '">' + options[i] + '</option>')
}
}
if(cv === 'latest')
{
$(versionselector).selectedIndex = 0
}
else {
$(versionselector).val(cv);
}
$(selector).append(versionselector)
$(versionselector).change(function () {
targetVersion = $(this).val()
url = WINDOW_CONTENTS.slice()
url[6] = targetVersion
var targetUrl = url.join('/')
httpGetAsyncFallbackOnFail(targetUrl, (unused) => window.location.href = url.join('/'),
(failureStatus) => window.location.href = getPackageUrl(SELECTED_LANGUAGE, packageName, targetVersion))
});
})
}
function httpGetLatestAsync(targetUrl, latestVersions, packageName) {
httpGetAsync(targetUrl, function (responseText) {
if (responseText) {
version = responseText.match(/[^\r\n]+/g)
$(latestVersions).append('<li><a href="' + getPackageUrl(SELECTED_LANGUAGE, packageName, version) + '" target="_blank">' + version + '</a></li>')
}
})
}
function loadedOtherVersions(url, latestVersions, publishedVersions, selector, collapsible, packageName) {
var hasAdded = function (currentVersion) {
return $(publishedVersions).children('li').filter(function() {
return $(this).text() === currentVersion
}).length || $(latestVersions).children('li').filter(function() {
return $(this).text() === currentVersion
}).length
}
httpGetAsync(url, function (responseText) {
if (responseText) {
options = responseText.match(/[^\r\n]+/g)
for (var i in options) {
if (!hasAdded(options[i])) {
$(publishedVersions).append('<li><a href="' + getPackageUrl(SELECTED_LANGUAGE, packageName, options[i]) + '" target="_blank">' + options[i] + '</a></li>')
}
}
}
else {
$(publishedVersions).append('<li>No discovered versions present in blob storage.</li>')
}
$(selector).addClass("loaded")
if ($(publishedVersions).children('li').length < 1) {
$(collapsible).remove()
}
})
}
function populateIndexList(selector, packageName) {
var url = "https://azuresdkdocs.blob.core.windows.net/$web/" + SELECTED_LANGUAGE + "/" + packageName + "/versioning/versions"
var latestGAUrl = "https://azuresdkdocs.blob.core.windows.net/$web/" + SELECTED_LANGUAGE + "/" + packageName + "/versioning/latest-ga"
var latestPreviewUrl = "https://azuresdkdocs.blob.core.windows.net/$web/" + SELECTED_LANGUAGE + "/" + packageName + "/versioning/latest-preview"
var latestVersions = document.createElement("ul")
httpGetLatestAsync(latestGAUrl, latestVersions, packageName)
httpGetLatestAsync(latestPreviewUrl, latestVersions, packageName)
var publishedVersions = $('<ul style="display: none;"></ul>')
var collapsible = $('<div class="versionarrow">&nbsp;&nbsp;&nbsp;Other versions</div>')
// Check whether it has display name tag.
if ($(selector).next().is('h5')) {
$(selector).next().after(latestVersions)
} else {
$(selector).after(latestVersions)
}
$(latestVersions).after(collapsible)
$(collapsible).after(publishedVersions)
// Add collapsible arrows on versioned docs.
$(collapsible).on('click', function(event) {
event.preventDefault();
if (collapsible.hasClass('disable')) {
return
}
$(this).toggleClass('down')
if ($(this).hasClass('down')) {
if (!$(selector).hasClass('loaded')){
loadedOtherVersions(url, latestVersions, publishedVersions, selector, collapsible, packageName)
}
$(publishedVersions).show()
} else {
$(publishedVersions).hide()
}
});
}
function getPackageUrl(language, package, version) {
return "https://azuresdkdocs.blob.core.windows.net/$web/" + language + "/" + package + "/" + version + "/index.html"
}
// Populate Versions
$(function () {
if (WINDOW_CONTENTS.length < 7 && WINDOW_CONTENTS[WINDOW_CONTENTS.length - 1] != 'index.html') {
console.log("Run PopulateList")
$('h4').each(function () {
var pkgName = $(this).text().replace(PACKAGE_REGEX, PACKAGE_REPLACEMENT)
populateIndexList($(this), pkgName)
})
}
if (WINDOW_CONTENTS.length > 7) {
var pkgName = WINDOW_CONTENTS[5]
populateOptions($('#navbar'), pkgName)
}
})

View file

@ -1,116 +0,0 @@
parameters:
- name: AdditionalParameters
type: object
- name: DependsOn
type: object
default: null
- name: CloudConfig
type: object
default: {}
- name: MatrixConfigs
type: object
default: []
- name: MatrixFilters
type: object
default: []
- name: MatrixReplace
type: object
default: {}
- name: JobTemplatePath
type: string
# Set this to false to do a full checkout for private repositories with the azure pipelines service connection
- name: SparseCheckout
type: boolean
default: true
- name: SparseCheckoutPaths
type: object
default: []
- name: Pool
type: string
default: azsdk-pool-mms-ubuntu-2004-general
- name: OsVmImage
type: string
default: MMSUbuntu20.04
# This parameter is only necessary if there are multiple invocations of this template within the SAME STAGE.
# When that occurs, provide a name other than the default value.
- name: GenerateJobName
type: string
default: 'generate_job_matrix'
- name: PreGenerationSteps
type: stepList
default: []
jobs:
- job: ${{ parameters.GenerateJobName }}
variables:
skipComponentGovernanceDetection: true
displayNameFilter: $[ coalesce(variables.jobMatrixFilter, '.*') ]
pool:
name: ${{ parameters.Pool }}
vmImage: ${{ parameters.OsVmImage }}
${{ if parameters.DependsOn }}:
dependsOn: ${{ parameters.DependsOn }}
steps:
# Skip sparse checkout for the `azure-sdk-for-<lang>-pr` private mirrored repositories
# as we require the github service connection to be loaded.
- ${{ if and(parameters.SparseCheckout, not(contains(variables['Build.DefinitionName'], '-pr - '))) }}:
- template: /eng/common/pipelines/templates/steps/sparse-checkout.yml
parameters:
${{ if ne(length(parameters.SparseCheckoutPaths), 0) }}:
Paths: ${{ parameters.SparseCheckoutPaths }}
${{ if and(eq(length(parameters.SparseCheckoutPaths), 0), ne(parameters.AdditionalParameters.ServiceDirectory, '')) }}:
Paths:
- "sdk/${{ parameters.AdditionalParameters.ServiceDirectory }}"
- ${{ parameters.PreGenerationSteps }}
- ${{ each config in parameters.MatrixConfigs }}:
- ${{ if eq(config.GenerateVMJobs, 'true') }}:
- task: Powershell@2
inputs:
pwsh: true
filePath: eng/common/scripts/job-matrix/Create-JobMatrix.ps1
arguments: >
-ConfigPath ${{ config.Path }}
-Selection ${{ config.Selection }}
-DisplayNameFilter '$(displayNameFilter)'
-Filters '${{ join(''',''', parameters.MatrixFilters) }}','container=^$','SupportedClouds=^$|${{ parameters.CloudConfig.Cloud }}'
-Replace '${{ join(''',''', parameters.MatrixReplace) }}'
-NonSparseParameters '${{ join(''',''', config.NonSparseParameters) }}'
displayName: Generate VM Job Matrix ${{ config.Name }}
name: generate_vm_job_matrix_${{ config.Name }}
- ${{ if eq(config.GenerateContainerJobs, 'true') }}:
- task: Powershell@2
inputs:
pwsh: true
filePath: eng/common/scripts/job-matrix/Create-JobMatrix.ps1
arguments: >
-ConfigPath ${{ config.Path }}
-Selection ${{ config.Selection }}
-DisplayNameFilter '$(displayNameFilter)'
-Filters '${{ join(''',''', parameters.MatrixFilters) }}', 'container=.*', 'SupportedClouds=^$|${{ parameters.CloudConfig.Cloud }}'
-NonSparseParameters '${{ join(''',''', config.NonSparseParameters) }}'
displayName: Generate Container Job Matrix
name: generate_container_job_matrix_${{ config.Name }}
- ${{ each config in parameters.MatrixConfigs }}:
- ${{ if eq(config.GenerateVMJobs, 'true') }}:
- template: ${{ parameters.JobTemplatePath }}
parameters:
UsePlatformContainer: false
Matrix: dependencies.${{ parameters.GenerateJobName }}.outputs['generate_vm_job_matrix_${{ config.Name }}.matrix']
DependsOn: ${{ parameters.GenerateJobName }}
CloudConfig: ${{ parameters.CloudConfig }}
${{ each param in parameters.AdditionalParameters }}:
${{ param.key }}: ${{ param.value }}
- ${{ if eq(config.GenerateContainerJobs, 'true') }}:
- template: ${{ parameters.JobTemplatePath }}
parameters:
UsePlatformContainer: true
Matrix: dependencies.${{ parameters.GenerateJobName }}.outputs['generate_container_job_matrix_${{ config.Name }}.matrix']
DependsOn: ${{ parameters.GenerateJobName }}
CloudConfig: ${{ parameters.CloudConfig }}
${{ each param in parameters.AdditionalParameters }}:
${{ param.key }}: ${{ param.value }}

View file

@ -1,71 +0,0 @@
jobs:
- job: CreateDocIndex
pool:
vmImage: windows-2019
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.6'
inputs:
versionSpec: '3.6'
- pwsh: |
Invoke-WebRequest -Uri "https://github.com/dotnet/docfx/releases/download/v2.43.2/docfx.zip" `
-OutFile "docfx.zip" | Wait-Process; Expand-Archive -Path "docfx.zip" -DestinationPath ./docfx
echo "##vso[task.setvariable variable=docfxPath]$(Build.BinariesDirectory)/docfx/docfx.exe"
workingDirectory: $(Build.BinariesDirectory)
displayName: Download and Extract DocFX
- task: PowerShell@2
displayName: 'Generate Doc Index'
inputs:
pwsh: true
filePath: $(Build.SourcesDirectory)/eng/common/docgeneration/Generate-DocIndex.ps1
arguments: >
-Docfx $(docfxPath)
-RepoRoot $(Build.SourcesDirectory)
-DocGenDir "$(Build.SourcesDirectory)/eng/common/docgeneration"
-DocOutDir "$(Build.ArtifactStagingDirectory)/docfx_project"
-verbose
- task: UsePythonVersion@0
displayName: 'Use Python 3.6'
inputs:
versionSpec: '3.6'
- template: /eng/common/pipelines/templates/steps/mashup-doc-index.yml
parameters:
SourceDirectory: $(Build.ArtifactStagingDirectory)
- pwsh: |
Copy-Item -Path $(Build.SourcesDirectory)/eng/* -Destination ./ -Recurse -Force
echo "##vso[task.setvariable variable=toolPath]$(Build.BinariesDirectory)"
workingDirectory: $(Build.BinariesDirectory)
displayName: Move eng/common to Tool Directory
- task: PublishPipelineArtifact@0
condition: succeeded()
inputs:
artifactName: "Doc.Index"
targetPath: $(Build.ArtifactStagingDirectory)/docfx_project/_site
- pwsh: |
git checkout -b gh-pages-local --track origin/gh-pages-root -f
workingDirectory: $(Build.SourcesDirectory)
displayName: Git pull GH pages branch
- pwsh: |
Copy-Item -Path $(Build.ArtifactStagingDirectory)/docfx_project/_site/* -Destination ./ -Recurse -Force
git add -A
workingDirectory: $(Build.SourcesDirectory)
displayName: Copy the latest changes
- task: PowerShell@2
displayName: Push the Docs to GH-Pages
condition: succeeded()
inputs:
pwsh: true
workingDirectory: $(Build.SourcesDirectory)
filePath: $(toolPath)/common/scripts/git-branch-push.ps1
arguments: >
-PRBranchName "gh-pages"
-CommitMsg "Auto-generated docs from SHA(s) $(Build.SourceVersion)"
-GitUrl "https://$(azuresdk-github-pat)@github.com/$(Build.Repository.Name).git"
-PushArgs "--force"

View file

@ -1,11 +0,0 @@
steps:
# https://github.com/actions/virtual-environments/issues/798
- script: sudo ln -sf /run/systemd/resolve/resolv.conf /etc/resolv.conf
displayName: Bypass local DNS server to workaround issue resolving cognitiveservices names
condition: |
and(
succeededOrFailed(),
contains(variables['OSVmImage'], 'ubuntu'),
eq(variables['Container'], '')
)

View file

@ -1,14 +0,0 @@
steps:
- pwsh: |
. ./eng/common/scripts/Helpers/PSModule-Helpers.ps1
Write-Host "##vso[task.setvariable variable=CachedPSModulePath]$global:CurrentUserModulePath"
displayName: Set PS Modules Cache Directory
# Containers should bake modules into the image to save on pipeline time
condition: and(succeeded(), eq(variables['Container'], ''))
- task: Cache@2
inputs:
key: 'PSModulePath | $(CacheSalt) | $(Agent.OS) | $(Build.SourcesDirectory)/eng/common/scripts/Import-AzModules.ps1'
path: $(CachedPSModulePath)
displayName: Cache PS Modules
# Containers should bake modules into the image to save on pipeline time
condition: and(succeeded(), eq(variables['Container'], ''))

View file

@ -1,29 +0,0 @@
# Checks spelling of files that changed between the current state of the repo
# and some ref (branch, tag, etc.) or commit hash. Only runs on PRs.
# ContinueOnError - true: Pipeline warns on spelling error
# false: Pipeline fails on spelling error
# TargetBranch - Target ref (e.g. main) to compare to create file change
# list.
# CspellConfigPath - Path to cspell.json config location
parameters:
ContinueOnError: true
CspellConfigPath: ./.vscode/cspell.json
steps:
- ${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
- task: NodeTool@0
inputs:
versionSpec: 16.x
displayName: Use Node.js 16.x
- task: PowerShell@2
displayName: Check spelling (cspell)
continueOnError: ${{ parameters.ContinueOnError }}
inputs:
targetType: filePath
filePath: eng/common/scripts/check-spelling-in-changed-files.ps1
arguments: >-
-CspellConfigPath ${{ parameters.CspellConfigPath }}
-ExitWithError:(!$${{ parameters.ContinueOnError }})
pwsh: true

View file

@ -1,36 +0,0 @@
parameters:
EmulatorMsiUrl: "https://aka.ms/cosmosdb-emulator"
StartParameters: ''
steps:
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Cosmos-Emulator.ps1
arguments: >
-EmulatorMsiUrl "${{ parameters.EmulatorMsiUrl }}"
-StartParameters "${{ parameters.StartParameters }}"
-Stage "Install"
pwsh: true
displayName: Install Public Cosmos DB Emulator
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Cosmos-Emulator.ps1
arguments: >
-EmulatorMsiUrl "${{ parameters.EmulatorMsiUrl }}"
-StartParameters "${{ parameters.StartParameters }}"
-Stage "Launch"
pwsh: true
displayName: Launch Public Cosmos DB Emulator
continueOnError: true
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Cosmos-Emulator.ps1
arguments: >
-EmulatorMsiUrl "${{ parameters.EmulatorMsiUrl }}"
-StartParameters "${{ parameters.StartParameters }}"
-Stage "Launch"
pwsh: true
displayName: Retry Launch of Public Cosmos DB Emulator
condition: failed()

View file

@ -1,27 +0,0 @@
parameters:
ArtifactPath: $(Build.ArtifactStagingDirectory)
Artifacts: []
ConfigFileDir: $(Build.ArtifactStagingDirectory)/PackageInfo
steps:
# ideally this should be done as initial step of a job in caller template
# We can remove this step later once it is added in caller
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
- ${{ each artifact in parameters.Artifacts }}:
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Create-APIReview.ps1
arguments: >
-ArtifactPath ${{parameters.ArtifactPath}}
-APIViewUri $(azuresdk-apiview-uri)
-APIKey $(azuresdk-apiview-apikey)
-APILabel "Auto Review - $(Build.SourceVersion)"
-PackageName ${{artifact.name}}
-SourceBranch $(Build.SourceBranchName)
-DefaultBranch $(DefaultBranch)
-ConfigFileDir '${{parameters.ConfigFileDir}}'
pwsh: true
workingDirectory: $(Pipeline.Workspace)
displayName: Create API Review for ${{ artifact.name}}
condition: and(succeededOrFailed(), ne(variables['Skip.CreateApiReview'], 'true') , ne(variables['Build.Reason'],'PullRequest'), eq(variables['System.TeamProject'], 'internal'))

View file

@ -1,59 +0,0 @@
# Expects azuresdk-github-pat is set to the PAT for azure-sdk
# Expects the buildtools to be cloned
parameters:
BaseBranchName: $(Build.SourceBranch)
PRBranchName: not-specified
PROwner: azure-sdk
CommitMsg: not-specified
RepoOwner: Azure
RepoName: $(Build.Repository.Name)
PushArgs:
WorkingDirectory: $(System.DefaultWorkingDirectory)
PRTitle: not-specified
PRBody: ''
ScriptDirectory: eng/common/scripts
GHReviewersVariable: ''
GHTeamReviewersVariable: ''
GHAssignessVariable: ''
# Multiple labels seperated by comma, e.g. "bug, APIView"
PRLabels: ''
SkipCheckingForChanges: false
CloseAfterOpenForTesting: false
OpenAsDraft: false
steps:
- template: /eng/common/pipelines/templates/steps/git-push-changes.yml
parameters:
BaseRepoBranch: ${{ parameters.PRBranchName }}
BaseRepoOwner: ${{ parameters.PROwner }}
CommitMsg: ${{ parameters.CommitMsg }}
TargetRepoOwner: ${{ parameters.RepoOwner }}
TargetRepoName: ${{ parameters.RepoName }}
PushArgs: ${{ parameters.PushArgs }}
WorkingDirectory: ${{ parameters.WorkingDirectory }}
ScriptDirectory: ${{ parameters.ScriptDirectory }}
SkipCheckingForChanges: ${{ parameters.SkipCheckingForChanges }}
- task: PowerShell@2
displayName: Create pull request
condition: and(succeeded(), eq(variables['HasChanges'], 'true'))
inputs:
pwsh: true
workingDirectory: ${{ parameters.WorkingDirectory }}
filePath: ${{ parameters.ScriptDirectory }}/Submit-PullRequest.ps1
arguments: >
-RepoOwner "${{ parameters.RepoOwner }}"
-RepoName "$(RepoNameWithoutOwner)"
-BaseBranch "${{ parameters.BaseBranchName }}"
-PROwner "${{ parameters.PROwner }}"
-PRBranch "${{ parameters.PRBranchName }}"
-AuthToken "$(azuresdk-github-pat)"
-PRTitle "${{ parameters.PRTitle }}"
-PRBody "${{ coalesce(parameters.PRBody, parameters.CommitMsg, parameters.PRTitle) }}"
-PRLabels "${{ parameters.PRLabels }}"
-UserReviewers "$(${{ parameters.GHReviewersVariable }})"
-TeamReviewers "$(${{ parameters.GHTeamReviewersVariable }})"
-Assignees "$(${{ parameters.GHAssignessVariable }})"
-CloseAfterOpenForTesting $${{ coalesce(parameters.CloseAfterOpenForTesting, 'false') }}
-OpenAsDraft $${{ parameters.OpenAsDraft }}

View file

@ -1,23 +0,0 @@
parameters:
ArtifactLocation: 'not-specified'
PackageRepository: 'not-specified'
ReleaseSha: 'not-specified'
RepoId: $(Build.Repository.Name)
WorkingDirectory: ''
ScriptDirectory: eng/common/scripts
steps:
- task: PowerShell@2
displayName: 'Verify Package Tags and Create Git Releases'
inputs:
filePath: ${{ parameters.ScriptDirectory }}/create-tags-and-git-release.ps1
arguments: >
-artifactLocation ${{ parameters.ArtifactLocation }}
-packageRepository ${{ parameters.PackageRepository }}
-releaseSha ${{ parameters.ReleaseSha }}
-repoId ${{ parameters.RepoId }}
-workingDirectory '${{ parameters.WorkingDirectory }}'
pwsh: true
timeoutInMinutes: 5
env:
GH_TOKEN: $(azuresdk-github-pat)

View file

@ -1,53 +0,0 @@
# cSpell:ignore changedfiles
# cSpell:ignore credscan
# cSpell:ignore securedevelopmentteam
# cSpell:ignore postanalysis
parameters:
SuppressionFilePath: 'eng/CredScanSuppression.json'
BaselineFilePath: ''
SourceDirectory: $(Build.SourcesDirectory)
ServiceDirectory: ''
steps:
- pwsh: |
if ("$(Build.Reason)" -eq 'PullRequest') {
$changedFiles = & "eng/common/scripts/get-changedfiles.ps1"
$changedFiles | ForEach-Object { Add-Content -Path "${{ parameters.SourceDirectory }}/credscan.tsv" -Value "${{ parameters.SourceDirectory }}/$_"}
}
else {
$scanFolder = ""
if ("${{ parameters.ServiceDirectory }}" -ne '') {
$scanFolder = "sdk/${{ parameters.ServiceDirectory }}"
}
Set-Content "${{ parameters.SourceDirectory }}/credscan.tsv" -Value "${{ parameters.SourceDirectory }}/$scanFolder"
}
if(Test-Path "${{ parameters.SourceDirectory }}/credscan.tsv") {
Get-Content "${{ parameters.SourceDirectory }}/credscan.tsv"
}
else {
Write-Host "##vso[task.setvariable variable=SKIP_CREDSCAN]true"
}
displayName: CredScan setup
- task: securedevelopmentteam.vss-secure-development-tools.build-task-credscan.CredScan@3
displayName: CredScan running
condition: and(succeededOrFailed(), ne(variables['SKIP_CREDSCAN'], true))
inputs:
toolVersion: 2.2.7.8
scanFolder: "${{ parameters.SourceDirectory }}/credscan.tsv"
suppressionsFile: ${{ parameters.SuppressionFilePath }}
- task: securedevelopmentteam.vss-secure-development-tools.build-task-postanalysis.PostAnalysis@2
displayName: CredScan result analysis
condition: and(succeededOrFailed(), ne(variables['SKIP_CREDSCAN'], true))
inputs:
GdnBreakBaselineFiles: ${{ parameters.BaselineFilePath }}
GdnBreakAllTools: false
GdnBreakGdnToolCredScan: true
GdnBreakGdnToolCredScanSeverity: Error
GdnBreakBaselines: baseline
# Used for generating baseline file.
# GdnBreakOutputBaselineFile: baseline
# GdnBreakOutputBaseline: baseline
- pwsh: |
Write-Host "Please check https://aka.ms/azsdk/credscan for more information about the cred scan failure."
displayName: CredScan troubleshooting guide
condition: and(failed(), ne(variables['SKIP_CREDSCAN'], true))

View file

@ -1,24 +0,0 @@
# This script fragment is used across our repos to set a variable "SetDevVersion" which
# is used when this pipeline is going to be generating and publishing daily dev builds.
parameters:
ServiceDirectory: ''
steps:
- ${{if ne(parameters.ServiceDirectory, '')}}:
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Save-Package-Properties.ps1
arguments: >
-ServiceDirectory ${{parameters.ServiceDirectory}}
-OutDirectory $(Build.ArtifactStagingDirectory)/PackageInfo
pwsh: true
workingDirectory: $(Pipeline.Workspace)
displayName: Dump Package properties
condition: succeeded()
- pwsh: |
$setDailyDevBuild = "false"
if (('$(Build.Reason)' -eq 'Schedule') -and ('$(System.TeamProject)' -eq 'internal')) {
$setDailyDevBuild = "true"
}
echo "##vso[task.setvariable variable=SetDevVersion]$setDailyDevBuild"
displayName: "Setup Versioning Properties"
condition: and(succeeded(), eq(variables['SetDevVersion'], ''))

View file

@ -1,25 +0,0 @@
parameters:
ArtifactPath: $(Build.ArtifactStagingDirectory)
Artifacts: []
steps:
- pwsh: |
$apiChangeDetectRequestUrl = "https://apiview.dev/PullRequest/DetectApiChanges"
echo "##vso[task.setvariable variable=ApiChangeDetectRequestUrl]$apiChangeDetectRequestUrl"
displayName: "Set API change detect request URL"
condition: eq(variables['ApiChangeDetectRequestUrl'], '')
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Detect-Api-Changes.ps1
arguments: >
-ArtifactList ('${{ convertToJson(parameters.Artifacts) }}' | ConvertFrom-Json | Select-Object Name)
-ArtifactPath ${{parameters.ArtifactPath}}
-CommitSha '$(Build.SourceVersion)'
-BuildId $(Build.BuildId)
-PullRequestNumber $(System.PullRequest.PullRequestNumber)
-RepoFullName $(Build.Repository.Name)
-APIViewUri $(ApiChangeDetectRequestUrl)
pwsh: true
displayName: Detect API changes
condition: and(succeededOrFailed(), eq(variables['Build.Reason'],'PullRequest'))

View file

@ -1,20 +0,0 @@
parameters:
- name: Variables
type: object
default: []
- name: ContinueOnError
type: boolean
default: false
steps:
- pwsh: |
$rawVariables = @"
${{ convertToJson(parameters.Variables) }}
"@
$variables = ConvertFrom-Json $rawVariables -AsHashtable
foreach ($key in $variables.Keys) {
Write-Host "Clearing: $key"
Write-Host "##vso[task.setvariable variable=$key]"
}
continueOnError: ${{ parameters.ContinueOnError }}
displayName: Clear DevOps Variables

View file

@ -1,21 +0,0 @@
parameters:
- name: Variables
type: object
default: []
- name: ContinueOnError
type: boolean
default: false
steps:
- pwsh: |
$rawVariables = @"
${{ convertToJson(parameters.Variables) }}
"@
$variables = ConvertFrom-Json $rawVariables -AsHashtable
foreach ($key in $variables.Keys) {
$value = $variables[$key]
Write-Host "Setting: $key = $value"
Write-Host "##vso[task.setvariable variable=$key]$value"
}
continueOnError: ${{ parameters.ContinueOnError }}
displayName: Set DevOps Variables

View file

@ -1,15 +0,0 @@
parameters:
- name: ContainerRegistryClientId
type: string
- name: ContainerRegistryClientSecret
type: string
- name: ImageId
type: string
steps:
- pwsh: |
$containerRegistry = ("${{parameters.ImageId}}" -split "\/")[0]
docker login $containerRegistry -u "${{ parameters.ContainerRegistryClientId }}" -p "${{ parameters.ContainerRegistryClientSecret }}"
displayName: Login container registry
- pwsh: |
docker pull '${{ parameters.ImageId}}'
displayName: Pull docker image ${{ parameters.ImageId }}

View file

@ -1,119 +0,0 @@
# intended to be used as part of a release process
parameters:
- name: ArtifactLocation
type: string
default: 'not-specified'
- name: PackageRepository
type: string
default: 'not-specified'
- name: ReleaseSha
type: string
default: 'not-specified'
- name: RepoId
type: string
default: $(Build.Repository.Name)
- name: WorkingDirectory
type: string
default: ''
- name: ScriptDirectory
type: string
default: eng/common/scripts
- name: TargetDocRepoName
type: string
default: ''
- name: TargetDocRepoOwner
type: string
default: ''
- name: PRBranchName
type: string
default: 'main-rdme'
- name: PRLabels
type: string
default: 'auto-merge'
- name: ArtifactName
type: string
default: ''
- name: Language
type: string
default: ''
- name: DocRepoDestinationPath
type: string
default: '' #usually docs-ref-services/
- name: CIConfigs
type: string
default: '[]'
- name: GHReviewersVariable
type: string
default: ''
- name: GHTeamReviewersVariable
type: string
default: '' # externally set, as eng-common does not have the identity-resolver. Run as pre-step
- name: OnboardingBranch
type: string
default: ''
- name: CloseAfterOpenForTesting
type: boolean
default: false
- name: SkipPackageJson
type: object
default: false
- name: SparseCheckoutPaths
type: object
default: null
steps:
- pwsh: |
if ($IsWindows) {
REG ADD HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem /f /v LongPathsEnabled /t REG_DWORD /d 1
git config --system core.longpaths true
}
else {
Write-Host "This script is not executing on Windows, skipping registry modification."
}
displayName: Enable Long Paths if Necessary
- ${{ if not(parameters.SparseCheckoutPaths) }}:
- pwsh: |
git clone https://github.com/${{ parameters.TargetDocRepoOwner }}/${{ parameters.TargetDocRepoName }} ${{ parameters.WorkingDirectory }}/repo
displayName: Clone Documentation Repository
ignoreLASTEXITCODE: false
- ${{ if parameters.SparseCheckoutPaths }}:
- template: /eng/common/pipelines/templates/steps/sparse-checkout.yml
parameters:
SkipDefaultCheckout: true
Repositories:
- Name: ${{ parameters.TargetDocRepoOwner }}/${{ parameters.TargetDocRepoName }}
WorkingDirectory: ${{ parameters.WorkingDirectory }}/repo
Paths: ${{ parameters.SparseCheckoutPaths }}
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
parameters:
WorkingDirectory: ${{ parameters.WorkingDirectory }}/repo
- task: PowerShell@2
displayName: 'Apply Documentation Updates From Artifact'
inputs:
targetType: filePath
filePath: ${{ parameters.ScriptDirectory }}/update-docs-metadata.ps1
arguments: >
-ArtifactLocation ${{ parameters.ArtifactLocation }}
-Repository ${{ parameters.PackageRepository }}
-ReleaseSHA ${{ parameters.ReleaseSha }}
-RepoId ${{ parameters.RepoId }}
-WorkDirectory "${{ parameters.WorkingDirectory }}"
-DocRepoLocation "${{ parameters.WorkingDirectory }}/repo"
-Language "${{parameters.Language}}"
-Configs "${{ parameters.CIConfigs }}"
pwsh: true
env:
GH_TOKEN: $(azuresdk-github-pat)
- template: /eng/common/pipelines/templates/steps/git-push-changes.yml
parameters:
BaseRepoBranch: $(DefaultBranch)
BaseRepoOwner: ${{ parameters.TargetDocRepoOwner }}
CommitMsg: "Update docs metadata and targeting for release of ${{ parameters.ArtifactName }}"
TargetRepoName: ${{ parameters.TargetDocRepoName }}
TargetRepoOwner: ${{ parameters.TargetDocRepoOwner }}
WorkingDirectory: ${{ parameters.WorkingDirectory }}/repo
ScriptDirectory: ${{ parameters.WorkingDirectory }}/${{ parameters.ScriptDirectory }}

View file

@ -1,10 +0,0 @@
steps:
- pwsh: |
if ($IsWindows) {
REG ADD HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem /f /v LongPathsEnabled /t REG_DWORD /d 1
git config --system core.longpaths true
}
else {
Write-Host "This script is not executing on Windows, skipping registry modification."
}
displayName: Enable long path support if necessary

View file

@ -1,23 +0,0 @@
# cSpell:ignore changedfiles
# cSpell:ignore Committish
# cSpell:ignore LASTEXITCODE
steps:
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
- ${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
- pwsh: |
# Find the default branch of the repo. The variable value sets in build step.
Write-Host "Default Branch: $(DefaultBranch)"
if ((!"$(System.PullRequest.SourceBranch)".StartsWith("sync-eng/common")) -and "$(System.PullRequest.TargetBranch)" -match "^(refs/heads/)?$(DefaultBranch)$")
{
$filesInCommonDir = & "eng/common/scripts/get-changedfiles.ps1" -DiffPath 'eng/common/*'
if (($LASTEXITCODE -eq 0) -and ($filesInCommonDir.Count -gt 0))
{
Write-Host "##vso[task.LogIssue type=error;]Changes to files under 'eng/common' directory should not be made in this Repo`n${filesInCommonDir}"
Write-Host "##vso[task.LogIssue type=error;]Please follow workflow at https://github.com/Azure/azure-sdk-tools/blob/main/doc/common/common_engsys.md"
exit 1
}
}
displayName: Prevent changes to eng/common outside of azure-sdk-tools repo
condition: and(succeeded(), ne(variables['Skip.EngCommonWorkflowEnforcer'], 'true'), not(endsWith(variables['Build.Repository.Name'], '-pr')))

View file

@ -1,54 +0,0 @@
parameters:
BaseRepoBranch: not-specified
BaseRepoOwner: azure-sdk
CommitMsg: not-specified
TargetRepoOwner: Azure
TargetRepoName: $(Build.Repository.Name)
PushArgs:
WorkingDirectory: $(System.DefaultWorkingDirectory)'
ScriptDirectory: eng/common/scripts
SkipCheckingForChanges: false
steps:
- pwsh: |
echo "git add -A"
git add -A
echo "git diff --name-status --cached --exit-code"
git diff --name-status --cached --exit-code
if ($LastExitCode -ne 0) {
echo "##vso[task.setvariable variable=HasChanges]$true"
echo "Changes detected so setting HasChanges=true"
}
else {
echo "##vso[task.setvariable variable=HasChanges]$false"
echo "No changes so skipping code push"
}
displayName: Check for changes
condition: and(succeeded(), eq(${{ parameters.SkipCheckingForChanges }}, false))
workingDirectory: ${{ parameters.WorkingDirectory }}
ignoreLASTEXITCODE: true
- pwsh: |
# Remove the repo owner from the front of the repo name if it exists there
$repoName = "${{ parameters.TargetRepoName }}" -replace "^${{ parameters.TargetRepoOwner }}/", ""
echo "##vso[task.setvariable variable=RepoNameWithoutOwner]$repoName"
echo "RepoName = $repoName"
displayName: Remove Repo Owner from Repo Name
condition: succeeded()
workingDirectory: ${{ parameters.WorkingDirectory }}
- task: PowerShell@2
displayName: Push changes
condition: and(succeeded(), eq(variables['HasChanges'], 'true'))
inputs:
pwsh: true
workingDirectory: ${{ parameters.WorkingDirectory }}
filePath: ${{ parameters.ScriptDirectory }}/git-branch-push.ps1
arguments: >
-PRBranchName "${{ parameters.BaseRepoBranch }}"
-CommitMsg "${{ parameters.CommitMsg }}"
-GitUrl "https://$(azuresdk-github-pat)@github.com/${{ parameters.BaseRepoOwner }}/$(RepoNameWithoutOwner).git"
-PushArgs "${{ parameters.PushArgs }}"
-SkipCommit $${{ parameters.SkipCheckingForChanges }}

View file

@ -1,16 +0,0 @@
parameters:
ToolPath: $(Pipeline.Workspace)/pipeline-generator
steps:
- script: >
mkdir pipeline-generator
workingDirectory: $(Pipeline.Workspace)
displayName: Setup working directory for pipeline generator.
- script: >
dotnet tool install
Azure.Sdk.Tools.PipelineGenerator
--version 1.0.2-dev.20220504.1
--add-source https://pkgs.dev.azure.com/azure-sdk/public/_packaging/azure-sdk-for-net/nuget/v3/index.json
--tool-path ${{parameters.ToolPath}}
workingDirectory: $(Pipeline.Workspace)/pipeline-generator
displayName: 'Install pipeline generator tool'

View file

@ -1,81 +0,0 @@
parameters:
TargetFolder: ''
steps:
- task: PythonScript@0
displayName: MashUp Generated Index Site so its served from default site location
inputs:
scriptSource: inline
script: |
import argparse
import os
import logging
import re
import shutil
from io import open
SITE_INDEX = r'${{ parameters.SourceDirectory }}\docfx_project\_site'
TOC_HTML_REGEX = r"\.\./toc.html"
NAV_TOC_HTML_REGEX = r"api/"
PREV_DIR_REGEX = r"\.\./"
def locate_htmlfiles(directory):
html_set = []
for root, dirs, files in os.walk(directory):
for file in files:
html_set.append(os.path.join(root, file))
return html_set
def process_html(content):
content = re.sub(TOC_HTML_REGEX, 'navtoc.html', content)
content = re.sub(PREV_DIR_REGEX, '', content)
return content
def process_navtoc(content):
content = re.sub(NAV_TOC_HTML_REGEX, '', content)
return content
if __name__ == "__main__":
html_files = locate_htmlfiles(os.path.join(SITE_INDEX, 'api'))
navtoc_location = os.path.join(SITE_INDEX, 'toc.html')
# Process the main toc.html and rename it to navtoc.html
try:
logging.info(
"Process {}.".format(navtoc_location)
)
with open(navtoc_location, "r", encoding="utf8") as navtoc_stream:
navtoc_content = navtoc_stream.read()
new_navtoc_content = process_navtoc(navtoc_content)
logging.info("Process {}.".format(navtoc_content))
with open(navtoc_location, "w", encoding="utf8") as html_stream:
html_stream.write(new_navtoc_content)
except Exception as e:
logging.error(e)
exit(1)
# Rename main toc.html to navtoc.html
os.rename(navtoc_location, os.path.join(SITE_INDEX, 'navtoc.html'))
# Process all html in api directory
for html_location in html_files:
try:
logging.info(
"Process {}.".format(html_location)
)
with open(html_location, "r", encoding="utf8") as html_stream:
html_content = html_stream.read()
new_content = process_html(html_content)
logging.info("Process {}.".format(html_location))
with open(html_location, "w", encoding="utf8") as html_stream:
html_stream.write(new_content)
except Exception as e:
logging.error(e)
exit(1)
# Move all files from api to main site home directory
for html_location in html_files:
shutil.copy(html_location, SITE_INDEX)
# Delete API Directory
shutil.rmtree(os.path.join(SITE_INDEX, 'api'))

View file

@ -1,36 +0,0 @@
parameters:
ExclusionDataBaseFileName: ''
TargetDirectory: ''
PublishAnalysisLogs: false
PoliCheckBlobSAS: "$(azuresdk-policheck-blob-SAS)"
ExclusionFilePath: "$(Build.SourcesDirectory)/eng/guardian-tools/policheck/PolicheckExclusions.xml"
steps:
- pwsh: |
azcopy copy "https://azuresdkartifacts.blob.core.windows.net/policheck/${{ parameters.ExclusionDataBaseFileName }}.mdb?${{ parameters.PoliCheckBlobSAS }}" `
"$(Build.BinariesDirectory)"
displayName: 'Download PoliCheck Exclusion Database'
- task: securedevelopmentteam.vss-secure-development-tools.build-task-policheck.PoliCheck@2
displayName: 'Run PoliCheck'
inputs:
targetType: F
targetArgument: "$(Build.SourcesDirectory)/${{ parameters.TargetDirectory }}"
result: PoliCheck.sarif
optionsFC: 0
optionsXS: 1
optionsPE: 1|2|3|4
optionsRulesDBPath: "$(Build.BinariesDirectory)/${{ parameters.ExclusionDataBaseFileName }}.mdb"
optionsUEPATH: ${{ parameters.ExclusionFilePath }}
- task: securedevelopmentteam.vss-secure-development-tools.build-task-postanalysis.PostAnalysis@2
displayName: 'Post Analysis (PoliCheck)'
inputs:
GdnBreakAllTools: false
GdnBreakGdnToolPoliCheck: true
GdnBreakGdnToolPoliCheckSeverity: Warning
continueOnError: true
- ${{ if eq(parameters.PublishAnalysisLogs, 'true') }}:
- task: securedevelopmentteam.vss-secure-development-tools.build-task-publishsecurityanalysislogs.PublishSecurityAnalysisLogs@3
displayName: 'Publish Security Analysis Logs'

View file

@ -1,187 +0,0 @@
parameters:
- name: Repository
type: string
default: $(Build.Repository.Name)
- name: Prefix
type: string
- name: CIConventionOptions
type: string
default: ''
- name: UPConventionOptions
type: string
default: ''
- name: TestsConventionOptions
type: string
default: ''
- name: GenerateUnifiedWeekly
type: boolean
default: false
steps:
- template: install-pipeline-generation.yml
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
# This covers our public repos.
- ${{ if not(endsWith(parameters.Repository, '-pr'))}}:
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project public
--prefix ${{parameters.Prefix}}
--devopspath "\${{parameters.Prefix}}"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention ci
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
${{parameters.CIConventionOptions}}
displayName: Create CI Pipelines for Public Repository
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}
--devopspath "\${{parameters.Prefix}}"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention up
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
${{parameters.UPConventionOptions}}
displayName: Create UP Pipelines for Public Repository
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}
--devopspath "\${{parameters.Prefix}}"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention tests
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
${{parameters.TestsConventionOptions}}
displayName: Create Live Test Pipelines for Public Repository
condition: and(succeeded(), ne('${{parameters.TestsConventionOptions}}',''))
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}
--devopspath "\${{parameters.Prefix}}"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention testsweekly
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
${{parameters.TestsConventionOptions}}
displayName: Create Weekly (Multi-Cloud) Live Test Pipelines for Public Repository
condition: and(succeeded(), ne('${{parameters.TestsConventionOptions}}',''))
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}
--devopspath "\${{parameters.Prefix}}"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention upweekly
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
${{parameters.UPConventionOptions}}
displayName: Create Weekly (Multi-Cloud) Unified Test Pipelines for Public Repository
condition: and(succeeded(), eq(${{parameters.GenerateUnifiedWeekly}},true))
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
# This covers our -pr repositories.
- ${{ if endsWith(parameters.Repository, '-pr')}}:
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}-pr
--devopspath "\${{parameters.Prefix}}\pr"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention ci
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
--no-schedule
${{parameters.CIConventionOptions}}
displayName: Create CI Pipelines for Private Repository
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}-pr
--devopspath "\${{parameters.Prefix}}\pr"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention up
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
--no-schedule
${{parameters.UPConventionOptions}}
displayName: Create UP Pipelines for Private Repository
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)
- script: >
$(Pipeline.Workspace)/pipeline-generator/pipeline-generator
--organization https://dev.azure.com/azure-sdk
--project internal
--prefix ${{parameters.Prefix}}-pr
--devopspath "\${{parameters.Prefix}}\pr"
--path $(System.DefaultWorkingDirectory)/sdk
--endpoint Azure
--repository ${{parameters.Repository}}
--convention tests
--agentpool Hosted
--branch refs/heads/$(DefaultBranch)
--patvar PATVAR
--set-managed-variables
--debug
--no-schedule
${{parameters.TestsConventionOptions}}
displayName: Create Live Test Pipelines for Private Repository
condition: and(succeeded(), ne('${{parameters.TestsConventionOptions}}',''))
env:
PATVAR: $(azuresdk-azure-sdk-devops-pipeline-generation-pat)

View file

@ -1,27 +0,0 @@
# This step is used to prevent duplication of artifact publishes when there is an issue that would prevent the overall success of the job.
# Ensuring that we only publish when successful (and two a differently named artifact otherwise) will allow easy retry on a build pipeline
# without running into the "cannot override artifact" failure when we finally do get a passing run.
# ArtifactName - The name of the artifact in the "successful" case.
# ArtifactPath - The path we will be publishing.
# CustomCondition - Used if there is additional logic necessary to prevent attempt of publish.
parameters:
ArtifactName: ''
ArtifactPath: ''
CustomCondition: true
steps:
- task: PublishPipelineArtifact@1
condition: and(succeeded(), ${{ parameters.CustomCondition }})
displayName: 'Publish ${{ parameters.ArtifactName }} Artifacts'
inputs:
artifactName: '${{ parameters.ArtifactName }}'
path: '${{ parameters.ArtifactPath }}'
- task: PublishPipelineArtifact@1
condition: failed()
displayName: 'Publish failed ${{ parameters.ArtifactName }} Artifacts'
inputs:
artifactName: '${{ parameters.ArtifactName }}-FailedAttempt$(System.JobAttempt)'
path: '${{ parameters.ArtifactPath }}'

View file

@ -1,33 +0,0 @@
parameters:
FolderForUpload: ''
BlobSASKey: ''
TargetLanguage: ''
BlobName: ''
ScriptPath: ''
ArtifactLocation: ''
RepoId: $(Build.Repository.Name)
steps:
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
- pwsh: |
if (!(Test-Path '$(Build.BinariesDirectory)/azcopy/azcopy_windows_amd64_*/azcopy.exe')) {
Invoke-WebRequest -MaximumRetryCount 10 -Uri "https://aka.ms/downloadazcopy-v10-windows" -OutFile "azcopy.zip" | Wait-Process;
Expand-Archive -Path "azcopy.zip" -DestinationPath "$(Build.BinariesDirectory)/azcopy/" -Force
}
workingDirectory: $(Build.BinariesDirectory)
displayName: Download and Extract azcopy Zip
- task: Powershell@2
inputs:
filePath: ${{ parameters.ScriptPath }}
arguments: >
-AzCopy $(Resolve-Path "$(Build.BinariesDirectory)/azcopy/azcopy_windows_amd64_*/azcopy.exe")[0]
-DocLocation "${{ parameters.FolderForUpload }}"
-SASKey "${{ parameters.BlobSASKey }}"
-BlobName "${{ parameters.BlobName }}"
-PublicArtifactLocation "${{ parameters.ArtifactLocation }}"
-RepoReplaceRegex "(https://github.com/${{ parameters.RepoId }}/(?:blob|tree)/)$(DefaultBranch)"
pwsh: true
workingDirectory: $(Pipeline.Workspace)
displayName: Copy Docs to Blob
continueOnError: false

View file

@ -1,217 +0,0 @@
parameters:
TargetFolder: ''
RootFolder: ''
BuildSHA: ''
RepoId: $(Build.Repository.Name)
steps:
- task: PythonScript@0
displayName: Replace Relative Readme Links with Absolute References
inputs:
scriptSource: inline
script: |
import argparse
import sys
import os
import logging
import glob
import re
import fnmatch
from io import open
try:
from pathlib import Path
except:
from pathlib2 import Path
# This script is intended to be run against a single folder. All readme.md files (regardless of casing) will have the relative links
# updated with appropriate full reference links. This is a recursive update..
logging.getLogger().setLevel(logging.INFO)
RELATIVE_LINK_REPLACEMENT_SYNTAX = (
"https://github.com/{repo_id}/tree/{build_sha}/{target_resource_path}"
)
LINK_DISCOVERY_REGEX = r"\[([^\]]*)\]\(([^)]+)\)"
PREDEFINED_LINK_DISCOVERY_REGEX = r"(\[[^\]]+]\:)\s*([^\s]+)"
IMAGE_FILE_EXTENSIONS = ['.jpeg', '.jpg', '.png', '.gif', '.tiff']
RELATIVE_LINK_REPLACEMENT_SYNTAX_FOR_IMAGE = (
"https://github.com/{repo_id}/raw/{build_sha}/{target_resource_path}"
)
def locate_readmes(directory):
readme_set = []
for root, dirs, files in os.walk(directory):
for file in files:
if file.lower() == "readme.md":
readme_set.append(os.path.join(root, file))
return readme_set
def is_relative_link(link_value, readme_location):
link_without_location = link_value
if link_without_location.find('#') > 0:
link_without_location = link_without_location[0:link_without_location.find('#')]
try:
return os.path.exists(
os.path.abspath(os.path.join(os.path.dirname(readme_location), link_without_location))
)
except:
return False
def replace_relative_link(match, readme_location, root_folder, build_sha, repo_id):
link_path = match.group(2).strip()
if is_relative_link(link_path, readme_location):
# if it is a relative reference, we need to find the path from the root of the repository
resource_absolute_path = os.path.abspath(
os.path.join(os.path.dirname(readme_location), link_path)
)
placement_from_root = os.path.relpath(resource_absolute_path, root_folder)
suffix = Path(placement_from_root).suffix
if (suffix in IMAGE_FILE_EXTENSIONS):
updated_link = RELATIVE_LINK_REPLACEMENT_SYNTAX_FOR_IMAGE.format(
repo_id=repo_id,
build_sha=build_sha,
target_resource_path=placement_from_root,
).replace("\\", "/")
else:
updated_link = RELATIVE_LINK_REPLACEMENT_SYNTAX.format(
repo_id=repo_id,
build_sha=build_sha,
target_resource_path=placement_from_root,
).replace("\\", "/")
return "[{}]({})".format(match.group(1), updated_link)
else:
return match.group(0)
def replace_predefined_relative_links(match, readme_location, root_folder, build_sha, repo_id):
link_path = match.group(2).strip()
if is_relative_link(link_path, readme_location):
# if it is a relative reference, we need to find the path from the root of the repository
resource_absolute_path = os.path.abspath(
os.path.join(os.path.dirname(readme_location), link_path)
)
placement_from_root = os.path.relpath(resource_absolute_path, root_folder)
suffix = Path(placement_from_root).suffix
if (suffix in IMAGE_FILE_EXTENSIONS):
updated_link = RELATIVE_LINK_REPLACEMENT_SYNTAX_FOR_IMAGE.format(
repo_id=repo_id,
build_sha=build_sha,
target_resource_path=placement_from_root,
).replace("\\", "/")
else:
updated_link = RELATIVE_LINK_REPLACEMENT_SYNTAX.format(
repo_id=repo_id,
build_sha=build_sha,
target_resource_path=placement_from_root,
).replace("\\", "/")
return "{} {}".format(match.group(1), updated_link)
else:
return match.group(0)
def transfer_content_to_absolute_references(
root_folder, build_sha, repo_id, readme_location, content
):
content = re.sub(
LINK_DISCOVERY_REGEX,
lambda match, readme_location=readme_location, root_folder=root_folder, build_sha=build_sha, repo_id=repo_id: replace_relative_link(
match, readme_location, root_folder, build_sha, repo_id
),
content,
)
content = re.sub(
PREDEFINED_LINK_DISCOVERY_REGEX,
lambda match, readme_location=readme_location, root_folder=root_folder, build_sha=build_sha, repo_id=repo_id: replace_predefined_relative_links(
match, readme_location, root_folder, build_sha, repo_id
),
content,
)
return content
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description="Replaces relative links for any README.md under the target folder. Given any discovered relative link, will replace with the provided repoId and SHA. Case insensitive"
)
parser.add_argument(
"-t",
"--target",
dest="target_folder",
help="The target folder that contains a README ",
default="${{ parameters.TargetFolder }}",
)
parser.add_argument(
"-i",
"--repoid",
dest="repo_id",
help='The target repository used as the base for the path replacement. Full Id, example: "Azure/azure-sdk-for-net"',
default="${{ parameters.RepoId }}",
)
parser.add_argument(
"-r",
"--root",
dest="root_folder",
help="The root directory of the repository. This gives us the ability to rationalize links in situations where a relative link traverses UPWARDS from the readme.",
default="${{ parameters.RootFolder }}",
)
parser.add_argument(
"-s",
"--sha",
dest="build_sha",
help="The commit hash associated with this change. Using this will mean that links will never be broken.",
default="${{ parameters.BuildSHA }}",
)
args = parser.parse_args()
logging.info("Root Folder: {}".format(args.root_folder))
logging.info("Target Folder: {}".format(args.target_folder))
logging.info("Repository Id: {}".format(args.repo_id))
logging.info("Build SHA: {}".format(args.build_sha))
readme_files = locate_readmes(args.target_folder)
for readme_location in readme_files:
try:
logging.info(
"Running Relative Link Replacement on {}.".format(readme_location)
)
with open(readme_location, "r", encoding="utf-8") as readme_stream:
readme_content = readme_stream.read()
new_content = transfer_content_to_absolute_references(
args.root_folder,
args.build_sha,
args.repo_id,
readme_location,
readme_content,
)
with open(readme_location, "w", encoding="utf-8") as readme_stream:
readme_stream.write(new_content)
except Exception as e:
logging.error(e)
exit(1)
- script: |
git diff -U0
displayName: Highlight Readme Updates

View file

@ -1,21 +0,0 @@
parameters:
- name: DaysValid
default: 731
type: number
steps:
- task: PowerShell@2
displayName: Retain pipeline run
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
pwsh: true
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Add-RetentionLease.ps1
arguments: >
-Organization azure-sdk
-Project $(System.TeamProject)
-DefinitionId $(System.DefinitionId)
-RunId $(Build.BuildId)
-DaysValid ${{ parameters.DaysValid }}
-AccessToken $env:SYSTEM_ACCESSTOKEN
-Debug

View file

@ -1,14 +0,0 @@
parameters:
- name: DailyBranchVariableName
type: string
default: TargetBranchName
steps:
- pwsh: |
$branchName = $env:DAILYDOCSBRANCHNAMEOVERRIDE
if (!$branchName) {
$branchName = "daily/$(Get-Date -Format 'yyyy-MM-dd')"
}
Write-Host "Daily Branch Name: $branchName"
Write-Host "##vso[task.setvariable variable=${{ parameters.DailyBranchVariableName }};]$branchName"
displayName: Set daily docs branch name in $(${{ parameters.DailyBranchVariableName }})

View file

@ -1,16 +0,0 @@
parameters:
WorkingDirectory: '$(System.DefaultWorkingDirectory)'
RemoteRepo: 'origin'
DefaultBranchVariableName: DefaultBranch
steps:
- pwsh: |
$setDefaultBranch = (git remote show ${{ parameters.RemoteRepo }} | Out-String) -replace "(?ms).*HEAD branch: (\w+).*", '$1'
if ($LASTEXITCODE -ne 0) {
Write-Host "Not able to fetch the default branch from git command. Set to main."
$setDefaultBranch = 'main'
}
Write-Host "Setting ${{ parameters.DefaultBranchVariableName }}=$setDefaultBranch"
Write-Host "##vso[task.setvariable variable=${{ parameters.DefaultBranchVariableName }}]$setDefaultBranch"
displayName: "Setup Default Branch"
workingDirectory: ${{ parameters.workingDirectory }}
ignoreLASTEXITCODE: true

View file

@ -1,15 +0,0 @@
parameters:
PackageName: ''
ServiceDirectory: ''
TestPipeline: false
steps:
- ${{ if eq(parameters.TestPipeline, 'true') }}:
- task: PowerShell@2
displayName: Prep template pipeline for release
condition: and(succeeded(), ne(variables['Skip.SetTestPipelineVersion'], 'true'))
inputs:
pwsh: true
workingDirectory: $(Build.SourcesDirectory)
filePath: $(Build.SourcesDirectory)/eng/common/scripts/SetTestPipelineVersion.ps1
arguments: '-BuildID $(Build.BuildId) -PackageName ${{ parameters.PackageName }} -ServiceDirectory ${{ parameters.ServiceDirectory }}'

View file

@ -1,80 +0,0 @@
parameters:
- name: Paths
type: object
default: []
- name: Repositories
type: object
default:
- Name: $(Build.Repository.Name)
Commitish: $(Build.SourceVersion)
WorkingDirectory: $(System.DefaultWorkingDirectory)
- name: SkipDefaultCheckout
type: boolean
default: false
steps:
- ${{ if not(parameters.SkipDefaultCheckout) }}:
- checkout: none
- task: PowerShell@2
displayName: 'Sparse checkout repositories'
inputs:
targetType: inline
# Define this inline, because of the chicken/egg problem with loading a script when nothing
# has been checked out yet.
script: |
function SparseCheckout([Array]$paths, [Hashtable]$repository)
{
$dir = $repository.WorkingDirectory
if (!$dir) {
$dir = "./$($repository.Name)"
}
New-Item $dir -ItemType Directory -Force
Push-Location $dir
if (Test-Path .git/info/sparse-checkout) {
$hasInitialized = $true
Write-Host "Repository $($repository.Name) has already been initialized. Skipping this step."
} else {
Write-Host "Repository $($repository.Name) is being initialized."
Write-Host "git clone --no-checkout --filter=tree:0 https://github.com/$($repository.Name) ."
git clone --no-checkout --filter=tree:0 https://github.com/$($repository.Name) .
Write-Host "git sparse-checkout init"
git sparse-checkout init
Write-Host "git sparse-checkout set '/*' '!/*/' '/eng'"
git sparse-checkout set '/*' '!/*/' '/eng'
}
# Prevent wildcard expansion in Invoke-Expression (e.g. for checkout path '/*')
$quotedPaths = $paths | ForEach-Object { "'$_'" }
$gitsparsecmd = "git sparse-checkout add $quotedPaths"
Write-Host $gitsparsecmd
Invoke-Expression -Command $gitsparsecmd
Write-Host "Set sparse checkout paths to:"
Get-Content .git/info/sparse-checkout
# sparse-checkout commands after initial checkout will auto-checkout again
if (!$hasInitialized) {
Write-Host "git checkout $($repository.Commitish)"
git checkout $($repository.Commitish) # this will use the default branch if repo.Commitish is empty
} else {
Write-Host "Skipping checkout as repo has already been initialized"
}
Pop-Location
}
# Paths may be sourced as a yaml object literal OR a dynamically generated variable json string.
# If the latter, convertToJson will wrap the 'string' in quotes, so remove them.
$paths = '${{ convertToJson(parameters.Paths) }}'.Trim('"') | ConvertFrom-Json
# Replace windows backslash paths, as Azure Pipelines default directories are sometimes formatted like 'D:\a\1\s'
$repositories = '${{ convertToJson(parameters.Repositories) }}' -replace '\\', '/' | ConvertFrom-Json -AsHashtable
foreach ($repo in $Repositories) {
SparseCheckout $paths $repo
}
pwsh: true
workingDirectory: $(System.DefaultWorkingDirectory)

View file

@ -1,118 +0,0 @@
parameters:
- name: PackageInfoLocations
type: object
default: []
- name: RepoId
type: string
default: $(Build.Repository.Name)
- name: WorkingDirectory
type: string
default: ''
- name: ScriptDirectory
type: string
default: eng/common/scripts
- name: TargetDocRepoName
type: string
default: ''
- name: TargetDocRepoOwner
type: string
- name: Language
type: string
default: ''
- name: DailyDocsBuild
type: boolean
default: false
- name: SparseCheckoutPaths
type: object
default:
- '**'
- name: PackageSourceOverride
type: string
default: ''
- name: DocValidationImageId
type: string
default: ''
steps:
- ${{ if eq(length(parameters.PackageInfoLocations), 0) }}:
- checkout: none
- pwsh: |
Write-Host "Skipping DocsMS Update because package list was empty."
displayName: Skip DocsMS Update
- ${{ else }}:
- template: /eng/common/pipelines/templates/steps/enable-long-path-support.yml
- pwsh: |
Write-Host "###vso[task.setvariable variable=DocRepoLocation]${{ parameters.WorkingDirectory }}/doc"
displayName: Set $(DocRepoLocation)
- template: /eng/common/pipelines/templates/steps/sparse-checkout.yml
parameters:
SkipDefaultCheckout: true
Repositories:
- Name: ${{ parameters.TargetDocRepoOwner }}/${{ parameters.TargetDocRepoName }}
WorkingDirectory: $(DocRepoLocation)
Paths: ${{ parameters.SparseCheckoutPaths }}
# If performing a daily docs build set the $(TargetBranchName) to a daily branch
# name and attempt to checkout the daily docs branch. If the branch doesn't
# exist, create it
- ${{ if eq(parameters.DailyDocsBuild, 'true') }}:
- template: /eng/common/pipelines/templates/steps/set-daily-docs-branch-name.yml
- pwsh: |
$ErrorActionPreference = "Continue"
$RemoteName = "origin"
$BranchName = "$(TargetBranchName)"
# Fetch and checkout remote branch if it already exists otherwise create a new branch.
git ls-remote --exit-code --heads $RemoteName $BranchName
if ($LASTEXITCODE -eq 0) {
Write-Host "git fetch $RemoteName $BranchName"
git fetch $RemoteName $BranchName
Write-Host "git checkout $BranchName."
git checkout $BranchName
} else {
Write-Host "git checkout -b $BranchName."
git checkout -b $BranchName
}
displayName: Checkout daily docs branch if it exists
workingDirectory: $(DocRepoLocation)
# If NOT performing a daily docs build, set the $(TargetBranchName) to the
# default branch of the documentation repository.
- ${{ if ne(parameters.DailyDocsBuild, 'true') }}:
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
parameters:
WorkingDirectory: $(DocRepoLocation)
DefaultBranchVariableName: TargetBranchName
# Pull and build the docker image.
- ${{ if ne(parameters.DocValidationImageId, '') }}:
- template: /eng/common/pipelines/templates/steps/docker-pull-image.yml
parameters:
ContainerRegistryClientId: $(azuresdkimages-cr-clientid)
ContainerRegistryClientSecret: $(azuresdkimages-cr-clientsecret)
ImageId: '${{ parameters.DocValidationImageId }}'
- pwsh: |
$packageInfoJson = '${{ convertToJson(parameters.PackageInfoLocations) }}'.Trim('"').Replace("\\", "/")
# Without -NoEnumerate, a single element array[T] gets unwrapped as a single item T.
$packageInfoLocations = ConvertFrom-Json $packageInfoJson -NoEnumerate
${{ parameters.ScriptDirectory }}/Update-DocsMsMetadata.ps1 `
-PackageInfoJsonLocations $packageInfoLocations `
-DocRepoLocation "$(DocRepoLocation)" `
-Language '${{parameters.Language}}' `
-RepoId '${{ parameters.RepoId }}' `
-DocValidationImageId '${{ parameters.DocValidationImageId }}' `
-PackageSourceOverride '${{ parameters.PackageSourceOverride }}' `
-TenantId '$(opensource-aad-tenant-id)' `
-ClientId '$(opensource-aad-app-id)' `
-ClientSecret '$(opensource-aad-secret)'
displayName: Apply Documentation Updates
- template: /eng/common/pipelines/templates/steps/git-push-changes.yml
parameters:
BaseRepoBranch: $(TargetBranchName)
BaseRepoOwner: ${{ parameters.TargetDocRepoOwner }}
CommitMsg: "Update docs metadata"
TargetRepoName: ${{ parameters.TargetDocRepoName }}
TargetRepoOwner: ${{ parameters.TargetDocRepoOwner }}
WorkingDirectory: $(DocRepoLocation)
ScriptDirectory: ${{ parameters.WorkingDirectory }}/${{ parameters.ScriptDirectory }}

View file

@ -1,18 +0,0 @@
parameters:
- name: ScriptDirectory
type: string
default: 'eng/common/scripts'
- name: AgentImage
type: string
steps:
- task: PowerShell@2
displayName: Verify agent OS
inputs:
pwsh: true
workingDirectory: $(System.DefaultWorkingDirectory)
filePath: ${{ parameters.ScriptDirectory }}/Verify-AgentOS.ps1
arguments: >
-AgentImage "${{ parameters.AgentImage }}"
- template: /eng/common/pipelines/templates/steps/bypass-local-dns.yml

View file

@ -1,26 +0,0 @@
parameters:
- name: PackageName
type: string
default: 'not-specified'
- name: ServiceName
type: string
default: ''
- name: ServiceDirectory
type: string
default: ''
- name: ForRelease
type: boolean
default: false
steps:
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Verify-ChangeLog.ps1
arguments: >
-PackageName '${{ parameters.PackageName }}'
-ServiceDirectory '${{ coalesce(parameters.ServiceDirectory, parameters.ServiceName) }}'
-ForRelease $${{ parameters.ForRelease }}
pwsh: true
workingDirectory: $(Pipeline.Workspace)
displayName: Verify ChangeLogEntry for ${{ parameters.PackageName }}
continueOnError: false

View file

@ -1,31 +0,0 @@
parameters:
Directory: 'not-specified'
IgnoreLinksFile: '$(Build.SourcesDirectory)/eng/ignore-links.txt'
WorkingDirectory: '$(System.DefaultWorkingDirectory)'
ScriptDirectory: 'eng/common/scripts'
Recursive: $false
CheckLinkGuidance: $true
Urls: '(Get-ChildItem -Path ./ -Recurse -Include *.md)'
BranchReplaceRegex: "^(${env:SYSTEM_PULLREQUEST_SOURCEREPOSITORYURI}/(?:blob|tree)/)$(DefaultBranch)(/.*)$"
BranchReplacementName: "${env:SYSTEM_PULLREQUEST_SOURCECOMMITID}"
Condition: succeeded() # If you want to run on failure for the link checker, set it to `Condition: succeededOrFailed()`.
steps:
- template: /eng/common/pipelines/templates/steps/set-default-branch.yml
- task: PowerShell@2
displayName: Link verification check
condition: ${{ parameters.Condition }}
inputs:
pwsh: true
workingDirectory: '${{ parameters.WorkingDirectory }}/${{ parameters.Directory }}'
filePath: ${{ parameters.ScriptDirectory }}/Verify-Links.ps1
arguments: >
-urls ${{ parameters.Urls }}
-rootUrl "file://${{ parameters.WorkingDirectory }}/${{ parameters.Directory }}"
-recursive: ${{ parameters.Recursive }}
-ignoreLinksFile ${{ parameters.IgnoreLinksFile }}
-branchReplaceRegex "${{ parameters.BranchReplaceRegex }}"
-branchReplacementName ${{ parameters.BranchReplacementName }}
-devOpsLogging: $true
-checkLinkGuidance: ${{ parameters.CheckLinkGuidance }}
-inputCacheFile "https://azuresdkartifacts.blob.core.windows.net/verify-links-cache/verify-links-cache.txt"

View file

@ -1,50 +0,0 @@
# Template for all Python Scripts in this repository
parameters:
SourceDirectory: ''
BasePathLength: 49
steps:
- task: PythonScript@0
displayName: Analyze Path Lengths
inputs:
scriptSource: inline
script: |
# Verifies Length of file path for all files in the SourceDirectory.
# File paths and directory paths must be less than 260 and 248 characters respectively on windows OS
# Repo users get a limited number of characters for the repo clone path. As Specified by the BasePathLength parameter.
# Script makes sure that paths in the repo are less than 260 and 248 for files and directories respectively after adding the BasePathLength.
import os
import sys
source_directory = r'${{ parameters.SourceDirectory }}'
break_switch = False
long_file_paths = []
long_dir_paths = []
def pluralize(string, plural_string, count):
return plural_string if count > 1 else string
print('Analyzing length of paths...')
for root, dirs, files in os.walk('{0}'.format(source_directory)):
for file in files:
file_path = os.path.relpath(os.path.join(root, file), source_directory)
if ((len(file_path) + ${{ parameters.BasePathLength }}) > 260):
long_file_paths.append(file_path)
dir_path = os.path.relpath(root, source_directory)
if ((len(dir_path) + ${{ parameters.BasePathLength }}) > 248):
long_dir_paths.append(dir_path)
if (len(long_file_paths) > 0):
print('With a base path length of {0} the following file path{1} exceed the allow path length of 260 characters'.format(${{ parameters.BasePathLength }}, pluralize('', 's', len(long_file_paths))))
print(*long_file_paths, sep = "\n")
break_switch = True
if (len(long_dir_paths) > 0):
print('With a base path length of {0} the following directory path{1} exceed the allow path length of 248 characters'.format(${{ parameters.BasePathLength }}, pluralize('', 's', len(long_dir_paths))))
print(*long_dir_paths, sep = "\n")
break_switch = True
if break_switch == True:
print("Some file paths are too long. Please reduce path lengths")
exit(1)

View file

@ -1,17 +0,0 @@
parameters:
ScanPath: $(Build.SourcesDirectory)
RepoRoot: $(Build.SourcesDirectory)
SettingsPath: '$(Build.SourcesDirectory)/eng/.docsettings.yml'
DocWardenVersion : '0.7.2'
steps:
- task: PowerShell@2
displayName: "Verify Readmes"
inputs:
filePath: "eng/common/scripts/Verify-Readme.ps1"
arguments: >
-DocWardenVersion ${{ parameters.DocWardenVersion }}
-ScanPath ${{ parameters.ScanPath }}
-RepoRoot ${{ parameters.RepoRoot }}
-SettingsPath ${{ parameters.SettingsPath }}
pwsh: true

View file

@ -1,15 +0,0 @@
parameters:
- name: ServiceDirectory
type: string
default: not-specified
- name: ScriptDirectory
type: string
default: eng/common/scripts
steps:
- pwsh: |
# If the last path segment is an absolute path it will be used entirely.
$root = [System.IO.Path]::Combine('$(Build.SourcesDirectory)', 'sdk', '${{ parameters.ServiceDirectory }}')
Get-ChildItem $root -Filter *.md -Recurse | ${{ parameters.ScriptDirectory }}/Test-SampleMetadata.ps1 -AllowParentProducts
displayName: Verify sample metadata
workingDirectory: $(Build.SourcesDirectory)

View file

@ -1,7 +0,0 @@
steps:
- task: Powershell@2
inputs:
filePath: $(Build.SourcesDirectory)/eng/common/scripts/Write-FileSystemMetrics.ps1
pwsh: true
displayName: Write filesystem metrics
continueOnError: true

View file

@ -1,28 +0,0 @@
[CmdletBinding(SupportsShouldProcess = $true)]
param(
[Parameter(Mandatory = $true)]
[string]$RepoOwner,
[Parameter(Mandatory = $true)]
[string]$RepoName,
[Parameter(Mandatory = $true)]
[string]$IssueNumber,
[Parameter(Mandatory = $true)]
[string]$Comment,
[Parameter(Mandatory = $true)]
[string]$AuthToken
)
. (Join-Path $PSScriptRoot common.ps1)
try {
Add-GithubIssueComment -RepoOwner $RepoOwner -RepoName $RepoName `
-IssueNumber $IssueNumber -Comment $Comment -AuthToken $AuthToken
}
catch {
LogError "Add-GithubIssueComment failed with exception:`n$_"
exit 1
}

View file

@ -1,28 +0,0 @@
[CmdletBinding(SupportsShouldProcess = $true)]
param(
[Parameter(Mandatory = $true)]
[string]$RepoOwner,
[Parameter(Mandatory = $true)]
[string]$RepoName,
[Parameter(Mandatory = $true)]
[string]$IssueNumber,
[Parameter(Mandatory = $true)]
[string]$Labels,
[Parameter(Mandatory = $true)]
[string]$AuthToken
)
. (Join-Path $PSScriptRoot common.ps1)
try {
Add-GithubIssueLabels -RepoOwner $RepoOwner -RepoName $RepoName `
-IssueNumber $IssueNumber -Labels $Labels -AuthToken $AuthToken
}
catch {
LogError "Add-GithubIssueLabels failed with exception:`n$_"
exit 1
}

View file

@ -1,58 +0,0 @@
[CmdletBinding(SupportsShouldProcess = $true)]
param(
[Parameter(Mandatory = $true)]
[string]$Organization,
[Parameter(Mandatory = $true)]
[string]$Project,
[Parameter(Mandatory = $true)]
[int]$DefinitionId,
[Parameter(Mandatory = $true)]
[int]$RunId,
[Parameter(Mandatory = $true)]
[int]$DaysValid,
[Parameter(Mandatory = $false)]
[string]$OwnerId = "azure-sdk-pipeline-automation",
[Parameter(Mandatory = $false)]
[string]$AccessToken = $env:DEVOPS_PAT
)
Set-StrictMode -Version 3
. (Join-Path $PSScriptRoot common.ps1)
$unencodedAuthToken = "nobody:$AccessToken"
$unencodedAuthTokenBytes = [System.Text.Encoding]::UTF8.GetBytes($unencodedAuthToken)
$encodedAuthToken = [System.Convert]::ToBase64String($unencodedAuthTokenBytes)
if ($isDevOpsRun) {
# We are doing this here so that there is zero chance that this token is emitted in Azure Pipelines
# build logs. Azure Pipelines will see this text and register the secret as a value it should *** out
# before being transmitted to the server (and shown in logs). It means if the value is accidentally
# leaked anywhere else that it won't be visible. The downside is that when the script is executed
# on a local development box, it will be visible.
Write-Host "##vso[task.setvariable variable=_throwawayencodedaccesstoken;issecret=true;]$($encodedAuthToken)"
}
LogDebug "Checking for existing leases on run: $RunId"
$existingLeases = Get-RetentionLeases -Organization $Organization -Project $Project -DefinitionId $DefinitionId -RunId $RunId -OwnerId $OwnerId -Base64EncodedAuthToken $encodedAuthToken
if ($existingLeases.count -ne 0) {
LogDebug "Found $($existingLeases.count) leases, will delete them first."
foreach ($lease in $existingLeases.value) {
LogDebug "Deleting lease: $($lease.leaseId)"
Delete-RetentionLease -Organization $Organization -Project $Project -LeaseId $lease.leaseId -Base64EncodedAuthToken $encodedAuthToken
}
}
LogDebug "Creating new lease on run: $RunId"
$lease = Add-RetentionLease -Organization $Organization -Project $Project -DefinitionId $DefinitionId -RunId $RunId -OwnerId $OwnerId -DaysValid $DaysValid -Base64EncodedAuthToken $encodedAuthToken
LogDebug "Lease ID is: $($lease.value.leaseId)"

View file

@ -1,402 +0,0 @@
# Common Changelog Operations
. "${PSScriptRoot}\logging.ps1"
. "${PSScriptRoot}\SemVer.ps1"
$RELEASE_TITLE_REGEX = "(?<releaseNoteTitle>^\#+\s+(?<version>$([AzureEngSemanticVersion]::SEMVER_REGEX))(\s+(?<releaseStatus>\(.+\))))"
$SECTION_HEADER_REGEX_SUFFIX = "##\s(?<sectionName>.*)"
$CHANGELOG_UNRELEASED_STATUS = "(Unreleased)"
$CHANGELOG_DATE_FORMAT = "yyyy-MM-dd"
$RecommendedSectionHeaders = @("Features Added", "Breaking Changes", "Bugs Fixed", "Other Changes")
# Returns a Collection of changeLogEntry object containing changelog info for all version present in the gived CHANGELOG
function Get-ChangeLogEntries {
param (
[Parameter(Mandatory = $true)]
[String]$ChangeLogLocation
)
if (!(Test-Path $ChangeLogLocation)) {
LogError "ChangeLog[${ChangeLogLocation}] does not exist"
return $null
}
LogDebug "Extracting entries from [${ChangeLogLocation}]."
return Get-ChangeLogEntriesFromContent (Get-Content -Path $ChangeLogLocation)
}
function Get-ChangeLogEntriesFromContent {
param (
[Parameter(Mandatory = $true)]
$changeLogContent
)
if ($changeLogContent -is [string])
{
$changeLogContent = $changeLogContent.Split("`n")
}
elseif($changeLogContent -isnot [array])
{
LogError "Invalid ChangelogContent passed"
return $null
}
$changelogEntry = $null
$sectionName = $null
$changeLogEntries = [Ordered]@{}
$initialAtxHeader= "#"
if ($changeLogContent[0] -match "(?<HeaderLevel>^#+)\s.*")
{
$initialAtxHeader = $matches["HeaderLevel"]
}
$sectionHeaderRegex = "^${initialAtxHeader}${SECTION_HEADER_REGEX_SUFFIX}"
$changeLogEntries | Add-Member -NotePropertyName "InitialAtxHeader" -NotePropertyValue $initialAtxHeader
$releaseTitleAtxHeader = $initialAtxHeader + "#"
try {
# walk the document, finding where the version specifiers are and creating lists
foreach ($line in $changeLogContent) {
if ($line -match $RELEASE_TITLE_REGEX) {
$changeLogEntry = [pscustomobject]@{
ReleaseVersion = $matches["version"]
ReleaseStatus = $matches["releaseStatus"]
ReleaseTitle = "$releaseTitleAtxHeader {0} {1}" -f $matches["version"], $matches["releaseStatus"]
ReleaseContent = @()
Sections = @{}
}
$changeLogEntries[$changeLogEntry.ReleaseVersion] = $changeLogEntry
}
else {
if ($changeLogEntry) {
if ($line.Trim() -match $sectionHeaderRegex)
{
$sectionName = $matches["sectionName"].Trim()
$changeLogEntry.Sections[$sectionName] = @()
$changeLogEntry.ReleaseContent += $line
continue
}
if ($sectionName)
{
$changeLogEntry.Sections[$sectionName] += $line
}
$changeLogEntry.ReleaseContent += $line
}
}
}
}
catch {
Write-Error "Error parsing Changelog."
Write-Error $_
}
return $changeLogEntries
}
# Returns single changeLogEntry object containing the ChangeLog for a particular version
function Get-ChangeLogEntry {
param (
[Parameter(Mandatory = $true)]
[String]$ChangeLogLocation,
[Parameter(Mandatory = $true)]
[String]$VersionString
)
$changeLogEntries = Get-ChangeLogEntries -ChangeLogLocation $ChangeLogLocation
if ($changeLogEntries -and $changeLogEntries.Contains($VersionString)) {
return $changeLogEntries[$VersionString]
}
return $null
}
#Returns the changelog for a particular version as string
function Get-ChangeLogEntryAsString {
param (
[Parameter(Mandatory = $true)]
[String]$ChangeLogLocation,
[Parameter(Mandatory = $true)]
[String]$VersionString
)
$changeLogEntry = Get-ChangeLogEntry -ChangeLogLocation $ChangeLogLocation -VersionString $VersionString
return ChangeLogEntryAsString $changeLogEntry
}
function ChangeLogEntryAsString($changeLogEntry) {
if (!$changeLogEntry) {
return "[Missing change log entry]"
}
[string]$releaseTitle = $changeLogEntry.ReleaseTitle
[string]$releaseContent = $changeLogEntry.ReleaseContent -Join [Environment]::NewLine
return $releaseTitle, $releaseContent -Join [Environment]::NewLine
}
function Confirm-ChangeLogEntry {
param (
[Parameter(Mandatory = $true)]
[String]$ChangeLogLocation,
[Parameter(Mandatory = $true)]
[String]$VersionString,
[boolean]$ForRelease = $false,
[Switch]$SantizeEntry
)
$changeLogEntries = Get-ChangeLogEntries -ChangeLogLocation $ChangeLogLocation
$changeLogEntry = $changeLogEntries[$VersionString]
if (!$changeLogEntry) {
LogError "ChangeLog[${ChangeLogLocation}] does not have an entry for version ${VersionString}."
return $false
}
if ($SantizeEntry)
{
Remove-EmptySections -ChangeLogEntry $changeLogEntry -InitialAtxHeader $changeLogEntries.InitialAtxHeader
Set-ChangeLogContent -ChangeLogLocation $ChangeLogLocation -ChangeLogEntries $changeLogEntries
}
Write-Host "Found the following change log entry for version '${VersionString}' in [${ChangeLogLocation}]."
Write-Host "-----"
Write-Host (ChangeLogEntryAsString $changeLogEntry)
Write-Host "-----"
if ([System.String]::IsNullOrEmpty($changeLogEntry.ReleaseStatus)) {
LogError "Entry does not have a correct release status. Please ensure the status is set to a date '($CHANGELOG_DATE_FORMAT)' or '$CHANGELOG_UNRELEASED_STATUS' if not yet released. See https://aka.ms/azsdk/guideline/changelogs for more info."
return $false
}
if ($ForRelease -eq $True)
{
LogDebug "Verifying as a release build because ForRelease parameter is set to true"
return Confirm-ChangeLogForRelease -changeLogEntry $changeLogEntry -changeLogEntries $changeLogEntries
}
# If the release status is a valid date then verify like its about to be released
$status = $changeLogEntry.ReleaseStatus.Trim().Trim("()")
if ($status -as [DateTime])
{
LogDebug "Verifying like it's a release build because the changelog entry has a valid date."
return Confirm-ChangeLogForRelease -changeLogEntry $changeLogEntry -changeLogEntries $changeLogEntries
}
return $true
}
function New-ChangeLogEntry {
param (
[Parameter(Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String]$Version,
[String]$Status=$CHANGELOG_UNRELEASED_STATUS,
[String]$InitialAtxHeader="#",
[String[]]$Content
)
# Validate RelaseStatus
$Status = $Status.Trim().Trim("()")
if ($Status -ne "Unreleased") {
try {
$Status = ([DateTime]$Status).ToString($CHANGELOG_DATE_FORMAT)
}
catch {
LogWarning "Invalid date [ $Status ] passed as status for Version [$Version]. Please use a valid date in the format '$CHANGELOG_DATE_FORMAT' or use '$CHANGELOG_UNRELEASED_STATUS'"
return $null
}
}
$Status = "($Status)"
# Validate Version
try {
$Version = ([AzureEngSemanticVersion]::ParseVersionString($Version)).ToString()
}
catch {
LogWarning "Invalid version [ $Version ]."
return $null
}
if (!$Content) {
$Content = @()
$Content += ""
$sectionsAtxHeader = $InitialAtxHeader + "##"
foreach ($recommendedHeader in $RecommendedSectionHeaders)
{
$Content += "$sectionsAtxHeader $recommendedHeader"
$Content += ""
}
}
$releaseTitleAtxHeader = $initialAtxHeader + "#"
$newChangeLogEntry = [pscustomobject]@{
ReleaseVersion = $Version
ReleaseStatus = $Status
ReleaseTitle = "$releaseTitleAtxHeader $Version $Status"
ReleaseContent = $Content
}
return $newChangeLogEntry
}
function Set-ChangeLogContent {
param (
[Parameter(Mandatory = $true)]
[String]$ChangeLogLocation,
[Parameter(Mandatory = $true)]
$ChangeLogEntries
)
$changeLogContent = @()
$changeLogContent += "$($ChangeLogEntries.InitialAtxHeader) Release History"
$changeLogContent += ""
$ChangeLogEntries = Sort-ChangeLogEntries -changeLogEntries $ChangeLogEntries
foreach ($changeLogEntry in $ChangeLogEntries) {
$changeLogContent += $changeLogEntry.ReleaseTitle
if ($changeLogEntry.ReleaseContent.Count -eq 0) {
$changeLogContent += @("","")
}
else {
$changeLogContent += $changeLogEntry.ReleaseContent
}
}
Set-Content -Path $ChangeLogLocation -Value $changeLogContent
}
function Remove-EmptySections {
param (
[Parameter(Mandatory = $true)]
$ChangeLogEntry,
$InitialAtxHeader = "#"
)
$sectionHeaderRegex = "^${InitialAtxHeader}${SECTION_HEADER_REGEX_SUFFIX}"
$releaseContent = $ChangeLogEntry.ReleaseContent
if ($releaseContent.Count -gt 0)
{
$parsedSections = $ChangeLogEntry.Sections
$sanitizedReleaseContent = New-Object System.Collections.ArrayList(,$releaseContent)
foreach ($key in @($parsedSections.Keys))
{
if ([System.String]::IsNullOrWhiteSpace($parsedSections[$key]))
{
for ($i = 0; $i -lt $sanitizedReleaseContent.Count; $i++)
{
$line = $sanitizedReleaseContent[$i]
if ($line -match $sectionHeaderRegex -and $matches["sectionName"].Trim() -eq $key)
{
$sanitizedReleaseContent.RemoveAt($i)
while($i -lt $sanitizedReleaseContent.Count -and [System.String]::IsNullOrWhiteSpace($sanitizedReleaseContent[$i]))
{
$sanitizedReleaseContent.RemoveAt($i)
}
$ChangeLogEntry.Sections.Remove($key)
break
}
}
}
}
$ChangeLogEntry.ReleaseContent = $sanitizedReleaseContent.ToArray()
}
}
function Get-LatestReleaseDateFromChangeLog
{
param (
[Parameter(Mandatory = $true)]
$ChangeLogLocation
)
$changeLogEntries = Get-ChangeLogEntries -ChangeLogLocation $ChangeLogLocation
$latestVersion = $changeLogEntries[0].ReleaseStatus.Trim("()")
return ($latestVersion -as [DateTime])
}
function Sort-ChangeLogEntries {
param (
[Parameter(Mandatory = $true)]
$changeLogEntries
)
try
{
$changeLogEntries = $ChangeLogEntries.Values | Sort-Object -Descending -Property ReleaseStatus, `
@{e = {[AzureEngSemanticVersion]::new($_.ReleaseVersion)}}
}
catch {
LogError "Problem sorting version in ChangeLogEntries"
exit(1)
}
return $changeLogEntries
}
function Confirm-ChangeLogForRelease {
param (
[Parameter(Mandatory = $true)]
$changeLogEntry,
[Parameter(Mandatory = $true)]
$changeLogEntries
)
$entries = Sort-ChangeLogEntries -changeLogEntries $changeLogEntries
$isValid = $true
if ($changeLogEntry.ReleaseStatus -eq $CHANGELOG_UNRELEASED_STATUS) {
LogError "Entry has no release date set. Please ensure to set a release date with format '$CHANGELOG_DATE_FORMAT'. See https://aka.ms/azsdk/guideline/changelogs for more info."
$isValid = $false
}
else {
$status = $changeLogEntry.ReleaseStatus.Trim().Trim("()")
try {
$releaseDate = [DateTime]$status
if ($status -ne ($releaseDate.ToString($CHANGELOG_DATE_FORMAT)))
{
LogError "Date must be in the format $($CHANGELOG_DATE_FORMAT). See https://aka.ms/azsdk/guideline/changelogs for more info."
$isValid = $false
}
if (@($entries.ReleaseStatus)[0] -ne $changeLogEntry.ReleaseStatus)
{
LogError "Invalid date [ $status ]. The date for the changelog being released must be the latest in the file."
$isValid = $false
}
}
catch {
LogError "Invalid date [ $status ] passed as status for Version [$($changeLogEntry.ReleaseVersion)]. See https://aka.ms/azsdk/guideline/changelogs for more info."
$isValid = $false
}
}
if ([System.String]::IsNullOrWhiteSpace($changeLogEntry.ReleaseContent)) {
LogError "Entry has no content. Please ensure to provide some content of what changed in this version. See https://aka.ms/azsdk/guideline/changelogs for more info."
$isValid = $false
}
$foundRecommendedSection = $false
$emptySections = @()
foreach ($key in $changeLogEntry.Sections.Keys)
{
$sectionContent = $changeLogEntry.Sections[$key]
if ([System.String]::IsNullOrWhiteSpace(($sectionContent | Out-String)))
{
$emptySections += $key
}
if ($RecommendedSectionHeaders -contains $key)
{
$foundRecommendedSection = $true
}
}
if ($emptySections.Count -gt 0)
{
LogError "The changelog entry has the following sections with no content ($($emptySections -join ', ')). Please ensure to either remove the empty sections or add content to the section."
$isValid = $false
}
if (!$foundRecommendedSection)
{
LogWarning "The changelog entry did not contain any of the recommended sections ($($RecommendedSectionHeaders -join ', ')), please add at least one. See https://aka.ms/azsdk/guideline/changelogs for more info."
}
return $isValid
}

View file

@ -1,141 +0,0 @@
<#
.SYNOPSIS
Script for installing and launching cosmos emulator
.DESCRIPTION
This script downloads, installs and launches cosmosdb-emulator.
.PARAMETER EmulatorMsiUrl
Uri for downloading the cosmosdb-emulator
.PARAMETER StartParameters
Parameter with which to launch the cosmosdb-emulator
.PARAMETER Stage
Determines what part of the script to run. Has to be either Install or Launch
#>
[CmdletBinding()]
Param (
[string] $EmulatorMsiUrl = "https://aka.ms/cosmosdb-emulator",
[string] $StartParameters,
[Parameter(Mandatory=$True)]
[ValidateSet('Install', 'Launch')]
[string] $Stage
)
$targetDir = Join-Path $Env:Temp AzureCosmosEmulator
$logFile = Join-Path $Env:Temp log.txt
$productName = "Azure Cosmos DB Emulator"
$emulator = (Join-Path $targetDir (Join-Path $productName "Microsoft.Azure.Cosmos.Emulator.exe"))
if ($Stage -eq "Install")
{
$downloadTryCount = 0
New-Item $targetDir -Type Directory
New-Item $logFile -Type File
do
{
# Download and Extract Public Cosmos DB Emulator
Write-Host "Downloading and extracting Cosmos DB Emulator - $EmulatorMsiUrl"
Write-Host "Target Directory $targetDir"
Write-Host "Log File $logFile"
$downloadTryCount++
Write-Host "Download Try Count: $downloadTryCount"
Remove-Item -Path (Join-Path $targetDir '*') -Recurse
Clear-Content -Path $logFile
$installProcess = Start-Process msiexec -Wait -PassThru -ArgumentList "/a $EmulatorMsiUrl TARGETDIR=$targetDir /qn /liew $logFile"
Get-Content $logFile
Write-Host "Exit Code: $($installProcess.ExitCode)"
}
while(($installProcess.ExitCode -ne 0) -and ($downloadTryCount -lt 3))
if(Test-Path (Join-Path $Env:LOCALAPPDATA CosmosDbEmulator))
{
Write-Host "Deleting Cosmos DB Emulator data"
Remove-Item -Recurse -Force $Env:LOCALAPPDATA\CosmosDbEmulator
}
Write-Host "Getting Cosmos DB Emulator Version"
$fileVersion = Get-ChildItem $emulator
Write-Host $emulator $fileVersion.VersionInfo
}
if ($Stage -eq "Launch")
{
Write-Host "Launching Cosmos DB Emulator"
if (!(Test-Path $emulator)) {
Write-Error "The emulator is not installed where expected at '$emulator'"
return
}
$process = Start-Process $emulator -ArgumentList "/getstatus" -PassThru -Wait
switch ($process.ExitCode) {
1 {
Write-Host "The emulator is already starting"
return
}
2 {
Write-Host "The emulator is already running"
return
}
3 {
Write-Host "The emulator is stopped"
}
default {
Write-Host "Unrecognized exit code $($process.ExitCode)"
return
}
}
$argumentList = ""
if (-not [string]::IsNullOrEmpty($StartParameters)) {
$argumentList += , $StartParameters
} else {
# Use the default params if none provided
$argumentList = "/noexplorer /noui /enablepreview /disableratelimiting /enableaadauthentication"
}
Write-Host "Starting emulator process: $emulator $argumentList"
$process = Start-Process $emulator -ArgumentList $argumentList -ErrorAction Stop -PassThru
Write-Host "Emulator process started: $($process.Name), $($process.FileVersion)"
$Timeout = 600
$result="NotYetStarted"
$complete = if ($Timeout -gt 0) {
$start = [DateTimeOffset]::Now
$stop = $start.AddSeconds($Timeout)
{
$result -eq "Running" -or [DateTimeOffset]::Now -ge $stop
}
}
else {
{
$result -eq "Running"
}
}
do {
$process = Start-Process $emulator -ArgumentList "/getstatus" -PassThru -Wait
switch ($process.ExitCode) {
1 {
Write-Host "The emulator is starting"
}
2 {
Write-Host "The emulator is running"
$result="Running"
return
}
3 {
Write-Host "The emulator is stopped"
}
default {
Write-Host "Unrecognized exit code $($process.ExitCode)"
}
}
Start-Sleep -Seconds 5
}
until ($complete.Invoke())
Write-Error "The emulator failed to reach Running status within ${Timeout} seconds"
}

View file

@ -1,175 +0,0 @@
[CmdletBinding()]
Param (
[Parameter(Mandatory=$True)]
[string] $ArtifactPath,
[Parameter(Mandatory=$True)]
[string] $APIViewUri,
[Parameter(Mandatory=$True)]
[string] $APIKey,
[Parameter(Mandatory=$True)]
[string] $APILabel,
[string] $PackageName,
[string] $SourceBranch,
[string] $DefaultBranch,
[string] $ConfigFileDir = ""
)
# Submit API review request and return status whether current revision is approved or pending or failed to create review
function Submit-APIReview($packagename, $filePath, $uri, $apiKey, $apiLabel, $releaseStatus)
{
$multipartContent = [System.Net.Http.MultipartFormDataContent]::new()
$FileStream = [System.IO.FileStream]::new($filePath, [System.IO.FileMode]::Open)
$fileHeader = [System.Net.Http.Headers.ContentDispositionHeaderValue]::new("form-data")
$fileHeader.Name = "file"
$fileHeader.FileName = $packagename
$fileContent = [System.Net.Http.StreamContent]::new($FileStream)
$fileContent.Headers.ContentDisposition = $fileHeader
$fileContent.Headers.ContentType = [System.Net.Http.Headers.MediaTypeHeaderValue]::Parse("application/octet-stream")
$multipartContent.Add($fileContent)
$stringHeader = [System.Net.Http.Headers.ContentDispositionHeaderValue]::new("form-data")
$stringHeader.Name = "label"
$StringContent = [System.Net.Http.StringContent]::new($apiLabel)
$StringContent.Headers.ContentDisposition = $stringHeader
$multipartContent.Add($stringContent)
Write-Host "Request param, label: $apiLabel"
if ($releaseStatus -and ($releaseStatus -ne "Unreleased"))
{
$compareAllParam = [System.Net.Http.Headers.ContentDispositionHeaderValue]::new("form-data")
$compareAllParam.Name = "compareAllRevisions"
$compareAllParamContent = [System.Net.Http.StringContent]::new($true)
$compareAllParamContent.Headers.ContentDisposition = $compareAllParam
$multipartContent.Add($compareAllParamContent)
Write-Host "Request param, compareAllRevisions: true"
}
$headers = @{
"ApiKey" = $apiKey;
"content-type" = "multipart/form-data"
}
try
{
$Response = Invoke-WebRequest -Method 'POST' -Uri $uri -Body $multipartContent -Headers $headers
Write-Host "API Review URL: $($Response.Content)"
$StatusCode = $Response.StatusCode
}
catch
{
Write-Host "Exception details: $($_.Exception.Response)"
$StatusCode = $_.Exception.Response.StatusCode
}
return $StatusCode
}
. (Join-Path $PSScriptRoot common.ps1)
Write-Host "Artifact path: $($ArtifactPath)"
Write-Host "Package Name: $($PackageName)"
Write-Host "Source branch: $($SourceBranch)"
Write-Host "Config File directory: $($ConfigFileDir)"
$packages = @{}
if ($FindArtifactForApiReviewFn -and (Test-Path "Function:$FindArtifactForApiReviewFn"))
{
$packages = &$FindArtifactForApiReviewFn $ArtifactPath $PackageName
}
else
{
Write-Host "The function for 'FindArtifactForApiReviewFn' was not found.`
Make sure it is present in eng/scripts/Language-Settings.ps1 and referenced in eng/common/scripts/common.ps1.`
See https://github.com/Azure/azure-sdk-tools/blob/main/doc/common/common_engsys.md#code-structure"
exit(1)
}
# Check if package config file is present. This file has package version, SDK type etc info.
if (-not $ConfigFileDir)
{
$ConfigFileDir = Join-Path -Path $ArtifactPath "PackageInfo"
}
if ($packages)
{
foreach($pkgPath in $packages.Values)
{
$pkg = Split-Path -Leaf $pkgPath
$pkgPropPath = Join-Path -Path $ConfigFileDir "$PackageName.json"
if (-Not (Test-Path $pkgPropPath))
{
Write-Host " Package property file path $($pkgPropPath) is invalid."
continue
}
# Get package info from json file created before updating version to daily dev
$pkgInfo = Get-Content $pkgPropPath | ConvertFrom-Json
$version = [AzureEngSemanticVersion]::ParseVersionString($pkgInfo.Version)
if ($version -eq $null)
{
Write-Host "Version info is not available for package $PackageName, because version '$(pkgInfo.Version)' is invalid. Please check if the version follows Azure SDK package versioning guidelines."
exit 1
}
Write-Host "Version: $($version)"
Write-Host "SDK Type: $($pkgInfo.SdkType)"
Write-Host "Release Status: $($pkgInfo.ReleaseStatus)"
# Run create review step only if build is triggered from main branch or if version is GA.
# This is to avoid invalidating review status by a build triggered from feature branch
if ( ($SourceBranch -eq $DefaultBranch) -or (-not $version.IsPrerelease))
{
Write-Host "Submitting API Review for package $($pkg)"
$respCode = Submit-APIReview -packagename $pkg -filePath $pkgPath -uri $APIViewUri -apiKey $APIKey -apiLabel $APILabel -releaseStatus $pkgInfo.ReleaseStatus
Write-Host "HTTP Response code: $($respCode)"
# HTTP status 200 means API is in approved status
if ($respCode -eq '200')
{
Write-Host "API review is in approved status."
}
elseif ($version.IsPrerelease)
{
# Ignore API review status for prerelease version
Write-Host "Package version is not GA. Ignoring API view approval status"
}
elseif (!$pkgInfo.ReleaseStatus -or $pkgInfo.ReleaseStatus -eq "Unreleased")
{
Write-Host "Release date is not set for current version in change log file for package. Ignoring API review approval status since package is not yet ready for release."
}
else
{
# Return error code if status code is 201 for new data plane package
# Temporarily enable API review for spring SDK types. Ideally this should be done be using 'IsReviewRequired' method in language side
# to override default check of SDK type client
if (($pkgInfo.SdkType -eq "client" -or $pkgInfo.SdkType -eq "spring") -and $pkgInfo.IsNewSdk)
{
if ($respCode -eq '201')
{
Write-Host "Package version $($version) is GA and automatic API Review is not yet approved for package $($PackageName)."
Write-Host "Build and release is not allowed for GA package without API review approval."
Write-Host "You will need to queue another build to proceed further after API review is approved"
Write-Host "You can check http://aka.ms/azsdk/engsys/apireview/faq for more details on API Approval."
}
else
{
Write-Host "Failed to create API Review for package $($PackageName). Please reach out to Azure SDK engineering systems on teams channel and share this build details."
}
exit 1
}
else
{
Write-Host "API review is not approved for package $($PackageName), however it is not required for this package type so it can still be released without API review approval."
}
}
}
else
{
Write-Host "Build is triggered from $($SourceBranch) with prerelease version. Skipping API review status check."
}
}
}
else
{
Write-Host "No package is found in artifact path to submit review request"
}

View file

@ -1,89 +0,0 @@
param(
# The repo owner: e.g. Azure
$RepoOwner,
# The repo name. E.g. azure-sdk-for-java
$RepoName,
# Please use the RepoOwner/RepoName format: e.g. Azure/azure-sdk-for-java
$RepoId="$RepoOwner/$RepoName",
[Parameter(Mandatory = $true)]
$BranchPrefix,
# Date format: e.g. Tuesday, April 12, 2022 1:36:02 PM. Allow to use other date format.
[AllowNull()]
[DateTime]$LastCommitOlderThan,
[Parameter(Mandatory = $true)]
$AuthToken
)
. (Join-Path $PSScriptRoot common.ps1)
LogDebug "Operating on Repo [ $RepoId ]"
try{
$responses = Get-GitHubSourceReferences -RepoId $RepoId -Ref "heads/$BranchPrefix" -AuthToken $AuthToken
}
catch {
LogError "Get-GitHubSourceReferences failed with exception:`n$_"
exit 1
}
foreach ($res in $responses)
{
if (!$res -or !$res.ref) {
LogDebug "No branch returned from the branch prefix $BranchPrefix on $Repo. Skipping..."
continue
}
$branch = $res.ref
$branchName = $branch.Replace("refs/heads/","")
try {
$head = "${RepoId}:${branchName}"
LogDebug "Operating on branch [ $branchName ]"
$pullRequests = Get-GitHubPullRequests -RepoId $RepoId -State "all" -Head $head -AuthToken $AuthToken
}
catch
{
LogError "Get-GitHubPullRequests failed with exception:`n$_"
exit 1
}
$openPullRequests = $pullRequests | ? { $_.State -eq "open" }
if ($openPullRequests.Count -gt 0)
{
LogDebug "Branch [ $branchName ] in repo [ $RepoId ] has open pull Requests. Skipping"
LogDebug $openPullRequests.url
continue
}
LogDebug "Branch [ $branchName ] in repo [ $RepoId ] has no associated open Pull Request. "
if ($LastCommitOlderThan) {
if (!$res.object -or !$res.object.url) {
LogWarning "No commit url returned from response. Skipping... "
continue
}
try {
$commitDate = Get-GithubReferenceCommitDate -commitUrl $res.object.url -AuthToken $AuthToken
if (!$commitDate)
{
LogDebug "No last commit date found. Skipping."
continue
}
if ($commitDate -gt $LastCommitOlderThan) {
LogDebug "The branch $branch last commit date [ $commitDate ] is newer than the date $LastCommitOlderThan. Skipping."
continue
}
LogDebug "Branch [ $branchName ] in repo [ $RepoId ] has a last commit date [ $commitDate ] that is older than $LastCommitOlderThan. "
}
catch {
LogError "Get-GithubReferenceCommitDate failed with exception:`n$_"
exit 1
}
}
try {
Remove-GitHubSourceReferences -RepoId $RepoId -Ref $branch -AuthToken $AuthToken
LogDebug "The branch [ $branchName ] in [ $RepoId ] has been deleted."
}
catch {
LogError "Remove-GitHubSourceReferences failed with exception:`n$_"
exit 1
}
}

Some files were not shown because too many files have changed in this diff Show more