info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/product/ux/technical-writing/#assignments
The `small` machine type is the default. Your job runs on this machine type if you don't specify
a [tags:](../../yaml/index.md#tags) keyword in your `.gitlab-ci.yml` file.
CI/CD jobs that run on `medium` and `large` machine types **will** consume CI minutes at a different rate than CI/CD jobs on the `small` machine type.
Refer to the CI/CD minutes [cost factor](../../../ci/pipelines/cicd_minutes.md#cost-factor) for the cost factor applied to the machine type based on size.
## Example of how to tag a job
To use a machine type other than `small`, add a `tags:` keyword to your job.
For example:
```yaml
stages:
- Prebuild
- Build
- Unit Test
job_001:
stage: Prebuild
script:
- echo "this job runs on the default (small) instance"
echo "Repository cache not available, cloning a new directory..."
exit
fi
rm -rf $CI_PROJECT_DIR
echo "Extracting tarball into $CI_PROJECT_DIR..."
mkdir -p $CI_PROJECT_DIR
cd $CI_PROJECT_DIR
tar xzf /tmp/gitlab.tar.gz
rm -f /tmp/gitlab.tar.gz
chmod a+w $CI_PROJECT_DIR
)
```
The first step of the script downloads `gitlab-master.tar.gz` from Google Cloud Storage.
There was a [GitLab CI/CD job named `cache-repo`](https://gitlab.com/gitlab-org/gitlab/-/blob/5fb40526c8c8aaafc5f92eab36d5bbddaca3893d/.gitlab/ci/cache-repo.gitlab-ci.yml)
that was responsible for keeping that archive up-to-date. Every two hours on a scheduled pipeline,
it did the following:
1. Create a fresh clone of the `gitlab-org/gitlab` repository on GitLab.com.
1. Save the data as a `.tar.gz`.
1. Upload it into the Google Cloud Storage bucket.
When a job ran with this configuration, the output looked similar to:
```shell
$ eval "$CI_PRE_CLONE_SCRIPT"
Downloading archived master...
Extracting tarball into /builds/gitlab-org/gitlab...
Fetching changes...
Reinitialized existing Git repository in /builds/gitlab-org/gitlab/.git/
```
The `Reinitialized existing Git repository` message shows that
the pre-clone step worked. The runner runs `git init`, which
overwrites the Git configuration with the appropriate settings to fetch
from the GitLab repository.
`CI_REPO_CACHE_CREDENTIALS` must contain the Google Cloud service account
JSON for uploading to the `gitlab-ci-git-repo-cache` bucket.
Note that this bucket should be located in the same continent as the
runner, or [you can incur network egress charges](https://cloud.google.com/storage/pricing).
"/dummy-sys-class-dmi-id:/sys/class/dmi/id:ro" # Make kaniko builds work on GCP.
]
[runners.machine]
IdleCount = 50
IdleTime = 3600
MaxBuilds = 1 # For security reasons we delete the VM after job has finished so it's not reused.
MachineName = "srm-%s"
MachineDriver = "google"
MachineOptions = [
"google-project=PROJECT",
"google-disk-size=25",
"google-machine-type=n1-standard-1",
"google-username=core",
"google-tags=gitlab-com,srm",
"google-use-internal-ip",
"google-zone=us-east1-d",
"engine-opt=mtu=1460", # Set MTU for container interface, for more information check https://gitlab.com/gitlab-org/gitlab-runner/-/issues/3214#note_82892928
"engine-opt=ipv6", # This will create IPv6 interfaces in the containers.
"engine-opt=fixed-cidr-v6=fc00::/7",
"google-operation-backoff-initial-interval=2" # Custom flag from forked docker-machine, for more information check https://github.com/docker/machine/pull/4600