mirror of
https://github.com/actions/cache.git
synced 2025-06-28 05:11:12 +02:00
Compare commits
6 Commits
v3.2.0-bet
...
Phantsure-
Author | SHA1 | Date | |
---|---|---|---|
cfa7ba9007 | |||
9b0be58822 | |||
7d403ca5c3 | |||
c17f4bf466 | |||
8a9ab1ae8c | |||
f10295073f |
@ -6,7 +6,7 @@
|
|||||||
// Use 'forwardPorts' to make a list of ports inside the container available locally.
|
// Use 'forwardPorts' to make a list of ports inside the container available locally.
|
||||||
// "forwardPorts": [],
|
// "forwardPorts": [],
|
||||||
// Use 'postCreateCommand' to run commands after the container is created.
|
// Use 'postCreateCommand' to run commands after the container is created.
|
||||||
"postCreateCommand": "npm install && npm run build"
|
"postCreateCommand": "npm install"
|
||||||
// Configure tool-specific properties.
|
// Configure tool-specific properties.
|
||||||
// "customizations": {},
|
// "customizations": {},
|
||||||
// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
|
// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
|
||||||
|
40
.github/workflows/codeql.yml
vendored
40
.github/workflows/codeql.yml
vendored
@ -8,45 +8,39 @@ on:
|
|||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
CodeQL-Build:
|
CodeQL-Build:
|
||||||
|
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
|
||||||
# CodeQL runs on ubuntu-latest and windows-latest
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
# required for all workflows
|
||||||
|
security-events: write
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- name: Checkout repository
|
||||||
uses: actions/checkout@v3
|
uses: actions/checkout@v3
|
||||||
with:
|
|
||||||
# We must fetch at least the immediate parents so that if this is
|
|
||||||
# a pull request then we can checkout the head.
|
|
||||||
fetch-depth: 2
|
|
||||||
|
|
||||||
# If this run was triggered by a pull request event, then checkout
|
|
||||||
# the head of the pull request instead of the merge commit.
|
|
||||||
- run: git checkout HEAD^2
|
|
||||||
if: ${{ github.event_name == 'pull_request' }}
|
|
||||||
|
|
||||||
# Initializes the CodeQL tools for scanning.
|
# Initializes the CodeQL tools for scanning.
|
||||||
- name: Initialize CodeQL
|
- name: Initialize CodeQL
|
||||||
uses: github/codeql-action/init@v1
|
uses: github/codeql-action/init@v2
|
||||||
# Override language selection by uncommenting this and choosing your languages
|
# Override language selection by uncommenting this and choosing your languages
|
||||||
# with:
|
# with:
|
||||||
# languages: go, javascript, csharp, python, cpp, java
|
# languages: go, javascript, csharp, python, cpp, java, ruby
|
||||||
|
|
||||||
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
|
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
|
||||||
# If this step fails, then you should remove it and run the build manually (see below)
|
# If this step fails, then you should remove it and run the build manually (see below).
|
||||||
- name: Autobuild
|
- name: Autobuild
|
||||||
uses: github/codeql-action/autobuild@v1
|
uses: github/codeql-action/autobuild@v2
|
||||||
|
|
||||||
# ℹ️ Command-line programs to run using the OS shell.
|
# ℹ️ Command-line programs to run using the OS shell.
|
||||||
# 📚 https://git.io/JvXDl
|
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||||
|
|
||||||
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
|
# ✏️ If the Autobuild fails above, remove it and uncomment the following
|
||||||
# and modify them (or add more) to build your code if your project
|
# three lines and modify them (or add more) to build your code if your
|
||||||
# uses a compiled language
|
# project uses a compiled language
|
||||||
|
|
||||||
#- run: |
|
#- run: |
|
||||||
# make bootstrap
|
# make bootstrap
|
||||||
# make release
|
# make release
|
||||||
|
|
||||||
- name: Perform CodeQL Analysis
|
- name: Perform CodeQL Analysis
|
||||||
uses: github/codeql-action/analyze@v1
|
uses: github/codeql-action/analyze@v2
|
||||||
|
2
.licenses/npm/@actions/cache.dep.yml
generated
2
.licenses/npm/@actions/cache.dep.yml
generated
@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
name: "@actions/cache"
|
name: "@actions/cache"
|
||||||
version: 3.0.5
|
version: 3.1.0
|
||||||
type: npm
|
type: npm
|
||||||
summary:
|
summary:
|
||||||
homepage:
|
homepage:
|
||||||
|
@ -27,6 +27,8 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac
|
|||||||
* Fixed the download stuck problem by introducing a timeout of 1 hour for cache downloads.
|
* Fixed the download stuck problem by introducing a timeout of 1 hour for cache downloads.
|
||||||
* Fix zstd not working for windows on gnu tar in issues.
|
* Fix zstd not working for windows on gnu tar in issues.
|
||||||
* Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
|
* Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
|
||||||
|
* Two new actions available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml)
|
||||||
|
* Add support for cross os caching. For example, a cache saved on windows can be restored on ubuntu and vice versa.
|
||||||
|
|
||||||
Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions
|
Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions
|
||||||
|
|
||||||
|
@ -52,3 +52,11 @@
|
|||||||
|
|
||||||
### 3.2.0-beta.1
|
### 3.2.0-beta.1
|
||||||
- Added two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache.
|
- Added two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache.
|
||||||
|
|
||||||
|
### 3.2.0
|
||||||
|
- Released the two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache
|
||||||
|
|
||||||
|
### 3.2.1
|
||||||
|
- Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
|
||||||
|
- Added support for fallback to gzip to restore old caches on windows.
|
||||||
|
- Added logs for cache version in case of a cache miss.
|
@ -91,3 +91,31 @@ test("save with valid inputs uploads a cache", async () => {
|
|||||||
|
|
||||||
expect(failedMock).toHaveBeenCalledTimes(0);
|
expect(failedMock).toHaveBeenCalledTimes(0);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test("save failing logs the warning message", async () => {
|
||||||
|
const warningMock = jest.spyOn(core, "warning");
|
||||||
|
|
||||||
|
const primaryKey = "Linux-node-bb828da54c148048dd17899ba9fda624811cfb43";
|
||||||
|
|
||||||
|
const inputPath = "node_modules";
|
||||||
|
testUtils.setInput(Inputs.Key, primaryKey);
|
||||||
|
testUtils.setInput(Inputs.Path, inputPath);
|
||||||
|
testUtils.setInput(Inputs.UploadChunkSize, "4000000");
|
||||||
|
|
||||||
|
const cacheId = -1;
|
||||||
|
const saveCacheMock = jest
|
||||||
|
.spyOn(cache, "saveCache")
|
||||||
|
.mockImplementationOnce(() => {
|
||||||
|
return Promise.resolve(cacheId);
|
||||||
|
});
|
||||||
|
|
||||||
|
await run();
|
||||||
|
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledTimes(1);
|
||||||
|
expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
|
||||||
|
uploadChunkSize: 4000000
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(warningMock).toHaveBeenCalledTimes(1);
|
||||||
|
expect(warningMock).toHaveBeenCalledWith("Cache save failed.");
|
||||||
|
});
|
||||||
|
@ -25,16 +25,15 @@ afterEach(() => {
|
|||||||
});
|
});
|
||||||
|
|
||||||
test("StateProvider saves states", async () => {
|
test("StateProvider saves states", async () => {
|
||||||
|
const states = new Map<string, string>();
|
||||||
const getStateMock = jest
|
const getStateMock = jest
|
||||||
.spyOn(core, "getState")
|
.spyOn(core, "getState")
|
||||||
.mockImplementation(name =>
|
.mockImplementation(key => states.get(key) || "");
|
||||||
jest.requireActual("@actions/core").getState(name)
|
|
||||||
);
|
|
||||||
|
|
||||||
const saveStateMock = jest
|
const saveStateMock = jest
|
||||||
.spyOn(core, "saveState")
|
.spyOn(core, "saveState")
|
||||||
.mockImplementation((key, value) => {
|
.mockImplementation((key, value) => {
|
||||||
return jest.requireActual("@actions/core").saveState(key, value);
|
states.set(key, value);
|
||||||
});
|
});
|
||||||
|
|
||||||
const setOutputMock = jest
|
const setOutputMock = jest
|
||||||
@ -48,9 +47,11 @@ test("StateProvider saves states", async () => {
|
|||||||
const stateProvider: IStateProvider = new StateProvider();
|
const stateProvider: IStateProvider = new StateProvider();
|
||||||
stateProvider.setState("stateKey", "stateValue");
|
stateProvider.setState("stateKey", "stateValue");
|
||||||
stateProvider.setState(State.CacheMatchedKey, cacheMatchedKey);
|
stateProvider.setState(State.CacheMatchedKey, cacheMatchedKey);
|
||||||
stateProvider.getState("stateKey");
|
const stateValue = stateProvider.getState("stateKey");
|
||||||
stateProvider.getCacheState();
|
const cacheStateValue = stateProvider.getCacheState();
|
||||||
|
|
||||||
|
expect(stateValue).toBe("stateValue");
|
||||||
|
expect(cacheStateValue).toBe(cacheMatchedKey);
|
||||||
expect(getStateMock).toHaveBeenCalledTimes(2);
|
expect(getStateMock).toHaveBeenCalledTimes(2);
|
||||||
expect(saveStateMock).toHaveBeenCalledTimes(2);
|
expect(saveStateMock).toHaveBeenCalledTimes(2);
|
||||||
expect(setOutputMock).toHaveBeenCalledTimes(0);
|
expect(setOutputMock).toHaveBeenCalledTimes(0);
|
||||||
|
32
dist/restore-only/index.js
vendored
32
dist/restore-only/index.js
vendored
@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
||||||
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
||||||
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
// Cache not found
|
||||||
if (response.statusCode === 204) {
|
if (response.statusCode === 204) {
|
||||||
// Cache not found
|
// List cache for primary key only if cache miss occurs
|
||||||
|
if (core.isDebug()) {
|
||||||
|
yield printCachesListForDiagnostics(keys[0], httpClient, version);
|
||||||
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
||||||
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.getCacheEntry = getCacheEntry;
|
exports.getCacheEntry = getCacheEntry;
|
||||||
|
function printCachesListForDiagnostics(key, httpClient, version) {
|
||||||
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
const resource = `caches?key=${encodeURIComponent(key)}`;
|
||||||
|
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
if (response.statusCode === 200) {
|
||||||
|
const cacheListResult = response.result;
|
||||||
|
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
|
||||||
|
if (totalCount && totalCount > 0) {
|
||||||
|
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
|
||||||
|
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
|
||||||
|
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
function downloadCache(archiveLocation, archivePath, options) {
|
function downloadCache(archiveLocation, archivePath, options) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
const archiveUrl = new url_1.URL(archiveLocation);
|
const archiveUrl = new url_1.URL(archiveLocation);
|
||||||
@ -38329,7 +38349,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d --long=30 -o',
|
'zstd -d --long=30 --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38340,7 +38360,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d -o',
|
'zstd -d --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38366,7 +38386,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 --long=30 -o',
|
'zstd -T0 --long=30 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -38377,7 +38397,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 -o',
|
'zstd -T0 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -47256,7 +47276,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
|
|||||||
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
// Cache not found
|
// Cache not found
|
||||||
|
32
dist/restore/index.js
vendored
32
dist/restore/index.js
vendored
@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
||||||
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
||||||
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
// Cache not found
|
||||||
if (response.statusCode === 204) {
|
if (response.statusCode === 204) {
|
||||||
// Cache not found
|
// List cache for primary key only if cache miss occurs
|
||||||
|
if (core.isDebug()) {
|
||||||
|
yield printCachesListForDiagnostics(keys[0], httpClient, version);
|
||||||
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
||||||
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.getCacheEntry = getCacheEntry;
|
exports.getCacheEntry = getCacheEntry;
|
||||||
|
function printCachesListForDiagnostics(key, httpClient, version) {
|
||||||
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
const resource = `caches?key=${encodeURIComponent(key)}`;
|
||||||
|
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
if (response.statusCode === 200) {
|
||||||
|
const cacheListResult = response.result;
|
||||||
|
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
|
||||||
|
if (totalCount && totalCount > 0) {
|
||||||
|
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
|
||||||
|
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
|
||||||
|
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
function downloadCache(archiveLocation, archivePath, options) {
|
function downloadCache(archiveLocation, archivePath, options) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
const archiveUrl = new url_1.URL(archiveLocation);
|
const archiveUrl = new url_1.URL(archiveLocation);
|
||||||
@ -38242,7 +38262,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d --long=30 -o',
|
'zstd -d --long=30 --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38253,7 +38273,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d -o',
|
'zstd -d --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38279,7 +38299,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 --long=30 -o',
|
'zstd -T0 --long=30 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -38290,7 +38310,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 -o',
|
'zstd -T0 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -47227,7 +47247,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
|
|||||||
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
// Cache not found
|
// Cache not found
|
||||||
|
65
dist/save-only/index.js
vendored
65
dist/save-only/index.js
vendored
@ -1043,6 +1043,29 @@ class ExecState extends events.EventEmitter {
|
|||||||
|
|
||||||
"use strict";
|
"use strict";
|
||||||
|
|
||||||
|
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||||
|
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||||
|
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||||
|
}
|
||||||
|
Object.defineProperty(o, k2, desc);
|
||||||
|
}) : (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
o[k2] = m[k];
|
||||||
|
}));
|
||||||
|
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||||
|
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||||
|
}) : function(o, v) {
|
||||||
|
o["default"] = v;
|
||||||
|
});
|
||||||
|
var __importStar = (this && this.__importStar) || function (mod) {
|
||||||
|
if (mod && mod.__esModule) return mod;
|
||||||
|
var result = {};
|
||||||
|
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||||
|
__setModuleDefault(result, mod);
|
||||||
|
return result;
|
||||||
|
};
|
||||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||||
return new (P || (P = Promise))(function (resolve, reject) {
|
return new (P || (P = Promise))(function (resolve, reject) {
|
||||||
@ -1056,11 +1079,15 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
|
|||||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
};
|
};
|
||||||
Object.defineProperty(exports, "__esModule", { value: true });
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
const core = __importStar(__webpack_require__(470));
|
||||||
const saveImpl_1 = __importDefault(__webpack_require__(471));
|
const saveImpl_1 = __importDefault(__webpack_require__(471));
|
||||||
const stateProvider_1 = __webpack_require__(309);
|
const stateProvider_1 = __webpack_require__(309);
|
||||||
function run() {
|
function run() {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
yield (0, saveImpl_1.default)(new stateProvider_1.NullStateProvider());
|
const cacheId = yield (0, saveImpl_1.default)(new stateProvider_1.NullStateProvider());
|
||||||
|
if (cacheId === -1) {
|
||||||
|
core.warning(`Cache save failed.`);
|
||||||
|
}
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
run();
|
run();
|
||||||
@ -3460,8 +3487,12 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
||||||
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
||||||
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
// Cache not found
|
||||||
if (response.statusCode === 204) {
|
if (response.statusCode === 204) {
|
||||||
// Cache not found
|
// List cache for primary key only if cache miss occurs
|
||||||
|
if (core.isDebug()) {
|
||||||
|
yield printCachesListForDiagnostics(keys[0], httpClient, version);
|
||||||
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
||||||
@ -3480,6 +3511,22 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.getCacheEntry = getCacheEntry;
|
exports.getCacheEntry = getCacheEntry;
|
||||||
|
function printCachesListForDiagnostics(key, httpClient, version) {
|
||||||
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
const resource = `caches?key=${encodeURIComponent(key)}`;
|
||||||
|
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
if (response.statusCode === 200) {
|
||||||
|
const cacheListResult = response.result;
|
||||||
|
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
|
||||||
|
if (totalCount && totalCount > 0) {
|
||||||
|
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
|
||||||
|
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
|
||||||
|
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
function downloadCache(archiveLocation, archivePath, options) {
|
function downloadCache(archiveLocation, archivePath, options) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
const archiveUrl = new url_1.URL(archiveLocation);
|
const archiveUrl = new url_1.URL(archiveLocation);
|
||||||
@ -38266,7 +38313,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d --long=30 -o',
|
'zstd -d --long=30 --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38277,7 +38324,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d -o',
|
'zstd -d --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38303,7 +38350,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 --long=30 -o',
|
'zstd -T0 --long=30 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -38314,7 +38361,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 -o',
|
'zstd -T0 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -41092,6 +41139,7 @@ const utils = __importStar(__webpack_require__(443));
|
|||||||
process.on("uncaughtException", e => utils.logWarning(e.message));
|
process.on("uncaughtException", e => utils.logWarning(e.message));
|
||||||
function saveImpl(stateProvider) {
|
function saveImpl(stateProvider) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
let cacheId = -1;
|
||||||
try {
|
try {
|
||||||
if (!utils.isCacheFeatureAvailable()) {
|
if (!utils.isCacheFeatureAvailable()) {
|
||||||
return;
|
return;
|
||||||
@ -41118,7 +41166,7 @@ function saveImpl(stateProvider) {
|
|||||||
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
|
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
|
||||||
required: true
|
required: true
|
||||||
});
|
});
|
||||||
const cacheId = yield cache.saveCache(cachePaths, primaryKey, {
|
cacheId = yield cache.saveCache(cachePaths, primaryKey, {
|
||||||
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
|
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
|
||||||
});
|
});
|
||||||
if (cacheId != -1) {
|
if (cacheId != -1) {
|
||||||
@ -41128,6 +41176,7 @@ function saveImpl(stateProvider) {
|
|||||||
catch (error) {
|
catch (error) {
|
||||||
utils.logWarning(error.message);
|
utils.logWarning(error.message);
|
||||||
}
|
}
|
||||||
|
return cacheId;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.default = saveImpl;
|
exports.default = saveImpl;
|
||||||
@ -47340,7 +47389,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
|
|||||||
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
// Cache not found
|
// Cache not found
|
||||||
|
36
dist/save/index.js
vendored
36
dist/save/index.js
vendored
@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
|
||||||
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
|
||||||
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
// Cache not found
|
||||||
if (response.statusCode === 204) {
|
if (response.statusCode === 204) {
|
||||||
// Cache not found
|
// List cache for primary key only if cache miss occurs
|
||||||
|
if (core.isDebug()) {
|
||||||
|
yield printCachesListForDiagnostics(keys[0], httpClient, version);
|
||||||
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
|
||||||
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.getCacheEntry = getCacheEntry;
|
exports.getCacheEntry = getCacheEntry;
|
||||||
|
function printCachesListForDiagnostics(key, httpClient, version) {
|
||||||
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
const resource = `caches?key=${encodeURIComponent(key)}`;
|
||||||
|
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
|
||||||
|
if (response.statusCode === 200) {
|
||||||
|
const cacheListResult = response.result;
|
||||||
|
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
|
||||||
|
if (totalCount && totalCount > 0) {
|
||||||
|
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
|
||||||
|
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
|
||||||
|
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
function downloadCache(archiveLocation, archivePath, options) {
|
function downloadCache(archiveLocation, archivePath, options) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
const archiveUrl = new url_1.URL(archiveLocation);
|
const archiveUrl = new url_1.URL(archiveLocation);
|
||||||
@ -38237,7 +38257,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d --long=30 -o',
|
'zstd -d --long=30 --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38248,7 +38268,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -d -o',
|
'zstd -d --force -o',
|
||||||
constants_1.TarFilename,
|
constants_1.TarFilename,
|
||||||
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
|
||||||
]
|
]
|
||||||
@ -38274,7 +38294,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.Zstd:
|
case constants_1.CompressionMethod.Zstd:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 --long=30 -o',
|
'zstd -T0 --long=30 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -38285,7 +38305,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
|
|||||||
case constants_1.CompressionMethod.ZstdWithoutLong:
|
case constants_1.CompressionMethod.ZstdWithoutLong:
|
||||||
return BSD_TAR_ZSTD
|
return BSD_TAR_ZSTD
|
||||||
? [
|
? [
|
||||||
'zstd -T0 -o',
|
'zstd -T0 --force -o',
|
||||||
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
|
||||||
constants_1.TarFilename
|
constants_1.TarFilename
|
||||||
]
|
]
|
||||||
@ -41063,6 +41083,7 @@ const utils = __importStar(__webpack_require__(443));
|
|||||||
process.on("uncaughtException", e => utils.logWarning(e.message));
|
process.on("uncaughtException", e => utils.logWarning(e.message));
|
||||||
function saveImpl(stateProvider) {
|
function saveImpl(stateProvider) {
|
||||||
return __awaiter(this, void 0, void 0, function* () {
|
return __awaiter(this, void 0, void 0, function* () {
|
||||||
|
let cacheId = -1;
|
||||||
try {
|
try {
|
||||||
if (!utils.isCacheFeatureAvailable()) {
|
if (!utils.isCacheFeatureAvailable()) {
|
||||||
return;
|
return;
|
||||||
@ -41089,7 +41110,7 @@ function saveImpl(stateProvider) {
|
|||||||
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
|
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
|
||||||
required: true
|
required: true
|
||||||
});
|
});
|
||||||
const cacheId = yield cache.saveCache(cachePaths, primaryKey, {
|
cacheId = yield cache.saveCache(cachePaths, primaryKey, {
|
||||||
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
|
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
|
||||||
});
|
});
|
||||||
if (cacheId != -1) {
|
if (cacheId != -1) {
|
||||||
@ -41099,6 +41120,7 @@ function saveImpl(stateProvider) {
|
|||||||
catch (error) {
|
catch (error) {
|
||||||
utils.logWarning(error.message);
|
utils.logWarning(error.message);
|
||||||
}
|
}
|
||||||
|
return cacheId;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
exports.default = saveImpl;
|
exports.default = saveImpl;
|
||||||
@ -47340,7 +47362,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
|
|||||||
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
|
||||||
return undefined;
|
return undefined;
|
||||||
}
|
}
|
||||||
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
// Cache not found
|
// Cache not found
|
||||||
|
18
package-lock.json
generated
18
package-lock.json
generated
@ -1,15 +1,15 @@
|
|||||||
{
|
{
|
||||||
"name": "cache",
|
"name": "cache",
|
||||||
"version": "3.2.0-beta.1",
|
"version": "3.2.1",
|
||||||
"lockfileVersion": 2,
|
"lockfileVersion": 2,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "cache",
|
"name": "cache",
|
||||||
"version": "3.2.0-beta.1",
|
"version": "3.2.1",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/cache": "3.1.0-beta.3",
|
"@actions/cache": "^3.1.0",
|
||||||
"@actions/core": "^1.10.0",
|
"@actions/core": "^1.10.0",
|
||||||
"@actions/exec": "^1.1.1",
|
"@actions/exec": "^1.1.1",
|
||||||
"@actions/io": "^1.1.2"
|
"@actions/io": "^1.1.2"
|
||||||
@ -36,9 +36,9 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"node_modules/@actions/cache": {
|
"node_modules/@actions/cache": {
|
||||||
"version": "3.1.0-beta.3",
|
"version": "3.1.0",
|
||||||
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0-beta.3.tgz",
|
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
|
||||||
"integrity": "sha512-71S1vd0WKLbC2lAe04pCYqTLBjSa8gURtiqnVBCYAt8QVBjOfwa2D3ESf2m8K2xjUxman/Yimdp7CPJDyFnxZg==",
|
"integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/core": "^1.10.0",
|
"@actions/core": "^1.10.0",
|
||||||
"@actions/exec": "^1.0.1",
|
"@actions/exec": "^1.0.1",
|
||||||
@ -9722,9 +9722,9 @@
|
|||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/cache": {
|
"@actions/cache": {
|
||||||
"version": "3.1.0-beta.3",
|
"version": "3.1.0",
|
||||||
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0-beta.3.tgz",
|
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
|
||||||
"integrity": "sha512-71S1vd0WKLbC2lAe04pCYqTLBjSa8gURtiqnVBCYAt8QVBjOfwa2D3ESf2m8K2xjUxman/Yimdp7CPJDyFnxZg==",
|
"integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
|
||||||
"requires": {
|
"requires": {
|
||||||
"@actions/core": "^1.10.0",
|
"@actions/core": "^1.10.0",
|
||||||
"@actions/exec": "^1.0.1",
|
"@actions/exec": "^1.0.1",
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "cache",
|
"name": "cache",
|
||||||
"version": "3.2.0-beta.1",
|
"version": "3.2.1",
|
||||||
"private": true,
|
"private": true,
|
||||||
"description": "Cache dependencies and build outputs",
|
"description": "Cache dependencies and build outputs",
|
||||||
"main": "dist/restore/index.js",
|
"main": "dist/restore/index.js",
|
||||||
@ -23,7 +23,7 @@
|
|||||||
"author": "GitHub",
|
"author": "GitHub",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/cache": "3.1.0-beta.3",
|
"@actions/cache": "^3.1.0",
|
||||||
"@actions/core": "^1.10.0",
|
"@actions/core": "^1.10.0",
|
||||||
"@actions/exec": "^1.1.1",
|
"@actions/exec": "^1.1.1",
|
||||||
"@actions/io": "^1.1.2"
|
"@actions/io": "^1.1.2"
|
||||||
|
@ -10,7 +10,8 @@ import * as utils from "./utils/actionUtils";
|
|||||||
// throw an uncaught exception. Instead of failing this action, just warn.
|
// throw an uncaught exception. Instead of failing this action, just warn.
|
||||||
process.on("uncaughtException", e => utils.logWarning(e.message));
|
process.on("uncaughtException", e => utils.logWarning(e.message));
|
||||||
|
|
||||||
async function saveImpl(stateProvider: IStateProvider): Promise<void> {
|
async function saveImpl(stateProvider: IStateProvider): Promise<number | void> {
|
||||||
|
let cacheId = -1;
|
||||||
try {
|
try {
|
||||||
if (!utils.isCacheFeatureAvailable()) {
|
if (!utils.isCacheFeatureAvailable()) {
|
||||||
return;
|
return;
|
||||||
@ -51,7 +52,7 @@ async function saveImpl(stateProvider: IStateProvider): Promise<void> {
|
|||||||
required: true
|
required: true
|
||||||
});
|
});
|
||||||
|
|
||||||
const cacheId = await cache.saveCache(cachePaths, primaryKey, {
|
cacheId = await cache.saveCache(cachePaths, primaryKey, {
|
||||||
uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize)
|
uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize)
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -61,6 +62,7 @@ async function saveImpl(stateProvider: IStateProvider): Promise<void> {
|
|||||||
} catch (error: unknown) {
|
} catch (error: unknown) {
|
||||||
utils.logWarning((error as Error).message);
|
utils.logWarning((error as Error).message);
|
||||||
}
|
}
|
||||||
|
return cacheId;
|
||||||
}
|
}
|
||||||
|
|
||||||
export default saveImpl;
|
export default saveImpl;
|
||||||
|
@ -1,8 +1,13 @@
|
|||||||
|
import * as core from "@actions/core";
|
||||||
|
|
||||||
import saveImpl from "./saveImpl";
|
import saveImpl from "./saveImpl";
|
||||||
import { NullStateProvider } from "./stateProvider";
|
import { NullStateProvider } from "./stateProvider";
|
||||||
|
|
||||||
async function run(): Promise<void> {
|
async function run(): Promise<void> {
|
||||||
await saveImpl(new NullStateProvider());
|
const cacheId = await saveImpl(new NullStateProvider());
|
||||||
|
if (cacheId === -1) {
|
||||||
|
core.warning(`Cache save failed.`);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
run();
|
run();
|
||||||
|
@ -19,24 +19,6 @@ A cache today is immutable and cannot be updated. But some use cases require the
|
|||||||
## Use cache across feature branches
|
## Use cache across feature branches
|
||||||
Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches.
|
Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches.
|
||||||
|
|
||||||
## Improving cache restore performance on Windows/Using cross-os caching
|
|
||||||
Currently, cache restore is slow on Windows due to tar being inherently slow and the compression algorithm `gzip` in use. `zstd` is the default algorithm in use on linux and macos. It was disabled on Windows due to issues with bsd tar(libarchive), the tar implementation in use on Windows.
|
|
||||||
|
|
||||||
To improve cache restore performance, we can re-enable `zstd` as the compression algorithm using the following workaround. Add the following step to your workflow before the cache step:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
- if: ${{ runner.os == 'Windows' }}
|
|
||||||
name: Use GNU tar
|
|
||||||
shell: cmd
|
|
||||||
run: |
|
|
||||||
echo "Adding GNU tar to PATH"
|
|
||||||
echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%"
|
|
||||||
```
|
|
||||||
|
|
||||||
The `cache` action will use GNU tar instead of bsd tar on Windows. This should work on all Github Hosted runners as it is. For self-hosted runners, please ensure you have GNU tar and `zstd` installed.
|
|
||||||
|
|
||||||
The above workaround is also needed if you wish to use cross-os caching since difference of compression algorithms will result in different cache versions for the same cache key. So the above workaround will ensure `zstd` is used for caching on all platforms thus resulting in the same cache version for the same cache key.
|
|
||||||
|
|
||||||
## Force deletion of caches overriding default cache eviction policy
|
## Force deletion of caches overriding default cache eviction policy
|
||||||
Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches.
|
Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches.
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user