Compare commits

..

3 Commits

Author SHA1 Message Date
914fcb3068 Update license 2022-12-23 06:12:09 +00:00
95a9c77002 Release compression related changes for windows 2022-12-23 05:59:01 +00:00
c17f4bf466 GA for granular cache (#1035)
* Add example for Haskell Stack

* Revert "Add example for Haskell Stack"

* Basic implementation

* Updated variable name

* Adding wrapper class

* Changed logs to warnings

* added debug logs

* experimenting

* Test

* test

* new try

* test

* Impl separated

* Reverted wrapper changes

* Added test cases

* Some cleanup

* Formatted document

* Fixed test cases issues

* Slight modification for test cases check

* Updated new actions' input descriptions

* Reverted custom asks implemented and added wrapper

* refactor into a generic outputter

* Readme draft for new actions

* Generated dist

* Fixed breaking test case

* Removed return type in promise

* Removed commented lines

* Calling methods from same file

* dist

* update save as well

* fix merge

* Changes for beta release

* Update dist folder

* Fixed formatting

* dist

* Add support for gzip fallback for restore of old cache on windows

* Fixed test cases

* Fixed test cases

* Added restore only and save only test cases

* Updated new actions dist files

* Removed comments

* Fixed inputs

* Renamed variables and added tests

* Fixed breaking test case

* Fixed review comments and tests

* added stateprovider changes

* Deleted stateprovider tests until added

* Added stateprovider test cases

* Fixed breaking test case

* Updated outputs of restore action

* Changes for beta release

* Update dist folder

* Add support for gzip fallback for restore of old cache on windows

* update for new beta release

* Update save/action.yml

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/action.yml

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/action.yml

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/action.yml

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/action.yml

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Added more assertions as values can't be checked

* Removed unused code

* Merged beta branch and resolved conflicts

* Added save readme

* Updates to save readme

* Renamed output

* Added cache hit info in readme

* Update restore/README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update restore/README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update save/README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Update save/README.md

Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>

* Removed verbose statements

* Repositioned new actions introduction

* Added test case for restore state

* Addressed review comments

* nit

* nit: added language to code blocks

* Updated beta version to 3.2.0-beta.1

* Added stateprovider mock implementations

* Linting errors fixed

* Save-only warning added

* Updated return ID to -2

* Removed -2 error code

* Removed comment

* Updated cache npm lib version

* Updated license version

* Updated releases.md

* Updated readme with the new actions in what's new

Co-authored-by: Malo Bourgon <mbourgon@gmail.com>
Co-authored-by: Vipul <vsvipul@github.com>
Co-authored-by: Bishal Prasad <bishal-pdmsft@github.com>
Co-authored-by: Tanuj Kumar Mishra <tanuj077@users.noreply.github.com>
Co-authored-by: Sampark Sharma <phantsure@github.com>
2022-12-21 19:38:44 +05:30
15 changed files with 206 additions and 67 deletions

View File

@ -6,7 +6,7 @@
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],
// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "npm install && npm run build"
"postCreateCommand": "npm install"
// Configure tool-specific properties.
// "customizations": {},
// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.

View File

@ -1,6 +1,6 @@
---
name: "@actions/cache"
version: 3.0.5
version: 3.1.0
type: npm
summary:
homepage:

View File

@ -27,6 +27,8 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac
* Fixed the download stuck problem by introducing a timeout of 1 hour for cache downloads.
* Fix zstd not working for windows on gnu tar in issues.
* Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
* Two new actions available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml)
* Add support for cross os caching. For example, a cache saved on windows can be restored on ubuntu and vice versa.
Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions

View File

@ -52,3 +52,11 @@
### 3.2.0-beta.1
- Added two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache.
### 3.2.0
- Released the two new actions - [restore](restore/action.yml) and [save](save/action.yml) for granular control on cache
### 3.2.1
- Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
- Added support for fallback to gzip to restore old caches on windows.
- Added logs for cache version in case of a cache miss.

View File

@ -91,3 +91,31 @@ test("save with valid inputs uploads a cache", async () => {
expect(failedMock).toHaveBeenCalledTimes(0);
});
test("save failing logs the warning message", async () => {
const warningMock = jest.spyOn(core, "warning");
const primaryKey = "Linux-node-bb828da54c148048dd17899ba9fda624811cfb43";
const inputPath = "node_modules";
testUtils.setInput(Inputs.Key, primaryKey);
testUtils.setInput(Inputs.Path, inputPath);
testUtils.setInput(Inputs.UploadChunkSize, "4000000");
const cacheId = -1;
const saveCacheMock = jest
.spyOn(cache, "saveCache")
.mockImplementationOnce(() => {
return Promise.resolve(cacheId);
});
await run();
expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
uploadChunkSize: 4000000
});
expect(warningMock).toHaveBeenCalledTimes(1);
expect(warningMock).toHaveBeenCalledWith("Cache save failed.");
});

View File

@ -25,16 +25,15 @@ afterEach(() => {
});
test("StateProvider saves states", async () => {
const states = new Map<string, string>();
const getStateMock = jest
.spyOn(core, "getState")
.mockImplementation(name =>
jest.requireActual("@actions/core").getState(name)
);
.mockImplementation(key => states.get(key) || "");
const saveStateMock = jest
.spyOn(core, "saveState")
.mockImplementation((key, value) => {
return jest.requireActual("@actions/core").saveState(key, value);
states.set(key, value);
});
const setOutputMock = jest
@ -48,9 +47,11 @@ test("StateProvider saves states", async () => {
const stateProvider: IStateProvider = new StateProvider();
stateProvider.setState("stateKey", "stateValue");
stateProvider.setState(State.CacheMatchedKey, cacheMatchedKey);
stateProvider.getState("stateKey");
stateProvider.getCacheState();
const stateValue = stateProvider.getState("stateKey");
const cacheStateValue = stateProvider.getCacheState();
expect(stateValue).toBe("stateValue");
expect(cacheStateValue).toBe(cacheMatchedKey);
expect(getStateMock).toHaveBeenCalledTimes(2);
expect(saveStateMock).toHaveBeenCalledTimes(2);
expect(setOutputMock).toHaveBeenCalledTimes(0);

View File

@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found
if (response.statusCode === 204) {
// Cache not found
// List cache for primary key only if cache miss occurs
if (core.isDebug()) {
yield printCachesListForDiagnostics(keys[0], httpClient, version);
}
return null;
}
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
});
}
exports.getCacheEntry = getCacheEntry;
function printCachesListForDiagnostics(key, httpClient, version) {
return __awaiter(this, void 0, void 0, function* () {
const resource = `caches?key=${encodeURIComponent(key)}`;
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
if (response.statusCode === 200) {
const cacheListResult = response.result;
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
if (totalCount && totalCount > 0) {
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
}
}
}
});
}
function downloadCache(archiveLocation, archivePath, options) {
return __awaiter(this, void 0, void 0, function* () {
const archiveUrl = new url_1.URL(archiveLocation);
@ -38329,7 +38349,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -d --long=30 -o',
'zstd -d --long=30 --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38340,7 +38360,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -d -o',
'zstd -d --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38366,7 +38386,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -T0 --long=30 -o',
'zstd -T0 --long=30 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -38377,7 +38397,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -T0 -o',
'zstd -T0 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -47256,7 +47276,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found

32
dist/restore/index.js vendored
View File

@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found
if (response.statusCode === 204) {
// Cache not found
// List cache for primary key only if cache miss occurs
if (core.isDebug()) {
yield printCachesListForDiagnostics(keys[0], httpClient, version);
}
return null;
}
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
});
}
exports.getCacheEntry = getCacheEntry;
function printCachesListForDiagnostics(key, httpClient, version) {
return __awaiter(this, void 0, void 0, function* () {
const resource = `caches?key=${encodeURIComponent(key)}`;
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
if (response.statusCode === 200) {
const cacheListResult = response.result;
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
if (totalCount && totalCount > 0) {
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
}
}
}
});
}
function downloadCache(archiveLocation, archivePath, options) {
return __awaiter(this, void 0, void 0, function* () {
const archiveUrl = new url_1.URL(archiveLocation);
@ -38242,7 +38262,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -d --long=30 -o',
'zstd -d --long=30 --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38253,7 +38273,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -d -o',
'zstd -d --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38279,7 +38299,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -T0 --long=30 -o',
'zstd -T0 --long=30 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -38290,7 +38310,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -T0 -o',
'zstd -T0 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -47227,7 +47247,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found

View File

@ -1043,6 +1043,29 @@ class ExecState extends events.EventEmitter {
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
Object.defineProperty(o, "default", { enumerable: true, value: v });
}) : function(o, v) {
o["default"] = v;
});
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
__setModuleDefault(result, mod);
return result;
};
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
@ -1056,11 +1079,15 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", { value: true });
const core = __importStar(__webpack_require__(470));
const saveImpl_1 = __importDefault(__webpack_require__(471));
const stateProvider_1 = __webpack_require__(309);
function run() {
return __awaiter(this, void 0, void 0, function* () {
yield (0, saveImpl_1.default)(new stateProvider_1.NullStateProvider());
const cacheId = yield (0, saveImpl_1.default)(new stateProvider_1.NullStateProvider());
if (cacheId === -1) {
core.warning(`Cache save failed.`);
}
});
}
run();
@ -3460,8 +3487,12 @@ function getCacheEntry(keys, paths, options) {
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found
if (response.statusCode === 204) {
// Cache not found
// List cache for primary key only if cache miss occurs
if (core.isDebug()) {
yield printCachesListForDiagnostics(keys[0], httpClient, version);
}
return null;
}
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
@ -3480,6 +3511,22 @@ function getCacheEntry(keys, paths, options) {
});
}
exports.getCacheEntry = getCacheEntry;
function printCachesListForDiagnostics(key, httpClient, version) {
return __awaiter(this, void 0, void 0, function* () {
const resource = `caches?key=${encodeURIComponent(key)}`;
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
if (response.statusCode === 200) {
const cacheListResult = response.result;
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
if (totalCount && totalCount > 0) {
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
}
}
}
});
}
function downloadCache(archiveLocation, archivePath, options) {
return __awaiter(this, void 0, void 0, function* () {
const archiveUrl = new url_1.URL(archiveLocation);
@ -38266,7 +38313,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -d --long=30 -o',
'zstd -d --long=30 --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38277,7 +38324,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -d -o',
'zstd -d --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38303,7 +38350,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -T0 --long=30 -o',
'zstd -T0 --long=30 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -38314,7 +38361,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -T0 -o',
'zstd -T0 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -41092,6 +41139,7 @@ const utils = __importStar(__webpack_require__(443));
process.on("uncaughtException", e => utils.logWarning(e.message));
function saveImpl(stateProvider) {
return __awaiter(this, void 0, void 0, function* () {
let cacheId = -1;
try {
if (!utils.isCacheFeatureAvailable()) {
return;
@ -41118,7 +41166,7 @@ function saveImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true
});
const cacheId = yield cache.saveCache(cachePaths, primaryKey, {
cacheId = yield cache.saveCache(cachePaths, primaryKey, {
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
});
if (cacheId != -1) {
@ -41128,6 +41176,7 @@ function saveImpl(stateProvider) {
catch (error) {
utils.logWarning(error.message);
}
return cacheId;
});
}
exports.default = saveImpl;
@ -47340,7 +47389,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found

36
dist/save/index.js vendored
View File

@ -3431,8 +3431,12 @@ function getCacheEntry(keys, paths, options) {
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found
if (response.statusCode === 204) {
// Cache not found
// List cache for primary key only if cache miss occurs
if (core.isDebug()) {
yield printCachesListForDiagnostics(keys[0], httpClient, version);
}
return null;
}
if (!requestUtils_1.isSuccessStatusCode(response.statusCode)) {
@ -3451,6 +3455,22 @@ function getCacheEntry(keys, paths, options) {
});
}
exports.getCacheEntry = getCacheEntry;
function printCachesListForDiagnostics(key, httpClient, version) {
return __awaiter(this, void 0, void 0, function* () {
const resource = `caches?key=${encodeURIComponent(key)}`;
const response = yield requestUtils_1.retryTypedResponse('listCache', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
if (response.statusCode === 200) {
const cacheListResult = response.result;
const totalCount = cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.totalCount;
if (totalCount && totalCount > 0) {
core.debug(`No matching cache found for cache key '${key}', version '${version} and scope ${process.env['GITHUB_REF']}. There exist one or more cache(s) with similar key but they have different version or scope. See more info on cache matching here: https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#matching-a-cache-key \nOther caches with similar key:`);
for (const cacheEntry of (cacheListResult === null || cacheListResult === void 0 ? void 0 : cacheListResult.artifactCaches) || []) {
core.debug(`Cache Key: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheKey}, Cache Version: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.cacheVersion}, Cache Scope: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.scope}, Cache Created: ${cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.creationTime}`);
}
}
}
});
}
function downloadCache(archiveLocation, archivePath, options) {
return __awaiter(this, void 0, void 0, function* () {
const archiveUrl = new url_1.URL(archiveLocation);
@ -38237,7 +38257,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -d --long=30 -o',
'zstd -d --long=30 --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38248,7 +38268,7 @@ function getDecompressionProgram(tarPath, compressionMethod, archivePath) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -d -o',
'zstd -d --force -o',
constants_1.TarFilename,
archivePath.replace(new RegExp(`\\${path.sep}`, 'g'), '/')
]
@ -38274,7 +38294,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.Zstd:
return BSD_TAR_ZSTD
? [
'zstd -T0 --long=30 -o',
'zstd -T0 --long=30 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -38285,7 +38305,7 @@ function getCompressionProgram(tarPath, compressionMethod) {
case constants_1.CompressionMethod.ZstdWithoutLong:
return BSD_TAR_ZSTD
? [
'zstd -T0 -o',
'zstd -T0 --force -o',
cacheFileName.replace(new RegExp(`\\${path.sep}`, 'g'), '/'),
constants_1.TarFilename
]
@ -41063,6 +41083,7 @@ const utils = __importStar(__webpack_require__(443));
process.on("uncaughtException", e => utils.logWarning(e.message));
function saveImpl(stateProvider) {
return __awaiter(this, void 0, void 0, function* () {
let cacheId = -1;
try {
if (!utils.isCacheFeatureAvailable()) {
return;
@ -41089,7 +41110,7 @@ function saveImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true
});
const cacheId = yield cache.saveCache(cachePaths, primaryKey, {
cacheId = yield cache.saveCache(cachePaths, primaryKey, {
uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
});
if (cacheId != -1) {
@ -41099,6 +41120,7 @@ function saveImpl(stateProvider) {
catch (error) {
utils.logWarning(error.message);
}
return cacheId;
});
}
exports.default = saveImpl;
@ -47340,7 +47362,7 @@ function restoreCache(paths, primaryKey, restoreKeys, options) {
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.debug("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found

18
package-lock.json generated
View File

@ -1,15 +1,15 @@
{
"name": "cache",
"version": "3.2.0-beta.1",
"version": "3.2.1",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "cache",
"version": "3.2.0-beta.1",
"version": "3.2.1",
"license": "MIT",
"dependencies": {
"@actions/cache": "3.1.0-beta.3",
"@actions/cache": "^3.1.0",
"@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2"
@ -36,9 +36,9 @@
}
},
"node_modules/@actions/cache": {
"version": "3.1.0-beta.3",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0-beta.3.tgz",
"integrity": "sha512-71S1vd0WKLbC2lAe04pCYqTLBjSa8gURtiqnVBCYAt8QVBjOfwa2D3ESf2m8K2xjUxman/Yimdp7CPJDyFnxZg==",
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
"integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
"dependencies": {
"@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1",
@ -9722,9 +9722,9 @@
},
"dependencies": {
"@actions/cache": {
"version": "3.1.0-beta.3",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0-beta.3.tgz",
"integrity": "sha512-71S1vd0WKLbC2lAe04pCYqTLBjSa8gURtiqnVBCYAt8QVBjOfwa2D3ESf2m8K2xjUxman/Yimdp7CPJDyFnxZg==",
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
"integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
"requires": {
"@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1",

View File

@ -1,6 +1,6 @@
{
"name": "cache",
"version": "3.2.0-beta.1",
"version": "3.2.1",
"private": true,
"description": "Cache dependencies and build outputs",
"main": "dist/restore/index.js",
@ -23,7 +23,7 @@
"author": "GitHub",
"license": "MIT",
"dependencies": {
"@actions/cache": "3.1.0-beta.3",
"@actions/cache": "^3.1.0",
"@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2"

View File

@ -10,7 +10,8 @@ import * as utils from "./utils/actionUtils";
// throw an uncaught exception. Instead of failing this action, just warn.
process.on("uncaughtException", e => utils.logWarning(e.message));
async function saveImpl(stateProvider: IStateProvider): Promise<void> {
async function saveImpl(stateProvider: IStateProvider): Promise<number | void> {
let cacheId = -1;
try {
if (!utils.isCacheFeatureAvailable()) {
return;
@ -51,7 +52,7 @@ async function saveImpl(stateProvider: IStateProvider): Promise<void> {
required: true
});
const cacheId = await cache.saveCache(cachePaths, primaryKey, {
cacheId = await cache.saveCache(cachePaths, primaryKey, {
uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize)
});
@ -61,6 +62,7 @@ async function saveImpl(stateProvider: IStateProvider): Promise<void> {
} catch (error: unknown) {
utils.logWarning((error as Error).message);
}
return cacheId;
}
export default saveImpl;

View File

@ -1,8 +1,13 @@
import * as core from "@actions/core";
import saveImpl from "./saveImpl";
import { NullStateProvider } from "./stateProvider";
async function run(): Promise<void> {
await saveImpl(new NullStateProvider());
const cacheId = await saveImpl(new NullStateProvider());
if (cacheId === -1) {
core.warning(`Cache save failed.`);
}
}
run();

View File

@ -19,24 +19,6 @@ A cache today is immutable and cannot be updated. But some use cases require the
## Use cache across feature branches
Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches.
## Improving cache restore performance on Windows/Using cross-os caching
Currently, cache restore is slow on Windows due to tar being inherently slow and the compression algorithm `gzip` in use. `zstd` is the default algorithm in use on linux and macos. It was disabled on Windows due to issues with bsd tar(libarchive), the tar implementation in use on Windows.
To improve cache restore performance, we can re-enable `zstd` as the compression algorithm using the following workaround. Add the following step to your workflow before the cache step:
```yaml
- if: ${{ runner.os == 'Windows' }}
name: Use GNU tar
shell: cmd
run: |
echo "Adding GNU tar to PATH"
echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%"
```
The `cache` action will use GNU tar instead of bsd tar on Windows. This should work on all Github Hosted runners as it is. For self-hosted runners, please ensure you have GNU tar and `zstd` installed.
The above workaround is also needed if you wish to use cross-os caching since difference of compression algorithms will result in different cache versions for the same cache key. So the above workaround will ensure `zstd` is used for caching on all platforms thus resulting in the same cache version for the same cache key.
## Force deletion of caches overriding default cache eviction policy
Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches.