Compare commits

..

2 Commits

Author SHA1 Message Date
914fcb3068 Update license 2022-12-23 06:12:09 +00:00
95a9c77002 Release compression related changes for windows 2022-12-23 05:59:01 +00:00
29 changed files with 232 additions and 460 deletions

View File

@ -8,39 +8,45 @@ on:
jobs: jobs:
CodeQL-Build: CodeQL-Build:
# CodeQL runs on ubuntu-latest, windows-latest, and macos-latest
runs-on: ubuntu-latest
permissions: # CodeQL runs on ubuntu-latest and windows-latest
# required for all workflows runs-on: ubuntu-latest
security-events: write
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v3 uses: actions/checkout@v3
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then checkout
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v2 uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages # Override language selection by uncommenting this and choosing your languages
# with: # with:
# languages: go, javascript, csharp, python, cpp, java, ruby # languages: go, javascript, csharp, python, cpp, java
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java). # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below). # If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v2 uses: github/codeql-action/autobuild@v1
# Command-line programs to run using the OS shell. # Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun # 📚 https://git.io/JvXDl
# ✏️ If the Autobuild fails above, remove it and uncomment the following # ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
# three lines and modify them (or add more) to build your code if your # and modify them (or add more) to build your code if your project
# project uses a compiled language # uses a compiled language
#- run: | #- run: |
# make bootstrap # make bootstrap
# make release # make release
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2 uses: github/codeql-action/analyze@v1

View File

@ -1,6 +1,6 @@
--- ---
name: "@actions/cache" name: "@actions/cache"
version: 3.1.2 version: 3.1.0
type: npm type: npm
summary: summary:
homepage: homepage:

View File

@ -28,7 +28,7 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac
* Fix zstd not working for windows on gnu tar in issues. * Fix zstd not working for windows on gnu tar in issues.
* Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes. * Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes.
* Two new actions available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml) * Two new actions available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml)
* Support cross-os caching as an opt-in feature. See [Cross OS caching](./tips-and-workarounds.md#cross-os-cache) for more info. * Add support for cross os caching. For example, a cache saved on windows can be restored on ubuntu and vice versa.
Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions
@ -44,10 +44,9 @@ If you are using this inside a container, a POSIX-compliant `tar` needs to be in
* `path` - A list of files, directories, and wildcard patterns to cache and restore. See [`@actions/glob`](https://github.com/actions/toolkit/tree/main/packages/glob) for supported patterns. * `path` - A list of files, directories, and wildcard patterns to cache and restore. See [`@actions/glob`](https://github.com/actions/toolkit/tree/main/packages/glob) for supported patterns.
* `key` - An explicit key for restoring and saving the cache * `key` - An explicit key for restoring and saving the cache
* `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key. * `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key.
* `enableCrossOsArchive` - An optional boolean when enabled, allows Windows runners to save or restore caches that can be restored or saved respectively on other platforms. Default: false
#### Environment Variables #### Environment Variables
* `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout) * `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/workarounds.md#cache-segment-restore-timeout)
### Outputs ### Outputs
@ -123,7 +122,6 @@ See [Examples](examples.md) for a list of `actions/cache` implementations for us
- [Swift, Objective-C - Carthage](./examples.md#swift-objective-c---carthage) - [Swift, Objective-C - Carthage](./examples.md#swift-objective-c---carthage)
- [Swift, Objective-C - CocoaPods](./examples.md#swift-objective-c---cocoapods) - [Swift, Objective-C - CocoaPods](./examples.md#swift-objective-c---cocoapods)
- [Swift - Swift Package Manager](./examples.md#swift---swift-package-manager) - [Swift - Swift Package Manager](./examples.md#swift---swift-package-manager)
- [Swift - Mint](./examples.md#swift---mint)
## Creating a cache key ## Creating a cache key
@ -247,7 +245,7 @@ Following are some of the known practices/workarounds which community has used t
- [Cache segment restore timeout](./tips-and-workarounds.md#cache-segment-restore-timeout) - [Cache segment restore timeout](./tips-and-workarounds.md#cache-segment-restore-timeout)
- [Update a cache](./tips-and-workarounds.md#update-a-cache) - [Update a cache](./tips-and-workarounds.md#update-a-cache)
- [Use cache across feature branches](./tips-and-workarounds.md#use-cache-across-feature-branches) - [Use cache across feature branches](./tips-and-workarounds.md#use-cache-across-feature-branches)
- [Cross OS cache](./tips-and-workarounds.md#cross-os-cache) - [Improving cache restore performance on Windows/Using cross-os caching](./tips-and-workarounds.md#improving-cache-restore-performance-on-windows-using-cross-os-caching)
- [Force deletion of caches overriding default cache eviction policy](./tips-and-workarounds.md#force-deletion-of-caches-overriding-default-cache-eviction-policy) - [Force deletion of caches overriding default cache eviction policy](./tips-and-workarounds.md#force-deletion-of-caches-overriding-default-cache-eviction-policy)
#### Windows environment variables #### Windows environment variables

View File

@ -59,11 +59,4 @@
### 3.2.1 ### 3.2.1
- Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984)) - Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984))
- Added support for fallback to gzip to restore old caches on windows. - Added support for fallback to gzip to restore old caches on windows.
- Added logs for cache version in case of a cache miss. - Added logs for cache version in case of a cache miss.
### 3.2.2
- Reverted the changes made in 3.2.1 to use gnu tar and zstd by default on windows.
### 3.2.3
- Support cross os caching on Windows as an opt-in feature.
- Fix issue with symlink restoration on Windows for cross-os caches.

View File

@ -174,26 +174,6 @@ test("getInputAsInt throws if required and value missing", () => {
).toThrowError(); ).toThrowError();
}); });
test("getInputAsBool returns false if input not set", () => {
expect(actionUtils.getInputAsBool("undefined")).toBe(false);
});
test("getInputAsBool returns value if input is valid", () => {
testUtils.setInput("foo", "true");
expect(actionUtils.getInputAsBool("foo")).toBe(true);
});
test("getInputAsBool returns false if input is invalid or NaN", () => {
testUtils.setInput("foo", "bar");
expect(actionUtils.getInputAsBool("foo")).toBe(false);
});
test("getInputAsBool throws if required and value missing", () => {
expect(() =>
actionUtils.getInputAsBool("undefined2", { required: true })
).toThrowError();
});
test("isCacheFeatureAvailable for ac enabled", () => { test("isCacheFeatureAvailable for ac enabled", () => {
jest.spyOn(cache, "isFeatureAvailable").mockImplementation(() => true); jest.spyOn(cache, "isFeatureAvailable").mockImplementation(() => true);

View File

@ -27,17 +27,9 @@ beforeAll(() => {
return actualUtils.getInputAsArray(name, options); return actualUtils.getInputAsArray(name, options);
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
const actualUtils = jest.requireActual("../src/utils/actionUtils");
return actualUtils.getInputAsBool(name, options);
}
);
}); });
beforeEach(() => { beforeEach(() => {
jest.restoreAllMocks();
process.env[Events.Key] = Events.Push; process.env[Events.Key] = Events.Push;
process.env[RefKey] = "refs/heads/feature-branch"; process.env[RefKey] = "refs/heads/feature-branch";
@ -58,8 +50,7 @@ test("restore with no cache found", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -74,7 +65,7 @@ test("restore with no cache found", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledTimes(1); expect(stateMock).toHaveBeenCalledTimes(1);
@ -93,8 +84,7 @@ test("restore with restore keys and no cache found", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -109,13 +99,7 @@ test("restore with restore keys and no cache found", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledTimes(1); expect(stateMock).toHaveBeenCalledTimes(1);
@ -132,8 +116,7 @@ test("restore with cache found for key", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -149,7 +132,7 @@ test("restore with cache found for key", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key); expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key);
@ -169,8 +152,7 @@ test("restore with cache found for restore key", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -186,13 +168,7 @@ test("restore with cache found for restore key", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", restoreKey); expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", restoreKey);

View File

@ -28,17 +28,9 @@ beforeAll(() => {
return actualUtils.getInputAsArray(name, options); return actualUtils.getInputAsArray(name, options);
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
const actualUtils = jest.requireActual("../src/utils/actionUtils");
return actualUtils.getInputAsBool(name, options);
}
);
}); });
beforeEach(() => { beforeEach(() => {
jest.restoreAllMocks();
process.env[Events.Key] = Events.Push; process.env[Events.Key] = Events.Push;
process.env[RefKey] = "refs/heads/feature-branch"; process.env[RefKey] = "refs/heads/feature-branch";
@ -105,8 +97,7 @@ test("restore on GHES with AC available ", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -122,7 +113,7 @@ test("restore on GHES with AC available ", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -161,20 +152,13 @@ test("restore with too many keys should fail", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys, restoreKeys
enableCrossOsArchive: false
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, restoreKeys);
[path],
key,
restoreKeys,
{},
false
);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
`Key Validation Error: Keys are limited to a maximum of 10.` `Key Validation Error: Keys are limited to a maximum of 10.`
); );
@ -185,14 +169,13 @@ test("restore with large key should fail", async () => {
const key = "foo".repeat(512); // Over the 512 character limit const key = "foo".repeat(512); // Over the 512 character limit
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
`Key Validation Error: ${key} cannot be larger than 512 characters.` `Key Validation Error: ${key} cannot be larger than 512 characters.`
); );
@ -203,14 +186,13 @@ test("restore with invalid key should fail", async () => {
const key = "comma,comma"; const key = "comma,comma";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const failedMock = jest.spyOn(core, "setFailed"); const failedMock = jest.spyOn(core, "setFailed");
const restoreCacheMock = jest.spyOn(cache, "restoreCache"); const restoreCacheMock = jest.spyOn(cache, "restoreCache");
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(failedMock).toHaveBeenCalledWith( expect(failedMock).toHaveBeenCalledWith(
`Key Validation Error: ${key} cannot contain commas.` `Key Validation Error: ${key} cannot contain commas.`
); );
@ -221,8 +203,7 @@ test("restore with no cache found", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -237,7 +218,7 @@ test("restore with no cache found", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
@ -254,8 +235,7 @@ test("restore with restore keys and no cache found", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -270,13 +250,7 @@ test("restore with restore keys and no cache found", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
@ -291,8 +265,7 @@ test("restore with cache found for key", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -308,7 +281,7 @@ test("restore with cache found for key", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);
@ -325,8 +298,7 @@ test("restore with cache found for restore key", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -342,13 +314,7 @@ test("restore with cache found for restore key", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key);
expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1);

View File

@ -27,18 +27,9 @@ beforeAll(() => {
return actualUtils.getInputAsArray(name, options); return actualUtils.getInputAsArray(name, options);
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
return jest
.requireActual("../src/utils/actionUtils")
.getInputAsBool(name, options);
}
);
}); });
beforeEach(() => { beforeEach(() => {
jest.restoreAllMocks();
process.env[Events.Key] = Events.Push; process.env[Events.Key] = Events.Push;
process.env[RefKey] = "refs/heads/feature-branch"; process.env[RefKey] = "refs/heads/feature-branch";
@ -59,8 +50,7 @@ test("restore with no cache found", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -75,7 +65,7 @@ test("restore with no cache found", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(outputMock).toHaveBeenCalledTimes(1); expect(outputMock).toHaveBeenCalledTimes(1);
@ -93,8 +83,7 @@ test("restore with restore keys and no cache found", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -109,13 +98,7 @@ test("restore with restore keys and no cache found", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
@ -130,8 +113,7 @@ test("restore with cache found for key", async () => {
const key = "node-test"; const key = "node-test";
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -146,7 +128,7 @@ test("restore with cache found for key", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(outputMock).toHaveBeenCalledWith("cache-hit", "true"); expect(outputMock).toHaveBeenCalledWith("cache-hit", "true");
@ -165,8 +147,7 @@ test("restore with cache found for restore key", async () => {
testUtils.setInputs({ testUtils.setInputs({
path: path, path: path,
key, key,
restoreKeys: [restoreKey], restoreKeys: [restoreKey]
enableCrossOsArchive: false
}); });
const infoMock = jest.spyOn(core, "info"); const infoMock = jest.spyOn(core, "info");
@ -181,13 +162,7 @@ test("restore with cache found for restore key", async () => {
await run(); await run();
expect(restoreCacheMock).toHaveBeenCalledTimes(1); expect(restoreCacheMock).toHaveBeenCalledTimes(1);
expect(restoreCacheMock).toHaveBeenCalledWith( expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]);
[path],
key,
[restoreKey],
{},
false
);
expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key);
expect(outputMock).toHaveBeenCalledWith("cache-hit", "false"); expect(outputMock).toHaveBeenCalledWith("cache-hit", "false");

View File

@ -35,14 +35,6 @@ beforeAll(() => {
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
return jest
.requireActual("../src/utils/actionUtils")
.getInputAsBool(name, options);
}
);
jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation(
(key, cacheResult) => { (key, cacheResult) => {
return jest return jest
@ -103,14 +95,9 @@ test("save with valid inputs uploads a cache", async () => {
await run(); await run();
expect(saveCacheMock).toHaveBeenCalledTimes(1); expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
[inputPath], uploadChunkSize: 4000000
primaryKey, });
{
uploadChunkSize: 4000000
},
false
);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
}); });

View File

@ -32,14 +32,6 @@ beforeAll(() => {
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
return jest
.requireActual("../src/utils/actionUtils")
.getInputAsBool(name, options);
}
);
jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation(
(key, cacheResult) => { (key, cacheResult) => {
return jest return jest
@ -55,7 +47,6 @@ beforeAll(() => {
}); });
beforeEach(() => { beforeEach(() => {
jest.restoreAllMocks();
process.env[Events.Key] = Events.Push; process.env[Events.Key] = Events.Push;
process.env[RefKey] = "refs/heads/feature-branch"; process.env[RefKey] = "refs/heads/feature-branch";
@ -164,14 +155,9 @@ test("save on GHES with AC available", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(saveCacheMock).toHaveBeenCalledTimes(1); expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
[inputPath], uploadChunkSize: 4000000
primaryKey, });
{
uploadChunkSize: 4000000
},
false
);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
}); });
@ -265,8 +251,7 @@ test("save with large cache outputs warning", async () => {
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith(
[inputPath], [inputPath],
primaryKey, primaryKey,
expect.anything(), expect.anything()
false
); );
expect(logWarningMock).toHaveBeenCalledTimes(1); expect(logWarningMock).toHaveBeenCalledTimes(1);
@ -312,8 +297,7 @@ test("save with reserve cache failure outputs warning", async () => {
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith(
[inputPath], [inputPath],
primaryKey, primaryKey,
expect.anything(), expect.anything()
false
); );
expect(logWarningMock).toHaveBeenCalledWith( expect(logWarningMock).toHaveBeenCalledWith(
@ -355,8 +339,7 @@ test("save with server error outputs warning", async () => {
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith(
[inputPath], [inputPath],
primaryKey, primaryKey,
expect.anything(), expect.anything()
false
); );
expect(logWarningMock).toHaveBeenCalledTimes(1); expect(logWarningMock).toHaveBeenCalledTimes(1);
@ -395,14 +378,9 @@ test("save with valid inputs uploads a cache", async () => {
await run(new StateProvider()); await run(new StateProvider());
expect(saveCacheMock).toHaveBeenCalledTimes(1); expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
[inputPath], uploadChunkSize: 4000000
primaryKey, });
{
uploadChunkSize: 4000000
},
false
);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
}); });

View File

@ -35,14 +35,6 @@ beforeAll(() => {
} }
); );
jest.spyOn(actionUtils, "getInputAsBool").mockImplementation(
(name, options) => {
return jest
.requireActual("../src/utils/actionUtils")
.getInputAsBool(name, options);
}
);
jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation(
(key, cacheResult) => { (key, cacheResult) => {
return jest return jest
@ -93,14 +85,9 @@ test("save with valid inputs uploads a cache", async () => {
await run(); await run();
expect(saveCacheMock).toHaveBeenCalledTimes(1); expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
[inputPath], uploadChunkSize: 4000000
primaryKey, });
{
uploadChunkSize: 4000000
},
false
);
expect(failedMock).toHaveBeenCalledTimes(0); expect(failedMock).toHaveBeenCalledTimes(0);
}); });
@ -125,14 +112,9 @@ test("save failing logs the warning message", async () => {
await run(); await run();
expect(saveCacheMock).toHaveBeenCalledTimes(1); expect(saveCacheMock).toHaveBeenCalledTimes(1);
expect(saveCacheMock).toHaveBeenCalledWith( expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, {
[inputPath], uploadChunkSize: 4000000
primaryKey, });
{
uploadChunkSize: 4000000
},
false
);
expect(warningMock).toHaveBeenCalledTimes(1); expect(warningMock).toHaveBeenCalledTimes(1);
expect(warningMock).toHaveBeenCalledWith("Cache save failed."); expect(warningMock).toHaveBeenCalledWith("Cache save failed.");

View File

@ -14,10 +14,6 @@ inputs:
upload-chunk-size: upload-chunk-size:
description: 'The chunk size used to split up large files during upload, in bytes' description: 'The chunk size used to split up large files during upload, in bytes'
required: false required: false
enableCrossOsArchive:
description: 'An optional boolean when enabled, allows windows runners to save or restore caches that can be restored or saved respectively on other platforms'
default: 'false'
required: false
outputs: outputs:
cache-hit: cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key' description: 'A boolean value to indicate an exact match was found for the primary key'

View File

@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417));
const fs = __importStar(__webpack_require__(747)); const fs = __importStar(__webpack_require__(747));
const url_1 = __webpack_require__(414); const url_1 = __webpack_require__(414);
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931);
const downloadUtils_1 = __webpack_require__(251); const downloadUtils_1 = __webpack_require__(251);
const options_1 = __webpack_require__(538); const options_1 = __webpack_require__(538);
const requestUtils_1 = __webpack_require__(899); const requestUtils_1 = __webpack_require__(899);
@ -3412,17 +3413,10 @@ function createHttpClient() {
const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token);
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
} }
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { function getCacheVersion(paths, compressionMethod) {
const components = paths; const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip
// Add compression method to cache version to restore ? []
// compressed cache as per compression method : [compressionMethod]);
if (compressionMethod) {
components.push(compressionMethod);
}
// Only check for windows platforms if enableCrossOsArchive is false
if (process.platform === 'win32' && !enableCrossOsArchive) {
components.push('windows-only');
}
// Add salt to cache version to support breaking changes in cache entry // Add salt to cache version to support breaking changes in cache entry
components.push(versionSalt); components.push(versionSalt);
return crypto return crypto
@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion;
function getCacheEntry(keys, paths, options) { function getCacheEntry(keys, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found // Cache not found
@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache;
function reserveCache(key, paths, options) { function reserveCache(key, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const reserveCacheRequest = { const reserveCacheRequest = {
key, key,
version, version,
@ -4977,8 +4971,7 @@ var Inputs;
Inputs["Key"] = "key"; Inputs["Key"] = "key";
Inputs["Path"] = "path"; Inputs["Path"] = "path";
Inputs["RestoreKeys"] = "restore-keys"; Inputs["RestoreKeys"] = "restore-keys";
Inputs["UploadChunkSize"] = "upload-chunk-size"; Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
})(Inputs = exports.Inputs || (exports.Inputs = {})); })(Inputs = exports.Inputs || (exports.Inputs = {}));
var Outputs; var Outputs;
(function (Outputs) { (function (Outputs) {
@ -10074,7 +10067,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result; return result;
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0;
const cache = __importStar(__webpack_require__(692)); const cache = __importStar(__webpack_require__(692));
const core = __importStar(__webpack_require__(470)); const core = __importStar(__webpack_require__(470));
const constants_1 = __webpack_require__(196); const constants_1 = __webpack_require__(196);
@ -10117,11 +10110,6 @@ function getInputAsInt(name, options) {
return value; return value;
} }
exports.getInputAsInt = getInputAsInt; exports.getInputAsInt = getInputAsInt;
function getInputAsBool(name, options) {
const result = core.getInput(name, options);
return result.toLowerCase() === "true";
}
exports.getInputAsBool = getInputAsBool;
function isCacheFeatureAvailable() { function isCacheFeatureAvailable() {
if (cache.isFeatureAvailable()) { if (cache.isFeatureAvailable()) {
return true; return true;
@ -38229,14 +38217,12 @@ var __importStar = (this && this.__importStar) || function (mod) {
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const exec_1 = __webpack_require__(986); const exec_1 = __webpack_require__(986);
const core_1 = __webpack_require__(470);
const io = __importStar(__webpack_require__(1)); const io = __importStar(__webpack_require__(1));
const fs_1 = __webpack_require__(747); const fs_1 = __webpack_require__(747);
const path = __importStar(__webpack_require__(622)); const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931); const constants_1 = __webpack_require__(931);
const IS_WINDOWS = process.platform === 'win32'; const IS_WINDOWS = process.platform === 'win32';
core_1.exportVariable('MSYS', 'winsymlinks:nativestrict');
// Returns tar path and type: BSD or GNU // Returns tar path and type: BSD or GNU
function getTarPath() { function getTarPath() {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
@ -47209,6 +47195,7 @@ const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const cacheHttpClient = __importStar(__webpack_require__(114)); const cacheHttpClient = __importStar(__webpack_require__(114));
const tar_1 = __webpack_require__(434); const tar_1 = __webpack_require__(434);
const constants_1 = __webpack_require__(931);
class ValidationError extends Error { class ValidationError extends Error {
constructor(message) { constructor(message) {
super(message); super(message);
@ -47255,10 +47242,9 @@ exports.isFeatureAvailable = isFeatureAvailable;
* @param primaryKey an explicit key for restoring the cache * @param primaryKey an explicit key for restoring the cache
* @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key
* @param downloadOptions cache download options * @param downloadOptions cache download options
* @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform
* @returns string returns the key for the cache hit, otherwise returns undefined * @returns string returns the key for the cache hit, otherwise returns undefined
*/ */
function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { function restoreCache(paths, primaryKey, restoreKeys, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
restoreKeys = restoreKeys || []; restoreKeys = restoreKeys || [];
@ -47271,17 +47257,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch
for (const key of keys) { for (const key of keys) {
checkKey(key); checkKey(key);
} }
const compressionMethod = yield utils.getCompressionMethod(); let cacheEntry;
let compressionMethod = yield utils.getCompressionMethod();
let archivePath = ''; let archivePath = '';
try { try {
// path are needed to compute version // path are needed to compute version
const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod, compressionMethod
enableCrossOsArchive
}); });
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
// Cache not found // This is to support the old cache entry created by gzip on windows.
return undefined; if (process.platform === 'win32' &&
compressionMethod !== constants_1.CompressionMethod.Gzip) {
compressionMethod = constants_1.CompressionMethod.Gzip;
cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod
});
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found
return undefined;
}
} }
archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod));
core.debug(`Archive Path: ${archivePath}`); core.debug(`Archive Path: ${archivePath}`);
@ -47324,11 +47324,10 @@ exports.restoreCache = restoreCache;
* *
* @param paths a list of file paths to be cached * @param paths a list of file paths to be cached
* @param key an explicit key for restoring the cache * @param key an explicit key for restoring the cache
* @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform
* @param options cache upload options * @param options cache upload options
* @returns number returns cacheId if the cache was saved successfully and throws an error if save fails * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails
*/ */
function saveCache(paths, key, options, enableCrossOsArchive = false) { function saveCache(paths, key, options) {
var _a, _b, _c, _d, _e; var _a, _b, _c, _d, _e;
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
@ -47359,7 +47358,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) {
core.debug('Reserving Cache'); core.debug('Reserving Cache');
const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, {
compressionMethod, compressionMethod,
enableCrossOsArchive,
cacheSize: archiveFileSize cacheSize: archiveFileSize
}); });
if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) {
@ -50494,8 +50492,7 @@ function restoreImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys);
const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys, {}, enableCrossOsArchive);
if (!cacheKey) { if (!cacheKey) {
core.info(`Cache not found for input keys: ${[ core.info(`Cache not found for input keys: ${[
primaryKey, primaryKey,

69
dist/restore/index.js vendored
View File

@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417));
const fs = __importStar(__webpack_require__(747)); const fs = __importStar(__webpack_require__(747));
const url_1 = __webpack_require__(414); const url_1 = __webpack_require__(414);
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931);
const downloadUtils_1 = __webpack_require__(251); const downloadUtils_1 = __webpack_require__(251);
const options_1 = __webpack_require__(538); const options_1 = __webpack_require__(538);
const requestUtils_1 = __webpack_require__(899); const requestUtils_1 = __webpack_require__(899);
@ -3412,17 +3413,10 @@ function createHttpClient() {
const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token);
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
} }
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { function getCacheVersion(paths, compressionMethod) {
const components = paths; const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip
// Add compression method to cache version to restore ? []
// compressed cache as per compression method : [compressionMethod]);
if (compressionMethod) {
components.push(compressionMethod);
}
// Only check for windows platforms if enableCrossOsArchive is false
if (process.platform === 'win32' && !enableCrossOsArchive) {
components.push('windows-only');
}
// Add salt to cache version to support breaking changes in cache entry // Add salt to cache version to support breaking changes in cache entry
components.push(versionSalt); components.push(versionSalt);
return crypto return crypto
@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion;
function getCacheEntry(keys, paths, options) { function getCacheEntry(keys, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found // Cache not found
@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache;
function reserveCache(key, paths, options) { function reserveCache(key, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const reserveCacheRequest = { const reserveCacheRequest = {
key, key,
version, version,
@ -4977,8 +4971,7 @@ var Inputs;
Inputs["Key"] = "key"; Inputs["Key"] = "key";
Inputs["Path"] = "path"; Inputs["Path"] = "path";
Inputs["RestoreKeys"] = "restore-keys"; Inputs["RestoreKeys"] = "restore-keys";
Inputs["UploadChunkSize"] = "upload-chunk-size"; Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
})(Inputs = exports.Inputs || (exports.Inputs = {})); })(Inputs = exports.Inputs || (exports.Inputs = {}));
var Outputs; var Outputs;
(function (Outputs) { (function (Outputs) {
@ -38137,14 +38130,12 @@ var __importStar = (this && this.__importStar) || function (mod) {
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const exec_1 = __webpack_require__(986); const exec_1 = __webpack_require__(986);
const core_1 = __webpack_require__(470);
const io = __importStar(__webpack_require__(1)); const io = __importStar(__webpack_require__(1));
const fs_1 = __webpack_require__(747); const fs_1 = __webpack_require__(747);
const path = __importStar(__webpack_require__(622)); const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931); const constants_1 = __webpack_require__(931);
const IS_WINDOWS = process.platform === 'win32'; const IS_WINDOWS = process.platform === 'win32';
core_1.exportVariable('MSYS', 'winsymlinks:nativestrict');
// Returns tar path and type: BSD or GNU // Returns tar path and type: BSD or GNU
function getTarPath() { function getTarPath() {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
@ -38602,7 +38593,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result; return result;
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0;
const cache = __importStar(__webpack_require__(692)); const cache = __importStar(__webpack_require__(692));
const core = __importStar(__webpack_require__(470)); const core = __importStar(__webpack_require__(470));
const constants_1 = __webpack_require__(196); const constants_1 = __webpack_require__(196);
@ -38645,11 +38636,6 @@ function getInputAsInt(name, options) {
return value; return value;
} }
exports.getInputAsInt = getInputAsInt; exports.getInputAsInt = getInputAsInt;
function getInputAsBool(name, options) {
const result = core.getInput(name, options);
return result.toLowerCase() === "true";
}
exports.getInputAsBool = getInputAsBool;
function isCacheFeatureAvailable() { function isCacheFeatureAvailable() {
if (cache.isFeatureAvailable()) { if (cache.isFeatureAvailable()) {
return true; return true;
@ -47180,6 +47166,7 @@ const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const cacheHttpClient = __importStar(__webpack_require__(114)); const cacheHttpClient = __importStar(__webpack_require__(114));
const tar_1 = __webpack_require__(434); const tar_1 = __webpack_require__(434);
const constants_1 = __webpack_require__(931);
class ValidationError extends Error { class ValidationError extends Error {
constructor(message) { constructor(message) {
super(message); super(message);
@ -47226,10 +47213,9 @@ exports.isFeatureAvailable = isFeatureAvailable;
* @param primaryKey an explicit key for restoring the cache * @param primaryKey an explicit key for restoring the cache
* @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key
* @param downloadOptions cache download options * @param downloadOptions cache download options
* @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform
* @returns string returns the key for the cache hit, otherwise returns undefined * @returns string returns the key for the cache hit, otherwise returns undefined
*/ */
function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { function restoreCache(paths, primaryKey, restoreKeys, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
restoreKeys = restoreKeys || []; restoreKeys = restoreKeys || [];
@ -47242,17 +47228,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch
for (const key of keys) { for (const key of keys) {
checkKey(key); checkKey(key);
} }
const compressionMethod = yield utils.getCompressionMethod(); let cacheEntry;
let compressionMethod = yield utils.getCompressionMethod();
let archivePath = ''; let archivePath = '';
try { try {
// path are needed to compute version // path are needed to compute version
const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod, compressionMethod
enableCrossOsArchive
}); });
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
// Cache not found // This is to support the old cache entry created by gzip on windows.
return undefined; if (process.platform === 'win32' &&
compressionMethod !== constants_1.CompressionMethod.Gzip) {
compressionMethod = constants_1.CompressionMethod.Gzip;
cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod
});
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found
return undefined;
}
} }
archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod));
core.debug(`Archive Path: ${archivePath}`); core.debug(`Archive Path: ${archivePath}`);
@ -47295,11 +47295,10 @@ exports.restoreCache = restoreCache;
* *
* @param paths a list of file paths to be cached * @param paths a list of file paths to be cached
* @param key an explicit key for restoring the cache * @param key an explicit key for restoring the cache
* @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform
* @param options cache upload options * @param options cache upload options
* @returns number returns cacheId if the cache was saved successfully and throws an error if save fails * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails
*/ */
function saveCache(paths, key, options, enableCrossOsArchive = false) { function saveCache(paths, key, options) {
var _a, _b, _c, _d, _e; var _a, _b, _c, _d, _e;
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
@ -47330,7 +47329,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) {
core.debug('Reserving Cache'); core.debug('Reserving Cache');
const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, {
compressionMethod, compressionMethod,
enableCrossOsArchive,
cacheSize: archiveFileSize cacheSize: archiveFileSize
}); });
if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) {
@ -50494,8 +50492,7 @@ function restoreImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys);
const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys, {}, enableCrossOsArchive);
if (!cacheKey) { if (!cacheKey) {
core.info(`Cache not found for input keys: ${[ core.info(`Cache not found for input keys: ${[
primaryKey, primaryKey,

View File

@ -3439,6 +3439,7 @@ const crypto = __importStar(__webpack_require__(417));
const fs = __importStar(__webpack_require__(747)); const fs = __importStar(__webpack_require__(747));
const url_1 = __webpack_require__(835); const url_1 = __webpack_require__(835);
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931);
const downloadUtils_1 = __webpack_require__(251); const downloadUtils_1 = __webpack_require__(251);
const options_1 = __webpack_require__(538); const options_1 = __webpack_require__(538);
const requestUtils_1 = __webpack_require__(899); const requestUtils_1 = __webpack_require__(899);
@ -3468,17 +3469,10 @@ function createHttpClient() {
const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token);
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
} }
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { function getCacheVersion(paths, compressionMethod) {
const components = paths; const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip
// Add compression method to cache version to restore ? []
// compressed cache as per compression method : [compressionMethod]);
if (compressionMethod) {
components.push(compressionMethod);
}
// Only check for windows platforms if enableCrossOsArchive is false
if (process.platform === 'win32' && !enableCrossOsArchive) {
components.push('windows-only');
}
// Add salt to cache version to support breaking changes in cache entry // Add salt to cache version to support breaking changes in cache entry
components.push(versionSalt); components.push(versionSalt);
return crypto return crypto
@ -3490,7 +3484,7 @@ exports.getCacheVersion = getCacheVersion;
function getCacheEntry(keys, paths, options) { function getCacheEntry(keys, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found // Cache not found
@ -3553,7 +3547,7 @@ exports.downloadCache = downloadCache;
function reserveCache(key, paths, options) { function reserveCache(key, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const reserveCacheRequest = { const reserveCacheRequest = {
key, key,
version, version,
@ -5033,8 +5027,7 @@ var Inputs;
Inputs["Key"] = "key"; Inputs["Key"] = "key";
Inputs["Path"] = "path"; Inputs["Path"] = "path";
Inputs["RestoreKeys"] = "restore-keys"; Inputs["RestoreKeys"] = "restore-keys";
Inputs["UploadChunkSize"] = "upload-chunk-size"; Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
})(Inputs = exports.Inputs || (exports.Inputs = {})); })(Inputs = exports.Inputs || (exports.Inputs = {}));
var Outputs; var Outputs;
(function (Outputs) { (function (Outputs) {
@ -38188,14 +38181,12 @@ var __importStar = (this && this.__importStar) || function (mod) {
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const exec_1 = __webpack_require__(986); const exec_1 = __webpack_require__(986);
const core_1 = __webpack_require__(470);
const io = __importStar(__webpack_require__(1)); const io = __importStar(__webpack_require__(1));
const fs_1 = __webpack_require__(747); const fs_1 = __webpack_require__(747);
const path = __importStar(__webpack_require__(622)); const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931); const constants_1 = __webpack_require__(931);
const IS_WINDOWS = process.platform === 'win32'; const IS_WINDOWS = process.platform === 'win32';
core_1.exportVariable('MSYS', 'winsymlinks:nativestrict');
// Returns tar path and type: BSD or GNU // Returns tar path and type: BSD or GNU
function getTarPath() { function getTarPath() {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
@ -38653,7 +38644,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result; return result;
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0;
const cache = __importStar(__webpack_require__(692)); const cache = __importStar(__webpack_require__(692));
const core = __importStar(__webpack_require__(470)); const core = __importStar(__webpack_require__(470));
const constants_1 = __webpack_require__(196); const constants_1 = __webpack_require__(196);
@ -38696,11 +38687,6 @@ function getInputAsInt(name, options) {
return value; return value;
} }
exports.getInputAsInt = getInputAsInt; exports.getInputAsInt = getInputAsInt;
function getInputAsBool(name, options) {
const result = core.getInput(name, options);
return result.toLowerCase() === "true";
}
exports.getInputAsBool = getInputAsBool;
function isCacheFeatureAvailable() { function isCacheFeatureAvailable() {
if (cache.isFeatureAvailable()) { if (cache.isFeatureAvailable()) {
return true; return true;
@ -41180,8 +41166,9 @@ function saveImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); cacheId = yield cache.saveCache(cachePaths, primaryKey, {
cacheId = yield cache.saveCache(cachePaths, primaryKey, { uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) }, enableCrossOsArchive); uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
});
if (cacheId != -1) { if (cacheId != -1) {
core.info(`Cache saved with key: ${primaryKey}`); core.info(`Cache saved with key: ${primaryKey}`);
} }
@ -47321,6 +47308,7 @@ const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const cacheHttpClient = __importStar(__webpack_require__(114)); const cacheHttpClient = __importStar(__webpack_require__(114));
const tar_1 = __webpack_require__(434); const tar_1 = __webpack_require__(434);
const constants_1 = __webpack_require__(931);
class ValidationError extends Error { class ValidationError extends Error {
constructor(message) { constructor(message) {
super(message); super(message);
@ -47367,10 +47355,9 @@ exports.isFeatureAvailable = isFeatureAvailable;
* @param primaryKey an explicit key for restoring the cache * @param primaryKey an explicit key for restoring the cache
* @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key
* @param downloadOptions cache download options * @param downloadOptions cache download options
* @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform
* @returns string returns the key for the cache hit, otherwise returns undefined * @returns string returns the key for the cache hit, otherwise returns undefined
*/ */
function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { function restoreCache(paths, primaryKey, restoreKeys, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
restoreKeys = restoreKeys || []; restoreKeys = restoreKeys || [];
@ -47383,17 +47370,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch
for (const key of keys) { for (const key of keys) {
checkKey(key); checkKey(key);
} }
const compressionMethod = yield utils.getCompressionMethod(); let cacheEntry;
let compressionMethod = yield utils.getCompressionMethod();
let archivePath = ''; let archivePath = '';
try { try {
// path are needed to compute version // path are needed to compute version
const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod, compressionMethod
enableCrossOsArchive
}); });
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
// Cache not found // This is to support the old cache entry created by gzip on windows.
return undefined; if (process.platform === 'win32' &&
compressionMethod !== constants_1.CompressionMethod.Gzip) {
compressionMethod = constants_1.CompressionMethod.Gzip;
cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod
});
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found
return undefined;
}
} }
archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod));
core.debug(`Archive Path: ${archivePath}`); core.debug(`Archive Path: ${archivePath}`);
@ -47436,11 +47437,10 @@ exports.restoreCache = restoreCache;
* *
* @param paths a list of file paths to be cached * @param paths a list of file paths to be cached
* @param key an explicit key for restoring the cache * @param key an explicit key for restoring the cache
* @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform
* @param options cache upload options * @param options cache upload options
* @returns number returns cacheId if the cache was saved successfully and throws an error if save fails * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails
*/ */
function saveCache(paths, key, options, enableCrossOsArchive = false) { function saveCache(paths, key, options) {
var _a, _b, _c, _d, _e; var _a, _b, _c, _d, _e;
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
@ -47471,7 +47471,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) {
core.debug('Reserving Cache'); core.debug('Reserving Cache');
const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, {
compressionMethod, compressionMethod,
enableCrossOsArchive,
cacheSize: archiveFileSize cacheSize: archiveFileSize
}); });
if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) {

71
dist/save/index.js vendored
View File

@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417));
const fs = __importStar(__webpack_require__(747)); const fs = __importStar(__webpack_require__(747));
const url_1 = __webpack_require__(835); const url_1 = __webpack_require__(835);
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931);
const downloadUtils_1 = __webpack_require__(251); const downloadUtils_1 = __webpack_require__(251);
const options_1 = __webpack_require__(538); const options_1 = __webpack_require__(538);
const requestUtils_1 = __webpack_require__(899); const requestUtils_1 = __webpack_require__(899);
@ -3412,17 +3413,10 @@ function createHttpClient() {
const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token);
return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions());
} }
function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { function getCacheVersion(paths, compressionMethod) {
const components = paths; const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip
// Add compression method to cache version to restore ? []
// compressed cache as per compression method : [compressionMethod]);
if (compressionMethod) {
components.push(compressionMethod);
}
// Only check for windows platforms if enableCrossOsArchive is false
if (process.platform === 'win32' && !enableCrossOsArchive) {
components.push('windows-only');
}
// Add salt to cache version to support breaking changes in cache entry // Add salt to cache version to support breaking changes in cache entry
components.push(versionSalt); components.push(versionSalt);
return crypto return crypto
@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion;
function getCacheEntry(keys, paths, options) { function getCacheEntry(keys, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`;
const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); }));
// Cache not found // Cache not found
@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache;
function reserveCache(key, paths, options) { function reserveCache(key, paths, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
const httpClient = createHttpClient(); const httpClient = createHttpClient();
const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod);
const reserveCacheRequest = { const reserveCacheRequest = {
key, key,
version, version,
@ -4977,8 +4971,7 @@ var Inputs;
Inputs["Key"] = "key"; Inputs["Key"] = "key";
Inputs["Path"] = "path"; Inputs["Path"] = "path";
Inputs["RestoreKeys"] = "restore-keys"; Inputs["RestoreKeys"] = "restore-keys";
Inputs["UploadChunkSize"] = "upload-chunk-size"; Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
})(Inputs = exports.Inputs || (exports.Inputs = {})); })(Inputs = exports.Inputs || (exports.Inputs = {}));
var Outputs; var Outputs;
(function (Outputs) { (function (Outputs) {
@ -38132,14 +38125,12 @@ var __importStar = (this && this.__importStar) || function (mod) {
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
const exec_1 = __webpack_require__(986); const exec_1 = __webpack_require__(986);
const core_1 = __webpack_require__(470);
const io = __importStar(__webpack_require__(1)); const io = __importStar(__webpack_require__(1));
const fs_1 = __webpack_require__(747); const fs_1 = __webpack_require__(747);
const path = __importStar(__webpack_require__(622)); const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const constants_1 = __webpack_require__(931); const constants_1 = __webpack_require__(931);
const IS_WINDOWS = process.platform === 'win32'; const IS_WINDOWS = process.platform === 'win32';
core_1.exportVariable('MSYS', 'winsymlinks:nativestrict');
// Returns tar path and type: BSD or GNU // Returns tar path and type: BSD or GNU
function getTarPath() { function getTarPath() {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
@ -38597,7 +38588,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result; return result;
}; };
Object.defineProperty(exports, "__esModule", { value: true }); Object.defineProperty(exports, "__esModule", { value: true });
exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0;
const cache = __importStar(__webpack_require__(692)); const cache = __importStar(__webpack_require__(692));
const core = __importStar(__webpack_require__(470)); const core = __importStar(__webpack_require__(470));
const constants_1 = __webpack_require__(196); const constants_1 = __webpack_require__(196);
@ -38640,11 +38631,6 @@ function getInputAsInt(name, options) {
return value; return value;
} }
exports.getInputAsInt = getInputAsInt; exports.getInputAsInt = getInputAsInt;
function getInputAsBool(name, options) {
const result = core.getInput(name, options);
return result.toLowerCase() === "true";
}
exports.getInputAsBool = getInputAsBool;
function isCacheFeatureAvailable() { function isCacheFeatureAvailable() {
if (cache.isFeatureAvailable()) { if (cache.isFeatureAvailable()) {
return true; return true;
@ -41124,8 +41110,9 @@ function saveImpl(stateProvider) {
const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); cacheId = yield cache.saveCache(cachePaths, primaryKey, {
cacheId = yield cache.saveCache(cachePaths, primaryKey, { uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) }, enableCrossOsArchive); uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize)
});
if (cacheId != -1) { if (cacheId != -1) {
core.info(`Cache saved with key: ${primaryKey}`); core.info(`Cache saved with key: ${primaryKey}`);
} }
@ -47294,6 +47281,7 @@ const path = __importStar(__webpack_require__(622));
const utils = __importStar(__webpack_require__(15)); const utils = __importStar(__webpack_require__(15));
const cacheHttpClient = __importStar(__webpack_require__(114)); const cacheHttpClient = __importStar(__webpack_require__(114));
const tar_1 = __webpack_require__(434); const tar_1 = __webpack_require__(434);
const constants_1 = __webpack_require__(931);
class ValidationError extends Error { class ValidationError extends Error {
constructor(message) { constructor(message) {
super(message); super(message);
@ -47340,10 +47328,9 @@ exports.isFeatureAvailable = isFeatureAvailable;
* @param primaryKey an explicit key for restoring the cache * @param primaryKey an explicit key for restoring the cache
* @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key
* @param downloadOptions cache download options * @param downloadOptions cache download options
* @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform
* @returns string returns the key for the cache hit, otherwise returns undefined * @returns string returns the key for the cache hit, otherwise returns undefined
*/ */
function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { function restoreCache(paths, primaryKey, restoreKeys, options) {
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
restoreKeys = restoreKeys || []; restoreKeys = restoreKeys || [];
@ -47356,17 +47343,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch
for (const key of keys) { for (const key of keys) {
checkKey(key); checkKey(key);
} }
const compressionMethod = yield utils.getCompressionMethod(); let cacheEntry;
let compressionMethod = yield utils.getCompressionMethod();
let archivePath = ''; let archivePath = '';
try { try {
// path are needed to compute version // path are needed to compute version
const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod, compressionMethod
enableCrossOsArchive
}); });
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
// Cache not found // This is to support the old cache entry created by gzip on windows.
return undefined; if (process.platform === 'win32' &&
compressionMethod !== constants_1.CompressionMethod.Gzip) {
compressionMethod = constants_1.CompressionMethod.Gzip;
cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, {
compressionMethod
});
if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) {
return undefined;
}
core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression.");
}
else {
// Cache not found
return undefined;
}
} }
archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod));
core.debug(`Archive Path: ${archivePath}`); core.debug(`Archive Path: ${archivePath}`);
@ -47409,11 +47410,10 @@ exports.restoreCache = restoreCache;
* *
* @param paths a list of file paths to be cached * @param paths a list of file paths to be cached
* @param key an explicit key for restoring the cache * @param key an explicit key for restoring the cache
* @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform
* @param options cache upload options * @param options cache upload options
* @returns number returns cacheId if the cache was saved successfully and throws an error if save fails * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails
*/ */
function saveCache(paths, key, options, enableCrossOsArchive = false) { function saveCache(paths, key, options) {
var _a, _b, _c, _d, _e; var _a, _b, _c, _d, _e;
return __awaiter(this, void 0, void 0, function* () { return __awaiter(this, void 0, void 0, function* () {
checkPaths(paths); checkPaths(paths);
@ -47444,7 +47444,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) {
core.debug('Reserving Cache'); core.debug('Reserving Cache');
const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, {
compressionMethod, compressionMethod,
enableCrossOsArchive,
cacheSize: archiveFileSize cacheSize: archiveFileSize
}); });
if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) {

View File

@ -38,7 +38,6 @@
- [Swift, Objective-C - Carthage](#swift-objective-c---carthage) - [Swift, Objective-C - Carthage](#swift-objective-c---carthage)
- [Swift, Objective-C - CocoaPods](#swift-objective-c---cocoapods) - [Swift, Objective-C - CocoaPods](#swift-objective-c---cocoapods)
- [Swift - Swift Package Manager](#swift---swift-package-manager) - [Swift - Swift Package Manager](#swift---swift-package-manager)
- [Swift - Mint](#swift---mint)
## C# - NuGet ## C# - NuGet
@ -642,18 +641,3 @@ whenever possible:
restore-keys: | restore-keys: |
${{ runner.os }}-spm- ${{ runner.os }}-spm-
``` ```
## Swift - Mint
```yaml
env:
MINT_PATH: .mint/lib
MINT_LINK_PATH: .mint/bin
steps:
- uses: actions/cache@v3
with:
path: .mint
key: ${{ runner.os }}-mint-${{ hashFiles('**/Mintfile') }}
restore-keys: |
${{ runner.os }}-mint-
```

18
package-lock.json generated
View File

@ -1,15 +1,15 @@
{ {
"name": "cache", "name": "cache",
"version": "3.2.3", "version": "3.2.1",
"lockfileVersion": 2, "lockfileVersion": 2,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "cache", "name": "cache",
"version": "3.2.3", "version": "3.2.1",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@actions/cache": "^3.1.2", "@actions/cache": "^3.1.0",
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1", "@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2" "@actions/io": "^1.1.2"
@ -36,9 +36,9 @@
} }
}, },
"node_modules/@actions/cache": { "node_modules/@actions/cache": {
"version": "3.1.2", "version": "3.1.0",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.2.tgz", "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
"integrity": "sha512-3XeKcXIonfIbqvW7gPm/VLOhv1RHQ1dtTgSBCH6OUhCgSTii9bEVgu0PIms7UbLnXeMCKFzECfpbud8fJEvBbQ==", "integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
"dependencies": { "dependencies": {
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1", "@actions/exec": "^1.0.1",
@ -9722,9 +9722,9 @@
}, },
"dependencies": { "dependencies": {
"@actions/cache": { "@actions/cache": {
"version": "3.1.2", "version": "3.1.0",
"resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.2.tgz", "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz",
"integrity": "sha512-3XeKcXIonfIbqvW7gPm/VLOhv1RHQ1dtTgSBCH6OUhCgSTii9bEVgu0PIms7UbLnXeMCKFzECfpbud8fJEvBbQ==", "integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==",
"requires": { "requires": {
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.0.1", "@actions/exec": "^1.0.1",

View File

@ -1,6 +1,6 @@
{ {
"name": "cache", "name": "cache",
"version": "3.2.3", "version": "3.2.1",
"private": true, "private": true,
"description": "Cache dependencies and build outputs", "description": "Cache dependencies and build outputs",
"main": "dist/restore/index.js", "main": "dist/restore/index.js",
@ -23,7 +23,7 @@
"author": "GitHub", "author": "GitHub",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@actions/cache": "^3.1.2", "@actions/cache": "^3.1.0",
"@actions/core": "^1.10.0", "@actions/core": "^1.10.0",
"@actions/exec": "^1.1.1", "@actions/exec": "^1.1.1",
"@actions/io": "^1.1.2" "@actions/io": "^1.1.2"

View File

@ -120,7 +120,7 @@ steps:
#### Reusing primary key and restored key in the save action #### Reusing primary key and restored key in the save action
Usually you may want to use same `key` in both `actions/cache/restore` and `actions/cache/save` action. To achieve this, use `outputs` from the restore action to reuse the same primary key (or the key of the cache that was restored). Usually you may want to use same `key` in both actions/cache/restore` and `actions/cache/save` action. To achieve this, use `outputs` from the restore action to reuse the same primary key (or the key of the cache that was restored).
#### Using restore action outputs to make save action behave just like the cache action #### Using restore action outputs to make save action behave just like the cache action

View File

@ -11,10 +11,6 @@ inputs:
restore-keys: restore-keys:
description: 'An ordered list of keys to use for restoring stale cache if no cache hit occurred for key. Note `cache-hit` returns false in this case.' description: 'An ordered list of keys to use for restoring stale cache if no cache hit occurred for key. Note `cache-hit` returns false in this case.'
required: false required: false
enableCrossOsArchive:
description: 'An optional boolean when enabled, allows windows runners to restore caches that were saved on other platforms'
default: 'false'
required: false
outputs: outputs:
cache-hit: cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key' description: 'A boolean value to indicate an exact match was found for the primary key'

View File

@ -54,7 +54,7 @@ Case 1: Where an user would want to reuse the key as it is
```yaml ```yaml
uses: actions/cache/save@v3 uses: actions/cache/save@v3
with: with:
key: ${{ steps.restore-cache.outputs.key }} key: steps.restore-cache.output.key
``` ```
Case 2: Where the user would want to re-evaluate the key Case 2: Where the user would want to re-evaluate the key

View File

@ -11,10 +11,6 @@ inputs:
upload-chunk-size: upload-chunk-size:
description: 'The chunk size used to split up large files during upload, in bytes' description: 'The chunk size used to split up large files during upload, in bytes'
required: false required: false
enableCrossOsArchive:
description: 'An optional boolean when enabled, allows windows runners to save caches that can be restored on other platforms'
default: 'false'
required: false
runs: runs:
using: 'node16' using: 'node16'
main: '../dist/save-only/index.js' main: '../dist/save-only/index.js'

View File

@ -2,8 +2,7 @@ export enum Inputs {
Key = "key", // Input for cache, restore, save action Key = "key", // Input for cache, restore, save action
Path = "path", // Input for cache, restore, save action Path = "path", // Input for cache, restore, save action
RestoreKeys = "restore-keys", // Input for cache, restore action RestoreKeys = "restore-keys", // Input for cache, restore action
UploadChunkSize = "upload-chunk-size", // Input for cache, save action UploadChunkSize = "upload-chunk-size" // Input for cache, save action
EnableCrossOsArchive = "enableCrossOsArchive" // Input for cache, restore, save action
} }
export enum Outputs { export enum Outputs {

View File

@ -31,16 +31,11 @@ async function restoreImpl(
const cachePaths = utils.getInputAsArray(Inputs.Path, { const cachePaths = utils.getInputAsArray(Inputs.Path, {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool(
Inputs.EnableCrossOsArchive
);
const cacheKey = await cache.restoreCache( const cacheKey = await cache.restoreCache(
cachePaths, cachePaths,
primaryKey, primaryKey,
restoreKeys, restoreKeys
{},
enableCrossOsArchive
); );
if (!cacheKey) { if (!cacheKey) {

View File

@ -52,16 +52,9 @@ async function saveImpl(stateProvider: IStateProvider): Promise<number | void> {
required: true required: true
}); });
const enableCrossOsArchive = utils.getInputAsBool( cacheId = await cache.saveCache(cachePaths, primaryKey, {
Inputs.EnableCrossOsArchive uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize)
); });
cacheId = await cache.saveCache(
cachePaths,
primaryKey,
{ uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize) },
enableCrossOsArchive
);
if (cacheId != -1) { if (cacheId != -1) {
core.info(`Cache saved with key: ${primaryKey}`); core.info(`Cache saved with key: ${primaryKey}`);

View File

@ -52,14 +52,6 @@ export function getInputAsInt(
return value; return value;
} }
export function getInputAsBool(
name: string,
options?: core.InputOptions
): boolean {
const result = core.getInput(name, options);
return result.toLowerCase() === "true";
}
export function isCacheFeatureAvailable(): boolean { export function isCacheFeatureAvailable(): boolean {
if (cache.isFeatureAvailable()) { if (cache.isFeatureAvailable()) {
return true; return true;

View File

@ -13,7 +13,6 @@ interface CacheInput {
path: string; path: string;
key: string; key: string;
restoreKeys?: string[]; restoreKeys?: string[];
enableCrossOsArchive?: boolean;
} }
export function setInputs(input: CacheInput): void { export function setInputs(input: CacheInput): void {
@ -21,11 +20,6 @@ export function setInputs(input: CacheInput): void {
setInput(Inputs.Key, input.key); setInput(Inputs.Key, input.key);
input.restoreKeys && input.restoreKeys &&
setInput(Inputs.RestoreKeys, input.restoreKeys.join("\n")); setInput(Inputs.RestoreKeys, input.restoreKeys.join("\n"));
input.enableCrossOsArchive !== undefined &&
setInput(
Inputs.EnableCrossOsArchive,
input.enableCrossOsArchive.toString()
);
} }
export function clearInputs(): void { export function clearInputs(): void {
@ -33,5 +27,4 @@ export function clearInputs(): void {
delete process.env[getInputName(Inputs.Key)]; delete process.env[getInputName(Inputs.Key)];
delete process.env[getInputName(Inputs.RestoreKeys)]; delete process.env[getInputName(Inputs.RestoreKeys)];
delete process.env[getInputName(Inputs.UploadChunkSize)]; delete process.env[getInputName(Inputs.UploadChunkSize)];
delete process.env[getInputName(Inputs.EnableCrossOsArchive)];
} }

View File

@ -19,11 +19,6 @@ A cache today is immutable and cannot be updated. But some use cases require the
## Use cache across feature branches ## Use cache across feature branches
Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches. Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches.
## Cross OS cache
From `v3.2.3` cache is cross-os compatible when `enableCrossOsArchive` input is passed as true. This means that a cache created on `ubuntu-latest` or `mac-latest` can be used by `windows-latest` and vice versa, provided the workflow which runs on `windows-latest` have input `enableCrossOsArchive` as true. This is useful to cache dependencies which are independent of the runner platform. This will help reduce the consumption of the cache quota and help build for multiple platforms from the same cache. Things to keep in mind while using this feature:
- Only cache those files which are compatible across OSs.
- Caching symlinks might cause issues while restoration as they work differently on different OSs.
## Force deletion of caches overriding default cache eviction policy ## Force deletion of caches overriding default cache eviction policy
Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches. Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches.