1
0
mirror of https://github.com/actions/upload-artifact.git synced 2025-12-06 16:37:53 +01:00

Compare commits

..

8 Commits

Author SHA1 Message Date
Josh Soref
8011b13033 Merge c11cddaf41 into 4cec3d8aa0 2025-02-28 18:01:07 -05:00
Josh Soref
c11cddaf41 Improve trashcan documentation
Reorder paragraph to simplify messaging for only users with write permissions instead of getting people's hopes up that there might be a trash can only to dash it a sentence later.
2024-12-27 15:08:28 -05:00
Josh Soref
6ee0aee05d Improve Zip archives documentation grammar
As the format is `Zip`, it's wrong to precede it with an article `a` which would be more appropriate for an instance of the format.
2024-12-27 15:08:28 -05:00
Josh Soref
0387148d8c Improve outputs documentation
Use `the` for `artifact-id` as it's a precise instance and not an amorphous artifact.
Fix boolean grammar for `artifact-url` -- all conditions must be true in order for the `artifact-url` to be valid.
2024-12-27 15:08:28 -05:00
Josh Soref
694ad020c0 Fix include-hidden-files documentation grammar 2024-12-27 15:05:56 -05:00
Josh Soref
9a6f90fdff Improve compression-level documentation
Wrap text so that it isn't clipped when rendered in:
https://github.com/actions/upload-artifact?tab=readme-ov-file#inputs
2024-12-27 15:05:56 -05:00
Josh Soref
a44bc71eb1 Improve retention-days documentation
Clarify that `retention-days: 0` is a valid configuration which is numerically less than 1, but that the shortest retention is `1` (it is _not_ a minimum value according to Math/Numbers/Integers, it's just the shortest retention value).
Clarify that the maximum `retention-days` value is configured per repository (which implicitly inherits from organizations...) but that the default for repositories is 90 days.
2024-12-27 15:05:56 -05:00
Josh Soref
319676eefa Improve if-no-files-found documentation
If `if-no-files-found: ignore` and `overwrite: false` and an artifact already exists, the action will fail, so clarify that `ignore` doesn't guarantee the action won't fail.
2024-12-27 15:05:56 -05:00
7 changed files with 85 additions and 843 deletions

View File

@@ -1,6 +1,6 @@
---
name: "@actions/artifact"
version: 2.3.2
version: 2.2.2
type: npm
summary: Actions artifact lib
homepage: https://github.com/actions/toolkit/tree/main/packages/artifact

36
.vscode/launch.json vendored
View File

@@ -1,36 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Debug Jest Tests",
"program": "${workspaceFolder}/node_modules/jest/bin/jest.js",
"args": [
"--runInBand",
"--testTimeout",
"10000"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"internalConsoleOptions": "neverOpen",
"disableOptimisticBPs": true
},
{
"type": "node",
"request": "launch",
"name": "Debug Current Test File",
"program": "${workspaceFolder}/node_modules/jest/bin/jest.js",
"args": [
"--runInBand",
"--testTimeout",
"10000",
"${relativeFile}"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"internalConsoleOptions": "neverOpen",
"disableOptimisticBPs": true
}
]
}

View File

@@ -68,24 +68,6 @@ There is also a new sub-action, `actions/upload-artifact/merge`. For more info,
For assistance with breaking changes, see [MIGRATION.md](docs/MIGRATION.md).
## Note
Thank you for your interest in this GitHub repo, however, right now we are not taking contributions.
We continue to focus our resources on strategic areas that help our customers be successful while making developers' lives easier. While GitHub Actions remains a key part of this vision, we are allocating resources towards other areas of Actions and are not taking contributions to this repository at this time. The GitHub public roadmap is the best place to follow along for any updates on features were working on and what stage theyre in.
We are taking the following steps to better direct requests related to GitHub Actions, including:
1. We will be directing questions and support requests to our [Community Discussions area](https://github.com/orgs/community/discussions/categories/actions)
2. High Priority bugs can be reported through Community Discussions or you can report these to our support team https://support.github.com/contact/bug-report.
3. Security Issues should be handled as per our [security.md](SECURITY.md).
We will still provide security updates for this project and fix major breaking changes during this time.
You are welcome to still raise bugs in this repo.
## Usage
### Inputs
@@ -105,19 +87,21 @@ You are welcome to still raise bugs in this repo.
# Available Options:
# warn: Output a warning but do not fail the action
# error: Fail the action with an error message
# ignore: Do not output any warnings or errors, the action does not fail
# ignore: Do not output any warnings or errors, does not fail the action
# Optional. Default is 'warn'
if-no-files-found:
# Duration after which artifact will expire in days. 0 means using default retention.
# Minimum 1 day.
# Maximum 90 days unless changed from the repository settings page.
# Duration after which artifact will expire in days.
# 0 means use default retention.
# 1 is the shortest retention.
# Maximum is based on repository settings (the default is 90 days).
# Optional. Defaults to repository settings.
retention-days:
# The level of compression for Zlib to be applied to the artifact archive.
# The value can range from 0 to 9.
# For large files that are not easily compressed, a value of 0 is recommended for significantly faster uploads.
# For large files that are not easily compressed, a value of 0 is recommended for
# significantly faster uploads.
# Optional. Default is '6'
compression-level:
@@ -127,9 +111,9 @@ You are welcome to still raise bugs in this repo.
# Optional. Default is 'false'
overwrite:
# Whether to include hidden files in the provided path in the artifact
# Whether to include hidden files in the provided path in the artifact.
# The file contents of any hidden files in the path should be validated before
# enabled this to avoid uploading sensitive information.
# enabling this to avoid uploading sensitive information.
# Optional. Default is 'false'
include-hidden-files:
```
@@ -138,8 +122,8 @@ You are welcome to still raise bugs in this repo.
| Name | Description | Example |
| - | - | - |
| `artifact-id` | GitHub ID of an Artifact, can be used by the REST API | `1234` |
| `artifact-url` | URL to download an Artifact. Can be used in many scenarios such as linking to artifacts in issues or pull requests. Users must be logged-in in order for this URL to work. This URL is valid as long as the artifact has not expired or the artifact, run or repository have not been deleted | `https://github.com/example-org/example-repo/actions/runs/1/artifacts/1234` |
| `artifact-id` | GitHub ID of the Artifact, can be used by the REST API | `1234` |
| `artifact-url` | URL to download the Artifact. Can be used in many scenarios such as linking to artifacts in issues or pull requests. Users must be logged-in in order for this URL to work. This URL is valid as long as the artifact has not expired and the artifact, run, and repository have not been deleted. | `https://github.com/example-org/example-repo/actions/runs/1/artifacts/1234` |
| `artifact-digest` | SHA-256 digest of an Artifact | 0fde654d4c6e659b45783a725dc92f1bfb0baa6c2de64b34e814dc206ff4aaaf |
## Examples
@@ -468,7 +452,7 @@ You may also be limited by Artifacts if you have exceeded your shared storage qu
### Zip archives
When an Artifact is uploaded, all the files are assembled into an immutable Zip archive. There is currently no way to download artifacts in a format other than a Zip or to download individual artifact contents.
When an Artifact is uploaded, all the files are assembled into an immutable Zip archive. There is currently no way to download artifacts in a format other than Zip or to download individual artifact contents.
### Permission Loss
@@ -491,9 +475,8 @@ If you must preserve permissions, you can `tar` all of your files together befor
At the bottom of the workflow summary page, there is a dedicated section for artifacts. Here's a screenshot of something you might see:
<img src="https://github.com/user-attachments/assets/bcb7120f-f445-4a3e-9596-77f85f7e0af0" width="700" height="300">
<img src="https://user-images.githubusercontent.com/16109154/103645952-223c6880-4f59-11eb-8268-8dca6937b5f9.png" width="700" height="300">
For users who have write permissions to the repository, there is a trashcan icon that can be used to delete the artifact.
There is a trashcan icon that can be used to delete the artifact. This icon will only appear for users who have write permissions to the repository.
The size of the artifact is denoted in bytes. The displayed artifact size denotes the size of the zip that `upload-artifact` creates during upload. The Digest column will display the SHA256 digest of the artifact being uploaded.
The size of the artifact is denoted in bytes. The displayed artifact size denotes the size of the zip that `upload-artifact` creates during upload.

400
dist/merge/index.js vendored
View File

@@ -824,7 +824,7 @@ __exportStar(__nccwpck_require__(49773), exports);
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = exports.FinalizeMigratedArtifactResponse = exports.FinalizeMigratedArtifactRequest = exports.MigrateArtifactResponse = exports.MigrateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -838,236 +838,6 @@ const wrappers_1 = __nccwpck_require__(8626);
const wrappers_2 = __nccwpck_require__(8626);
const timestamp_1 = __nccwpck_require__(54622);
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "expires_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* google.protobuf.Timestamp expires_at */ 3:
message.expiresAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.expiresAt);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* google.protobuf.Timestamp expires_at = 3; */
if (message.expiresAt)
timestamp_1.Timestamp.internalBinaryWrite(message.expiresAt, writer.tag(3, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
exports.MigrateArtifactRequest = new MigrateArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "signed_upload_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { ok: false, signedUploadUrl: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* string signed_upload_url */ 2:
message.signedUploadUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* string signed_upload_url = 2; */
if (message.signedUploadUrl !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.signedUploadUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
exports.MigrateArtifactResponse = new MigrateArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "", size: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* int64 size */ 3:
message.size = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* int64 size = 3; */
if (message.size !== "0")
writer.tag(3, runtime_1.WireType.Varint).int64(message.size);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
exports.FinalizeMigratedArtifactRequest = new FinalizeMigratedArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
exports.FinalizeMigratedArtifactResponse = new FinalizeMigratedArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class CreateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.CreateArtifactRequest", [
@@ -1449,8 +1219,7 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
{ no: 3, name: "database_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 4, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 5, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp },
{ no: 7, name: "digest", kind: "message", T: () => wrappers_2.StringValue }
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
@@ -1483,9 +1252,6 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
case /* google.protobuf.Timestamp created_at */ 6:
message.createdAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.createdAt);
break;
case /* google.protobuf.StringValue digest */ 7:
message.digest = wrappers_2.StringValue.internalBinaryRead(reader, reader.uint32(), options, message.digest);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
@@ -1516,9 +1282,6 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
/* google.protobuf.Timestamp created_at = 6; */
if (message.createdAt)
timestamp_1.Timestamp.internalBinaryWrite(message.createdAt, writer.tag(6, runtime_1.WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.StringValue digest = 7; */
if (message.digest)
wrappers_2.StringValue.internalBinaryWrite(message.digest, writer.tag(7, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
@@ -1760,9 +1523,7 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse },
{ name: "MigrateArtifact", options: {}, I: exports.MigrateArtifactRequest, O: exports.MigrateArtifactResponse },
{ name: "FinalizeMigratedArtifact", options: {}, I: exports.FinalizeMigratedArtifactRequest, O: exports.FinalizeMigratedArtifactResponse }
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map
@@ -2159,8 +1920,6 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(__nccwpck_require__(73292));
const crypto = __importStar(__nccwpck_require__(6113));
const stream = __importStar(__nccwpck_require__(12781));
const github = __importStar(__nccwpck_require__(21260));
const core = __importStar(__nccwpck_require__(42186));
const httpClient = __importStar(__nccwpck_require__(96255));
@@ -2197,7 +1956,8 @@ function streamExtract(url, directory) {
let retryCount = 0;
while (retryCount < 5) {
try {
return yield streamExtractExternal(url, directory);
yield streamExtractExternal(url, directory);
return;
}
catch (error) {
retryCount++;
@@ -2217,18 +1977,12 @@ function streamExtractExternal(url, directory) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
let sha256Digest = undefined;
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
const hashStream = crypto.createHash('sha256').setEncoding('hex');
const passThrough = new stream.PassThrough();
response.message.pipe(passThrough);
passThrough.pipe(hashStream);
const extractStream = passThrough;
extractStream
response.message
.on('data', () => {
timer.refresh();
})
@@ -2240,12 +1994,7 @@ function streamExtractExternal(url, directory) {
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', () => {
clearTimeout(timer);
if (hashStream) {
hashStream.end();
sha256Digest = hashStream.read();
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`);
}
resolve({ sha256Digest: `sha256:${sha256Digest}` });
resolve();
})
.on('error', (error) => {
reject(error);
@@ -2258,7 +2007,6 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const api = github.getOctokit(token);
let digestMismatch = false;
core.info(`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`);
const { headers, status } = yield api.rest.actions.downloadArtifact({
owner: repositoryOwner,
@@ -2279,20 +2027,13 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
core.info(`Redirecting to blob download url: ${scrubQueryParameters(location)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
const extractResponse = yield streamExtract(location, downloadPath);
yield streamExtract(location, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath, digestMismatch };
return { downloadPath };
});
}
exports.downloadArtifactPublic = downloadArtifactPublic;
@@ -2300,7 +2041,6 @@ function downloadArtifactInternal(artifactId, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
let digestMismatch = false;
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
@@ -2323,20 +2063,13 @@ function downloadArtifactInternal(artifactId, options) {
core.info(`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
const extractResponse = yield streamExtract(signedUrl, downloadPath);
yield streamExtract(signedUrl, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath, digestMismatch };
return { downloadPath };
});
}
exports.downloadArtifactInternal = downloadArtifactInternal;
@@ -2442,17 +2175,13 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
}
};
});
}
exports.getArtifactPublic = getArtifactPublic;
function getArtifactInternal(artifactName) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
@@ -2479,8 +2208,7 @@ function getArtifactInternal(artifactName) {
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
: undefined
}
};
});
@@ -2534,7 +2262,7 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
};
const github = (0, github_1.getOctokit)(token, opts, plugin_retry_1.retry, plugin_request_log_1.requestLog);
let currentPageNumber = 1;
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -2553,18 +2281,14 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
});
}
// Move to the next page
currentPageNumber++;
// Iterate over any remaining pages
for (currentPageNumber; currentPageNumber < numberOfPages; currentPageNumber++) {
currentPageNumber++;
(0, core_1.debug)(`Fetching page ${currentPageNumber} of artifact list`);
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -2578,8 +2302,7 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
: undefined
});
}
}
@@ -2602,18 +2325,14 @@ function listArtifactsInternal(latest = false) {
workflowJobRunBackendId
};
const res = yield artifactClient.ListArtifacts(req);
let artifacts = res.artifacts.map(artifact => {
var _a;
return ({
let artifacts = res.artifacts.map(artifact => ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
});
});
: undefined
}));
if (latest) {
artifacts = filterLatest(artifacts);
}
@@ -2725,7 +2444,6 @@ const generated_1 = __nccwpck_require__(49960);
const config_1 = __nccwpck_require__(74610);
const user_agent_1 = __nccwpck_require__(85164);
const errors_1 = __nccwpck_require__(38182);
const util_1 = __nccwpck_require__(63062);
class ArtifactHttpClient {
constructor(userAgent, maxAttempts, baseRetryIntervalMilliseconds, retryMultiplier) {
this.maxAttempts = 5;
@@ -2778,7 +2496,6 @@ class ArtifactHttpClient {
(0, core_1.debug)(`[Response] - ${response.message.statusCode}`);
(0, core_1.debug)(`Headers: ${JSON.stringify(response.message.headers, null, 2)}`);
const body = JSON.parse(rawBody);
(0, util_1.maskSecretUrls)(body);
(0, core_1.debug)(`Body: ${JSON.stringify(body, null, 2)}`);
if (this.isSuccessStatusCode(statusCode)) {
return { response, body };
@@ -3095,11 +2812,10 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.maskSecretUrls = exports.maskSigUrl = exports.getBackendIdsFromToken = void 0;
exports.getBackendIdsFromToken = void 0;
const core = __importStar(__nccwpck_require__(42186));
const config_1 = __nccwpck_require__(74610);
const jwt_decode_1 = __importDefault(__nccwpck_require__(84329));
const core_1 = __nccwpck_require__(42186);
const InvalidJwtError = new Error('Failed to get backend IDs: The provided JWT token is invalid and/or missing claims');
// uses the JWT token claims to get the
// workflow run and workflow job run backend ids
@@ -3148,74 +2864,6 @@ function getBackendIdsFromToken() {
throw InvalidJwtError;
}
exports.getBackendIdsFromToken = getBackendIdsFromToken;
/**
* Masks the `sig` parameter in a URL and sets it as a secret.
*
* @param url - The URL containing the signature parameter to mask
* @remarks
* This function attempts to parse the provided URL and identify the 'sig' query parameter.
* If found, it registers both the raw and URL-encoded signature values as secrets using
* the Actions `setSecret` API, which prevents them from being displayed in logs.
*
* The function handles errors gracefully if URL parsing fails, logging them as debug messages.
*
* @example
* ```typescript
* // Mask a signature in an Azure SAS token URL
* maskSigUrl('https://example.blob.core.windows.net/container/file.txt?sig=abc123&se=2023-01-01');
* ```
*/
function maskSigUrl(url) {
if (!url)
return;
try {
const parsedUrl = new URL(url);
const signature = parsedUrl.searchParams.get('sig');
if (signature) {
(0, core_1.setSecret)(signature);
(0, core_1.setSecret)(encodeURIComponent(signature));
}
}
catch (error) {
(0, core_1.debug)(`Failed to parse URL: ${url} ${error instanceof Error ? error.message : String(error)}`);
}
}
exports.maskSigUrl = maskSigUrl;
/**
* Masks sensitive information in URLs containing signature parameters.
* Currently supports masking 'sig' parameters in the 'signed_upload_url'
* and 'signed_download_url' properties of the provided object.
*
* @param body - The object should contain a signature
* @remarks
* This function extracts URLs from the object properties and calls maskSigUrl
* on each one to redact sensitive signature information. The function doesn't
* modify the original object; it only marks the signatures as secrets for
* logging purposes.
*
* @example
* ```typescript
* const responseBody = {
* signed_upload_url: 'https://example.com?sig=abc123',
* signed_download_url: 'https://example.com?sig=def456'
* };
* maskSecretUrls(responseBody);
* ```
*/
function maskSecretUrls(body) {
if (typeof body !== 'object' || body === null) {
(0, core_1.debug)('body is not an object or is null');
return;
}
if ('signed_upload_url' in body &&
typeof body.signed_upload_url === 'string') {
maskSigUrl(body.signed_upload_url);
}
if ('signed_url' in body && typeof body.signed_url === 'string') {
maskSigUrl(body.signed_url);
}
}
exports.maskSecretUrls = maskSecretUrls;
//# sourceMappingURL=util.js.map
/***/ }),
@@ -3322,7 +2970,7 @@ function uploadZipToBlobStorage(authenticatedUploadURL, zipUploadStream) {
core.info('Finished uploading artifact content to blob storage!');
hashStream.end();
sha256Hash = hashStream.read();
core.info(`SHA256 digest of uploaded artifact zip is ${sha256Hash}`);
core.info(`SHA256 hash of uploaded artifact zip is ${sha256Hash}`);
if (uploadByteCount === 0) {
core.warning(`No data was uploaded to blob storage. Reported upload byte count is 0.`);
}
@@ -135835,7 +135483,7 @@ module.exports = index;
/***/ ((module) => {
"use strict";
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.3.2","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^7.0.1","jwt-decode":"^3.1.2","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.2.2","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^7.0.1","jwt-decode":"^3.1.2","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
/***/ }),

400
dist/upload/index.js vendored
View File

@@ -824,7 +824,7 @@ __exportStar(__nccwpck_require__(49773), exports);
"use strict";
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = exports.FinalizeMigratedArtifactResponse = exports.FinalizeMigratedArtifactRequest = exports.MigrateArtifactResponse = exports.MigrateArtifactRequest = void 0;
exports.ArtifactService = exports.DeleteArtifactResponse = exports.DeleteArtifactRequest = exports.GetSignedArtifactURLResponse = exports.GetSignedArtifactURLRequest = exports.ListArtifactsResponse_MonolithArtifact = exports.ListArtifactsResponse = exports.ListArtifactsRequest = exports.FinalizeArtifactResponse = exports.FinalizeArtifactRequest = exports.CreateArtifactResponse = exports.CreateArtifactRequest = void 0;
// @generated by protobuf-ts 2.9.1 with parameter long_type_string,client_none,generate_dependencies
// @generated from protobuf file "results/api/v1/artifact.proto" (package "github.actions.results.api.v1", syntax proto3)
// tslint:disable
@@ -838,236 +838,6 @@ const wrappers_1 = __nccwpck_require__(8626);
const wrappers_2 = __nccwpck_require__(8626);
const timestamp_1 = __nccwpck_require__(54622);
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "expires_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* google.protobuf.Timestamp expires_at */ 3:
message.expiresAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.expiresAt);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* google.protobuf.Timestamp expires_at = 3; */
if (message.expiresAt)
timestamp_1.Timestamp.internalBinaryWrite(message.expiresAt, writer.tag(3, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactRequest
*/
exports.MigrateArtifactRequest = new MigrateArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class MigrateArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.MigrateArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "signed_upload_url", kind: "scalar", T: 9 /*ScalarType.STRING*/ }
]);
}
create(value) {
const message = { ok: false, signedUploadUrl: "" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* string signed_upload_url */ 2:
message.signedUploadUrl = reader.string();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* string signed_upload_url = 2; */
if (message.signedUploadUrl !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.signedUploadUrl);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.MigrateArtifactResponse
*/
exports.MigrateArtifactResponse = new MigrateArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactRequest", [
{ no: 1, name: "workflow_run_backend_id", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 2, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 3, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { workflowRunBackendId: "", name: "", size: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* string workflow_run_backend_id */ 1:
message.workflowRunBackendId = reader.string();
break;
case /* string name */ 2:
message.name = reader.string();
break;
case /* int64 size */ 3:
message.size = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* string workflow_run_backend_id = 1; */
if (message.workflowRunBackendId !== "")
writer.tag(1, runtime_1.WireType.LengthDelimited).string(message.workflowRunBackendId);
/* string name = 2; */
if (message.name !== "")
writer.tag(2, runtime_1.WireType.LengthDelimited).string(message.name);
/* int64 size = 3; */
if (message.size !== "0")
writer.tag(3, runtime_1.WireType.Varint).int64(message.size);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactRequest
*/
exports.FinalizeMigratedArtifactRequest = new FinalizeMigratedArtifactRequest$Type();
// @generated message type with reflection information, may provide speed optimized methods
class FinalizeMigratedArtifactResponse$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.FinalizeMigratedArtifactResponse", [
{ no: 1, name: "ok", kind: "scalar", T: 8 /*ScalarType.BOOL*/ },
{ no: 2, name: "artifact_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ }
]);
}
create(value) {
const message = { ok: false, artifactId: "0" };
globalThis.Object.defineProperty(message, runtime_4.MESSAGE_TYPE, { enumerable: false, value: this });
if (value !== undefined)
(0, runtime_3.reflectionMergePartial)(this, message, value);
return message;
}
internalBinaryRead(reader, length, options, target) {
let message = target !== null && target !== void 0 ? target : this.create(), end = reader.pos + length;
while (reader.pos < end) {
let [fieldNo, wireType] = reader.tag();
switch (fieldNo) {
case /* bool ok */ 1:
message.ok = reader.bool();
break;
case /* int64 artifact_id */ 2:
message.artifactId = reader.int64().toString();
break;
default:
let u = options.readUnknownField;
if (u === "throw")
throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
let d = reader.skip(wireType);
if (u !== false)
(u === true ? runtime_2.UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
}
}
return message;
}
internalBinaryWrite(message, writer, options) {
/* bool ok = 1; */
if (message.ok !== false)
writer.tag(1, runtime_1.WireType.Varint).bool(message.ok);
/* int64 artifact_id = 2; */
if (message.artifactId !== "0")
writer.tag(2, runtime_1.WireType.Varint).int64(message.artifactId);
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
return writer;
}
}
/**
* @generated MessageType for protobuf message github.actions.results.api.v1.FinalizeMigratedArtifactResponse
*/
exports.FinalizeMigratedArtifactResponse = new FinalizeMigratedArtifactResponse$Type();
// @generated message type with reflection information, may provide speed optimized methods
class CreateArtifactRequest$Type extends runtime_5.MessageType {
constructor() {
super("github.actions.results.api.v1.CreateArtifactRequest", [
@@ -1449,8 +1219,7 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
{ no: 3, name: "database_id", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 4, name: "name", kind: "scalar", T: 9 /*ScalarType.STRING*/ },
{ no: 5, name: "size", kind: "scalar", T: 3 /*ScalarType.INT64*/ },
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp },
{ no: 7, name: "digest", kind: "message", T: () => wrappers_2.StringValue }
{ no: 6, name: "created_at", kind: "message", T: () => timestamp_1.Timestamp }
]);
}
create(value) {
@@ -1483,9 +1252,6 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
case /* google.protobuf.Timestamp created_at */ 6:
message.createdAt = timestamp_1.Timestamp.internalBinaryRead(reader, reader.uint32(), options, message.createdAt);
break;
case /* google.protobuf.StringValue digest */ 7:
message.digest = wrappers_2.StringValue.internalBinaryRead(reader, reader.uint32(), options, message.digest);
break;
default:
let u = options.readUnknownField;
if (u === "throw")
@@ -1516,9 +1282,6 @@ class ListArtifactsResponse_MonolithArtifact$Type extends runtime_5.MessageType
/* google.protobuf.Timestamp created_at = 6; */
if (message.createdAt)
timestamp_1.Timestamp.internalBinaryWrite(message.createdAt, writer.tag(6, runtime_1.WireType.LengthDelimited).fork(), options).join();
/* google.protobuf.StringValue digest = 7; */
if (message.digest)
wrappers_2.StringValue.internalBinaryWrite(message.digest, writer.tag(7, runtime_1.WireType.LengthDelimited).fork(), options).join();
let u = options.writeUnknownFields;
if (u !== false)
(u == true ? runtime_2.UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
@@ -1760,9 +1523,7 @@ exports.ArtifactService = new runtime_rpc_1.ServiceType("github.actions.results.
{ name: "FinalizeArtifact", options: {}, I: exports.FinalizeArtifactRequest, O: exports.FinalizeArtifactResponse },
{ name: "ListArtifacts", options: {}, I: exports.ListArtifactsRequest, O: exports.ListArtifactsResponse },
{ name: "GetSignedArtifactURL", options: {}, I: exports.GetSignedArtifactURLRequest, O: exports.GetSignedArtifactURLResponse },
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse },
{ name: "MigrateArtifact", options: {}, I: exports.MigrateArtifactRequest, O: exports.MigrateArtifactResponse },
{ name: "FinalizeMigratedArtifact", options: {}, I: exports.FinalizeMigratedArtifactRequest, O: exports.FinalizeMigratedArtifactResponse }
{ name: "DeleteArtifact", options: {}, I: exports.DeleteArtifactRequest, O: exports.DeleteArtifactResponse }
]);
//# sourceMappingURL=artifact.js.map
@@ -2159,8 +1920,6 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.downloadArtifactInternal = exports.downloadArtifactPublic = exports.streamExtractExternal = void 0;
const promises_1 = __importDefault(__nccwpck_require__(73292));
const crypto = __importStar(__nccwpck_require__(6113));
const stream = __importStar(__nccwpck_require__(12781));
const github = __importStar(__nccwpck_require__(21260));
const core = __importStar(__nccwpck_require__(42186));
const httpClient = __importStar(__nccwpck_require__(96255));
@@ -2197,7 +1956,8 @@ function streamExtract(url, directory) {
let retryCount = 0;
while (retryCount < 5) {
try {
return yield streamExtractExternal(url, directory);
yield streamExtractExternal(url, directory);
return;
}
catch (error) {
retryCount++;
@@ -2217,18 +1977,12 @@ function streamExtractExternal(url, directory) {
throw new Error(`Unexpected HTTP response from blob storage: ${response.message.statusCode} ${response.message.statusMessage}`);
}
const timeout = 30 * 1000; // 30 seconds
let sha256Digest = undefined;
return new Promise((resolve, reject) => {
const timerFn = () => {
response.message.destroy(new Error(`Blob storage chunk did not respond in ${timeout}ms`));
};
const timer = setTimeout(timerFn, timeout);
const hashStream = crypto.createHash('sha256').setEncoding('hex');
const passThrough = new stream.PassThrough();
response.message.pipe(passThrough);
passThrough.pipe(hashStream);
const extractStream = passThrough;
extractStream
response.message
.on('data', () => {
timer.refresh();
})
@@ -2240,12 +1994,7 @@ function streamExtractExternal(url, directory) {
.pipe(unzip_stream_1.default.Extract({ path: directory }))
.on('close', () => {
clearTimeout(timer);
if (hashStream) {
hashStream.end();
sha256Digest = hashStream.read();
core.info(`SHA256 digest of downloaded artifact is ${sha256Digest}`);
}
resolve({ sha256Digest: `sha256:${sha256Digest}` });
resolve();
})
.on('error', (error) => {
reject(error);
@@ -2258,7 +2007,6 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const api = github.getOctokit(token);
let digestMismatch = false;
core.info(`Downloading artifact '${artifactId}' from '${repositoryOwner}/${repositoryName}'`);
const { headers, status } = yield api.rest.actions.downloadArtifact({
owner: repositoryOwner,
@@ -2279,20 +2027,13 @@ function downloadArtifactPublic(artifactId, repositoryOwner, repositoryName, tok
core.info(`Redirecting to blob download url: ${scrubQueryParameters(location)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
const extractResponse = yield streamExtract(location, downloadPath);
yield streamExtract(location, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath, digestMismatch };
return { downloadPath };
});
}
exports.downloadArtifactPublic = downloadArtifactPublic;
@@ -2300,7 +2041,6 @@ function downloadArtifactInternal(artifactId, options) {
return __awaiter(this, void 0, void 0, function* () {
const downloadPath = yield resolveOrCreateDirectory(options === null || options === void 0 ? void 0 : options.path);
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
let digestMismatch = false;
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
const listReq = {
workflowRunBackendId,
@@ -2323,20 +2063,13 @@ function downloadArtifactInternal(artifactId, options) {
core.info(`Redirecting to blob download url: ${scrubQueryParameters(signedUrl)}`);
try {
core.info(`Starting download of artifact to: ${downloadPath}`);
const extractResponse = yield streamExtract(signedUrl, downloadPath);
yield streamExtract(signedUrl, downloadPath);
core.info(`Artifact download completed successfully.`);
if (options === null || options === void 0 ? void 0 : options.expectedHash) {
if ((options === null || options === void 0 ? void 0 : options.expectedHash) !== extractResponse.sha256Digest) {
digestMismatch = true;
core.debug(`Computed digest: ${extractResponse.sha256Digest}`);
core.debug(`Expected digest: ${options.expectedHash}`);
}
}
}
catch (error) {
throw new Error(`Unable to download and extract artifact: ${error.message}`);
}
return { downloadPath, digestMismatch };
return { downloadPath };
});
}
exports.downloadArtifactInternal = downloadArtifactInternal;
@@ -2442,17 +2175,13 @@ function getArtifactPublic(artifactName, workflowRunId, repositoryOwner, reposit
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
}
};
});
}
exports.getArtifactPublic = getArtifactPublic;
function getArtifactInternal(artifactName) {
var _a;
return __awaiter(this, void 0, void 0, function* () {
const artifactClient = (0, artifact_twirp_client_1.internalArtifactTwirpClient)();
const { workflowRunBackendId, workflowJobRunBackendId } = (0, util_1.getBackendIdsFromToken)();
@@ -2479,8 +2208,7 @@ function getArtifactInternal(artifactName) {
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
: undefined
}
};
});
@@ -2534,7 +2262,7 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
};
const github = (0, github_1.getOctokit)(token, opts, plugin_retry_1.retry, plugin_request_log_1.requestLog);
let currentPageNumber = 1;
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -2553,18 +2281,14 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
name: artifact.name,
id: artifact.id,
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
createdAt: artifact.created_at ? new Date(artifact.created_at) : undefined
});
}
// Move to the next page
currentPageNumber++;
// Iterate over any remaining pages
for (currentPageNumber; currentPageNumber < numberOfPages; currentPageNumber++) {
currentPageNumber++;
(0, core_1.debug)(`Fetching page ${currentPageNumber} of artifact list`);
const { data: listArtifactResponse } = yield github.request('GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts', {
const { data: listArtifactResponse } = yield github.rest.actions.listWorkflowRunArtifacts({
owner: repositoryOwner,
repo: repositoryName,
run_id: workflowRunId,
@@ -2578,8 +2302,7 @@ function listArtifactsPublic(workflowRunId, repositoryOwner, repositoryName, tok
size: artifact.size_in_bytes,
createdAt: artifact.created_at
? new Date(artifact.created_at)
: undefined,
digest: artifact.digest
: undefined
});
}
}
@@ -2602,18 +2325,14 @@ function listArtifactsInternal(latest = false) {
workflowJobRunBackendId
};
const res = yield artifactClient.ListArtifacts(req);
let artifacts = res.artifacts.map(artifact => {
var _a;
return ({
let artifacts = res.artifacts.map(artifact => ({
name: artifact.name,
id: Number(artifact.databaseId),
size: Number(artifact.size),
createdAt: artifact.createdAt
? generated_1.Timestamp.toDate(artifact.createdAt)
: undefined,
digest: (_a = artifact.digest) === null || _a === void 0 ? void 0 : _a.value
});
});
: undefined
}));
if (latest) {
artifacts = filterLatest(artifacts);
}
@@ -2725,7 +2444,6 @@ const generated_1 = __nccwpck_require__(49960);
const config_1 = __nccwpck_require__(74610);
const user_agent_1 = __nccwpck_require__(85164);
const errors_1 = __nccwpck_require__(38182);
const util_1 = __nccwpck_require__(63062);
class ArtifactHttpClient {
constructor(userAgent, maxAttempts, baseRetryIntervalMilliseconds, retryMultiplier) {
this.maxAttempts = 5;
@@ -2778,7 +2496,6 @@ class ArtifactHttpClient {
(0, core_1.debug)(`[Response] - ${response.message.statusCode}`);
(0, core_1.debug)(`Headers: ${JSON.stringify(response.message.headers, null, 2)}`);
const body = JSON.parse(rawBody);
(0, util_1.maskSecretUrls)(body);
(0, core_1.debug)(`Body: ${JSON.stringify(body, null, 2)}`);
if (this.isSuccessStatusCode(statusCode)) {
return { response, body };
@@ -3095,11 +2812,10 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.maskSecretUrls = exports.maskSigUrl = exports.getBackendIdsFromToken = void 0;
exports.getBackendIdsFromToken = void 0;
const core = __importStar(__nccwpck_require__(42186));
const config_1 = __nccwpck_require__(74610);
const jwt_decode_1 = __importDefault(__nccwpck_require__(84329));
const core_1 = __nccwpck_require__(42186);
const InvalidJwtError = new Error('Failed to get backend IDs: The provided JWT token is invalid and/or missing claims');
// uses the JWT token claims to get the
// workflow run and workflow job run backend ids
@@ -3148,74 +2864,6 @@ function getBackendIdsFromToken() {
throw InvalidJwtError;
}
exports.getBackendIdsFromToken = getBackendIdsFromToken;
/**
* Masks the `sig` parameter in a URL and sets it as a secret.
*
* @param url - The URL containing the signature parameter to mask
* @remarks
* This function attempts to parse the provided URL and identify the 'sig' query parameter.
* If found, it registers both the raw and URL-encoded signature values as secrets using
* the Actions `setSecret` API, which prevents them from being displayed in logs.
*
* The function handles errors gracefully if URL parsing fails, logging them as debug messages.
*
* @example
* ```typescript
* // Mask a signature in an Azure SAS token URL
* maskSigUrl('https://example.blob.core.windows.net/container/file.txt?sig=abc123&se=2023-01-01');
* ```
*/
function maskSigUrl(url) {
if (!url)
return;
try {
const parsedUrl = new URL(url);
const signature = parsedUrl.searchParams.get('sig');
if (signature) {
(0, core_1.setSecret)(signature);
(0, core_1.setSecret)(encodeURIComponent(signature));
}
}
catch (error) {
(0, core_1.debug)(`Failed to parse URL: ${url} ${error instanceof Error ? error.message : String(error)}`);
}
}
exports.maskSigUrl = maskSigUrl;
/**
* Masks sensitive information in URLs containing signature parameters.
* Currently supports masking 'sig' parameters in the 'signed_upload_url'
* and 'signed_download_url' properties of the provided object.
*
* @param body - The object should contain a signature
* @remarks
* This function extracts URLs from the object properties and calls maskSigUrl
* on each one to redact sensitive signature information. The function doesn't
* modify the original object; it only marks the signatures as secrets for
* logging purposes.
*
* @example
* ```typescript
* const responseBody = {
* signed_upload_url: 'https://example.com?sig=abc123',
* signed_download_url: 'https://example.com?sig=def456'
* };
* maskSecretUrls(responseBody);
* ```
*/
function maskSecretUrls(body) {
if (typeof body !== 'object' || body === null) {
(0, core_1.debug)('body is not an object or is null');
return;
}
if ('signed_upload_url' in body &&
typeof body.signed_upload_url === 'string') {
maskSigUrl(body.signed_upload_url);
}
if ('signed_url' in body && typeof body.signed_url === 'string') {
maskSigUrl(body.signed_url);
}
}
exports.maskSecretUrls = maskSecretUrls;
//# sourceMappingURL=util.js.map
/***/ }),
@@ -3322,7 +2970,7 @@ function uploadZipToBlobStorage(authenticatedUploadURL, zipUploadStream) {
core.info('Finished uploading artifact content to blob storage!');
hashStream.end();
sha256Hash = hashStream.read();
core.info(`SHA256 digest of uploaded artifact zip is ${sha256Hash}`);
core.info(`SHA256 hash of uploaded artifact zip is ${sha256Hash}`);
if (uploadByteCount === 0) {
core.warning(`No data was uploaded to blob storage. Reported upload byte count is 0.`);
}
@@ -135845,7 +135493,7 @@ module.exports = index;
/***/ ((module) => {
"use strict";
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.3.2","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^7.0.1","jwt-decode":"^3.1.2","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
module.exports = JSON.parse('{"name":"@actions/artifact","version":"2.2.2","preview":true,"description":"Actions artifact lib","keywords":["github","actions","artifact"],"homepage":"https://github.com/actions/toolkit/tree/main/packages/artifact","license":"MIT","main":"lib/artifact.js","types":"lib/artifact.d.ts","directories":{"lib":"lib","test":"__tests__"},"files":["lib","!.DS_Store"],"publishConfig":{"access":"public"},"repository":{"type":"git","url":"git+https://github.com/actions/toolkit.git","directory":"packages/artifact"},"scripts":{"audit-moderate":"npm install && npm audit --json --audit-level=moderate > audit.json","test":"cd ../../ && npm run test ./packages/artifact","bootstrap":"cd ../../ && npm run bootstrap","tsc-run":"tsc","tsc":"npm run bootstrap && npm run tsc-run","gen:docs":"typedoc --plugin typedoc-plugin-markdown --out docs/generated src/artifact.ts --githubPages false --readme none"},"bugs":{"url":"https://github.com/actions/toolkit/issues"},"dependencies":{"@actions/core":"^1.10.0","@actions/github":"^5.1.1","@actions/http-client":"^2.1.0","@azure/storage-blob":"^12.15.0","@octokit/core":"^3.5.1","@octokit/plugin-request-log":"^1.0.4","@octokit/plugin-retry":"^3.0.9","@octokit/request-error":"^5.0.0","@protobuf-ts/plugin":"^2.2.3-alpha.1","archiver":"^7.0.1","jwt-decode":"^3.1.2","unzip-stream":"^0.3.1"},"devDependencies":{"@types/archiver":"^5.3.2","@types/unzip-stream":"^0.3.4","typedoc":"^0.25.4","typedoc-plugin-markdown":"^3.17.1","typescript":"^5.2.2"}}');
/***/ }),

19
package-lock.json generated
View File

@@ -1,15 +1,15 @@
{
"name": "upload-artifact",
"version": "4.6.2",
"version": "4.6.1",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "upload-artifact",
"version": "4.6.2",
"version": "4.6.1",
"license": "MIT",
"dependencies": {
"@actions/artifact": "^2.3.2",
"@actions/artifact": "^2.2.2",
"@actions/core": "^1.11.1",
"@actions/github": "^6.0.0",
"@actions/glob": "^0.5.0",
@@ -34,10 +34,9 @@
}
},
"node_modules/@actions/artifact": {
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/@actions/artifact/-/artifact-2.3.2.tgz",
"integrity": "sha512-uX2Mr5KEPcwnzqa0Og9wOTEKIae6C/yx9P/m8bIglzCS5nZDkcQC/zRWjjoEsyVecL6oQpBx5BuqQj/yuVm0gw==",
"license": "MIT",
"version": "2.2.2",
"resolved": "https://registry.npmjs.org/@actions/artifact/-/artifact-2.2.2.tgz",
"integrity": "sha512-UtS1kcINiPRkI3/hDKkO/XdrtKo89kn8s81J67QNBU6RRMWSSXrrfCCbQVThuxcdW2boOLv51NVCEKyo954A2A==",
"dependencies": {
"@actions/core": "^1.10.0",
"@actions/github": "^5.1.1",
@@ -7767,9 +7766,9 @@
},
"dependencies": {
"@actions/artifact": {
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/@actions/artifact/-/artifact-2.3.2.tgz",
"integrity": "sha512-uX2Mr5KEPcwnzqa0Og9wOTEKIae6C/yx9P/m8bIglzCS5nZDkcQC/zRWjjoEsyVecL6oQpBx5BuqQj/yuVm0gw==",
"version": "2.2.2",
"resolved": "https://registry.npmjs.org/@actions/artifact/-/artifact-2.2.2.tgz",
"integrity": "sha512-UtS1kcINiPRkI3/hDKkO/XdrtKo89kn8s81J67QNBU6RRMWSSXrrfCCbQVThuxcdW2boOLv51NVCEKyo954A2A==",
"requires": {
"@actions/core": "^1.10.0",
"@actions/github": "^5.1.1",

View File

@@ -1,6 +1,6 @@
{
"name": "upload-artifact",
"version": "4.6.2",
"version": "4.6.1",
"description": "Upload an Actions Artifact in a workflow run",
"main": "dist/upload/index.js",
"scripts": {
@@ -29,7 +29,7 @@
},
"homepage": "https://github.com/actions/upload-artifact#readme",
"dependencies": {
"@actions/artifact": "^2.3.2",
"@actions/artifact": "^2.2.2",
"@actions/core": "^1.11.1",
"@actions/github": "^6.0.0",
"@actions/glob": "^0.5.0",