--- sidebar_position: 8 --- import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; # Get object Gets the contents of the object from the bucket `Function GetObject(Val Name, Val Bucket, Val BasicData, Val Version = "", Val Headers = Undefined, Val SavePath = "") Export` | Parameter | CLI option | Type | Required | Description | |-|-|-|-|-| | Name | --name | String | ✔ | Name of the object in the bucket | | Bucket | --bucket | String | ✔ | Name of the bucket in which the object is stored | | BasicData | --basic | Structure Of KeyAndValue | ✔ | Basic request data. See GetBasicDataStructure | | Version | --ver | String | ✖ | Token for receiving a specific version of an object | | Headers | --headers | Map Of KeyAndValue | ✖ | Additional request headers, if necessary | | SavePath | --out | String | ✖ | Path to directly write a file to disk | Returns: BinaryData, String - object content or file path, if a save path is specified
:::tip Method at AWS documentation: [GetObjectAttributes](https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObjectAttributes.html) You can use the `ChunkSize` field in the basic data to specify the minimum file and chunk size for a chunked upload For example, `ChunkSize=X` means that all files larger than `X` (in bytes) will be downloaded in chunks, where one chunk will be of size `X`. Chunk upload is used for large files. Default `ChunkSize` - 20971520 bytes (20 MB) :::
```bsl title="1C:Enterprise/OneScript code example" URL = "storage-155.s3hoster.by"; AccessKey = "BRN5RKJE67..."; SecretKey = "NNhv+i9PrytpT8Tu0C1N..."; Region = "BTC"; BasicData = OPI_S3.GetBasicDataStructure(URL, AccessKey, SecretKey, Region); Name = "picture.jpg"; Bucket = "opi-gpbucket3"; Result = OPI_S3.GetObject(Name, Bucket, BasicData); TempFile = GetTempFileName(); BasicData.Insert("ChunkSize", 200000); Result = OPI_S3.GetObject(Name, Bucket, BasicData, , , TempFile); ``` ```bash # JSON data can also be passed as a path to a .json file oint s3 GetObject \ --name "bigfile.exe" \ --bucket "newbucket2" \ --basic "{'URL':'storage-155.s3hoster.by','AccessKey':'***','SecretKey':'***','Region':'BTC','Service':'s3','ChunkSize':200000}" ``` ```batch :: JSON data can also be passed as a path to a .json file oint s3 GetObject ^ --name "bigfile.exe" ^ --bucket "newbucket2" ^ --basic "{'URL':'storage-155.s3hoster.by','AccessKey':'***','SecretKey':'***','Region':'BTC','Service':'s3','ChunkSize':200000}" ``` ```json title="Result" NOT JSON: FF D8 FF E1 54 C1 45 78 69 66 00 00 49 49 2A 00 08 00 00 00 0B 00 0E 01 02 00 20 00 00 00 92 00 00 00 0F 01 02 00 05 00 00 00 B2 00 00 00 10 01 02 00 07 00 00 00 B8 00 00 00 12 01 03 00 01 00… ```