Learn how to automate some actions on Object Storage with Terraform, which is an open-source tool to automate infrastructure provisioning. The following actions will be automated:
- Object Storage user creation
- bucket creation
- file copy into the bucket
- S3™*-compatible policies and assignment
Requirements
- Have Terraform command line installed (see this tutorial from Hashicorp, the company behind Terraform)
- Have git command line installed
- You will need to set up an account to interact with the OVHcloud API (see this tutorial). Depending on your needs in terms of permissions (http verbs PUT/GET/POST/DELETE), enter the route
/cloud/project/{serviceName}/region/{regionName}/storage/
to target object storage where {serviceName} corresponds to your Public Cloud project id and {regionName} corresponds to the region where your resources will be located. From the application keys created, you will need to export the four environment variables:
- A Public Cloud project, with the ID exported as the variable
TF_VAR_OVH_PUBLIC_CLOUD_PROJECT_ID
If you do not have your AWS CLI configured, you should set dummy values with the following. This is due to a limitation in Terraform's dependency graph for providers' initialization (see this long-lasting terraform issue):
Instructions
Manage an Object Storage bucket with terraform @OVHcloud
Initialize
Clone the repository:
Initialize Terraform:
Plan
With the following command, you will see what are the actions that Terraform is going to perform:
Now let's have a look at the content of the main.tf
file:
- The variable block defines the region and S3™ endpoint that are used to create the bucket. You can update it according to your needs (see our Endpoints and Object Storage Geoavailability.)
- The Providers block defines two providers: ovh and Hashicorp AWS one. The first one is necessary to create the user whose identity/credentials will be used for the latest.
- The User/Credential block defines the user & credential that are visible in the Settings > Users & Roles tab. They are needed to configure the Hashicorp AWS provider.
- The Bucket block defines the bucket itself.
- The Output defines the access & secret key that may be useful for CLI usage.
Run
Now you can go in the Console and check the "Object Storage" tab. Your bucket is created.
Destroy
With the following command, you will be back to your original state: Terraform will destroy all the resources that were previously created.
- This script does not follow Terraform best practices for splitting the project into multiple files (e.g.,
provider.tf
,main.tf
,variables.tf
,outputs.tf
). This has been done intentionally to avoid switching into multiple files for what is a really simple example. - The secret that is created by this script is stored in the local state back-end. If you use this back-end in production, make sure to consider the state file as a secret.
Automating Object Storage policies with Terraform
Initialize
Clone the repository:
Initialize Terraform:
Plan
With the following command, you will see what are the actions that Terraform is going to perform:
Now let's have a look at the content of the main.tf
file and compare it with the previous example:
- The User/Credential block defines three users and credentials: one user will be the administrator of the bucket and create it, the two others will have read/write & read-only access.
- In the Bucket block we have added the creation of a file into the bucket
- The Policy block defines two policies, one for read/write and the other for read-only on the bucket.
Run
Now you can go in the Console and check the "Object Storage" tab. You will see the bucket and the file.
You can also check the access right by using the AWS CLI with the two users that have the read/write & read-only access.
Destroy
With the following command, you will be back to your original state: Terraform will destroy all the resources that were previously created.
Go further
For more information and tutorials, please see our other Object Storage support guides or explore the guides for other OVHcloud products and services.
If you need training or technical assistance to implement our solutions, contact your sales representative or click on this link to get a quote and ask our Professional Services experts for a custom analysis of your project.
*S3 is a trademark filed by Amazon Technologies, Inc. OVHcloud's service is not sponsored by, endorsed by, or otherwise affiliated with Amazon Technologies, Inc.