Running a script to install local object storage buckets and create connections
For convenience, run a script (provided in the following procedure) that automatically completes these tasks:
-
Creates a Minio instance in your project.
-
Creates two storage buckets in that Minio instance.
-
Generates a random user id and password for your Minio instance.
-
Creates two connections in your project, one for each bucket and both using the same credentials.
-
Installs required network policies for service mesh functionality.
The script is based on a guide for deploying Minio.
The Minio-based Object Storage that the script creates is not meant for production usage. |
If you want to connect to your own storage, see Creating connections to your own S3-compatible object storage. |
You must know the OpenShift resource name for your data science project so that you run the provided script in the correct project. To get the project’s resource name:
In the OpenShift AI dashboard, select Data Science Projects and then click the ? icon next to the project name. A text box appears with information about the project, including its resource name:
The following procedure describes how to run the script from the OpenShift console. If you are knowledgeable in OpenShift and can access the cluster from the command line, instead of following the steps in this procedure, you can use the following command to run the script: oc apply -n <your-project-name/> -f https://github.com/rh-aiservices-bu/fraud-detection/raw/main/setup/setup-s3.yaml |
-
In the OpenShift AI dashboard, click the application launcher icon and then select the OpenShift Console option.
-
In the OpenShift console, click + in the top navigation bar.
-
Select your project from the list of projects.
-
Verify that you selected the correct project.
-
Copy the following code and paste it into the Import YAML editor.
This code gets and applies the setup-s3-no-sa.yaml
file.--- apiVersion: v1 kind: ServiceAccount metadata: name: demo-setup --- apiVersion: rbac.authorization.k8s.io/v1 kind: RoleBinding metadata: name: demo-setup-edit roleRef: apiGroup: rbac.authorization.k8s.io kind: ClusterRole name: edit subjects: - kind: ServiceAccount name: demo-setup --- apiVersion: batch/v1 kind: Job metadata: name: create-s3-storage spec: selector: {} template: spec: containers: - args: - -ec - |- echo -n 'Setting up Minio instance and connections' oc apply -f https://github.com/rh-aiservices-bu/fraud-detection/raw/main/setup/setup-s3-no-sa.yaml command: - /bin/bash image: image-registry.openshift-image-registry.svc:5000/openshift/tools:latest imagePullPolicy: IfNotPresent name: create-s3-storage restartPolicy: Never serviceAccount: demo-setup serviceAccountName: demo-setup
-
Click Create.
-
In the OpenShift console, you should see a "Resources successfully created" message and the following resources listed:
-
demo-setup
-
demo-setup-edit
-
create-s3-storage
-
-
In the OpenShift AI dashboard:
-
Select Data Science Projects and then click the name of your project, Fraud detection.
-
Click Connections. You should see two connections listed:
My Storage
andPipeline Artifacts
.
-
If you want to complete the pipelines section of this workshop, go to Enabling data science pipelines.
Otherwise, skip to Creating a workbench.