Skip to content

Commit 39b9f0e

Browse files
feat: generate_library.sh with postprocessing (#1951)
* feat: add * generate gapic and proto folder * refactor utilities * add an action to verify * checkout googleapis-gen * setup repo name * add commit hash of googleapis-gen * change secret * change token * change to git clone * change user name * add input list * include resources folder in main * remove grpc version in `*ServiceGrpc.java` * change destination path * compare generation result with googleapis-gen * fix type in diff command * checkout repo using checkout action * checkout repos as nested repo * sparse checkout googleapis * Revert "sparse checkout googleapis" This reverts commit 3d612f8. * change library * change step name * add a readme * make grpc version optional * make protobuf version optional * checkout master branch, rather than a commit hash * allow snapshot version of generator * download snapshot of generator parent pom * update README * download generator and grpc using mvn * change error message * add maven central mirror * add comments in utilities * add comments * add an integration test * fail fast if no file is found * do not delete google/ * get protobuf version from WORKSPACE * add instructions on download `google/` from googleapis * add comments * update description of `destination_path` * update comments * download dependencies using `curl` * increase download time * remove comment * add samples directory in readme * remove prerequisite about `proto_path` * add explanation in prerequisite * add example to generate showcase * add a comment * wip adaptations * add owlbot.py template * run owlbot docker image * fix consolidate config * move owlbot call to its own function * move postprocessing logic * prepare integration test for gh workflow * fix local dev script * post-merge fixes * fix test script and IT * fix parent poms * start fixing samples problem * fix samples folder transfer * cleanup, prepare IT workflow * cleanup ii, sparse clone monorepo * delete preserve script * clean unnecessary lines * infer owlbot sha * add template file * remove newline from owlbot template * chore: newline correction * use stderr for error messages * fix script documentation * function comments * quoting variables * format constant * fix sparse checkout of monorepo * include location to googleapis sparse clone * remove unnecessary parent pom setting * remove consolidate_config.sh * exclude changelog and owlbot copy files from diff check * fixes after merge * include .github in monorepo sparse clone * restore `set_parent_pom.sh` * restore `consolidate_config.sh` * correct parameter resolution * use separate variable for version * postprocessing to use separate versions * remove old IT file * post-merge fixes * enable post-processing by default * post-merge fixes * post-merge fixes * post merge fixes * add script to compare poms * post-merge fixes * post-merge fixes ii * fix pom comparison * include pre-existing poms before running owlbot * change owlbot-staging suffix folder to run owlbot.py * fix newline removal in owlbot.py * split git diff command * enable tests for HW libraries * generate all hw libs except bigtable * all libraries passing * fix unit tests * repo metadata json logic cleanup * remove new library scripts * fix googleapis-gen tests * fix post-processing it * magic empty commit * correct conflict string * use os agnostic string replacement * comments and cleanup on postprocessing * cleanup of IT * temp: use custom gapic library name * use owl-bot-copy * remove api_version logic * remove custom_gapic_name in favor of owl-bot-copy * remove unnecessary new library flag * fix folder name test * remove unnecessary util function * remove unnecessary utils script dir var * rename postprocessing folder, apply_current_versions comment * fix postprocessing comments * correct popd folder name to its variable name * unnecessary sed command * skip generation if more versions coming * do not stage previous versions in owl-bot-staging * do not use custom repo metadatas * reset workspace folder * remove unnecessary owlbot yaml copy * modify readme * expand README instructions * examples for both pre and post processing * exclude new library owlbot.py template * do not process HW libraries * success message, folder navigation fix * set git author * add docker to workflow * lint fix * custom docker step for macos * do not postprocess showcase * os-dependent pom comparison * add python to workflow * explicit python version * add debugging output for compare_poms * correct xargs for macos * remove debug checkpoints * clean compare_poms.py * concise else logic * infer destination_path * add generation times * remove unused transport and include_samples from postprocessing * use versions.txt at root of owlbot postprocessor fs * modify success message * remove unused version processing script * remove owlbot_sha and repo_metadata args * use built-in docker images * manual install of docker ii * manual install of docker iii * manual install of docker iv * manual install of docker v * manual install of docker vi * manual install of docker vii * manual install of docker viii * manual install of docker ix * versions.txt as an argument * fix exit code in time tracking * fix readme * remove unused options * fix macos docker install * do not use cask to install docker * test custom user id in docker run * correct time tracking entry * change postprocessing file structture * move helper postprocess funcs to utilities.sh * add unit tests for postprocess utils * remove repository_path * fix workspace creation logic * fix readme * transfer from workspace to destination path * include folder structure for p.p. libs in readme * omit pre-processed folders * omit package-info.java * fix documentation argument order * fix preparation of copy-code source folder * add unit test for copy_directory_if_exists * fix wrong args to cp * change test monorepo folder names --------- Co-authored-by: JoeWang1127 <[email protected]>
1 parent f1ee04d commit 39b9f0e

13 files changed

+704
-66
lines changed

.github/workflows/verify_library_generation.yaml

+26-1
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ jobs:
1414
matrix:
1515
java: [ 8 ]
1616
os: [ ubuntu-22.04, macos-12 ]
17+
post_processing: [ 'true', 'false' ]
1718
runs-on: ${{ matrix.os }}
1819
steps:
1920
- uses: actions/checkout@v3
@@ -22,11 +23,35 @@ jobs:
2223
java-version: ${{ matrix.java }}
2324
distribution: temurin
2425
cache: maven
26+
- uses: actions/setup-python@v4
27+
with:
28+
python-version: '3.11'
29+
- name: install docker (ubuntu)
30+
if: matrix.os == 'ubuntu-22.04'
31+
run: |
32+
set -x
33+
# install docker
34+
sudo apt install containerd -y
35+
sudo apt install -y docker.io docker-compose
36+
37+
# launch docker
38+
sudo systemctl start docker
39+
- name: install docker (macos)
40+
if: matrix.os == 'macos-12'
41+
run: |
42+
brew update --preinstall
43+
brew install docker docker-compose qemu
44+
brew upgrade qemu
45+
colima start
46+
docker run --user $(id -u):$(id -g) --rm hello-world
2547
- name: Run integration tests
2648
run: |
2749
set -x
50+
git config --global user.email "[email protected]"
51+
git config --global user.name "Github Workflow"
2852
library_generation/test/generate_library_integration_test.sh \
29-
--googleapis_gen_url https://cloud-java-bot:${{ secrets.CLOUD_JAVA_BOT_GITHUB_TOKEN }}@github.com/googleapis/googleapis-gen.git
53+
--googleapis_gen_url https://cloud-java-bot:${{ secrets.CLOUD_JAVA_BOT_GITHUB_TOKEN }}@github.com/googleapis/googleapis-gen.git \
54+
--enable_postprocessing "${{ matrix.post_processing }}"
3055
unit_tests:
3156
strategy:
3257
matrix:

library_generation/README.md

+65-3
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,13 @@ In order to generate a GAPIC library, you need to pull `google/` from [googleapi
2121
and put it into `output` since protos in `google/` are likely referenced by
2222
protos from which the library are generated.
2323

24+
In order to generate a post-processed GAPIC library, you need to pull the
25+
original repository (i.e. google-cloud-java) and pass the monorepo as
26+
`destination_path` (e.g. `google-cloud-java/java-asset`).
27+
This repository will be the source of truth for pre-existing
28+
pom.xml files, owlbot.py and .OwlBot.yaml files. See the option belows for
29+
custom postprocessed generations (e.g. custom `versions.txt` file).
30+
2431
## Parameters to run `generate_library.sh`
2532

2633
You need to run the script with the following parameters.
@@ -40,7 +47,7 @@ Use `-d` or `--destination_path` to specify the value.
4047

4148
Note that you do not need to create `$destination_path` beforehand.
4249

43-
The directory structure of the generated library is
50+
The directory structure of the generated library _withtout_ postprocessing is
4451
```
4552
$destination_path
4653
|_gapic-*
@@ -65,7 +72,35 @@ $destination_path
6572
```
6673
You can't build the library as-is since it does not have `pom.xml` or `build.gradle`.
6774
To use the library, copy the generated files to the corresponding directory
68-
of a library repository, e.g., `google-cloud-java`.
75+
of a library repository, e.g., `google-cloud-java` or use the
76+
`enable_postprocessing` flag on top of a pre-existing generated library to
77+
produce the necessary pom files.
78+
79+
For `asset/v1` the directory structure of the generated library _with_ postprocessing is
80+
```
81+
82+
├── google-cloud-asset
83+
│   └── src
84+
│   ├── main
85+
│   │   ├── java
86+
│   │   └── resources
87+
│   └── test
88+
│   └── java
89+
├── google-cloud-asset-bom
90+
├── grpc-google-cloud-asset-v*
91+
│   └── src
92+
│   └── main
93+
│   └── java
94+
├── proto-google-cloud-asset-v*
95+
│   └── src
96+
│   └── main
97+
│   ├── java
98+
│   └── proto
99+
└── samples
100+
└── snippets
101+
└── generated
102+
103+
```
69104

70105
### gapic_generator_version
71106
You can find the released version of gapic-generator-java in [maven central](https://repo1.maven.org/maven2/com/google/api/gapic-generator-java/).
@@ -150,8 +185,33 @@ Use `--include_samples` to specify the value.
150185
Choose the protoc binary type from https://github.com/protocolbuffers/protobuf/releases.
151186
Default is "linux-x86_64".
152187

153-
## An example to generate a client library
188+
### enable_postprocessing (optional)
189+
Whether to enable the post-processing steps (usage of owlbot) in the generation
190+
of this library
191+
Default is "true".
192+
193+
### versions_file (optional)
194+
It must point to a versions.txt file containing the versions the post-processed
195+
poms will have. It is required when `enable_postprocessing` is `"true"`
196+
197+
198+
## An example to generate a non post-processed client library
199+
```bash
200+
library_generation/generate_library.sh \
201+
-p google/cloud/confidentialcomputing/v1 \
202+
-d google-cloud-confidentialcomputing-v1-java \
203+
--gapic_generator_version 2.24.0 \
204+
--protobuf_version 23.2 \
205+
--grpc_version 1.55.1 \
206+
--gapic_additional_protos "google/cloud/common_resources.proto google/cloud/location/locations.proto" \
207+
--transport grpc+rest \
208+
--rest_numeric_enums true \
209+
--enable_postprocessing false \
210+
--include_samples true
154211
```
212+
213+
## An example to generate a library with postprocessing
214+
```bash
155215
library_generation/generate_library.sh \
156216
-p google/cloud/confidentialcomputing/v1 \
157217
-d google-cloud-confidentialcomputing-v1-java \
@@ -161,5 +221,7 @@ library_generation/generate_library.sh \
161221
--gapic_additional_protos "google/cloud/common_resources.proto google/cloud/location/locations.proto" \
162222
--transport grpc+rest \
163223
--rest_numeric_enums true \
224+
--enable_postprocessing true \
225+
--versions_file "path/to/versions.txt" \
164226
--include_samples true
165227
```

library_generation/generate_library.sh

+53-6
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
#!/usr/bin/env bash
22

33
set -eo pipefail
4-
set -x
54

65
# parse input parameters
76
while [[ $# -gt 0 ]]; do
@@ -61,10 +60,18 @@ case $key in
6160
include_samples="$2"
6261
shift
6362
;;
63+
--enable_postprocessing)
64+
enable_postprocessing="$2"
65+
shift
66+
;;
6467
--os_architecture)
6568
os_architecture="$2"
6669
shift
6770
;;
71+
--versions_file)
72+
versions_file="$2"
73+
shift
74+
;;
6875
*)
6976
echo "Invalid option: [$1]"
7077
exit 1
@@ -74,6 +81,7 @@ shift # past argument or value
7481
done
7582

7683
script_dir=$(dirname "$(readlink -f "$0")")
84+
# source utility functions
7785
source "${script_dir}"/utilities.sh
7886
output_folder="$(get_output_folder)"
7987

@@ -117,17 +125,20 @@ if [ -z "${include_samples}" ]; then
117125
include_samples="true"
118126
fi
119127

128+
if [ -z "$enable_postprocessing" ]; then
129+
enable_postprocessing="true"
130+
fi
131+
120132
if [ -z "${os_architecture}" ]; then
121133
os_architecture=$(detect_os_architecture)
122134
fi
123135

124-
125136
mkdir -p "${output_folder}/${destination_path}"
126137
##################### Section 0 #####################
127138
# prepare tooling
128139
#####################################################
129140
# the order of services entries in gapic_metadata.json is relevant to the
130-
# order of proto file, sort the proto files with respect to their name to
141+
# order of proto file, sort the proto files with respect to their bytes to
131142
# get a fixed order.
132143
folder_name=$(extract_folder_name "${destination_path}")
133144
pushd "${output_folder}"
@@ -137,7 +148,7 @@ case "${proto_path}" in
137148
find_depth="-maxdepth 1"
138149
;;
139150
esac
140-
proto_files=$(find "${proto_path}" ${find_depth} -type f -name "*.proto" | sort)
151+
proto_files=$(find "${proto_path}" ${find_depth} -type f -name "*.proto" | LC_COLLATE=C sort)
141152
# include or exclude certain protos in grpc plugin and gapic generator java.
142153
case "${proto_path}" in
143154
"google/cloud")
@@ -280,5 +291,41 @@ popd # output_folder
280291
#####################################################
281292
pushd "${output_folder}/${destination_path}"
282293
rm -rf java_gapic_srcjar java_gapic_srcjar_raw.srcjar.zip java_grpc.jar java_proto.jar temp-codegen.srcjar
283-
popd
284-
set +x
294+
popd # destination path
295+
##################### Section 5 #####################
296+
# post-processing
297+
#####################################################
298+
if [ "${enable_postprocessing}" != "true" ];
299+
then
300+
echo "post processing is disabled"
301+
exit 0
302+
fi
303+
if [ -z "${versions_file}" ];then
304+
echo "no versions.txt argument provided. Please provide one in order to enable post-processing"
305+
exit 1
306+
fi
307+
workspace="${output_folder}/workspace"
308+
if [ -d "${workspace}" ]; then
309+
rm -rdf "${workspace}"
310+
fi
311+
312+
mkdir -p "${workspace}"
313+
314+
bash -x "${script_dir}/postprocess_library.sh" "${workspace}" \
315+
"${script_dir}" \
316+
"${destination_path}" \
317+
"${proto_path}" \
318+
"${versions_file}" \
319+
"${output_folder}"
320+
321+
# for post-procesed libraries, remove pre-processed folders
322+
pushd "${output_folder}/${destination_path}"
323+
rm -rdf "proto-${folder_name}"
324+
rm -rdf "grpc-${folder_name}"
325+
rm -rdf "gapic-${folder_name}"
326+
if [ "${include_samples}" == "false" ]; then
327+
rm -rdf "samples"
328+
fi
329+
popd # output_folder
330+
# move contents of the post-processed library into destination_path
331+
cp -r ${workspace}/* "${output_folder}/${destination_path}"
+101
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,101 @@
1+
#!/bin/bash
2+
#
3+
# Main functions to interact with owlbot post-processor and postprocessing
4+
# scripts
5+
6+
7+
# Runs the owlbot post-processor docker image. The resulting post-processed
8+
# library gets stored in `${output_folder}/workspace`
9+
# Arguments
10+
# 1 - workspace: the location of the grpc,proto and gapic libraries to be
11+
# processed
12+
# 2 - scripts_root: location of the generation scripts
13+
# 3 - destination_path: used to transfer the raw grpc, proto and gapic libraries
14+
# 4 - proto_path: googleapis path of the library. This is used to prepare the
15+
# folder structure to run `owlbot-cli copy-code`
16+
# 5 - versions_file: path to file containing versions to be applied to the poms
17+
# 6 - output_folder: main workspace of the generation process
18+
19+
workspace=$1
20+
scripts_root=$2
21+
destination_path=$3
22+
proto_path=$4
23+
versions_file=$5
24+
output_folder=$6
25+
26+
source "${scripts_root}"/utilities.sh
27+
28+
repository_root=$(echo "${destination_path}" | cut -d/ -f1)
29+
repo_metadata_json_path=$(get_repo_metadata_json "${destination_path}" "${output_folder}")
30+
owlbot_sha=$(get_owlbot_sha "${output_folder}" "${repository_root}")
31+
32+
# read or infer owlbot sha
33+
34+
cp "${repo_metadata_json_path}" "${workspace}"/.repo-metadata.json
35+
36+
# call owl-bot-copy
37+
owlbot_staging_folder="${workspace}/owl-bot-staging"
38+
mkdir -p "${owlbot_staging_folder}"
39+
owlbot_postprocessor_image="gcr.io/cloud-devrel-public-resources/owlbot-java@sha256:${owlbot_sha}"
40+
41+
42+
43+
# copy existing pom, owlbot and version files if the source of truth repo is present
44+
# pre-processed folders are ommited
45+
if [[ -d "${output_folder}/${destination_path}" ]]; then
46+
rsync -avm \
47+
--include='*/' \
48+
--include='*.xml' \
49+
--include='owlbot.py' \
50+
--include='.OwlBot.yaml' \
51+
--exclude='*' \
52+
"${output_folder}/${destination_path}/" \
53+
"${workspace}"
54+
fi
55+
56+
echo 'Running owl-bot-copy'
57+
pre_processed_libs_folder="${output_folder}/pre-processed"
58+
# By default (thanks to generation templates), .OwlBot.yaml `deep-copy` section
59+
# references a wildcard pattern matching a folder
60+
# ending with `-java` at the leaf of proto_path.
61+
mkdir -p "${pre_processed_libs_folder}/${proto_path}/generated-java"
62+
folder_name=$(extract_folder_name "${destination_path}")
63+
copy_directory_if_exists "${output_folder}/${destination_path}/proto-${folder_name}" \
64+
"${pre_processed_libs_folder}/${proto_path}/generated-java/proto-google-cloud-${folder_name}"
65+
copy_directory_if_exists "${output_folder}/${destination_path}/grpc-${folder_name}" \
66+
"${pre_processed_libs_folder}/${proto_path}/generated-java/grpc-google-cloud-${folder_name}"
67+
copy_directory_if_exists "${output_folder}/${destination_path}/gapic-${folder_name}" \
68+
"${pre_processed_libs_folder}/${proto_path}/generated-java/gapic-google-cloud-${folder_name}"
69+
copy_directory_if_exists "${output_folder}/${destination_path}/samples" \
70+
"${pre_processed_libs_folder}/${proto_path}/generated-java/samples"
71+
pushd "${pre_processed_libs_folder}"
72+
# create an empty repository so owl-bot-copy can process this as a repo
73+
# (cannot process non-git-repositories)
74+
git init
75+
git commit --allow-empty -m 'empty commit'
76+
popd # pre_processed_libs_folder
77+
78+
docker run --rm \
79+
--user $(id -u):$(id -g) \
80+
-v "${workspace}:/repo" \
81+
-v "${pre_processed_libs_folder}:/pre-processed-libraries" \
82+
-w /repo \
83+
--env HOME=/tmp \
84+
gcr.io/cloud-devrel-public-resources/owlbot-cli:latest \
85+
copy-code \
86+
--source-repo-commit-hash=none \
87+
--source-repo=/pre-processed-libraries \
88+
--config-file=.OwlBot.yaml
89+
90+
91+
echo 'running owl-bot post-processor'
92+
versions_file_arg=""
93+
if [ -f "${versions_file}" ];then
94+
versions_file_arg="-v ${versions_file}:/versions.txt"
95+
fi
96+
# run the postprocessor
97+
docker run --rm \
98+
-v "${workspace}:/workspace" \
99+
${versions_file_arg} \
100+
--user $(id -u):$(id -g) \
101+
"${owlbot_postprocessor_image}"

0 commit comments

Comments
 (0)