Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build legacy swarm with any netplugin branch #281

Merged
merged 2 commits into from
Nov 3, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ cluster/.vagrant
cluster/export
cluster/*.log
release
artifact_staging/

38 changes: 35 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,13 +1,45 @@
# backwards compatibility name for CONTIV_INSTALLER_VERSION
export BUILD_VERSION ?= devbuild
# sets the version for the installer output artifacts
export CONTIV_INSTALLER_VERSION ?= $(BUILD_VERSION)
# downloaded and built assets intended to go in installer by build.sh
export CONTIV_ARTIFACT_STAGING := $(PWD)/artifact_staging
# some assets are retrieved from GitHub, this is the default version to fetch
export DEFAULT_DOWNLOAD_CONTIV_VERSION := 1.1.5
export NETPLUGIN_OWNER ?= contiv
# setting NETPLUGIN_BRANCH compiles that commit on demand,
# setting CONTIV_NETPLUGIN_VERSION will download that released version
ifeq ($(NETPLUGIN_BRANCH),)
export CONTIV_NETPLUGIN_VERSION ?= $(DEFAULT_DOWNLOAD_CONTIV_VERSION)
else
export CONTIV_NETPLUGIN_VERSION := $(NETPLUGIN_OWNER)-$(NETPLUGIN_BRANCH)
endif
export CONTIV_V2PLUGIN_VERSION ?= $(DEFAULT_DOWNLOAD_CONTIV_VERSION)
export CONTIV_NETPLUGIN_TARBALL_NAME := netplugin-$(CONTIV_NETPLUGIN_VERSION).tar.bz2
export CONTIV_ANSIBLE_COMMIT ?= 4e67f54a8042debfc3d8b504046d0a1d4ea38c37
export CONTIV_ANSIBLE_OWNER ?= contiv

# this is the classic first makefile target, and it's also the default target
# run when `make` is invoked with no specific target.
all: build
rel_ver = $(shell ./scripts/get_latest_release.sh)

# accepts CONTIV_ANSIBLE_COMMIT and CONTIV_ANSIBLE_OWNER environment vars
download-ansible-repo:
@scripts/download_ansible_repo.sh

# set NETPLUGIN_OWNER (default contiv) and NETPLUGIN_BRANCH make variables
# to compile locally
# e.g. make NETPLUGIN_OWNER=contiv NETPLUGIN_BRANCH=master
prepare-netplugin-tarball:
@scripts/prepare_netplugin_tarball.sh

assemble-build:
@bash ./scripts/build.sh

# build creates a release package for contiv.
# It uses a pre-built image specified by BUILD_VERSION.
build:
rm -rf release/
@bash ./scripts/build.sh
build: download-ansible-repo prepare-netplugin-tarball assemble-build

# ansible-image creates the docker image for ansible container
# It uses the version specified by BUILD_VERSION or creates an image with the latest tag.
Expand Down
140 changes: 49 additions & 91 deletions scripts/build.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
#!/bin/bash

# Required environment variables:
# * CONTIV_INSTALLER_VERSION - sets the tarball artifact filenames
# * CONTIV_NETPLUGIN_VERSION - updates config files to locate contiv tarball
# * CONTIV_V2PLUGIN_VERSION - which v2plugin version to download during install

set -xeuo pipefail

# ensure this script wasn't called from the directory where this script
Expand All @@ -10,126 +15,67 @@ if [ "$script_dir" == "." ]; then
exit 1
fi

DEV_IMAGE_NAME="devbuild"
VERSION=${BUILD_VERSION-$DEV_IMAGE_NAME}

contiv_version=${CONTIV_VERSION:-"1.0.3"}
pull_images=${CONTIV_CI_HOST:-"false"}
aci_gw_version=${CONTIV_ACI_GW_VERSION:-"latest"}
ansible_image_version=${CONTIV_ANSIBLE_IMAGE_VERSION:-$contiv_version}
auth_proxy_version=${CONTIV_API_PROXY_VERSION:-$contiv_version}
ansible_image_version=${CONTIV_ANSIBLE_IMAGE_VERSION:-$DEFAULT_DOWNLOAD_CONTIV_VERSION}
auth_proxy_version=${CONTIV_API_PROXY_VERSION:-$DEFAULT_DOWNLOAD_CONTIV_VERSION}
docker_version=${CONTIV_DOCKER_VERSION:-1.12.6}
etcd_version=${CONTIV_ETCD_VERSION:-v2.3.8}
contiv_ansible_commit=${CONTIV_ANSIBLE_COMMIT:-4e67f54a8042debfc3d8b504046d0a1d4ea38c37}
contiv_ansible_owner=${CONTIV_ANSIBLE_OWNER:-contiv}

# the installer currently pulls the v2plugin image directly from Docker Hub, but
# this will change to being downloaded from the Docker Store in the future.
# because of this, the default value for this variable will become the latest
# version that is available in the Docker Store and should be considered
# independent of $contiv_version above.
v2plugin_version=${CONTIV_V2PLUGIN_VERSION:-"1.0.3"}

function usage() {
echo "Usage:"
echo "./release.sh -a <ACI gateway image> -c <contiv version> -e <etcd version> -p <API proxy image version> "
exit 1
}
v2plugin_version=${CONTIV_V2PLUGIN_VERSION}

function error_ret() {
echo ""
echo $1
exit 1
}

while getopts ":a:p:c:e:v:" opt; do
case $opt in
a)
aci_gw_version=$OPTARG
;;
c)
contiv_version=$OPTARG
;;
e)
etcd_version=$OPTARG
;;
p)
auth_proxy_version=$OPTARG
;;
v)
v2plugin_version=$OPTARG
;;
:)
echo "An argument required for $OPTARG was not passed"
usage
;;
?)
usage
;;
esac
done

release_dir="release"
output_dir="$release_dir/contiv-$VERSION/"
output_file="$release_dir/contiv-$VERSION.tgz"
tmp_output_file="contiv-$VERSION.tgz"
full_output_file="$release_dir/contiv-full-$VERSION.tgz"
tmp_full_output_file="contiv-full-$VERSION.tgz"

# Clean older dist folders and release binaries
rm -rf $output_dir
rm -rf $output_file
# where everything is assembled, always start with a clean dir and clean it up
output_tmp_dir="$(mktemp -d)"
output_dir="${output_tmp_dir}/contiv-${CONTIV_INSTALLER_VERSION}"
mkdir -p ${output_dir}
trap 'rm -rf ${output_tmp_dir}' EXIT

release_dir=release
mkdir -p $release_dir
output_file="${release_dir}/contiv-${CONTIV_INSTALLER_VERSION}.tgz"
full_output_file="$release_dir/contiv-full-${CONTIV_INSTALLER_VERSION}.tgz"

# Release files
# k8s - install.sh to take the args and construct contiv.yaml as required and to launch kubectl
# swarm - install.sh launches the container to do the actual installation
# Top level install.sh which will either take k8s/swarm install params and do the required.
mkdir -p $output_dir
cp -rf install $output_dir
cp README.md $output_dir
cp -rf install README.md $output_dir
cp -rf scripts/generate-certificate.sh $output_dir/install

# Get the ansible support files
chmod +x $output_dir/install/genInventoryFile.py
chmod +x $output_dir/install/generate-certificate.sh

# This is maybe optional - but assume we need it for
curl -sSL https://github.com/contiv/netplugin/releases/download/$contiv_version/netplugin-$contiv_version.tar.bz2 -o $output_dir/netplugin-$contiv_version.tar.bz2
pushd $output_dir
tar oxf netplugin-$contiv_version.tar.bz2 netctl
rm -f netplugin-$contiv_version.tar.bz2
popd
# add ansible repo contents where final tarball will include
mkdir $output_dir/ansible
curl -sL https://api.github.com/repos/${contiv_ansible_owner}/ansible/tarball/$contiv_ansible_commit |
tar --strip-components 1 -C $output_dir/ansible -z -x
cp -a ${CONTIV_ARTIFACT_STAGING}/ansible ${output_dir}/

# Replace versions
files=$(find $output_dir -type f -name "*.yaml" -or -name "*.sh" -or -name "*.json")
sed -i.bak 's/__ACI_GW_VERSION__/'"$aci_gw_version"'/g' $files
sed -i.bak 's/__API_PROXY_VERSION__/'"$auth_proxy_version"'/g' $files
sed -i.bak 's/__CONTIV_INSTALL_VERSION__/'"$ansible_image_version"'/g' $files
sed -i.bak 's/__CONTIV_VERSION__/'"$contiv_version"'/g' $files
sed -i.bak 's/__CONTIV_VERSION__/'"$CONTIV_NETPLUGIN_VERSION"'/g' $files
sed -i.bak 's/__DOCKER_VERSION__/'"$docker_version"'/g' $files
sed -i.bak 's/__ETCD_VERSION__/'"$etcd_version"'/g' $files
sed -i.bak 's/__CONTIV_V2PLUGIN_VERSION__/'"$v2plugin_version"'/g' $files

# Make all shell script files executable
chmod +x $(find $output_dir -type f -name "*.sh")

# Cleanup the backup files
rm -rf $output_dir/scripts
rm -rf $(find $output_dir -type f -name "*.bak")

# Clean up the Dockerfile, it is not part of the release bits.
rm -f $output_dir/install/ansible/Dockerfile

# Create the binary cache folder
binary_cache=$output_dir/contiv_cache
mkdir -p $binary_cache

# Create the minimal tar bundle
tar czf $tmp_output_file -C $release_dir contiv-$VERSION
# only build installer that pulls artifacts over internet if not building
# a specific commit of netplugin
if [ -z "${NETPLUGIN_BRANCH:-}" ]; then
# Create the minimal tar bundle
tar czf $output_file -C $output_tmp_dir contiv-${CONTIV_INSTALLER_VERSION}
echo -n "Contiv Installer version '$CONTIV_INSTALLER_VERSION' with "
echo "netplugin version '$CONTIV_NETPLUGIN_VERSION' is available "
echo "at '$output_file'"
fi

# Save the auth proxy & aci-gw images for packaging the full docker images with contiv install binaries
if [[ "$(docker images -q contiv/auth_proxy:$auth_proxy_version 2>/dev/null)" == "" || "$pull_images" == "true" ]]; then
Expand All @@ -147,17 +93,29 @@ curl --fail -sL -o $binary_cache/openvswitch-2.5.0-2.el7.x86_64.rpm http://cbs.c
curl --fail -sL -o $binary_cache/ovs-common.deb http://mirrors.kernel.org/ubuntu/pool/main/o/openvswitch/openvswitch-common_2.5.2-0ubuntu0.16.04.3_amd64.deb
curl --fail -sL -o $binary_cache/ovs-switch.deb http://mirrors.kernel.org/ubuntu/pool/main/o/openvswitch/openvswitch-switch_2.5.2-0ubuntu0.16.04.3_amd64.deb

# Copy the netplugin release into the binary cache for "full" installer
# Netplugin releases built locally based on a branch are named by their SHA,
# but there is a symlink to point to the SHA named tarball by it's branch name
plugin_tball=${CONTIV_ARTIFACT_STAGING}/$CONTIV_NETPLUGIN_TARBALL_NAME
if [[ -L "${plugin_tball}" ]]; then
# copy the link (so other processes can find the tarball) and the tarball
target_plugin_tball=$(readlink ${plugin_tball})
cp -a ${plugin_tball} ${binary_cache}/
plugin_tball=${CONTIV_ARTIFACT_STAGING}/${target_plugin_tball}
fi
cp ${plugin_tball} ${binary_cache}/

env_file=$output_dir/install/ansible/env.json
sed -i.bak 's#__AUTH_PROXY_LOCAL_INSTALL__#true#g' "$env_file"
sed -i.bak 's#__CONTIV_NETWORK_LOCAL_INSTALL__#true#g' "$env_file"

echo "Ansible extra vars from env.json:"
cat $env_file
# Create the full tar bundle
tar czf $tmp_full_output_file -C $release_dir contiv-$VERSION

mv $tmp_output_file $output_file
mv $tmp_full_output_file $full_output_file
rm -rf $output_dir

echo "Success: Contiv Installer version $VERSION is available at $output_file"
tar czf $full_output_file -C $output_tmp_dir contiv-${CONTIV_INSTALLER_VERSION}
echo -n "Contiv Installer version '$CONTIV_INSTALLER_VERSION' with "
echo "netplugin version '$CONTIV_NETPLUGIN_VERSION' is available "
echo "at '$full_output_file', it includes all contiv assets "
echo "required for installation"
echo
echo -e "\nSuccess"
13 changes: 13 additions & 0 deletions scripts/download_ansible_repo.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/bin/bash

set -euo pipefail

ANSIBLE_REPO_DIR=${CONTIV_ARTIFACT_STAGING}/ansible

rm -rf $ANSIBLE_REPO_DIR

mkdir -p $ANSIBLE_REPO_DIR $CONTIV_ARTIFACT_STAGING

echo downloading ${CONTIV_ANSIBLE_OWNER}/ansible commit: $CONTIV_ANSIBLE_COMMIT
curl --fail -sL https://api.github.com/repos/${CONTIV_ANSIBLE_OWNER}/ansible/tarball/$CONTIV_ANSIBLE_COMMIT \
| tar --strip-components 1 -C $ANSIBLE_REPO_DIR -z -x
14 changes: 5 additions & 9 deletions scripts/legacy_swarm_test.sh
Original file line number Diff line number Diff line change
Expand Up @@ -24,16 +24,12 @@ if [ "$ssh_key" == "" ]; then
ssh_key=$(vagrant ssh-config legacy-swarm-master | grep IdentityFile | awk '{print $2}' | xargs)
fi
popd

./scripts/unpack-installer.sh

# Extract and launch the installer
mkdir -p release
cd release
if [ ! -f "${install_version}.tgz" ]; then
# For release builds, get the build from github releases
curl -L -O https://github.com/contiv/install/releases/download/${BUILD_VERSION}/${install_version}.tgz
fi

tar oxf $install_version.tgz
cd $install_version
cd release/$install_version
./install/ansible/install_swarm.sh -f ../../cluster/.cfg_legacy-swarm.yaml -e $ssh_key -u $user -i

# Wait for CONTIV to start for up to 10 minutes
Expand All @@ -46,7 +42,7 @@ for i in {0..20}; do
cat <<EOF
NOTE: Because the Contiv Admin Console is using a self-signed certificate for this demo,
you will see a security warning when the page loads. You can safely dismiss it.

You can access the Contiv master node with:
cd cluster && vagrant ssh legacy-swarm-master
EOF
Expand Down
12 changes: 7 additions & 5 deletions scripts/prepare_netplugin_tarball.sh
Original file line number Diff line number Diff line change
Expand Up @@ -39,13 +39,15 @@ git clone --branch ${NETPLUGIN_BRANCH} --depth 1 \

# run the build and extract the binaries
cd $netplugin_tmp_dir/netplugin
GIT_COMMIT=$(./scripts/getGitCommit.sh)
# gopath is set in the tar container, not used, but makefile requires it set
make GOPATH=${GOPATH:-.} BUILD_VERSION=${GIT_COMMIT} tar
# BUILD_VERSION (currently == devbuild) is in env, so clear it
declare +x BUILD_VERSION
# this is most likely to be just SHA because we pulled only single commit
NETPLUGIN_VERSION=$(./scripts/getGitVersion.sh)
BUILD_VERSION=${NETPLUGIN_VERSION} make tar

# move the netplugin tarball to the staging directory for the installer
mv netplugin-${GIT_COMMIT}.tar.bz2 \
mv netplugin-${NETPLUGIN_VERSION}.tar.bz2 \
${CONTIV_ARTIFACT_STAGING}/
# create a link so other scripts can find the file without knowing the SHA
cd ${CONTIV_ARTIFACT_STAGING}
ln -sf netplugin-${GIT_COMMIT}.tar.bz2 $CONTIV_NETPLUGIN_TARBALL_NAME
ln -sf netplugin-${NETPLUGIN_VERSION}.tar.bz2 $CONTIV_NETPLUGIN_TARBALL_NAME
21 changes: 14 additions & 7 deletions scripts/swarm_mode_test.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,24 +16,31 @@ else
fi
user=${CONTIV_SSH_USER:-"$def_user"}

# If BUILD_VERSION is not defined, we use a local dev build, that must have been created with make release
install_version="contiv-${BUILD_VERSION:-devbuild}"
pushd cluster
ssh_key=${CONTIV_SSH_KEY:-"$def_key"}
if [ "$ssh_key" == "" ]; then
ssh_key=$(vagrant ssh-config swarm-mode-master | grep IdentityFile | awk '{print $2}' | xargs)
fi
popd

# Extract and launch the installer
mkdir -p release
cd release
if [ ! -f "${install_version}.tgz" ]; then
# For release builds, get the build from github releases
curl -L -O https://github.com/contiv/install/releases/download/${BUILD_VERSION}/${install_version}.tgz
# If BUILD_VERSION is not defined, we use a local dev build, that must have been created with make release
release_name="contiv-${BUILD_VERSION:-devbuild}"
release_tarball="${release_name}.tgz"
release_local_tarball="contiv-full-${BUILD_VERSION}.tgz"
if [ -f "${release_local_tarball}" ]; then
tar oxf "${release_local_tarball}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we using full build for dev setups and minimal build for released versions?

Copy link
Contributor Author

@chrisplo chrisplo Nov 2, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for installing here, it's looking for full build first (which both dev and release builds create) but the fall back is that if the build wasn't made local, thent he BUILD_VERSION was an actual release version that is available to download. As a side note, the dev build does not make the "small" installer because those dev versions are not downloadable

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't we always go to the else case for a release build? Unless if the release build was built locally?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Locally, the full installer is always built by build.sh, but the small installer is built only when NETPLUGIN_BRANCH is not defined:
https://github.com/contiv/install/pull/281/files#diff-d99b26f864526f6f8f79861d49deb922R72

Anyone downloading an running this target, that doesn't match up BUILD_VERSION to a BUILD_VERSION that was built locally, will not have a full installer, and so we'll hit the else case and it's expected in that scenario that only release builds apply. The second -f check allows for reinstall without re-downloading the small release installer.

else
if [ ! -f "${release_tarball}" ]; then
# For release builds, get the build from github releases
curl -L -O https://github.com/contiv/install/releases/download/${BUILD_VERSION}/${release_name}.tgz
fi
tar oxf "${release_name}.tgz"
fi

tar oxf $install_version.tgz
cd $install_version
cd $release_name
./install/ansible/install_swarm.sh -f ../../cluster/.cfg_swarm-mode.yaml -e $ssh_key -u $user -p

# Wait for CONTIV to start for up to 10 minutes
Expand Down