Linux Foundation Releng Documentation¶
Linux Foundation Release Engineering Documentation.
A collection of documentation for Linux Foundation Continuous Integration (LFCI).
Guides:
Environment Overview¶
Projects coming to The Linux Foundation (LF) for Continuous Integration (CI) services are generally given a similar infrastructure to the rest of our other projects unless there is a good reason to deviate from this configuration.
This infrastructure enables the developers to all have similar workflows which makes it possible for the communities to help each other out with how to work in the environment. The development workflow enabled is in Figure 1.

Figure 1¶
The standard infrastructure configuration that we build for projects is in Figure 2.

Figure 2¶
This design allows services to be logically separated and moved around to different cloud providers based upon the needs of the project as well as costs related to operating in given clouds.
The basic configuration puts our CI systems along with artifact storage into a special DMZ Cloud which is where communities interact with the CI infrastructure itself. This cloud is then linked to a private dynamic instance cloud that has the ability to access the DMZ resources and external internet services, but not anything deeper into the core LF networks.
Services that are not dependent on being co-located with the CI infrastructure will be in a different cloud. While this cloud, may be with the same provider, it may be a different provider. We do this to ensure that the CI build infrastructure itself has as little capability of potentially jail breaking / exploiting potential security issues in the code repository hosting in particular but also any other services hosted in the infrastructure.
Pre-formation¶
When services for a project are first being setup, the project will be in a pre-formation phase. During this phase most services allow a non-public, restricted set of people, access the resources.
Pre-formation participants will receive invitations to one or more access
groups. These groups will have their name set in a way to show that they are for
pre-formation access. While the group names will include the demarcation of
gerrit
, this is an artifact of our naming groups that power Gerrit access
rights. CI (Jenkins), issue tracking (JIRA), wiki (Confluence) will have access
granted via this same group.
Long term storage of CI logs is not complete during this phase as the log shipping mechanisms that we use for capturing the console logs require that the CI infrastructure be open to the public. To improve the log storage, as well as avoid potential issues with licensing for JIRA and Confluence the recommendation for projects is stay in pre-formation for as short a time as possible or if possible, skip a restricted formation phase altogether.
Preparation for code seeds¶
Code contributed to a project as seed needs to meet the following criteria:
The code must pass any required Intellectual Property Review (IPR) that is in use by the project
The code must pass any licensing requirements related to the licensing used by the project
The code contribution must be a squash commit. This means losing history. The reasoning for this is that The Linux Foundation requires that any projects that we host must conform to the Developer’s Certificate of Origin (DCO) indicated by a
Signed-off-by
commit message footer by the author of non-merge object commits in the code which indicates that they have read and agree to the DCO.Developer’s Certificate of Origin¶Developer Certificate of Origin Version 1.1 Copyright (C) 2004, 2006 The Linux Foundation and its contributors. 1 Letterman Drive Suite D4700 San Francisco, CA, 94129 Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Developer's Certificate of Origin 1.1 By making a contribution to this project, I certify that: (a) The contribution was created in whole or in part by me and I have the right to submit it under the open source license indicated in the file; or (b) The contribution is based upon previous work that, to the best of my knowledge, is covered under an appropriate open source license and I have the right under that license to submit that work with modifications, whether created in whole or in part by me, under the same open source license (unless I am permitted to submit under a different license), as indicated in the file; or (c) The contribution was provided directly to me by some other person who certified (a), (b) or (c) and I have not modified it. (d) I understand and agree that this project and the contribution are public and that a record of the contribution (including all personal information I submit with it, including my sign-off) is maintained indefinitely and may be redistributed consistent with this project or the open source license(s) involved.
Refer to https://developercertificate.org/ for the original text.
LF does not presently have, nor is it aware of, tooling that will allow us to properly scan incoming repository histories to verify that they meet this. Requiring a squash commit of seed code is the way that we can definitively enforce this.
Post-formation¶
Once a project exits the pre-formation phase the following will happen:
Hosted services open to the public
The services inventory will get updated with the standard public services
Best Practices¶
Code Review¶
All patches that go into a project repo need to be code reviewed by someone other than the original author. Code review is a great way to both learn from others as well as improve code quality. Contribution to code review is highly recommended regardless of activity as a committer.
Below provides a simple checklist of common items that code reviewers should look out for (Patch submitters can use this to self-review as well to ensure that they are not hitting any of these):
General
Does the Git commit message sufficiently describe the change? (Refer to: https://chris.beams.io/posts/git-commit/ and https://fatbusinessman.com/2019/my-favourite-git-commit)
Does the Git commit subject line need to follow the Conventional Commit specification? (Refer to: https://www.conventionalcommits.org/ and https://gist.github.com/joshbuchea/6f47e86d2510bce28f8e7f42ae84c716)
If the change has an associated issue tracker, does the commit message have an ‘Issue: <someissue>’ (or any similar tag such as ‘Issue-Id:’ or ‘JIRA:’) in the footer and not in the subject line or body?
Is all meta-data in the footer? This includes the above point and any other key-value data pairings that are truly meta-data. Such as, but not limited to, Signed-off-by, Change-Id, Issue, Jira, Issue-Id, Bug, etc.
Are there any typos?
Are all code review comments addressed?
Is the code rebased onto the latest HEAD of the branch?
Does the code pull in any dependencies that might have license conflicts with this project’s license?
Is the Git commit body independent from the title ? The first paragraph should not be a continued flow from the subject line but a paragraph that can stand on its own.
If important changes are brought by the commit(s), does an appropriate ChangeLog file need to be created ? (for example a reno YAML file for Releng)
Code
Are imports alphabetical and sectioned off by stdlib, 3rdparty, and local?
Are functions / methods organized alphabetically? (or categorized alphabetically)
Does the change need unit tests? (Yes, it probably does!)
Does the change need documentation?
Does every function added have function docs? (javadoc, pydoc, whatever the language code docs is)
Does it pass linting?
Does the code cause backwards compatibility breakage? (If so it needs documentation)
Note
Refer to Google’s blog (google-blog-code-health) on effective code review.
Generic Linting (pre-commit)¶
pre-commit is a Git hooks management tool and is great for running linters from any code languages. The easiest way to run pre-commit is with python-tox and requires Python 3 installed on the system:
tox -epre-commit
However if you want a more automated experience we recommend running pre-commit
directly and installing the hooks such that they automatically run when you
execute the git commit
command. In this case, install pre-commit using your
package manager or pip install
it if your distro does not have it
available.
Requirements¶
Python 3
Python pre-commit
Install pre-commit¶
If pre-commit is not available from your native package manager than you can
install it via Python’s pip install
command:
pip install --user pre-commit
pre-commit --help
Once installed for every repo that are are working on you can install the pre-commit hooks directly into your local Git hooks on a per repo basis.
pre-commit install
Set up pre-commit for a Project¶
Requirements
Python 3
Python pre-commit
Python Tox
Configure the project with a tox.ini
and a .pre-commit-config.yaml
file. Below are examples of .pre-commit-config.yaml
and tox.ini
as
defined by this project. Inside the tox.ini
file the interesting bits are
under [testenv:pre-commit]
.
.pre-commit-config.yaml
---
default_language_version:
python: python3
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: check-executables-have-shebangs
- id: check-merge-conflict
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/jorisroovers/gitlint
rev: v0.18.0
hooks:
- id: gitlint
- repo: https://github.com/ambv/black
rev: 23.3.0
hooks:
- id: black
- repo: https://github.com/PyCQA/flake8
rev: 6.0.0
hooks:
- id: flake8
args: ["--max-line-length=120"]
additional_dependencies: [Flake8-pyproject]
- repo: https://github.com/pycqa/bandit
rev: 1.7.5
hooks:
- id: bandit
# Bandit does not need to run on test code
exclude: tests/*
- repo: https://github.com/pycqa/pydocstyle
rev: 6.3.0
hooks:
- id: pydocstyle
- repo: https://github.com/btford/write-good
rev: v1.0.8
hooks:
- id: write-good
files: "\\.(rst|md|markdown|mdown|mkdn)$"
exclude: docs/infra/gerrit.rst|docs/best-practices.rst
tox.ini
[tox]
minversion = 1.6
envlist =
check-best-practices,
docs,
docs-linkcheck,
pre-commit
skipsdist=true
[testenv]
basepython = python3
install_command=python -m pip install --no-cache-dir {opts} {packages}
[testenv:check-best-practices]
basepython = python3
commands = python {toxinidir}/check-best-practices.py
[testenv:docs]
basepython = python3
deps = -rrequirements.txt
commands =
sphinx-build -q -j auto -W -b html -n -W -d {envtmpdir}/doctrees ./docs/ {toxinidir}/docs/_build/html
[testenv:docs-linkcheck]
basepython = python3
deps = -rrequirements.txt
commands = sphinx-build -q -j auto -W -b linkcheck -d {envtmpdir}/doctrees ./docs/ {toxinidir}/docs/_build/linkcheck
[testenv:pre-commit]
basepython = python3
allowlist_externals =
/bin/sh
deps =
pre-commit
passenv = HOME
commands =
pre-commit run --all-files --show-diff-on-failure
/bin/sh -c 'if ! git config --get user.name > /dev/null; then \
git config --global --add user.name "CI"; \
touch .git/REMOVE_USERNAME; fi'
/bin/sh -c 'if ! git config --get user.email > /dev/null; then \
git config --global --add user.email "ci@example.org"; \
touch .git/REMOVE_USEREMAIL; fi'
/bin/sh -c "if [ -f .git/COMMIT_EDITMSG ]; then \
cp .git/COMMIT_EDITMSG .git/COMMIT_MSGTOX; else \
git log HEAD -n1 | tail -n +5 | cut -c 5- > .git/COMMIT_MSGTOX; fi"
pre-commit run gitlint --hook-stage commit-msg --commit-msg-filename .git/COMMIT_MSGTOX
/bin/sh -c "rm -f .git/COMMIT_MSGTOX"
/bin/sh -c "if [ -f .git/REMOVE_USERNAME ]; then \
git config --global --unset user.name; \
rm -f .git/REMOVE_USERNAME; fi"
/bin/sh -c "if [ -f .git/REMOVE_USEREMAIL ]; then \
git config --global --unset user.email; \
rm -f .git/REMOVE_USEREMAIL; fi"
Jenkins Job Builder¶
GitHub Workflow¶
When working directly on Github (as opposed to Gerrit systems mirrored to Github), you’ll need to create a fork and use branches/ pull requests to get changes merged to the main repo. Here are some instructions on creating and maintaining your fork.
Forking and working¶
Fork your $PROJECT/$REPO to your personal Github account
- NOTE if you are forking the ci-management repository you should consider
changing the local fork name after you have forked it to be $PROJECT-ci-management this has 2 benefits:
Let you know which upstream project the ci-management repo it’s for
Allow you to fork the next ci-management repository that you might need to work on
Clone your repo
git clone git@github.com:$MYACCOUNT/$REPO
Setup an upstream remote
git remote add upstream git@github.com:$PROJECT/$REPO
Create local branch and do your work (same as with gerrit)
git checkout -b $feature
Push your local branch to your fork, preferably as a branch on your fork
git push origin $feature
Raise PR against the upstream (note: when pushing a branch from your local to your fork the CLI gives you a URL for raising the PR)
Care and feeding of your fork¶
- Your fork will fall out of sync with the upstream repo so be sure to tend
to your fork before working on it.
Fetch upstream changes from the remote you’ve already added:
git fetch upstream
Switch to primary branch and refresh your master branch
git checkout master && git pull --rebase upstream master
Update Github with your synced fork:
git push origin
Ansible Guide¶
This guide documents the process to setup and manage a new Ansible role.
Ansible Roles¶
Ansible roles are a collection of Ansible vars_files, tasks, and handlers packaged into a single package for easy distribution and reuse.
Refer to the upstream Ansible Roles documentation for details.
Ansible Galaxy¶
Ansible galaxy is an online hub for finding, reusing and sharing Ansible Content. We use it to share and pull down Ansible roles.
Molecule¶
Molecule is a test framework for testing Ansible roles. We use it to ensure the role supports all supported distros.
Requirements¶
In a virtualenv install.
pip install ansible docker-py molecule
Set up an Ansible Role¶
Create a repo to store the role
Init role using Ansible galaxy:
# Replace ROLE_NAME with the name of your role ansible-galaxy init ROLE_NAME --force .. note:: The ``ansible-galaxy`` command creates a directory named ROLE_NAME so call it from outside the repo directory and pass it the name of the repository.
Change directory into the ROLE_NAME directory
Create a .gitignore:
.molecule/ .tox/ __pycache__/ *.pyc
Add molecule test:
molecule init scenario -r ROLE_NAME
Add molecule test to tox.ini:
[tox] minversion = 1.6 envlist = molecule skipsdist=true [testenv:coala] basepython = python2 deps = ansible docker-py molecule passenv = * commands = ./molecule.sh
Add
molecule.sh
scriptReplace ROLE_NAME with the name of your role.
#!/bin/bash # SPDX-License-Identifier: EPL-1.0 ############################################################################## # Copyright (c) 2018 The Linux Foundation and others. # # All rights reserved. This program and the accompanying materials # are made available under the terms of the Eclipse Public License v1.0 # which accompanies this distribution, and is available at # http://www.eclipse.org/legal/epl-v10.html ############################################################################## # If running in Jenkins we need to symlink the workspace so that # ansible can pick up the role. if [ ! -z "$JENKINS_URL" ]; then ln -sf "$WORKSPACE" ../ROLE_NAME fi molecule test --destroy=always
Make
molecule.sh
script executable:chmod +x molecule.sh
Run molecule test:
tox -e molecule
Note
Resolve any molecule test errors before moving on.
Edit meta information in
meta/main.yml
Edit
README.md
with relevant information about the new roleGit commit the repo:
git add . git commit -sm "Add role ROLE_NAME"
Git Guide¶
Git is the most commonly used distributed version control system for anyone working with a code repository. Git allows the user to create a local copy of the remote repository and sync changes onto the server.
Git is available as a set of CLI tools on different platforms to perform operations such as initialize, add, commit, pull and push on the repository and more advanced operations such as branch, merge and rebase. Git works with GitHub, GitLab and Gerrit workflows.
Prerequisites¶
Install Git.
For Debian based systems:
sudo apt-get install git -y
For rpm based systems:
sudo dnf install git -y
For MacOS systems, install homebrew and install Git
brew install git
Setup Git config¶
To change the author name or email used to sign off a commit use the command.
git config --local user.name "Your Name"
git config --local user.email yourname@example.com
To change the Git commit editor to vim run the command.
git config --global core.editor "vim"
To always sign a commit by default.
git config --global gpg.gpgsign true
To set the default gpg2 program:
git config --global gpg.program $(which gpg2)
Sample .gitconfig¶
Sample $HOME/.gitconfig
with other useful settings.
[user]
name = <User Name>
email = user@example.com
signingkey = XXXXXXXXXXXXXXXXX
[core]
editor = vim
pager = less -R
[credential]
helper = cache --timeout=3600
[gitreview]
username = askb
[color]
ui = auto
[rerere]
enabled = true
[commit]
gpgsign = true
[gpg]
program = /usr/bin/gpg2
[push]
sign = if-asked
[status]
submoduleSummary = false
[alias]
co = checkout
Clone a repository¶
To clone a Git repository.
git clone ssh://<user_name>@git.example.org:29418/<repository>.git
Note
Use the --recursive-submodule
option if repository has Git submodules.
Auto Generate Change IDs¶
To generate a change-id automatically for each patch:
git clone ssh://USERNAME@gerrit.linuxfoundation.org:29418/releng/docs
scp -p -P 29418 USERNAME@gerrit.linuxfoundation.org:hooks/commit-msg docs/.git/hooks/
Pull Down New Source¶
To pull updates from the remote repository and rebase changes on your local branch.
git pull --rebase
Repository status¶
To check the status of the repository.
git status
Create a branch¶
To create a local branch from master.
git checkout -b <branch-name> origin/master
List branches¶
To see the available list of branches
git branch
Switching between branches¶
To switch between a branch and the master within your repository.
git checkout <branch-name>
git checkout master
Delete local branch¶
To delete a local branch (not active one).
- This is typically done
when a patch has merged.
when a review has completed.
git branch -d <branch-to-delete>
- If the above does not work, do a force delete.
Before performing a force delete, analyze and check why the normal delete did not work.
git branch -D <branch-to-delete>
Add a file¶
To stage a file modified in your local repository.
git add <path/to/file>
Commit a change¶
To commit a change to your local repository.
git add <path/to/file>
git commit --signoff --gpg-sign
Note
The –signoff (or -s) adds a “Signed-off-by” line in the commit footer. The –gpg-sign (or -S) signs the commit with the GPG key.
Amend a change¶
To amend a change in your local repository.
git add <path/to/file>
git commit --amend
Note
The –signoff (or -s) adds a “Signed-off-by” line in the commit footer. The –gpg-sign (or -S) signs the commit with the GPG key.
Discard a change¶
To discard changes introduced in the last commit.
git reset --hard HEAD~1
Cherry-pick a commit¶
To copy a commit from between branches use the git cherry-pick
command.
git checkout <from-branch>
git log # note <commit-id> from the output
git checkout <to-branch>
git cherry-pick <commit-id> # use the <commit-id> noted earlier
Stash changes¶
To stash your work temporarily and move between branches.
git stash # stash the modified files temporarily
git checkout <new-branch>
git stash pop # pop the modified changes
Log of recent changes¶
To view a log of the recent changes.
git log
To revert change partially in a commit¶
To revert changes to one or more files in a commit.
git log # note the <commit-id>
git show <commit-id> -- <file> | git apply -R # Revert the <file> in <commit-id>
git add <file>
git commit --signoff --gpg-sign --amend
Git Merge Conflicts¶
During rebase with master, a merge conflict might occur.
Open the conflicted file in an editor
Search for “<<<<”
Observe the code between “<<<<” to “>>>>” and delete wrong parts (including <<<<, ====, >>>>)
When done, add file and continue rebase.
git add <modified file> git rebase --continue
Continue this process, until rebase has completed.
Workflow Sample 1¶
Existing patch with comments in Gerrit, or a new patch.
Clone the Git repository.
Please look at Clone a repository.
Download an existing patch, or create a new.
Download existing patch and rebase
git review -d <Gerrit patch number> git fetch origin git rebase origin/master
Create new patch/branch.
git branch -b my_special_fix
Correct the patch - code - unit test - release document - commit message
Run tox locally (if applicable) to ensure unit tests and lint are passing with no errors.
tox
Go back to previous step and correct any issues reported by tox.
Add files to Git.
git add <each individual file>
Commit files If first time to commit
… code-block:: bash
git commit –signoff –gpg-sign –verbose
If not first time to commit
… code-block:: bash
git commit –amend
Rebase against master.
git fetch origin git rebase origin/master
If merge conflict occurs, solve this as in Git Merge Conflicts and repeat previous two steps.
Push changes to Gerrit.
git review
Clean up When the patch has merged, delete the branch
Follow instructions in Delete local branch
Workflow Sample 2¶
How to manage a big script, by submitting smaller patches which are depending on each other.
Analyze the code - Find code blocks that are small with no dependencies - Find code blocks that are small with dependencies on previous code.
- For instance,
each function by itself
common declarations
each class by itself etc.
- Key areas:
Each patch is building on the previous patch.
Each patch contains test unit code to fully test the new code in this patch.
Each patch passes all tox checks.
First patch : Do Workflow Sample 1
Next patch - Add the code for the next patch - Submit it according to Workflow Sample 1 (from Correct the patch step)
Remember to do ‘git commit –signoff –gpg-sign –verbose’, to submit a new patch.
Go to the previous step, until all patches submitted.
Now you should have a set of patches, like: 1, 2, 3, 4, 5 who are all building on each other.
Workflow Sample 3¶
How to change a patch set.
To change the patch set (one or more).
Ensure that master is up to date.
git checkout master git fetch origin git rebase origin/master
Checkout, and rebase.
git review -d <my_patch_set last patch number> git rebase origin/master
Rebase interactive.
git rebase -i
Change from ‘pick’ to ‘edit’ for the patch numbers to be review/modified.
Change files.
Add, and continue with rebase.
git add <modified file> git rebase --continue
Repeat previous two steps, until rebase finish.
Good to rebase.
git fetch origin git rebase origin/master
Time to submit patch.
git review
Workflow Sample 4¶
How to download an earlier version of the patch set and push it as the latest version.
git review -d <my_patch_set last patch number>,<second last patch set no>
git review
Alternative
git pull <https link to the last patch, second last patch set no>
git review
Example: There are 5 different versions of patch 13734.
Example with review
git review -d 13734,4Example with git pull
git pull https://gerrit.linuxfoundation.org/infra/releng/lftools refs/changes/34/13734/4
Gerrit Guide¶
Gerrit is an opensource web-based collaborative code review tool that integrates with Git. Gerrit provides a framework for reviewing code commits before it merges into the code base. The changes are not made a part of the project until a code review completes. Gerrit is also a good collaboration tool for storing the conversations that occur around the code commits.
Note
Here is more information on Gerrit
Prerequisites¶
Before you get started, you should have:
an LFID account (sign up here)
git installed (see: http://www.git-scm.com/downloads)
git configured with your name, e-mail address and editor
git config --global user.name "Firstname Lastname" git config --global user.email "email@address.com" git config --global core.editor "text-editor-name"
Note
Your name and e-mail address (including capitalization) must match what you entered when creating your LFID account.
an ssh public/private key pair (see the good GitHub docs on generating ssh keys)
register in the Gerrit server. See below for detailed instructions. Register your SSH key with Gerrit
git-review installed (see: https://www.mediawiki.org/wiki/Gerrit/git-review#Installation)
Clone the code¶
Cloning the code into a local workspace can happen via HTTP or SSH.
Make sure your Gerrit settings are up to date with correct SSH and GPG keys.
In the project’s Gerrit instance, we can see the HTTP and SSH commands. From the left side menu, select Projects->List->Select any project->General.
Copy the desired clone command and paste it in your terminal.
SSH Clone¶
This option provides a more secure connection. We should always use SSH for pushing code unless the user is under a network that prevents SSH usage. In such case, use HTTPS.
Note
For more information on how to generate the public/private key pair see Generating SSH keys for your system and Register your SSH key with Gerrit
Note
The SSH clone option will not appear if the settings are not updated with the correct SSH keys.
Browse for the project’s General information.
Click on the ssh tab.
Clone desired repo. For example:
git clone ssh://USERNAME@gerrit.linuxfoundation.org:29418/releng/docs
Note
Since we are constantly working on uploading new code into the repositories, we recommend to use SSH clones since the remotes for pushing code get configured appropriately.
Anonymous HTTP Clone¶
Recommended if the intention is to view code and not make any contributions:
Browse the project and click
General
Click
anonymous http
tab.Clone desired repo. For example:
git clone https://gerrit.linuxfoundation.org/releng/docs
Authenticated HTTP Clone¶
This works everywhere, even behind a proxy or a firewall.
Get the password by clicking on the username on the top right->Settings-> HTTP Password->Generate Password
Browse for the project and click
General
.Click
http
tab.Clone desired repo. For example:
git clone https://USERNAME@gerrit.linuxfoundation.org/infra/a/releng/docs
Follow the user/password prompts.
Note
For Gerrit < 2.14 the HTTP password is not the same as the Linux Foundation ID password.
Note
For Gerrit with HTTP configuration, the HTTP Password is in the User Name (Top right corner) -> Settings -> HTTP Password -> Generate Password.
Clone with commit-msg hook¶
Both SSH and HTTP clone options have a clone with commit-msg hook which adds a hook to handle the Change-Id field in the footer of the commit message.
Browse for the project and click
General
.Click
Clone with commit-msg hook
. For example:git clone ssh://USERNAME@gerrit.linuxfoundation.org:29418/releng/docs scp -p -P 29418 USERNAME@gerrit.linuxfoundation.org:hooks/commit-msg docs/.git/hooks/
OR
curl -Lo .git/hooks/commit-msg https://gerrit.linuxfoundation.org/infra/tools/hooks/commit-msg
Note
The hook implementation is intelligent at inserting the Change-Id line before any Signed-off-by or Acked-by lines placed at the end of the commit message by the author, but if no lines are present then it will insert a blank line, and add the Change-Id at the bottom of the message.
If a Change-Id line is already present in the message footer, the script will do nothing, leaving the existing Change-Id unmodified. This permits amending an existing commit, or allows the user to insert the Change-Id manually after copying it from an existing change viewed on the web.
(Optional). To prevent the Change-Id addition, set gerrit.createChangeId to false in the git config.
Push patches to Gerrit¶
Open a shell to the directory containing the project repo
Create a local working branch, based on the branch you would like to make changes to.
git fetch origin git checkout -b new_feature_branch origin/master
Replace origin/master with whichever remote/branch you need to contribute to. Typically master is the latest development branch.
Make the modifications you would like to change in the project
Stage the modified files for commit. (Repeat for all files modified)
git add /path/to/file
Verify the staged files by running
git status
Commit the staged files by amending the patch
git commit -s
Note
The ‘-s’ argument signs the commit message with your name and email and is a statement that you agree to the Developer’s Certificate of Origin.
Push the patch to Gerrit using one of the 2 methods documented:
Push using git review¶
We recommend using git-review if possible as it makes working with Gerrit much easier.
Install
git-review
via your local package management systemIf your distro does not package git-review or you need a newer version.
Install it via PyPi in a virtualenv environment:
virtualenv ~/.virtualenvs/git-review pip install git-review
Push the patch to Gerrit
git review
We can optionally pass the parameter
-t my_topic
to set a topic in Gerrit. Useful when we have related patches to organize in one topic.
Once pushed we should see some output in the terminal as described in Gerrit Push Output.
Push using git push¶
This method is a useful fallback in situations where we cannot use git-review.
Use the following command:
git push <remote> HEAD:refs/for/master
Where <remote> is the Gerrit location to push the patch to. Typically ‘origin’ but can also be ‘gerrit’ depending on how we have our local repo setup.
Note
Notice the word “for” is explicitly intending to perform the push into Gerrit. Using “heads” instead, will attempt to make the a push into the repository bypassing Gerrit which can come in handy for some isolated cases (when having force push rights). Another variable commonly used is “refs/changes/<gerrit-number>” which is an explicit way of making an update to an existing gerrit. In such case, is best to let gerrit handle this via Change-Id in the commit text.
More options for this command: git-push.
Once pushed we should see some output in the terminal as described in Gerrit Push Output.
Push output¶
After pushing a commit to Gerrit we should see the following output:
(releng) cjac@probook0:/usr/src/git/lf/gerrit.linuxfoundation.org/releng/docs$ git review
remote: Processing changes: updated: 1, refs: 1, done
remote:
remote: Updated Changes:
remote: https://gerrit.linuxfoundation.org/infra/7404 documentation on the topic of git-review
remote:
To ssh://gerrit.linuxfoundation.org:29418/releng/docs.git
* [new branch] HEAD -> refs/publish/master/git-review-docs
This output includes a URL to the patch. The number at the end is the patch’s change number.
Update an existing patch¶
In a healthy Open Source project code reviews will happen and we will need to amend the patches until reviewers are happy with the change. This section will walk through the process of updating a patch that is already in Gerrit Code Review.
Open a shell to the directory containing the project repo
Pull the latest version of the patch from Gerrit
git review -d ${change_number}
The change number is in the URL of the Gerrit patch. For example if the URL is https://git.opendaylight.org/gerrit/75307 then run
git review -d 75307
to pull the corresponding changes.(Optional) View information on the latest changes made to that patch.
To view the edited files, run
git show
.To view a listing of the edited files and the number of lines in those files, run
git show --stat
.
Rebase the patch before you start working on it
git pull --rebase
This is to ensure that the patch incorporates the latest version of the repo and is not out of date.
Make the necessary changes to the patch with your favorite editor
Check the state of the repo by running
git status
Stage the modified files for commit. (Repeat for all files modified)
git add /path/to/file
Verify the staged files by running
git status
Commit the staged files by amending the patch
git commit --amend
Update the current patch description and then save the commit request.
If you feel as though you added enough work on the patch, add your name in the footer with a line
Co-Authored-By: Firstname Lastname <email>
.Rebase the patch one last time
git pull --rebase
This is to ensure that the patch incorporates the latest version of the repo and is not out of date.
Submit your files for review:
git review
You will receive 2 emails from Gerrit Code Review: the first indicating that a build to incorporate your changes has started; and the second indicating the whether the build passed or failed. Refer to the console logs if the build has failed and amend the patch as necessary.
Update a patch with dependent changes¶
In the case where a patch depends on another in review Gerrit patch, we will need to rebase the commit against the latest patchset of the dependent change in Gerrit. The best way to do this is to retrieve the latest version of the dependent change and then cherry-pick our patch on top of the change.
Fetch latest parent patch set
git review -d <parent-gerrit-id>
Cherry-pick out patch on top
git review -x <patch-gerrit-id>
Push patch back up to Gerrit
git review -R
Rebasing a change against master¶
In the case that your patchset cannot be re-based via the U/I (merge conflict)
git pull origin master git review -d 12345 git rebase master "fix conflicts" git add * git rebase --continue git review
Code Review¶
All contributions to Git repositories use Gerrit for code review.
The code review process provides constructive feedback about a proposed change. Committers and interested contributors will review the change, give their feedback, propose revisions and work with the change author through iterations of the patch until it’s ready for merging.
Managing and providing feedback for the change happens via Gerrit web UI.

Gerrit wide view.¶
Pre-review¶
Change authors will want to push changes to Gerrit before they are actually ready for review. This is an encouraged good practice. It has been the experience of experienced community members that pushing often tends to reduce the amount of work and ensures speedy code reviews.
Note
This is not required and in some projects, not encouraged, but the general idea of making sure patches are ready for review when submitted is a good one.
Note
While in draft state, Gerrit triggers, e.g., verify Jenkins jobs, won’t run
by default. You can trigger them despite it being a draft by adding
“Jenkins CI” (or the corresponding Jenkins automation account) as a
reviewer. You may need to do a recheck by replying with a comment
containing recheck
to trigger the jobs after adding the reviewer.
To mark an uploaded change as not ready for attention by committers and interested
contributors (in order of preference), either mark the Gerrit a draft (by adding
a -D
to your git review
command), vote -1 on it yourself or edit the commit
message with “WIP” (“Work in Progress”).
Do not add committers to the Reviewers list for a change while in the pre-review state, as it adds noise to their review queue.
Review¶
Once an author wants a change reviewed, they need to take some actions to put it on the radar of the committers.
If the change it’s a draft, you’ll need to publish it. Do this from the Gerrit web UI.

Gerrit Web UI button to publish a draft change.¶
Remove your -1 vote if you’ve marked it with one. If you think the patch is ready for merge, vote +1. If there is not an automated job to test your change and vote +1/-1 for Verified, you will need to do as much testing yourself as possible and then manually vote +1 to Verified. Reviewers can dditionally vote +1 for Verified along with automated jobs. Describing the testing you did or did not do is typically helpful.

Gerrit voting interface, exposed by the Reply button.¶
Once the change gets published and you have voted for merging, add the people who need to review/merge the change to the Gerrit Reviewers list. The auto-complete for this Gerrit UI field is somewhat flaky, but typing the full name from the start typically works.

Gerrit Reviewers list with Int/Pack committers added¶
Reviewers will give feedback via Gerrit comments or inline against the diff.

Gerrit inline feedback about a typo¶
Updated versions of the proposed change get pushed as new patchsets to the same Gerrit, either by the original submitter or other contributors. Amending proposed changes owned by others while reviewing may be more efficient than documenting the problem, -1ing, waiting for the original submitter to make the changes, re-reviewing and merging.
Download changes for local manipulation and re-uploaded updates via git-review.
See Update an existing patch above. Once you have re-uploaded the patch the Gerrit web UI for the proposed change will reflect the new patchset.

Gerrit history showing a patch update¶
Reviewers will use the diff between the last time they gave review and the current patchset to understand updates, speeding the code review process.

Gerrit diff menu¶
Iterative feedback continues until reaching consensus (typically: all active reviewers +1/+2 and no -1s nor -2s), at least one committer +2s and a committer merges the change.

Gerrit code review votes¶
Merge¶
Once a patch has gotten a +2 from a committer and they have clicked the submit button the project’s merge job should run and publish the project’s artifacts to Nexus. Once completed, other projects will be able to see the results of that patch.
This is important when merging dependent patches across projects. You will need to wait for the merge job to run on one patch before any patches in other projects depending on it will successful verify.
Set up Gerrit¶
Generating SSH keys for your system¶
You must have SSH keys for your system to register with your Gerrit account. The method for generating SSH keys is different for different types of operating systems.
The key you register with Gerrit must be identical to the one you will use later to pull or edit the code. For example, if you have a development VM which has a different UID login and keygen than that of your laptop, the SSH key you generate for the VM is different from the laptop. If you register the SSH key generated on your VM with Gerrit and do not reuse it on your laptop when using Git on the laptop, the pull fails.
Note
Here is more information on SSH keys for Ubuntu and more on generating SSH keys
For a system running Ubuntu operating system, follow the steps below:
Run the following command:
mkdir ~/.ssh chmod 700 ~/.ssh ssh-keygen -t rsa
Save the keys, and add a passphrase for the keys.
This passphrase protects your private key stored in the hard drive. You must use the passphrase to use the keys every time you need to login to a key-based system:
Generating public/private rsa key pair. Enter file in which to save the key (/home/b/.ssh/id_rsa): Enter passphrase (empty for no passphrase): Enter same passphrase again:
Your public key is now available as .ssh/id_rsa.pub in your home folder.
Register your SSH key with Gerrit¶
Using a Google Chrome or Mozilla Firefox browser, go to gerrit.<project>.org
Click Sign In to access the repositories.
Sign into Gerrit¶
Click your name in the top right corner of the window and then click Settings.
The Settings page.
Settings page for your Gerrit account¶
Click SSH Public Keys under Settings.
Click Add Key.
In the Add SSH Public Key text box, paste the contents of your id_rsa.pub file and then click Add.
Adding your SSH key¶
To verify your SSH key, try using an SSH client to connect to Gerrit’s SSHD port:
$ ssh -p 29418 <sshusername>@gerrit.<project>.org
Enter passphrase for key '/home/cisco/.ssh/id_rsa':
**** Welcome to Gerrit Code Review ****
Submit a patch over HTTPS¶
While we recommend you submit patchsets over SSH some users may need to submit patchsets over HTTPS due to corporate network policies such as the blocking of high range ports or outgoing SSH.
To submit code to Gerrit over HTTPS follow these steps.
Note
This guide uses the Linux Foundation Gerrit server and the releng/docs project as an example. Differences may vary with other Gerrit servers.
Configure your Machine¶
Generate a HTTPS password
Note
Required when uploading patches to Gerrit servers via HTTPS.
Navigate to https://gerrit.linuxfoundation.org/infra/#/settings/http-password and click Generate Password. Write this to the file .netrc in your home directory excluding the angle brackets:
machine gerrit.linuxfoundation.org user <username> password <http-password>
Clone the repository over HTTPS using your Linux Foundation ID
git clone https://bramwelt@gerrit.linuxfoundation.org/infra/releng/docs
Download the commit-msg git hook
curl -Lo .git/hooks/commit-msg \ https://gerrit.linuxfoundation.org/infra/tools/hooks/commit-msg && \ chmod +x .git/hooks/commit-msg
Due to a bug in git-review, you need to download the commit-msg hook manually to the .git/hooks/ directory or
git-review -s
will fail.
Configure the Repository¶
Because git-review
attempts to use SSH by default, you need
configure the git-review scheme and port through git-config in the
repository.
Note
The Gerrit context path on the Linux Foundation Gerrit server is
infra/
. Others Gerrit servers may use gerrit/
or r/
.
Perform the following commands
cd docs/ git config gitreview.scheme https git config gitreview.port 443 git config gitreview.project infra/releng/docs
Verify the configuration by running the following command:
git review -s
If successful, the command will not print anything to the console, and you will be able to submit code with:
git review
Otherwise
git-review
will still request your Gerrit username, indicating a configuration issue.You can check the configuration using verbose output:
git review -v -s
Sign Gerrit Commits¶
Generate your GPG key.
The following instructions work on a Mac, but the general approach should be the same on other OSes.
brew install gpg2 # If you don't have homebrew, get that here: http://brew.sh/ gpg2 --gen-key # pick 1 for "RSA and RSA" # enter 4096 to creat a 4096-bit key # enter an expiration time, I picked 2y for 2 years # enter y to accept the expiration time # pick O or Q to accept your name/email/comment # enter a pass phrase twice. it seems like backspace doesn't work, so type carefully gpg2 --fingerprint # you'll get something like this: # spectre:~ ckd$ gpg2 --fingerprint # /Users/ckd/.gnupg/pubring.gpg # ----------------------------- # pub 4096R/F566C9B1 2015-04-06 [expires: 2017-04-05] # Key fingerprint = 7C37 02AC D651 1FA7 9209 48D3 5DD5 0C4B F566 C9B1 # uid [ultimate] Colin Dixon <colin at colindixon.com> # sub 4096R/DC1497E1 2015-04-06 [expires: 2017-04-05] # you're looking for the part after 4096R, which is your key ID gpg2 --send-keys $KEY_ID # in the above example, the $KEY_ID would be F566C9B1 # you should see output like this: # gpg: sending key F566C9B1 to hkp server keys.gnupg.net
If you are collaborating in keysigning, then send the output of
gpg2 --fingerprint $KEY_ID
to your coworkers.gpg2 --fingerprint $KEY_ID # in the above example, the $KEY_ID would be F566C9B1 # in my case, the output was: # pub 4096R/F566C9B1 2015-04-06 [expires: 2017-04-05] # Key fingerprint = 7C37 02AC D651 1FA7 9209 48D3 5DD5 0C4B F566 C9B1 # uid [ultimate] Colin Dixon <colin at colindixon.com> # sub 4096R/DC1497E1 2015-04-06 [expires: 2017-04-05]
Install gpg, instead of or addition to gpg2.
Note
you can tell Git to use gpg by doing:
git config --global gpg.program gpg2
but that then will seem to struggle asking for your passphrase unless you have your gpg-agent set up right.Add your GPG to Gerrit
Run the following at the CLI:
gpg --export -a $FINGER_PRINT # e.g., gpg --export -a F566C9B1 # in my case the output looked like: # -----BEGIN PGP PUBLIC KEY BLOCK----- # Version: GnuPG v2 # # mQINBFUisGABEAC/DkcjNUhxQkRLdfbfdlq9NlfDusWri0cXLVz4YN1cTUTF5HiW # ... # gJT+FwDvCGgaE+JGlmXgjv0WSd4f9cNXkgYqfb6mpji0F3TF2HXXiVPqbwJ1V3I2 # NA+l+/koCW0aMReK # =A/ql # -----END PGP PUBLIC KEY BLOCK-----
Browse to https://git.opendaylight.org/gerrit/#/settings/gpg-keys
Click Add Key…
Copy the output from the above command, paste it into the box, and click Add
Set up your Git to sign commits and push signatures
git config commit.gpgsign true git config push.gpgsign true git config user.signingkey $FINGER_PRINT # e.g., git config user.signingkey F566C9B1
Note
We can create a signed commit with
git commit -S
and a signed push withgit push --signed
on the CLI instead of configuring it in config if we want to manually control which commits use the signature.Create a signed commit
Change a file
Create a signed commit with
git commit -asm "test commit"
This will result in Git asking you for your passphrase. Enter it to proceed.
Push to Gerrit with a signed-push with
git review
This will result in Git asking you for your passphrase. Enter it to proceed.
Note
The signing a commit or pushing again with a signed push is not recognized as a “change” by Gerrit, so if you forget to do either, you need to change something about the commit to get Gerrit to accept the patch again. Slightly tweaking the commit message is a good way.
Note
This assumes you have git review set up and push.gpgsign set to true. Otherwise:
git push --signed gerrit HEAD:refs/for/master
This assumes the gerrit remote is available, if not, configure something like:
ssh://ckd@git.opendaylight.org:29418/<repo>.git
where repo is something like docs or controller
Verify the signature
To do this, navigate to Gerrit and check for a green check next to your name in the patch.
Example signed push to Gerrit.¶
Appendix¶
Developer’s Certificate of Origin (DCO)¶
Code contributions to Linux Foundation projects must have a sign-off by the author of the code which indicates that they have read and agree to the DCO.
Developer Certificate of Origin
Version 1.1
Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
Refer to https://developercertificate.org/ for original text.
Gerrit Topics¶
Topics are useful as a search criteria in Gerrit. By entering topic:foo
as a search criteria we can track related commits. Use one of the following
methods to configure topics:
Directly in the Gerrit UI via the Edit Topic button
Via
git review
using the-t topic
parameterNote
git-review defaults to the local branch name as the topic if it does not match the upstream branch.
Via
git push
using one of the following methods:git push origin HEAD:refs/for/master%topic=some-topic git push origin HEAD:refs/for/master -o topic=some-topic
Both methods achieve the same result so is up to preference. Further documentation available at Gerrit Topics.
GPG2 (GnuGP 2) Guide¶
The guide describes how to generate GPG2 (GnuPG 2) key pair, sign and verify commits on Linux and MacOS platforms using Git.
Prerequisites¶
Install GnuPG 2.
For Debian based systems:
sudo apt-get install gnupg2 -y
For rpm based systems:
sudo dnf install gnupg2 -y
For MacOS systems install homebrew <http://brew.sh>_ and install GPG2
brew install gpg2
If you are using a GPG smartcard refer to Protecting code integrity with PGP
Generate the GPG keys¶
Generate your GPG key.
Pick option 1 for “RSA and RSA”
Enter 4096 bit key size (recommended)
Set the key expiry to 2 years, use ‘2y’ for 2 years
Enter ‘y’ to confirm the expiry time
Pick ‘O’ or ‘Q’ to accept your name/email/comment
Enter a pass phrase twice.
gpg2 --gen-key
Note
The default key ring path on Linux is /home/$USER/.gnupg/pubring.kbx and MacOS is /Users/$USER/.gnupg/pubring.kbx. This path can be overridden by setting the environment variable $GNUPGHOME to point to a different directory.
View the key fingerprint.
$ gpg2 --fingerprint --keyid-format long /home/abelur/.gnupg/pubring.kbx ------------------------------- pub rsa4096/0xA46800C5D9A8855E 2016-06-28 [SC] Key fingerprint = DBE2 4D9E 8ECC 5B29 5F33 FF61 A468 00C5 D9A8 855E uid [ unknown] Anil Belur <abelur@linux.com> sub rsa2048/0x0FAA11C1B55BFA62 2016-06-28 [S] [expires: 2022-08-24] Key fingerprint = 3E59 553C 2748 4079 C1A1 5DC8 0FAA 11C1 B55B FA62 sub rsa2048/0xDC40225E6664848E 2016-06-28 [E] [expires: 2022-08-24] Key fingerprint = 5415 64A8 4449 4AE8 1A8D 0877 DC40 225E 6664 848E sub rsa2048/0x9515A6A0C2B6EDC9 2016-06-28 [A] Key fingerprint = 0E46 C7F1 A2A7 F3C3 9849 A56A 9515 A6A0 C2B6 EDC9
Note
In the above example, the users long key id is ‘0xA46800C5D9A8855E’. Use the long key-id from your keys and replace with ‘<KEYID-FINGERPRINT>` in rest of the document. It’s recommended to use long key-id, since 32-bit short key-id’s are subject to collision attacks.
Setup Git to sign commits and push signatures. This step updates the file ‘~/.gitconfig’ to sign commits (with your GPG2 keys) by adding the default user key fingerprint and setting the commit.gpgsign option as true. Also add push.gpgsign as true sign all pushes.
git config --global user.signingkey <KEYID-FINGERPRINT> git config --global commit.gpgsign true git config --global push.gpgsign true
Set GPG2 the default program.
git config --global gpg.program $(which gpg2)
Upload your public key to key servers.
gpg2 --send-keys <KEYID-FINGERPRINT> ... gpg: sending key <KEYID-FINGERPRINT> to hkp server keys.gnupg.net
Note
In the above example, the $KEY_ID would be A46800C5D9A8855E
Export the GPG2 public key and add it to Gerrit.
Run the following at the CLI:
gpg --export -a <KEYID-FINGERPRINT>
Open the project’s Gerrit and go to project settings and gpg-keys.
Click the Add Key button.
Copy the output from the above command, paste it into the box, and click ‘Add’.
Setup gpg-agent¶
Install gpg-agent and pinentry-mac using brew:
brew install gpg-agent pinentry-mac
Edit ~/.gnupg/gpg.conf contain the line:
echo "use-agent" > ~/.gnupg/gpg.conf
Edit ~/.gnupg/gpg-agent.conf and add the below line:
cat > ~/.gnupg/gpg-agent.conf << EOF use-standard-socket enable-ssh-support default-cache-ttl 600 max-cache-ttl 7200 pinentry-program /usr/local/bin/pinentry-mac EOF
Update ~/.bash_profile with the following:
[ -f ~/.gpg-agent-info ] && source ~/.gpg-agent-info if [ -S "${GPG_AGENT_INFO%%:*}" ]; then export GPG_AGENT_INFO else eval $( gpg-agent --daemon --write-env-file ~/.gpg-agent-info ) fi
Kill any stray gpg-agent daemons running:
sudo killall gpg-agent
Restart the terminal (or log in and out) to reload the your ~/.bash_profile.
The next time a Git operation makes a call to gpg, it should use your gpg-agent to run a GUI window to ask for your passphrase and give you an option to save your passphrase in the keychain.
For Linux:
For MacOS:
Sign your commit¶
Commit and push a change
Change a file and save it with your favorite editor.
Add the file and sign the commit with your GPG private key.
git add <path/to/file> git commit --gpg-sign --signoff -m 'commit message'
Note
The option –gpg-sign (-S) uses GPG for signing commits. The option –signoff (-s) adds the Signed-off-by line in the commit message footer.
Push patch to Gerrit.
git review
Note
This should result in Git asking you for your pass phrase, if the ssh keys are password protected.
The presence of a GPG signature or pushing of a gpg signature isn’t recognized as a “change” by Gerrit, so if you forget to do either, you need to change something about the commit to get Gerrit to accept the patch again. Tweaking the commit message is a good way.
This assumes you have git review -s set up and push.gpgsign set to true. Otherwise:
git push --signed gerrit HEAD:refs/for/master
This assumes you have your gerrit remote set up like the below, where repo is something like releng-docs:
ssh://<user-id>@git.linuxfoundation.org:29418/<repo>.git
Verify the signature of the signed commit locally.
git log --show-signature -1 commit ea26afb7d635a615547490e05a7aef2d9bcda265 gpg: Signature made Tue 28 Nov 2017 11:15:12 AM AEST gpg: using RSA key 0FAA11C1B55BFA62 gpg: Good signature from "Anil Belur <abelur@linux.com>" [unknown] Primary key fingerprint: DBE2 4D9E 8ECC 5B29 5F33 FF61 A468 00C5 D9A8 855E Subkey fingerprint: 3E59 553C 2748 4079 C1A1 5DC8 0FAA 11C1 B55B FA62 Author: Anil Belur <abelur@linux.com> Date: Tue Nov 28 10:45:29 2017 +1000
A green check next to the users name on the Gerrit change should suggest a valid commit signature.
Jenkins Guide¶
The ci-management
or releng/builder
repos in an LF project consolidates the
Jenkins jobs from project-specific VMs to a single Jenkins server. Each Git repo in
every project has a view for their jobs on the main Jenkins server. The system utilizes
Jenkins Job Builder for the creation and management of the
Jenkins jobs.

Jenkins Sandbox Guide¶
Sandbox Overview¶
Facts to keep in mind before working on the Sandbox:
Jobs are automatically deleted every Saturday at 08:00 UTC
Committers can login and configure Jenkins jobs in the Sandbox directly
Sandbox jobs cannot perform any upload/deploy tasks
Project configuration files and credentials are not loaded into the system
Sandbox jobs cannot vote on Gerrit
Jenkins nodes have the same OpenStack configuration as the production instance with minor differences.
Get access to the Jenkins Sandbox¶
The Sandbox provides a testing/experimentation environment used before pushing job templates to the production instance.
To access the Sandbox use: jenkins.example.org/sandbox
The access to the Sandbox uses the same LFID used in the production Jenkins instance, but in this case a new Helpdesk ticket needs creation to request the sandbox access.
The LF helpdesk team can add users to the appropriate group to grant permissions
to access the Sandbox via https://identity.linuxfoundation.org/.
The group that controls this access is <project>-jenkins-sandbox-access
For example:
https://identity.linuxfoundation.org/content/<project>-jenkins-sandbox-access
The requester will receive an invitation to join this group. Once accepted, the user can now access the Sandbox same way as the production Jenkins.
Push jobs to Jenkins Sandbox¶
Push jobs to the Jenkins Sandbox using one of these methods:
Method 1 is easier as it does not require installing anything on your local system. This method requires pushing the patch to Gerrit on each test. We recommend this method for quick one off edits or if you are testing another contributor’s patch.
Method 2 is more convenient for those who work on JJB templates more than once or twice.
Push jobs via Gerrit comment¶
This is the easiest and fastest way to start using the Jenkins Sandbox. This is the recommended way to use the Sandbox since this does not require a local installation of JJB.
To push jobs to the Jenkins Sandbox with the jjb-deploy job, add a comment on any Gerrit patch.
jjb-deploy <job name>
jjb-deploy ci-management-jjb-verify # Push the ciman JJB verify job.
Leaving a comment on a non-ci-management patch, the resultant job will use latest master branch of the ci-management repo.
Leaving a comment on a ci-management patch, the resultant job’s configuration will reflect patch’s code base in Gerrit.
Push jobs via JJB CLI¶
JJB CLI needs configuration first.
Note
Use this configuration if you prefer to use the JJB tool locally on your system.
Configure the file ~/.config/jenkins_jobs/jenkins_jobs.ini
as follows:
[job_builder]
ignore_cache=True
keep_descriptions=False
recursive=True
retain_anchors=True
update=jobs
[jenkins]
user=<Provide your Jenkins Sandbox user-id (LFID)>
password= <Refer below steps to get API token>
url=https://jenkins.example.org/sandbox
Note
The [jenkins] section is the default configuration section that JJB will
refer to when calling jenkins-jobs
without passing the -s | --server
option. If you work on more than one Jenkins system then configure sections
like [odl], [opnfv], [onap], etc… and pass jenkins-jobs -s odl
to make
it convenient to switch projects.
How to retrieve API token?
Login to the Jenkins Sandbox using your LFID
Go to the user page by clicking on your username on the top right
Click
Configure
Click
Show API Token
To start using the Sandbox, we must do a clone of ci-management or releng/builder (in case of ODL) repo for the project. For example:
git clone ssh://<LFID>@gerrit.example.org:29418/ci-management
Make sure you sync global-jjb also using:
git submodule update --init
Install JJB (Jenkins Job Builder).
Execute the following commands to install JJB on your machine:
cd ci-management (or cd builder)
pip install --user virtualenvwrapper
mkvirtualenv jjb
pip install jenkins-job-builder
jenkins-jobs --version
jenkins-jobs test --recursive jjb/
Note
More information on Python Virtual Environments
To work on existing jobs or create new jobs, navigate to the /jjb directory where you will find all job templates for the project. Follow the below commands to test, push or delete jobs in your Sandbox environment.
Verify JJB¶
After you edit or create new job templates, test the job in the Sandbox environment before you submit this job to production CI environment.
jenkins-jobs test jjb/ <job-name>
For Example:
jenkins-jobs test jjb/ ci-management-jjb-merge
If the job you would like to test is a template with variables in its name, it must be manually expanded before use. For example, the commonly used template {project-name}-jjb-merge might expand to ci-management-jjb-merge.
A successful test will output the XML description of the Jenkins job described by the specified JJB job name.
Execute the following command to pipe-out to a directory:
jenkins-jobs --conf jenkins.ini test jjb/ <job-name> -o target
The output directory will contain files with the XML configurations.
Push a Job¶
Ensure you have configured your jenkins.ini and verified it by outputting valid XML descriptions of Jenkins jobs. Upon successful verification, execute the following command to push the job to the Sandbox:
jenkins-jobs update -j jjb/ <job-name>
For Example:
jenkins-jobs update -j jjb/ ci-management-jjb-merge
Delete a Job¶
Execute the following command to Delete a job from Sandbox:
jenkins-jobs delete -j jjb/ <job-name>
For Example:
jenkins-jobs delete -j jjb/ ci-management-jjb-merge
You can also delete the job from the UI options in Jenkins Sandbox.
Edit Job via Web UI¶
In the Sandbox, you can directly edit the job configuration by selecting the job name and clicking on the Configure button. Click the Apply or Save (to save and exit the configuration) buttons to save the job.
This is useful in the case where you might want to test quick tweaks to a job before modifying the YAML.
Edit the job in your terminal and follow the described steps in Verify JJB and Push Job <push-job> to push any changes and have them ready to push to Gerrit.
Important
When pushing to the Sandbox with jenkins-jobs, do not forget the <job-name> parameter. Otherwise, JJB will push all job templates into the Sandbox and will flood the system.
If that happens, use `ctrl+c` to cancel the upload.
A successful run of the desired job will look like this:
INFO:jenkins_jobs.builder:Number of jobs generated: 1
Execute jobs in the Sandbox¶
Once you push the Jenkins job configuration to the Sandbox environment, run the job from the Sandbox WebUI. Follow the below process to trigger the build:
Login into the Jenkins Sandbox WebUI
Click on the job which you want to trigger
Click “Build with parameters”
Click Build
Verify the Build Executor Status bar to check on progress.
You can click on the build number to view the job details and console output.
Quick Start¶
This section provides details on how to create jobs for new projects with minimal steps. All users in need to create or contribute to new job types should read and understand this guide.
As a new project you will be mainly interested in getting your jobs to appear in the Jenkins server silo archiving it by creating a <project>.yaml in the releng/builder or ci-management project’s jjb directory.
Example for releng/builder projects:
git clone --recursive https://git.opendaylight.org/gerrit/releng/builder
cd builder
mkdir jjb/<new-project>
Example for ci-management projects:
git clone --recursive https://gerrit.onap.org/gerrit/ci-management
cd ci-management
mkdir jjb/<new-project>
Where <new-project> should be the same name as your project’s Git repo in Gerrit. If your project name is “aaa” then create a new jjb/aaa directory.
Note
In similar matter, if your project name is “aaa/bbb” then create a new jjb/aaa-bbb directory by replacing all “/” with “-“.
Note
builder/jjb/global-jjb or ci-management/jjb/global-jjb are submodules of releng/builder or
ci-management repositories which require a git submodule update --init
or using
–recursive with git clone to get them fetched.
Next we will create <new-project>.yaml as follows:
---
- project:
name: <new-project>
project-name: <new-project>
project: <new-project>
mvn-settings: <new-project>-settings
jobs:
- gerrit-maven-clm
- gerrit-maven-merge
- gerrit-maven-release
- gerrit-maven-verify
- gerrit-maven-verify-dependencies
stream: master
- project:
name: <new-project>-sonar
jobs:
- gerrit-maven-sonar
build-node: centos7-builder-4c-4g
project: <new-project>
project-name: <new-project>
branch: master
mvn-settings: <new-project>-settings
Replace all instances of <new-project> with the name of your project as explained before.
The template above shows how to add each job from global-jjb. We recommend defining a local job-group for the project or defining each job needed in a list.
Add the following jobs for minimal setup on a Maven based project:
- gerrit-maven-clm
- gerrit-maven-merge
- gerrit-maven-release
- gerrit-maven-verify
- gerrit-maven-sonar
Optionally, you can add other jobs as well:
- gerrit-maven-verify-dependencies
Global-jjb defines groups of jobs recommended for ci, maven, python, node, rtd and more future languages as global-jjb is always under constant improvement. If you would like to explore more about these options available please refer to the Global JJB Templates section.
The changes to these files get published in Gerrit and reviewed by the releng/builder or ci-management teams for the LF project. After approvals, these Gerrits get merged and the jobs published in Jenkins.
git add jjb/<new-project>
git commit -sm "Add <new-project> jobs to Jenkins"
git review
This will push the jobs to Gerrit and your jobs will appear in Jenkins once the releng/builder or ci-management teams has reviewed and merged your patch.
Build agents¶
Jenkins jobs run on build agents (executors) created on demand and deleted when the job terminates. Jenkins supports different types of dynamic build nodes and developers must know the flavors available to run their custom jobs.
Jenkins uses the OpenStack Cloud plugin to administer node templates and configuration
for the node instances. For more information on the template:
https://wiki.jenkins.io/display/JENKINS/Openstack+Cloud+Plugin
Projects requiring a specific build configuration can submit a change to the ci-management
or releng
repos.
Refer to the Jenkins Configuration Merge section to understand how the configuration changes get merged.
Note
Here is an example from OpenDaylight: https://github.com/opendaylight/releng-builder/tree/master/jenkins-config/clouds/openstack/odlvex
For details on how to build an image for a particular build flavor, refer to the :ref: Packer Images <lfdocs-packer-images> section.
Cloud configuration (Global Configuration)¶
This information will help developers (who do not have administer permissions) understand how LFIT configures a cloud and build agents via OpenStack Cloud plugin:
Log in into Jenkins and click on
Manage Jenkins
Scroll to the
Cloud
sectionClick
Add a new cloud
,Cloud (OpenStack)
Fill the require information for Cloud provider, URL, credentials and region
Note
Click
Test Connection
to make sure the parameters provided establishes a connection.Configure
Default slave options...
Note
The
Default slave options
can be overwritten for a particular node flavor using theTemplate Advanced
optionsClick
Add template
and provide a nodeName
andLabels
Specify a
build-node
in a project’s yaml file:build-node: ubuntu1604-builder-4c-4g
Note
The value should match an available
Label
for the node template.
Build agents flavors¶
This section points to each LF project’s build agents availability and flavors.
EdgeX Foundry: https://github.com/edgexfoundry/ci-management/tree/master/jenkins-config/clouds/openstack/Primary
FD.io: https://github.com/FDio/ci-management/tree/master/jenkins-config/clouds
ONAP: https://github.com/onap/ci-management/tree/master/jenkins-config/clouds/openstack/cattle
OpenDaylight (ODL): https://github.com/opendaylight/releng-builder/tree/master/jenkins-config/clouds/openstack/odlvex
O-RAN: https://github.com/o-ran-sc/ci-management/tree/master/jenkins-config/clouds/openstack/cattle
Zowe: https://github.com/zowe/ci-management/tree/master/jenkins-config/clouds/openstack/cattle
Managed Config Files¶
Jobs in Jenkins make extensive use of Managed Config Files for different types
of files that store configuration or credentials. These files live in the
ci-management repository along side of the rest of the community configurable
components under the jenkins-config/managed-config-files
directory tree.
This directory tree has the following case sensitive format:
ci-mangement
|- jenkins-admin
|- managed-config-files
|- <config_type>
|- <file_id>
|- config-params.yaml
|- content
|- ??CredentialMappings.yaml
|- ??CrednetialMappings.sandbox.yaml
...
Configuration of credentials for production Jenkins systems come from the ??CredentialMappings.yaml file.
Configuration of credentials for sandbox Jenkins systems come from the ??CredentialMappings.sandbox.yaml file.
The config_type will correspond to the type that is under management is how JCasC itself defines the file type.
Common types in the LF environment are:
custom
globalMavenSettings
json
mavenSettings
openstackUserData
properties
The file_id is precisely what the ID of the file should be for reference. The LF Release Engineering pratice is to always set a human readable / relatable ID.
config-params.yaml are all the parameters related to this particular file that are _not_ the content of it or the credential mappings or the file content.
The content file is the actual file that is under management. This must be a non-escaped version of the content for the field. It will be appropriately escaped when converted into the corresponding JCasC yaml.
The two double ? in the name of the CredentialMappings files must have the appropriate mapping definition.
The mapping type will use a verbatim copy when converting to the JCasC so it should be properly configured to match the config_type.
The known breakdown of config_type to CredentialMappings is:
custom -> customizedCredentialMappings
mavenSetings -> serverCredentialMappings
properties -> propertiesCredentialMappings
The following is the layout for a custom file with the ID of example
---
name: "My Custom File"
comment: "An example custom file"
This is just an example custom config file
The user for the EXAMPLE token is: $EXAMPLE_USR
The password for the EXAMPLE token is: $EXAMPLE_PSW
The user:pass for the EXAMPLE token is: $EXAMPLE
---
customizedCredentialMappings:
- credentialsId: "example_username"
tokenKey: "EXAMPLE"
---
customizedCredentialMappings:
- credentialsId: "dummy"
tokenKey: "EXAMPLE"
Log Server¶
While Jenkins stores the console logs on the Jenkins instance, this is short term and typically depending on the job type purged after a week. We highly recommend preferring the log server over the Jenkins system when viewing console logs as it reduces load on Jenkins and we compress logs stored on the log server so downloads are faster.
We store Log server archives for 6 months
.
At the end of a build the job ships logs to a Nexus logs repo and can be conveniently accessed via the https://logs.example.org URL. The Job Build Description will contain the specific log server URL for a build log. Jobs triggered via Gerrit Trigger will have the URL to the logs left as a post build comment.
Example Jenkins Build Description:
Build logs: https://logs.opendaylight.org/releng/vex-yul-odl-jenkins-1/distribution-check-carbon/167
Example Gerrit Comment:
jenkins-releng 03-05 16:24
Patch Set 6: Verified+1
Build Successful
https://jenkins.opendaylight.org/releng/job/builder-tox-verify-master/1066/ : SUCCESS
Logs: https://logs.opendaylight.org/releng/vex-yul-odl-jenkins-1/builder-tox-verify-master/1066
The log path pattern for the logs server is
LOG_SERVER_URL/SILO/JENKINS_HOSTNAME/JOB_NAME/BUILD_NUMBER
typically if you
know the JOB_NAME and BUILD_NUMBER you can replace the paths before it to
convert the URL between Jenkins and the Log server.
We compress and store individual log files in gzip (.gz) format on the Nexus log repository. You can access these files through the URL.
Jenkins Production:
https://logs.example.org/production
Jenkins Sandbox:
https://logs.example.org/sandbox
Log Cleanup Schedule¶
The log servers are setup with cron jobs that purge logs during regular scheduled intervals.
Jenkins Production: Delete logs everyday at 08:00 UTC which are older than 180 days.
Jenkins Sandbox: Delete logs and jobs every week on Saturday at 08:00 UTC.
Jenkins Job Builder¶
Jenkins Job Builder takes simple descriptions of Jenkins jobs in YAML format and uses them to configure Jenkins.
JJB Overview¶
Jenkins Job Builder translates YAML code to job configuration suitable
for consumption by Jenkins. When testing new Jenkins Jobs in the
Jeknins Sandbox, you will need to use the
jenkins-jobs
executable to translate a set of jobs into their XML
descriptions and upload them to the Jenkins Sandbox server.
Install JJB¶
You can install the latest version of JJB and its dependencies with pip using Python Virtual Environments or lock a specific version of JJB in jjb/requirements.txt, as a workaround for known issues. The documentation is available in pip-assisted install.
Virtual Environments¶
For pip-assisted, we recommend using Python Virtual Environments to manage JJB and it’s Python dependencies.
The documentation to install virtual environments
with virtualenvwrapper
. On Linux systems with pip run:
pip install --user virtualenvwrapper
A virtual environment is a directory that you install Python programs into and update the shell’s $PATH, which allows the version installed in the virtual environment to take precedence over any system-wide versions available.
Create a new virtual environment for JJB.
virtualenv jjb
With in your virtual environment active, you can install JJB which is visible when the virtual environment that is active.
To activate your virtual environment.
source ./jjb/bin/activate
# or
workon jjb
To deactivate your virtual environment.
deactivate
Install JJB using pip¶
To install JJB and its dependencies, make sure you have created and activated a virtual environment for JJB.
Set a virtualenv
virtualenv jjb source jjb/bin/activate
Install JJB
pip install jenkins-job-builder==2.0.5
Note
If a requirements.txt exists in the repository with the recommended JJB version then, use the requirements file to install JJB by calling.
# From the root of the ci-management or builder directory pip install -r jjb/requirements.txt
To change the version of JJB specified by jjb/requirements.txt to install from the latest commit to the master branch of JJB’s Git repository:
cat jjb/requirements.txt -e git+https://git.openstack.org/openstack-infra/jenkins-job-builder#egg=jenkins-job-builder
Check JJB installation:
jenkins-jobs --version
Global JJB Templates¶
Global-JJB is a library project containing reusable Jenkins Job Builder templates. The intention is to save time for projects from having to define their own job templates. Documentation is available via global-jjb documentation specific sections of interest linked here:
Note
For infra admins, the CI Job Templates contain useful jobs for managing Jenkins and VM Images. We recommend to deploy these jobs to all new infra projects.
Packer Images¶
The ci-management repo contains a directory called packer
which contains
scripts for building images used by Jenkins to spawn builders. There are 2
files necessary for constructing a new image:
packer/templates/BUILDER.json
packer/provision/BUILDER.yaml
Replace BUILDER with the name of your desired builder image type.
The templates file contains packer configuration information for building the image. The provision file is a script for running commands inside the packer-builder to construct the image. We recommend using the Ansible provisioner as that is the standard used by LF packer builds.
While developing a new builder image type, we can use the Jenkins Sandbox to build and deploy the image for testing. Configure a Jenkins Job the new image type using the global-jjb gerrit-packer-merge job template.
Example job definition:
- project:
name: packer-robot-jobs
jobs:
- gerrit-packer-merge
project: releng/builder
project-name: builder
branch: master
archive-artifacts: '**/*.log'
build-node: centos7-builder-2c-1g
platforms: centos-7
templates: robot
The gerrit-packer-merge job creates jobs in the format
PROJECT_NAME-packer-merge-PLATFORM-TEMPLATE
. Where PROJECT_NAME is the
project-name
field, PLATFORM is the platforms
field, and TEMPLATES is
the templates
field in the yaml above. In this example the resultant job is
builder-packer-merge-centos-7-robot
.
Follow the instructions in the Pushing a patch to Gerrit section to push this job to the Sandbox.
Once the job is on the Jenkins Sandbox, run the job and it will attempt to deploy the new image and make it available. Once the job completes look for a line in the logs that look like:
==> vexxhost: Creating the image: ZZCI - CentOS 7 - robot - 20180301-1004
This line provides the name of the new image we built.
Jenkins Production & Jenkins Sandbox¶
The Jenkins server is the home for all project’s Jenkins jobs. Most of the job configuration gets managed through code using JJB through the ci-management or releng/builder repos.
To access the Jenkins Production URL for any project use:
https://jenkins.example.org
Similarly, the project’s corresponding Jenkins Sandbox URL would be:
https://jenkins.example.org/sandbox
Any users with an LFID can access the Jenkins Production site, but for Jenkins Sandbox please request an access. To do so, refer to the section Get access to the Jenkins Sandbox
Project contributors do not edit the Jenkins jobs directly on the Jenkins production server. Instead, we encourage them to use the Jenkins Sandbox.
The Jenkins Sandbox has similar configuration to the production instance. Jenkins Sandbox does not publish artifacts in Nexus or Nexus3 or vote in Gerrit which makes it a safe environment to test the jobs. Users can edit and trigger the jobs directly to test the behavior.
The Jenkins Sandbox can contain dummy configuration files and dummy credentials in case it helps take the test further and not fail on the first steps due to the configuration not being present. Any attempt to use the configuration files will cause the server communications to fail. To add dummy configuration files, please create a new ticket to Helpdesk.
In such case, merge jobs, push, CLM, Docker or Sonar jobs get tested to some extent due to this limitation. Once the job template gets merged and becomes available in Jenkins Production, we can confirm the jobs are actually making server communications as expected with Nexus-IQ, Sonar, Gerrit or Nexus.
The Sandbox has limited amount of Virtual Machine nodes instances to test compared to production Jenkins.
Documentation on using the Jenkins Sandbox and uploading jobs is available here.
How to test unmerged CR in global-jjb or lftools with Jenkins¶
To test one or more changes in review state on the global-jjb or lftools repository with a Jenkins job on sandbox, insert the sample code in the relevant builder section on the job. This reduces the number of regressions and/or hot fixes required post-release.
Example code for lftools changes:
git clone https://gerrit.linuxfoundation.org/infra/releng/lftools /tmp/lftools
cd /tmp/lftools
# For example replace ${GERRIT_REFSPEC} with 'refs/changes/81/15881/2'
git fetch "https://gerrit.linuxfoundation.org/infra/releng/lftools" ${GERRIT_REFSPEC} && git cherry-pick --ff --keep-redundant-commits FETCH_HEAD
git log --pretty=format:"%h%x09%an%x09%s" -n5
virtualenv --quiet -p $(which python3) "/tmp/lftools-env"
set +u
source "/tmp/lftools-env/bin/activate"
set -u
pip3 install --quiet -r requirements.txt -e .
cd ${WORKSPACE}
Example code for global-jjb changes:
cd $WORKSPACE/global-jjb
# For example replace ${GERRIT_REFSPEC} with 'refs/changes/81/15881/2'
git fetch "https://gerrit.linuxfoundation.org/infra/releng/global-jjb" ${GERRIT_REFSPEC} && git cherry-pick --ff --keep-redundant-commits FETCH_HEAD
git log --pretty=format:"%h%x09%an%x09%s" -n5
cd ${WORKSPACE}
Note
Repeat the line to fetch ${GERRIT_REFSPEC} to test one or more changes.
Jenkins Build Failure Analyzer¶
The Build Failure Analyzer Jenkins plugin analyzes the causes of failed builds and presents the causes on the build page.
It does this by using a knowledge base of build failure causes maintained from scratch.
Plugin Documentation¶
Official plugin documentation: https://plugins.jenkins.io/build-failure-analyzer/
Make sure your Jenkins server has this plugin installed before proceeding.
Plugin Permissions¶
To configure the Build Failure Analyzer plugin’s permissions select:
Manage Jenkins
-> Configure Global Security
-> Authorization
section
The table under the Authorization section will show the Build Failure Analyzer
column.
Users and/or groups can add or remove the following permissions:
View Causes
Update Causes
Remove Causes
View Causes¶
Depending on the permissions granted to the groups to use the Build Failure Analyzer,
users will be able to see the Failure Cause Management
option in the left side menu
in Jenkins.
This option will display the current causes in a table with:
Name
Categories
Description
Comment
Modified
Remove Cause Icon (Depending on permissions)

Update Causes¶
The Create New
option adds a new cause.
A new cause will require the following information:
Name
Description
Comment
Categories (It will autocomplete for any existing categories)
Indications (What to look for in the log. Regex pattern or text)
Modification history (Date, time and username)

To update an existing cause, click on a cause’s name from the current table.
Delete Causes¶
The last column of the causes table will show a remove icon for those groups with permissions to Remove Causes. No icon will appear if this permission is not granted.
The same Remove option will appear if the user clicks on the name of any of the causes in the table.
Project Documentation Guide¶
Documentation is an important aspect to any software project. LF-Releng provides some recommended tools for projects to get setup with their own documentation and we will attempt to describe them in this guide.
Tools¶
The main tools recommended to generate docs is Sphinx and reStructuredText. Sphinx is a tool for generating documentation from a set of reStructuredText documents.
LF provides lfdocs-conf as a convenience package that will pull in the most common documentation dependencies and configuration for your project. global-jjb provides job templates that can build and publish the documentation.
Framework¶
Typically every project like ONAP, OpenDaylight, OPNFV, etc… have a “documentation” project. This project provides a gateway to all documentation for the project and typically is the index page of any project’s https://docs.example.org url.
Project-specific documentation will configure as subprojects in ReadTheDocs and are available at https://docs.example.org/projects/PROJECT
Linking between projects are possible via intersphinx linking.
Bootstrap a New Project¶
Bootstrap your project with documentation by following these steps.
Setup lfdocs-conf with the Install Procedures.
Add project to ReadTheDocs following instructions here
Open a Helpdesk ticket if you require assistence here.
Create RTD Generic Webhook
Follow the steps described in the rtd-jobs documentation then record the
rtd-build-url
andrtd-token
for the next step.Add the rtd jobs to your project
Open up your project.yaml in the ci-management repo and add this section:
- project: name: PROJECT jobs: - '{project-name}-rtd-jobs' project-pattern: PROJECT rtd-build-url: RTD_BUILD_URL rtd-token: RTD_TOKEN
- name:
Project name in Gerrit converting forward slashes (/) to dashes (-).
- project-pattern:
Project name as defined in Gerrit.
- rtd-build-url:
This is the generic webhook url from readthedocs.org. Refer to the above instructions to generate one. (Check Admin > Integrations > Generic API incoming webhook)
- rtd-token:
The unique token for the project Generic webhook. Refer to the above instructions to generate one. (Check Admin > Integrations > Generic API incoming webhook)
More details on rtd job template configuration and parameters is available here.
Note
If lfdocs-conf patches are already merged then issue a ‘remerge’ so the publish job can push the docs to ReadTheDocs.
Add a project to ReadTheDocs¶
In this task we will add and activate a project to ReadTheDocs. This is necessary to let ReadTheDocs know where to pull your docs to build from.
Warning
Remember to add lf-rtd as a maintainer of the project. This is to ensure that LF staff can continue to manage this project even if the project owner stops working on the project. If you would like helpdesk to assist with creating the project for you then open a helpdesk ticket.
Login to ReadTheDocs (LFIT can use the lf-rtd account)
Click “Import a Project” on the dashboard
Click “Import Manually”
Setup Project
Import Project page¶
Give the project a name
Note
Remember this name to setup the Jenkins jobs.
Provide the Anonymous HTTP clone URL eg. https://gerrit.linuxfoundation.org/infra/releng/docs-conf
Repository type: Git
Click Next
Click Admin > Maintainers
Ensure lf-rtd is a maintainer of the project
Setup sub-project
If this project is not the main documentation project then it needs to be setup as a sub-project of the main documentation project. This will create a subproject link for your project under https://docs.example.org/projects/YOUR_PROJECT
Note
Either the main documentation project’s committers or LF Staff will need to perform this step. If documentation project committers are not available contact the Helpdesk to have LF Staff take care of the subproject configuration.
Goto the main documentation project’s ReadTheDocs admin page
Click Sub-projects
Click Add subproject
Select the child project (the one we created above)
Give it an Alias
Note
Typically the repo name. Forward slashes are not allowed so convert them to hyphens.
Appendix¶
Activate new sub-project:¶
Select the sub-project
Select Admin > Edit Versions
Locate the version to activate in the “Activate a version” list
Activate it by pressing the “Activate” button on the right hand side of the entry
Intersphinx Linking¶
This is supplemental documentation for upstream Sphinx docs on intersphinx linking and Sphinx linking in general. Please refer to the upstream docs here:
When working with related projects that generate separate Sphinx documentation that need to be cross referenced, intersphinx linking is the recommended way to link them.
As a refresher, refer to the Sphinx documentation on linking and review the
upstream docs for the :doc:
and :ref:
link types. :any:
is a useful
helper function to let Sphinx guess if a link is a :doc:
or a :ref:
link.
In most cases folks use these link references to link to local documentation,
we can use these for intersphinx linking to another project’s
public docs as well via a namespace
and configuration in conf.py
.
The configuration is a dictionary containing a key
which we will refer to
as a doc namespace
and a tuple with a link to the project’s public
documentation. This namespace
is locally significant and is a free form
word so set it to anything, then within the local project use it to reference
an external doc.
- Example:
intersphinx_mapping = { 'python': ('https://docs.python.org/3', None), }
conf.py configuration¶
The lfdocs-conf
project already provides common
LF docs related intersphinx links
for projects using lfdocs-conf
.
To add to the intersphinx link dictionary define intersphinx_mapping
in the local conf.py
file, refer to the example above. This overrides the
intersphinx_mapping
variable. If using lfdocs-conf
, we recommend
appending to the list instead by setting the following:
intersphinx_mapping['key'] = ('https://example.org/url/to/link', None)
intersphinx_mapping['netvirt'] = ('http://docs.opendaylight.org/projects/netvirt/en/latest/', None)
Since lfdocs-conf defines the intersphinx_mapping dictionary, the code above will append to it using a key-value pair. More examples of intersphinx mapping exist in the OpenDaylight conf.py.
Cross-Reference external docs¶
Using the namespace
we can refer to docs
and labels
in external
project documentation in the same way we can refer to local documentation.
- Example:
* :doc:`Global JJB <global-jjb:index>` * :ref:`CI Jobs <global-jjb:lf-global-jjb-jenkins-cfg-merge>`
- Demo:
From the example, we insert the global-jjb docs namespace as deliminated by the
colon :
symbol inside of link reference to point Sphinx to the global-jjb
project docs link.
Tip
The above example highlights a bad practice in some LF Docs projects where
we were namespacing label definitions using code such as
.. _lf-global-jjb-jenkins-cfg-merge
. This is redundant and unnecessary
as the project is already namespaced by the intersphinx_mapping
configuration. When defining labels, define them with locally significant
names and use intersphinx_mapping
to handle the namespace.
Inspect the objects.inv for links¶
Every Sphinx build produces an objects.inv. In a local build this file is
where the html output is for example docs/_build/html/objects.inv
,
online the file is at the html root
https://docs.releng.linuxfoundation.org/en/latest/objects.inv
. We can
use this file to inspect the types of reference links we can use for a project.
# In a virtualenv
pip install sphinx
python -m sphinx.ext.intersphinx path/to/objects.inv
Links listed as std:doc
refer to the :doc:
syntax while
links listed as std:label
refer to the :ref:
syntax.
std:doc
ansible Ansible Guide : ansible.html
best-practices Best Practices : best-practices.html
gerrit Gerrit Guide : gerrit.html
gpg GPG2 (GnuGP 2) Guide : gpg.html
helpdesk LF Helpdesk : helpdesk.html
index Linux Foundation Releng Documentation : index.html
infra/index Infrastructure Guide : infra/index.html
infra/inventory Inventory : infra/inventory.html
infra/jenkins Jenkins : infra/jenkins.html
infra/nexus Nexus : infra/nexus.html
infra/openstack OpenStack Management : infra/openstack.html
jenkins Jenkins Guide : jenkins.html
nexus2 Nexus 2 Guide : nexus2.html
project-bootstrap New Project Bootstrap : project-bootstrap.html
project-documentation Project Documentation Guide : project-documentation.html
std:label
dco Developer’s Certificate of Origin : gerrit.html#dco
genindex Index : genindex.html
gerrit-push-git-push Push using git push : gerrit.html#gerrit-push-git-push
gerrit-push-git-review Push using git review : gerrit.html#gerrit-push-git-review
gerrit-push-output Push output : gerrit.html#gerrit-push-output
gerrit-topics Gerrit Topics : gerrit.html#gerrit-topics
get-sandbox-access Get access to the Jenkins Sandbox : jenkins.html#get-sandbox-access
jenkins-guide Jenkins Guide : jenkins.html#jenkins-guide
jenkins-infra Jenkins : infra/jenkins.html#jenkins-infra
jenkins-sandbox-push-jobs Push jobs to Jenkins Sandbox : jenkins.html#jenkins-sandbox-push-jobs
jjb-push-cli Push jobs via JJB CLI : jenkins.html#jjb-push-cli
jjb-push-gerrit-comment Push jobs via Gerrit comment : jenkins.html#jjb-push-gerrit-comment
lfdocs-create-rtd Add a project to ReadTheDocs : project-documentation.html#lfdocs-create-rtd
lfdocs-global-jjb-templates Global JJB Templates : jenkins.html#lfdocs-global-jjb-templates
lfdocs-helpdesk LF Helpdesk : helpdesk.html#lfdocs-helpdesk
lfdocs-jenkins-sandbox Jenkins Sandbox : jenkins.html#lfdocs-jenkins-sandbox
lfdocs-packer-images Packer Images : jenkins.html#lfdocs-packer-images
lfdocs-proj-docs Project Documentation Guide : project-documentation.html#lfdocs-proj-docs
lfreleng-docs Linux Foundation Releng Documentation : index.html#lfreleng-docs
lfreleng-docs-ansible Ansible Guide : ansible.html#lfreleng-docs-ansible
lfreleng-docs-best-practices Best Practices : best-practices.html#lfreleng-docs-best-practices
lfreleng-docs-bootstrap New Project Bootstrap : project-bootstrap.html#lfreleng-docs-bootstrap
lfreleng-docs-gerrit Gerrit Guide : gerrit.html#lfreleng-docs-gerrit
lfreleng-docs-gpg GPG2 (GnuGP 2) Guide : gpg.html#lfreleng-docs-gpg
lfreleng-infra Infrastructure Guide : infra/index.html#lfreleng-infra
lfreleng-infra-inventory Inventory : infra/inventory.html#lfreleng-infra-inventory
lfreleng-infra-nexus Nexus : infra/nexus.html#lfreleng-infra-nexus
lfreleng-infra-openstack OpenStack Management : infra/openstack.html#lfreleng-infra-openstack
modindex Module Index : py-modindex.html
nexus2-guide Nexus 2 Guide : nexus2.html#nexus2-guide
push-job Push a Job : jenkins.html#push-job
register-key-gerrit Register your SSH key with Gerrit : gerrit.html#register-key-gerrit
search Search Page : search.html
verify-jjb Verify JJB : jenkins.html#verify-jjb
Nexus 2 Guide¶
LF projects use Nexus Repository Manager 2 to store Maven and Java based artifacts. It helps organizing dependencies and releases.
Note
And Nexus Repository Manager 2 specifics: https://help.sonatype.com/repomanager2
To access Nexus 2 for a particular project, use URL:
https://nexus.example.org

Users do not need to login using their LFID credentials. LF admin teams and LFRE engeneers should login to access the administator options. Other users can browse the repositories and proxies anonymously.

Alternately, users can access the repositories outside the GUI using the URL:
https://nexus.example.org/content/repositories/

Nexus 2 communicates with Jenkins server which is the interface used to make the artifacts publications on a scheduled or by demand basis (depending on the Jenkins JJB configuration for the particuar job).
Nexus 2 Repositories¶
Nexus 2 allows users to manage different types of repositories. To learn more about how to manage them, please refer to Sonatype’s official documentation.
Most LF projects manage their Maven artifacts using the following repos:
- Releases:
(hosted) Official repository for released artifacts. Releases repositories have a Disable re-deployment policy to avoid overwriting released versions.
- Snapshots:
(hosted) Used to publish Maven SNAPSHOT builds. In the project’s pom.xml these versions have a -SNAPSHOT suffix.
Special repo namespaces:
- Public Repositories:
(group) A meta-url containing all release repos in a combined view.
- Staging Repositories:
(group) A meta-url containing all staging repos in a combined view. Beware: oldest staging repo artifacts take precedence in cases where 2 staging repos contain the same version artifact.
- Proxy:
Repositories that proxy artifacts from an upstream repository.
Each repository is accessible via URL https://nexus.example.org/content/repositories/<repo name>.
For continuous integration builds, Jenkins has one settings file for each Gerrit repository. Each settings file contains an entry for each accessible Nexus2 repository (ServerId).

In the Gerrit repository’s pom.xml, include the ServerIds in the following manner:
<repositories>
<repository>
<id>releases</id>
<name>Release Repository</name>
<url>${project.nexus.url}/content/repositories/releases/</url>
</repository>
<repository>
<id>staging</id>
<name>Staging Repository</name>
<url>${project.nexus.url}/content/repositories/staging/</url>
</repository>
<repository>
<id>snapshots</id>
<name>Snapshot Repository</name>
<url>${project.nexus.url}/content/repositories/snapshots/</url>
</repository>
</repositories>
Note
More information on access configuration for each Gerrit repository in Create Nexus2 repos with lftools.
Users, Roles and Privileges¶
Users, roles and privileges are key to manage and restrict access into Nexus repositories. Anonymous users have read permissions, while administration teams and CI accounts have write and delete permissions.
Sonatype’s documentation on creating users, roles and privileges found in: https://help.sonatype.com/repomanager2/configuration/managing-users/, and https://help.sonatype.com/repomanager2/configuration/managing-roles/.
For LF projects, a user per Gerrit repository exists matching the repository name.

Similarly, roles and privileges match the name of the Gerrit repository. The following privileges exist:
Repo All Repositories (Read)
<project-name> (create)
<project-name> (delete)
<project-name> (read)
<project-name> (update)
Note
Where “<project-name>” matches the Gerrit name of the repository.

Add roles required for Nexus users:
- <project-name>:
Which groups the privileges mentioned above.
- LF Deployment Role:
To deploy into the Snapshots and Releases repositories.
- Staging:
Deployer (autorelease) For projects using the Staging Profile to create autoreleases.

Note
More information on users, roles and privileges configuration using lftools along with the repos in Create Nexus2 repos with lftools.
Nexus 3 Guide¶
LF projects use Nexus Repository Manager 3 to store docker images. It helps organizing dependencies and releases.
Note
Nexus Repository Manager 3 specifics: https://help.sonatype.com/repomanager3
To access Nexus 3 for a particular project, use URL:
https://nexus3.example.org

Users do not need to login using their LFID credentials. LF admin teams and LFRE engineers should login to access the administrator options. Other users can browse the repositories and proxies anonymously.

Alternately, users can access the repositories outside the GUI using the URL:
https://nexus3.example.org/content/repository/<repository-name>/<artifact-path>/<manifest>
For example:
https://nexus3.onap.org/repository/docker.snapshot/v2/ecomp/admportal-sdnc-image/manifests/latest
Nexus 3 communicates with Jenkins server which is the interface used to make the docker image publications on a scheduled or by demand basis (depending on the Jenkins JJB configuration for the particular job).
Nexus 3 Repositories¶
Nexus 3 allows users to manage different types of repositories. To learn more about how to manage them, please refer to Sonatype’s official documentation.
Most LF projects manage their Docker images using the following repos:
- docker.release:
(hosted/HTTP port 10002) Official repository for released images. Releases repositories have a Disable re-deployment policy to avoid overwriting released versions.
- docker.snapshot:
(hosted/HTTP port 10003) Used to publish docker snapshot images.
Special repo namespaces:
- docker.public:
(group/HTTP port 10001) A meta-url containing all release repos in a combined view.
- docker.staging:
(hosted/HTTP port 10004) Used to publish docker images produced by the scheduled jobs.
- docker.io:
Repositories that proxy artifacts from https://registry-1.docker.io.
For continuous integration builds, Jenkins has one settings file for each Gerrit repository. Each settings file contains an entry for each accessible Nexus3 repository (ServerId).

Fabric8.io plugin usage¶
Projects using fabric8.io maven plugin for managing their docker images should make sure to define the docker registries. For example:
<docker.pull.registry>nexus3.onap.org:10001</docker.pull.registry>
<docker.push.registry>nexus3.onap.org:10003</docker.push.registry>
<groupId>io.fabric8</groupId>
<artifactId>docker-maven-plugin</artifactId>
<version>0.19.1</version>
<configuration>
<verbose>true</verbose>
<apiVersion>1.23</apiVersion>
<pullRegistry>${docker.pull.registry}</pullRegistry>
<pushRegistry>${docker.push.registry}</pushRegistry>
<images>

</images>
</configuration>
Note
More information in https://dmp.fabric8.io
Users, Roles and Privileges¶
Users, roles and privileges are key to manage and restrict access into Nexus repositories. Anonymous users have read permissions, while administration teams and CI accounts have write and delete permissions.
For LF projects, we have created roles to help with the administration of Docker images, NPM/Pypi/Helm repositories and administrative tasks.

Nexus 3 does not require to define patterns for Repository Targets to allow a specific directory structure name to exist.
Like Nexus 2, we require to have one user entry per repo in Nexus 3.
Provide the following parameters for every user:
- ID:
Should match the Gerrit repository name. For example “aai-aai-common”
- First name:
Same as ID
- Last name:
We use a generic last name for users, for example “Deployment”
- Email:
Repo name + “deploy@example.org”. For example “aai-aai-common-deploy@onap.org”
- Status:
Active. Can be “Disabled” if the Gerrit repo is no longer in use
- Roles:
docker. This role will allow the user to administer Docker images
MeetBot Guide¶
LF Project communities use MeetBot to take notes and to manage meetings on IRC.
To host a meeting, join #<project>-meeting on irc.libera.chat and take notes on the public IRC channel. It’s recommended that all meetings participants assist with the task of taking notes. This reduces the onus of the task on a single person since its difficult to take notes and act as the chair of the meeting at the same time.
MeetBot uploads the meeting minutes and raw IRC logs are to the LF IRC log server and are available under the directory IRC channel (e.g. for #<project>-meeting).
Start and end a meeting¶
To start a meeting, use a
#startmeeting <Meeting name>
command followed by meeting name.Use
#chair <username>
to assign one or more meeting chairs.Meeting chairs have the ability to moderate the meeting and allows them to use commands such as
#startmeeting
,#endmeeting
,#topic
,#startvote
,#endvote
.Use
#endmeeting
to end the meeting.This frees up MeetBot to run other meetings in the channel and posts the links to the HTML and raw minutes to the channel.
Take notes¶
Use
#topic
to set a discussion topic.This command automatically changes the topic and closes the previous discussion item.
#topic Review Action Items
Note
The chair of the meeting has to set the topic.
Use
#info
to record a note.#info dneary suggested using MeetBot for meeting minutes
Use
#agree <agreement>
to record agreements to document consensus.#agreed promote the user X as committer on project Y
Note
The chair of the meeting has to record agreements.
Use
#link
to link to external resources in the minutes.#link http://wiki.opnfv.org/wiki/MeetBot
Use
#action
to record action items.This creates a summary section at the end of the meeting, summarizing the action items by assignee. Include the user names in the action to mark an assignee to a action item.
Use
#startvote <vote>
and#endvote
to start/end voting.#startvote Do you approve a 15 minute coffee break? (+1, 0, -1)
Voters will use
#vote <option>
to vote. Typically +1 is for approval, 0 abstain, and -1 non-approval.Use
#undo
to remove the last addition to the minutes.This command undoes the last command in the stack. (eg.
#idea
,#info
,#action
,#topic
, etc…)
Post-meeting work¶
After the meeting, update the wiki page with the link to the HTML minutes summary along with the date, and send an email to the project mailing list. Cut and paste the output in-channel of MeetBot in the email and send the minutes email to the project mailing list.
Example minutes and logs from OPNFV Test and Performance team, who met at 15:00 UTC on Thursday Jan 15, 2015:
SSH Guide¶
Ssh-keygen is a tool for creating new authentication key pairs for SSH, which is then used for automating logins, single sign-on, and for authenticating hosts.
Creating a SSH key on Windows¶
1. Check for existing SSH keys¶
You can use an existing key if you’d like, but creating a new key per service is a good security practice.
Open a command prompt, and run:
cd %userprofile%/.sshIf you see “No such file or directory”, then there aren’t any existing keys and you’ll need to create a new one. Go to Generate a new SSh key._
Check to see if you have a key already:
dir id_*
If there are existing keys, you may want to use those.
2. Back up old SSH keys¶
If you have existing SSH keys, but you don’t want to use them when connecting to remote Server, you should back those up.
In a command prompt on your local computer, run:
mkdir key_backup copy id_rsa* key_backup
3. Generate a new SSH key¶
If you don’t have an existing SSH key that you wish to use, generate one as follows:
Log in to your local computer as your user.
In a command prompt, run:
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"Associating the key with your email address helps you to identify the key later on.
Note that the ssh-keygen command is present and available if you have already installed Git (with Git Bash).
You’ll see a response like this:
![]()
Enter, and re-enter, a passphrase when prompted. The whole interaction will look like this:
![]()
You’re done!
Creating an SSH key on Linux & macOS¶
1. Check for existing SSH keys¶
You can use an existing key if you’d like, but creating a new key per service is a good security practice.
Open a terminal and run the following:
cd ~/.sshIf you see “No such file or directory”, then there aren’t any existing keys and you’ll need to create a new one. Go to Generate a new SSH key._ you can also refer to https://docs.github.com/en/enterprise/2.16/user/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent.
Check to see if you have a key already:
ls id_*
If there are existing keys, you may want to use those.
2. Back up old SSH keys¶
If you have existing SSH keys, but you don’t want to use them when connecting to Bitbucket Server, you should back those up.
Do this in a terminal on your local computer, by running:
mkdir key_backup mv id_rsa* key_backup
3. Generate a new SSH key¶
If you don’t have an existing SSH key that you wish to use, generate one as follows:
Open a terminal on your local computer and enter the following:
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"Associating the key with your email address helps you to identify the key later on.
You’ll see a response like this:
![]()
Press <Enter> to accept the default location and file name. If the .ssh directory doesn’t exist, the system creates one for you.
- Enter, and re-enter, a passphrase when prompted.
The whole interaction will look like this:
![]()
You’re done!
Self-Service:
Committer management¶
This is the documentation for Self-serve committer managament via your repositories INFO.yaml file. The purpose of the INFO file is two fold, the project committers can use it to act as administrators of their project, and it provides a clear record of who the committers and project lead are, and who autorized their committer permissions.
Quick Start¶
Adding someone as a committer requires a change-set against your projects INFO.yaml The change should add the needed information, including an approval link if your project requires it. Upon a successful merge automation will process the change and update permissions as needed.
Note
Some projects TCS’s require approval to add committers and/or PTL. If this is the case, append a link to the meeting minutes in the tsc: changes: section
Filling out the INFO file¶
The identity site will provide you with the values for your user.
name: ''
email: ''
company: ''
id: ''
Filling out the REPOSITORIES section¶
In this section you will list your repository, one (1) repository which this INFO file handles. Each repository must have its own INFO file.
repositories:
- example
Note
Do not have more than one repository under the repositories heading. Below is an example of what not to do
repositories:
- example
- example2
- example3
Filling out the TSC approval section¶
In this section you list the history of PTL/Committers. Add each committers entry or exit from the committer list, and one committer per type. Even if not required by your project, a good habit is to provide a link to the Minutes of Meeting with the approval, or if an approval is not needed, to a mail which informs of the decision.
The type can be Approval, Addition or Removal.
tsc:
# yamllint disable rule:line-length
approval: 'missing'
changes:
- type: 'approval'
name: 'name of new committer'
link: 'link to relevant Minutes of Meeting'
Note
Do not forget to add the yammllint option. Otherwise you will get a yamllint error
error line too long (124 > 80 characters) (line-length)
Note
One name per change type. Yamllint will return error otherwise. If you have more names, you need to add each name within its own type section.:
changes:
- type: 'Addition'
name: 'Person1'
link: 'https://wiki.example.org/pages/URL-2-PermissionMail1'
- type: 'Addition'
name: 'Person2'
link: 'https://wiki.example.org/pages/URL-2-PermissionMail2'
For instance, the below faulty change section will give yamllint error:
changes:
- type: 'Addition'
name: 'Person1'
name: 'Person2'
link: 'https://wiki.example.org/pages/URL-2-PermissionMail'
error duplication of key “name” in mapping (key-duplicates)
Example
tsc:
# yamllint disable rule:line-length
approval: 'https://lists.example.org/pipermail/example-tsc'
changes:
- type: 'addition'
name: 'John Doe'
link: 'https://wiki.example.org/display/TOC/2019+09+18'
- type: 'addition'
name: 'Jane Doe'
link: 'https://lists.example.org/g/example-TSC/message/3725'
- type: 'removal'
name: 'Gone Doe'
link: 'https://lists.example.org/g/example-TSC/message/3726'
Lint check before submitting¶
Always a good habit to perform a lint check before submitting. One tool for this is the yamllint
sudo dnf install yamllint
And then to check your INFO file
yamllint INFO.yaml
No output indicates no fault found.
To showcase how yamllint will present possible errors, see below example.
Here is an INFO file with more than one name row under the type (one name row allowed).
- type: 'Removal'
name: 'Person 1'
name: 'Person 2'
link: 'https://lists.example.org/g/message/msgnbr'
And this is the result when you do the lint check
yamllint INFO.yaml
98:11 error duplication of key "name" in mapping (key-duplicates)
99:11 error duplication of key "name" in mapping (key-duplicates)
Verify against INFO.yaml schema¶
Also good habit to verify that your INFO.yaml file is following the proper schema.
Download info-schema.yaml and yaml-verify-schema.py
wget -q https://raw.githubusercontent.com/lfit/releng-global-jjb/master/schema/info-schema.yaml \
https://raw.githubusercontent.com/lfit/releng-global-jjb/master/yaml-verify-schema.py
Verify INFO.yaml uses correct schema
pip install -U jsonschema
python yaml-verify-schema.py \
--yaml INFO.yaml \
--schema info-schema.yaml
No output indicates INFO.yaml file is valid against the schema. Otherwise, ensure you correct any issues before continuing.
Example INFO file¶
---
project: 'example'
project_creation_date: '2019-11-13'
project_category: ''
lifecycle_state: 'Incubation'
project_lead: &example_example_ptl
name: ''
email: ''
id: ''
company: ''
timezone: ''
primary_contact: *example_example_ptl
issue_tracking:
type: 'jira'
url: 'https://jira.example.org/projects/'
key: 'example'
mailing_list:
type: 'groups.io'
url: 'technical-discuss@lists.example.org'
tag: '[]'
realtime_discussion:
type: 'irc'
server: 'libera.chat'
channel: '#example'
meetings:
- type: 'gotomeeting+irc'
agenda: 'https://wiki.example.org/display/'
url: ''
server: 'libera.chat'
channel: '#example'
repeats: ''
time: ''
repositories:
- example
committers:
- <<: *example_example_ptl
- name: ''
email: ''
company: ''
id: ''
tsc:
# yamllint disable rule:line-length
approval: 'missing'
changes:
- type: ''
name: ''
link: ''
INFO.yaml auto-merge job¶
The auto-merge job triggers after an INFO.yaml verify run for committer changes for an already exisiting repository.
The job checks if the change verified belongs to a project where either TSC or TOC members approved automatically merging changes after the INFO verify job votes +1 Verified.
Auto-merge skips changes for new project creation as it detects a new INFO.yaml file. In such case, a RELENG engineer needs to review the change.
How to enable auto-merge¶
Get TSC or TOC approval to enable auto-merge in your project
Clone the LF ci-management repo
git clone "https://gerrit.linuxfoundation.org/infra/ci-management"
Edit info-auto-merge script in jjb/ci-management/info-auto-merge.sh
if [[ $gerrit_name == "onap" || $gerrit_name == "o-ran-sc" ]]; then
Note
Add your project to the IF block in a new OR statement. This IF block allows approved projects to auto-merge changes and skips if the project is not listed.
Push your change and wait for reviews and approval
After merging your change, the account “lf-auto-merge” will +2 Code Review and Submit INFO.yaml file changes approved by info-master-verify.
Project Creation¶
Introduction¶
Self serve project creation: To reduce administrator time spent on project creation automation is in place. Submitting and merging an INFO.yaml file for a new Gerrit repository now creates the project and its related resources.
Quick Start and INFO.yaml Creation¶
To drive Self-serve Project creation submit an INFO.yaml for approval in the correct location in the Linux Foundation’s info-master repository.
Determine the correct location via path in the info master repository.
At the top level the info-master repo is a collection of directories, each for a Gerrit site.
Inside these are the top level projects, and then their child projects and so on.
The following example is for a Gerrit named gerrit.example.org and a project named example-parent/example-child
An example of a parent and child: gerrit.onap.org/ccsdk/dashboard Where ccsdk is the parent and dashboard is the child.
Correct clone options for your LFID will be available here: .. _info-master: https://gerrit.linuxfoundation.org/infra/admin/repos/releng/info-master
Example of cloning the info-master repo and creating a new repo “example-parent/example-child” on gerrit.example.org
git clone ssh://LFID@gerrit.linuxfoundation.org:29418/releng/info-master cd info-master/gerrit.example.org/example-parent/ mkdir example-child/ && cd example-child
We are now in an empty directory whose name matches the repository we are creating.
We must create an INFO.yaml file in this directory and submit it for review. We have created an optional helper to expediate this step, it makes creating and INFO.yaml file quicker and less error prone.
lftools infofile create-info-file gerrit.example.org example-parent/example-child --empty --tsc_approval "https://link.to.meeting.minutes" > INFO.yaml
We must now pause and fill out the empty sections on the INFO.yaml
vim INFO.yaml #(add committers and lead) tox #(check that you have entered valid yaml) git add INFO.yaml git commit -sv cd to the root path of the info-master repo git review
Note
An LF staff will be automatically added to review your change. If the –tsc_approval link checks out and the verify job passes your project creation will happen on the merge of your patch set.
Jenkins Configuration¶
After merging the INFO.yaml, Jenkins updates project managed credentials (username and passwords).
Users still need to create the Jenkins Maven Settings files and used for Nexus artifact and image deployment.
Clone the ci-management repo for your project
git clone ssh://LFID@gerrit.o-ran-sc.org:29418/ci-management"
Navigate to jenkins-config/managed-config-files/mavenSettings and create your project directory and files. Name the directory “<project-name>-settings”
This folder contains:
config-params.yaml - Parameters file
content - Symbolic link to file “../../../managed-config-templates/mavenSettings-content”
serverCredentialMappings.sandbox.yaml - Symbolic link to file ../../../managed-config-templates/serverCredentialMappings.sandbox.yaml
serverCredentialMappings.yaml - Maven Server ID and Credential mappings
Note
Users can copy these files from an existing repo’s settings files to use as a guide and update them to match their repo names.
Push the change. LFIT will review the change and merge it
These jenkins-config files call the Jenkins corresponding repo credentials created after merging the INFO.yaml file. The is a set of credentials and jenkins-config files per each repo. If a repo has sub folders for different repo sub components, they all will use the .git repo matching credentials.
jenkins-config files allow artifacts and docker images deployment in Nexus and Nexus3 via Jenkins jobs.
Note
Please contact support.linuxfoundation.org for any questions during this process or for any related Jenkins failues with project creentials and Nexus/Nexus3 access issues.
Tools:
Help:
LF Helpdesk / Service Desk¶
The service desk is a support platform for administrative and infrastructure issues requiring input or action from members of the Linux Foundation IT staff.
Choose “Project Services” as your support category and then start typing your question into the field provided. As you type, the system will suggest knowledge base articles that may help you find an applicable self-service solution.
If none of the suggested articles are helping you resolve your problem, please navigate the menus under the input field to select the category that best reflects the nature of your request.
We can help you faster and better if you provide a detailed explanation of the problem in the description field of your new support request. Once you submit the request, you can track its status by selecting “Requests->My Requests” in the top-right of the Service Desk interface. If there are any changes to your request, you will receive an automated email notification.
Our response time to routine support requests will vary depending on the nature of the request and we will generally take care of issues during US business hours.
Libera.Chat IRC¶
Libera.Chat is an IRC network used to discuss peer-directed projects. Libera.Chat is a popular choice for Open Source project collaboration and having your project use this network makes it easy to cross collaborate with other communities.
The Linux Foundation operates support channels to provide community help.
Important
Due to prolonged SPAM attacks, all Linux Foundation project channels now require registered accounts to join. Register your account with the instructions below.
Register your username¶
To register you must set your nick, register it and authenticate.
Set your IRC nick:
/nick <username>
Register your IRC nick:
/msg NickServ REGISTER <password> <youremail@example.com>
To Authenticate:
/msg NickServ IDENTIFY <username> <password>
Note
If you are already registered and encounter “-!- Nick YourNick is already in use” you will need to ghost your nick:
/msg NickServ ghost <username> <password>
This command kicks whoever is using your nick allowing you to take it back.
Your IRC client will have a way of automating your login identification please refer to the docs of your IRC client for instructions.
For further details on the Libera.Chat registration process, please see https://libera.chat/guides/registration
Channel management¶
Use the ChanServ service to manage IRC Channels. Use the command
/msg chanserv help
to get detailed documentation of ChanServ commands
and more specific help by adding specific sections
/msg chanserv help [section] ...
to the end of the command.
The first person who joins a channel creates the channel and becomes OPs as
marked by the @
symbol next to their name. This person can choose to
register the channel, in which case they become the Founder of the channel. The
channel Founder will have full permissions to manage the channel.
We recommend registering any channels that the project plans to use for an extended period of time.
Register a channel¶
New projects can register their project specific channel by using the REGISTER command and passing the channel name they’d like to register.
/msg chanserv register <channel>
After registering the channel we recommend providing Founder permissions to one
of the following LF Staff to ensure that the channel is managable by LF Staff
should the original founder move on from the project. Provide the flags
+F
to one of:
aricg
bramwelt
tykeal
zxiiro
/msg chanserv flags <channel> <nick> +F
Once done notify LF Staff about the new channel registration.
Linux Foundation Channels¶
The Linux Foundation operates the following channels on IRC. We recommend
project developers to at least join the #lf-releng
channel for releng or
CI related questions.
Channel |
Details |
---|---|
#lf-docs |
For cross community documentation collaboration. |
#lf-releng |
Linux Foundation Release Engineering channel for asking general support questions as well as LF projects such as jjb / lftools / packer / etc… |
#lf-unregistered |
Redirect channel for unauthenicated users. |
IRC Best Practices¶
For users¶
Skip the formalities and ask your question¶
Avoid the unnecessary 3-way handshake when asking a question. Eg.
user1> Hi, I have a question. user2> Hello user1, what is your question? user1> My question is…
Asking the question upfront allows everyone watching the channel to respond to the question. People may be away from their terminals and not see the question when you ask, and hours later you may no longer be around to respond with the question causing an unnecessary feedback loop.
Be patient¶
People who might know the answer to your question may not be available but may see it later on. If you are not in the channel when someone who can answer is around then they will not be able to answer.
Try the mailing list¶
If you cannot stick around in the channel for a response try leaving your question on the project’s mailing list. Most projects have one at lists.example.org where example.org is the domain of the project.
For channel moderators¶
DO NOT use ops unless necessary¶
Setting yourself as ops targets you to the top of the channel list, making you
the obvious choice to direct questions to. Have everyone in the channel deopped
and then use /msg chanserv
commands to administrate the channel. This
ensures anonymity when running commands in the channel.
LF Internal:
Infrastructure Guide¶
A collection of documentation describing aspects of Linux Foundation Infrastructure provided to projects.
Inventory¶
Escalation¶
Attention
This document is for LF internal release engineering. The information below references communications channels that are not all reachable by non-LF staff.
Infrastructure critical to releng:
Gerrit
Nexus
Jenkins
Priority is to make sure developers are able to continue working. This means Jenkins, Nexus, and Gerrit are reachable and can perform code builds.
Note
A project failing because of a bug or compile error in their code is not an emergency. If known working code is failing because the job cannot fetch code from Gerrit, or artifacts from Nexus, or builders are not spawning in Jenkins would be an emergency as infrastructure is not working as expected preventing the project from building their code.
If we are unable to perform any builds and these services are offline then we need to make sure someone is working on getting these services back online.
Look into the problem and see if we can fix it ourselves
Failing that ping the IT team in
#it-infra
for helpNote
Use
@here
in the#it-infra
channel to ping everyone in the channel.Contact emergency line: emergency@linuxfoundation.org
In the email provide these key details:
What project (FD.io / OpenDaylight / ONAP / OPNFV / etc…)
What service is failing (Gerrit / Jenkins / Nexus)
Note
The emergency line will ring the pager and contact whoever is on call.
New Infra Bootstrap¶
This document uses example.org
as the domain for all examples. Please
change to point to the intended systems for your project.
Jenkins¶
Steps
Login to Jenkins at https://jenkins.example.org
Navigate to https://jenkins.example.org/pluginManager/
Update all plugins
Install required plugins as documented in global-jjb install guide
Install the following plugins:
Navigate to https://jenkins.example.org/configure
Configure Jenkins as follows:
# of executors: 0 Jenkins URL: https://jenkins.example.org System Admin e-mail address: Jenkins <jenkins-dontreply@example.org> Global Config user.name value: jenkins Global Config user.email value: jenkins@example.org
If using the Message Injector plugin set
Message to inject
toLogs: https://logs.example.org/SILO/HOSTNAME/$JOB_NAME/$BUILD_NUMBER
and replaceSILO
andHOSTNAME
as appropriate.Click
Save
Configure Jenkins security as described in Jenkins Security
Navigate to https://jenkins.example.org/configureSecurity/
Configure the following permissions for
Anonymous Users
Overall:Read
Job:ExtendedRead
Job:Read
View:Read
Note
If the project is not yet public, hold off on these permissions or adjust as necessary for the project’s case.
Setup Jenkins global environment variables as described in the global-jjb install guide
Note
Skip the ci-management step in as we will be discussing that in the next section.
Setup a jobbuilder account
Setup global-jjb required Jenkins Files
Setup Job Builder account¶
The ci-jobs in global-jjb require a jobbuilder account which has permissions to login to Jenkins.
Navigate to and create an account for jobbuilder https://identity.linuxfoundation.org/
Note
This step mainly applies to LF projects. Use the relevant identity system as it applies to your local configuration.
Navigate to https://jenkins.example.org/configureSecurity and configure permissions for the jobbuilder account as follows:
Overall: Administer
Job: Configure
Job: Create
Job: Delete
Job: Discover
Job: Read
View: Configure
View: Create
View: Delete
View: Read
Setup Sandbox Access¶
To allow people access to the Jenkins Sandbox, we require an LDAP group to
exist with the appropriate people added. Use lftools lfidapi create-group
to create a group called $project-jenkins-sandbox-access
and add any initial
members you might need.
Go to https://jenkins.example.org/configureSecurity and add the group with:
Overall: Read
Job: Build
Job: Cancel
Job: Configure
Job: Create
Job: Delete
Job: Discover
Job: Read
Job: Workspace
View: Read
ci-management repo¶
Once Jenkins is available we can initialize a new ci-management repo.
Setup administrative files¶
Create ci-management repo in the project SCM system
Create a README.md file explaining the purpose of the repo
# ci-management This repo contains configuration files for Jenkins jobs for the EXAMPLE project.
Setup tox/coala linting for
jjb/
andpacker/
directories.yamllint.conf
extends: default rules: empty-lines: max-end: 1 line-length: max: 120
.coafile
[Documentation] bears = WriteGoodLintBear files = *.md allow_so_beginning = False allow_there_is = False allow_cliche_phrases = False [GitCommit] bears = GitCommitBear ignore_length_regex = Signed-off-by, Also-by, Co-authored-by, http://, https:// [JSON] bears = JSONFormatBear files = packer/**.json indent_size = 2 [ShellCheck] bears = ShellCheckBear, SpaceConsistencyBear files = jjb/**.sh, packer/**.sh shell = bash indent_size = 4 use_spaces = yeah [YAML] bears = YAMLLintBear files = jjb/**/*.yaml document_start = True yamllint_config = .yamllint.conf
tox.ini
[tox] minversion = 1.6 envlist = coala skipsdist = true [testenv:coala] basepython = python3 deps = coala==0.11 coala-bears==0.11 nodeenv~=1.3.0 commands = nodeenv -p npm install --global write-good python3 -m nltk.downloader punkt maxent_treebank_pos_tagger averaged_perceptron_tagger coala --non-interactive
Setup .gitignore
.tox/ archives/ jenkins.ini # Packer .galaxy/ *.retry cloud-env.json
git commit -asm "Setup repo administrative files"
git push
files to the repositoryRun
tox
Note
The
jjb
tox env will fail as the requiredjjb/
directory does not yet exist. This is fine and proves that tox is working before we continue in the next step.
Bootstrap common-packer and initial builder¶
Note
This section assumes the usage of an OpenStack cloud provider for Jenkins build nodes. Adjust as necessary if not using an OpenStack cloud.
Navigate to the
GIT_ROOT
of the ci-management repoInstall common-packer to
GIT_ROOT/packer/common-packer
git submodule add https://github.com/lfit/releng-common-packer.git packer/common-packer
Follow common-packer doc to setup a template
git commit -asm "Setup common-packer and initial builder"
git push
files to repositoryUpload a CentOS 7 cloudimg to use as a base for packer builds
When uploading the cloudimg ensure it’s name matches the
base_image
name incommon-packer/vars/centos-7.json
.Run
packer build -var-file=cloud-env.json -var-file=common-packer/vars/centos-7.json templates/builder.json
Note down the image name from the packer build as we will need it later
Navigate to
https://jenkins.example.org/credentials/store/system/domain/_/newCredentials
Configure the OpenStack cloud credential as follows:
Kind: OpenStack auth v3 Project Domain: Default Project Name: OPENSTACK_TENANT_ID User Domain: Default User Name: OPENSTACK_USERNAME Password: OPENSTACK_PASSWORD ID: os-cloud Description: openstack-cloud-credential
Note
Replace ALL_CAPS instances with your Cattle account credential.
Configure an ssh keypair for the Jenkins <-> OpenStack connection
Generate a new SSH Keypair
ssh-keygen -t rsa -C jenkins-ssh -f /tmp/jenkins
Navigate to
https://jenkins.example.org/credentials/store/system/domain/_/newCredentials
Configure the Jenkins SSH Key as follows:
Kind: SSH Username and private key Scope: Global Username: jenkins Private Key: Enter directly Passphrase: ID: jenkins-ssh Description: jenkins-ssh
Copy the contents of
/tmp/jenkins
into the Key field.Navigate to
https://openstack-cloud.example.org/project/key_pairs
Import the contents of
/tmp/jenkins.pub
into the OpenStack cloud provider account with the keypair namejenkins-ssh
Navigate to
https://jenkins.example.org/configfiles/selectProvider
Create a
OpenStack User Data
file with the following specs:Type: OpenStack User Data ID: jenkins-init-script Name: jenkins-init-script Comment: jenkins-init-script
With the contents (change the git clone URL as necessary for the project):
#!/bin/bash until host gerrit.example.org &>/dev/null do echo "Waiting until gerrit.example.org is resolvable..." done git clone --recurse-submodules https://gerrit.example.org/r/ci-management /opt/ciman /opt/ciman/jjb/global-jjb/jenkins-init-scripts/init.sh
For Windows:
Type: OpenStack User Data ID: jenkins-init-script-windows Name: jenkins-init-script-windows Comment: jenkins-init-script-windows
With the contents (change the git clone URL as necessary for the project):
<powershell> # Resize first partition of first disk to maximum size Get-Partition -DiskNumber 0 -PartitionNumber 1 $size = (Get-PartitionSupportedSize -DiskNumber 0 -PartitionNumber 1) Resize-Partition -DiskNumber 0 -PartitionNumber 1 -Size $size.SizeMax mkdir -Force "${SLAVE_JENKINS_HOME}" (new-object System.Net.WebClient).DownloadFile('${SLAVE_JAR_URL}','${SLAVE_JENKINS_HOME}\slave.jar') cd "${SLAVE_JENKINS_HOME}" java ${SLAVE_JVM_OPTIONS} -jar "slave.jar" -jnlpUrl "${SLAVE_JNLP_URL}" -secret "${SLAVE_JNLP_SECRET}" </powershell>
Configure
cattle
cloudCreate cloud config directory
mkdir -p jenkins-config/clouds/openstack/cattle
Configure the OpenStack cloud connection details in
jenkins-config/clouds/openstack/cattle/cloud.cfg
Replace
<BUILD_IMAGE_NAME>
and<NETWORK_ID>
in the below file with the details for your cloud. Find<NETWORK_ID>
at https://dashboard.vexxhost.net/project/networks/jenkins-config/clouds/openstack/cattle/cloud.cfg¶# Cloud Configuration CLOUD_CREDENTIAL_ID=os-cloud CLOUD_URL=https://auth.vexxhost.net/v3/ CLOUD_IGNORE_SSL=false CLOUD_ZONE=ca-ymq-1 # Default Template Configuration IMAGE_NAME=<BUILD_IMAGE_NAME> HARDWARE_ID=v3-standard-2 NETWORK_ID=<NETWORK_ID> USER_DATA_ID=jenkins-init-script INSTANCE_CAP=10 SANDBOX_CAP=4 FLOATING_IP_POOL= SECURITY_GROUPS=default STARTUP_TIMEOUT=600000 KEY_PAIR_NAME=jenkins-ssh NUM_EXECUTORS=1 JVM_OPTIONS= FS_ROOT=/w RETENTION_TIME=0
Create
jenkins-config/clouds/openstack/cattle/centos7-builder-2c-1g.cfg
IMAGE_NAME=ZZCI - CentOS 7 - builder - 20180604-1653 HARDWARE_ID=v3-standard-2
Run global-jjb jenkins-cfg script to update Jenkins cloud config
Note
This step requires
crudini
tool, install from your package manager to avoid python 2 vs 3 problems in your virtualenv.Note
This step requires having lftools available on your path and a
~/.config/jenkins_jobs/jenkins_jobs.ini
configured with Jenkins credentials.Set
jenkins_silos
to match the config section name in thejenkins_jobs.ini
file.Run the following commands:
export WORKSPACE=$(pwd) export jenkins_silos=production bash ./jjb/global-jjb/shell/jenkins-configure-clouds.sh # OPTIONAL: view the created script cat archives/groovy-inserts/production-cloud-cfg.groovy
Then navigate to
https://jenkins.example.org/script
and copy the contents ofarchives/groovy-inserts/production-cloud-cfg.groovy
into the script console. This will initialize the OpenStack cloud configuration.Commit the
jenkins-config
directorygit add jenkins-config/ git commit -sm "Add OpenStack cloud configuration" git push
Navigate to
https://jenkins.example.org/configure
and verify the cloud configuration.
Setup global-jjb and ci-jobs¶
Install global-jjb to
GIT_ROOT/jjb/global-jjb
git submodule add https://github.com/lfit/releng-global-jjb.git jjb/global-jjb
Setup
jjb/defaults.yaml
- defaults: name: global gerrit-server-name: Primary git-url: 'ssh://jenkins-$SILO@gerrit.example.org:29418' jenkins-ssh-credential: jenkins-ssh lftools-version: '<1.0.0'
Create the CI Jobs in
jjb/ci-management/ci-jobs.yaml
- project: name: ci-jobs jobs: - '{project-name}-ci-jobs' project: ci-management project-name: ci-management build-node: centos7-builder-2c-1g
Manually push the initial ci-management jobs to Jenkins
jenkins-jobs update jjb/
Git commit the current files and push to Gerrit
git commit -sm "Setup global-jjb and ci-jobs" git push
Confirm verify jobs work
Merge the patch and confirm merge job works
Setup packer jobs¶
Create Initial CI Packer job in jjb/ci-management/ci-packer.yaml
- project: name: packer-verify jobs: - gerrit-packer-verify project: ci-management project-name: ci-management build-node: centos7-builder-2c-1g - project: name: packer-builder-jobs jobs: - gerrit-packer-merge project: ci-management project-name: ci-management build-node: centos7-builder-2c-1g templates: builder platforms: - centos-7 - ubuntu-16.04
Git commit and push the patch to ci-management for review
git commit -sm "Add packer builder job" git push ...
Confirm packer verify job passes
Merge patch and confirm merge job works
Nexus 2¶
Setup Server Config¶
Navigate to https://nexus.example.org/#nexus-config
SMTP Settings
Hostname: localhost Port: 25 Username: Password: Connection: Use plain SMTP System Email: noreply@example.org
Application Server Settings
Base URL: https://nexus.example.org/ Force base URL: true UI Timeout: 120
PGP Key Server Information
Server 1: http://pool.sks-keyservers.net:11371 Server 2: http://pgp.mit.edu:11371
Setup LDAP¶
Navigate to https://nexus.example.org/#enterprise-ldap
Click
Add
at the top menu barConfigure the LDAP connection as follows:
Name: ldaps://ldap.example.org:636 Protocol: ldaps Hostname: ldap.example.org Port: 636 Search Base: dc=example,dc=org Authentication: Anonymous Authentication
Click on the
User & Group Settings
tabConfigure the
User & Group Settings
as follows:Base DN: ou=Users Object Class: inetOrgPerson User ID Attribute: uid Real Name Attribute: cn E-Mail Attribute: mail Group Type: Static Groups Base DN: ou=groups Object Class: groupOfNames Group ID Attribute: cn Group Member Attribute: member Group Member Format: ${dn}
Setup Admin role¶
Navigate to https://nexus.example.org/#security-roles
Click
Add > External Role Mapping
Configure mapping as follows:
Realm: LDAP Role: lf-collab-admins
Note
If not an LF project replace
lf-collab-admins
with the relevant admin group for your case.Click
Add
and add theNexus Administrator Role
From this point you should be able to login using your own account to administrate the server. Do that and then setup admin user email and deactivate the default deployment account as we will create separate project deployment accounts for each individual project.
Navigate to https://nexus.example.org/#security-users
Configure the admin user email to
collab-it+PROJECT@linuxfoundation.org
Note
Replace email as necessary for your org.
Set the default deployment user account Status to
Disabled
Setup custom deployment role¶
LF projects use Nexus 2 as a server to host logs and requires the
Nexus Unpack
plugin configured. Since the default Nexus Deployment Role
is not configurable, we will have to create our own custom one to ensure Unpack
is available.
Navigate to https://nexus.example.org/#security-roles
Click
Add > Nexus Role
Configure the following settings:
Role Id: lf-deployment Name: LF Deployment Role Description: LF modified deployment role
Click
Add
and add the following roles:Artifact Upload
Nexus Deployment Role
Unpack
Setup routing¶
Navigate to https://nexus.example.org/#routes-config
Clear all existing routes
Click
Add
to add a new routeConfigure the route as follows:
URL Pattern: ^/org/example/.* Rule Type: Inclusive Repository Group: All Repository Groups Ordered Route Repositories: * Releases * Snapshots
Nexus 3¶
Setup Server Config¶
Navigate to https://nexus3.example.org/#admin/system/emailserver
SMTP Settings
Enabled: true Hostname: localhost Port: 25 Username: Password: From address: noreply@example.org Subject prefix:
Setup LDAP¶
Navigate to https://nexus3.example.org/#admin/security/ldap
Click
Create connection
Configure the LDAP connection as follows
Name: ldaps://ldap.example.org:636 Protocol: ldaps Hostname: ldap.example.org Port: 636 Search base: dc=example,dc=org Authentication method: Anonymous Authentication
Click
Verify connection
and check that it worksClick
Next
Configure the
User & Group Settings
as follows:Base DN: ou=Users Object Class: inetOrgPerson User ID Attribute: uid Real Name Attribute: cn E-Mail Attribute: mail Map LDAP groups as roles: true Group Type: Static Groups Base DN: ou=groups Object Class: groupOfNames Group ID Attribute: cn Group Member Attribute: member Group Member Format: ${dn}
Click
Verify user mapping
and confirm it worksClick
Create
Setup Admin role¶
Navigate to https://nexus3.example.org/#admin/security/roles
Click
Create role > External Role Mapping
Configure mapping as follows:
Mapped Role: lf-collab-admins Role Name: lf-collab-admins Role description: lf-collab-admins Privileges: nx-all
From this point you should be able to login using your own account to administrate the server. Do that and then setup admin user email and deactivate the default deployment account as we will create separate project deployment accounts for each individual project.
Navigate to https://nexus3.example.org/#admin/security/users:admin
Configure the admin user email to
collab-it+PROJECT@linuxfoundation.org
Note
Replace email as necessary for your org.
Post bootstrap¶
With infrastructure bootstrapped, here is a list of tasks to consider that may be useful to setup.
GitHub¶
Nexus¶
Gerrit¶
GitHub Replication Configuration¶
Initial configuration (required once)¶
Hiera configuration:
Gerrit::extra_configs: replication_config: config_file: '/opt/gerrit/etc/replication.config' mode: '0644' options: 'remote.github': # ORG == the Org on GitHub # ${name} is literal and should exist in that format url: 'git@github.com/ORG/${name}.git' push: - '+refs/heads/*:refs/heads/*' - '+refs/heads/*:refs/tags/*' timeout: '5' threads: '5' authGroup: 'GitHub Replication' remoteNameStyle: 'dash'
If a $PROJECT-github account does not exist on GitHub, create it, setup 2-factor authentication on the account, and add the recovery tokens to LastPass. The email for the account should be to collab-it+$PROJECT-github@linuxfoundation.org
Copy the public SSH key for the ‘gerrit’ user into the GitHub account
On the Gerrit Server do the following:
# create 'root' shell sudo -i # create 'gerrit' shell sudo -iu gerrit # Add the server key to gerrit's known_hosts file ssh-keyscan -t rsa github.com >> ~/.ssh/known_hosts # exit from 'gerrit' shell exit # restart Gerrit so that SSH changes are properly picked up systemctl restart gerrit # exit from 'root' shell exit
Add the account to the GitHub Organization as a Member
Configure the Organization with the following options:
Members cannot create repositories
Members cannot delete or transfer repositories
Set the default repository permission to Read
Require 2FA (Two Factor Authentication) for everyone
Create a Replication team in the organization and add the $PROJECT-github account
In Gerrit create a ‘GitHub Replication’ group that is empty
Set the following ACL on the All-Projects repository
refs/* Read DENY: GitHub Replication
Repository replication setup (repeat for each repository)¶
Note
After initial setups, descibed above gerrit project creation, github repo creation and gerrit replication are now done with lftools commands.
To create_repo, clone_repo, create_groups_file and add_gitreview:
lftools gerrit create [OPTIONS] GERRIT_URL LDAP_GROUP REPO USER
To create a github repo:
lftools github create-repo --sparse ORGANIZATION REPOSITORY DESCRIPTION
To enable replication:
lftools gerrit create --enable GERRIT_URL LDAP_GROUP REPO USER
Manual Process¶
Perform the following in each repository mirrored from Gerrit
Create the repository in the GitHub organization replacing any occurrence of ‘/’ with ‘-’ as ‘/’ is an illegal character for GitHub repositories.
Add the Replication Team to the repository with write privileges
In Gerrit add the following ACL
refs/* Read ALLOW: GitHub Replication
Perform initial code drop
The initial code drop must be present before you enable Gerrit replication for a repository.
Enable repo replication
To enable replication for a single repo:
ssh -p 29418 ${youruid}@${project_gerrit} replication start --wait --url ${repo_url}
To enable replication for more than one repo:
ssh -p 29418 ${youruid}@${project_gerrit} replication start --all --wait
Watch GitHub to see if the repo starts to replicate, if not troubleshoot by looking at ~gerrit/logs/replication*
Gerrit Prolog Filter¶
LF has automated the handling of committers and creation of repos, which makes it crucial that the INFO.yaml file are correct.
To enforce this, Gerrit needs to do some extra checks on the submitted files, in particular the INFO.yaml file. The change set can not have more than 1 file, if the file is INFO.yaml, to enable fault tracing and handling.
To summarize, below are the requirements:
Ensure that self review with +2 is not allowed.
Ensure that INFO.yaml has been automatically reviewed and approved by Jenkins.
Ensure that INFO.yaml file is alone in the change set.
A gerrit prolog filter, located in the Gerrit All-Projects repository, implements the above requirements. The project repos inherits and apply the filter.
Note
For further information about Prolog and Gerrit refer to the Prolog Cookbook
Below are the instructions on how to install this filter.
Clone the project’s All-Projects repo
git clone "ssh://<user>@gerrit.<project>.org:29418/All-Projects" git fetch origin refs/meta/config:config git checkout config
Confirm rules.pl is not modified.
Verify that the rules.pl file are either missing, or contains code for non-author-approval like below
submit_filter(In, Out) :- In =.. [submit | Ls], add_non_author_approval(Ls, R), Out =.. [submit | R]. add_non_author_approval(S1, S2) :- gerrit:commit_author(A), gerrit:commit_label(label('Code-Review', 2), R), R \= A, !, S2 = [label('Non-Author-Code-Review', ok(R)) | S1]. add_non_author_approval(S1, [label('Non-Author-Code-Review', need(_)) | S1]).
Note
If rules.pl contains something else, please confirm before continuing, since below steps will overwrite the old rules.pl.
Get the user id for the automatic codereview users.
Go to the appropriate Gerrit’s groups page (https://gerrit.example.org/r/admin/groups)
Click on Non-Interactive Users
Click on Members Verify these users are the correct ones. For ONAP that would be ONAP Jobbuilder, ecomp jobbuilder, and LF Jenkins CI
Click on Audit Log Find the Added row for each user. The member column contains the userid (in parentheses). For instance, for ONAP Jobbuilder the record states Added ONAP Jobbuilder(459) where the user id is 459.
These userid’s should replace the userid’s in the rules.pl further down in this document. Below is the relevant code area in rules.pl.
% Define who is the special Jenkins user jenkins_user(user(459)). % onap-jobbuilder@jenkins.onap.org jenkins_user(user(3)). % ecomp-jobbuilder@jenkins.openecomp.org jenkins_user(user(4937)). % releng+lf-jobbuilder@linuxfoundation.org
Replace/Create rules.pl with below content
# Start ignoring allow_passive_voice
submit_filter(In, Out) :- In =.. [submit | Ls], % add the non-owner code review requiremet reject_self_review(Ls, R1), % Reject if multiple files and one is INFO.yaml ensure_info_file_is_only_file(R1, R2), % Reject if not INFO file has been verified by Jenkins if_info_file_require_jenkins_plus_1(R2, R), Out =.. [submit | R]. % ============= %filter to require all projects to have a code-reviewer other than the owner % ============= reject_self_review(S1, S2) :- % set O to be the change owner gerrit:change_owner(O), % find a +2 code review, if it exists, and set R to be the reviewer gerrit:commit_label(label('Code-Review', 2), R), % if there is a +2 review from someone other than the owner, % then the filter has no work to do, assign S2 to S1 R \= O, !, % the cut (!) predicate prevents further rules from being consulted S2 = S1. reject_self_review(S1, S2) :- % set O to be the change owner gerrit:change_owner(O), % find a +2 code review, if it exists, and set R to be the reviewer gerrit:commit_label(label('Code-Review', 2), R), R = O, !, % if there is not a +2 from someone else (above rule), % and there is a +2 from the owner, reject with a self-reviewed label S2 = [label('Self-Reviewed', reject(O))|S1]. % if the above two rules did not make it to the ! predicate, % there are not any +2s so let the default rules through unfiltered reject_self_review(S1, S1). % ============= % Filter to require one file to be uploaded, if file is INFO.yaml % ============= ensure_info_file_is_only_file(S1, S2) :- % Ask how many files changed gerrit:commit_stats(ModifiedFiles, _, _), % Check if more than 1 file has changed ModifiedFiles > 1, % Check if one file name is INFO.yaml gerrit:commit_delta('^INFO.yaml$'), % If above two statements are true, give the cut (!) predicate. !, %set O to be the change owner gerrit:change_owner(O), % If you reached here, then reject with Label. S2 = [label('INFO-File-Not-Alone', reject(O))|S1]. ensure_info_file_is_only_file(S1, S1). % ============= % Filter to require approved jenkins user to give +1 if INFO file % ============= % Define who is the special Jenkins user jenkins_user(user(459)). % onap-jobbuilder@jenkins.onap.org jenkins_user(user(3)). % ecomp-jobbuilder@jenkins.openecomp.org jenkins_user(user(4937)). % releng+lf-jobbuilder@linuxfoundation.org is_it_only_INFO_file() :- % Ask how many files changed gerrit:commit_stats(ModifiedFiles, _, _), % Check that only 1 file is changed ModifiedFiles = 1, % Check if changed file name is INFO.yaml gerrit:commit_delta('^INFO.yaml$'). if_info_file_require_jenkins_plus_1(S1, S2) :- % Check if only INFO file is changed. is_it_only_INFO_file(), % Check that Verified is set to +1 gerrit:commit_label(label('Verified', 1), U), % Confirm correct user gave the +1 jenkins_user(U), !, %set O to be the change owner gerrit:change_owner(O), % Jenkins has verified file. S2 = [label('Verified-By-Jenkins', ok(O))|S1]. if_info_file_require_jenkins_plus_1(S1, S2) :- % Check if only INFO file is changed. is_it_only_INFO_file(), % Check if Verified failed (-1) +1 gerrit:commit_label(label('Verified', -1), U), % Confirm correct user gave the -1 jenkins_user(U), !, % set O to be the change owner gerrit:change_owner(O), % Jenkins failed verifying file. S2 = [label('Verified-By-Jenkins', reject(O))|S1]. if_info_file_require_jenkins_plus_1(S1, S2) :- % Check if only INFO file is changed. is_it_only_INFO_file(), !, % set O to be the change owner gerrit:change_owner(O), S2 = [label('Verified-By-Jenkins', need(O))|S1]. if_info_file_require_jenkins_plus_1(S1, S1).
Push it to Gerrit
git add rules.pl git commit -m "LF initial prolog filter" git push origin HEAD:refs/meta/config
# Stop ignoring
Jenkins¶
Upgrading Jenkins¶
Regular Jenkins maintenance is necessary to ensure security patches are up to date.
Follow these steps to update Jenkins:
Notify community that maintenance is about to begin
Put Jenkins into Shutdown mode (https://jenkins.example.org/quietDown)
yum update -y --exclude=jenkins
(Do this step while waiting for Jobs to clear in shutdown mode.)yum update -y
Update Jenkins plugins via Manage Jenkins > Manage Plugins
Ensure that you click “Download now and install after restart” but DO NOT check the “Restart Jenkins when installation is complete and no jobs are running” button.
Restart the server itself
systemctl reboot
Remove Shutdown mode from Jenkins (https://jenkins.example.org/cancelQuietDown)
GitHub Configuration¶
Jenkins requires admin level configuration to work with GitHub.
Create a GitHub account for Jenkins to use
The user needs to have Full Admin access to the GitHub Organization that Jenkins will manage, this is so that Jenkins can automatically manage the hooks.
Navigate to
https://jenkins.example.org/configure
Under
GitHub Servers
click Advanced > Manage GitHub actions > Convert login and password to tokenChoose
From login and password
and enter the github-jenkins account detailsClick Create token credentials
Under
GitHub Servers
click Add GitHub Server and configure the following:Name: <Leave blank> API URL: https://api.github.com Credentials: <Auto-generated token> Manage hooks: true GitHub client cache size (MB): 20
Click
Re-register hooks for all jobs
Security Configuration¶
Security recommendations for Jenkins.
Install the OWASP Markup Formater Plugin
Navigate to https://jenkins.example.org/configureSecurity/
Configure the following:
Enable
CSRF Protection
withDefault Crumb Issuer
Enable
Agent -> Master Access Control
Disable
JNLP Protocol 1 - 3
Enable
JNLP Protocol 4
Set
Markup Formatter
toSafe HTML
JIRA¶
General Setup¶
Navigate to https://jira.example.org/secure/admin/ViewApplicationProperties.jspa
Click
Edit Settings
Base URL: <set as appropriate> Introduction: <p>You will need a Linux Foundation ID to login here.</p> <p style="color:red"> You can create a Linux Foundation ID username (or request a new password if forgotten) at <a href="https://identity.linuxfoundation.org"> https://identity.linuxfoundation.org </a>. </p>
Navigate to https://jira.example.org/secure/admin/EditDefaultDashboard!default.jspa
Move the “Introduction” widget to under the “Your Company JIRA” widget
Navigate to https://jira.example.org/secure/admin/EditAnnouncementBanner!default.jspa
Configure the Annoucement as follows:
<style type="text/css"> div#publicmodeoffmsg { display: none; } a#forgotpassword { display: none; } a#login-form-cancel { display: none; } /* a.aui-nav-link.login-link { display: none; } */ </style>
Set Visibility Level to
Public
Navigate to https://jira.example.org/secure/admin/OutgoingMailServers.jspa
Configure outgoing email as follows:
Name: localhost From address: jira@example.org Email prefix: [JIRA] Protocal: SMTP Host Name: localhost
Click Update
LDAP¶
Navigate to https://jira.example.org/plugins/servlet/embedded-crowd/directories/list
Click
Add Directory
Choose
Internal with LDAP Authentication
and clickNext
Configure LDAP
Name: Delegated LDAP Authentication Directory Type: OpenLDAP Hostname: ldap.example.org Port: 636 Use SSL: True Copy User on Login: True Default Group Membership: jira-software-users Synchronize Group Memberships: True Base DN: dc=example,dc=org User Name Attribute: uid Additional User DN: ou=Users User Display Name Attribute: cn # Group Schema Settings Additional Group DN: ou=Groups Group Object Class: groupOfNames Group Object Filter: (&(objectclass=groupOfNames)(|(cn=PROJECT-*)(cn=lf-releng)(cn=lf-sysops)(cn=lf-helpdesk))) # Membership Schema Settings Group Members Attribute: member User Membership Attribute: memberOf
Note
In Group Object Filter, change cn=PROJECT-* to replace PROJECT with the group prefix for the project you want to have group permissions on this JIRA instance. Eg. odl, onap, opnfv, etc…
Click
Save and Test
Ensure the
Internal
directory has higher precedence thanOpenLDAP
At this point we should be able to log in using our personal account to continue managing the JIRA Server. This is necessary for the LDAP admin groups to appear.
Admin Permissions¶
Navigate to https://jira.example.org/secure/admin/GlobalPermissions!default.jspa
Add
lf-collab-admins
andlf-helpdesk
to the following groups:JIRA System Administrators
JIRA Administrators
Browse Users
Create Shared Objects
Manage Group Filter Subscriptions
Bulk Change
Post configuration¶
Inform LF Helpdesk about new Jira instance
Create a new Helpdesk ticket with the following text:
Greetings Helpdesk, This is a notification that a new JIRA is online at https://jira.example.org and ready for you to take on license management and renewals. Please install the initial trial license. Thanks, Releng
Nexus¶
Nexus is an artifact repository typically used in Java / Maven projects. Stores Project artifacts, Javadocs, and Jenkins job logs.
File system layout¶
We recommend to configure the Nexus server storage for all artifacts and logs on separate file systems, preferably a file system that allows a large amount of inodes such as XFS for the logs storage.
- /srv:
Contains Nexus install along with storage repositories.
- /srv/sonatype-work/nexus/storage/logs:
Contains Jenkins server logs. Use a file system with a lot of inodes.
Note
OpenDaylight ran out of inodes before due to logs. Issue documented in Jira https://jira.linuxfoundation.org/browse/RELENG-773
Scheduled Tasks¶
We recommend configuring Nexus to clear out old SNAPSHOT artifacts as well as old Staging repositories. Some projects may have specific policies set by the TSC on how long artifacts need to stick around but below make a good starting point.
Purge old SNAPSHOTs¶
For purging SNAPSHOTs we should setup 2 jobs.
The first job to purge week old artifacts but keep 1 SNAPSHOT around in case the project has a broken merge job.
The second job to purge all 3 week old artifacts. This is necessary is to ensure that if a project removes a module from their build that downstream projects will notice by fact of their builds failing to find this artifact.
LF: Purge week old SNAPSHOTs
Name: LF Purge week old SNAPSHOTs Task Type: Remove Snapshots From Repository Repository/Group: Snapshots (Repo) Minimum snapshot count: 1 Snapshot retention (days): 7 Remove if released: True Grace period after release (days): 21 Delete immediately: True Recurrence: Daily
LF: Purge 3 week old SNAPSHOTs
Name: LF Purge 3 week old SNAPSHOTs Task Type: Remove Snapshots From Repository Repository/Group: Snapshots (Repo) Minimum snapshot count: 0 Snapshot retention (days): 21 Remove if released: True Grace period after release (days): 21 Delete immediately: True Recurrence: Daily
Purge old staging¶
Name: LF Purge old staging
Task Type: Drop Inactive Staging Repositories
Inactivity duration (days): 30
Scan open repositories: True
Scan closed repositories: True
Scan promoted repositories: True
Scan released repositories: True
Recurrence: Daily
Purge trash¶
Name: LF Purge trash
Task Type: Empty Trash
Repository/Group: All Repositories
Recurrence: Daily
Rebuild metadata¶
Name: LF Rebuild metadata
Task Type: Rebuild Maven Metadata Files
Repository/Group: All Repositories
Recurrence: Daily
Use Nexus as a log server¶
One use for a Nexus server is to be a log server for Jenkins. This is useful to offload logs from Jenkins and allow Nexus to store the longer term storage of the logs.
We suggest following advice from the File system layout <nexus-file-system> section before configuring the log server directory here.
Create log repository¶
Navigate to https://nexus.example.org/#view-repositories
Click
Add > Hosted Repository
Configure the repository as follows:
Repository ID: logs Repository Name: logs Repository Type: hosted Provider: Site Format: site Repository Policy: Mixed Deployment Policy: Allow Redeploy Allow File Browsing: True Include in Search: False Publish URL: True
Navigate to https://nexus.example.org/#security-privileges
Click
Add > Repository Target Privilege
Configure the privilege as follows:
Name: logs Description: logs Repository: All Repositories Repository Target: All (site)
Create log role¶
Navigate to https://nexus.example.org/#security-roles
Click
Add > Nexus Role
Configure the role as follows:
Role Id: All logs repo Name: All logs repo Description:
Click
Add
and add the following privileges:logs - (create)
logs - (delete)
logs - (read)
logs - (update)
logs - (view)
Note
Be careful not to include the “Logs - (read)” (the one with the capitalized first letter) this one is for granting access to Nexus’ own logs.
Click
Save
Create log user¶
Navigate to https://nexus.example.org/#security-users
Click
Add > Nexus User
Configure the user as follows:
User ID: logs First Name: logs Last Name: user Email: jenkins@example.org Status: Active
Click
Add
and add the following roles:All logs repo
LF Deployment Role
Configure log credential in Jenkins¶
Navigate to https://jenkins.example.org/credentials/store/system/domain/_/newCredentials
Configure the credential as follows:
Kind: Username with password Scope: Global Username: logs Passowrd: <password> ID: jenkins-log-archives Description: jenkins-log-archives
Navigate to https://jenkins.example.org/configfiles/editConfig?id=jenkins-log-archives-settings
Click
Add
to add a new Server CredentialConfigure the credential as follows:
ServerId: logs Credentials: jenkins-log-archives
Click
Submit
Configure global-var in ci-management¶
Edit the file
jenkins-config/global-vars-production.sh
Add
LOGS_SERVER=https://logs.example.org
as a new global-varRepeat for all
global-vars
files as necessary
Refer to Jenkins CFG Global Variables for details on global-vars configuration.
Setup cron to cleanup old logs¶
We highly recommend setting up cron jobs to cleanup old logs periodically.
Job to clean up files 6 months old on production path every day
Job to clean up empty directories in the logs path every day
Job to clean up all sandbox logs every week
The following example shows the puppet-cron configuration used by LF to manage logs following the Jenkins Sandbox rules defined in the Jenkins Sandbox Overview.
cron::daily:
purge-logs-production:
hour: 8
user: 'nexus'
# yamllint disable-line rule:line-length
command: '/usr/bin/yes | /usr/bin/find /srv/sonatype-work/nexus/storage/logs/production -mtime +183 -delete 2>/dev/null'
purge-empty-dirs:
hour: 9
user: 'nexus'
# yamllint disable-line rule:line-length
command: '/usr/bin/yes | /usr/bin/find /srv/sonatype-work/nexus/storage/logs -type d -empty -delete 2>/dev/null'
cron::weekly:
purge-logs-sandbox:
hour: 8
weekday: 6
user: 'nexus'
# yamllint disable-line rule:line-length
command: '/bin/rm -rf /srv/sonatype-work/nexus/storage/logs/sandbox/*'
Create Nexus2 repos with lftools¶
LF Tools provides an interface to Nexus 2 for creating resources or reordering staging repositories. More information on how to use the commands: LF Tools Nexus commands
The lftools nexus create repo
command needs two files as parameters:
-c, –config Configuration file containing the repos and their tree structure.
# Using ONAP as example base_groupId: 'org.onap' email_domain: 'onap.org' global_privs: - 'LF Deployment Role' repositories: appc: password: 'NjPAd1ZZ5RbDalZy4ROHaApb4Bk3buTU' extra_privs: - 'Staging: Deployer (autorelease)' repositories: cdt: password: 'NjPAd1ZZ5RbDalZy4ROHaApb4Bk3buTU' extra_privs: - 'Staging: Deployer (autorelease)' aaf: password: 'NjPAd1ZZ5RbDalZy4ROHaApb4Bk3buTU' extra_privs: - 'Staging: Deployer (autorelease)' repositories: sms: password: 'NjPAd1ZZ5RbDalZy4ROHaApb4Bk3buTU' extra_privs: - 'Staging: Deployer (autorelease)'
appc is the parent for cdt and aaf is the parent of sms. The projects created will be: appc, appc-cdt, aaf and aaf-sms.
Note
‘Staging: Deployer (autorelease)’ in the above example is in the
extra_privs
section as an example. If it applies to all repos, it can be
part of the global_privs
section.
-s, –settings Configuration file with all the admin settings
# Using ONAP as example nexus: 'https://nexus.onap.org' user: 'admin' password: 'admin123'
After running lftools nexus create repo -c <the_repo_config> -s <your_settings_config>, the script will create all repos, users, roles and privileges. Also, the Repository Targets gets set with the patterns to set restrictions for projects and the location where they should post artifacts. These patterns should match the GroupId in the project’s pom.xml.
Troubleshooting¶
SSL certificate does not match due to SNI¶
When using the nexus-staging-maven-plugin and the build fails with the message below. This is due to Nexus 2 not supporting SNI and prevents the staging plugin from uploading artifacts to Nexus.
The workaround for this is to use another method to upload to Nexus such as cURL which is capable of ignoring the failure.
Error
Refer to https://jira.linuxfoundation.org/browse/RELENG-21 for further details.
OpenStack Management¶
We use OpenStack as our primary underlying cloud for CI. Most LF projects hosted with the same vendor enables us to adopt common management practices across them.
GitHub¶
Setup DCO¶
To setup a DCO we require configuring probot for our GitHub Organization.
Navigate to https://github.com/apps/dco
Click
Configure
at the top right of the pageChoose the Organization to deploy the DCO to
Set
All repositories
andSave
At this point DCO configuration is complete for the organization. Next we need to configure each repository to require the DCO.
Navigate to the Settings
page and set the DCO for each repository
following these steps:
Click
Branches
Configure
Branch protection rules
for each branch which needs DCO enforcementSet the following configurations:
Protect this branch
Require pull request reviews before merging
Dismiss stale pull request approvals when new commits are pushed
Require review from Code Owners
Require status checks to pass before merging * DCO * (any verify jobs)
Include administrators
Note
Status checks will not appear until a job using one of them has ran at least once.
Click
Save