User Guide


  • We aim for complete C++11/14 compliance; please use this to your advantage
  • Please use the standard library and dependency libraries whenever possible

Vulnerability Response


  1. Read Google’s C++ Style Guide (particularly for non-formatting style reference)
  2. For files containing only new work, run clang-format with -style=file (which uses our provided .clang-format)
    $ cd kovri/ && clang-format -i -style=file src/path/to/my/file
  3. For files with mixed (existing + new) work, run clang-format selectively over only lines directly related to the new work.
    • See vim and emacs documentation for examples of configuring keybindings for clang-format plugins.
  4. Run cpplint (which uses our provided CPPLINT.cfg) to catch any issues that were missed by clang-format
    $ cd kovri/ && cpplint src/path/to/my/file && [edit file manually to apply fixes]


Amendments to Google’s proposed C++ style

  • Avoid prepended mixed-case k and MACRO_TYPE for all constants
  • Use Doxygen three-slash /// C++ comments when documenting for Doxygen
  • Try to document all your work for Doxygen as you progress
  • If anonymity is a concern, try to blend in with a present contributor’s style

Optional Checks

  1. cppdep for component dependency, physical insulation, and include checks.
  2. cppcheck for static analysis (complementary to Coverity).
  3. lizard for code complexity checks.

Sending your work

To contribute your work, please proceed with the following:

  1. Fork Kovri
  2. Review the style section of this document
  3. Create a topic branch
    • We currently do not have any tags as we are in Alpha. For now, you can base your work off of master
  4. Make changes
    • Commits should be atomic when possible and diffs should be easy to read
    • Please try to not mix formatting fixes with non-formatting commits
  5. Be courteous of the git-log
    • Commit title should prepend class or aspect of project. For example, “HTTPProxy: implement User-Agent scrubber. Fixes #193.” or “Garlic: fix uninitialized padding in ElGamalBlock”
    • Commit messages should be verbose by default, consisting of a short subject line (50 chars max), a blank line, and detailed explanatory text as separate paragraph(s) - unless the title alone is self-explanatory
    • If a particular commit references another issue, please add a reference. For example; See #123, or Fixes #123. This will help us resolve issues when we merge into master
    • If a particular commit is rebased after collaboration within a pull-request, please reference the pull-request number within the commit message. For example; References #123
  6. Sign your commit(s) and, if you are a new contributor, open a new pull-request which adds your PGP key to our repository (see contrib)
  7. Send a pull-request to branch master
    • The body of the pull request should contain an accurate description of what the patch does and should also provide justification/reasoning for the patch (when appropriate). You should include references to any discussions such as other issues or chats on IRC


To contribute a proposal, please review our open issues for existing proposals. If what you propose is not there, then open a new issue.

We ask that you open a proposal for the following reasons:

  1. A proposal opens up communication
  2. A proposal shows that the contributor respects the input of all project collaborators
  3. A proposal allows seamless collaborator input in an open forum
  4. A proposal saves time if a collaborator is working on a similar feature/issue
  5. A proposal prevents catastrophes and mishaps or allows collaborators to prepare for catastrophes and mishaps

Not opening a proposal will not prevent you from contributing; we will merge what you PR - but a proposal is highly recommended.


  • Do a quick search in the codebase for TODO(unassigned): and/or pick an issue and start patching!
  • If you create a TODO, assign it to yourself or write in TODO(unassigned):

Unit-test writing

Test writing is a well-trodden path whose process should not come as a surprise (as there are many decades of tests to study in the software repertoire). For this project, we will focus on the following when writing unit-tests as they are considered a standard good practice:

  • Err on the side of TDD (refactor when necessary)
  • Focus on modular programming / separation of concerns
  • Test the quality of code coverage, not simply quantity
  • Avoid running the same code paths across multiple tests
  • Avoid copypasting implementation into test code

Also note that the state of the data - not the context of the state - should be held paramount as a driver for unit TDD.

Now, while there are many good, working examples of how to write unit-tests, let’s look at some popular and recommended idioms as presented by our cousin Tor:

If your code is very-low level, and its behavior is easily described in terms of a relation between inputs and outputs, or a set of state transitions, then it’s a natural fit for unit tests. (If not, please consider refactoring it until most of it is a good fit for unit tests!)

If your code adds new externally visible functionality to Tor, it would be great to have a test for that functionality. That’s where integration tests more usually come in.

When writing tests, it’s not enough to just generate coverage on all the lines of the code that you’re testing: It’s important to make sure that the test really tests the code.

Remember, the purpose of a test is to succeed if the code does what it’s supposed to do, and fail otherwise. Try to design your tests so that they check for the code’s intended and documented functionality as much as possible.

Often we want to test that a function works right, but the function to be tested depends on other functions whose behavior is hard to observe, or which require a working Tor network, or something like that.

We talk above about “test coverage” – making sure that your tests visit every line of code, or every branch of code. But visiting the code isn’t enough: we want to verify that it’s correct.

So when writing tests, try to make tests that should pass with any correct implementation of the code, and that should fail if the code doesn’t do what it’s supposed to do.

You can write “black-box” tests or “glass-box” tests. A black-box test is one that you write without looking at the structure of the function. A glass-box one is one you implement while looking at how the function is implemented.

In either case, make sure to consider common cases and edge cases; success cases and failure csaes.

Tests shouldn’t require a network connection.

When possible, tests should not be over-fit to the implementation. That is, the test should verify that the documented behavior is implemented, but should not break if other permissible behavior is later implemented.

In addition to not requiring a network connection, unit-tests should not require socket or filesystem access unless the test is socket/filesystem-specific test (these are unit-tests, not integration tests).

Other notes:

  • Though we have a Docker testnet and Boost.Python hooks, our framework for integration and system testing are a WIP. As such, the best we can do at the moment is effective unit testing.
  • For gcov output when building tests, build with make coverage. This target should also build unit-tests.
  • For existing kovri examples, see crypto/{,} and util/ to name a few
  • For effective unit-test writing outside of Tor, see Crypto++ and Monero unit-tests

Fuzz testing

From reference : “LibFuzzer is under active development so you will need the current (or at least a very recent) version of the Clang compiler”

Get a recent version of clang:

$ cd ~/ && mkdir TMP_CLANG && git clone TMP_CLANG/clang
$ ./TMP_CLANG/clang/scripts/
$ cd --

Get libFuzzer:

$ git clone contrib/Fuzzer

Build kovri with fuzz testing enabled:

$ PATH="~/third_party/llvm-build/Release+Asserts/bin:$PATH" CC=clang CXX=clang++ make fuzz-tests

Usage (Example for RouterInfo):

find ~/.kovri/core/network_database/ -name "router_info*" -exec cp {} RI_CORPUS \;
./build/kovri-util fuzz --target=routerinfo -merge=1 MIN_RI_CORPUS RI_CORPUS
./build/kovri-util fuzz --target=routerinfo -jobs=2 -workers=2 MIN_RI_CORPUS

Quality Assurance

The following is a proposed model for QA workflow. While linear in nature, any phase can be worked on individually if needed - as long as all phases are eventually addressed.

Phase 1: Basic Review

  • Review open issues on our Issue Tracker
  • Review our Vulnerability Response Process
  • All code must adhere to our contributing guidelines
  • Note areas that need improving (mentally or in code)
  • Create TODO’s and assign if possible

Phase 2: Specification Review / Implementation / Code Documentation

  • Complete specification review on a per module basis; e.g., Streaming, I2PControl, etc.
    • Code must be in-line with essential parts of the specification that will maintain the same (or better) level of anonymity that java I2P provides
    • Refactor/implement/patch when/where needed
  • Ensure C++11/14 compliant implementation
    • Review phase 2 if needed
  • Resolve all related TODO’s
  • Document code as much as possible with inline comments and Doxygen
    • Code should be understood by novice to experienced coders
    • Code should guide the reader to a better understanding of I2P
      • I2P is very complex so our code should act as sovereign replacement of spec documentation and not simply as a supplement (this can be a tedious objective but very rewarding in terms of maintenance and software lifespan)

Phase 3: Crypto Review / Security auditing

  • Ensure that crypto is up-to-date and properly implemented
  • Establish every vector for known exploitation
    • Keep these vectors in mind when writing tests
  • Break Kovri every which-way possible
    • Fix what you break
  • Always use trustworthy, well-written libraries when possible
    • Avoid homebrewed, ad-hoc, I’m sure I know better than the community type of code
  • Seek a 2nd (or more) opinion(s) from colleagues before proceeding to next phase

Phase 4: Bug squashing / Tests / Profiling

  • Resolve priority bugs/issues
  • Write unit-tests tests for every module
    • Run tests. Run them again
    • Full review of test results. Patch if needed. Refactor as necessary
  • Ensure that automation is running on a regular basis
    • valgrind, doxygen, clang-format
    • Patch if needed, refactor as necessary

Phase 5: Confer

  • Confer with colleagues and the community
    • Conferring should be done publicly via issues, meetings, and/or IRC
  • Accept all feedback and, in response, produce tangible results
  • If satisfied, proceed with next phase, else repeat this phase (or start from a previous phase)

Phase 6: Repeat the cycle from the beginning