Differences between revisions 1 and 2
Revision 1 as of 2016-03-19 21:33:36
Size: 4423
Comment:
Revision 2 as of 2016-03-19 23:12:50
Size: 4761
Comment: Add some more details
Deletions are marked like this. Additions are marked like this.
Line 22: Line 22:
 * push to shared fork
Line 28: Line 29:
 * diff - unified or side by side, plain text or diff markup or syntax markup or hyperlinked
 * show individual changesets, aggregated diff since ancestor, merge diff (warning about conflicts)
 * diff - unified or side by side, plain text or diff markup or syntax markup or hyperlinked / type information
 * show individual changesets
 * aggregated diff since ancestor
 *
merge diff (warning about conflicts)
Line 33: Line 36:
  * standard for review system client plugins?
Line 38: Line 42:
 * 1 by 1 or batched comments
Line 54: Line 59:
 * partially landing a series while continuing developing
Line 61: Line 67:
  * Heuristics are tough to make accurate
   * Fuzzy match diff context around comment?
  * some sort of annotation?
  * tracking built in to vcs?

- about process and tooling on top of VCS.

WIP, notes based on discussion at the 3.8 sprint.

What is it / possible goals

  • get (early) feedback to changes
  • knowledge sharing - inform others about changes
  • catch bugs and issues before landing changes in mainline
  • let area owners maintain ownership while changed by others
  • collect approvals (human and automatic) and enforce/support change approval workflow
  • coordinate and automate landing of changes
  • clean history while landing - make/ensure it is bisectable

Different aspects / phases of the review workflow and what they can contain

We identify some main areas where there can be different solutions:

Submit

how the changes are submitted:

  • fork & push & Pull Request on live diff

  • shared repo push to branches & Merge Request on live diff

  • push to shared fork
  • upload patch(es) & Integration Request (thus some history cleaning)

  • clean snapshots with most stable first, or all/nothing?

Show

how the changes are shown to reviewers:

  • diff - unified or side by side, plain text or diff markup or syntax markup or hyperlinked / type information
  • show individual changesets
  • aggregated diff since ancestor
  • merge diff (warning about conflicts)
  • show meta diff between iterations
  • static or moving target
  • web / mail / local native client
    • standard for review system client plugins?

Comment / communicate

how reviewers can communicate

  • forum style: web click'n'comment, mail inline replies, local native client comments
  • 1 by 1 or batched comments
  • chat style
  • constructive contribution, show don't tell, make changes instead of asking for them (using existing process of modifying history or adding changesets, thus iterating)

Approve

how reviewers can control

  • some approval workflow can be specified and be more or less mandatory/enforced
  • possibly formalized ownership of touched files, suggested or added/informed automatically
  • different kind of reviews and approvals - architectural, ownership, catch bugs, full or partial, FYI/ACK, stylistic review, product management, release management, CI
  • may or may not be invalidated by iteration

Land

actual landing of the changes

  • automatic / "merge button" manual / external manual
  • collapse / rebase/linearize / merge
  • partially landing a series while continuing developing

Iterate

how changes might change during the process

  • extra commits with changes
  • history modification
  • HOW TO DEAL WITH HOW ALL THAT CAN INFLUENCE COMMENTS - where and whether they apply?
    • Heuristics are tough to make accurate
      • Fuzzy match diff context around comment?
    • some sort of annotation?
    • tracking built in to vcs?
  • might invalidate reviews / approvals

Tools and implementations

How different tools and implementations relate to this nomenclature / frame of reference:

Mercurial project

  • Submit: mainly patchbombs, automated with push gateway
  • Show: patch emails to mailing list, also snarfed by patchwork
  • Comment: inline replies on mailing list
  • Approve: trusted people collect patches from mailing list, read informal responses, reject bad parts, might tweak, approve good parts by pushing to pre-landing repo
  • Land: pre-landed changesets are (currently) re-reviewed, rebased, tweaked and pulled upstream by mpm
  • Iterate: rejected parts must be "history modified" and re-sent to the mailing list tagged v(n+1)

Kallithea

  • Submit: repo hosting, fork or shared, PR on live diff
  • Show: web with unified diff (partly side-by-side), mail listing commit descriptions, notifications on iterations
  • Comment: web inline comments, notifications
  • Approve: reviewers are assigned manually - no automatic landing and thus no enforcement
  • Land: manual offline, shows hint about what to pull and merge
  • Iterate: static view until iterating (closes old PR and creates new one with additional changes - no good support for modifications)

Unity Technologies

Tweaked implementation of Kallithea

  • Submit: shared repo with named branches
  • Approve: automatic recommentation/review of assigned revieweres based on touched files plus release management plus "auto land request", CI status shows up too
  • Land: "trunk guardians" are gate keepers and make final meta review, prepare batch merging all possible PRs to verification branch, runs all tests, bisect failures, merge good parts to mainline, bounce PRs causing conflicts or failures
  • Iterate: append-only history - not clean, only bisectable on mainline, new iteration pretty much invalidates previous approvals ... but negotiable with guardians

ReviewWorkflows (last edited 2016-03-20 19:29:21 by MadsKiilerich)