Making code review happen in 1.18

classic Classic list List threaded Threaded
31 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Making code review happen in 1.18

Mark A. Hershberger

The problem with our current VCS is the sort of work-flow that has
developed around it.

But we can solve the work-flow problem without introducing an entirely
new VCS and disrupting everything for a month or so while people adjust
to the new system.

The solution I'm proposing is that we branch 1.18 immediately after the
release of the 1.17 tarball.

Revisions marked “OK” (or, perhaps, tagged “118”) on the trunk could be
merged to the 1.18 branch.  Or, to make merging into 1.18 less of a
chore for a single person, we could enable those doing code review to
merge code they've reviewed into the 1.18 branch.  In this way, we
achieve Roan's (and my) goal of continuous integration.

This also put the onus on the individual developers to make sure that
their code gets reviewed and that problems found get fixed.

As a bonus, we could set a date *now* to make the 1.18 release and then
just release whatever is in the 1.18 branch on that day.

I think we need a few goals thrown in to make this really good — I
propose “1.18 will include a web UI for configuring MediaWiki” and “1.18
will cut the number of global variables in half” to pick two of my
personal favorites — but I think the work-flow of how MediaWiki is
prepared for release needs to be addressed.

Mark.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Brion Vibber
On Sun, Feb 13, 2011 at 2:29 PM, Mark A. Hershberger <
[hidden email]> wrote:

>
> The problem with our current VCS is the sort of work-flow that has
> developed around it.
>
> But we can solve the work-flow problem without introducing an entirely
> new VCS and disrupting everything for a month or so while people adjust
> to the new system.
>
> The solution I'm proposing is that we branch 1.18 immediately after the
> release of the 1.17 tarball.
>
> Revisions marked “OK” (or, perhaps, tagged “118”) on the trunk could be
> merged to the 1.18 branch.  Or, to make merging into 1.18 less of a
> chore for a single person, we could enable those doing code review to
> merge code they've reviewed into the 1.18 branch.  In this way, we
> achieve Roan's (and my) goal of continuous integration.
>

If by this you suggest that 1.18 will be Wikimedia's actual live deployment
branch, and that it should always be within a day or two's commits from
trunk, then I can get down with that.


-- brion
_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Bryan Tong Minh
In reply to this post by Mark A. Hershberger
Hi Mark,


It is good to see people thinking about 1.18 already!

On Sun, Feb 13, 2011 at 11:29 PM, Mark A. Hershberger
<[hidden email]> wrote:
>
> The problem with our current VCS is the sort of work-flow that has
> developed around it.
>
> But we can solve the work-flow problem without introducing an entirely
> new VCS and disrupting everything for a month or so while people adjust
> to the new system.
>
Can you be a bit more specific? What problems are you implying? I
think of some problems, but I don't know how they are consequences
from our workflow and VCS.

> The solution I'm proposing is that we branch 1.18 immediately after the
> release of the 1.17 tarball.
>
I agree... a bit. We should branch 1.18wmf1 immediately from trunk
once things have calmed down a bit. However, this 1.18wmf1 does not
necessarily need to be the base for 1.18. We can branch 1.18wmf2 from
trunk again and so on, until the time that we want to release 1.18
when we make a final 1.18wmfN branch and 1.18 branch.

> I think we need a few goals thrown in to make this really good — I
> propose “1.18 will include a web UI for configuring MediaWiki” and “1.18
> will cut the number of global variables in half” to pick two of my
> personal favorites — but I think the work-flow of how MediaWiki is
> prepared for release needs to be addressed.
>
Well, if you want 1.18 released before 2012... ;)


Bryan

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

K. Peachey
> Well, if you want 1.18 released before 2012... ;)
> Bryan

Well of course, Where would we be if we let the world end without a
new(ish) mediawiki release?
-Peachey

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Mark A. Hershberger
In reply to this post by Bryan Tong Minh
Bryan Tong Minh <[hidden email]> writes:

>> The problem with our current VCS is the sort of work-flow that has
>> developed around it.
>
> Can you be a bit more specific? What problems are you implying? I
> think of some problems, but I don't know how they are consequences
> from our workflow and VCS.

The introduction and growth of various DVCS (especially git) over the
past few years has shown how a centralized VCS (like svn) ends up
affecting and even forming the workflow.

For example, we want to release only reviewed code, but we also want to
make development access (and commit bits) available to as many people as
possible.

As a result, we have code review, but it is peripheral to the process of
getting code committed to trunk.  So, when the time came to put 1.17
together, we had this huge mound of un-reviewed code that had to be
examined.

This workflow is different from a DVCS.  Take Linux, for example.  Linus
pulls code from several lieutenants.  Anyone can set up a branch of the
Linux source code and commit to it, but to get Linus to ship their code,
they have to get a lieutenant to review it and give it to Linus.

The workflow is different in that code review is an integral, not
peripheral, process.  As a bonus development access is available to
anyone willing to run “git clone”.

>> I think we need a few goals thrown in to make this really good — I
>> propose “1.18 will include a web UI for configuring MediaWiki” and “1.18
>> will cut the number of global variables in half” to pick two of my
>> personal favorites — but I think the work-flow of how MediaWiki is
>> prepared for release needs to be addressed.
>>
> Well, if you want 1.18 released before 2012... ;)

I was getting a little dreamy there.  One goal?  One SMALL goal?  A
CDB-based configuration database?  That way, work on a web UI could
happen outside of the trunk, but would still be usable and ready to
integrate after 1.19 was branched.

But, yes, I want to get 1.18 out by July.  If that means I have to be
satisfied with more prosaic goals like “getting code review working
better” and “keep 1.18 within a day or two's commits from trunk” (to
paraphrase Brion) then I'll be happy.

Mark.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Mark A. Hershberger
In reply to this post by Mark A. Hershberger
[hidden email] (Mark A. Hershberger) writes:

> The solution I'm proposing is that we branch 1.18 immediately after the
> release of the 1.17 tarball.

I want to give credit where it is due.  Although I haven't seen him
propose it here, this is, in fact, Robla's idea.  He and I were
discussing what needed to happen for 1.18 and it was his idea to branch
1.18 immediately after the release of the 1.17 tarball.

Mark.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Diederik van Liere
+1 to migrate to a DVCS

On Sun, Feb 13, 2011 at 8:38 PM, Mark A. Hershberger <
[hidden email]> wrote:

> [hidden email] (Mark A. Hershberger) writes:
>
> > The solution I'm proposing is that we branch 1.18 immediately after the
> > release of the 1.17 tarball.
>
> I want to give credit where it is due.  Although I haven't seen him
> propose it here, this is, in fact, Robla's idea.  He and I were
> discussing what needed to happen for 1.18 and it was his idea to branch
> 1.18 immediately after the release of the 1.17 tarball.
>
> Mark.
>
> _______________________________________________
> Wikitech-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



--
<a href="http://about.me/diederik">Check out my about.me profile!</a>
_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Jeroen De Dauw-2
> +1 to migrate to a DVCS

Unless I'm mistaken no one has actually suggested doing that.

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Aryeh Gregor
In reply to this post by Mark A. Hershberger
On Sun, Feb 13, 2011 at 8:11 PM, Mark A. Hershberger
<[hidden email]> wrote:
> This workflow is different from a DVCS.  Take Linux, for example.  Linus
> pulls code from several lieutenants.  Anyone can set up a branch of the
> Linux source code and commit to it, but to get Linus to ship their code,
> they have to get a lieutenant to review it and give it to Linus.
>
> The workflow is different in that code review is an integral, not
> peripheral, process.  As a bonus development access is available to
> anyone willing to run “git clone”.

On the other hand, in Linux, it can be hard to get patches reviewed
and accepted in a timely fashion, because there's no clear chain of
command unless Linus personally intervenes.  I think Mozilla is an
excellent model to follow, in that they have a well-defined review
process (not "everyone can object to design decisions at the eleventh
hour" like Linux) and make sure that all patches get reviewed in a
timely fashion (at least as far as I've seen).  I've submitted two
patches to Mozilla, and I got immediate feedback and review on both of
them from developers responsible for the relevant areas.

What I think is important is that a) there's a formal process that
ensures all submitted code gets reviewed, and b) this process is
basically the same for everyone (with no group of people with "commit
access" who get to jump the queue).  Without (b), code by new
contributors will too easily slip through the cracks.  Mozilla has
everyone submit code on Bugzilla, which is awkward but works -- even
core developers have to file a bug, submit a patch there, and get
review from another qualified coder, essentially the same as anyone.
(Obviously with exceptions like backing out build breakage and so on.)
 Mozilla does have people with commit access, but they're just the
ones tasked with the chore of checking in code once it's been reviewed
-- they aren't allowed to just check stuff in without going through
the review process first.

If we switched to git, perhaps we should take a look at Gerrit for a
review tool.  I've heard good things about it, but never used it
myself.  Whatever we use, I think it should really be "review then
commit", not "commit then review".  Cherry-picking from trunk to a
branch is possibly better than reverting things in trunk (not sure),
but in the medium term it would be much better to get rid of commit
access status entirely -- once we're sure we have a
properly-functioning review process that can keep up with changes.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Brion Vibber
In reply to this post by Jeroen De Dauw-2
On Sun, Feb 13, 2011 at 5:49 PM, Jeroen De Dauw <[hidden email]>wrote:

> > +1 to migrate to a DVCS
>
> Unless I'm mistaken no one has actually suggested doing that.
>

That's been talked about a few times here already. :)

There's some notes about potential git migration plans of action at:
http://www.mediawiki.org/wiki/Git_conversion

I know Avar's looked into it to various degrees, there may be some more
notes sitting about. I'm definitely in favor of it getting done in some way,
though there remain basic issues of what to do with extensions -- roll
everything into a giant repo with core and all extensions, or do extensions
separate but make it harder to maintain a full consistent versioned set?

I can definitely say that working with branching and merging in git is
FANTASTIC -- especially for the ability to use source control as your
workspace for in-progress patches, so you get the full benefits of
versioning *and* sharing while still working on something.

-- brion
_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Ryan Lane-2
In reply to this post by Aryeh Gregor
> On the other hand, in Linux, it can be hard to get patches reviewed
> and accepted in a timely fashion, because there's no clear chain of
> command unless Linus personally intervenes.  I think Mozilla is an
> excellent model to follow, in that they have a well-defined review
> process (not "everyone can object to design decisions at the eleventh
> hour" like Linux) and make sure that all patches get reviewed in a
> timely fashion (at least as far as I've seen).  I've submitted two
> patches to Mozilla, and I got immediate feedback and review on both of
> them from developers responsible for the relevant areas.
>
> What I think is important is that a) there's a formal process that
> ensures all submitted code gets reviewed, and b) this process is
> basically the same for everyone (with no group of people with "commit
> access" who get to jump the queue).  Without (b), code by new
> contributors will too easily slip through the cracks.  Mozilla has
> everyone submit code on Bugzilla, which is awkward but works -- even
> core developers have to file a bug, submit a patch there, and get
> review from another qualified coder, essentially the same as anyone.
> (Obviously with exceptions like backing out build breakage and so on.)
>  Mozilla does have people with commit access, but they're just the
> ones tasked with the chore of checking in code once it's been reviewed
> -- they aren't allowed to just check stuff in without going through
> the review process first.
>

I've been working on OpenStack quite a bit lately, and though they use
bazaar w/ launchpad, and not git, the idea behind it is pretty
similar.

Everything that goes in either needs a bug or a blueprint. Everything
is a branch. You don't actually commit, but do a request for a merge,
where your branch is linked to a bug or a blueprint. Two people need
to review the commit before it merges. Every bug in openstack is
generally expected to be a single branch. I've found this works very,
very well. They already have a fairly large community after only 6
months of existence, and have accepted a very large number of features
in a short period of time.

I'm very much a fan of the distributed system over the fully
centralized after becoming accustomed to it.

> If we switched to git, perhaps we should take a look at Gerrit for a
> review tool.  I've heard good things about it, but never used it
> myself.  Whatever we use, I think it should really be "review then
> commit", not "commit then review".  Cherry-picking from trunk to a
> branch is possibly better than reverting things in trunk (not sure),
> but in the medium term it would be much better to get rid of commit
> access status entirely -- once we're sure we have a
> properly-functioning review process that can keep up with changes.
>

For puppet in the test/dev environment and the production environment
I plan on keeping the configuration in git, and manage it via gerrit.
I have a gerrit server for testing up on the same server as the
openstack (nova) controller. So far it looks pretty nice.

- Ryan Lane

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Migrating to GIT (extensions)

Daniel Friesen-4
In reply to this post by Brion Vibber
On 11-02-13 06:15 PM, Brion Vibber wrote:

> On Sun, Feb 13, 2011 at 5:49 PM, Jeroen De Dauw<[hidden email]>wrote:
>
>>> +1 to migrate to a DVCS
>> Unless I'm mistaken no one has actually suggested doing that.
>>
> That's been talked about a few times here already. :)
>
> There's some notes about potential git migration plans of action at:
> http://www.mediawiki.org/wiki/Git_conversion
>
> I know Avar's looked into it to various degrees, there may be some more
> notes sitting about. I'm definitely in favor of it getting done in some way,
> though there remain basic issues of what to do with extensions -- roll
> everything into a giant repo with core and all extensions, or do extensions
> separate but make it harder to maintain a full consistent versioned set?
>
> I can definitely say that working with branching and merging in git is
> FANTASTIC -- especially for the ability to use source control as your
> workspace for in-progress patches, so you get the full benefits of
> versioning *and* sharing while still working on something.
>
> -- brion
On extensions there's another idea not mentioned on that page.

Right now the distinction there is everything in one git repo vs. use
git submodules.

The dilemma there is this:
- In one repo everyone would need the whole history of all extensions to
check out one extension, naturally this is not what we desire so we want
to split it.
- however if we split it then you no longer can check out all the
extensions for mass development, bulk translation, etc... purposes.
- so we consider having a dedicated repo which simply has git submodules
to all the different extensions so you can easily check them out
- unfortunately git submodules aren't like svn externals (whether svn's
externals behavior could be considered a good thing all the time is a
side topic) a commit id is part of the package, so you can't say "I want
the latest commit from this repo", you have to say "I want <x> commit
from this repo" and to get the same behavior you have to do some hacky
things committing a change to the repo hosting submodule references
every time the remote repo adds a new commit
-- ((Although on the other hand, this could be interesting for extension
branching... if we make rel1_16 style branches in the git repo they can
easily target a specific commit, and "backporting" things as compatible
for that rev simply means going into that branch and updating what
commit the submodule refers to))

Naturally both options are not ideal. We can't have all extensions in
one repo, nor is making automated commits to a repo every time a remote
repo is updated an ideal.

However a key thing to note is that git submodules aren't anything
really special. Sure, they're integrated into git, but the only real
special feature about them is that you can target a specific commit
id... and heck, we don't even want that feature, that's the whole reason
it's problematic. Git submodules don't optimize cloning the repo, it
doesn't store any of the remote data in the repo, so whether you use a
submodule or not the actual task of getting those extension repos is
still that of cloning each and every one of those remote repos
individually. The only difference submodules make is that now instead of
`git clone <x>` or `git pull <x>` over and over and over. You just run
`git submodule <update,etc...>` and it runs the clone or pull for every
repo sequentially without you having to explicitly do it. And the result
is essentially the same, you end up with a bunch of git repos in
subdirectories, only with the extra ability to do a batch update of them
all. (Though there is a negative that when using submodules these are
attached to a specific commit id and not a branch, so you can no longer
decide you want to `git pull` only one of the extensions instead of all
of them at once)

Hence I think there is another option. Instead of (ab)using git's
submodules to reference extension repos, we could instead write a quick
script that talks to a simple database or api (heck it could be a tab
delimited text file committed to the git repo where the script is) which
gives a list of all extension names, the url for the git repo, and
optionally could even list what commit id was last used at the branch
point for a release (in other words serve the same purpose as branching
extensions does), while perhaps also making it easy to update what
version of extensions is marked as compatible with certain releases.
As a bonus; We'll retain the ability to git pull individual repos
without conflicting with submodules. We can write some extra quick
scripting to give us info like "These extensions are running off master,
while these are running of commits tagged as compatible with a specific
release". And other features we can't offer just using submodules. We
also can also manage extension groups and dependencies like "AbuseFilter
requires AntiSpoof" and set it up so that if you individually
"./mwextension checkout AbuseFilter" it implicitly checks out AntiSpoof
for you too.

Naturally it would make sense to write it in php, since MediaWiki
already requires php it's the language most likely to be relevant to
those checking things out. Although you could argue for users who check
out onto their own machine and not onto the machine they run MW and may
not have php installed. In that case you could have a case for writing
it in python, since by comparison python is probably one of the most
commonly installed scripting languages, and imho it's comparatively
easier to install than php or ruby, or compiling something yourself in a
cross-compatible way. Though on the other hand, this almost sounds like
something we might make a maintenance script.

Ohh... if the translatewiki guys are looking for a dummy for
streamlining support for extensions based in git in preparation for a
git migration if we do so, I'd be happy to offer monaco-port up as a
existing extension (well, skin) using git that could be used as a test
for streamlining git support. ;) having monaco-port get proper i18n
while it's still not up to a level I believe I want to commit it into
svn yet wouldn't be a bad thing.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Daniel Friesen-4
In reply to this post by Brion Vibber
On 11-02-13 06:15 PM, Brion Vibber wrote:

> On Sun, Feb 13, 2011 at 5:49 PM, Jeroen De Dauw<[hidden email]>wrote:
>
>>> +1 to migrate to a DVCS
>> Unless I'm mistaken no one has actually suggested doing that.
>>
> That's been talked about a few times here already. :)
>
> There's some notes about potential git migration plans of action at:
> http://www.mediawiki.org/wiki/Git_conversion
>
> I know Avar's looked into it to various degrees, there may be some more
> notes sitting about. I'm definitely in favor of it getting done in some way,
> though there remain basic issues of what to do with extensions -- roll
> everything into a giant repo with core and all extensions, or do extensions
> separate but make it harder to maintain a full consistent versioned set?
>
> I can definitely say that working with branching and merging in git is
> FANTASTIC -- especially for the ability to use source control as your
> workspace for in-progress patches, so you get the full benefits of
> versioning *and* sharing while still working on something.
>
> -- brion
Don't forget about what git does for those of us using remote servers too.
The ease of development between working on my monaco-port (in git for
now) vs. making core changes (svn of course) is fairly noticeable.

I do my development and testing on remote servers, and multiple ones at
that (monaco-port is actually located in 4 or 5 locations; My local
repo, on the dragonballencyclopedia's prototype and live site, another
location still setup on commonjs (I used to use it for experimenting
before I had other wikis to try it on) and in my trunktest for
development in a 1.18 env). And I usually vary whether I work on a
feature from the dragonballencyclopedia prototype or trunktest.
Sometimes doing a little work on both since one is a 1.16 env and the
other trunk and some features vary in support over that border.
The last key piece of background, is I don't commit/push things to
remote repos from these servers. I don't trust my own servers with any
of my private keys used to commit to remote repos (sure, I have separate
keys, and I could give it just the ability to push to the repo without
accessing other servers... but I don't trust my own servers enough to
give them the ability to commit something to Wikimedia's svn under my
name). Hence any task of committing/pushing to a remote repo is done
from my physical computer, not the server, despite 99% of the actual
development being on the servers.

So since I work on remote machines, but commit from my local machine I
need to find a way to transfer the changes from the remote machine to
the local machine before submitting them. This is completely trivial
with git. I just make the actual commit on the server itself (since I
don't need any special access for that). Then I pull from my machine to
pull those changes from the server into my local copy. And finally I
push those changes to the public repo. Only takes a few seconds. I even
have a trivial shell script for monaco-port that pulls changes from all
my remote repos into my local one, pushes that to my github repo, and
then makes sure each of my servers using monaco-port are up to date.
(It's really just a series of git pull/push commands)
But when it comes to svn... I've taken to the habit of running `ssh
<server> "cd /path/to/wiki; svn diff" | patch -p0`. Naturally I have to
svn up both copies before that to avoid any mishaps. And if I added a
new file, I have to do a separate explicit scp. And after that I have to
go back to the remote server, and svn up it again so that it understands
that those changes are no longer a dirty working copy. If I added a file
sometimes I have to deal with a slight mess. And if I forget to svn up,
have to deal with some blech from patch. etc...

Naturally of course, git is also beautiful for my disorganized working
copy. Right now trunktest has some uncommitted stuff I'm experimenting
with. And I usually end up transferring my changes to my local copy,
then reverting everything not relevant, and in the off case where two
experiments used the same file, I have to manually edit one of those out
(though I slightly streamlined that by piping svn diff <file> to a file,
running svn revert <file>, editing the patch and deleting the change I
don't want to commit, and then running the patch command; rather than
manually undoing the code). Git changes that extremely, since there is
an index I just explicitly say which parts I want to add to the index to
commit. And it works beautifully for files where two experiments work on
the same file, I can look through the file and explicitly say which diff
chunks I actually DO want to commit without having to edit the file. And
`git gui` really does make picking those diff chunks easier (yay for ssh
-X too).

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Migrating to GIT (extensions)

Siebrand Mazeland
In reply to this post by Daniel Friesen-4
Op 14-02-11 05:01 schreef Daniel Friesen <[hidden email]>:

>Ohh... if the translatewiki guys are looking for a dummy for
>streamlining support for extensions based in git in preparation for a
>git migration if we do so, I'd be happy to offer monaco-port up as a
>existing extension (well, skin) using git that could be used as a test
>for streamlining git support. ;) having monaco-port get proper i18n
>while it's still not up to a level I believe I want to commit it into
>svn yet wouldn't be a bad thing.

With regards to i18n support it is not clear to me how translatewiki staff
would deal with 100+1 commits to different repo's every day if core and
extensions would each be in individual repos. Can you please explain how
Raymond would be working with Windows and Git in the proposed structure
updating L10n for 100 extensions and MediaWiki core? How would
translatewiki.net easily manage MediaWiki updates (diff review/commits)?

I'm not particularly looking forward to having to jump through a huge
series of hoops just to keep checkouts for single extensions small. If
that is the real issue, extension distribution should get another look as
this might indicate that ExtensionDistributor does not work as expected. I
have currently checked out all of trunk, and for translatewiki.net we have
a selective checkout of i18n files for extensions and we have a checkout
for core and the installed extensions. The fragmentation and
disorganisation/disharmony that will exist after creating 450 GIT repos
instead of one Subversion repo as we currently have is also something I am
not looking forward to.

Source code management is now centralised, and correct me if I'm wrong,
but we encourage developers to request commit access to improve visibility
of their work and grow the community. "Going distributed" in the proposed
way, would hamper that, if I'm correct. I think the relative lower
popularity of extensions that are maintained outside of svn.wikimedia.org
are proof of this. I am not in favour of using GIT in the proposed way. I
think core and extensions should remain in the same repo. Checkout are for
developers, and developers should get just all of it.

Siebrand



_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Roan Kattouw-2
In reply to this post by Bryan Tong Minh
2011/2/13 Bryan Tong Minh <[hidden email]>:
> I agree... a bit. We should branch 1.18wmf1 immediately from trunk
> once things have calmed down a bit. However, this 1.18wmf1 does not
> necessarily need to be the base for 1.18. We can branch 1.18wmf2 from
> trunk again and so on, until the time that we want to release 1.18
> when we make a final 1.18wmfN branch and 1.18 branch.
>
+1

If we want to move to continuous integration (and I think the
consensus is we do, considering the mess we've made for ourselves by
deploying 9 months worth of commits and not knowing which of the
~15,000 new revisions killed the cluster the other day), our first
step should be to get closer to continuous integration, i.e. bring
deployment closer to trunk. By the time we deploy 1.17, trunk will
already be more than two months ahead. Of course this is because we
needed time to stabilize 1.17, which in turn was caused by the amount
of new code in it. Stabilizing and deploying 1.18wmf1 should take
considerably less time and allow us to get much closer to a continuous
integration model.

Roan Kattouw (Catrope)

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Migrating to GIT (extensions)

Diederik van Liere
In reply to this post by Siebrand Mazeland
If I am not mistaken then mercurial has better support for highly
modularized open source
software projects. You can use a  mercurial subrepository (which is
very similar to svn external and git submodule). According to their
manual:
"Subrepositories is a feature that allows you to treat a collection of
repositories as a group. This will allow you to clone, commit to,
push, and pull projects and their associated libraries as a group."
see: http://mercurial.selenic.com/wiki/Subrepository
http://mercurial.selenic.com/wiki/NestedRepositories

just my 2 cents.





On Mon, Feb 14, 2011 at 2:18 AM, Siebrand Mazeland <[hidden email]> wrote:

>
> Op 14-02-11 05:01 schreef Daniel Friesen <[hidden email]>:
>
> >Ohh... if the translatewiki guys are looking for a dummy for
> >streamlining support for extensions based in git in preparation for a
> >git migration if we do so, I'd be happy to offer monaco-port up as a
> >existing extension (well, skin) using git that could be used as a test
> >for streamlining git support. ;) having monaco-port get proper i18n
> >while it's still not up to a level I believe I want to commit it into
> >svn yet wouldn't be a bad thing.
>
> With regards to i18n support it is not clear to me how translatewiki staff
> would deal with 100+1 commits to different repo's every day if core and
> extensions would each be in individual repos. Can you please explain how
> Raymond would be working with Windows and Git in the proposed structure
> updating L10n for 100 extensions and MediaWiki core? How would
> translatewiki.net easily manage MediaWiki updates (diff review/commits)?
>
> I'm not particularly looking forward to having to jump through a huge
> series of hoops just to keep checkouts for single extensions small. If
> that is the real issue, extension distribution should get another look as
> this might indicate that ExtensionDistributor does not work as expected. I
> have currently checked out all of trunk, and for translatewiki.net we have
> a selective checkout of i18n files for extensions and we have a checkout
> for core and the installed extensions. The fragmentation and
> disorganisation/disharmony that will exist after creating 450 GIT repos
> instead of one Subversion repo as we currently have is also something I am
> not looking forward to.
>
> Source code management is now centralised, and correct me if I'm wrong,
> but we encourage developers to request commit access to improve visibility
> of their work and grow the community. "Going distributed" in the proposed
> way, would hamper that, if I'm correct. I think the relative lower
> popularity of extensions that are maintained outside of svn.wikimedia.org
> are proof of this. I am not in favour of using GIT in the proposed way. I
> think core and extensions should remain in the same repo. Checkout are for
> developers, and developers should get just all of it.
>
> Siebrand
>
>
>
> _______________________________________________
> Wikitech-l mailing list
> [hidden email]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--
<a href="http://about.me/diederik">Check out my about.me profile!</a>

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Jay Ashworth-2
In reply to this post by Jeroen De Dauw-2
----- Original Message -----
> From: "Jeroen De Dauw" <[hidden email]>

> > +1 to migrate to a DVCS
>
> Unless I'm mistaken no one has actually suggested doing that.

0 + 1 = 1, right?  :-)

Cheers,
-- jra

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Making code review happen in 1.18

Mark A. Hershberger
In reply to this post by Roan Kattouw-2
Roan Kattouw <[hidden email]> writes:

> 2011/2/13 Bryan Tong Minh <[hidden email]>:
>> I agree... a bit. We should branch 1.18wmf1 immediately from trunk
>> once things have calmed down a bit. However, this 1.18wmf1 does not
>> necessarily need to be the base for 1.18. We can branch 1.18wmf2 from
>> trunk again and so on, until the time that we want to release 1.18
>> when we make a final 1.18wmfN branch and 1.18 branch.
[ SNIP ]
> Stabilizing and deploying 1.18wmf1 should take considerably less time
> and allow us to get much closer to a continuous integration model.

It sounds like you guys are balking at this idea.  I'm not familiar with
how the wmfN branches have worked, so some input would help.

If we have a 1.18 branch that is, as Brion has noted (and supported), a
day or two behind trunk at most, is there a reason that the we couldn't
branch wmfN from the rolling 1.18 branch?  Or even just tag it when we
wanted to mark a WMF deployment?

Mark.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Migrating to GIT (extensions)

Mark A. Hershberger
In reply to this post by Siebrand Mazeland
Siebrand Mazeland <[hidden email]> writes:

> With regards to i18n support it is not clear to me how translatewiki staff
> would deal with 100+1 commits to different repo's every day if core and
> extensions would each be in individual repos.

This is one reason I avoided suggesting that we switch to a DVCS for
1.18: The change is too big and dramatic and would impede too many
people's workflows without offering an immediate improvement.

Mark.

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Reply | Threaded
Open this post in threaded view
|

Re: Migrating to GIT (extensions)

Rob Lanphier
In reply to this post by Siebrand Mazeland
On Sun, Feb 13, 2011 at 11:18 PM, Siebrand Mazeland
<[hidden email]> wrote:
> With regards to i18n support it is not clear to me how translatewiki staff
> would deal with 100+1 commits to different repo's every day if core and
> extensions would each be in individual repos. Can you please explain how
> Raymond would be working with Windows and Git in the proposed structure
> updating L10n for 100 extensions and MediaWiki core? How would
> translatewiki.net easily manage MediaWiki updates (diff review/commits)?

I'm also an advocate for each extension getting its own git
repository, but I'm not sure how this would work, to be honest.  It is
definitely something we'll need to consider.  We appreciate the work
that you all do, so I'd hate to make life harder for you.  If we go
this route, we'll need to make sure we figure out a mitigation
strategy here.  Thankfully, I believe there is one (more below)

> Source code management is now centralised, and correct me if I'm wrong,
> but we encourage developers to request commit access to improve visibility
> of their work and grow the community. "Going distributed" in the proposed
> way, would hamper that, if I'm correct. I think the relative lower
> popularity of extensions that are maintained outside of svn.wikimedia.org
> are proof of this. I am not in favour of using GIT in the proposed way. I
> think core and extensions should remain in the same repo. Checkout are for
> developers, and developers should get just all of it.

DVCS systems like Git really don't operate well on repositories as
large as the full MediaWiki repository.  Furthermore, most developers
only work with core + a few extensions, so I don't think it's fair to
force everyone to check out the full mess.

Riffing on Diederik's suggestion, Git also has the concept of submodules:
http://book.git-scm.com/5_submodules.html

I'm betting we'll be able to use submodules to make life better for
translatewiki developers.

Rob
(p.s. Diederik: I think Git submodules actually predate Mercurial
submodules, if memory serves me correctly)

_______________________________________________
Wikitech-l mailing list
[hidden email]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
12