[crossfire] Release schedule: was maps/tags/1.10/
Mark Wedel
mwedel at sonic.net
Thu Mar 8 02:23:45 CST 2007
Kevin R. Bulgrien wrote:
>> In terms of more frequent 'build' releases, I'm not sure how that really
>> differs from a normal release.
>
> Perhaps a technicality, but having a distinction would make it "easier" to
> release something even if there was a big bug unfixed, or it might be top of
> tree so that new development gets tested quicker... and also encourages
> new development to not go too long in an unusable state.
Ok. I wasn't thinking about the head releases here, just the stable
releases/branch.
So we have the case where 1.10.0 was just released.
The next official release (in 2-3 months time) would be called 1.11.0.
So if we do these interim (every 2 week) releases, what do we call them?
Now my thought, and also stated in the in the release guidelines, that micro
releases could be done.
So in the case of maps, if you have some bugs that need to be fixed and you
want to make a new release, you could make a 1.10.1 and release it in a few weeks.
I think that is perfectly reasonably, but should perhaps be more driven by
bugs it fixes or other changes. It certainly doesn't make sense to make a map
release every 2 weeks if the last thing that changed from the previous release
were a few minor spelling mistakes.
But the flip side is that if serious bugs are discovered, a release right away
probably makes sense. And if a serious bug (say security) is discovered the day
after that micro release, then another micro release a day later probably makes
sense.
There is also some advantage to this - for the micro releases, there is no
need for a full release of all the components. If there are map fixes and they
will work fine with the 1.10.0 server, make a micro release of the new map set, etc.
For the stable release, this works pretty good, because big changes shouldn't
be happening in the stable branches, so the interfaces should remain stable.
IT does get a little trickier with the maps, because you could get the case
where the new maps require new archetypes, yet the archetypes are part of the
server, so you'd need to make a new server release.
One could probably make a strong case that the data files from the server
should be in a separate archive than the binaries (just like the maps are
separate). Server admins are probably more willing to dump a new set of
archetypes/maps down than to have to recompile everything and make sure nothing
got messed up in the process.
>
>> If it is a semi private release (not uploaded to sourceforge), it probably
>> won't get as widespread testing, and in fact may not get any testing.
>
> Not my intent. Say Joe wants to run a server... he picks stable because he
> mostly wants to play with a bunch of friends and doesn't want a lot of risk
> that stuff will be busted. Jack though, likes to hack around, and play the
> bleeding edge, but he's not up to doing SVN checkout and build, so he goes
> for the "unstable". It's all up to the downloader. It's not a new concept as
> a number of big projects run that way... TortoiseCVS... CVS... etc. I'd not
> say it is without some downsides if not managed well.
As of now, we are not making any releases of the head branch.
As said, that should perhaps change, but there are lots of downsides to it.
There are actually lots to think about here - doing both head and stable
releases double the release work. have things drifted far enough apart for this
to make lots of sense? do we want to wait until we are close (6 months) from a
2.0 release and start doing releases then, on the basis that we know some things
in the code right now are work in progress?
One reason for the branches was to open up the head branch for more rapid
development without the more strict model done in a stable release.
But the other issue I mention is incompability between future releases.
We pretty much know what is in the main trunk right now is pretty far away
from 2.0, and will be incompatible in various ways.
You can imagine some of the reaction where joe user downloads the snapshot of
the trunk right now that we provides, and sets up a server on it because he
wants bleeding edge. The next snapshot 2 months from now is various in
incompatible ways, meaning that the player files have to be wiped, etc. I think
the reaction there would be negative, not positive.
This is a bit different than most other programs, where things may change in
various incompatible ways, but you don't effectively lose 2 months of work
because of it
So in a sense, if we make those snapshots, we almost need to put a warning
like "What, are you crazy? You almost certainly should be using the stable
release because you may be forced to start with new characters at any time with
this branch"
That said, if people think we should put these releases out there, I won't
stop it. but I personally would rather wait to the stage where we pretty sure
things are somewhat stable in terms of not needing to wipe character, but may
still have a fairly long train of other enhancements and bug fixes.
d.
>>
>> If the scripts are not of release quality, there is nothing preventing us
>> right now from putting those in something like maps/scripts-nrq (non release
>> quality), and then just not include that directory when the data is tarred up.
>
> When I said release quality, I was meaning that I wouldn't necessarily want
> someone to think I thought they were polished, but since they were in SVN,
> someone could hack on them when I didn't. It seems too bad to hoard a
> script just because I don't know where junk like that goes. It seems harder
> to add a file in a maps directory, etc if your just hacking around a proof of
> concept that might just fizzle. Having a tools area might make it easier to
> chuck something in in case someone else might look at it and go... hey...
> that's a cool idea, but how about doing it this way... or whatever. I tend to
> feel less confident about uploading a script into say maps, or server if I'm
> not sure someone else would even use it, because it would be easy to be
> seen as clutter especially if abused. If a script matured and got heavily
> used, it could be moved to a more permanent location that was in the
> directory it was most applicable to. The other thing is, that it is hard to
> know where scripts are and what they are for right now. If there was an
> obvious top-level directory for that sort of thing, it would make it much
> easier to find out what was available to try on for size.
I see what your saying. I'm just concerned that such directories will end up
filled with scripts in progress or otherwise stuff of questionable status, and
then we're stuck trying to figure out what to do with it (should it get removed?
Is someone going to work on it, etc). If too many misc. scripts are put in
there, then it also becomes harder to find the useful/good ones.
I suppose this could be helped somewhat by having a README file, with a
requirement that if you add a script, you add a line to the README providing a
quick summary, so someone can quickly look at the README and see what is there
and what may be useful.
Some level of coordination is still needed. I could see a script that hasn't
been modified for 6 months which I don't think is particularly useful, but
decide with some modifications, it would be. However, if someone else is
finding it useful in its current form, and I change it in incompatible ways,
that isn't a great thing either.
I'm not positive if SVN is the best thing for work in progress type of stuff
or not. But we could try it out.
>> I think it also makes sense to have script dirs in the arch and map
>> distribution - there are already a bunch of map checking scripts in the server
>> area, but that doesn't make a lot of sense - people don't necessarily know that
>> they are there, etc.
>
> Agree mostly... but still think there might be an advantage to having them
> collected in a common scripts directory that had subdirectories named
> after the arch, maps, etc. they serve. Don't know SVN well enough, but
> it could be you could have some kind of "view" that also put them in
> the directory where they were most likely to be used, but as far as
> version control went, they were controlled in the common tools area.
> Mostly this is just thinking out loud. Don't know if it fits the project, but
> since I write scripts that aren't in SVN I was thinking about how I might
> be more likely to put them up for people to look at and adapt for their
> own use.
You can do external references in SVN. So if someone does a 'svn checkout
maps', it pulls scripts/maps into maps/scripts.
However, that is done at the directory level. So this then doesn't work great
if pre production quality scripts are there (as they are now part of the map area).
Also, there isn't a major concept of controlled area. Right now, if you have
SVN write access, you have it in all areas.
So in that above example, if we have a scripts directory with maps/scripts
being an external references to it, I can go into maps/scripts, make the
changes, and to a svn commit and it would commit them just fine. It would
commit them to scripts/maps, but I don't have to manually go to scripts/maps to
do the checkin.
The external reference support basically just provides a mechanism to link in
other projects. The link doesn't even need to be the same project.
I'm not sure what we really get by doing that, vs just having maps/scripts,
arch/scripts, etc. I suppose the one thing is that it does provide an area for
scripts outside of those areas, or which hits multiple areas (something that can
process map files looking for spelling errors could probably also parse the
archetypes files).
The other case would be scripts to do the release process, etc.
More information about the crossfire
mailing list