git-archimport
git-archive
git-bisect
+git-blame
git-branch
git-cat-file
git-check-ref-format
git-for-each-ref
git-format-patch
git-fsck-objects
+git-gc
git-get-tar-commit-id
git-grep
git-hash-object
git-read-tree
git-rebase
git-receive-pack
+git-reflog
git-relink
git-repack
git-repo-config
config.mak.autogen
config.mak.append
configure
-git-blame
Additional email headers to include in a patch to be submitted
by mail. See gitlink:git-format-patch[1].
+gc.reflogexpire::
+ `git reflog expire` removes reflog entries older than
+ this time; defaults to 90 days.
+
+gc.reflogexpireunreachable::
+ `git reflog expire` removes reflog entries older than
+ this time and are not reachable from the current tip;
+ defaults to 30 days.
+
+gc.rerereresolved::
+ Records of conflicted merge you resolved earlier are
+ kept for this many days when `git rerere gc` is run.
+ The default is 60 days. See gitlink:git-rerere[1].
+
+gc.rerereunresolved::
+ Records of conflicted merge you have not resolved are
+ kept for this many days when `git rerere gc` is run.
+ The default is 15 days. See gitlink:git-rerere[1].
+
gitcvs.enabled::
Whether the cvs pserver interface is enabled for this repository.
See gitlink:git-cvsserver[1].
* gitlink:git-repack[1] to pack loose objects for efficiency.
+ * gitlink:git-gc[1] to do common housekeeping tasks such as
+ repack and prune.
+
Examples
~~~~~~~~
+
------------
$ git fsck-objects <1>
-$ git prune
$ git count-objects <2>
$ git repack <3>
-$ git prune <4>
+$ git gc <4>
------------
+
-<1> running without "--full" is usually cheap and assures the
+<1> running without `\--full` is usually cheap and assures the
repository health reasonably well.
<2> check how many loose objects there are and how much
disk space is wasted by not repacking.
-<3> without "-a" repacks incrementally. repacking every 4-5MB
+<3> without `-a` repacks incrementally. repacking every 4-5MB
of loose objects accumulation may be a good rule of thumb.
-<4> after repack, prune removes the duplicate loose objects.
+<4> it is easier to use `git gc` than individual housekeeping commands
+such as `prune` and `repack`. This runs `repack -a -d`.
Repack a small project into single pack.::
+
* gitlink:git-checkout[1] and gitlink:git-branch[1] to switch
branches.
- * gitlink:git-add[1] and gitlink:git-update-index[1] to manage
- the index file.
+ * gitlink:git-add[1] to manage the index file.
* gitlink:git-diff[1] and gitlink:git-status[1] to see what
you are in the middle of doing.
* gitlink:git-reset[1] and gitlink:git-checkout[1] (with
pathname parameters) to undo changes.
- * gitlink:git-pull[1] with "." as the remote to merge between
- local branches.
+ * gitlink:git-merge[1] to merge between local branches.
* gitlink:git-rebase[1] to maintain topic branches.
Examples
~~~~~~~~
-Use a tarball as a starting point for a new repository:
+Use a tarball as a starting point for a new repository.::
+
------------
$ tar zxf frotz.tar.gz
$ git checkout -- curses/ux_audio_oss.c <2>
$ git add curses/ux_audio_alsa.c <3>
$ edit/compile/test
-$ git diff <4>
+$ git diff HEAD <4>
$ git commit -a -s <5>
$ edit/compile/test
$ git reset --soft HEAD^ <6>
$ git diff ORIG_HEAD <7>
$ git commit -a -c ORIG_HEAD <8>
$ git checkout master <9>
-$ git pull . alsa-audio <10>
+$ git merge alsa-audio <10>
$ git log --since='3 days ago' <11>
$ git log v2.43.. curses/ <12>
------------
+
<1> create a new topic branch.
-<2> revert your botched changes in "curses/ux_audio_oss.c".
+<2> revert your botched changes in `curses/ux_audio_oss.c`.
<3> you need to tell git if you added a new file; removal and
-modification will be caught if you do "commit -a" later.
+modification will be caught if you do `git commit -a` later.
<4> to see what changes you are committing.
<5> commit everything as you have tested, with your sign-off.
<6> take the last commit back, keeping what is in the working tree.
<8> redo the commit undone in the previous step, using the message
you originally wrote.
<9> switch to the master branch.
-<10> merge a topic branch into your master branch
+<10> merge a topic branch into your master branch. You can also use
+`git pull . alsa-audio`, i.e. pull from the local repository.
<11> review commit logs; other forms to limit output can be
-combined and include --max-count=10 (show 10 commits), --until='2005-12-10'.
-<12> view only the changes that touch what's in curses/
-directory, since v2.43 tag.
+combined and include `\--max-count=10` (show 10 commits),
+`\--until=2005-12-10`, etc.
+<12> view only the changes that touch what's in `curses/`
+directory, since `v2.43` tag.
Individual Developer (Participant)[[Individual Developer (Participant)]]
+
<1> repeat as needed.
<2> extract patches from your branch for e-mail submission.
-<3> "pull" fetches from "origin" by default and merges into the
+<3> `git pull` fetches from `origin` by default and merges into the
current branch.
<4> immediately after pulling, look at the changes done upstream
since last time we checked, only in the
<5> fetch from a specific branch from a specific repository and merge.
<6> revert the pull.
<7> garbage collect leftover objects from reverted pull.
-<8> from time to time, obtain official tags from the "origin"
-and store them under .git/refs/tags/.
+<8> from time to time, obtain official tags from the `origin`
+and store them under `.git/refs/tags/`.
Push into another repository.::
+
------------
-satellite$ git clone mothership:frotz/.git frotz <1>
+satellite$ git clone mothership:frotz frotz <1>
satellite$ cd frotz
-satellite$ cat .git/remotes/origin <2>
-URL: mothership:frotz/.git
-Pull: master:origin
-satellite$ echo 'Push: master:satellite' >>.git/remotes/origin <3>
+satellite$ git repo-config --get-regexp '^(remote|branch)\.' <2>
+remote.origin.url mothership:frotz
+remote.origin.fetch refs/heads/*:refs/remotes/origin/*
+branch.master.remote origin
+branch.master.merge refs/heads/master
+satellite$ git repo-config remote.origin.push \
+ master:refs/remotes/satellite/master <3>
satellite$ edit/compile/test/commit
satellite$ git push origin <4>
mothership$ cd frotz
mothership$ git checkout master
-mothership$ git pull . satellite <5>
+mothership$ git merge satellite/master <5>
------------
+
<1> mothership machine has a frotz repository under your home
directory; clone from it to start a repository on the satellite
machine.
-<2> clone creates this file by default. It arranges "git pull"
-to fetch and store the master branch head of mothership machine
-to local "origin" branch.
-<3> arrange "git push" to push local "master" branch to
-"satellite" branch of the mothership machine.
-<4> push will stash our work away on "satellite" branch on the
-mothership machine. You could use this as a back-up method.
+<2> clone sets these configuration variables by default.
+It arranges `git pull` to fetch and store the branches of mothership
+machine to local `remotes/origin/*` tracking branches.
+<3> arrange `git push` to push local `master` branch to
+`remotes/satellite/master` branch of the mothership machine.
+<4> push will stash our work away on `remotes/satellite/master`
+tracking branch on the mothership machine. You could use this as
+a back-up method.
<5> on mothership machine, merge the work done on the satellite
machine into the master branch.
+
<1> create a private branch based on a well known (but somewhat behind)
tag.
-<2> forward port all changes in private2.6.14 branch to master branch
+<2> forward port all changes in `private2.6.14` branch to `master` branch
without a formal "merging".
& s 2 3 4 5 ./+to-apply
& s 7 8 ./+hold-linus
& q
-$ git checkout master
+$ git checkout -b topic/one master
$ git am -3 -i -s -u ./+to-apply <4>
$ compile/test
$ git checkout -b hold/linus && git am -3 -i -s -u ./+hold-linus <5>
$ git checkout topic/one && git rebase master <6>
-$ git checkout pu && git reset --hard master <7>
-$ git pull . topic/one topic/two && git pull . hold/linus <8>
+$ git checkout pu && git reset --hard next <7>
+$ git merge topic/one topic/two && git merge hold/linus <8>
$ git checkout maint
$ git cherry-pick master~4 <9>
$ compile/test
that are not quite ready.
<4> apply them, interactively, with my sign-offs.
<5> create topic branch as needed and apply, again with my
-sign-offs.
+sign-offs.
<6> rebase internal topic branch that has not been merged to the
master, nor exposed as a part of a stable branch.
-<7> restart "pu" every time from the master.
+<7> restart `pu` every time from the next.
<8> and bundle topic branches still cooking.
<9> backport a critical fix.
<10> create a signed tag.
<11> make sure I did not accidentally rewind master beyond what I
-already pushed out. "ko" shorthand points at the repository I have
+already pushed out. `ko` shorthand points at the repository I have
at kernel.org, and looks like this:
+
------------
$ cat .git/remotes/ko
URL: kernel.org:/pub/scm/git/git.git
Pull: master:refs/tags/ko-master
+Pull: next:refs/tags/ko-next
Pull: maint:refs/tags/ko-maint
Push: master
+Push: next
Push: +pu
Push: maint
------------
+
-In the output from "git show-branch", "master" should have
-everything "ko-master" has.
+In the output from `git show-branch`, `master` should have
+everything `ko-master` has, and `next` should have
+everything `ko-next` has.
<12> push out the bleeding edge.
<13> push the tag out, too.
------------
+
<1> log-in shell is set to /usr/bin/git-shell, which does not
-allow anything but "git push" and "git pull". The users should
+allow anything but `git push` and `git pull`. The users should
get an ssh access to the machine.
<2> in many distributions /etc/shells needs to list what is used
as the login shell.
-v::
In addition to the number of loose objects and disk
space consumed, it reports the number of in-pack
- objects, and number of objects that can be removed by
- running `git-prune-packed`.
+ objects, number of packs, and number of objects that can be
+ removed by running `git-prune-packed`.
Author
--- /dev/null
+git-gc(1)
+=========
+
+NAME
+----
+git-gc - Cleanup unnecessary files and optimize the local repository
+
+
+SYNOPSIS
+--------
+'git-gc'
+
+DESCRIPTION
+-----------
+Runs a number of housekeeping tasks within the current repository,
+such as compressing file revisions (to reduce disk space and increase
+performance) and removing unreachable objects which may have been
+created from prior invocations of gitlink:git-add[1].
+
+Users are encouraged to run this task on a regular basis within
+each repository to maintain good disk space utilization and good
+operating performance.
+
+Configuration
+-------------
+
+The optional configuration variable 'gc.reflogExpire' can be
+set to indicate how long historical entries within each branch's
+reflog should remain available in this repository. The setting is
+expressed as a length of time, for example '90 days' or '3 months'.
+It defaults to '90 days'.
+
+The optional configuration variable 'gc.reflogExpireUnreachable'
+can be set to indicate how long historical reflog entries which
+are not part of the current branch should remain available in
+this repository. These types of entries are generally created as
+a result of using `git commit \--amend` or `git rebase` and are the
+commits prior to the amend or rebase occuring. Since these changes
+are not part of the current project most users will want to expire
+them sooner. This option defaults to '30 days'.
+
+The optional configuration variable 'gc.rerereresolved' indicates
+how long records of conflicted merge you resolved earlier are
+kept. This defaults to 60 days.
+
+The optional configuration variable 'gc.rerereunresolved' indicates
+how long records of conflicted merge you have not resolved are
+kept. This defaults to 15 days.
+
+
+See Also
+--------
+gitlink:git-prune[1]
+gitlink:git-reflog[1]
+gitlink:git-repack[1]
+gitlink:git-rerere[1]
+
+Author
+------
+Written by Shawn O. Pearce <spearce@spearce.org>
+
+GIT
+---
+Part of the gitlink:git[7] suite
--------
[verse]
'git-merge' [-n] [--no-commit] [--squash] [-s <strategy>]...
- [--reflog-action=<action>]
-m=<msg> <remote> <remote>...
DESCRIPTION
least one <remote>. Specifying more than one <remote>
obviously means you are trying an Octopus.
---reflog-action=<action>::
- This is used internally when `git-pull` calls this command
- to record that the merge was created by `pull` command
- in the `ref-log` entry that results from the merge.
-
include::merge-strategies.txt[]
--- /dev/null
+git-reflog(1)
+=============
+
+NAME
+----
+git-reflog - Manage reflog information
+
+
+SYNOPSIS
+--------
+[verse]
+'git-reflog' expire [--dry-run]
+ [--expire=<time>] [--expire-unreachable=<time>] [--all] <refs>...
+
+
+DESCRIPTION
+-----------
+
+Reflog is a mechanism to record when the tip of branches are
+updated. This command is to manage the information recorded in it.
+
+The subcommand "expire" is used to prune older reflog entries.
+Entries older than `expire` time, or entries older than
+`expire-unreachable` time and are not reachable from the current
+tip, are removed from the reflog. This is typically not used
+directly by the end users -- instead, see gitlink:git-gc[1].
+
+
+
+OPTIONS
+-------
+
+--expire=<time>::
+ Entries older than this time are pruned. Without the
+ option it is taken from configuration `gc.reflogExpire`,
+ which in turn defaults to 90 days.
+
+--expire-unreachable=<time>::
+ Entries older than this time and are not reachable from
+ the current tip of the branch are pruned. Without the
+ option it is taken from configuration
+ `gc.reflogExpireUnreachable`, which in turn defaults to
+ 30 days.
+
+--all::
+ Instead of listing <refs> explicitly, prune all refs.
+
+Author
+------
+Written by Junio C Hamano <junkio@cox.net>
+
+Documentation
+--------------
+Documentation by Junio C Hamano and the git-list <git@vger.kernel.org>.
+
+GIT
+---
+Part of the gitlink:git[7] suite
+
SYNOPSIS
--------
-'git-rerere' [clear|diff|status]
+'git-rerere' [clear|diff|status|gc]
DESCRIPTION
-----------
'gc'::
This command is used to prune records of conflicted merge that
-occurred long time ago.
+occurred long time ago. By default, conflicts older than 15
+days that you have not recorded their resolution, and conflicts
+older than 60 days, are pruned. These are controlled with
+`gc.rerereunresolved` and `gc.rerereresolved` configuration
+variables.
DISCUSSION
gitlink:git-cvsserver[1]::
A CVS server emulator for git.
+gitlink:git-gc[1]::
+ Cleanup unnecessary files and optimize the local repository.
+
gitlink:git-lost-found[1]::
Recover lost refs that luckily have not yet been pruned.
gitlink:git-quiltimport[1]::
Applies a quilt patchset onto the current branch.
+gitlink:git-reflog[1]::
+ Manage reflog information.
+
gitlink:git-relink[1]::
Hardlink common objects in local repositories.
#
# Define NO_ICONV if your libc does not properly support iconv.
#
+# Define NO_R_TO_GCC if your gcc does not like "-R/path/lib" that
+# tells runtime paths to dynamic libraries; "-Wl,-rpath=/path/lib"
+# is used instead.
+#
# Define USE_NSEC below if you want git to care about sub-second file mtimes
# and ctimes. Note that you need recent glibc (at least 2.2.4) for this, and
# it will BREAK YOUR LOCAL DIFFS! show-diff and anything using it will likely
SCRIPT_SH = \
git-bisect.sh git-checkout.sh \
git-clean.sh git-clone.sh git-commit.sh \
- git-fetch.sh \
+ git-fetch.sh git-gc.sh \
git-ls-remote.sh \
git-merge-one-file.sh git-parse-remote.sh \
git-pull.sh git-rebase.sh \
revision.o pager.o tree-walk.o xdiff-interface.o \
write_or_die.o trace.o list-objects.o grep.o \
alloc.o merge-file.o path-list.o help.o unpack-trees.o $(DIFF_OBJS) \
- color.o wt-status.o archive-zip.o archive-tar.o \
- utf8.o
+ color.o wt-status.o archive-zip.o archive-tar.o shallow.o utf8.o
BUILTIN_OBJS = \
builtin-add.o \
builtin-prune-packed.o \
builtin-push.o \
builtin-read-tree.o \
+ builtin-reflog.o \
builtin-repo-config.o \
builtin-rerere.o \
builtin-rev-list.o \
NO_FAST_WORKING_DIRECTORY = UnfortunatelyYes
# There are conflicting reports about this.
# On some boxes NO_MMAP is needed, and not so elsewhere.
- # Try uncommenting this if you see things break -- YMMV.
- # NO_MMAP = YesPlease
+ # Try commenting this out if you suspect MMAP is more efficient
+ NO_MMAP = YesPlease
NO_IPV6 = YesPlease
X = .exe
endif
endif
endif
+ifdef NO_R_TO_GCC_LINKER
+ # Some gcc does not accept and pass -R to the linker to specify
+ # the runtime dynamic library path.
+ CC_LD_DYNPATH = -Wl,-rpath=
+else
+ CC_LD_DYNPATH = -R
+endif
+
ifndef NO_CURL
ifdef CURLDIR
- # This is still problematic -- gcc does not always want -R.
+ # Try "-Wl,-rpath=$(CURLDIR)/lib" in such a case.
BASIC_CFLAGS += -I$(CURLDIR)/include
- CURL_LIBCURL = -L$(CURLDIR)/lib -R$(CURLDIR)/lib -lcurl
+ CURL_LIBCURL = -L$(CURLDIR)/lib $(CC_LD_DYNPATH)$(CURLDIR)/lib -lcurl
else
CURL_LIBCURL = -lcurl
endif
ifndef NO_OPENSSL
OPENSSL_LIBSSL = -lssl
ifdef OPENSSLDIR
- # Again this may be problematic -- gcc does not always want -R.
BASIC_CFLAGS += -I$(OPENSSLDIR)/include
- OPENSSL_LINK = -L$(OPENSSLDIR)/lib -R$(OPENSSLDIR)/lib
+ OPENSSL_LINK = -L$(OPENSSLDIR)/lib $(CC_LD_DYNPATH)$(OPENSSLDIR)/lib
else
OPENSSL_LINK =
endif
endif
ifdef NEEDS_LIBICONV
ifdef ICONVDIR
- # Again this may be problematic -- gcc does not always want -R.
BASIC_CFLAGS += -I$(ICONVDIR)/include
- ICONV_LINK = -L$(ICONVDIR)/lib -R$(ICONVDIR)/lib
+ ICONV_LINK = -L$(ICONVDIR)/lib $(CC_LD_DYNPATH)$(ICONVDIR)/lib
else
ICONV_LINK =
endif
return "";
}
-static int in_merge_bases(const unsigned char *sha1,
- struct commit *rev1,
- struct commit *rev2)
-{
- struct commit_list *bases, *b;
- int ret = 0;
-
- bases = get_merge_bases(rev1, rev2, 1);
- for (b = bases; b; b = b->next) {
- if (!hashcmp(sha1, b->item->object.sha1)) {
- ret = 1;
- break;
- }
- }
-
- free_commit_list(bases);
- return ret;
-}
-
static int delete_branches(int argc, const char **argv, int force, int kinds)
{
struct commit *rev, *head_rev = head_rev;
*/
if (!force &&
- !in_merge_bases(sha1, rev, head_rev)) {
+ !in_merge_bases(rev, head_rev)) {
error("The branch '%s' is not a strict subset of "
"your current HEAD.\n"
"If you are sure you want to delete it, "
}
if (verbose) {
struct packed_git *p;
+ unsigned long num_pack = 0;
if (!packed_git)
prepare_packed_git();
for (p = packed_git; p; p = p->next) {
if (!p->pack_local)
continue;
packed += num_packed_objects(p);
+ num_pack++;
}
printf("count: %lu\n", loose);
printf("size: %lu\n", loose_size / 2);
printf("in-pack: %lu\n", packed);
+ printf("packs: %lu\n", num_pack);
printf("prune-packable: %lu\n", packed_loose);
printf("garbage: %lu\n", garbage);
}
git-pack-objects [{ -q | --progress | --all-progress }] \n\
[--local] [--incremental] [--window=N] [--depth=N] \n\
[--no-reuse-delta] [--delta-base-offset] [--non-empty] \n\
- [--revs [--unpacked | --all]*] [--stdout | base-name] \n\
+ [--revs [--unpacked | --all]*] [--reflog] [--stdout | base-name] \n\
[<ref-list | <object-list]";
struct object_entry {
}
if (!strcmp("--unpacked", arg) ||
!strncmp("--unpacked=", arg, 11) ||
+ !strcmp("--reflog", arg) ||
!strcmp("--all", arg)) {
use_internal_rev_list = 1;
if (ARRAY_SIZE(rp_av) - 1 <= rp_ac)
}
}
+static int add_one_reflog_ent(unsigned char *osha1, unsigned char *nsha1, char *datail, void *cb_data)
+{
+ struct object *object;
+
+ object = parse_object(osha1);
+ if (object)
+ add_pending_object(&revs, object, "");
+ object = parse_object(nsha1);
+ if (object)
+ add_pending_object(&revs, object, "");
+ return 0;
+}
+
static int add_one_ref(const char *path, const unsigned char *sha1, int flag, void *cb_data)
{
struct object *object = parse_object(sha1);
if (!object)
die("bad object ref: %s:%s", path, sha1_to_hex(sha1));
add_pending_object(&revs, object, "");
+
+ for_each_reflog_ent(path, add_one_reflog_ent, NULL);
+
return 0;
}
--- /dev/null
+#include "cache.h"
+#include "builtin.h"
+#include "commit.h"
+#include "refs.h"
+#include "dir.h"
+#include "tree-walk.h"
+
+static unsigned long default_reflog_expire;
+static unsigned long default_reflog_expire_unreachable;
+
+struct expire_reflog_cb {
+ FILE *newlog;
+ const char *ref;
+ struct commit *ref_commit;
+ unsigned long expire_total;
+ unsigned long expire_unreachable;
+};
+
+static int tree_is_complete(const unsigned char *sha1)
+{
+ struct tree_desc desc;
+ void *buf;
+ char type[20];
+
+ buf = read_sha1_file(sha1, type, &desc.size);
+ if (!buf)
+ return 0;
+ desc.buf = buf;
+ while (desc.size) {
+ const unsigned char *elem;
+ const char *name;
+ unsigned mode;
+
+ elem = tree_entry_extract(&desc, &name, &mode);
+ if (!has_sha1_file(elem) ||
+ (S_ISDIR(mode) && !tree_is_complete(elem))) {
+ free(buf);
+ return 0;
+ }
+ update_tree_entry(&desc);
+ }
+ free(buf);
+ return 1;
+}
+
+static int keep_entry(struct commit **it, unsigned char *sha1)
+{
+ struct commit *commit;
+
+ *it = NULL;
+ if (is_null_sha1(sha1))
+ return 1;
+ commit = lookup_commit_reference_gently(sha1, 1);
+ if (!commit)
+ return 0;
+
+ /* Make sure everything in this commit exists. */
+ parse_object(commit->object.sha1);
+ if (!tree_is_complete(commit->tree->object.sha1))
+ return 0;
+ *it = commit;
+ return 1;
+}
+
+static int expire_reflog_ent(unsigned char *osha1, unsigned char *nsha1,
+ char *data, void *cb_data)
+{
+ struct expire_reflog_cb *cb = cb_data;
+ unsigned long timestamp;
+ char *cp, *ep;
+ struct commit *old, *new;
+
+ cp = strchr(data, '>');
+ if (!cp || *++cp != ' ')
+ goto prune;
+ timestamp = strtoul(cp, &ep, 10);
+ if (*ep != ' ')
+ goto prune;
+ if (timestamp < cb->expire_total)
+ goto prune;
+
+ if (!keep_entry(&old, osha1) || !keep_entry(&new, nsha1))
+ goto prune;
+
+ if ((timestamp < cb->expire_unreachable) &&
+ (!cb->ref_commit ||
+ (old && !in_merge_bases(old, cb->ref_commit)) ||
+ (new && !in_merge_bases(new, cb->ref_commit))))
+ goto prune;
+
+ if (cb->newlog)
+ fprintf(cb->newlog, "%s %s %s",
+ sha1_to_hex(osha1), sha1_to_hex(nsha1), data);
+ return 0;
+ prune:
+ if (!cb->newlog)
+ fprintf(stderr, "would prune %s", data);
+ return 0;
+}
+
+struct cmd_reflog_expire_cb {
+ int dry_run;
+ unsigned long expire_total;
+ unsigned long expire_unreachable;
+};
+
+static int expire_reflog(const char *ref, const unsigned char *sha1, int unused, void *cb_data)
+{
+ struct cmd_reflog_expire_cb *cmd = cb_data;
+ struct expire_reflog_cb cb;
+ struct ref_lock *lock;
+ char *newlog_path = NULL;
+ int status = 0;
+
+ if (strncmp(ref, "refs/", 5))
+ return error("not a ref '%s'", ref);
+
+ memset(&cb, 0, sizeof(cb));
+ /* we take the lock for the ref itself to prevent it from
+ * getting updated.
+ */
+ lock = lock_ref_sha1(ref + 5, sha1);
+ if (!lock)
+ return error("cannot lock ref '%s'", ref);
+ if (!file_exists(lock->log_file))
+ goto finish;
+ if (!cmd->dry_run) {
+ newlog_path = xstrdup(git_path("logs/%s.lock", ref));
+ cb.newlog = fopen(newlog_path, "w");
+ }
+
+ cb.ref_commit = lookup_commit_reference_gently(sha1, 1);
+ if (!cb.ref_commit)
+ fprintf(stderr,
+ "warning: ref '%s' does not point at a commit\n", ref);
+ cb.ref = ref;
+ cb.expire_total = cmd->expire_total;
+ cb.expire_unreachable = cmd->expire_unreachable;
+ for_each_reflog_ent(ref, expire_reflog_ent, &cb);
+ finish:
+ if (cb.newlog) {
+ if (fclose(cb.newlog))
+ status |= error("%s: %s", strerror(errno),
+ newlog_path);
+ if (rename(newlog_path, lock->log_file)) {
+ status |= error("cannot rename %s to %s",
+ newlog_path, lock->log_file);
+ unlink(newlog_path);
+ }
+ }
+ free(newlog_path);
+ unlock_ref(lock);
+ return status;
+}
+
+static int reflog_expire_config(const char *var, const char *value)
+{
+ if (!strcmp(var, "gc.reflogexpire"))
+ default_reflog_expire = approxidate(value);
+ else if (!strcmp(var, "gc.reflogexpireunreachable"))
+ default_reflog_expire_unreachable = approxidate(value);
+ else
+ return git_default_config(var, value);
+ return 0;
+}
+
+static const char reflog_expire_usage[] =
+"git-reflog expire [--dry-run] [--expire=<time>] [--expire-unreachable=<time>] [--all] <refs>...";
+
+static int cmd_reflog_expire(int argc, const char **argv, const char *prefix)
+{
+ struct cmd_reflog_expire_cb cb;
+ unsigned long now = time(NULL);
+ int i, status, do_all;
+
+ git_config(reflog_expire_config);
+
+ save_commit_buffer = 0;
+ do_all = status = 0;
+ memset(&cb, 0, sizeof(cb));
+
+ if (!default_reflog_expire_unreachable)
+ default_reflog_expire_unreachable = now - 30 * 24 * 3600;
+ if (!default_reflog_expire)
+ default_reflog_expire = now - 90 * 24 * 3600;
+ cb.expire_total = default_reflog_expire;
+ cb.expire_unreachable = default_reflog_expire_unreachable;
+
+ for (i = 1; i < argc; i++) {
+ const char *arg = argv[i];
+ if (!strcmp(arg, "--dry-run") || !strcmp(arg, "-n"))
+ cb.dry_run = 1;
+ else if (!strncmp(arg, "--expire=", 9))
+ cb.expire_total = approxidate(arg + 9);
+ else if (!strncmp(arg, "--expire-unreachable=", 21))
+ cb.expire_unreachable = approxidate(arg + 21);
+ else if (!strcmp(arg, "--all"))
+ do_all = 1;
+ else if (!strcmp(arg, "--")) {
+ i++;
+ break;
+ }
+ else if (arg[0] == '-')
+ usage(reflog_expire_usage);
+ else
+ break;
+ }
+ if (do_all)
+ status |= for_each_ref(expire_reflog, &cb);
+ while (i < argc) {
+ const char *ref = argv[i++];
+ unsigned char sha1[20];
+ if (!resolve_ref(ref, sha1, 1, NULL)) {
+ status |= error("%s points nowhere!", ref);
+ continue;
+ }
+ status |= expire_reflog(ref, sha1, 0, &cb);
+ }
+ return status;
+}
+
+static const char reflog_usage[] =
+"git-reflog (expire | ...)";
+
+int cmd_reflog(int argc, const char **argv, const char *prefix)
+{
+ if (argc < 2)
+ usage(reflog_usage);
+ else if (!strcmp(argv[1], "expire"))
+ return cmd_reflog_expire(argc - 1, argv + 1, prefix);
+ else
+ usage(reflog_usage);
+}
return write_rr(rr, fd);
}
+static int git_rerere_config(const char *var, const char *value)
+{
+ if (!strcmp(var, "gc.rerereresolved"))
+ cutoff_resolve = git_config_int(var, value);
+ else if (!strcmp(var, "gc.rerereunresolved"))
+ cutoff_noresolve = git_config_int(var, value);
+ else
+ return git_default_config(var, value);
+ return 0;
+}
+
int cmd_rerere(int argc, const char **argv, const char *prefix)
{
struct path_list merge_rr = { NULL, 0, 0, 1 };
if (stat(git_path("rr-cache"), &st) || !S_ISDIR(st.st_mode))
return 0;
+ git_config(git_rerere_config);
+
merge_rr_path = xstrdup(git_path("rr-cache/MERGE_RR"));
fd = hold_lock_file_for_update(&write_lock, merge_rr_path, 1);
read_rr(&merge_rr);
extern int cmd_prune_packed(int argc, const char **argv, const char *prefix);
extern int cmd_push(int argc, const char **argv, const char *prefix);
extern int cmd_read_tree(int argc, const char **argv, const char *prefix);
+extern int cmd_reflog(int argc, const char **argv, const char *prefix);
extern int cmd_repo_config(int argc, const char **argv, const char *prefix);
extern int cmd_rerere(int argc, const char **argv, const char *prefix);
extern int cmd_rev_list(int argc, const char **argv, const char *prefix);
#include "cache.h"
#include "tag.h"
#include "commit.h"
+#include "pkt-line.h"
#include "utf8.h"
int save_commit_buffer = 1;
return;
graft_file = get_graft_file();
read_graft_file(graft_file);
+ /* make sure shallows are read */
+ is_repository_shallow();
commit_graft_prepared = 1;
}
return commit_graft[pos];
}
+int write_shallow_commits(int fd, int use_pack_protocol)
+{
+ int i, count = 0;
+ for (i = 0; i < commit_graft_nr; i++)
+ if (commit_graft[i]->nr_parent < 0) {
+ const char *hex =
+ sha1_to_hex(commit_graft[i]->sha1);
+ count++;
+ if (use_pack_protocol)
+ packet_write(fd, "shallow %s", hex);
+ else {
+ write(fd, hex, 40);
+ write(fd, "\n", 1);
+ }
+ }
+ return count;
+}
+
+int unregister_shallow(const unsigned char *sha1)
+{
+ int pos = commit_graft_pos(sha1);
+ if (pos < 0)
+ return -1;
+ if (pos + 1 < commit_graft_nr)
+ memcpy(commit_graft + pos, commit_graft + pos + 1,
+ sizeof(struct commit_graft *)
+ * (commit_graft_nr - pos - 1));
+ commit_graft_nr--;
+ return 0;
+}
+
int parse_commit_buffer(struct commit *item, void *buffer, unsigned long size)
{
char *tail = buffer;
free(rslt);
return result;
}
+
+int in_merge_bases(struct commit *rev1, struct commit *rev2)
+{
+ struct commit_list *bases, *b;
+ int ret = 0;
+
+ bases = get_merge_bases(rev1, rev2, 1);
+ for (b = bases; b; b = b->next) {
+ if (!hashcmp(rev1->object.sha1, b->item->object.sha1)) {
+ ret = 1;
+ break;
+ }
+ }
+
+ free_commit_list(bases);
+ return ret;
+}
struct commit_graft {
unsigned char sha1[20];
- int nr_parent;
+ int nr_parent; /* < 0 if shallow commit */
unsigned char parent[FLEX_ARRAY][20]; /* more */
};
extern struct commit_list *get_merge_bases(struct commit *rev1, struct commit *rev2, int cleanup);
+extern int register_shallow(const unsigned char *sha1);
+extern int unregister_shallow(const unsigned char *sha1);
+extern int write_shallow_commits(int fd, int use_pack_protocol);
+extern int is_repository_shallow();
+extern struct commit_list *get_shallow_commits(struct object_array *heads,
+ int depth, int shallow_flag, int not_shallow_flag);
+
+int in_merge_bases(struct commit *rev1, struct commit *rev2);
#endif /* COMMIT_H */
static int quiet;
static int verbose;
static int fetch_all;
+static int depth;
static const char fetch_pack_usage[] =
-"git-fetch-pack [--all] [-q] [-v] [-k] [--thin] [--exec=upload-pack] [host:]directory <refs>...";
+"git-fetch-pack [--all] [-q] [-v] [-k] [--thin] [--exec=upload-pack] [--depth=<n>] [host:]directory <refs>...";
static const char *exec = "git-upload-pack";
#define COMPLETE (1U << 0)
packet_write(fd[1], "want %s\n", sha1_to_hex(remote));
fetching++;
}
+ if (is_repository_shallow())
+ write_shallow_commits(fd[1], 1);
+ if (depth > 0)
+ packet_write(fd[1], "deepen %d", depth);
packet_flush(fd[1]);
if (!fetching)
return 1;
+ if (depth > 0) {
+ char line[1024];
+ unsigned char sha1[20];
+ int len;
+
+ while ((len = packet_read_line(fd[0], line, sizeof(line)))) {
+ if (!strncmp("shallow ", line, 8)) {
+ if (get_sha1_hex(line + 8, sha1))
+ die("invalid shallow line: %s", line);
+ register_shallow(sha1);
+ continue;
+ }
+ if (!strncmp("unshallow ", line, 10)) {
+ if (get_sha1_hex(line + 10, sha1))
+ die("invalid unshallow line: %s", line);
+ if (!lookup_object(sha1))
+ die("object not found: %s", line);
+ /* make sure that it is parsed as shallow */
+ parse_object(sha1);
+ if (unregister_shallow(sha1))
+ die("no shallow found: %s", line);
+ continue;
+ }
+ die("expected shallow/unshallow, got %s", line);
+ }
+ }
+
flushes = 0;
retval = -1;
while ((sha1 = get_rev())) {
if (!memcmp(ref->name, "refs/", 5) &&
check_ref_format(ref->name + 5))
; /* trash */
- else if (fetch_all) {
+ else if (fetch_all &&
+ (!depth || strncmp(ref->name, "refs/tags/", 10) )) {
*newtail = ref;
ref->next = NULL;
newtail = &ref->next;
}
}
- for_each_ref(mark_complete, NULL);
- if (cutoff)
- mark_recent_complete_commits(cutoff);
+ if (!depth) {
+ for_each_ref(mark_complete, NULL);
+ if (cutoff)
+ mark_recent_complete_commits(cutoff);
+ }
/*
* Mark all complete remote refs as common refs.
int status;
get_remote_heads(fd[0], &ref, 0, NULL, 0);
+ if (is_repository_shallow() && !server_supports("shallow"))
+ die("Server does not support shallow clients");
if (server_supports("multi_ack")) {
if (verbose)
fprintf(stderr, "Server supports multi_ack\n");
char *dest = NULL, **heads;
int fd[2];
pid_t pid;
+ struct stat st;
+ struct lock_file lock;
setup_git_directory();
verbose = 1;
continue;
}
+ if (!strncmp("--depth=", arg, 8)) {
+ depth = strtol(arg + 8, NULL, 0);
+ if (stat(git_path("shallow"), &st))
+ st.st_mtime = 0;
+ continue;
+ }
usage(fetch_pack_usage);
}
dest = arg;
}
}
+ if (!ret && depth > 0) {
+ struct cache_time mtime;
+ char *shallow = git_path("shallow");
+ int fd;
+
+ mtime.sec = st.st_mtime;
+#ifdef USE_NSEC
+ mtime.usec = st.st_mtim.usec;
+#endif
+ if (stat(shallow, &st)) {
+ if (mtime.sec)
+ die("shallow file was removed during fetch");
+ } else if (st.st_mtime != mtime.sec
+#ifdef USE_NSEC
+ || st.st_mtim.usec != mtime.usec
+#endif
+ )
+ die("shallow file was changed during fetch");
+
+ fd = hold_lock_file_for_update(&lock, shallow, 1);
+ if (!write_shallow_commits(fd, 0)) {
+ unlink(shallow);
+ rollback_lock_file(&lock);
+ } else {
+ close(fd);
+ commit_lock_file(&lock);
+ }
+ }
+
return !!ret;
}
static int default_refs;
+static int fsck_handle_reflog_ent(unsigned char *osha1, unsigned char *nsha1, char *datail, void *cb_data)
+{
+ struct object *obj;
+
+ if (!is_null_sha1(osha1)) {
+ obj = lookup_object(osha1);
+ if (obj) {
+ obj->used = 1;
+ mark_reachable(obj, REACHABLE);
+ }
+ }
+ obj = lookup_object(nsha1);
+ if (obj) {
+ obj->used = 1;
+ mark_reachable(obj, REACHABLE);
+ }
+ return 0;
+}
+
static int fsck_handle_ref(const char *refname, const unsigned char *sha1, int flag, void *cb_data)
{
struct object *obj;
default_refs++;
obj->used = 1;
mark_reachable(obj, REACHABLE);
+
+ for_each_reflog_ent(refname, fsck_handle_reflog_ent, NULL);
+
return 0;
}
[--interactive] [--whitespace=<option>] <mbox>...
or, when resuming [--skip | --resolved]'
. git-sh-setup
+set_reflog_action am
git var GIT_COMMITTER_IDENT >/dev/null || exit
}
prec=4
-rloga=am
dotest=.dotest sign= utf8= keep= skip= interactive= resolved= binary= ws= resolvemsg=
while case "$#" in 0) break;; esac
--resolvemsg=*)
resolvemsg=$(echo "$1" | sed -e "s/^--resolvemsg=//"); shift ;;
- --reflog-action=*)
- rloga=`expr "z$1" : 'z-[^=]*=\(.*\)'`; shift ;;
-
--)
shift; break ;;
-*)
parent=$(git-rev-parse --verify HEAD) &&
commit=$(git-commit-tree $tree -p $parent <"$dotest/final-commit") &&
echo Committed: $commit &&
- git-update-ref -m "$rloga: $SUBJECT" HEAD $commit $parent ||
+ git-update-ref -m "$GIT_REFLOG_ACTION: $SUBJECT" HEAD $commit $parent ||
stop_here $this
if test -x "$GIT_DIR"/hooks/post-applypatch
}
usage() {
- die "Usage: $0 [--template=<template_directory>] [--reference <reference-repo>] [--bare] [-l [-s]] [-q] [-u <upload-pack>] [--origin <name>] [-n] <repo> [<dir>]"
+ die "Usage: $0 [--template=<template_directory>] [--reference <reference-repo>] [--bare] [-l [-s]] [-q] [-u <upload-pack>] [--origin <name>] [--depth <n>] [-n] <repo> [<dir>]"
}
get_repo_base() {
origin=
origin_override=
use_separate_remote=t
+depth=
while
case "$#,$1" in
0,*) break ;;
*,-u|*,--upload-pack)
shift
upload_pack="--exec=$1" ;;
+ 1,--depth) usage;;
+ *,--depth)
+ shift
+ depth="--depth=$1";;
*,-*) usage ;;
*) break ;;
esac
*)
case "$repo" in
rsync://*)
+ case "$depth" in
+ "") ;;
+ *) die "shallow over rsync not supported" ;;
+ esac
rsync $quiet -av --ignore-existing \
--exclude info "$repo/objects/" "$GIT_DIR/objects/" ||
exit
git-ls-remote "$repo" >"$GIT_DIR/CLONE_HEAD" || exit 1
;;
https://*|http://*|ftp://*)
+ case "$depth" in
+ "") ;;
+ *) die "shallow over http or ftp not supported" ;;
+ esac
if test -z "@@NO_CURL@@"
then
clone_dumb_http "$repo" "$D"
;;
*)
case "$upload_pack" in
- '') git-fetch-pack --all -k $quiet "$repo" ;;
- *) git-fetch-pack --all -k $quiet "$upload_pack" "$repo" ;;
+ '') git-fetch-pack --all -k $quiet $depth "$repo" ;;
+ *) git-fetch-pack --all -k $quiet "$upload_pack" $depth "$repo" ;;
esac >"$GIT_DIR/CLONE_HEAD" ||
die "fetch-pack from '$repo' failed."
;;
# Set up the mappings to track the remote branches.
git-repo-config remote."$origin".fetch \
- "refs/heads/*:$remote_top/*" '^$' &&
+ "+refs/heads/*:$remote_top/*" '^$' &&
rm -f "refs/remotes/$origin/HEAD"
git-symbolic-ref "refs/remotes/$origin/HEAD" \
"refs/remotes/$origin/$head_points_at" &&
USAGE='<fetch-options> <repository> <refspec>...'
SUBDIRECTORY_OK=Yes
. git-sh-setup
+set_reflog_action "fetch $*"
+
TOP=$(git-rev-parse --show-cdup)
if test ! -z "$TOP"
then
'
IFS="$LF"
-rloga=fetch
no_tags=
tags=
append=
exec=
upload_pack=
keep=
+shallow_depth=
while case "$#" in 0) break ;; esac
do
case "$1" in
-k|--k|--ke|--kee|--keep)
keep='-k -k'
;;
- --reflog-action=*)
- rloga=`expr "z$1" : 'z-[^=]*=\(.*\)'`
+ --depth=*)
+ shallow_depth="--depth=`expr "z$1" : 'z-[^=]*=\(.*\)'`"
+ ;;
+ --depth)
+ shift
+ shallow_depth="--depth=$1"
;;
-*)
usage
rref=
rsync_slurped_objects=
-rloga="$rloga $remote_nick"
-test "$remote_nick" = "$remote" || rloga="$rloga $remote"
-
if test "" = "$append"
then
: >"$GIT_DIR/FETCH_HEAD"
else
echo >&2 "* $1: updating with $3"
echo >&2 " $label_: $newshort_"
- git-update-ref -m "$rloga: updating tag" "$1" "$2"
+ git-update-ref -m "$GIT_REFLOG_ACTION: updating tag" "$1" "$2"
fi
else
echo >&2 "* $1: storing $3"
echo >&2 " $label_: $newshort_"
- git-update-ref -m "$rloga: storing tag" "$1" "$2"
+ git-update-ref -m "$GIT_REFLOG_ACTION: storing tag" "$1" "$2"
fi
;;
*,$local)
echo >&2 "* $1: fast forward to $3"
echo >&2 " old..new: $oldshort_..$newshort_"
- git-update-ref -m "$rloga: fast-forward" "$1" "$2" "$local"
+ git-update-ref -m "$GIT_REFLOG_ACTION: fast-forward" "$1" "$2" "$local"
;;
*)
false
*,t,*)
echo >&2 "* $1: forcing update to non-fast forward $3"
echo >&2 " old...new: $oldshort_...$newshort_"
- git-update-ref -m "$rloga: forced-update" "$1" "$2" "$local"
+ git-update-ref -m "$GIT_REFLOG_ACTION: forced-update" "$1" "$2" "$local"
;;
*)
echo >&2 "* $1: not updating to non-fast forward $3"
else
echo >&2 "* $1: storing $3"
echo >&2 " $label_: $newshort_"
- git-update-ref -m "$rloga: storing head" "$1" "$2"
+ git-update-ref -m "$GIT_REFLOG_ACTION: storing head" "$1" "$2"
fi
;;
esac
# There are transports that can fetch only one head at a time...
case "$remote" in
http://* | https://* | ftp://*)
+ test -n "$shallow_depth" &&
+ die "shallow clone with http not supported"
proto=`expr "$remote" : '\([^:]*\):'`
if [ -n "$GIT_SSL_NO_VERIFY" ]; then
curl_extra_args="-k"
git-http-fetch -v -a "$head" "$remote/" || exit
;;
rsync://*)
+ test -n "$shallow_depth" &&
+ die "shallow clone with rsync not supported"
TMP_HEAD="$GIT_DIR/TMP_HEAD"
rsync -L -q "$remote/$remote_name" "$TMP_HEAD" || exit 1
head=$(git-rev-parse --verify TMP_HEAD)
pack_lockfile=
IFS=" $LF"
(
- git-fetch-pack --thin $exec $keep "$remote" $rref || echo failed "$remote"
+ git-fetch-pack --thin $exec $keep $shallow_depth "$remote" $rref || echo failed "$remote"
) |
while read sha1 remote_name
do
case "$taglist" in
'') ;;
?*)
+ # do not deepen a shallow tree when following tags
+ shallow_depth=
fetch_main "$taglist" || exit ;;
esac
esac
if test "$curr_head" != "$orig_head"
then
git-update-ref \
- -m "$rloga: Undoing incorrectly fetched HEAD." \
+ -m "$GIT_REFLOG_ACTION: Undoing incorrectly fetched HEAD." \
HEAD "$orig_head"
die "Cannot fetch into the current branch."
fi
--- /dev/null
+#!/bin/sh
+#
+# Copyright (c) 2006, Shawn O. Pearce
+#
+# Cleanup unreachable files and optimize the repository.
+
+USAGE=''
+SUBDIRECTORY_OK=Yes
+. git-sh-setup
+
+git-pack-refs --prune &&
+git-reflog expire --all &&
+git-repack -a -d -l &&
+git-prune &&
+git-rerere gc || exit
# Copyright (c) 2005 Junio C Hamano
#
-USAGE='[-n] [--no-commit] [--squash] [-s <strategy>] [--reflog-action=<action>] [-m=<merge-message>] <commit>+'
+USAGE='[-n] [--no-commit] [--squash] [-s <strategy>] [-m=<merge-message>] <commit>+'
. git-sh-setup
+set_reflog_action "merge $*"
LF='
'
finish () {
if test '' = "$2"
then
- rlogm="$rloga"
+ rlogm="$GIT_REFLOG_ACTION"
else
echo "$2"
- rlogm="$rloga: $2"
+ rlogm="$GIT_REFLOG_ACTION: $2"
fi
case "$squash" in
t)
case "$#" in 0) usage ;; esac
-rloga= have_message=
+have_message=
while case "$#" in 0) break ;; esac
do
case "$1" in
die "available strategies are: $all_strategies" ;;
esac
;;
- --reflog-action=*)
- rloga=`expr "z$1" : 'z-[^=]*=\(.*\)'`
- ;;
-m=*|--m=*|--me=*|--mes=*|--mess=*|--messa=*|--messag=*|--message=*)
merge_msg=`expr "z$1" : 'z-[^=]*=\(.*\)'`
have_message=t
# All the rest are remote heads
test "$#" = 0 && usage ;# we need at least one remote head.
-test "$rloga" = '' && rloga="merge: $@"
remoteheads=
for remote
'')
case "$#" in
1)
- use_strategies="$default_twohead_strategies" ;;
+ var="`git-repo-config --get pull.twohead`"
+ if test -n "$var"
+ then
+ use_strategies="$var"
+ else
+ use_strategies="$default_twohead_strategies"
+ fi ;;
*)
- use_strategies="$default_octopus_strategies" ;;
+ var="`git-repo-config --get pull.octopus`"
+ if test -n "$var"
+ then
+ use_strategies="$var"
+ else
+ use_strategies="$default_octopus_strategies"
+ fi ;;
esac
;;
esac
USAGE='[-n | --no-summary] [--no-commit] [-s strategy]... [<fetch-options>] <repo> <head>...'
LONG_USAGE='Fetch one or more remote refs and merge it/them into the current HEAD.'
. git-sh-setup
+set_reflog_action "pull $*"
strategy_args= no_summary= no_commit= squash=
while case "$#,$1" in 0) break ;; *,-*) ;; *) break ;; esac
done
orig_head=$(git-rev-parse --verify HEAD 2>/dev/null)
-git-fetch --update-head-ok --reflog-action=pull "$@" || exit 1
+git-fetch --update-head-ok "$@" || exit 1
curr_head=$(git-rev-parse --verify HEAD 2>/dev/null)
if test "$curr_head" != "$orig_head"
echo >&2 "Cannot merge multiple branches into empty head"
exit 1
fi
- var=`git-repo-config --get pull.octopus`
- if test -n "$var"
- then
- strategy_default_args="-s $var"
- fi
- ;;
-*)
- var=`git-repo-config --get pull.twohead`
- if test -n "$var"
- then
- strategy_default_args="-s $var"
- fi
;;
esac
exit
fi
-case "$strategy_args" in
-'')
- strategy_args=$strategy_default_args
- ;;
-esac
-
merge_name=$(git-fmt-merge-msg <"$GIT_DIR/FETCH_HEAD") || exit
-git-merge "--reflog-action=pull $*" \
- $no_summary $no_commit $squash $strategy_args \
+exec git-merge $no_summary $no_commit $squash $strategy_args \
"$merge_name" HEAD $merge_head
D---E---F---G master D---E---F---G master
'
. git-sh-setup
+set_reflog_action rebase
RESOLVEMSG="
When you have resolved this problem run \"git rebase --continue\".
call_merge () {
cmt="$(cat $dotest/cmt.$1)"
echo "$cmt" > "$dotest/current"
- git-merge-$strategy "$cmt^" -- HEAD "$cmt"
+ hd=$(git-rev-parse --verify HEAD)
+ cmt_name=$(git-symbolic-ref HEAD)
+ msgnum=$(cat $dotest/msgnum)
+ end=$(cat $dotest/end)
+ eval GITHEAD_$cmt='"${cmt_name##refs/heads/}~$(($end - $msgnum))"'
+ eval GITHEAD_$hd='"$(cat $dotest/onto_name)"'
+ export GITHEAD_$cmt GITHEAD_$hd
+ git-merge-$strategy "$cmt^" -- "$hd" "$cmt"
rv=$?
case "$rv" in
0)
+ unset GITHEAD_$cmt GITHEAD_$hd
return
;;
1)
finish_rb_merge
exit
fi
- git am --resolved --3way --resolvemsg="$RESOLVEMSG" \
- --reflog-action=rebase
+ git am --resolved --3way --resolvemsg="$RESOLVEMSG"
exit
;;
--skip)
finish_rb_merge
exit
fi
- git am -3 --skip --resolvemsg="$RESOLVEMSG" \
- --reflog-action=rebase
+ git am -3 --skip --resolvemsg="$RESOLVEMSG"
exit
;;
--abort)
if test -z "$do_merge"
then
git-format-patch -k --stdout --full-index --ignore-if-in-upstream "$upstream"..ORIG_HEAD |
- git am --binary -3 -k --resolvemsg="$RESOLVEMSG" \
- --reflog-action=rebase
+ git am --binary -3 -k --resolvemsg="$RESOLVEMSG"
exit $?
fi
mkdir -p "$dotest"
echo "$onto" > "$dotest/onto"
+echo "$onto_name" > "$dotest/onto_name"
prev_head=`git-rev-parse HEAD^0`
echo "$prev_head" > "$dotest/prev_head"
esac
args="$args $local $quiet $no_reuse_delta$extra"
-name=$(git-pack-objects --non-empty --all $args </dev/null "$PACKTMP") ||
+name=$(git-pack-objects --non-empty --all --reflog $args </dev/null "$PACKTMP") ||
exit 1
if [ -z "$name" ]; then
echo Nothing new to pack.
USAGE='[--mixed | --soft | --hard] [<commit-ish>] [ [--] <paths>...]'
SUBDIRECTORY_OK=Yes
. git-sh-setup
+set_reflog_action "reset $*"
update= reset_type=--mixed
unset rev
else
rm -f "$GIT_DIR/ORIG_HEAD"
fi
-git-update-ref -m "reset $reset_type $*" HEAD "$rev"
+git-update-ref -m "$GIT_REFLOG_ACTION" HEAD "$rev"
update_ref_status=$?
case "$reset_type" in
if (!defined $from) {
$from = $author || $committer;
do {
- $_ = $term->readline("Who should the emails appear to be from? ",
- $from);
+ $_ = $term->readline("Who should the emails appear to be from? [$from] ");
} while (!defined $_);
- $from = $_;
+ $from = $_ if ($_);
print "Emails will be sent from: ", $from, "\n";
$prompting++;
}
die "Usage: $0 $USAGE"
}
+set_reflog_action() {
+ if [ -z "${GIT_REFLOG_ACTION:+set}" ]
+ then
+ GIT_REFLOG_ACTION="$*"
+ export GIT_REFLOG_ACTION
+ fi
+}
+
if [ -z "$LONG_USAGE" ]
then
LONG_USAGE="Usage: $0 $USAGE"
$_limit, $_verbose, $_incremental, $_oneline, $_l_fmt, $_show_commit,
$_version, $_upgrade, $_authors, $_branch_all_refs, @_opt_m,
$_merge, $_strategy, $_dry_run, $_ignore_nodate, $_non_recursive,
- $_username, $_config_dir, $_no_auth_cache, $_xfer_delta,
+ $_username, $_config_dir, $_no_auth_cache,
$_pager, $_color);
my (@_branch_from, %tree_map, %users, %rusers, %equiv);
my ($_svn_can_do_switch);
}
sub version {
- print "git-svn version $VERSION\n";
+ print "git-svn version $VERSION (svn $SVN::Core::VERSION)\n";
exit 0;
}
}
return if $_dry_run;
fetch();
- my @diff = command('diff-tree', $head, $gs, '--');
+ my @diff = command('diff-tree', 'HEAD', $gs, '--');
my @finish;
if (@diff) {
@finish = qw/rebase/;
push @finish, qw/--merge/ if $_merge;
push @finish, "--strategy=$_strategy" if $_strategy;
- print STDERR "W: $head and $gs differ, using @finish:\n", @diff;
+ print STDERR "W: HEAD and $gs differ, using @finish:\n", @diff;
} else {
- print "No changes between current $head and $gs\n",
+ print "No changes between current HEAD and $gs\n",
"Resetting to the latest $gs\n";
@finish = qw/reset --mixed/;
}
sub verify_ref {
my ($ref) = @_;
- eval { command_oneline([ 'rev-parse', $ref ], { STDERR => 0 }) };
+ eval { command_oneline([ 'rev-parse', '--verify', $ref ],
+ { STDERR => 0 }); };
}
sub repo_path_split {
config => $config,
pool => SVN::Pool->new,
auth_provider_callbacks => $callbacks);
-
- my $df = $ENV{GIT_SVN_DELTA_FETCH};
- if (defined $df) {
- $_xfer_delta = $df;
- } else {
- $_xfer_delta = ($url =~ m#^file://#) ? undef : 1;
- }
$ra->{svn_path} = $url;
$ra->{repos_root} = $ra->get_repos_root;
$ra->{svn_path} =~ s#^\Q$ra->{repos_root}\E/*##;
auth auth_provider_callbacks repos_root svn_path/);
}
-sub libsvn_get_file {
- my ($gui, $f, $rev, $chg, $untracked) = @_;
- $f =~ s#^/##;
- print "\t$chg\t$f\n" unless $_q;
-
- my ($hash, $pid, $in, $out);
- my $pool = SVN::Pool->new;
- defined($pid = open3($in, $out, '>&STDERR',
- qw/git-hash-object -w --stdin/)) or croak $!;
- # redirect STDOUT for SVN 1.1.x compatibility
- open my $stdout, '>&', \*STDOUT or croak $!;
- open STDOUT, '>&', $in or croak $!;
- my ($r, $props) = $SVN->get_file($f, $rev, \*STDOUT, $pool);
- $in->flush == 0 or croak $!;
- open STDOUT, '>&', $stdout or croak $!;
- close $in or croak $!;
- close $stdout or croak $!;
- $pool->clear;
- chomp($hash = do { local $/; <$out> });
- close $out or croak $!;
- waitpid $pid, 0;
- $hash =~ /^$sha1$/o or die "not a sha1: $hash\n";
-
- my $mode = exists $props->{'svn:executable'} ? '100755' : '100644';
- if (exists $props->{'svn:special'}) {
- $mode = '120000';
- my $link = `git-cat-file blob $hash`; # no chomping symlinks
- $link =~ s/^link // or die "svn:special file with contents: <",
- $link, "> is not understood\n";
- defined($pid = open3($in, $out, '>&STDERR',
- qw/git-hash-object -w --stdin/)) or croak $!;
- print $in $link;
- $in->flush == 0 or croak $!;
- close $in or croak $!;
- chomp($hash = do { local $/; <$out> });
- close $out or croak $!;
- waitpid $pid, 0;
- $hash =~ /^$sha1$/o or die "not a sha1: $hash\n";
- }
- %{$untracked->{file_prop}->{$f}} = %$props;
- print $gui $mode,' ',$hash,"\t",$f,"\0" or croak $!;
-}
-
sub uri_encode {
my ($f) = @_;
$f =~ s#([^a-zA-Z0-9\*!\:_\./\-])#uc sprintf("%%%02x",ord($1))#eg;
}
sub libsvn_fetch {
- $_xfer_delta ? libsvn_fetch_delta(@_) : libsvn_fetch_full(@_);
-}
-
-sub libsvn_fetch_delta {
my ($last_commit, $paths, $rev, $author, $date, $msg) = @_;
my $pool = SVN::Pool->new;
my $ed = SVN::Git::Fetcher->new({ c => $last_commit, q => $_q });
libsvn_log_entry($rev, $author, $date, $msg, [$last_commit], $ed);
}
-sub libsvn_fetch_full {
- my ($last_commit, $paths, $rev, $author, $date, $msg) = @_;
- my ($gui, $ctx) = command_input_pipe(qw/update-index -z --index-info/);
- my %amr;
- my $ut = { empty => {}, dir_prop => {}, file_prop => {} };
- my $p = $SVN->{svn_path};
- foreach my $f (keys %$paths) {
- my $m = $paths->{$f}->action();
- if (length $p) {
- $f =~ s#^/\Q$p\E/##;
- next if $f =~ m#^/#;
- } else {
- $f =~ s#^/##;
- }
- if ($m =~ /^[DR]$/) {
- my $t = process_rm($gui, $last_commit, $f, $_q);
- if ($m eq 'D') {
- $ut->{empty}->{$f} = 0 if $t == $SVN::Node::dir;
- next;
- }
- # 'R' can be file replacements, too, right?
- }
- my $pool = SVN::Pool->new;
- my $t = $SVN->check_path($f, $rev, $pool);
- if ($t == $SVN::Node::file) {
- if ($m =~ /^[AMR]$/) {
- $amr{$f} = $m;
- } else {
- die "Unrecognized action: $m, ($f r$rev)\n";
- }
- } elsif ($t == $SVN::Node::dir && $m =~ /^[AR]$/) {
- my @traversed = ();
- libsvn_traverse($gui, '', $f, $rev, \@traversed, $ut);
- if (@traversed) {
- foreach (@traversed) {
- $amr{$_} = $m;
- }
- } else {
- my ($dir, $file) = ($f =~ m#^(.*?)/?([^/]+)$#);
- delete $ut->{empty}->{$dir};
- $ut->{empty}->{$f} = 1;
- }
- }
- $pool->clear;
- }
- foreach (keys %amr) {
- libsvn_get_file($gui, $_, $rev, $amr{$_}, $ut);
- my ($d) = ($_ =~ m#^(.*?)/?(?:[^/]+)$#);
- delete $ut->{empty}->{$d};
- }
- unless (exists $ut->{dir_prop}->{''}) {
- my $pool = SVN::Pool->new;
- my (undef, undef, $props) = $SVN->get_dir('', $rev, $pool);
- %{$ut->{dir_prop}->{''}} = %$props;
- $pool->clear;
- }
- command_close_pipe($gui, $ctx);
- libsvn_log_entry($rev, $author, $date, $msg, [$last_commit], $ut);
-}
-
sub svn_grab_base_rev {
my $c = eval { command_oneline([qw/rev-parse --verify/,
"refs/remotes/$GIT_SVN^0"],
"Try using the command-line svn client instead\n";
}
-sub libsvn_traverse {
- my ($gui, $pfx, $path, $rev, $files, $untracked) = @_;
- my $cwd = length $pfx ? "$pfx/$path" : $path;
- my $pool = SVN::Pool->new;
- $cwd =~ s#^\Q$SVN->{svn_path}\E##;
- my $nr = 0;
- my ($dirent, $r, $props) = $SVN->get_dir($cwd, $rev, $pool);
- %{$untracked->{dir_prop}->{$cwd}} = %$props;
- foreach my $d (keys %$dirent) {
- my $t = $dirent->{$d}->kind;
- if ($t == $SVN::Node::dir) {
- my $i = libsvn_traverse($gui, $cwd, $d, $rev,
- $files, $untracked);
- if ($i) {
- $nr += $i;
- } else {
- $untracked->{empty}->{"$cwd/$d"} = 1;
- }
- } elsif ($t == $SVN::Node::file) {
- $nr++;
- my $file = "$cwd/$d";
- if (defined $files) {
- push @$files, $file;
- } else {
- libsvn_get_file($gui, $file, $rev, 'A',
- $untracked);
- my ($dir) = ($file =~ m#^(.*?)/?(?:[^/]+)$#);
- delete $untracked->{empty}->{$dir};
- }
- }
- }
- $pool->clear;
- $nr;
-}
-
sub libsvn_traverse_ignore {
my ($fh, $path, $r) = @_;
$path =~ s#^/+##g;
print STDERR "Found branch parent: ($GIT_SVN) $parent\n";
command_noisy('read-tree', $parent);
unless (libsvn_can_do_switch()) {
- return libsvn_fetch_full($parent, $paths, $rev,
- $author, $date, $msg);
+ return _libsvn_new_tree($paths, $rev, $author, $date,
+ $msg, [$parent]);
}
# do_switch works with svn/trunk >= r22312, but that is not
# included with SVN 1.4.2 (the latest version at the moment),
sub libsvn_get_log {
my ($ra, @args) = @_;
- $args[4]-- if $args[4] && $_xfer_delta && ! $_follow_parent;
+ $args[4]-- if $args[4] && ! $_follow_parent;
if ($SVN::Core::VERSION le '1.2.0') {
splice(@args, 3, 1);
}
if (my $log_entry = libsvn_find_parent_branch(@_)) {
return $log_entry;
}
- my ($paths, $rev, $author, $date, $msg) = @_;
- my $ut;
- if ($_xfer_delta) {
- my $pool = SVN::Pool->new;
- my $ed = SVN::Git::Fetcher->new({q => $_q});
- my $reporter = $SVN->do_update($rev, '', 1, $ed, $pool);
- my @lock = $SVN::Core::VERSION ge '1.2.0' ? (undef) : ();
- $reporter->set_path('', $rev, 1, @lock, $pool);
- $reporter->finish_report($pool);
- $pool->clear;
- unless ($ed->{git_commit_ok}) {
- die "SVN connection failed somewhere...\n";
- }
- $ut = $ed;
- } else {
- $ut = { empty => {}, dir_prop => {}, file_prop => {} };
- my ($gui, $ctx) = command_input_pipe(qw/update-index
- -z --index-info/);
- libsvn_traverse($gui, '', $SVN->{svn_path}, $rev, undef, $ut);
- command_close_pipe($gui, $ctx);
+ my ($paths, $rev, $author, $date, $msg) = @_; # $pool is last
+ _libsvn_new_tree($paths, $rev, $author, $date, $msg, []);
+}
+
+sub _libsvn_new_tree {
+ my ($paths, $rev, $author, $date, $msg, $parents) = @_;
+ my $pool = SVN::Pool->new;
+ my $ed = SVN::Git::Fetcher->new({q => $_q});
+ my $reporter = $SVN->do_update($rev, '', 1, $ed, $pool);
+ my @lock = $SVN::Core::VERSION ge '1.2.0' ? (undef) : ();
+ $reporter->set_path('', $rev, 1, @lock, $pool);
+ $reporter->finish_report($pool);
+ $pool->clear;
+ unless ($ed->{git_commit_ok}) {
+ die "SVN connection failed somewhere...\n";
}
- libsvn_log_entry($rev, $author, $date, $msg, [], $ut);
+ libsvn_log_entry($rev, $author, $date, $msg, $parents, $ed);
}
sub find_graft_path_commit {
my $pool = SVN::Pool->new;
my $r = defined $_revision ? $_revision : $ra->get_latest_revnum;
my ($dirent, undef, undef) = $ra->get_dir('', $r, $pool);
- foreach my $d (keys %$dirent) {
+ foreach my $d (sort keys %$dirent) {
if ($dirent->{$d}->kind == $SVN::Node::dir) {
push @ret, "$d/"; # add '/' for compat with cli svn
}
{ "prune-packed", cmd_prune_packed, RUN_SETUP },
{ "push", cmd_push, RUN_SETUP },
{ "read-tree", cmd_read_tree, RUN_SETUP },
+ { "reflog", cmd_reflog, RUN_SETUP },
{ "repo-config", cmd_repo_config },
{ "rerere", cmd_rerere, RUN_SETUP },
{ "rev-list", cmd_rev_list, RUN_SETUP },
use File::Basename qw(basename);
binmode STDOUT, ':utf8';
+BEGIN {
+ CGI->compile() if $ENV{MOD_PERL};
+}
+
our $cgi = new CGI;
our $version = "++GIT_VERSION++";
our $my_url = $cgi->url();
}
sub parse_commit_text {
- my ($commit_text) = @_;
+ my ($commit_text, $withparents) = @_;
my @commit_lines = split '\n', $commit_text;
my %co;
if (!($header =~ m/^[0-9a-fA-F]{40}/)) {
return;
}
- $co{'id'} = $header;
- my @parents;
+ ($co{'id'}, my @parents) = split ' ', $header;
while (my $line = shift @commit_lines) {
last if $line eq "\n";
if ($line =~ m/^tree ([0-9a-fA-F]{40})$/) {
$co{'tree'} = $1;
- } elsif ($line =~ m/^parent ([0-9a-fA-F]{40})$/) {
+ } elsif ((!defined $withparents) && ($line =~ m/^parent ([0-9a-fA-F]{40})$/)) {
push @parents, $1;
} elsif ($line =~ m/^author (.*) ([0-9]+) (.*)$/) {
$co{'author'} = $1;
local $/ = "\0";
open my $fd, "-|", git_cmd(), "rev-list",
+ "--parents",
"--header",
"--max-count=1",
$commit_id,
"--",
or die_error(undef, "Open git-rev-list failed");
- %co = parse_commit_text(<$fd>);
+ %co = parse_commit_text(<$fd>, 1);
close $fd;
return %co;
}
print $cgi->header(-type=>$content_type, -charset => 'utf-8',
-status=> $status, -expires => $expires);
+ my $mod_perl_version = $ENV{'MOD_PERL'} ? " $ENV{'MOD_PERL'}" : '';
print <<EOF;
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<!-- git core binaries version $git_version -->
<head>
<meta http-equiv="content-type" content="$content_type; charset=utf-8"/>
-<meta name="generator" content="gitweb/$version git/$git_version"/>
+<meta name="generator" content="gitweb/$version git/$git_version$mod_perl_version"/>
<meta name="robots" content="index, nofollow"/>
<title>$title</title>
EOF
const char *committer;
if (log_all_ref_updates &&
- !strncmp(lock->ref_name, "refs/heads/", 11)) {
+ (!strncmp(lock->ref_name, "refs/heads/", 11) ||
+ !strncmp(lock->ref_name, "refs/remotes/", 13))) {
if (safe_create_leading_directories(lock->log_file) < 0)
return error("unable to create directory for %s",
lock->log_file);
{
const char *logfile, *logdata, *logend, *rec, *lastgt, *lastrec;
char *tz_c;
- int logfd, tz;
+ int logfd, tz, reccnt = 0;
struct stat st;
unsigned long date;
unsigned char logged_sha1[20];
lastrec = NULL;
rec = logend = logdata + st.st_size;
while (logdata < rec) {
+ reccnt++;
if (logdata < rec && *(rec-1) == '\n')
rec--;
lastgt = NULL;
if (get_sha1_hex(logdata, sha1))
die("Log %s is corrupt.", logfile);
munmap((void*)logdata, st.st_size);
- fprintf(stderr, "warning: Log %s only goes back to %s.\n",
- logfile, show_rfc2822_date(date, tz));
+ if (at_time)
+ fprintf(stderr, "warning: Log %s only goes back to %s.\n",
+ logfile, show_rfc2822_date(date, tz));
+ else
+ fprintf(stderr, "warning: Log %s only has %d entries.\n",
+ logfile, reccnt);
return 0;
}
+
+void for_each_reflog_ent(const char *ref, each_reflog_ent_fn fn, void *cb_data)
+{
+ const char *logfile;
+ FILE *logfp;
+ char buf[1024];
+
+ logfile = git_path("logs/%s", ref);
+ logfp = fopen(logfile, "r");
+ if (!logfp)
+ return;
+ while (fgets(buf, sizeof(buf), logfp)) {
+ unsigned char osha1[20], nsha1[20];
+ int len;
+
+ /* old SP new SP name <email> SP time TAB msg LF */
+ len = strlen(buf);
+ if (len < 83 || buf[len-1] != '\n' ||
+ get_sha1_hex(buf, osha1) || buf[40] != ' ' ||
+ get_sha1_hex(buf + 41, nsha1) || buf[81] != ' ')
+ continue; /* corrupt? */
+ fn(osha1, nsha1, buf+82, cb_data);
+ }
+ fclose(logfp);
+}
+
/** Reads log for the value of ref during at_time. **/
extern int read_ref_at(const char *ref, unsigned long at_time, int cnt, unsigned char *sha1);
+/* iterate over reflog entries */
+typedef int each_reflog_ent_fn(unsigned char *osha1, unsigned char *nsha1, char *, void *);
+void for_each_reflog_ent(const char *ref, each_reflog_ent_fn fn, void *cb_data);
+
/** Returns 0 if target has the right format for a ref. **/
extern int check_ref_format(const char *target);
revs->commits = newlist;
}
-static int all_flags;
-static struct rev_info *all_revs;
+struct all_refs_cb {
+ int all_flags;
+ int warned_bad_reflog;
+ struct rev_info *all_revs;
+ const char *name_for_errormsg;
+};
static int handle_one_ref(const char *path, const unsigned char *sha1, int flag, void *cb_data)
{
- struct object *object = get_reference(all_revs, path, sha1, all_flags);
- add_pending_object(all_revs, object, "");
+ struct all_refs_cb *cb = cb_data;
+ struct object *object = get_reference(cb->all_revs, path, sha1,
+ cb->all_flags);
+ add_pending_object(cb->all_revs, object, "");
return 0;
}
static void handle_all(struct rev_info *revs, unsigned flags)
{
- all_revs = revs;
- all_flags = flags;
- for_each_ref(handle_one_ref, NULL);
+ struct all_refs_cb cb;
+ cb.all_revs = revs;
+ cb.all_flags = flags;
+ for_each_ref(handle_one_ref, &cb);
+}
+
+static void handle_one_reflog_commit(unsigned char *sha1, void *cb_data)
+{
+ struct all_refs_cb *cb = cb_data;
+ if (!is_null_sha1(sha1)) {
+ struct object *o = parse_object(sha1);
+ if (o) {
+ o->flags |= cb->all_flags;
+ add_pending_object(cb->all_revs, o, "");
+ }
+ else if (!cb->warned_bad_reflog) {
+ warn("reflog of '%s' references pruned commits",
+ cb->name_for_errormsg);
+ cb->warned_bad_reflog = 1;
+ }
+ }
+}
+
+static int handle_one_reflog_ent(unsigned char *osha1, unsigned char *nsha1, char *detail, void *cb_data)
+{
+ handle_one_reflog_commit(osha1, cb_data);
+ handle_one_reflog_commit(nsha1, cb_data);
+ return 0;
+}
+
+static int handle_one_reflog(const char *path, const unsigned char *sha1, int flag, void *cb_data)
+{
+ struct all_refs_cb *cb = cb_data;
+ cb->warned_bad_reflog = 0;
+ cb->name_for_errormsg = path;
+ for_each_reflog_ent(path, handle_one_reflog_ent, cb_data);
+ return 0;
+}
+
+static void handle_reflog(struct rev_info *revs, unsigned flags)
+{
+ struct all_refs_cb cb;
+ cb.all_revs = revs;
+ cb.all_flags = flags;
+ for_each_ref(handle_one_reflog, &cb);
}
static int add_parents_only(struct rev_info *revs, const char *arg, int flags)
handle_all(revs, flags);
continue;
}
+ if (!strcmp(arg, "--reflog")) {
+ handle_reflog(revs, flags);
+ continue;
+ }
if (!strcmp(arg, "--not")) {
flags ^= UNINTERESTING;
continue;
--- /dev/null
+#include "cache.h"
+#include "commit.h"
+#include "tag.h"
+
+static int is_shallow = -1;
+
+int register_shallow(const unsigned char *sha1)
+{
+ struct commit_graft *graft =
+ xmalloc(sizeof(struct commit_graft));
+ struct commit *commit = lookup_commit(sha1);
+
+ hashcpy(graft->sha1, sha1);
+ graft->nr_parent = -1;
+ if (commit && commit->object.parsed)
+ commit->parents = NULL;
+ return register_commit_graft(graft, 0);
+}
+
+int is_repository_shallow()
+{
+ FILE *fp;
+ char buf[1024];
+
+ if (is_shallow >= 0)
+ return is_shallow;
+
+ fp = fopen(git_path("shallow"), "r");
+ if (!fp) {
+ is_shallow = 0;
+ return is_shallow;
+ }
+ is_shallow = 1;
+
+ while (fgets(buf, sizeof(buf), fp)) {
+ unsigned char sha1[20];
+ if (get_sha1_hex(buf, sha1))
+ die("bad shallow line: %s", buf);
+ register_shallow(sha1);
+ }
+ fclose(fp);
+ return is_shallow;
+}
+
+struct commit_list *get_shallow_commits(struct object_array *heads, int depth,
+ int shallow_flag, int not_shallow_flag)
+{
+ int i = 0, cur_depth = 0;
+ struct commit_list *result = NULL;
+ struct object_array stack = {0, 0, NULL};
+ struct commit *commit = NULL;
+
+ while (commit || i < heads->nr || stack.nr) {
+ struct commit_list *p;
+ if (!commit) {
+ if (i < heads->nr) {
+ commit = (struct commit *)
+ deref_tag(heads->objects[i++].item, NULL, 0);
+ if (commit->object.type != OBJ_COMMIT) {
+ commit = NULL;
+ continue;
+ }
+ if (!commit->util)
+ commit->util = xmalloc(sizeof(int));
+ *(int *)commit->util = 0;
+ cur_depth = 0;
+ } else {
+ commit = (struct commit *)
+ stack.objects[--stack.nr].item;
+ cur_depth = *(int *)commit->util;
+ }
+ }
+ parse_commit(commit);
+ commit->object.flags |= not_shallow_flag;
+ cur_depth++;
+ for (p = commit->parents, commit = NULL; p; p = p->next) {
+ if (!p->item->util) {
+ int *pointer = xmalloc(sizeof(int));
+ p->item->util = pointer;
+ *pointer = cur_depth;
+ } else {
+ int *pointer = p->item->util;
+ if (cur_depth >= *pointer)
+ continue;
+ *pointer = cur_depth;
+ }
+ if (cur_depth < depth) {
+ if (p->next)
+ add_object_array(&p->item->object,
+ NULL, &stack);
+ else {
+ commit = p->item;
+ cur_depth = *(int *)commit->util;
+ }
+ } else {
+ commit_list_insert(p->item, &result);
+ p->item->object.flags |= shallow_flag;
+ }
+ }
+ }
+
+ return result;
+}
+
# we can test NO_OPTIMIZE_COMMITS independently of LC_ALL
full-svn-test:
- $(MAKE) $(TSVN) GIT_SVN_DELTA_FETCH=1 \
- GIT_SVN_NO_OPTIMIZE_COMMITS=1 LC_ALL=C
+ $(MAKE) $(TSVN) GIT_SVN_NO_OPTIMIZE_COMMITS=1 LC_ALL=C
$(MAKE) $(TSVN) GIT_SVN_NO_OPTIMIZE_COMMITS=0 LC_ALL=en_US.UTF-8
.PHONY: $(T) clean
pull_to_client 3rd "A" $((1*3)) # old fails
+test_expect_success "clone shallow" "git-clone --depth 2 . shallow"
+
+(cd shallow; git-count-objects -v) > count.shallow
+
+test_expect_success "clone shallow object count" \
+ "test \"in-pack: 18\" = \"$(grep in-pack count.shallow)\""
+
+count_output () {
+ sed -e '/^in-pack:/d' -e '/^packs:/d' -e '/: 0$/d' "$1"
+}
+
+test_expect_success "clone shallow object count (part 2)" '
+ test -z "$(count_output count.shallow)"
+'
+
+test_expect_success "fsck in shallow repo" \
+ "(cd shallow; git-fsck-objects --full)"
+
+#test_done; exit
+
+add B66 $B65
+add B67 $B66
+
+test_expect_success "pull in shallow repo" \
+ "(cd shallow; git pull .. B)"
+
+(cd shallow; git-count-objects -v) > count.shallow
+test_expect_success "clone shallow object count" \
+ "test \"count: 6\" = \"$(grep count count.shallow)\""
+
+add B68 $B67
+add B69 $B68
+
+test_expect_success "deepening pull in shallow repo" \
+ "(cd shallow; git pull --depth 4 .. B)"
+
+(cd shallow; git-count-objects -v) > count.shallow
+test_expect_success "clone shallow object count" \
+ "test \"count: 12\" = \"$(grep count count.shallow)\""
+
+test_expect_success "deepening fetch in shallow repo" \
+ "(cd shallow; git fetch --depth 4 .. A:A)"
+
+(cd shallow; git-count-objects -v) > count.shallow
+test_expect_success "clone shallow object count" \
+ "test \"count: 18\" = \"$(grep count count.shallow)\""
+
+test_expect_failure "pull in shallow repo with missing merge base" \
+ "(cd shallow; git pull --depth 4 .. A)"
+
test_done
test_expect_success "expected conflict markers, with -L" \
"diff -u test.txt expect.txt"
+sed "s/ tu / TU /" < new1.txt > new5.txt
+test_expect_failure "conflict in removed tail" \
+ "git-merge-file -p orig.txt new1.txt new5.txt > out"
+
+cat > expect << EOF
+Dominus regit me,
+et nihil mihi deerit.
+In loco pascuae ibi me collocavit,
+super aquam refectionis educavit me;
+animam meam convertit,
+deduxit me super semitas jusitiae,
+propter nomen suum.
+<<<<<<< orig.txt
+=======
+Nam et si ambulavero in medio umbrae mortis,
+non timebo mala, quoniam TU mecum es:
+virga tua et baculus tuus ipsa me consolata sunt.
+>>>>>>> new5.txt
+EOF
+
+test_expect_success "expected conflict markers" "diff -u expect out"
+
test_done
# X \
# 2 - C - E - G
-export GIT_COMMITTER_DATE="2006-12-12 23:28:00 +0100"
-echo 1 > a1
-git add a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:00" git commit -m 1 a1
-
-git checkout -b A master
-echo A > a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:01" git commit -m A a1
-
-git checkout -b B master
-echo B > a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:02" git commit -m B a1
-
-git checkout -b D A
-git-rev-parse B > .git/MERGE_HEAD
-echo D > a1
-git update-index a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:03" git commit -m D
-
-git symbolic-ref HEAD refs/heads/other
-echo 2 > a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:04" git commit -m 2 a1
-
-git checkout -b C
-echo C > a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:05" git commit -m C a1
-
-git checkout -b E C
-git-rev-parse B > .git/MERGE_HEAD
-echo E > a1
-git update-index a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:06" git commit -m E
-
-git checkout -b G E
-git-rev-parse A > .git/MERGE_HEAD
-echo G > a1
-git update-index a1
-GIT_AUTHOR_DATE="2006-12-12 23:00:07" git commit -m G
-
-git checkout -b F D
-git-rev-parse C > .git/MERGE_HEAD
-echo F > a1
-git update-index a1
+GIT_COMMITTER_DATE="2006-12-12 23:28:00 +0100"
+export GIT_COMMITTER_DATE
+
+test_expect_success "setup tests" '
+echo 1 > a1 &&
+git add a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:00" git commit -m 1 a1 &&
+
+git checkout -b A master &&
+echo A > a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:01" git commit -m A a1 &&
+
+git checkout -b B master &&
+echo B > a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:02" git commit -m B a1 &&
+
+git checkout -b D A &&
+git-rev-parse B > .git/MERGE_HEAD &&
+echo D > a1 &&
+git update-index a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:03" git commit -m D &&
+
+git symbolic-ref HEAD refs/heads/other &&
+echo 2 > a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:04" git commit -m 2 a1 &&
+
+git checkout -b C &&
+echo C > a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:05" git commit -m C a1 &&
+
+git checkout -b E C &&
+git-rev-parse B > .git/MERGE_HEAD &&
+echo E > a1 &&
+git update-index a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:06" git commit -m E &&
+
+git checkout -b G E &&
+git-rev-parse A > .git/MERGE_HEAD &&
+echo G > a1 &&
+git update-index a1 &&
+GIT_AUTHOR_DATE="2006-12-12 23:00:07" git commit -m G &&
+
+git checkout -b F D &&
+git-rev-parse C > .git/MERGE_HEAD &&
+echo F > a1 &&
+git update-index a1 &&
GIT_AUTHOR_DATE="2006-12-12 23:00:08" git commit -m F
+'
test_expect_failure "combined merge conflicts" "git merge -m final G"
echo 'define NO_SVN_TESTS to skip git-svn tests'
-mkdir import
-cd import
-
-echo foo > foo
-if test -z "$NO_SYMLINK"
-then
- ln -s foo foo.link
-fi
-mkdir -p dir/a/b/c/d/e
-echo 'deep dir' > dir/a/b/c/d/e/file
-mkdir -p bar
-echo 'zzz' > bar/zzz
-echo '#!/bin/sh' > exec.sh
-chmod +x exec.sh
-svn import -m 'import for git-svn' . "$svnrepo" >/dev/null
-
-cd ..
-rm -rf import
-
test_expect_success \
- 'initialize git-svn' \
- "git-svn init $svnrepo"
+ 'initialize git-svn' "
+ mkdir import &&
+ cd import &&
+ echo foo > foo &&
+ if test -z '$NO_SYMLINK'
+ then
+ ln -s foo foo.link
+ fi
+ mkdir -p dir/a/b/c/d/e &&
+ echo 'deep dir' > dir/a/b/c/d/e/file &&
+ mkdir bar &&
+ echo 'zzz' > bar/zzz &&
+ echo '#!/bin/sh' > exec.sh &&
+ chmod +x exec.sh &&
+ svn import -m 'import for git-svn' . $svnrepo >/dev/null &&
+ cd .. &&
+ rm -rf import &&
+ git-svn init $svnrepo"
test_expect_success \
'import an SVN revision into git' \
'git-svn fetch'
-test_expect_success "checkout from svn" "svn co $svnrepo $SVN_TREE"
+test_expect_success "checkout from svn" "svn co $svnrepo '$SVN_TREE'"
name='try a deep --rmdir with a commit'
-git checkout -f -b mybranch remotes/git-svn
-mv dir/a/b/c/d/e/file dir/file
-cp dir/file file
-git update-index --add --remove dir/a/b/c/d/e/file dir/file file
-git commit -m "$name"
-
-test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch &&
- svn up $SVN_TREE &&
- test -d $SVN_TREE/dir && test ! -d $SVN_TREE/dir/a"
+test_expect_success "$name" "
+ git checkout -f -b mybranch remotes/git-svn &&
+ mv dir/a/b/c/d/e/file dir/file &&
+ cp dir/file file &&
+ git update-index --add --remove dir/a/b/c/d/e/file dir/file file &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch &&
+ svn up '$SVN_TREE' &&
+ test -d '$SVN_TREE'/dir && test ! -d '$SVN_TREE'/dir/a"
name='detect node change from file to directory #1'
-mkdir dir/new_file
-mv dir/file dir/new_file/file
-mv dir/new_file dir/file
-git update-index --remove dir/file
-git update-index --add dir/file/file
-git commit -m "$name"
-
-test_expect_failure "$name" \
- 'git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch' \
- || true
+test_expect_failure "$name" "
+ mkdir dir/new_file &&
+ mv dir/file dir/new_file/file &&
+ mv dir/new_file dir/file &&
+ git update-index --remove dir/file &&
+ git update-index --add dir/file/file &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch" || true
name='detect node change from directory to file #1'
-rm -rf dir $GIT_DIR/index
-git checkout -f -b mybranch2 remotes/git-svn
-mv bar/zzz zzz
-rm -rf bar
-mv zzz bar
-git update-index --remove -- bar/zzz
-git update-index --add -- bar
-git commit -m "$name"
-
-test_expect_failure "$name" \
- 'git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch2' \
- || true
+test_expect_failure "$name" "
+ rm -rf dir '$GIT_DIR'/index &&
+ git checkout -f -b mybranch2 remotes/git-svn &&
+ mv bar/zzz zzz &&
+ rm -rf bar &&
+ mv zzz bar &&
+ git update-index --remove -- bar/zzz &&
+ git update-index --add -- bar &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch2" || true
name='detect node change from file to directory #2'
-rm -f $GIT_DIR/index
-git checkout -f -b mybranch3 remotes/git-svn
-rm bar/zzz
-git-update-index --remove bar/zzz
-mkdir bar/zzz
-echo yyy > bar/zzz/yyy
-git-update-index --add bar/zzz/yyy
-git commit -m "$name"
-
-test_expect_failure "$name" \
- 'git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch3' \
- || true
+test_expect_failure "$name" "
+ rm -f '$GIT_DIR'/index &&
+ git checkout -f -b mybranch3 remotes/git-svn &&
+ rm bar/zzz &&
+ git-update-index --remove bar/zzz &&
+ mkdir bar/zzz &&
+ echo yyy > bar/zzz/yyy &&
+ git-update-index --add bar/zzz/yyy &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch3" || true
name='detect node change from directory to file #2'
-rm -f $GIT_DIR/index
-git checkout -f -b mybranch4 remotes/git-svn
-rm -rf dir
-git update-index --remove -- dir/file
-touch dir
-echo asdf > dir
-git update-index --add -- dir
-git commit -m "$name"
-
-test_expect_failure "$name" \
- 'git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch4' \
- || true
+test_expect_failure "$name" "
+ rm -f '$GIT_DIR'/index &&
+ git checkout -f -b mybranch4 remotes/git-svn &&
+ rm -rf dir &&
+ git update-index --remove -- dir/file &&
+ touch dir &&
+ echo asdf > dir &&
+ git update-index --add -- dir &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch4" || true
name='remove executable bit from a file'
-rm -f $GIT_DIR/index
-git checkout -f -b mybranch5 remotes/git-svn
-chmod -x exec.sh
-git update-index exec.sh
-git commit -m "$name"
-
-test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch5 &&
- svn up $SVN_TREE &&
- test ! -x $SVN_TREE/exec.sh"
+test_expect_success "$name" "
+ rm -f '$GIT_DIR'/index &&
+ git checkout -f -b mybranch5 remotes/git-svn &&
+ chmod -x exec.sh &&
+ git update-index exec.sh &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch5 &&
+ svn up '$SVN_TREE' &&
+ test ! -x '$SVN_TREE'/exec.sh"
name='add executable bit back file'
-chmod +x exec.sh
-git update-index exec.sh
-git commit -m "$name"
-
-test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch5 &&
- svn up $SVN_TREE &&
- test -x $SVN_TREE/exec.sh"
-
+test_expect_success "$name" "
+ chmod +x exec.sh &&
+ git update-index exec.sh &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch5 &&
+ svn up '$SVN_TREE' &&
+ test -x '$SVN_TREE'/exec.sh"
if test -z "$NO_SYMLINK"
then
name='executable file becomes a symlink to bar/zzz (file)'
- rm exec.sh
- ln -s bar/zzz exec.sh
- git update-index exec.sh
- git commit -m "$name"
- test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch5 &&
- svn up $SVN_TREE &&
- test -L $SVN_TREE/exec.sh"
+ test_expect_success "$name" "
+ rm exec.sh &&
+ ln -s bar/zzz exec.sh &&
+ git update-index exec.sh &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch5 &&
+ svn up '$SVN_TREE' &&
+ test -L '$SVN_TREE'/exec.sh"
name='new symlink is added to a file that was also just made executable'
- chmod +x bar/zzz
- ln -s bar/zzz exec-2.sh
- git update-index --add bar/zzz exec-2.sh
- git commit -m "$name"
- test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch5 &&
- svn up $SVN_TREE &&
- test -x $SVN_TREE/bar/zzz &&
- test -L $SVN_TREE/exec-2.sh"
+ test_expect_success "$name" "
+ chmod +x bar/zzz &&
+ ln -s bar/zzz exec-2.sh &&
+ git update-index --add bar/zzz exec-2.sh &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch5 &&
+ svn up '$SVN_TREE' &&
+ test -x '$SVN_TREE'/bar/zzz &&
+ test -L '$SVN_TREE'/exec-2.sh"
name='modify a symlink to become a file'
- echo git help > help || true
- rm exec-2.sh
- cp help exec-2.sh
- git update-index exec-2.sh
- git commit -m "$name"
-
- test_expect_success "$name" \
- "git-svn set-tree --find-copies-harder --rmdir remotes/git-svn..mybranch5 &&
- svn up $SVN_TREE &&
- test -f $SVN_TREE/exec-2.sh &&
- test ! -L $SVN_TREE/exec-2.sh &&
- diff -u help $SVN_TREE/exec-2.sh"
+ test_expect_success "$name" "
+ echo git help > help || true &&
+ rm exec-2.sh &&
+ cp help exec-2.sh &&
+ git update-index exec-2.sh &&
+ git commit -m '$name' &&
+ git-svn set-tree --find-copies-harder --rmdir \
+ remotes/git-svn..mybranch5 &&
+ svn up '$SVN_TREE' &&
+ test -f '$SVN_TREE'/exec-2.sh &&
+ test ! -L '$SVN_TREE'/exec-2.sh &&
+ diff -u help $SVN_TREE/exec-2.sh"
fi
if test "$have_utf8" = t
then
name="commit with UTF-8 message: locale: $GIT_SVN_LC_ALL"
- echo '# hello' >> exec-2.sh
- git update-index exec-2.sh
- git commit -m 'éï∏'
- export LC_ALL="$GIT_SVN_LC_ALL"
- test_expect_success "$name" "git-svn set-tree HEAD"
+ LC_ALL="$GIT_SVN_LC_ALL"
+ export LC_ALL
+ test_expect_success "$name" "
+ echo '# hello' >> exec-2.sh &&
+ git update-index exec-2.sh &&
+ git commit -m 'éï∏' &&
+ git-svn set-tree HEAD"
unset LC_ALL
else
echo "UTF-8 locale not set, test skipped ($GIT_SVN_LC_ALL)"
exit
fi
-export CVSROOT=$(pwd)/cvsroot
-export CVSWORK=$(pwd)/cvswork
+CVSROOT=$(pwd)/cvsroot
+CVSWORK=$(pwd)/cvswork
+GIT_DIR=$(pwd)/.git
+export CVSROOT CVSWORK GIT_DIR
+
rm -rf "$CVSROOT" "$CVSWORK"
mkdir "$CVSROOT" &&
cvs init &&
cvs -Q co -d "$CVSWORK" . &&
-export GIT_DIR=$(pwd)/.git &&
echo >empty &&
git add empty &&
-git commit -a -m "Initial" 2>/dev/null ||
+git commit -q -a -m "Initial" 2>/dev/null ||
exit 1
test_expect_success \
return 0
}
+test_skip () {
+ this_test=$(expr "./$0" : '.*/\(t[0-9]*\)-[^/]*$')
+ this_test="$this_test.$(expr "$test_count" + 1)"
+ to_skip=
+ for skp in $GIT_SKIP_TESTS
+ do
+ case "$this_test" in
+ $skp)
+ to_skip=t
+ esac
+ done
+ case "$to_skip" in
+ t)
+ say >&3 "skipping test: $@"
+ test_count=$(expr "$test_count" + 1)
+ say "skip $test_count: $1"
+ : true
+ ;;
+ *)
+ false
+ ;;
+ esac
+}
+
test_expect_failure () {
test "$#" = 2 ||
error "bug in the test script: not 2 parameters to test-expect-failure"
- say >&3 "expecting failure: $2"
- test_run_ "$2"
- if [ "$?" = 0 -a "$eval_ret" != 0 -a "$eval_ret" -lt 129 ]
+ if ! test_skip "$@"
then
- test_ok_ "$1"
- else
- test_failure_ "$@"
+ say >&3 "expecting failure: $2"
+ test_run_ "$2"
+ if [ "$?" = 0 -a "$eval_ret" != 0 -a "$eval_ret" -lt 129 ]
+ then
+ test_ok_ "$1"
+ else
+ test_failure_ "$@"
+ fi
fi
echo >&3 ""
}
test_expect_success () {
test "$#" = 2 ||
error "bug in the test script: not 2 parameters to test-expect-success"
- say >&3 "expecting success: $2"
- test_run_ "$2"
- if [ "$?" = 0 -a "$eval_ret" = 0 ]
+ if ! test_skip "$@"
then
- test_ok_ "$1"
- else
- test_failure_ "$@"
+ say >&3 "expecting success: $2"
+ test_run_ "$2"
+ if [ "$?" = 0 -a "$eval_ret" = 0 ]
+ then
+ test_ok_ "$1"
+ else
+ test_failure_ "$@"
+ fi
fi
echo >&3 ""
}
test_expect_code () {
test "$#" = 3 ||
error "bug in the test script: not 3 parameters to test-expect-code"
- say >&3 "expecting exit code $1: $3"
- test_run_ "$3"
- if [ "$?" = 0 -a "$eval_ret" = "$1" ]
+ if ! test_skip "$@"
then
- test_ok_ "$2"
- else
- test_failure_ "$@"
+ say >&3 "expecting exit code $1: $3"
+ test_run_ "$3"
+ if [ "$?" = 0 -a "$eval_ret" = "$1" ]
+ then
+ test_ok_ "$2"
+ else
+ test_failure_ "$@"
+ fi
fi
echo >&3 ""
}
repo="$1"
mkdir "$repo"
cd "$repo" || error "Cannot setup test environment"
- "$GIT_EXEC_PATH/git" init-db --template=$GIT_EXEC_PATH/templates/blt/ 2>/dev/null ||
+ "$GIT_EXEC_PATH/git" init-db --template=$GIT_EXEC_PATH/templates/blt/ >/dev/null 2>&1 ||
error "cannot run git init-db -- have you built things yet?"
mv .git/hooks .git/hooks-disabled
cd "$owd"
rm -fr "$test"
test_create_repo $test
cd "$test"
+
+this_test=$(expr "./$0" : '.*/\(t[0-9]*\)-[^/]*$')
+for skp in $GIT_SKIP_TESTS
+do
+ to_skip=
+ for skp in $GIT_SKIP_TESTS
+ do
+ case "$this_test" in
+ $skp)
+ to_skip=t
+ esac
+ done
+ case "$to_skip" in
+ t)
+ say >&3 "skipping test $this_test altogether"
+ say "skip all tests in $this_test"
+ test_done
+ esac
+done
#
# To enable this hook, make this file executable.
+# Uncomment the below to add a Signed-off-by line to the message.
+# SOB=$(git var GIT_AUTHOR_IDENT | sed -n 's/^\(.*>\).*$/Signed-off-by: \1/p')
+# grep -qs "^$SOB" "$1" || echo "$SOB" >> "$1"
+
# This example catches duplicate Signed-off-by lines.
test "" = "$(grep '^Signed-off-by: ' "$1" |
case "$1","$ref_type" in
refs/tags/*,commit)
echo "*** Un-annotated tags are not allowed in this repo" >&2
- echo "*** Use 'git tag [ -a | -s ]' for tags you want to propagate."
+ echo "*** Use 'git tag [ -a | -s ]' for tags you want to propagate." >&2
exit 1;;
refs/tags/*,tag)
echo "### Pushing version '${1##refs/tags/}' to the masses" >&2
#include "object.h"
#include "commit.h"
#include "exec_cmd.h"
+#include "diff.h"
+#include "revision.h"
+#include "list-objects.h"
static const char upload_pack_usage[] = "git-upload-pack [--strict] [--timeout=nn] <dir>";
#define COMMON_KNOWN (1u << 14)
#define REACHABLE (1u << 15)
+#define SHALLOW (1u << 16)
+#define NOT_SHALLOW (1u << 17)
+#define CLIENT_SHALLOW (1u << 18)
+
static unsigned long oldest_have;
static int multi_ack, nr_our_refs;
return safe_write(fd, data, sz);
}
+FILE *pack_pipe = NULL;
+static void show_commit(struct commit *commit)
+{
+ if (commit->object.flags & BOUNDARY)
+ fputc('-', pack_pipe);
+ if (fputs(sha1_to_hex(commit->object.sha1), pack_pipe) < 0)
+ die("broken output pipe");
+ fputc('\n', pack_pipe);
+ fflush(pack_pipe);
+ free(commit->buffer);
+ commit->buffer = NULL;
+}
+
+static void show_object(struct object_array_entry *p)
+{
+ /* An object with name "foo\n0000000..." can be used to
+ * confuse downstream git-pack-objects very badly.
+ */
+ const char *ep = strchr(p->name, '\n');
+ if (ep) {
+ fprintf(pack_pipe, "%s %.*s\n", sha1_to_hex(p->item->sha1),
+ (int) (ep - p->name),
+ p->name);
+ }
+ else
+ fprintf(pack_pipe, "%s %s\n",
+ sha1_to_hex(p->item->sha1), p->name);
+}
+
+static void show_edge(struct commit *commit)
+{
+ fprintf(pack_pipe, "-%s\n", sha1_to_hex(commit->object.sha1));
+}
+
static void create_pack_file(void)
{
/* Pipes between rev-list to pack-objects, pack-objects to us
if (!pid_rev_list) {
int i;
- int args;
- const char **argv;
- const char **p;
- char *buf;
+ struct rev_info revs;
- if (create_full_pack) {
- args = 10;
- use_thin_pack = 0; /* no point doing it */
- }
- else
- args = have_obj.nr + want_obj.nr + 5;
- p = xmalloc(args * sizeof(char *));
- argv = (const char **) p;
- buf = xmalloc(args * 45);
+ pack_pipe = fdopen(lp_pipe[1], "w");
- dup2(lp_pipe[1], 1);
- close(0);
- close(lp_pipe[0]);
- close(lp_pipe[1]);
- *p++ = "rev-list";
- *p++ = use_thin_pack ? "--objects-edge" : "--objects";
if (create_full_pack)
- *p++ = "--all";
- else {
+ use_thin_pack = 0; /* no point doing it */
+ init_revisions(&revs, NULL);
+ revs.tag_objects = 1;
+ revs.tree_objects = 1;
+ revs.blob_objects = 1;
+ if (use_thin_pack)
+ revs.edge_hint = 1;
+
+ if (create_full_pack) {
+ const char *args[] = {"rev-list", "--all", NULL};
+ setup_revisions(2, args, &revs, NULL);
+ } else {
for (i = 0; i < want_obj.nr; i++) {
struct object *o = want_obj.objects[i].item;
- *p++ = buf;
- memcpy(buf, sha1_to_hex(o->sha1), 41);
- buf += 41;
+ /* why??? */
+ o->flags &= ~UNINTERESTING;
+ add_pending_object(&revs, o, NULL);
}
- }
- if (!create_full_pack)
for (i = 0; i < have_obj.nr; i++) {
struct object *o = have_obj.objects[i].item;
- *p++ = buf;
- *buf++ = '^';
- memcpy(buf, sha1_to_hex(o->sha1), 41);
- buf += 41;
+ o->flags |= UNINTERESTING;
+ add_pending_object(&revs, o, NULL);
}
- *p++ = NULL;
- execv_git_cmd(argv);
- die("git-upload-pack: unable to exec git-rev-list");
+ setup_revisions(0, NULL, &revs, NULL);
+ }
+ prepare_revision_walk(&revs);
+ mark_edges_uninteresting(revs.commits, &revs, show_edge);
+ traverse_commit_list(&revs, show_commit, show_object);
+ exit(0);
}
if (pipe(pu_pipe) < 0)
static void receive_needs(void)
{
+ struct object_array shallows = {0, 0, NULL};
static char line[1000];
- int len;
+ int len, depth = 0;
for (;;) {
struct object *o;
len = packet_read_line(0, line, sizeof(line));
reset_timeout();
if (!len)
- return;
+ break;
+ if (!strncmp("shallow ", line, 8)) {
+ unsigned char sha1[20];
+ struct object *object;
+ use_thin_pack = 0;
+ if (get_sha1(line + 8, sha1))
+ die("invalid shallow line: %s", line);
+ object = parse_object(sha1);
+ if (!object)
+ die("did not find object for %s", line);
+ object->flags |= CLIENT_SHALLOW;
+ add_object_array(object, NULL, &shallows);
+ continue;
+ }
+ if (!strncmp("deepen ", line, 7)) {
+ char *end;
+ use_thin_pack = 0;
+ depth = strtol(line + 7, &end, 0);
+ if (end == line + 7 || depth <= 0)
+ die("Invalid deepen: %s", line);
+ continue;
+ }
if (strncmp("want ", line, 5) ||
get_sha1_hex(line+5, sha1_buf))
die("git-upload-pack: protocol error, "
add_object_array(o, NULL, &want_obj);
}
}
+ if (depth == 0 && shallows.nr == 0)
+ return;
+ if (depth > 0) {
+ struct commit_list *result, *backup;
+ int i;
+ backup = result = get_shallow_commits(&want_obj, depth,
+ SHALLOW, NOT_SHALLOW);
+ while (result) {
+ struct object *object = &result->item->object;
+ if (!(object->flags & (CLIENT_SHALLOW|NOT_SHALLOW))) {
+ packet_write(1, "shallow %s",
+ sha1_to_hex(object->sha1));
+ register_shallow(object->sha1);
+ }
+ result = result->next;
+ }
+ free_commit_list(backup);
+ for (i = 0; i < shallows.nr; i++) {
+ struct object *object = shallows.objects[i].item;
+ if (object->flags & NOT_SHALLOW) {
+ struct commit_list *parents;
+ packet_write(1, "unshallow %s",
+ sha1_to_hex(object->sha1));
+ object->flags &= ~CLIENT_SHALLOW;
+ /* make sure the real parents are parsed */
+ unregister_shallow(object->sha1);
+ object->parsed = 0;
+ parse_commit((struct commit *)object);
+ parents = ((struct commit *)object)->parents;
+ while (parents) {
+ add_object_array(&parents->item->object,
+ NULL, &want_obj);
+ parents = parents->next;
+ }
+ }
+ /* make sure commit traversal conforms to client */
+ register_shallow(object->sha1);
+ }
+ packet_flush(1);
+ } else
+ if (shallows.nr > 0) {
+ int i;
+ for (i = 0; i < shallows.nr; i++)
+ register_shallow(shallows.objects[i].item->sha1);
+ }
+ free(shallows.objects);
}
static int send_ref(const char *refname, const unsigned char *sha1, int flag, void *cb_data)
{
- static const char *capabilities = "multi_ack thin-pack side-band side-band-64k ofs-delta";
+ static const char *capabilities = "multi_ack thin-pack side-band"
+ " side-band-64k ofs-delta shallow";
struct object *o = parse_object(sha1);
if (!o)
if (m->mode)
continue;
+ /* no sense refining a conflict when one side is empty */
+ if (m->chg1 == 0 || m->chg2 == 0)
+ continue;
+
/*
* This probably does not work outside git, since
* we have a very simple mmfile structure.