gitk-wish
gitweb/gitweb.cgi
test-chmtime
+test-ctype
test-date
test-delta
test-dump-cache-tree
As for more concrete guidelines, just imitate the existing code
(this is a good guideline, no matter which project you are
-contributing to). But if you must have a list of rules,
-here they are.
+contributing to). It is always preferable to match the _local_
+convention. New code added to git suite is expected to match
+the overall style of existing code. Modifications to existing
+code is expected to match the style the surrounding code already
+uses (even if it doesn't match the overall style of existing code).
+
+But if you must have a list of rules, here they are.
For shell scripts specifically (not exhaustive):
Fixes since v1.6.1
------------------
+* "git add frotz/nitfol" when "frotz" is a submodule should have errored
+ out, but it didn't.
+
* "git apply" took file modes from the patch text and updated the mode
bits of the target tree even when the patch was not about mode changes.
+* "git bisect view" on Cygwin did not launch gitk
+
* "git checkout $tree" did not trigger an error.
+* "git commit" tried to remove COMMIT_EDITMSG from the work tree by mistake.
+
* "git describe --all" complained when a commit is described with a tag,
which was nonsense.
+* "git diff --no-index --" did not trigger no-index (aka "use git-diff as
+ a replacement of diff on untracked files") behaviour.
+
+* "git format-patch -1 HEAD" on a root commit failed to produce patch
+ text.
+
+* "git fsck branch" did not work as advertised; instead it behaved the same
+ way as "git fsck".
+
* "git log --pretty=format:%s" did not handle a multi-line subject the
same way as built-in log listers (i.e. shortlog, --pretty=oneline, etc.)
* "git mv -k" with more than one errorneous paths misbehaved.
+* "git read-tree -m -u" hence branch switching incorrectly lost a
+ subdirectory in rare cases.
+
* "git rebase -i" issued an unnecessary error message upon a user error of
marking the first commit to be "squash"ed.
-Other documentation updates.
-
----
-exec >/var/tmp/1
-O=v1.6.1-47-g914186a
-echo O=$(git describe maint)
-git shortlog --no-merges $O..maint
+* "git shortlog" did not format a commit message with multi-line
+ subject correctly.
+Many documentation updates.
will enable basic rename detection. If set to "copies" or
"copy", it will detect copies, as well.
-diff.suppress-blank-empty::
+diff.suppressBlankEmpty::
A boolean to inhibit the standard behavior of printing a space
before each empty output line. Defaults to false.
+diff.wordRegex::
+ A POSIX Extended Regular Expression used to determine what is a "word"
+ when performing word-by-word difference calculations. Character
+ sequences that match the regular expression are "words", all other
+ characters are *ignorable* whitespace.
+
fetch.unpackLimit::
If the number of objects fetched over the git native
transfer is below this
--patch-with-raw::
Synonym for "-p --raw".
+--patience::
+ Generate a diff using the "patience diff" algorithm.
+
--stat[=width[,name-width]]::
Generate a diffstat. You can override the default
output width for 80-column terminal by "--stat=width".
Turn off colored diff, even when the configuration file
gives the default to color output.
---color-words::
- Show colored word diff, i.e. color words which have changed.
+--color-words[=<regex>]::
+ Show colored word diff, i.e., color words which have changed.
+ By default, words are separated by whitespace.
++
+When a <regex> is specified, every non-overlapping match of the
+<regex> is considered a word. Anything between these matches is
+considered whitespace and ignored(!) for the purposes of finding
+differences. You may want to append `|[^[:space:]]` to your regular
+expression to make sure that it matches all non-whitespace characters.
+A match that contains a newline is silently truncated(!) at the
+newline.
++
+The regex can also be set via a diff driver or configuration option, see
+linkgit:gitattributes[1] or linkgit:git-config[1]. Giving it explicitly
+overrides any diff driver or configuration setting. Diff drivers
+override configuration settings.
--no-renames::
Turn off rename detection, even when the configuration
[verse]
'git am' [--signoff] [--keep] [--utf8 | --no-utf8]
[--3way] [--interactive]
- [--whitespace=<option>] [-C<n>] [-p<n>]
+ [--whitespace=<option>] [-C<n>] [-p<n>] [--directory=<dir>]
+ [--reject]
[<mbox> | <Maildir>...]
'git am' (--skip | --resolved | --abort)
available locally.
--whitespace=<option>::
- This flag is passed to the 'git-apply' (see linkgit:git-apply[1])
- program that applies
- the patch.
-
-C<n>::
-p<n>::
+--directory=<dir>::
+--reject::
These flags are passed to the 'git-apply' (see linkgit:git-apply[1])
program that applies
the patch.
--------
[verse]
'git apply' [--stat] [--numstat] [--summary] [--check] [--index]
- [--apply] [--no-add] [--build-fake-ancestor <file>] [-R | --reverse]
+ [--apply] [--no-add] [--build-fake-ancestor=<file>] [-R | --reverse]
[--allow-binary-replacement | --binary] [--reject] [-z]
[-pNUM] [-CNUM] [--inaccurate-eof] [--recount] [--cached]
[--whitespace=<nowarn|warn|fix|error|error-all>]
cached data, apply the patch, and store the result in the index,
without using the working tree. This implies '--index'.
---build-fake-ancestor <file>::
+--build-fake-ancestor=<file>::
Newer 'git-diff' output has embedded 'index information'
for each blob to help identify the original version that
the patch applies to. When this flag is given, and if
Automatically implies --tags.
--abbrev=<n>::
- Instead of using the default 8 hexadecimal digits as the
+ Instead of using the default 7 hexadecimal digits as the
abbreviated object name, use <n> digits.
--candidates=<n>::
-------
include::diff-options.txt[]
--1 -2 -3 or --base --ours --theirs, and -0::
+-1 --base::
+-2 --ours::
+-3 --theirs::
+-0::
Diff against the "base" version, "our branch" or "their
branch" respectively. With these options, diffs for
merged entries are not shown.
OPTIONS
-------
--t or --tool=<tool>::
+-t <tool>::
+--tool=<tool>::
Use the merge resolution program specified by <tool>.
Valid merge tools are:
kdiff3, tkdiff, meld, xxdiff, emerge, vimdiff, gvimdiff, ecmerge, and opendiff
Otherwise, 'git-mergetool' will prompt the user to indicate the
success of the resolution after the custom tool has exited.
--y or --no-prompt::
+-y::
+--no-prompt::
Don't prompt before each invocation of the merge resolution
program.
-------
<repository>::
The "remote" repository that is destination of a push
- operation. See the section <<URLS,GIT URLS>> below.
+ operation. This parameter can be either a URL
+ (see the section <<URLS,GIT URLS>> below) or the name
+ of a remote (see the section <<REMOTES,REMOTES>> below).
<refspec>...::
The canonical format of a <refspec> parameter is
want to push. The <dst> side represents the destination location.
+
The local ref that matches <src> is used
-to fast forward the remote ref that matches <dst> (or, if no <dst> was
-specified, the same ref that <src> referred to locally). If
+to fast forward the remote ref that matches <dst>. If
the optional leading plus `+` is used, the remote ref is updated
even if it does not result in a fast forward update.
+
`tag <tag>` means the same as `refs/tags/<tag>:refs/tags/<tag>`.
+
-A parameter <ref> without a colon pushes the <ref> from the source
-repository to the destination repository under the same name.
+A lonely <src> parameter (without a colon and a destination) pushes
+the <src> to the same name in the destination repository.
+
Pushing an empty <src> allows you to delete the <dst> ref from
the remote repository.
+
The special refspec `:` (or `+:` to allow non-fast forward updates)
-directs git to push "matching" heads: for every head that exists on
-the local side, the remote side is updated if a head of the same name
+directs git to push "matching" branches: for every branch that exists on
+the local side, the remote side is updated if a branch of the same name
already exists on the remote side. This is the default operation mode
if no explicit refspec is found (that is neither on the command line
nor in any Push line of the corresponding remotes file---see below).
SYNOPSIS
--------
[verse]
-'git rebase' [-i | --interactive] [-v | --verbose] [-m | --merge]
- [-s <strategy> | --strategy=<strategy>] [--no-verify]
- [-C<n>] [ --whitespace=<option>] [-p | --preserve-merges]
- [--onto <newbase>] <upstream> [<branch>]
+'git rebase' [-i | --interactive] [options] [--onto <newbase>]
+ <upstream> [<branch>]
+'git rebase' [-i | --interactive] [options] --onto <newbase>
+ --root [<branch>]
+
'git rebase' --continue | --skip | --abort
DESCRIPTION
All changes made by commits in the current branch but that are not
in <upstream> are saved to a temporary area. This is the same set
-of commits that would be shown by `git log <upstream>..HEAD`.
+of commits that would be shown by `git log <upstream>..HEAD` (or
+`git log HEAD`, if --root is specified).
The current branch is reset to <upstream>, or <newbase> if the
--onto option was supplied. This has the exact same effect as
--preserve-merges::
Instead of ignoring merges, try to recreate them.
+--root::
+ Rebase all commits reachable from <branch>, instead of
+ limiting them with an <upstream>. This allows you to rebase
+ the root commit(s) on a branch. Must be used with --onto, and
+ will skip changes already contained in <newbase> (instead of
+ <upstream>). When used together with --preserve-merges, 'all'
+ root commits will be rewritten to have <newbase> as parent
+ instead.
+
include::merge-strategies.txt[]
NOTES
The commands can be executed only by the '-c' option; the shell is not
interactive.
-Currently, only the 'git-receive-pack' and 'git-upload-pack' commands
-are permitted to be called, with a single required argument.
+Currently, only three commands are permitted to be called, 'git-receive-pack'
+'git-upload-pack' with a single required argument or 'cvs server' (to invoke
+'git-cvsserver').
Author
------
.git/config file may be specified as an optional command-line
argument.
+--localtime;;
+ Store Git commit times in the local timezone instead of UTC. This
+ makes 'git-log' (even without --date=local) show the same times
+ that `svn log` would in the local timezone.
+
+This doesn't interfere with interoperating with the Subversion
+repository you cloned from, but if you wish for your local Git
+repository to be able to interoperate with someone else's local Git
+repository, either don't use this option or you should both use it in
+the same local timezone.
+
+--ignore-paths=<regex>;;
+ This allows one to specify Perl regular expression that will
+ cause skipping of all matching paths from checkout from SVN.
+ Examples:
+
+ --ignore-paths="^doc" - skip "doc*" directory for every fetch.
+
+ --ignore-paths="^[^/]+/(?:branches|tags)" - skip "branches"
+ and "tags" of first level directories.
+
+ Regular expression is not persistent, you should specify
+ it every time when fetching.
+
'clone'::
Runs 'init' and 'fetch'. It will automatically create a
directory based on the basename of the URL passed to it;
branch of the `git.git` repository.
Documentation for older releases are available here:
-* link:v1.6.1/git.html[documentation for release 1.6.1]
+* link:v1.6.1.1/git.html[documentation for release 1.6.1.1]
* release notes for
+ link:RelNotes-1.6.1.1.txt[1.6.1.1],
link:RelNotes-1.6.1.txt[1.6.1].
* link:v1.6.0.6/git.html[documentation for release 1.6.0.6]
- `bibtex` suitable for files with BibTeX coded references.
+- `cpp` suitable for source code in the C and C++ languages.
+
- `html` suitable for HTML/XHTML documents.
- `java` suitable for source code in the Java language.
- `tex` suitable for source code for LaTeX documents.
+Customizing word diff
+^^^^^^^^^^^^^^^^^^^^^
+
+You can customize the rules that `git diff --color-words` uses to
+split words in a line, by specifying an appropriate regular expression
+in the "diff.*.wordRegex" configuration variable. For example, in TeX
+a backslash followed by a sequence of letters forms a command, but
+several such commands can be run together without intervening
+whitespace. To separate them, use a regular expression such as
+
+------------------------
+[diff "tex"]
+ wordRegex = "\\\\[a-zA-Z]+|[{}]|\\\\.|[^\\{}[:space:]]+"
+------------------------
+
+A built-in pattern is provided for all languages listed in the
+previous section.
+
+
Performing text diffs of binary files
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
default log message, and before the editor is started.
It takes one to three parameters. The first is the name of the file
-that the commit log message. The second is the source of the commit
+that contains the commit log message. The second is the source of the commit
message, and can be: `message` (if a `-m` or `-F` option was
given); `template` (if a `-t` option was given or the
configuration option `commit.template` is set); `merge` (if the
$ echo 'hello world' > file.txt
$ git add .
$ git commit -a -m "initial commit"
-[master (root-commit)] created 54196cc: "initial commit"
+[master (root-commit) 54196cc] initial commit
1 files changed, 1 insertions(+), 0 deletions(-)
create mode 100644 file.txt
$ echo 'hello world!' >file.txt
$ git commit -a -m "add emphasis"
-[master] created c4d59f3: "add emphasis"
+[master c4d59f3] add emphasis
1 files changed, 1 insertions(+), 1 deletions(-)
------------------------------------------------
This merges the changes from Bob's "master" branch into Alice's
current branch. If Alice has made her own changes in the meantime,
-then she may need to manually fix any conflicts. (Note that the
-"master" argument in the above command is actually unnecessary, as it
-is the default.)
+then she may need to manually fix any conflicts.
The "pull" command thus performs two operations: it fetches changes
from a remote branch, then merges them into the current branch.
- '%Cgreen': switch color to green
- '%Cblue': switch color to blue
- '%Creset': reset color
+- '%C(...)': color specification, as described in color.branch.* config option
- '%m': left, right or boundary mark
- '%n': newline
- '%x00': print a byte from a hex code
Wait for the completion of an asynchronous function that was
started with start_async().
+`run_hook`::
+
+ Run a hook.
+ The first argument is a pathname to an index file, or NULL
+ if the hook uses the default index file or no index is needed.
+ The second argument is the name of the hook.
+ The further arguments correspond to the hook arguments.
+ The last argument has to be NULL to terminate the arguments list.
+ If the hook does not exist or is not executable, the return
+ value will be zero.
+ If it is executable, the hook will be executed and the exit
+ status of the hook is returned.
+ On execution, .stdout_to_stderr and .no_stdin will be set.
+ (See below.)
+
Data structures
---------------
#!/bin/sh
GVF=GIT-VERSION-FILE
-DEF_VER=v1.6.0.2.GIT
+DEF_VER=v1.6.1.GIT
LF='
'
# Define NO_EXPAT if you do not have expat installed. git-http-push is
# not built, and you cannot push using http:// and https:// transports.
#
+# Define EXPATDIR=/foo/bar if your expat header and library files are in
+# /foo/bar/include and /foo/bar/lib directories.
+#
# Define NO_D_INO_IN_DIRENT if you don't have d_ino in your struct dirent.
#
# Define NO_D_TYPE_IN_DIRENT if your platform defines DT_UNKNOWN but lacks
BASIC_LDFLAGS += -L/opt/local/lib
endif
endif
+ PTHREAD_LIBS =
endif
ifndef CC_LD_DYNPATH
endif
endif
ifndef NO_EXPAT
- EXPAT_LIBEXPAT = -lexpat
+ ifdef EXPATDIR
+ BASIC_CFLAGS += -I$(EXPATDIR)/include
+ EXPAT_LIBEXPAT = -L$(EXPATDIR)/$(lib) $(CC_LD_DYNPATH)$(EXPATDIR)/$(lib) -lexpat
+ else
+ EXPAT_LIBEXPAT = -lexpat
+ endif
endif
endif
$(QUIET_AR)$(RM) $@ && $(AR) rcs $@ $(LIB_OBJS)
XDIFF_OBJS=xdiff/xdiffi.o xdiff/xprepare.o xdiff/xutils.o xdiff/xemit.o \
- xdiff/xmerge.o
+ xdiff/xmerge.o xdiff/xpatience.o
$(XDIFF_OBJS): xdiff/xinclude.h xdiff/xmacros.h xdiff/xdiff.h xdiff/xtypes.h \
xdiff/xutils.h xdiff/xprepare.h xdiff/xdiffi.h xdiff/xemit.h
### Testing rules
TEST_PROGRAMS += test-chmtime$X
+TEST_PROGRAMS += test-ctype$X
TEST_PROGRAMS += test-date$X
TEST_PROGRAMS += test-delta$X
TEST_PROGRAMS += test-genrandom$X
test: all
$(MAKE) -C t/ all
+test-ctype$X: ctype.o
+
test-date$X: date.o ctype.o
test-delta$X: diff-delta.o patch-delta.o
{ $(RM) "$$execdir/git-add$X" && \
ln git-add$X "$$execdir/git-add$X" 2>/dev/null || \
cp git-add$X "$$execdir/git-add$X"; } && \
- { $(foreach p,$(filter-out git-add$X,$(BUILT_INS)), $(RM) "$$execdir/$p" && \
- ln "$$execdir/git-add$X" "$$execdir/$p" 2>/dev/null || \
- ln -s "git-add$X" "$$execdir/$p" 2>/dev/null || \
- cp "$$execdir/git-add$X" "$$execdir/$p" || exit;) } && \
+ { for p in $(filter-out git-add$X,$(BUILT_INS)); do \
+ $(RM) "$$execdir/$$p" && \
+ ln "$$execdir/git-add$X" "$$execdir/$$p" 2>/dev/null || \
+ ln -s "git-add$X" "$$execdir/$$p" 2>/dev/null || \
+ cp "$$execdir/git-add$X" "$$execdir/$$p" || exit; \
+ done } && \
./check_bindir "z$$bindir" "z$$execdir" "$$bindir/git-add$X"
install-doc:
#include "builtin.h"
#include "string-list.h"
#include "dir.h"
+#include "parse-options.h"
/*
* --check turns on checking that the working tree matches the
static int no_add;
static const char *fake_ancestor;
static int line_termination = '\n';
-static unsigned long p_context = ULONG_MAX;
-static const char apply_usage[] =
-"git apply [--stat] [--numstat] [--summary] [--check] [--index] [--cached] [--apply] [--no-add] [--index-info] [--allow-binary-replacement] [--reverse] [--reject] [--verbose] [-z] [-pNUM] [-CNUM] [--whitespace=<nowarn|warn|fix|error|error-all>] <patch>...";
+static unsigned int p_context = UINT_MAX;
+static const char * const apply_usage[] = {
+ "git apply [options] [<patch>...]",
+ NULL
+};
static enum ws_error_action {
nowarn_ws_error,
static const char *patch_input_file;
static const char *root;
static int root_len;
+static int read_stdin = 1;
+static int options;
static void parse_whitespace_option(const char *option)
{
stream.avail_in = size;
stream.next_out = out = xmalloc(inflated_size);
stream.avail_out = inflated_size;
- inflateInit(&stream);
- st = inflate(&stream, Z_FINISH);
+ git_inflate_init(&stream);
+ st = git_inflate(&stream, Z_FINISH);
+ git_inflate_end(&stream);
if ((st != Z_STREAM_END) || stream.total_out != inflated_size) {
free(out);
return NULL;
return git_default_config(var, value, cb);
}
+static int option_parse_exclude(const struct option *opt,
+ const char *arg, int unset)
+{
+ add_name_limit(arg, 1);
+ return 0;
+}
+
+static int option_parse_include(const struct option *opt,
+ const char *arg, int unset)
+{
+ add_name_limit(arg, 0);
+ has_include = 1;
+ return 0;
+}
+
+static int option_parse_p(const struct option *opt,
+ const char *arg, int unset)
+{
+ p_value = atoi(arg);
+ p_value_known = 1;
+ return 0;
+}
+
+static int option_parse_z(const struct option *opt,
+ const char *arg, int unset)
+{
+ if (unset)
+ line_termination = '\n';
+ else
+ line_termination = 0;
+ return 0;
+}
+
+static int option_parse_whitespace(const struct option *opt,
+ const char *arg, int unset)
+{
+ const char **whitespace_option = opt->value;
+
+ *whitespace_option = arg;
+ parse_whitespace_option(arg);
+ return 0;
+}
+
+static int option_parse_directory(const struct option *opt,
+ const char *arg, int unset)
+{
+ root_len = strlen(arg);
+ if (root_len && arg[root_len - 1] != '/') {
+ char *new_root;
+ root = new_root = xmalloc(root_len + 2);
+ strcpy(new_root, arg);
+ strcpy(new_root + root_len++, "/");
+ } else
+ root = arg;
+ return 0;
+}
int cmd_apply(int argc, const char **argv, const char *unused_prefix)
{
int i;
- int read_stdin = 1;
- int options = 0;
int errs = 0;
int is_not_gitdir;
+ int binary;
+ int force_apply = 0;
const char *whitespace_option = NULL;
+ struct option builtin_apply_options[] = {
+ { OPTION_CALLBACK, 0, "exclude", NULL, "path",
+ "don´t apply changes matching the given path",
+ 0, option_parse_exclude },
+ { OPTION_CALLBACK, 0, "include", NULL, "path",
+ "apply changes matching the given path",
+ 0, option_parse_include },
+ { OPTION_CALLBACK, 'p', NULL, NULL, "num",
+ "remove <num> leading slashes from traditional diff paths",
+ 0, option_parse_p },
+ OPT_BOOLEAN(0, "no-add", &no_add,
+ "ignore additions made by the patch"),
+ OPT_BOOLEAN(0, "stat", &diffstat,
+ "instead of applying the patch, output diffstat for the input"),
+ OPT_BOOLEAN(0, "allow-binary-replacement", &binary,
+ "now no-op"),
+ OPT_BOOLEAN(0, "binary", &binary,
+ "now no-op"),
+ OPT_BOOLEAN(0, "numstat", &numstat,
+ "shows number of added and deleted lines in decimal notation"),
+ OPT_BOOLEAN(0, "summary", &summary,
+ "instead of applying the patch, output a summary for the input"),
+ OPT_BOOLEAN(0, "check", &check,
+ "instead of applying the patch, see if the patch is applicable"),
+ OPT_BOOLEAN(0, "index", &check_index,
+ "make sure the patch is applicable to the current index"),
+ OPT_BOOLEAN(0, "cached", &cached,
+ "apply a patch without touching the working tree"),
+ OPT_BOOLEAN(0, "apply", &force_apply,
+ "also apply the patch (use with --stat/--summary/--check)"),
+ OPT_STRING(0, "build-fake-ancestor", &fake_ancestor, "file",
+ "build a temporary index based on embedded index information"),
+ { OPTION_CALLBACK, 'z', NULL, NULL, NULL,
+ "paths are separated with NUL character",
+ PARSE_OPT_NOARG, option_parse_z },
+ OPT_INTEGER('C', NULL, &p_context,
+ "ensure at least <n> lines of context match"),
+ { OPTION_CALLBACK, 0, "whitespace", &whitespace_option, "action",
+ "detect new or modified lines that have whitespace errors",
+ 0, option_parse_whitespace },
+ OPT_BOOLEAN('R', "reverse", &apply_in_reverse,
+ "apply the patch in reverse"),
+ OPT_BOOLEAN(0, "unidiff-zero", &unidiff_zero,
+ "don't expect at least one line of context"),
+ OPT_BOOLEAN(0, "reject", &apply_with_reject,
+ "leave the rejected hunks in corresponding *.rej files"),
+ OPT__VERBOSE(&apply_verbosely),
+ OPT_BIT(0, "inaccurate-eof", &options,
+ "tolerate incorrectly detected missing new-line at the end of file",
+ INACCURATE_EOF),
+ OPT_BIT(0, "recount", &options,
+ "do not trust the line counts in the hunk headers",
+ RECOUNT),
+ { OPTION_CALLBACK, 0, "directory", NULL, "root",
+ "prepend <root> to all filenames",
+ 0, option_parse_directory },
+ OPT_END()
+ };
+
prefix = setup_git_directory_gently(&is_not_gitdir);
prefix_length = prefix ? strlen(prefix) : 0;
git_config(git_apply_config, NULL);
if (apply_default_whitespace)
parse_whitespace_option(apply_default_whitespace);
- for (i = 1; i < argc; i++) {
+ argc = parse_options(argc, argv, builtin_apply_options,
+ apply_usage, 0);
+ if (apply_with_reject)
+ apply = apply_verbosely = 1;
+ if (!force_apply && (diffstat || numstat || summary || check || fake_ancestor))
+ apply = 0;
+ if (check_index && is_not_gitdir)
+ die("--index outside a repository");
+ if (cached) {
+ if (is_not_gitdir)
+ die("--cached outside a repository");
+ check_index = 1;
+ }
+ for (i = 0; i < argc; i++) {
const char *arg = argv[i];
- char *end;
int fd;
if (!strcmp(arg, "-")) {
errs |= apply_patch(0, "<stdin>", options);
read_stdin = 0;
continue;
- }
- if (!prefixcmp(arg, "--exclude=")) {
- add_name_limit(arg + 10, 1);
- continue;
- }
- if (!prefixcmp(arg, "--include=")) {
- add_name_limit(arg + 10, 0);
- has_include = 1;
- continue;
- }
- if (!prefixcmp(arg, "-p")) {
- p_value = atoi(arg + 2);
- p_value_known = 1;
- continue;
- }
- if (!strcmp(arg, "--no-add")) {
- no_add = 1;
- continue;
- }
- if (!strcmp(arg, "--stat")) {
- apply = 0;
- diffstat = 1;
- continue;
- }
- if (!strcmp(arg, "--allow-binary-replacement") ||
- !strcmp(arg, "--binary")) {
- continue; /* now no-op */
- }
- if (!strcmp(arg, "--numstat")) {
- apply = 0;
- numstat = 1;
- continue;
- }
- if (!strcmp(arg, "--summary")) {
- apply = 0;
- summary = 1;
- continue;
- }
- if (!strcmp(arg, "--check")) {
- apply = 0;
- check = 1;
- continue;
- }
- if (!strcmp(arg, "--index")) {
- if (is_not_gitdir)
- die("--index outside a repository");
- check_index = 1;
- continue;
- }
- if (!strcmp(arg, "--cached")) {
- if (is_not_gitdir)
- die("--cached outside a repository");
- check_index = 1;
- cached = 1;
- continue;
- }
- if (!strcmp(arg, "--apply")) {
- apply = 1;
- continue;
- }
- if (!strcmp(arg, "--build-fake-ancestor")) {
- apply = 0;
- if (++i >= argc)
- die ("need a filename");
- fake_ancestor = argv[i];
- continue;
- }
- if (!strcmp(arg, "-z")) {
- line_termination = 0;
- continue;
- }
- if (!prefixcmp(arg, "-C")) {
- p_context = strtoul(arg + 2, &end, 0);
- if (*end != '\0')
- die("unrecognized context count '%s'", arg + 2);
- continue;
- }
- if (!prefixcmp(arg, "--whitespace=")) {
- whitespace_option = arg + 13;
- parse_whitespace_option(arg + 13);
- continue;
- }
- if (!strcmp(arg, "-R") || !strcmp(arg, "--reverse")) {
- apply_in_reverse = 1;
- continue;
- }
- if (!strcmp(arg, "--unidiff-zero")) {
- unidiff_zero = 1;
- continue;
- }
- if (!strcmp(arg, "--reject")) {
- apply = apply_with_reject = apply_verbosely = 1;
- continue;
- }
- if (!strcmp(arg, "-v") || !strcmp(arg, "--verbose")) {
- apply_verbosely = 1;
- continue;
- }
- if (!strcmp(arg, "--inaccurate-eof")) {
- options |= INACCURATE_EOF;
- continue;
- }
- if (!strcmp(arg, "--recount")) {
- options |= RECOUNT;
- continue;
- }
- if (!prefixcmp(arg, "--directory=")) {
- arg += strlen("--directory=");
- root_len = strlen(arg);
- if (root_len && arg[root_len - 1] != '/') {
- char *new_root;
- root = new_root = xmalloc(root_len + 2);
- strcpy(new_root, arg);
- strcpy(new_root + root_len++, "/");
- } else
- root = arg;
- continue;
- }
- if (0 < prefix_length)
+ } else if (0 < prefix_length)
arg = prefix_filename(prefix, prefix_length, arg);
fd = open(arg, O_RDONLY);
static int post_checkout_hook(struct commit *old, struct commit *new,
int changed)
{
- struct child_process proc;
- const char *name = git_path("hooks/post-checkout");
- const char *argv[5];
+ return run_hook(NULL, "post-checkout",
+ sha1_to_hex(old ? old->object.sha1 : null_sha1),
+ sha1_to_hex(new ? new->object.sha1 : null_sha1),
+ changed ? "1" : "0", NULL);
+ /* "new" can be NULL when checking out from the index before
+ a commit exists. */
- if (access(name, X_OK) < 0)
- return 0;
-
- memset(&proc, 0, sizeof(proc));
- argv[0] = name;
- argv[1] = xstrdup(sha1_to_hex(old ? old->object.sha1 : null_sha1));
- argv[2] = xstrdup(sha1_to_hex(new->object.sha1));
- argv[3] = changed ? "1" : "0";
- argv[4] = NULL;
- proc.argv = argv;
- proc.no_stdin = 1;
- proc.stdout_to_stderr = 1;
- return run_command(&proc);
}
static int update_some(const unsigned char *sha1, const char *base, int baselen,
for (pos = 0; pos < active_nr; pos++) {
struct cache_entry *ce = active_cache[pos];
- pathspec_match(pathspec, ps_matched, ce->name, 0);
+ match_pathspec(pathspec, ce->name, ce_namelen(ce), 0, ps_matched);
}
if (report_path_error(ps_matched, pathspec, 0))
/* Any unmerged paths? */
for (pos = 0; pos < active_nr; pos++) {
struct cache_entry *ce = active_cache[pos];
- if (pathspec_match(pathspec, NULL, ce->name, 0)) {
+ if (match_pathspec(pathspec, ce->name, ce_namelen(ce), 0, NULL)) {
if (!ce_stage(ce))
continue;
if (opts->force) {
state.refresh_cache = 1;
for (pos = 0; pos < active_nr; pos++) {
struct cache_entry *ce = active_cache[pos];
- if (pathspec_match(pathspec, NULL, ce->name, 0)) {
+ if (match_pathspec(pathspec, ce->name, ce_namelen(ce), 0, NULL)) {
if (!ce_stage(ce)) {
errs |= checkout_entry(ce, &state, NULL);
continue;
struct stat buf;
const char *repo_name, *repo, *work_tree, *git_dir;
char *path, *dir;
+ int dest_exists;
const struct ref *refs, *head_points_at, *remote_head, *mapped_refs;
struct strbuf key = STRBUF_INIT, value = STRBUF_INIT;
struct strbuf branch_top = STRBUF_INIT, reflog_msg = STRBUF_INIT;
dir = guess_dir_name(repo_name, is_bundle, option_bare);
strip_trailing_slashes(dir);
- if (!stat(dir, &buf))
- die("destination directory '%s' already exists.", dir);
+ dest_exists = !stat(dir, &buf);
+ if (dest_exists && !is_empty_dir(dir))
+ die("destination path '%s' already exists and is not "
+ "an empty directory.", dir);
strbuf_addf(&reflog_msg, "clone: from %s", repo);
if (safe_create_leading_directories_const(work_tree) < 0)
die("could not create leading directories of '%s': %s",
work_tree, strerror(errno));
- if (mkdir(work_tree, 0755))
+ if (!dest_exists && mkdir(work_tree, 0755))
die("could not create work tree dir '%s': %s.",
work_tree, strerror(errno));
set_git_work_tree(work_tree);
option_upload_pack);
refs = transport_get_remote_refs(transport);
- transport_fetch_refs(transport, refs);
+ if(refs)
+ transport_fetch_refs(transport, refs);
}
- clear_extra_refs();
+ if (refs) {
+ clear_extra_refs();
- mapped_refs = write_remote_refs(refs, &refspec, reflog_msg.buf);
+ mapped_refs = write_remote_refs(refs, &refspec, reflog_msg.buf);
- head_points_at = locate_head(refs, mapped_refs, &remote_head);
+ head_points_at = locate_head(refs, mapped_refs, &remote_head);
+ }
+ else {
+ warning("You appear to have cloned an empty repository.");
+ head_points_at = NULL;
+ remote_head = NULL;
+ option_no_checkout = 1;
+ }
if (head_points_at) {
/* Local default branch link */
struct cache_entry *ce = active_cache[i];
if (ce->ce_flags & CE_UPDATE)
continue;
- if (!pathspec_match(pattern, m, ce->name, 0))
+ if (!match_pathspec(pattern, ce->name, ce_namelen(ce), 0, m))
continue;
string_list_insert(ce->name, list);
}
return s.commitable;
}
-static int run_hook(const char *index_file, const char *name, ...)
-{
- struct child_process hook;
- const char *argv[10], *env[2];
- char index[PATH_MAX];
- va_list args;
- int i;
-
- va_start(args, name);
- argv[0] = git_path("hooks/%s", name);
- i = 0;
- do {
- if (++i >= ARRAY_SIZE(argv))
- die ("run_hook(): too many arguments");
- argv[i] = va_arg(args, const char *);
- } while (argv[i]);
- va_end(args);
-
- snprintf(index, sizeof(index), "GIT_INDEX_FILE=%s", index_file);
- env[0] = index;
- env[1] = NULL;
-
- if (access(argv[0], X_OK) < 0)
- return 0;
-
- memset(&hook, 0, sizeof(hook));
- hook.argv = argv;
- hook.no_stdin = 1;
- hook.stdout_to_stderr = 1;
- hook.env = env;
-
- return run_command(&hook);
-}
-
static int is_a_merge(const unsigned char *sha1)
{
struct commit *commit = lookup_commit(sha1);
if (!commitable && !in_merge && !allow_empty &&
!(amend && is_a_merge(head_sha1))) {
run_status(stdout, index_file, prefix, 0);
- unlink(commit_editmsg);
return 0;
}
if (wt_status_use_color == -1)
wt_status_use_color = git_use_color_default;
+ if (diff_use_color_default == -1)
+ diff_use_color_default = git_use_color_default;
+
argc = parse_and_validate_options(argc, argv, builtin_status_usage, prefix);
index_file = prepare_index(argc, argv, prefix);
{
struct rev_info rev;
struct commit *commit;
- static const char *format = "format:%h: \"%s\"";
+ static const char *format = "format:%h] %s";
unsigned char junk_sha1[20];
const char *head = resolve_ref("HEAD", junk_sha1, 0, NULL);
rev.diffopt.break_opt = 0;
diff_setup_done(&rev.diffopt);
- printf("[%s%s]: created ",
+ printf("[%s%s ",
!prefixcmp(head, "refs/heads/") ?
head + 11 :
!strcmp(head, "HEAD") ?
git_config(git_commit_config, NULL);
+ if (wt_status_use_color == -1)
+ wt_status_use_color = git_use_color_default;
+
argc = parse_and_validate_options(argc, argv, builtin_commit_usage, prefix);
index_file = prepare_index(argc, argv, prefix);
*/
#include "cache.h"
+#include "dir.h"
#include "builtin.h"
#include "parse-options.h"
const char *cp;
int bad = 0;
- if ((ent->d_name[0] == '.') &&
- (ent->d_name[1] == 0 ||
- ((ent->d_name[1] == '.') && (ent->d_name[2] == 0))))
+ if (is_dot_or_dotdot(ent->d_name))
continue;
for (cp = ent->d_name; *cp; cp++) {
int ch = *cp;
#include "tree-walk.h"
#include "fsck.h"
#include "parse-options.h"
+#include "dir.h"
#define REACHABLE 0x0001
#define SEEN 0x0002
while ((de = readdir(dir)) != NULL) {
char name[100];
unsigned char sha1[20];
- int len = strlen(de->d_name);
- switch (len) {
- case 2:
- if (de->d_name[1] != '.')
- break;
- case 1:
- if (de->d_name[0] != '.')
- break;
+ if (is_dot_or_dotdot(de->d_name))
continue;
- case 38:
+ if (strlen(de->d_name) == 38) {
sprintf(name, "%02x", i);
- memcpy(name+2, de->d_name, len+1);
+ memcpy(name+2, de->d_name, 39);
if (get_sha1_hex(name, sha1) < 0)
break;
add_sha1_list(sha1, DIRENT_SORT_HINT(de));
}
heads = 0;
- for (i = 1; i < argc; i++) {
+ for (i = 0; i < argc; i++) {
const char *arg = argv[i];
if (!get_sha1(arg, head_sha1)) {
struct object *obj = lookup_object(head_sha1);
return gc_auto_pack_limit <= cnt;
}
-static int run_hook(void)
-{
- const char *argv[2];
- struct child_process hook;
- int ret;
-
- argv[0] = git_path("hooks/pre-auto-gc");
- argv[1] = NULL;
-
- if (access(argv[0], X_OK) < 0)
- return 0;
-
- memset(&hook, 0, sizeof(hook));
- hook.argv = argv;
- hook.no_stdin = 1;
- hook.stdout_to_stderr = 1;
-
- ret = start_command(&hook);
- if (ret) {
- warning("Could not spawn %s", argv[0]);
- return ret;
- }
- ret = finish_command(&hook);
- if (ret == -ERR_RUN_COMMAND_WAITPID_SIGNAL)
- warning("%s exited due to uncaught signal", argv[0]);
- return ret;
-}
-
static int need_to_gc(void)
{
/*
else if (!too_many_loose_objects())
return 0;
- if (run_hook())
+ if (run_hook(NULL, "pre-auto-gc", NULL))
return 0;
return 1;
}
static FILE *realstdout = NULL;
static const char *output_directory = NULL;
+static int outdir_offset;
static int reopen_stdout(const char *oneline, int nr, int total)
{
strcpy(filename + len, fmt_patch_suffix);
}
- fprintf(realstdout, "%s\n", filename);
+ fprintf(realstdout, "%s\n", filename + outdir_offset);
if (freopen(filename, "w", stdout) == NULL)
return error("Cannot open patch file %s",filename);
return xmemdupz(a, z - a);
}
+static const char *set_outdir(const char *prefix, const char *output_directory)
+{
+ if (output_directory && is_absolute_path(output_directory))
+ return output_directory;
+
+ if (!prefix || !*prefix) {
+ if (output_directory)
+ return output_directory;
+ /* The user did not explicitly ask for "./" */
+ outdir_offset = 2;
+ return "./";
+ }
+
+ outdir_offset = strlen(prefix);
+ if (!output_directory)
+ return prefix;
+
+ return xstrdup(prefix_filename(prefix, outdir_offset,
+ output_directory));
+}
+
int cmd_format_patch(int argc, const char **argv, const char *prefix)
{
struct commit *commit;
if (!DIFF_OPT_TST(&rev.diffopt, TEXT) && !no_binary_diff)
DIFF_OPT_SET(&rev.diffopt, BINARY);
- if (!output_directory && !use_stdout)
- output_directory = prefix;
+ if (!use_stdout)
+ output_directory = set_outdir(prefix, output_directory);
if (output_directory) {
if (use_stdout)
* get_revision() to do the usual traversal.
*/
}
+
+ /*
+ * We cannot move this anywhere earlier because we do want to
+ * know if --root was given explicitly from the comand line.
+ */
+ rev.show_root_diff = 1;
+
if (cover_letter) {
/* remember the range */
int i;
static const char *tag_killed = "";
static const char *tag_modified = "";
-
-/*
- * Match a pathspec against a filename. The first "skiplen" characters
- * are the common prefix
- */
-int pathspec_match(const char **spec, char *ps_matched,
- const char *filename, int skiplen)
-{
- const char *m;
-
- while ((m = *spec++) != NULL) {
- int matchlen = strlen(m + skiplen);
-
- if (!matchlen)
- goto matched;
- if (!strncmp(m + skiplen, filename + skiplen, matchlen)) {
- if (m[skiplen + matchlen - 1] == '/')
- goto matched;
- switch (filename[skiplen + matchlen]) {
- case '/': case '\0':
- goto matched;
- }
- }
- if (!fnmatch(m + skiplen, filename + skiplen, 0))
- goto matched;
- if (ps_matched)
- ps_matched++;
- continue;
- matched:
- if (ps_matched)
- *ps_matched = 1;
- return 1;
- }
- return 0;
-}
-
static void show_dir_entry(const char *tag, struct dir_entry *ent)
{
int len = prefix_len;
if (len >= ent->len)
die("git ls-files: internal error - directory entry not superset of prefix");
- if (pathspec && !pathspec_match(pathspec, ps_matched, ent->name, len))
+ if (!match_pathspec(pathspec, ent->name, ent->len, len, ps_matched))
return;
fputs(tag, stdout);
if (len >= ce_namelen(ce))
die("git ls-files: internal error - cache entry not superset of prefix");
- if (pathspec && !pathspec_match(pathspec, ps_matched, ce->name, len))
+ if (!match_pathspec(pathspec, ce->name, ce_namelen(ce), len, ps_matched))
return;
if (tag && *tag && show_valid_bit &&
strbuf_release(&out);
}
-static int run_hook(const char *name)
-{
- struct child_process hook;
- const char *argv[3], *env[2];
- char index[PATH_MAX];
-
- argv[0] = git_path("hooks/%s", name);
- if (access(argv[0], X_OK) < 0)
- return 0;
-
- snprintf(index, sizeof(index), "GIT_INDEX_FILE=%s", get_index_file());
- env[0] = index;
- env[1] = NULL;
-
- if (squash)
- argv[1] = "1";
- else
- argv[1] = "0";
- argv[2] = NULL;
-
- memset(&hook, 0, sizeof(hook));
- hook.argv = argv;
- hook.no_stdin = 1;
- hook.stdout_to_stderr = 1;
- hook.env = env;
-
- return run_command(&hook);
-}
-
static void finish(const unsigned char *new_head, const char *msg)
{
struct strbuf reflog_message = STRBUF_INIT;
}
/* Run a post-merge hook */
- run_hook("post-merge");
+ run_hook(NULL, "post-merge", squash ? "1" : "0", NULL);
strbuf_release(&reflog_message);
}
int st;
memset(&stream, 0, sizeof(stream));
- inflateInit(&stream);
+ git_inflate_init(&stream);
do {
in = use_pack(p, w_curs, offset, &stream.avail_in);
stream.next_in = in;
stream.next_out = fakebuf;
stream.avail_out = sizeof(fakebuf);
- st = inflate(&stream, Z_FINISH);
+ st = git_inflate(&stream, Z_FINISH);
offset += stream.next_in - in;
} while (st == Z_OK || st == Z_BUF_ERROR);
- inflateEnd(&stream);
+ git_inflate_end(&stream);
return (st == Z_STREAM_END &&
stream.total_out == expect &&
stream.total_in == len) ? 0 : -1;
#include "builtin.h"
#include "reachable.h"
#include "parse-options.h"
+#include "dir.h"
static const char * const prune_usage[] = {
"git prune [-n] [-v] [--expire <time>] [--] [<head>...]",
while ((de = readdir(dir)) != NULL) {
char name[100];
unsigned char sha1[20];
- int len = strlen(de->d_name);
- switch (len) {
- case 2:
- if (de->d_name[1] != '.')
- break;
- case 1:
- if (de->d_name[0] != '.')
- break;
+ if (is_dot_or_dotdot(de->d_name))
continue;
- case 38:
+ if (strlen(de->d_name) == 38) {
sprintf(name, "%02x", i);
- memcpy(name+2, de->d_name, len+1);
+ memcpy(name+2, de->d_name, 39);
if (get_sha1_hex(name, sha1) < 0)
break;
}
}
-static int run_hook(const char *hook_name)
+static int run_receive_hook(const char *hook_name)
{
static char buf[sizeof(commands->old_sha1) * 2 + PATH_MAX + 4];
struct command *cmd;
return;
}
- if (run_hook(pre_receive_hook)) {
+ if (run_receive_hook(pre_receive_hook)) {
while (cmd) {
cmd->error_string = "pre-receive hook declined";
cmd = cmd->next;
unlink(pack_lockfile);
if (report_status)
report(unpack_status);
- run_hook(post_receive_hook);
+ run_receive_hook(post_receive_hook);
run_update_post_hook(commands);
}
return 0;
#include "builtin.h"
#include "cache.h"
+#include "dir.h"
#include "string-list.h"
#include "rerere.h"
#include "xdiff/xdiff.h"
git_config(git_rerere_gc_config, NULL);
dir = opendir(git_path("rr-cache"));
while ((e = readdir(dir))) {
- const char *name = e->d_name;
- if (name[0] == '.' &&
- (name[1] == '\0' || (name[1] == '.' && name[2] == '\0')))
+ if (is_dot_or_dotdot(e->d_name))
continue;
- then = rerere_created_at(name);
+ then = rerere_created_at(e->d_name);
if (!then)
continue;
- cutoff = (has_resolution(name)
+ cutoff = (has_resolution(e->d_name)
? cutoff_resolve : cutoff_noresolve);
if (then < now - cutoff * 86400)
- string_list_append(name, &to_remove);
+ string_list_append(e->d_name, &to_remove);
}
for (i = 0; i < to_remove.nr; i++)
unlink_rr_item(to_remove.items[i].string);
/* .receivepack = */ "git-receive-pack",
};
+static int feed_object(const unsigned char *sha1, int fd, int negative)
+{
+ char buf[42];
+
+ if (negative && !has_sha1_file(sha1))
+ return 1;
+
+ memcpy(buf + negative, sha1_to_hex(sha1), 40);
+ if (negative)
+ buf[0] = '^';
+ buf[40 + negative] = '\n';
+ return write_or_whine(fd, buf, 41 + negative, "send-pack: send refs");
+}
+
/*
* Make a pack stream and spit it out into file descriptor fd
*/
};
struct child_process po;
int i;
- char buf[42];
if (args.use_thin_pack)
argv[4] = "--thin";
* We feed the pack-objects we just spawned with revision
* parameters by writing to the pipe.
*/
- for (i = 0; i < extra->nr; i++) {
- memcpy(buf + 1, sha1_to_hex(&extra->array[i][0]), 40);
- buf[0] = '^';
- buf[41] = '\n';
- if (!write_or_whine(po.in, buf, 42, "send-pack: send refs"))
+ for (i = 0; i < extra->nr; i++)
+ if (!feed_object(extra->array[i], po.in, 1))
break;
- }
while (refs) {
if (!is_null_sha1(refs->old_sha1) &&
- has_sha1_file(refs->old_sha1)) {
- memcpy(buf + 1, sha1_to_hex(refs->old_sha1), 40);
- buf[0] = '^';
- buf[41] = '\n';
- if (!write_or_whine(po.in, buf, 42,
- "send-pack: send refs"))
- break;
- }
- if (!is_null_sha1(refs->new_sha1)) {
- memcpy(buf, sha1_to_hex(refs->new_sha1), 40);
- buf[40] = '\n';
- if (!write_or_whine(po.in, buf, 41,
- "send-pack: send refs"))
- break;
- }
+ !feed_object(refs->old_sha1, po.in, 1))
+ break;
+ if (!is_null_sha1(refs->new_sha1) &&
+ !feed_object(refs->new_sha1, po.in, 0))
+ break;
refs = refs->next;
}
return -1;
}
+const char *format_subject(struct strbuf *sb, const char *msg,
+ const char *line_separator);
+
static void insert_one_record(struct shortlog *log,
const char *author,
const char *oneline)
size_t len;
const char *eol;
const char *boemail, *eoemail;
+ struct strbuf subject = STRBUF_INIT;
boemail = strchr(author, '<');
if (!boemail)
while (*oneline && isspace(*oneline) && *oneline != '\n')
oneline++;
len = eol - oneline;
- while (len && isspace(oneline[len-1]))
- len--;
- buffer = xmemdupz(oneline, len);
+ format_subject(&subject, oneline, " ");
+ buffer = strbuf_detach(&subject, NULL);
if (dot3) {
int dot3len = strlen(dot3);
stream.avail_out = size;
stream.next_in = fill(1);
stream.avail_in = len;
- inflateInit(&stream);
+ git_inflate_init(&stream);
for (;;) {
- int ret = inflate(&stream, 0);
+ int ret = git_inflate(&stream, 0);
use(len - stream.avail_in);
if (stream.total_out == size && ret == Z_STREAM_END)
break;
stream.next_in = fill(1);
stream.avail_in = len;
}
- inflateEnd(&stream);
+ git_inflate_end(&stream);
return buf;
}
return error("unrecognized argument: %s'", argv[i]);
}
+ object_array_remove_duplicates(&revs.pending);
+
for (i = 0; i < revs.pending.nr; i++) {
struct object_array_entry *e = revs.pending.objects + i;
unsigned char sha1[20];
#define deflateBound(c,s) ((s) + (((s) + 7) >> 3) + (((s) + 63) >> 6) + 11)
#endif
+void git_inflate_init(z_streamp strm);
+void git_inflate_end(z_streamp strm);
+int git_inflate(z_streamp strm, int flush);
+
#if defined(DT_UNKNOWN) && !defined(NO_D_TYPE_IN_DIRENT)
#define DTYPE(de) ((de)->d_type)
#else
extern int checkout_entry(struct cache_entry *ce, const struct checkout *state, char *topath);
extern int has_symlink_leading_path(int len, const char *name);
+extern int has_symlink_or_noent_leading_path(int len, const char *name);
+extern int has_dirs_only_path(int len, const char *name, int prefix_len);
+extern void invalidate_lstat_cache(int len, const char *name);
+extern void clear_lstat_cache(void);
extern struct alternate_object_database {
struct alternate_object_database *next;
extern int ws_blank_line(const char *line, int len, unsigned ws_rule);
/* ls-files */
-int pathspec_match(const char **spec, char *matched, const char *filename, int skiplen);
int report_path_error(const char *ps_matched, const char **pathspec, int prefix_offset);
void overlay_tree_on_cache(const char *tree_name, const char *prefix);
}
void color_parse(const char *value, const char *var, char *dst)
+{
+ color_parse_mem(value, strlen(value), var, dst);
+}
+
+void color_parse_mem(const char *value, int value_len, const char *var,
+ char *dst)
{
const char *ptr = value;
+ int len = value_len;
int attr = -1;
int fg = -2;
int bg = -2;
- if (!strcasecmp(value, "reset")) {
+ if (!strncasecmp(value, "reset", len)) {
strcpy(dst, "\033[m");
return;
}
/* [fg [bg]] [attr] */
- while (*ptr) {
+ while (len > 0) {
const char *word = ptr;
- int val, len = 0;
+ int val, wordlen = 0;
- while (word[len] && !isspace(word[len]))
- len++;
+ while (len > 0 && !isspace(word[wordlen])) {
+ wordlen++;
+ len--;
+ }
- ptr = word + len;
- while (*ptr && isspace(*ptr))
+ ptr = word + wordlen;
+ while (len > 0 && isspace(*ptr)) {
ptr++;
+ len--;
+ }
- val = parse_color(word, len);
+ val = parse_color(word, wordlen);
if (val >= -1) {
if (fg == -2) {
fg = val;
}
goto bad;
}
- val = parse_attr(word, len);
+ val = parse_attr(word, wordlen);
if (val < 0 || attr != -1)
goto bad;
attr = val;
*dst = 0;
return;
bad:
- die("bad config value '%s' for variable '%s'", value, var);
+ die("bad color value '%.*s' for variable '%s'", value_len, value, var);
}
int git_config_colorbool(const char *var, const char *value, int stdout_is_tty)
va_end(args);
return r;
}
+
+/*
+ * This function splits the buffer by newlines and colors the lines individually.
+ *
+ * Returns 0 on success.
+ */
+int color_fwrite_lines(FILE *fp, const char *color,
+ size_t count, const char *buf)
+{
+ if (!*color)
+ return fwrite(buf, count, 1, fp) != 1;
+ while (count) {
+ char *p = memchr(buf, '\n', count);
+ if (p != buf && (fputs(color, fp) < 0 ||
+ fwrite(buf, p ? p - buf : count, 1, fp) != 1 ||
+ fputs(COLOR_RESET, fp) < 0))
+ return -1;
+ if (!p)
+ return 0;
+ if (fputc('\n', fp) < 0)
+ return -1;
+ count -= p + 1 - buf;
+ buf = p + 1;
+ }
+ return 0;
+}
+
+
int git_color_default_config(const char *var, const char *value, void *cb);
int git_config_colorbool(const char *var, const char *value, int stdout_is_tty);
-void color_parse(const char *var, const char *value, char *dst);
+void color_parse(const char *value, const char *var, char *dst);
+void color_parse_mem(const char *value, int len, const char *var, char *dst);
int color_fprintf(FILE *fp, const char *color, const char *fmt, ...);
int color_fprintf_ln(FILE *fp, const char *color, const char *fmt, ...);
+int color_fwrite_lines(FILE *fp, const char *color, size_t count, const char *buf);
#endif /* COLOR_H */
FREAD_READS_DIRECTORIES=@FREAD_READS_DIRECTORIES@
SNPRINTF_RETURNS_BOGUS=@SNPRINTF_RETURNS_BOGUS@
NO_PTHREADS=@NO_PTHREADS@
+THREADED_DELTA_SEARCH=@THREADED_DELTA_SEARCH@
PTHREAD_LIBS=@PTHREAD_LIBS@
#
AC_PROG_CC([cc gcc])
# which switch to pass runtime path to dynamic libraries to the linker
-AC_CACHE_CHECK([if linker supports -R], ld_dashr, [
+AC_CACHE_CHECK([if linker supports -R], git_cv_ld_dashr, [
SAVE_LDFLAGS="${LDFLAGS}"
LDFLAGS="${SAVE_LDFLAGS} -R /"
- AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [ld_dashr=yes], [ld_dashr=no])
+ AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [git_cv_ld_dashr=yes], [git_cv_ld_dashr=no])
LDFLAGS="${SAVE_LDFLAGS}"
])
-if test "$ld_dashr" = "yes"; then
+if test "$git_cv_ld_dashr" = "yes"; then
AC_SUBST(CC_LD_DYNPATH, [-R])
else
- AC_CACHE_CHECK([if linker supports -Wl,-rpath,], ld_wl_rpath, [
+ AC_CACHE_CHECK([if linker supports -Wl,-rpath,], git_cv_ld_wl_rpath, [
SAVE_LDFLAGS="${LDFLAGS}"
LDFLAGS="${SAVE_LDFLAGS} -Wl,-rpath,/"
- AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [ld_wl_rpath=yes], [ld_wl_rpath=no])
+ AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [git_cv_ld_wl_rpath=yes], [git_cv_ld_wl_rpath=no])
LDFLAGS="${SAVE_LDFLAGS}"
])
- if test "$ld_wl_rpath" = "yes"; then
+ if test "$git_cv_ld_wl_rpath" = "yes"; then
AC_SUBST(CC_LD_DYNPATH, [-Wl,-rpath,])
else
- AC_CACHE_CHECK([if linker supports -rpath], ld_rpath, [
+ AC_CACHE_CHECK([if linker supports -rpath], git_cv_ld_rpath, [
SAVE_LDFLAGS="${LDFLAGS}"
LDFLAGS="${SAVE_LDFLAGS} -rpath /"
- AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [ld_rpath=yes], [ld_rpath=no])
+ AC_LINK_IFELSE(AC_LANG_PROGRAM([], []), [git_cv_ld_rpath=yes], [git_cv_ld_rpath=no])
LDFLAGS="${SAVE_LDFLAGS}"
])
- if test "$ld_rpath" = "yes"; then
+ if test "$git_cv_ld_rpath" = "yes"; then
AC_SUBST(CC_LD_DYNPATH, [-rpath])
else
AC_MSG_WARN([linker does not support runtime path to dynamic libraries])
#
# Define NO_PTHREADS if we do not have pthreads
#
-# Define PTHREAD_LIBS to the linker flag used for Pthread support.
+# Define PTHREAD_LIBS to the linker flag used for Pthread support and define
+# THREADED_DELTA_SEARCH if Pthreads are available.
AC_LANG_CONFTEST([AC_LANG_PROGRAM(
[[#include <pthread.h>]],
[[pthread_mutex_t test_mutex;]]
${CC} -pthread conftest.c -o conftest.o > /dev/null 2>&1
if test $? -eq 0;then
PTHREAD_LIBS="-pthread"
+ THREADED_DELTA_SEARCH=YesPlease
else
${CC} -lpthread conftest.c -o conftest.o > /dev/null 2>&1
if test $? -eq 0;then
PTHREAD_LIBS="-lpthread"
+ THREADED_DELTA_SEARCH=YesPlease
else
NO_PTHREADS=UnfortunatelyYes
fi
fi
AC_SUBST(PTHREAD_LIBS)
AC_SUBST(NO_PTHREADS)
+AC_SUBST(THREADED_DELTA_SEARCH)
## Site configuration (override autodetection)
## --with-PACKAGE[=ARG] and --without-PACKAGE
done
case "${COMP_WORDS[COMP_CWORD]}" in
- --*=*) COMPREPLY=() ;;
--*)
__gitcomp "
--color --no-color --verbose --abbrev= --no-abbrev
__gitcomp "$(__git_refs)"
}
-_git_diff ()
-{
- __git_has_doubledash && return
-
- local cur="${COMP_WORDS[COMP_CWORD]}"
- case "$cur" in
- --*)
- __gitcomp "--cached --stat --numstat --shortstat --summary
+__git_diff_common_options="--stat --numstat --shortstat --summary
--patch-with-stat --name-only --name-status --color
--no-color --color-words --no-renames --check
--full-index --binary --abbrev --diff-filter=
- --find-copies-harder --pickaxe-all --pickaxe-regex
+ --find-copies-harder
--text --ignore-space-at-eol --ignore-space-change
--ignore-all-space --exit-code --quiet --ext-diff
--no-ext-diff
--no-prefix --src-prefix= --dst-prefix=
- --base --ours --theirs
--inter-hunk-context=
+ --patience
+ --raw
+"
+
+_git_diff ()
+{
+ __git_has_doubledash && return
+
+ local cur="${COMP_WORDS[COMP_CWORD]}"
+ case "$cur" in
+ --*)
+ __gitcomp "--cached --pickaxe-all --pickaxe-regex
+ --base --ours --theirs
+ $__git_diff_common_options
"
return
;;
--not --all
--cover-letter
--no-prefix --src-prefix= --dst-prefix=
+ --inline --suffix= --ignore-if-in-upstream
+ --subject-prefix=
"
return
;;
__git_complete_file
}
+__git_log_pretty_formats="oneline short medium full fuller email raw format:"
+
_git_log ()
{
__git_has_doubledash && return
local cur="${COMP_WORDS[COMP_CWORD]}"
case "$cur" in
--pretty=*)
- __gitcomp "
- oneline short medium full fuller email raw
+ __gitcomp "$__git_log_pretty_formats
" "" "${cur##--pretty=}"
return
;;
--relative-date --date=
--author= --committer= --grep=
--all-match
- --pretty= --name-status --name-only --raw
+ --pretty=
--not --all
--left-right --cherry-pick
--graph
- --stat --numstat --shortstat
- --decorate --diff-filter=
- --color-words --walk-reflogs
+ --decorate
+ --walk-reflogs
--parents --children --full-history
--merge
- --inter-hunk-context=
+ $__git_diff_common_options
+ --pickaxe-all --pickaxe-regex
"
return
;;
_git_remote ()
{
- local subcommands="add rm show prune update"
+ local subcommands="add rename rm show prune update"
local subcommand="$(__git_find_subcommand "$subcommands")"
if [ -z "$subcommand" ]; then
__gitcomp "$subcommands"
fi
case "$subcommand" in
- rm|show|prune)
+ rename|rm|show|prune)
__gitcomp "$(__git_remotes)"
;;
update)
local cur="${COMP_WORDS[COMP_CWORD]}"
case "$cur" in
--pretty=*)
- __gitcomp "
- oneline short medium full fuller email raw
+ __gitcomp "$__git_log_pretty_formats
" "" "${cur##--pretty=}"
return
;;
--*)
- __gitcomp "--pretty="
+ __gitcomp "--pretty=
+ $__git_diff_common_options
+ "
return
;;
esac
--follow-parent --authors-file= --repack=
--no-metadata --use-svm-props --use-svnsync-props
--log-window-size= --no-checkout --quiet
- --repack-flags --user-log-author $remote_opts
+ --repack-flags --user-log-author --localtime $remote_opts
"
local init_opts="
--template= --shared= --trunk= --tags=
if [ -z "$command" ]; then
case "${COMP_WORDS[COMP_CWORD]}" in
- --*=*) COMPREPLY=() ;;
--*) __gitcomp "
--paginate
--no-pager
--- /dev/null
+#!/usr/bin/env perl
+# Copyright (c) 2009 David Aguilar
+#
+# This is a wrapper around the GIT_EXTERNAL_DIFF-compatible
+# git-difftool-helper script. This script exports
+# GIT_EXTERNAL_DIFF and GIT_PAGER for use by git, and
+# GIT_DIFFTOOL_NO_PROMPT and GIT_MERGE_TOOL for use by git-difftool-helper.
+# Any arguments that are unknown to this script are forwarded to 'git diff'.
+
+use strict;
+use warnings;
+use Cwd qw(abs_path);
+use File::Basename qw(dirname);
+
+my $DIR = abs_path(dirname($0));
+
+
+sub usage
+{
+ print << 'USAGE';
+usage: git difftool [--tool=<tool>] [--no-prompt] ["git diff" options]
+USAGE
+ exit 1;
+}
+
+sub setup_environment
+{
+ $ENV{PATH} = "$DIR:$ENV{PATH}";
+ $ENV{GIT_PAGER} = '';
+ $ENV{GIT_EXTERNAL_DIFF} = 'git-difftool-helper';
+}
+
+sub exe
+{
+ my $exe = shift;
+ return defined $ENV{COMSPEC} ? "$exe.exe" : $exe;
+}
+
+sub generate_command
+{
+ my @command = (exe('git'), 'diff');
+ my $skip_next = 0;
+ my $idx = -1;
+ for my $arg (@ARGV) {
+ $idx++;
+ if ($skip_next) {
+ $skip_next = 0;
+ next;
+ }
+ if ($arg eq '-t' or $arg eq '--tool') {
+ usage() if $#ARGV <= $idx;
+ $ENV{GIT_MERGE_TOOL} = $ARGV[$idx + 1];
+ $skip_next = 1;
+ next;
+ }
+ if ($arg =~ /^--tool=/) {
+ $ENV{GIT_MERGE_TOOL} = substr($arg, 7);
+ next;
+ }
+ if ($arg eq '--no-prompt') {
+ $ENV{GIT_DIFFTOOL_NO_PROMPT} = 'true';
+ next;
+ }
+ if ($arg eq '-h' or $arg eq '--help') {
+ usage();
+ }
+ push @command, $arg;
+ }
+ return @command
+}
+
+setup_environment();
+exec(generate_command());
--- /dev/null
+#!/bin/sh
+# git-difftool-helper is a GIT_EXTERNAL_DIFF-compatible diff tool launcher.
+# It supports kdiff3, tkdiff, xxdiff, meld, opendiff, emerge, ecmerge,
+# vimdiff, gvimdiff, and custom user-configurable tools.
+# This script is typically launched by using the 'git difftool'
+# convenience command.
+#
+# Copyright (c) 2009 David Aguilar
+
+# Set GIT_DIFFTOOL_NO_PROMPT to bypass the per-file prompt.
+should_prompt () {
+ ! test -n "$GIT_DIFFTOOL_NO_PROMPT"
+}
+
+# Should we keep the backup .orig file?
+keep_backup_mode="$(git config --bool merge.keepBackup || echo true)"
+keep_backup () {
+ test "$keep_backup_mode" = "true"
+}
+
+# This function manages the backup .orig file.
+# A backup $MERGED.orig file is created if changes are detected.
+cleanup_temp_files () {
+ if test -n "$MERGED"; then
+ if keep_backup && test "$MERGED" -nt "$BACKUP"; then
+ test -f "$BACKUP" && mv -- "$BACKUP" "$MERGED.orig"
+ else
+ rm -f -- "$BACKUP"
+ fi
+ fi
+}
+
+# This is called when users Ctrl-C out of git-difftool-helper
+sigint_handler () {
+ cleanup_temp_files
+ exit 1
+}
+
+# This function prepares temporary files and launches the appropriate
+# merge tool.
+launch_merge_tool () {
+ # Merged is the filename as it appears in the work tree
+ # Local is the contents of a/filename
+ # Remote is the contents of b/filename
+ # Custom merge tool commands might use $BASE so we provide it
+ MERGED="$1"
+ LOCAL="$2"
+ REMOTE="$3"
+ BASE="$1"
+ ext="$$$(expr "$MERGED" : '.*\(\.[^/]*\)$')"
+ BACKUP="$MERGED.BACKUP.$ext"
+
+ # Create and ensure that we clean up $BACKUP
+ test -f "$MERGED" && cp -- "$MERGED" "$BACKUP"
+ trap sigint_handler INT
+
+ # $LOCAL and $REMOTE are temporary files so prompt
+ # the user with the real $MERGED name before launching $merge_tool.
+ if should_prompt; then
+ printf "\nViewing: '$MERGED'\n"
+ printf "Hit return to launch '%s': " "$merge_tool"
+ read ans
+ fi
+
+ # Run the appropriate merge tool command
+ case "$merge_tool" in
+ kdiff3)
+ basename=$(basename "$MERGED")
+ "$merge_tool_path" --auto \
+ --L1 "$basename (A)" \
+ --L2 "$basename (B)" \
+ -o "$MERGED" "$LOCAL" "$REMOTE" \
+ > /dev/null 2>&1
+ ;;
+
+ tkdiff)
+ "$merge_tool_path" -o "$MERGED" "$LOCAL" "$REMOTE"
+ ;;
+
+ meld)
+ "$merge_tool_path" "$LOCAL" "$REMOTE"
+ ;;
+
+ vimdiff)
+ "$merge_tool_path" -c "wincmd l" "$LOCAL" "$REMOTE"
+ ;;
+
+ gvimdiff)
+ "$merge_tool_path" -c "wincmd l" -f "$LOCAL" "$REMOTE"
+ ;;
+
+ xxdiff)
+ "$merge_tool_path" \
+ -X \
+ -R 'Accel.SaveAsMerged: "Ctrl-S"' \
+ -R 'Accel.Search: "Ctrl+F"' \
+ -R 'Accel.SearchForward: "Ctrl-G"' \
+ --merged-file "$MERGED" \
+ "$LOCAL" "$REMOTE"
+ ;;
+
+ opendiff)
+ "$merge_tool_path" "$LOCAL" "$REMOTE" \
+ -merge "$MERGED" | cat
+ ;;
+
+ ecmerge)
+ "$merge_tool_path" "$LOCAL" "$REMOTE" \
+ --default --mode=merge2 --to="$MERGED"
+ ;;
+
+ emerge)
+ "$merge_tool_path" -f emerge-files-command \
+ "$LOCAL" "$REMOTE" "$(basename "$MERGED")"
+ ;;
+
+ *)
+ if test -n "$merge_tool_cmd"; then
+ ( eval $merge_tool_cmd )
+ fi
+ ;;
+ esac
+
+ cleanup_temp_files
+}
+
+# Verifies that mergetool.<tool>.cmd exists
+valid_custom_tool() {
+ merge_tool_cmd="$(git config mergetool.$1.cmd)"
+ test -n "$merge_tool_cmd"
+}
+
+# Verifies that the chosen merge tool is properly setup.
+# Built-in merge tools are always valid.
+valid_tool() {
+ case "$1" in
+ kdiff3 | tkdiff | xxdiff | meld | opendiff | emerge | vimdiff | gvimdiff | ecmerge)
+ ;; # happy
+ *)
+ if ! valid_custom_tool "$1"
+ then
+ return 1
+ fi
+ ;;
+ esac
+}
+
+# Sets up the merge_tool_path variable.
+# This handles the mergetool.<tool>.path configuration.
+init_merge_tool_path() {
+ merge_tool_path=$(git config mergetool."$1".path)
+ if test -z "$merge_tool_path"; then
+ case "$1" in
+ emerge)
+ merge_tool_path=emacs
+ ;;
+ *)
+ merge_tool_path="$1"
+ ;;
+ esac
+ fi
+}
+
+# Allow the GIT_MERGE_TOOL variable to provide a default value
+test -n "$GIT_MERGE_TOOL" && merge_tool="$GIT_MERGE_TOOL"
+
+# If not merge tool was specified then use the merge.tool
+# configuration variable. If that's invalid then reset merge_tool.
+if test -z "$merge_tool"; then
+ merge_tool=$(git config merge.tool)
+ if test -n "$merge_tool" && ! valid_tool "$merge_tool"; then
+ echo >&2 "git config option merge.tool set to unknown tool: $merge_tool"
+ echo >&2 "Resetting to default..."
+ unset merge_tool
+ fi
+fi
+
+# Try to guess an appropriate merge tool if no tool has been set.
+if test -z "$merge_tool"; then
+
+ # We have a $DISPLAY so try some common UNIX merge tools
+ if test -n "$DISPLAY"; then
+ merge_tool_candidates="kdiff3 tkdiff xxdiff meld gvimdiff"
+ # If gnome then prefer meld
+ if test -n "$GNOME_DESKTOP_SESSION_ID"; then
+ merge_tool_candidates="meld $merge_tool_candidates"
+ fi
+ # If KDE then prefer kdiff3
+ if test "$KDE_FULL_SESSION" = "true"; then
+ merge_tool_candidates="kdiff3 $merge_tool_candidates"
+ fi
+ fi
+
+ # $EDITOR is emacs so add emerge as a candidate
+ if echo "${VISUAL:-$EDITOR}" | grep 'emacs' > /dev/null 2>&1; then
+ merge_tool_candidates="$merge_tool_candidates emerge"
+ fi
+
+ # $EDITOR is vim so add vimdiff as a candidate
+ if echo "${VISUAL:-$EDITOR}" | grep 'vim' > /dev/null 2>&1; then
+ merge_tool_candidates="$merge_tool_candidates vimdiff"
+ fi
+
+ merge_tool_candidates="$merge_tool_candidates opendiff emerge vimdiff"
+ echo "merge tool candidates: $merge_tool_candidates"
+
+ # Loop over each candidate and stop when a valid merge tool is found.
+ for i in $merge_tool_candidates
+ do
+ init_merge_tool_path $i
+ if type "$merge_tool_path" > /dev/null 2>&1; then
+ merge_tool=$i
+ break
+ fi
+ done
+
+ if test -z "$merge_tool" ; then
+ echo "No known merge resolution program available."
+ exit 1
+ fi
+
+else
+ # A merge tool has been set, so verify that it's valid.
+ if ! valid_tool "$merge_tool"; then
+ echo >&2 "Unknown merge tool $merge_tool"
+ exit 1
+ fi
+
+ init_merge_tool_path "$merge_tool"
+
+ if test -z "$merge_tool_cmd" && ! type "$merge_tool_path" > /dev/null 2>&1; then
+ echo "The merge tool $merge_tool is not available as '$merge_tool_path'"
+ exit 1
+ fi
+fi
+
+
+# Launch the merge tool on each path provided by 'git diff'
+while test $# -gt 6
+do
+ launch_merge_tool "$1" "$2" "$5"
+ shift 7
+done
--- /dev/null
+git-difftool(1)
+===============
+
+NAME
+----
+git-difftool - compare changes using common merge tools
+
+SYNOPSIS
+--------
+'git difftool' [--tool=<tool>] [--no-prompt] ['git diff' options]
+
+DESCRIPTION
+-----------
+'git-difftool' is a git command that allows you to compare and edit files
+between revisions using common merge tools. At its most basic level,
+'git-difftool' does what 'git-mergetool' does but its use is for non-merge
+situations such as when preparing commits or comparing changes against
+the index.
+
+'git difftool' is a frontend to 'git diff' and accepts the same
+arguments and options.
+
+See linkgit:git-diff[1] for the full list of supported options.
+
+OPTIONS
+-------
+-t <tool>::
+--tool=<tool>::
+ Use the merge resolution program specified by <tool>.
+ Valid merge tools are:
+ kdiff3, tkdiff, meld, xxdiff, emerge, vimdiff, gvimdiff, ecmerge, and opendiff
++
+If a merge resolution program is not specified, 'git-difftool'
+will use the configuration variable `merge.tool`. If the
+configuration variable `merge.tool` is not set, 'git difftool'
+will pick a suitable default.
++
+You can explicitly provide a full path to the tool by setting the
+configuration variable `mergetool.<tool>.path`. For example, you
+can configure the absolute path to kdiff3 by setting
+`mergetool.kdiff3.path`. Otherwise, 'git-difftool' assumes the
+tool is available in PATH.
++
+Instead of running one of the known merge tool programs,
+'git-difftool' can be customized to run an alternative program
+by specifying the command line to invoke in a configuration
+variable `mergetool.<tool>.cmd`.
++
+When 'git-difftool' is invoked with this tool (either through the
+`-t` or `--tool` option or the `merge.tool` configuration variable)
+the configured command line will be invoked with the following
+variables available: `$LOCAL` is set to the name of the temporary
+file containing the contents of the diff pre-image and `$REMOTE`
+is set to the name of the temporary file containing the contents
+of the diff post-image. `$BASE` is provided for compatibility
+with custom merge tool commands and has the same value as `$LOCAL`.
+
+--no-prompt::
+ Do not prompt before launching a diff tool.
+
+CONFIG VARIABLES
+----------------
+merge.tool::
+ The default merge tool to use.
++
+See the `--tool=<tool>` option above for more details.
+
+merge.keepBackup::
+ The original, unedited file content can be saved to a file with
+ a `.orig` extension. Defaults to `true` (i.e. keep the backup files).
+
+mergetool.<tool>.path::
+ Override the path for the given tool. This is useful in case
+ your tool is not in the PATH.
+
+mergetool.<tool>.cmd::
+ Specify the command to invoke the specified merge tool.
++
+See the `--tool=<tool>` option above for more details.
+
+
+SEE ALSO
+--------
+linkgit:git-diff[1]::
+ Show changes between commits, commit and working tree, etc
+
+linkgit:git-mergetool[1]::
+ Run merge conflict resolution tools to resolve merge conflicts
+
+linkgit:git-config[1]::
+ Get and set repository or global options
+
+
+AUTHOR
+------
+Written by David Aguilar <davvid@gmail.com>.
+
+Documentation
+--------------
+Documentation by David Aguilar and the git-list <git@vger.kernel.org>.
+
+GIT
+---
+Part of the linkgit:git[1] suite
*/
#include "cache.h"
-/* Just so that no insane platform contaminate namespace with these symbols */
-#undef SS
-#undef AA
-#undef DD
-#undef GS
-
-#define SS GIT_SPACE
-#define AA GIT_ALPHA
-#define DD GIT_DIGIT
-#define GS GIT_SPECIAL /* \0, *, ?, [, \\ */
+enum {
+ S = GIT_SPACE,
+ A = GIT_ALPHA,
+ D = GIT_DIGIT,
+ G = GIT_GLOB_SPECIAL, /* *, ?, [, \\ */
+ R = GIT_REGEX_SPECIAL, /* $, (, ), +, ., ^, {, | * */
+};
unsigned char sane_ctype[256] = {
- GS, 0, 0, 0, 0, 0, 0, 0, 0, SS, SS, 0, 0, SS, 0, 0, /* 0-15 */
- 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, /* 16-15 */
- SS, 0, 0, 0, 0, 0, 0, 0, 0, 0, GS, 0, 0, 0, 0, 0, /* 32-15 */
- DD, DD, DD, DD, DD, DD, DD, DD, DD, DD, 0, 0, 0, 0, 0, GS, /* 48-15 */
- 0, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, /* 64-15 */
- AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, GS, GS, 0, 0, 0, /* 80-15 */
- 0, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, /* 96-15 */
- AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, AA, 0, 0, 0, 0, 0, /* 112-15 */
+ 0, 0, 0, 0, 0, 0, 0, 0, 0, S, S, 0, 0, S, 0, 0, /* 0.. 15 */
+ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, /* 16.. 31 */
+ S, 0, 0, 0, R, 0, 0, 0, R, R, G, R, 0, 0, R, 0, /* 32.. 47 */
+ D, D, D, D, D, D, D, D, D, D, 0, 0, 0, 0, 0, G, /* 48.. 63 */
+ 0, A, A, A, A, A, A, A, A, A, A, A, A, A, A, A, /* 64.. 79 */
+ A, A, A, A, A, A, A, A, A, A, A, G, G, 0, R, 0, /* 80.. 95 */
+ 0, A, A, A, A, A, A, A, A, A, A, A, A, A, A, A, /* 96..111 */
+ A, A, A, A, A, A, A, A, A, A, A, R, R, 0, 0, 0, /* 112..127 */
/* Nothing in the 128.. range */
};
/* Were we asked to do --no-index explicitly? */
for (i = 1; i < argc; i++) {
- if (!strcmp(argv[i], "--"))
- return;
+ if (!strcmp(argv[i], "--")) {
+ i++;
+ break;
+ }
if (!strcmp(argv[i], "--no-index"))
no_index = 1;
if (argv[i][0] != '-')
die("git diff %s takes two paths",
no_index ? "--no-index" : "[--no-index]");
- /*
- * If the user asked for our exit code then don't start a
- * pager or we would end up reporting its exit code instead.
- */
- if (!DIFF_OPT_TST(&revs->diffopt, EXIT_WITH_STATUS))
- setup_pager();
-
diff_setup(&revs->diffopt);
if (!revs->diffopt.output_format)
revs->diffopt.output_format = DIFF_FORMAT_PATCH;
int j;
if (!strcmp(argv[i], "--no-index"))
i++;
- else if (!strcmp(argv[1], "-q"))
+ else if (!strcmp(argv[i], "-q")) {
options |= DIFF_SILENT_ON_REMOVED;
+ i++;
+ }
+ else if (!strcmp(argv[i], "--"))
+ i++;
else {
j = diff_opt_parse(&revs->diffopt, argv + i, argc - i);
if (!j)
}
}
+ /*
+ * If the user asked for our exit code then don't start a
+ * pager or we would end up reporting its exit code instead.
+ */
+ if (!DIFF_OPT_TST(&revs->diffopt, EXIT_WITH_STATUS))
+ setup_pager();
+
if (prefix) {
int len = strlen(prefix);
static int diff_rename_limit_default = 200;
static int diff_suppress_blank_empty;
int diff_use_color_default = -1;
+static const char *diff_word_regex_cfg;
static const char *external_diff_cmd_cfg;
int diff_auto_refresh_index = 1;
static int diff_mnemonic_prefix;
}
if (!strcmp(var, "diff.external"))
return git_config_string(&external_diff_cmd_cfg, var, value);
+ if (!strcmp(var, "diff.wordregex"))
+ return git_config_string(&diff_word_regex_cfg, var, value);
return git_diff_basic_config(var, value, cb);
}
}
/* like GNU diff's --suppress-blank-empty option */
- if (!strcmp(var, "diff.suppress-blank-empty")) {
+ if (!strcmp(var, "diff.suppressblankempty") ||
+ /* for backwards compatibility */
+ !strcmp(var, "diff.suppress-blank-empty")) {
diff_suppress_blank_empty = git_config_bool(var, value);
return 0;
}
struct diff_words_buffer {
mmfile_t text;
long alloc;
- long current; /* output pointer */
- int suppressed_newline;
+ struct diff_words_orig {
+ const char *begin, *end;
+ } *orig;
+ int orig_nr, orig_alloc;
};
static void diff_words_append(char *line, unsigned long len,
struct diff_words_buffer *buffer)
{
- if (buffer->text.size + len > buffer->alloc) {
- buffer->alloc = (buffer->text.size + len) * 3 / 2;
- buffer->text.ptr = xrealloc(buffer->text.ptr, buffer->alloc);
- }
+ ALLOC_GROW(buffer->text.ptr, buffer->text.size + len, buffer->alloc);
line++;
len--;
memcpy(buffer->text.ptr + buffer->text.size, line, len);
buffer->text.size += len;
+ buffer->text.ptr[buffer->text.size] = '\0';
}
struct diff_words_data {
struct diff_words_buffer minus, plus;
+ const char *current_plus;
FILE *file;
+ regex_t *word_regex;
};
-static void print_word(FILE *file, struct diff_words_buffer *buffer, int len, int color,
- int suppress_newline)
+static void fn_out_diff_words_aux(void *priv, char *line, unsigned long len)
{
- const char *ptr;
- int eol = 0;
+ struct diff_words_data *diff_words = priv;
+ int minus_first, minus_len, plus_first, plus_len;
+ const char *minus_begin, *minus_end, *plus_begin, *plus_end;
- if (len == 0)
+ if (line[0] != '@' || parse_hunk_header(line, len,
+ &minus_first, &minus_len, &plus_first, &plus_len))
return;
- ptr = buffer->text.ptr + buffer->current;
- buffer->current += len;
+ /* POSIX requires that first be decremented by one if len == 0... */
+ if (minus_len) {
+ minus_begin = diff_words->minus.orig[minus_first].begin;
+ minus_end =
+ diff_words->minus.orig[minus_first + minus_len - 1].end;
+ } else
+ minus_begin = minus_end =
+ diff_words->minus.orig[minus_first].end;
- if (ptr[len - 1] == '\n') {
- eol = 1;
- len--;
+ if (plus_len) {
+ plus_begin = diff_words->plus.orig[plus_first].begin;
+ plus_end = diff_words->plus.orig[plus_first + plus_len - 1].end;
+ } else
+ plus_begin = plus_end = diff_words->plus.orig[plus_first].end;
+
+ if (diff_words->current_plus != plus_begin)
+ fwrite(diff_words->current_plus,
+ plus_begin - diff_words->current_plus, 1,
+ diff_words->file);
+ if (minus_begin != minus_end)
+ color_fwrite_lines(diff_words->file,
+ diff_get_color(1, DIFF_FILE_OLD),
+ minus_end - minus_begin, minus_begin);
+ if (plus_begin != plus_end)
+ color_fwrite_lines(diff_words->file,
+ diff_get_color(1, DIFF_FILE_NEW),
+ plus_end - plus_begin, plus_begin);
+
+ diff_words->current_plus = plus_end;
+}
+
+/* This function starts looking at *begin, and returns 0 iff a word was found. */
+static int find_word_boundaries(mmfile_t *buffer, regex_t *word_regex,
+ int *begin, int *end)
+{
+ if (word_regex && *begin < buffer->size) {
+ regmatch_t match[1];
+ if (!regexec(word_regex, buffer->ptr + *begin, 1, match, 0)) {
+ char *p = memchr(buffer->ptr + *begin + match[0].rm_so,
+ '\n', match[0].rm_eo - match[0].rm_so);
+ *end = p ? p - buffer->ptr : match[0].rm_eo + *begin;
+ *begin += match[0].rm_so;
+ return *begin >= *end;
+ }
+ return -1;
}
- fputs(diff_get_color(1, color), file);
- fwrite(ptr, len, 1, file);
- fputs(diff_get_color(1, DIFF_RESET), file);
+ /* find the next word */
+ while (*begin < buffer->size && isspace(buffer->ptr[*begin]))
+ (*begin)++;
+ if (*begin >= buffer->size)
+ return -1;
- if (eol) {
- if (suppress_newline)
- buffer->suppressed_newline = 1;
- else
- putc('\n', file);
- }
+ /* find the end of the word */
+ *end = *begin + 1;
+ while (*end < buffer->size && !isspace(buffer->ptr[*end]))
+ (*end)++;
+
+ return 0;
}
-static void fn_out_diff_words_aux(void *priv, char *line, unsigned long len)
+/*
+ * This function splits the words in buffer->text, stores the list with
+ * newline separator into out, and saves the offsets of the original words
+ * in buffer->orig.
+ */
+static void diff_words_fill(struct diff_words_buffer *buffer, mmfile_t *out,
+ regex_t *word_regex)
{
- struct diff_words_data *diff_words = priv;
+ int i, j;
+ long alloc = 0;
- if (diff_words->minus.suppressed_newline) {
- if (line[0] != '+')
- putc('\n', diff_words->file);
- diff_words->minus.suppressed_newline = 0;
- }
+ out->size = 0;
+ out->ptr = NULL;
- len--;
- switch (line[0]) {
- case '-':
- print_word(diff_words->file,
- &diff_words->minus, len, DIFF_FILE_OLD, 1);
- break;
- case '+':
- print_word(diff_words->file,
- &diff_words->plus, len, DIFF_FILE_NEW, 0);
- break;
- case ' ':
- print_word(diff_words->file,
- &diff_words->plus, len, DIFF_PLAIN, 0);
- diff_words->minus.current += len;
- break;
+ /* fake an empty "0th" word */
+ ALLOC_GROW(buffer->orig, 1, buffer->orig_alloc);
+ buffer->orig[0].begin = buffer->orig[0].end = buffer->text.ptr;
+ buffer->orig_nr = 1;
+
+ for (i = 0; i < buffer->text.size; i++) {
+ if (find_word_boundaries(&buffer->text, word_regex, &i, &j))
+ return;
+
+ /* store original boundaries */
+ ALLOC_GROW(buffer->orig, buffer->orig_nr + 1,
+ buffer->orig_alloc);
+ buffer->orig[buffer->orig_nr].begin = buffer->text.ptr + i;
+ buffer->orig[buffer->orig_nr].end = buffer->text.ptr + j;
+ buffer->orig_nr++;
+
+ /* store one word */
+ ALLOC_GROW(out->ptr, out->size + j - i + 1, alloc);
+ memcpy(out->ptr + out->size, buffer->text.ptr + i, j - i);
+ out->ptr[out->size + j - i] = '\n';
+ out->size += j - i + 1;
+
+ i = j - 1;
}
}
xdemitconf_t xecfg;
xdemitcb_t ecb;
mmfile_t minus, plus;
- int i;
+
+ /* special case: only removal */
+ if (!diff_words->plus.text.size) {
+ color_fwrite_lines(diff_words->file,
+ diff_get_color(1, DIFF_FILE_OLD),
+ diff_words->minus.text.size, diff_words->minus.text.ptr);
+ diff_words->minus.text.size = 0;
+ return;
+ }
+
+ diff_words->current_plus = diff_words->plus.text.ptr;
memset(&xpp, 0, sizeof(xpp));
memset(&xecfg, 0, sizeof(xecfg));
- minus.size = diff_words->minus.text.size;
- minus.ptr = xmalloc(minus.size);
- memcpy(minus.ptr, diff_words->minus.text.ptr, minus.size);
- for (i = 0; i < minus.size; i++)
- if (isspace(minus.ptr[i]))
- minus.ptr[i] = '\n';
- diff_words->minus.current = 0;
-
- plus.size = diff_words->plus.text.size;
- plus.ptr = xmalloc(plus.size);
- memcpy(plus.ptr, diff_words->plus.text.ptr, plus.size);
- for (i = 0; i < plus.size; i++)
- if (isspace(plus.ptr[i]))
- plus.ptr[i] = '\n';
- diff_words->plus.current = 0;
-
+ diff_words_fill(&diff_words->minus, &minus, diff_words->word_regex);
+ diff_words_fill(&diff_words->plus, &plus, diff_words->word_regex);
xpp.flags = XDF_NEED_MINIMAL;
- xecfg.ctxlen = diff_words->minus.alloc + diff_words->plus.alloc;
+ /* as only the hunk header will be parsed, we need a 0-context */
+ xecfg.ctxlen = 0;
xdi_diff_outf(&minus, &plus, fn_out_diff_words_aux, diff_words,
&xpp, &xecfg, &ecb);
free(minus.ptr);
free(plus.ptr);
+ if (diff_words->current_plus != diff_words->plus.text.ptr +
+ diff_words->plus.text.size)
+ fwrite(diff_words->current_plus,
+ diff_words->plus.text.ptr + diff_words->plus.text.size
+ - diff_words->current_plus, 1,
+ diff_words->file);
diff_words->minus.text.size = diff_words->plus.text.size = 0;
-
- if (diff_words->minus.suppressed_newline) {
- putc('\n', diff_words->file);
- diff_words->minus.suppressed_newline = 0;
- }
}
typedef unsigned long (*sane_truncate_fn)(char *line, unsigned long len);
diff_words_show(ecbdata->diff_words);
free (ecbdata->diff_words->minus.text.ptr);
+ free (ecbdata->diff_words->minus.orig);
free (ecbdata->diff_words->plus.text.ptr);
+ free (ecbdata->diff_words->plus.orig);
+ free(ecbdata->diff_words->word_regex);
free(ecbdata->diff_words);
ecbdata->diff_words = NULL;
}
return one->driver->funcname.pattern ? &one->driver->funcname : NULL;
}
+static const char *userdiff_word_regex(struct diff_filespec *one)
+{
+ diff_filespec_load_driver(one);
+ return one->driver->word_regex;
+}
+
void diff_set_mnemonic_prefix(struct diff_options *options, const char *a, const char *b)
{
if (!options->a_prefix)
ecbdata.diff_words =
xcalloc(1, sizeof(struct diff_words_data));
ecbdata.diff_words->file = o->file;
+ if (!o->word_regex)
+ o->word_regex = userdiff_word_regex(one);
+ if (!o->word_regex)
+ o->word_regex = userdiff_word_regex(two);
+ if (!o->word_regex)
+ o->word_regex = diff_word_regex_cfg;
+ if (o->word_regex) {
+ ecbdata.diff_words->word_regex = (regex_t *)
+ xmalloc(sizeof(regex_t));
+ if (regcomp(ecbdata.diff_words->word_regex,
+ o->word_regex,
+ REG_EXTENDED | REG_NEWLINE))
+ die ("Invalid regular expression: %s",
+ o->word_regex);
+ }
}
xdi_diff_outf(&mf1, &mf2, fn_out_consume, &ecbdata,
&xpp, &xecfg, &ecb);
options->xdl_opts |= XDF_IGNORE_WHITESPACE_CHANGE;
else if (!strcmp(arg, "--ignore-space-at-eol"))
options->xdl_opts |= XDF_IGNORE_WHITESPACE_AT_EOL;
+ else if (!strcmp(arg, "--patience"))
+ options->xdl_opts |= XDF_PATIENCE_DIFF;
/* flags options */
else if (!strcmp(arg, "--binary")) {
DIFF_OPT_CLR(options, COLOR_DIFF);
else if (!strcmp(arg, "--color-words"))
options->flags |= DIFF_OPT_COLOR_DIFF | DIFF_OPT_COLOR_DIFF_WORDS;
+ else if (!prefixcmp(arg, "--color-words=")) {
+ options->flags |= DIFF_OPT_COLOR_DIFF | DIFF_OPT_COLOR_DIFF_WORDS;
+ options->word_regex = arg + 14;
+ }
else if (!strcmp(arg, "--exit-code"))
DIFF_OPT_SET(options, EXIT_WITH_STATUS);
else if (!strcmp(arg, "--quiet"))
int stat_width;
int stat_name_width;
+ const char *word_regex;
/* this is set by diffcore for DIFF_FORMAT_PATCH */
int found_changes;
* is a possible size - we really should have a flag to
* say whether the size is valid or not!)
*/
- if (!src->cnt_data && diff_populate_filespec(src, 0))
+ if (!src->cnt_data && diff_populate_filespec(src, 1))
return 0;
- if (!dst->cnt_data && diff_populate_filespec(dst, 0))
+ if (!dst->cnt_data && diff_populate_filespec(dst, 1))
return 0;
max_size = ((src->size > dst->size) ? src->size : dst->size);
if (base_size * (MAX_SCORE-minimum_score) < delta_size * MAX_SCORE)
return 0;
+ if (!src->cnt_data && diff_populate_filespec(src, 0))
+ return 0;
+ if (!dst->cnt_data && diff_populate_filespec(dst, 0))
+ return 0;
+
delta_limit = (unsigned long)
(base_size * (MAX_SCORE-minimum_score) / MAX_SCORE);
if (diffcore_count_changes(src, dst,
for (;;) {
unsigned char c1 = *match;
unsigned char c2 = *name;
- if (isspecial(c1))
+ if (c1 == '\0' || is_glob_special(c1))
break;
if (c1 != c2)
return 0;
* and a mark is left in seen[] array for pathspec element that
* actually matched anything.
*/
-int match_pathspec(const char **pathspec, const char *name, int namelen, int prefix, char *seen)
+int match_pathspec(const char **pathspec, const char *name, int namelen,
+ int prefix, char *seen)
{
- int retval;
- const char *match;
+ int i, retval = 0;
+
+ if (!pathspec)
+ return 1;
name += prefix;
namelen -= prefix;
- for (retval = 0; (match = *pathspec++) != NULL; seen++) {
+ for (i = 0; pathspec[i] != NULL; i++) {
int how;
- if (retval && *seen == MATCHED_EXACTLY)
+ const char *match = pathspec[i] + prefix;
+ if (seen && seen[i] == MATCHED_EXACTLY)
continue;
- match += prefix;
how = match_one(match, name, namelen);
if (how) {
if (retval < how)
retval = how;
- if (*seen < how)
- *seen = how;
+ if (seen && seen[i] < how)
+ seen[i] = how;
}
}
return retval;
int len, dtype;
int exclude;
- if ((de->d_name[0] == '.') &&
- (de->d_name[1] == 0 ||
- !strcmp(de->d_name + 1, ".") ||
- !strcmp(de->d_name + 1, "git")))
+ if (is_dot_or_dotdot(de->d_name) ||
+ !strcmp(de->d_name, ".git"))
continue;
len = strlen(de->d_name);
/* Ignore overly long pathnames! */
for (;;) {
unsigned char c = *match++;
len++;
- if (isspecial(c))
+ if (c == '\0' || is_glob_special(c))
return len;
}
}
return get_relative_cwd(buffer, sizeof(buffer), dir) != NULL;
}
+int is_empty_dir(const char *path)
+{
+ DIR *dir = opendir(path);
+ struct dirent *e;
+ int ret = 1;
+
+ if (!dir)
+ return 0;
+
+ while ((e = readdir(dir)) != NULL)
+ if (!is_dot_or_dotdot(e->d_name)) {
+ ret = 0;
+ break;
+ }
+
+ closedir(dir);
+ return ret;
+}
+
int remove_dir_recursively(struct strbuf *path, int only_empty)
{
DIR *dir = opendir(path->buf);
len = path->len;
while ((e = readdir(dir)) != NULL) {
struct stat st;
- if ((e->d_name[0] == '.') &&
- ((e->d_name[1] == 0) ||
- ((e->d_name[1] == '.') && e->d_name[2] == 0)))
- continue; /* "." and ".." */
+ if (is_dot_or_dotdot(e->d_name))
+ continue;
strbuf_setlen(path, len);
strbuf_addstr(path, e->d_name);
extern char *get_relative_cwd(char *buffer, int size, const char *dir);
extern int is_inside_dir(const char *dir);
+static inline int is_dot_or_dotdot(const char *name)
+{
+ return (name[0] == '.' &&
+ (name[1] == '\0' ||
+ (name[1] == '.' && name[2] == '\0')));
+}
+
+extern int is_empty_dir(const char *dir);
+
extern void setup_standard_excludes(struct dir_struct *dir);
extern int remove_dir_recursively(struct strbuf *path, int only_empty);
#include "cache.h"
#include "blob.h"
+#include "dir.h"
static void create_directories(const char *path, const struct checkout *state)
{
const char *slash = path;
while ((slash = strchr(slash+1, '/')) != NULL) {
- struct stat st;
- int stat_status;
-
len = slash - path;
memcpy(buf, path, len);
buf[len] = 0;
- if (len <= state->base_dir_len)
- /*
- * checkout-index --prefix=<dir>; <dir> is
- * allowed to be a symlink to an existing
- * directory.
- */
- stat_status = stat(buf, &st);
- else
- /*
- * if there currently is a symlink, we would
- * want to replace it with a real directory.
- */
- stat_status = lstat(buf, &st);
-
- if (!stat_status && S_ISDIR(st.st_mode))
+ /*
+ * For 'checkout-index --prefix=<dir>', <dir> is
+ * allowed to be a symlink to an existing directory,
+ * and we set 'state->base_dir_len' below, such that
+ * we test the path components of the prefix with the
+ * stat() function instead of the lstat() function.
+ */
+ if (has_dirs_only_path(len, buf, state->base_dir_len))
continue; /* ok, it is already a directory. */
/*
- * We know stat_status == 0 means something exists
- * there and this mkdir would fail, but that is an
- * error codepath; we do not care, as we unlink and
- * mkdir again in such a case.
+ * If this mkdir() would fail, it could be that there
+ * is already a symlink or something else exists
+ * there, therefore we then try to unlink it and try
+ * one more time to create the directory.
*/
if (mkdir(buf, 0777)) {
if (errno == EEXIST && state->force &&
*name++ = '/';
while ((de = readdir(dir)) != NULL) {
struct stat st;
- if ((de->d_name[0] == '.') &&
- ((de->d_name[1] == 0) ||
- ((de->d_name[1] == '.') && de->d_name[2] == 0)))
+ if (is_dot_or_dotdot(de->d_name))
continue;
strcpy(name, de->d_name);
if (lstat(pathbuf, &st))
git am [options] [<mbox>|<Maildir>...]
git am [options] (--resolved | --skip | --abort)
--
-d,dotest= (removed -- do not use)
i,interactive run interactively
-b,binary (historical option -- no-op)
+b,binary* (historical option -- no-op)
3,3way allow fall back on 3way merging if needed
s,signoff add a Signed-off-by line to the commit message
u,utf8 recode into utf8 (default)
k,keep pass -k flag to git-mailinfo
whitespace= pass it through git-apply
+directory= pass it through git-apply
C= pass it through git-apply
p= pass it through git-apply
+reject pass it through git-apply
resolvemsg= override error message when patch failure occurs
r,resolved to be used after a patch failure
skip skip the current patch
abort restore the original branch and abort the patching operation.
-rebasing (internal use for git-rebase)"
+rebasing* (internal use for git-rebase)"
. git-sh-setup
prefix=$(git rev-parse --show-prefix)
git var GIT_COMMITTER_IDENT >/dev/null ||
die "You need to set your committer info first"
+sq () {
+ for sqarg
+ do
+ printf "%s" "$sqarg" |
+ sed -e 's/'\''/'\''\\'\'''\''/g' -e 's/.*/ '\''&'\''/'
+ done
+}
+
stop_here () {
echo "$1" >"$dotest/next"
exit 1
;;
--resolvemsg)
shift; resolvemsg=$1 ;;
- --whitespace)
- git_apply_opt="$git_apply_opt $1=$2"; shift ;;
+ --whitespace|--directory)
+ git_apply_opt="$git_apply_opt $(sq "$1=$2")"; shift ;;
-C|-p)
- git_apply_opt="$git_apply_opt $1$2"; shift ;;
+ git_apply_opt="$git_apply_opt $(sq "$1$2")"; shift ;;
+ --reject)
+ git_apply_opt="$git_apply_opt $1" ;;
--)
shift; break ;;
*)
# unreliable -- stdin could be /dev/null for example
# and the caller did not intend to feed us a patch but
# wanted to continue unattended.
- tty -s
+ test -t 0
;;
*)
false
case "$resolved" in
'')
files=$(git diff-index --cached --name-only HEAD --) || exit
- if [ "$files" ]; then
- echo "Dirty index: cannot apply patches (dirty: $files)" >&2
- exit 1
- fi
+ test "$files" && die "Dirty index: cannot apply patches (dirty: $files)"
esac
if test "$(cat "$dotest/utf8")" = t
case "$resolved" in
'')
- git apply $git_apply_opt --index "$dotest/patch"
+ eval 'git apply '"$git_apply_opt"' --index "$dotest/patch"'
apply_status=$?
;;
t)
fi
if test $apply_status != 0
then
- echo Patch failed at $msgnum.
+ printf 'Patch failed at %s %s\n' "$msgnum" "$FIRSTLINE"
stop_here_user_resolve $this
fi
#define GIT_SPACE 0x01
#define GIT_DIGIT 0x02
#define GIT_ALPHA 0x04
-#define GIT_SPECIAL 0x08
+#define GIT_GLOB_SPECIAL 0x08
+#define GIT_REGEX_SPECIAL 0x10
#define sane_istest(x,mask) ((sane_ctype[(unsigned char)(x)] & (mask)) != 0)
#define isspace(x) sane_istest(x,GIT_SPACE)
#define isdigit(x) sane_istest(x,GIT_DIGIT)
#define isalpha(x) sane_istest(x,GIT_ALPHA)
#define isalnum(x) sane_istest(x,GIT_ALPHA | GIT_DIGIT)
-#define isspecial(x) sane_istest(x,GIT_SPECIAL)
+#define is_glob_special(x) sane_istest(x,GIT_GLOB_SPECIAL)
+#define is_regex_special(x) sane_istest(x,GIT_GLOB_SPECIAL | GIT_REGEX_SPECIAL)
#define tolower(x) sane_case((unsigned char)(x), 0x20)
#define toupper(x) sane_case((unsigned char)(x), 0)
fi
status=$?
;;
- meld|vimdiff)
+ meld)
touch "$BACKUP"
"$merge_tool_path" "$LOCAL" "$MERGED" "$REMOTE"
check_unchanged
;;
+ vimdiff)
+ touch "$BACKUP"
+ "$merge_tool_path" -c "wincmd l" "$LOCAL" "$MERGED" "$REMOTE"
+ check_unchanged
+ ;;
gvimdiff)
touch "$BACKUP"
- "$merge_tool_path" -f "$LOCAL" "$MERGED" "$REMOTE"
+ "$merge_tool_path" -c "wincmd l" -f "$LOCAL" "$MERGED" "$REMOTE"
check_unchanged
;;
xxdiff)
abort abort rebasing process and restore original branch
skip skip current patch and continue rebasing process
no-verify override pre-rebase hook from stopping the operation
+root rebase all reachable commmits up to the root(s)
"
. git-sh-setup
ONTO=
VERBOSE=
OK_TO_SKIP_PRE_REBASE=
+REBASE_ROOT=
GIT_CHERRY_PICK_HELP=" After resolving the conflicts,
mark the corrected paths with 'git add <paths>', and
output git rev-parse --verify $sha1 || die "Invalid commit name: $sha1"
test -d "$REWRITTEN" &&
pick_one_preserving_merges "$@" && return
+ if test ! -z "$REBASE_ROOT"
+ then
+ output git cherry-pick "$@"
+ return
+ fi
parent_sha1=$(git rev-parse --verify $sha1^) ||
die "Could not get the parent of $sha1"
current_sha1=$(git rev-parse --verify HEAD)
# rewrite parents; if none were rewritten, we can fast-forward.
new_parents=
- pend=" $(git rev-list --parents -1 $sha1 | cut -d' ' -f2-)"
+ pend=" $(git rev-list --parents -1 $sha1 | cut -d' ' -s -f2-)"
+ if test "$pend" = " "
+ then
+ pend=" root"
+ fi
while [ "$pend" != "" ]
do
p=$(expr "$pend" : ' \([^ ]*\)')
if test -f "$DROPPED"/$p
then
fast_forward=f
- pend=" $(cat "$DROPPED"/$p)$pend"
+ replacement="$(cat "$DROPPED"/$p)"
+ test -z "$replacement" && replacement=root
+ pend=" $replacement$pend"
else
new_parents="$new_parents $p"
fi
test -d "$REWRITTEN" && PRESERVE_MERGES=t
test -f "$DOTEST"/strategy && STRATEGY="$(cat "$DOTEST"/strategy)"
test -f "$DOTEST"/verbose && VERBOSE=t
+ test -f "$DOTEST"/rebase-root && REBASE_ROOT=t
}
while test $# != 0
-i)
# yeah, we know
;;
+ --root)
+ REBASE_ROOT=t
+ ;;
--onto)
shift
ONTO=$(git rev-parse --verify "$1") ||
;;
--)
shift
- run_pre_rebase_hook ${1+"$@"}
- test $# -eq 1 -o $# -eq 2 || usage
+ test -z "$REBASE_ROOT" -a $# -ge 1 -a $# -le 2 ||
+ test ! -z "$REBASE_ROOT" -a $# -le 1 || usage
test -d "$DOTEST" &&
die "Interactive rebase already started"
git var GIT_COMMITTER_IDENT >/dev/null ||
die "You need to set your committer info first"
+ if test -z "$REBASE_ROOT"
+ then
+ UPSTREAM_ARG="$1"
+ UPSTREAM=$(git rev-parse --verify "$1") || die "Invalid base"
+ test -z "$ONTO" && ONTO=$UPSTREAM
+ shift
+ else
+ UPSTREAM=
+ UPSTREAM_ARG=--root
+ test -z "$ONTO" &&
+ die "You must specify --onto when using --root"
+ fi
+ run_pre_rebase_hook "$UPSTREAM_ARG" "$@"
+
comment_for_reflog start
require_clean_work_tree
- UPSTREAM=$(git rev-parse --verify "$1") || die "Invalid base"
- test -z "$ONTO" && ONTO=$UPSTREAM
-
- if test ! -z "$2"
+ if test ! -z "$1"
then
- output git show-ref --verify --quiet "refs/heads/$2" ||
- die "Invalid branchname: $2"
- output git checkout "$2" ||
- die "Could not checkout $2"
+ output git show-ref --verify --quiet "refs/heads/$1" ||
+ die "Invalid branchname: $1"
+ output git checkout "$1" ||
+ die "Could not checkout $1"
fi
HEAD=$(git rev-parse --verify HEAD) || die "No HEAD?"
echo "detached HEAD" > "$DOTEST"/head-name
echo $HEAD > "$DOTEST"/head
- echo $UPSTREAM > "$DOTEST"/upstream
+ case "$REBASE_ROOT" in
+ '')
+ rm -f "$DOTEST"/rebase-root ;;
+ *)
+ : >"$DOTEST"/rebase-root ;;
+ esac
echo $ONTO > "$DOTEST"/onto
test -z "$STRATEGY" || echo "$STRATEGY" > "$DOTEST"/strategy
test t = "$VERBOSE" && : > "$DOTEST"/verbose
# This ensures that commits on merged, but otherwise
# unrelated side branches are left alone. (Think "X"
# in the man page's example.)
- mkdir "$REWRITTEN" &&
- for c in $(git merge-base --all $HEAD $UPSTREAM)
- do
- echo $ONTO > "$REWRITTEN"/$c ||
+ if test -z "$REBASE_ROOT"
+ then
+ mkdir "$REWRITTEN" &&
+ for c in $(git merge-base --all $HEAD $UPSTREAM)
+ do
+ echo $ONTO > "$REWRITTEN"/$c ||
+ die "Could not init rewritten commits"
+ done
+ else
+ mkdir "$REWRITTEN" &&
+ echo $ONTO > "$REWRITTEN"/root ||
die "Could not init rewritten commits"
- done
+ fi
# No cherry-pick because our first pass is to determine
# parents to rewrite and skipping dropped commits would
# prematurely end our probe
MERGES_OPTION="--no-merges --cherry-pick"
fi
- SHORTUPSTREAM=$(git rev-parse --short $UPSTREAM)
SHORTHEAD=$(git rev-parse --short $HEAD)
SHORTONTO=$(git rev-parse --short $ONTO)
+ if test -z "$REBASE_ROOT"
+ # this is now equivalent to ! -z "$UPSTREAM"
+ then
+ SHORTUPSTREAM=$(git rev-parse --short $UPSTREAM)
+ REVISIONS=$UPSTREAM...$HEAD
+ SHORTREVISIONS=$SHORTUPSTREAM..$SHORTHEAD
+ else
+ REVISIONS=$ONTO...$HEAD
+ SHORTREVISIONS=$SHORTHEAD
+ fi
git rev-list $MERGES_OPTION --pretty=oneline --abbrev-commit \
--abbrev=7 --reverse --left-right --topo-order \
- $UPSTREAM...$HEAD | \
+ $REVISIONS | \
sed -n "s/^>//p" | while read shortsha1 rest
do
if test t != "$PRESERVE_MERGES"
echo "pick $shortsha1 $rest" >> "$TODO"
else
sha1=$(git rev-parse $shortsha1)
- preserve=t
- for p in $(git rev-list --parents -1 $sha1 | cut -d' ' -f2-)
- do
- if test -f "$REWRITTEN"/$p -a \( $p != $UPSTREAM -o $sha1 = $first_after_upstream \)
- then
- preserve=f
- fi
- done
+ if test -z "$REBASE_ROOT"
+ then
+ preserve=t
+ for p in $(git rev-list --parents -1 $sha1 | cut -d' ' -s -f2-)
+ do
+ if test -f "$REWRITTEN"/$p -a \( $p != $UPSTREAM -o $sha1 = $first_after_upstream \)
+ then
+ preserve=f
+ fi
+ done
+ else
+ preserve=f
+ fi
if test f = "$preserve"
then
touch "$REWRITTEN"/$sha1
then
mkdir "$DROPPED"
# Save all non-cherry-picked changes
- git rev-list $UPSTREAM...$HEAD --left-right --cherry-pick | \
+ git rev-list $REVISIONS --left-right --cherry-pick | \
sed -n "s/^>//p" > "$DOTEST"/not-cherry-picks
# Now all commits and note which ones are missing in
# not-cherry-picks and hence being dropped
- git rev-list $UPSTREAM..$HEAD |
+ git rev-list $REVISIONS |
while read rev
do
if test -f "$REWRITTEN"/$rev -a "$(grep "$rev" "$DOTEST"/not-cherry-picks)" = ""
# not worthwhile, we don't want to track its multiple heads,
# just the history of its first-parent for others that will
# be rebasing on top of it
- git rev-list --parents -1 $rev | cut -d' ' -f2 > "$DROPPED"/$rev
+ git rev-list --parents -1 $rev | cut -d' ' -s -f2 > "$DROPPED"/$rev
short=$(git rev-list -1 --abbrev-commit --abbrev=7 $rev)
grep -v "^[a-z][a-z]* $short" <"$TODO" > "${TODO}2" ; mv "${TODO}2" "$TODO"
rm "$REWRITTEN"/$rev
fi
done
fi
+
test -s "$TODO" || echo noop >> "$TODO"
cat >> "$TODO" << EOF
-# Rebase $SHORTUPSTREAM..$SHORTHEAD onto $SHORTONTO
+# Rebase $SHORTREVISIONS onto $SHORTONTO
#
# Commands:
# p, pick = use commit
# Copyright (c) 2005 Junio C Hamano.
#
-USAGE='[--interactive | -i] [-v] [--onto <newbase>] <upstream> [<branch>]'
+USAGE='[--interactive | -i] [-v] [--onto <newbase>] [<upstream>|--root] [<branch>]'
LONG_USAGE='git-rebase replaces <branch> with a new branch of the
same name. When the --onto option is provided the new branch starts
out with a HEAD equal to <newbase>, otherwise it is equal to <upstream>
prec=4
verbose=
git_am_opt=
+rebase_root=
continue_merge () {
test -n "$prev_head" || die "prev_head must be defined"
-C*)
git_am_opt="$git_am_opt $1"
;;
+ --root)
+ rebase_root=t
+ ;;
-*)
usage
;;
;;
esac
-# The upstream head must be given. Make sure it is valid.
-upstream_name="$1"
-upstream=`git rev-parse --verify "${upstream_name}^0"` ||
- die "invalid upstream $upstream_name"
+if test -z "$rebase_root"
+then
+ # The upstream head must be given. Make sure it is valid.
+ upstream_name="$1"
+ shift
+ upstream=`git rev-parse --verify "${upstream_name}^0"` ||
+ die "invalid upstream $upstream_name"
+ unset root_flag
+ upstream_arg="$upstream_name"
+else
+ test -z "$newbase" && die "--root must be used with --onto"
+ unset upstream_name
+ unset upstream
+ root_flag="--root"
+ upstream_arg="$root_flag"
+fi
# Make sure the branch to rebase onto is valid.
onto_name=${newbase-"$upstream_name"}
onto=$(git rev-parse --verify "${onto_name}^0") || exit
# If a hook exists, give it a chance to interrupt
-run_pre_rebase_hook ${1+"$@"}
+run_pre_rebase_hook "$upstream_arg" "$@"
# If the branch to rebase is given, that is the branch we will rebase
# $branch_name -- branch being rebased, or HEAD (already detached)
# $head_name -- refs/heads/<that-branch> or "detached HEAD"
switch_to=
case "$#" in
-2)
+1)
# Is it "rebase other $branchname" or "rebase other $commit"?
- branch_name="$2"
- switch_to="$2"
+ branch_name="$1"
+ switch_to="$1"
- if git show-ref --verify --quiet -- "refs/heads/$2" &&
- branch=$(git rev-parse -q --verify "refs/heads/$2")
+ if git show-ref --verify --quiet -- "refs/heads/$1" &&
+ branch=$(git rev-parse -q --verify "refs/heads/$1")
then
- head_name="refs/heads/$2"
- elif branch=$(git rev-parse -q --verify "$2")
+ head_name="refs/heads/$1"
+ elif branch=$(git rev-parse -q --verify "$1")
then
head_name="detached HEAD"
else
esac
orig_head=$branch
-# Now we are rebasing commits $upstream..$branch on top of $onto
+# Now we are rebasing commits $upstream..$branch (or with --root,
+# everything leading up to $branch) on top of $onto
# Check if we are already based on $onto with linear history,
# but this should be done only when upstream and onto are the same.
exit 0
fi
+if test -n "$rebase_root"
+then
+ revisions="$onto..$orig_head"
+else
+ revisions="$upstream..$orig_head"
+fi
+
if test -z "$do_merge"
then
git format-patch -k --stdout --full-index --ignore-if-in-upstream \
- "$upstream..$orig_head" |
+ $root_flag "$revisions" |
git am $git_am_opt --rebasing --resolvemsg="$RESOLVEMSG" &&
move_to_original_branch
ret=$?
echo "$head_name" > "$dotest/head-name"
msgnum=0
-for cmt in `git rev-list --reverse --no-merges "$upstream..$orig_head"`
+for cmt in `git rev-list --reverse --no-merges "$revisions"`
do
msgnum=$(($msgnum + 1))
echo "$cmt" > "$dotest/cmt.$msgnum"
$Git::SVN::_follow_parent = 1;
my %remote_opts = ( 'username=s' => \$Git::SVN::Prompt::_username,
'config-dir=s' => \$Git::SVN::Ra::config_dir,
- 'no-auth-cache' => \$Git::SVN::Prompt::_no_auth_cache );
+ 'no-auth-cache' => \$Git::SVN::Prompt::_no_auth_cache,
+ 'ignore-paths=s' => \$SVN::Git::Fetcher::_ignore_regex );
my %fc_opts = ( 'follow-parent|follow!' => \$Git::SVN::_follow_parent,
'authors-file|A=s' => \$_authors,
'repack:i' => \$Git::SVN::_repack,
\$Git::SVN::_repack_flags,
'use-log-author' => \$Git::SVN::_use_log_author,
'add-author-from' => \$Git::SVN::_add_author_from,
+ 'localtime' => \$Git::SVN::_localtime,
%remote_opts );
my ($_trunk, $_tags, $_branches, $_stdlayout);
if ($@) {
$result .= "Repository Root: (offline)\n";
}
- $result .= "Repository UUID: $uuid\n" unless $diff_status eq "A";
+ $result .= "Repository UUID: $uuid\n" unless $diff_status eq "A" &&
+ ($SVN::Core::VERSION le '1.5.4' || $file_type ne "dir");
$result .= "Revision: " . ($diff_status eq "A" ? 0 : $rev) . "\n";
$result .= "Node Kind: " .
use vars qw/$default_repo_id $default_ref_id $_no_metadata $_follow_parent
$_repack $_repack_flags $_use_svm_props $_head
$_use_svnsync_props $no_reuse_existing $_minimize_url
- $_use_log_author $_add_author_from/;
+ $_use_log_author $_add_author_from $_localtime/;
use Carp qw/croak/;
use File::Path qw/mkpath/;
use File::Copy qw/copy/;
\@out;
}
+# parse_svn_date(DATE)
+# --------------------
+# Given a date (in UTC) from Subversion, return a string in the format
+# "<TZ Offset> <local date/time>" that Git will use.
+#
+# By default the parsed date will be in UTC; if $Git::SVN::_localtime
+# is true we'll convert it to the local timezone instead.
sub parse_svn_date {
my $date = shift || return '+0000 1970-01-01 00:00:00';
my ($Y,$m,$d,$H,$M,$S) = ($date =~ /^(\d{4})\-(\d\d)\-(\d\d)T
(\d\d)\:(\d\d)\:(\d\d).\d+Z$/x) or
croak "Unable to parse date: $date\n";
- "+0000 $Y-$m-$d $H:$M:$S";
+ my $parsed_date; # Set next.
+
+ if ($Git::SVN::_localtime) {
+ # Translate the Subversion datetime to an epoch time.
+ # Begin by switching ourselves to $date's timezone, UTC.
+ my $old_env_TZ = $ENV{TZ};
+ $ENV{TZ} = 'UTC';
+
+ my $epoch_in_UTC =
+ POSIX::strftime('%s', $S, $M, $H, $d, $m - 1, $Y - 1900);
+
+ # Determine our local timezone (including DST) at the
+ # time of $epoch_in_UTC. $Git::SVN::Log::TZ stored the
+ # value of TZ, if any, at the time we were run.
+ if (defined $Git::SVN::Log::TZ) {
+ $ENV{TZ} = $Git::SVN::Log::TZ;
+ } else {
+ delete $ENV{TZ};
+ }
+
+ my $our_TZ =
+ POSIX::strftime('%Z', $S, $M, $H, $d, $m - 1, $Y - 1900);
+
+ # This converts $epoch_in_UTC into our local timezone.
+ my ($sec, $min, $hour, $mday, $mon, $year,
+ $wday, $yday, $isdst) = localtime($epoch_in_UTC);
+
+ $parsed_date = sprintf('%s %04d-%02d-%02d %02d:%02d:%02d',
+ $our_TZ, $year + 1900, $mon + 1,
+ $mday, $hour, $min, $sec);
+
+ # Reset us to the timezone in effect when we entered
+ # this routine.
+ if (defined $old_env_TZ) {
+ $ENV{TZ} = $old_env_TZ;
+ } else {
+ delete $ENV{TZ};
+ }
+ } else {
+ $parsed_date = "+0000 $Y-$m-$d $H:$M:$S";
+ }
+
+ return $parsed_date;
}
sub check_author {
use Carp qw/croak/;
use File::Temp qw/tempfile/;
use IO::File qw//;
+use vars qw/$_ignore_regex/;
# file baton members: path, mode_a, mode_b, pool, fh, blob, base
sub new {
my ($class, $git_svn) = @_;
my $self = SVN::Delta::Editor->new;
bless $self, $class;
- $self->{c} = $git_svn->{last_commit} if exists $git_svn->{last_commit};
+ if (exists $git_svn->{last_commit}) {
+ $self->{c} = $git_svn->{last_commit};
+ $self->{empty_symlinks} = _mark_empty_symlinks($git_svn);
+ }
$self->{empty} = {};
$self->{dir_prop} = {};
$self->{file_prop} = {};
$self;
}
+# this uses the Ra object, so it must be called before do_{switch,update},
+# not inside them (when the Git::SVN::Fetcher object is passed) to
+# do_{switch,update}
+sub _mark_empty_symlinks {
+ my ($git_svn) = @_;
+ my %ret;
+ my ($rev, $cmt) = $git_svn->last_rev_commit;
+ return {} unless ($rev && $cmt);
+
+ chomp(my $empty_blob = `git hash-object -t blob --stdin < /dev/null`);
+ my ($ls, $ctx) = command_output_pipe(qw/ls-tree -r -z/, $cmt);
+ local $/ = "\0";
+ my $pfx = $git_svn->{path};
+ $pfx .= '/' if length($pfx);
+ while (<$ls>) {
+ chomp;
+ s/\A100644 blob $empty_blob\t//o or next;
+ my $path = $_;
+ my (undef, $props) =
+ $git_svn->ra->get_file($pfx.$path, $rev, undef);
+ if ($props->{'svn:special'}) {
+ $ret{$path} = 1;
+ }
+ }
+ command_close_pipe($ls, $ctx);
+ \%ret;
+}
+
+# returns true if a given path is inside a ".git" directory
+sub in_dot_git {
+ $_[0] =~ m{(?:^|/)\.git(?:/|$)};
+}
+
+# return value: 0 -- don't ignore, 1 -- ignore
+sub is_path_ignored {
+ my ($path) = @_;
+ return 1 if in_dot_git($path);
+ return 0 unless defined($_ignore_regex);
+ return 1 if $path =~ m!$_ignore_regex!o;
+ return 0;
+}
+
sub set_path_strip {
my ($self, $path) = @_;
$self->{path_strip} = qr/^\Q$path\E(\/|$)/ if length $path;
sub delete_entry {
my ($self, $path, $rev, $pb) = @_;
+ return undef if is_path_ignored($path);
my $gpath = $self->git_path($path);
return undef if ($gpath eq '');
sub open_file {
my ($self, $path, $pb, $rev) = @_;
+ my ($mode, $blob);
+
+ goto out if is_path_ignored($path);
+
my $gpath = $self->git_path($path);
- my ($mode, $blob) = (command('ls-tree', $self->{c}, '--', $gpath)
+ ($mode, $blob) = (command('ls-tree', $self->{c}, '--', $gpath)
=~ /^(\d{6}) blob ([a-f\d]{40})\t/);
unless (defined $mode && defined $blob) {
die "$path was not found in commit $self->{c} (r$rev)\n";
}
+ if ($mode eq '100644' && $self->{empty_symlinks}->{$path}) {
+ $mode = '120000';
+ }
+out:
{ path => $path, mode_a => $mode, mode_b => $mode, blob => $blob,
pool => SVN::Pool->new, action => 'M' };
}
sub add_file {
my ($self, $path, $pb, $cp_path, $cp_rev) = @_;
- my ($dir, $file) = ($path =~ m#^(.*?)/?([^/]+)$#);
- delete $self->{empty}->{$dir};
- { path => $path, mode_a => 100644, mode_b => 100644,
+ my $mode;
+
+ if (!is_path_ignored($path)) {
+ my ($dir, $file) = ($path =~ m#^(.*?)/?([^/]+)$#);
+ delete $self->{empty}->{$dir};
+ $mode = '100644';
+ }
+ { path => $path, mode_a => $mode, mode_b => $mode,
pool => SVN::Pool->new, action => 'A' };
}
sub add_directory {
my ($self, $path, $cp_path, $cp_rev) = @_;
+ goto out if is_path_ignored($path);
my $gpath = $self->git_path($path);
if ($gpath eq '') {
my ($ls, $ctx) = command_output_pipe(qw/ls-tree
my ($dir, $file) = ($path =~ m#^(.*?)/?([^/]+)$#);
delete $self->{empty}->{$dir};
$self->{empty}->{$path} = 1;
+out:
{ path => $path };
}
sub change_dir_prop {
my ($self, $db, $prop, $value) = @_;
+ return undef if is_path_ignored($db->{path});
$self->{dir_prop}->{$db->{path}} ||= {};
$self->{dir_prop}->{$db->{path}}->{$prop} = $value;
undef;
sub absent_directory {
my ($self, $path, $pb) = @_;
+ return undef if is_path_ignored($path);
$self->{absent_dir}->{$pb->{path}} ||= [];
push @{$self->{absent_dir}->{$pb->{path}}}, $path;
undef;
sub absent_file {
my ($self, $path, $pb) = @_;
+ return undef if is_path_ignored($path);
$self->{absent_file}->{$pb->{path}} ||= [];
push @{$self->{absent_file}->{$pb->{path}}}, $path;
undef;
sub change_file_prop {
my ($self, $fb, $prop, $value) = @_;
+ return undef if is_path_ignored($fb->{path});
if ($prop eq 'svn:executable') {
if ($fb->{mode_b} != 120000) {
$fb->{mode_b} = defined $value ? 100755 : 100644;
sub apply_textdelta {
my ($self, $fb, $exp) = @_;
+ return undef if is_path_ignored($fb->{path});
my $fh = $::_repository->temp_acquire('svn_delta');
# $fh gets auto-closed() by SVN::TxDelta::apply(),
# (but $base does not,) so dup() it for reading in close_file
open my $dup, '<&', $fh or croak $!;
my $base = $::_repository->temp_acquire('git_blob');
+
if ($fb->{blob}) {
- print $base 'link ' if ($fb->{mode_a} == 120000);
- my $size = $::_repository->cat_blob($fb->{blob}, $base);
+ my ($base_is_link, $size);
+
+ if ($fb->{mode_a} eq '120000' &&
+ ! $self->{empty_symlinks}->{$fb->{path}}) {
+ print $base 'link ' or die "print $!\n";
+ $base_is_link = 1;
+ }
+ retry:
+ $size = $::_repository->cat_blob($fb->{blob}, $base);
die "Failed to read object $fb->{blob}" if ($size < 0);
if (defined $exp) {
seek $base, 0, 0 or croak $!;
my $got = ::md5sum($base);
- die "Checksum mismatch: $fb->{path} $fb->{blob}\n",
- "expected: $exp\n",
- " got: $got\n" if ($got ne $exp);
+ if ($got ne $exp) {
+ my $err = "Checksum mismatch: ".
+ "$fb->{path} $fb->{blob}\n" .
+ "expected: $exp\n" .
+ " got: $got\n";
+ if ($base_is_link) {
+ warn $err,
+ "Retrying... (possibly ",
+ "a bad symlink from SVN)\n";
+ $::_repository->temp_reset($base);
+ $base_is_link = 0;
+ goto retry;
+ }
+ die $err;
+ }
}
}
seek $base, 0, 0 or croak $!;
sub close_file {
my ($self, $fb, $exp) = @_;
+ return undef if is_path_ignored($fb->{path});
+
my $hash;
my $path = $self->git_path($fb->{path});
if (my $fh = $fb->{fh}) {
}
if ($fb->{mode_b} == 120000) {
sysseek($fh, 0, 0) or croak $!;
- sysread($fh, my $buf, 5) == 5 or croak $!;
+ my $rd = sysread($fh, my $buf, 5);
- unless ($buf eq 'link ') {
+ if (!defined $rd) {
+ croak "sysread: $!\n";
+ } elsif ($rd == 0) {
+ warn "$path has mode 120000",
+ " but it points to nothing\n",
+ "converting to an empty file with mode",
+ " 100644\n";
+ $fb->{mode_b} = '100644';
+ } elsif ($buf ne 'link ') {
warn "$path has mode 120000",
- " but is not a link\n";
+ " but is not a link\n";
} else {
my $tmp_fh = $::_repository->temp_acquire(
'svn_hash');
BEGIN {
# enforce temporary pool usage for some simple functions
no strict 'refs';
- for my $f (qw/rev_proplist get_latest_revnum get_uuid get_repos_root/) {
+ for my $f (qw/rev_proplist get_latest_revnum get_uuid get_repos_root
+ get_file/) {
my $SUPER = "SUPER::$f";
*$f = sub {
my $self = shift;
# do not call the real DESTROY since we store ourselves in $RA
}
+# get_log(paths, start, end, limit,
+# discover_changed_paths, strict_node_history, receiver)
sub get_log {
my ($self, @args) = @_;
my $pool = SVN::Pool->new;
- splice(@args, 3, 1) if ($SVN::Core::VERSION le '1.2.0');
+
+ # the limit parameter was not supported in SVN 1.1.x, so we
+ # drop it. Therefore, the receiver callback passed to it
+ # is made aware of this limitation by being wrapped if
+ # the limit passed to is being wrapped.
+ if ($SVN::Core::VERSION le '1.2.0') {
+ my $limit = splice(@args, 3, 1);
+ if ($limit > 0) {
+ my $receiver = pop @args;
+ push(@args, sub { &$receiver(@_) if (--$limit >= 0) });
+ }
+ }
my $ret = $self->SUPER::get_log(@args, $pool);
$pool->clear;
$ret;
strbuf_release(&cmd);
}
+static int run_argv(int *argcp, const char ***argv)
+{
+ int done_alias = 0;
+
+ while (1) {
+ /* See if it's an internal command */
+ handle_internal_command(*argcp, *argv);
+
+ /* .. then try the external ones */
+ execv_dashed_external(*argv);
+
+ /* It could be an alias -- this works around the insanity
+ * of overriding "git log" with "git show" by having
+ * alias.log = show
+ */
+ if (done_alias || !handle_alias(argcp, argv))
+ break;
+ done_alias = 1;
+ }
+
+ return done_alias;
+}
+
int main(int argc, const char **argv)
{
const char *cmd = argv[0] && *argv[0] ? argv[0] : "git-help";
char *slash = (char *)cmd + strlen(cmd);
- int done_alias = 0;
/*
* Take the basename of argv[0] as the command
setup_path();
while (1) {
- /* See if it's an internal command */
- handle_internal_command(argc, argv);
-
- /* .. then try the external ones */
- execv_dashed_external(argv);
-
- /* It could be an alias -- this works around the insanity
- * of overriding "git log" with "git show" by having
- * alias.log = show
- */
- if (done_alias || !handle_alias(&argc, &argv))
+ static int done_help = 0;
+ static int was_alias = 0;
+ was_alias = run_argv(&argc, &argv);
+ if (errno != ENOENT)
break;
- done_alias = 1;
- }
-
- if (errno == ENOENT) {
- if (done_alias) {
+ if (was_alias) {
fprintf(stderr, "Expansion of alias '%s' failed; "
"'%s' is not a git-command\n",
cmd, argv[0]);
exit(1);
}
- argv[0] = help_unknown_cmd(cmd);
- handle_internal_command(argc, argv);
- execv_dashed_external(argv);
+ if (!done_help) {
+ cmd = argv[0] = help_unknown_cmd(cmd);
+ done_help = 1;
+ } else
+ break;
}
fprintf(stderr, "Failed to run command '%s': %s\n",
'ctags' => {
'override' => 0,
'default' => [0]},
+
+ # The maximum number of patches in a patchset generated in patch
+ # view. Set this to 0 or undef to disable patch view, or to a
+ # negative number to remove any limit.
+
+ # To disable system wide have in $GITWEB_CONFIG
+ # $feature{'patches'}{'default'} = [0];
+ # To have project specific config enable override in $GITWEB_CONFIG
+ # $feature{'patches'}{'override'} = 1;
+ # and in project config gitweb.patches = 0|n;
+ # where n is the maximum number of patches allowed in a patchset.
+ 'patches' => {
+ 'sub' => \&feature_patches,
+ 'override' => 0,
+ 'default' => [16]},
);
sub gitweb_get_feature {
return @fmts;
}
+sub feature_patches {
+ my @val = (git_get_project_config('patches', '--int'));
+
+ if (@val) {
+ return @val;
+ }
+
+ return ($_[0]);
+}
+
# checking HEAD file with -e is fragile if the repository was
# initialized long time ago (i.e. symlink HEAD) and was pack-ref'ed
# and then pruned.
"heads" => \&git_heads,
"history" => \&git_history,
"log" => \&git_log,
+ "patch" => \&git_patch,
+ "patches" => \&git_patches,
"rss" => \&git_rss,
"atom" => \&git_atom,
"search" => \&git_search,
}
my $use_pathinfo = gitweb_check_feature('pathinfo');
- if ($use_pathinfo) {
+ if ($use_pathinfo and defined $params{'project'}) {
# try to put as many parameters as possible in PATH_INFO:
# - project name
# - action
$href =~ s,/$,,;
# Then add the project name, if present
- $href .= "/".esc_url($params{'project'}) if defined $params{'project'};
+ $href .= "/".esc_url($params{'project'});
delete $params{'project'};
# since we destructively absorb parameters, we keep this
my $paging_nav = format_paging_nav('log', $hash, $head, $page, $#commitlist >= 100);
+ my ($patch_max) = gitweb_get_feature('patches');
+ if ($patch_max) {
+ if ($patch_max < 0 || @commitlist <= $patch_max) {
+ $paging_nav .= " ⋅ " .
+ $cgi->a({-href => href(action=>"patches", -replay=>1)},
+ "patches");
+ }
+ }
+
git_header_html();
git_print_page_nav('log','', $hash,undef,undef, $paging_nav);
} @$parents ) .
')';
}
+ if (gitweb_check_feature('patches')) {
+ $formats_nav .= " | " .
+ $cgi->a({-href => href(action=>"patch", -replay=>1)},
+ "patch");
+ }
if (!defined $parent) {
$parent = "--root";
}
sub git_commitdiff {
- my $format = shift || 'html';
+ my %params = @_;
+ my $format = $params{-format} || 'html';
+
+ my ($patch_max) = gitweb_get_feature('patches');
+ if ($format eq 'patch') {
+ die_error(403, "Patch view not allowed") unless $patch_max;
+ }
+
$hash ||= $hash_base || "HEAD";
my %co = parse_commit($hash)
or die_error(404, "Unknown commit object");
$formats_nav =
$cgi->a({-href => href(action=>"commitdiff_plain", -replay=>1)},
"raw");
+ if ($patch_max) {
+ $formats_nav .= " | " .
+ $cgi->a({-href => href(action=>"patch", -replay=>1)},
+ "patch");
+ }
if (defined $hash_parent &&
$hash_parent ne '-c' && $hash_parent ne '--cc') {
open $fd, "-|", git_cmd(), "diff-tree", '-r', @diff_opts,
'-p', $hash_parent_param, $hash, "--"
or die_error(500, "Open git-diff-tree failed");
-
+ } elsif ($format eq 'patch') {
+ # For commit ranges, we limit the output to the number of
+ # patches specified in the 'patches' feature.
+ # For single commits, we limit the output to a single patch,
+ # diverging from the git-format-patch default.
+ my @commit_spec = ();
+ if ($hash_parent) {
+ if ($patch_max > 0) {
+ push @commit_spec, "-$patch_max";
+ }
+ push @commit_spec, '-n', "$hash_parent..$hash";
+ } else {
+ if ($params{-single}) {
+ push @commit_spec, '-1';
+ } else {
+ if ($patch_max > 0) {
+ push @commit_spec, "-$patch_max";
+ }
+ push @commit_spec, "-n";
+ }
+ push @commit_spec, '--root', $hash;
+ }
+ open $fd, "-|", git_cmd(), "format-patch", '--encoding=utf8',
+ '--stdout', @commit_spec
+ or die_error(500, "Open git-format-patch failed");
} else {
die_error(400, "Unknown commitdiff format");
}
print to_utf8($line) . "\n";
}
print "---\n\n";
+ } elsif ($format eq 'patch') {
+ my $filename = basename($project) . "-$hash.patch";
+
+ print $cgi->header(
+ -type => 'text/plain',
+ -charset => 'utf-8',
+ -expires => $expires,
+ -content_disposition => 'inline; filename="' . "$filename" . '"');
}
# write patch
print <$fd>;
close $fd
or print "Reading git-diff-tree failed\n";
+ } elsif ($format eq 'patch') {
+ local $/ = undef;
+ print <$fd>;
+ close $fd
+ or print "Reading git-format-patch failed\n";
}
}
sub git_commitdiff_plain {
- git_commitdiff('plain');
+ git_commitdiff(-format => 'plain');
+}
+
+# format-patch-style patches
+sub git_patch {
+ git_commitdiff(-format => 'patch', -single=> 1);
+}
+
+sub git_patches {
+ git_commitdiff(-format => 'patch');
}
sub git_history {
$cgi->a({-href => href(-replay=>1, page=>$page+1),
-accesskey => "n", -title => "Alt-n"}, "next");
}
+ my $patch_max = gitweb_check_feature('patches');
+ if ($patch_max) {
+ if ($patch_max < 0 || @commitlist <= $patch_max) {
+ $paging_nav .= " ⋅ " .
+ $cgi->a({-href => href(action=>"patches", -replay=>1)},
+ "patches");
+ }
+ }
git_header_html();
git_print_page_nav('shortlog','', $hash,$hash,$hash, $paging_nav);
}
if (defined($commitlist[0])) {
%latest_commit = %{$commitlist[0]};
- %latest_date = parse_date($latest_commit{'author_epoch'});
+ my $latest_epoch = $latest_commit{'committer_epoch'};
+ %latest_date = parse_date($latest_epoch);
+ my $if_modified = $cgi->http('IF_MODIFIED_SINCE');
+ if (defined $if_modified) {
+ my $since;
+ if (eval { require HTTP::Date; 1; }) {
+ $since = HTTP::Date::str2time($if_modified);
+ } elsif (eval { require Time::ParseDate; 1; }) {
+ $since = Time::ParseDate::parsedate($if_modified, GMT => 1);
+ }
+ if (defined $since && $latest_epoch <= $since) {
+ print $cgi->header(
+ -type => $content_type,
+ -charset => 'utf-8',
+ -last_modified => $latest_date{'rfc2822'},
+ -status => '304 Not Modified');
+ return;
+ }
+ }
print $cgi->header(
-type => $content_type,
-charset => 'utf-8',
print "<title>$title</title>\n" .
"<link>$alt_url</link>\n" .
"<description>$descr</description>\n" .
- "<language>en</language>\n";
+ "<language>en</language>\n" .
+ # project owner is responsible for 'editorial' content
+ "<managingEditor>$owner</managingEditor>\n";
+ if (defined $logo || defined $favicon) {
+ # prefer the logo to the favicon, since RSS
+ # doesn't allow both
+ my $img = esc_url($logo || $favicon);
+ print "\n";
+ }
+ if (%latest_date) {
+ print "<pubDate>$latest_date{'rfc2822'}</pubDate>\n";
+ print "<lastBuildDate>$latest_date{'rfc2822'}</lastBuildDate>\n";
+ }
+ print "<generator>gitweb v.$version/$git_version</generator>\n";
} elsif ($format eq 'atom') {
print <<XML;
<feed xmlns="http://www.w3.org/2005/Atom">
} else {
print "<updated>$latest_date{'iso-8601'}</updated>\n";
}
+ print "<generator version='$version/$git_version'>gitweb</generator>\n";
}
# contents
sub git_opml {
my @list = git_get_projects_list();
- print $cgi->header(-type => 'text/xml', -charset => 'utf-8');
+ print $cgi->header(
+ -type => 'text/xml',
+ -charset => 'utf-8',
+ -content_disposition => 'inline; filename="opml.xml"');
+
print <<XML;
<?xml version="1.0" encoding="utf-8"?>
<opml version="1.0">
p->next = NULL;
}
+static int is_fixed(const char *s)
+{
+ while (*s && !is_regex_special(*s))
+ s++;
+ return !*s;
+}
+
static void compile_regexp(struct grep_pat *p, struct grep_opt *opt)
{
- int err = regcomp(&p->regexp, p->pattern, opt->regflags);
+ int err;
+
+ if (opt->fixed || is_fixed(p->pattern))
+ p->fixed = 1;
+ if (opt->regflags & REG_ICASE)
+ p->fixed = 0;
+ if (p->fixed)
+ return;
+
+ err = regcomp(&p->regexp, p->pattern, opt->regflags);
if (err) {
char errbuf[1024];
char where[1024];
case GREP_PATTERN: /* atom */
case GREP_PATTERN_HEAD:
case GREP_PATTERN_BODY:
- if (!opt->fixed)
- compile_regexp(p, opt);
+ compile_regexp(p, opt);
break;
default:
opt->extended = 1;
static int match_one_pattern(struct grep_opt *opt, struct grep_pat *p, char *bol, char *eol, enum grep_context ctx)
{
int hit = 0;
- int at_true_bol = 1;
int saved_ch = 0;
regmatch_t pmatch[10];
}
again:
- if (!opt->fixed) {
+ if (!p->fixed) {
regex_t *exp = &p->regexp;
hit = !regexec(exp, bol, ARRAY_SIZE(pmatch),
pmatch, 0);
* either end of the line, or at word boundary
* (i.e. the next char must not be a word char).
*/
- if ( ((pmatch[0].rm_so == 0 && at_true_bol) ||
+ if ( ((pmatch[0].rm_so == 0) ||
!word_char(bol[pmatch[0].rm_so-1])) &&
((pmatch[0].rm_eo == (eol-bol)) ||
!word_char(bol[pmatch[0].rm_eo])) )
/* There could be more than one match on the
* line, and the first match might not be
* strict word match. But later ones could be!
+ * Forward to the next possible start, i.e. the
+ * next position following a non-word char.
*/
bol = pmatch[0].rm_so + bol + 1;
- at_true_bol = 0;
- goto again;
+ while (word_char(bol[-1]) && bol < eol)
+ bol++;
+ if (bol < eol)
+ goto again;
}
}
if (p->token == GREP_PATTERN_HEAD && saved_ch)
const char *pattern;
enum grep_header_field field;
regex_t regexp;
+ unsigned fixed:1;
};
enum grep_expr_node {
struct remote_ls_ctx *parent;
};
+/* get_dav_token_headers options */
+enum dav_header_flag {
+ DAV_HEADER_IF = (1u << 0),
+ DAV_HEADER_LOCK = (1u << 1),
+ DAV_HEADER_TIMEOUT = (1u << 2)
+};
+
+static struct curl_slist *get_dav_token_headers(struct remote_lock *lock, enum dav_header_flag options)
+{
+ struct strbuf buf = STRBUF_INIT;
+ struct curl_slist *dav_headers = NULL;
+
+ if (options & DAV_HEADER_IF) {
+ strbuf_addf(&buf, "If: (<%s>)", lock->token);
+ dav_headers = curl_slist_append(dav_headers, buf.buf);
+ strbuf_reset(&buf);
+ }
+ if (options & DAV_HEADER_LOCK) {
+ strbuf_addf(&buf, "Lock-Token: <%s>", lock->token);
+ dav_headers = curl_slist_append(dav_headers, buf.buf);
+ strbuf_reset(&buf);
+ }
+ if (options & DAV_HEADER_TIMEOUT) {
+ strbuf_addf(&buf, "Timeout: Second-%ld", lock->timeout);
+ dav_headers = curl_slist_append(dav_headers, buf.buf);
+ strbuf_reset(&buf);
+ }
+ strbuf_release(&buf);
+
+ return dav_headers;
+}
+
static void finish_request(struct transfer_request *request);
static void release_request(struct transfer_request *request);
do {
request->stream.next_out = expn;
request->stream.avail_out = sizeof(expn);
- request->zret = inflate(&request->stream, Z_SYNC_FLUSH);
+ request->zret = git_inflate(&request->stream, Z_SYNC_FLUSH);
git_SHA1_Update(&request->c, expn,
sizeof(expn) - request->stream.avail_out);
} while (request->stream.avail_in && request->zret == Z_OK);
memset(&request->stream, 0, sizeof(request->stream));
- inflateInit(&request->stream);
+ git_inflate_init(&request->stream);
git_SHA1_Init(&request->c);
file; also rewind to the beginning of the local file. */
if (prev_read == -1) {
memset(&request->stream, 0, sizeof(request->stream));
- inflateInit(&request->stream);
+ git_inflate_init(&request->stream);
git_SHA1_Init(&request->c);
if (prev_posn>0) {
prev_posn = 0;
{
struct active_request_slot *slot;
struct slot_results results;
- char *if_header;
- char timeout_header[25];
- struct curl_slist *dav_headers = NULL;
+ struct curl_slist *dav_headers;
int rc = 0;
lock->refreshing = 1;
- if_header = xmalloc(strlen(lock->token) + 25);
- sprintf(if_header, "If: (<%s>)", lock->token);
- sprintf(timeout_header, "Timeout: Second-%ld", lock->timeout);
- dav_headers = curl_slist_append(dav_headers, if_header);
- dav_headers = curl_slist_append(dav_headers, timeout_header);
+ dav_headers = get_dav_token_headers(lock, DAV_HEADER_IF | DAV_HEADER_TIMEOUT);
slot = get_active_slot();
slot->results = &results;
lock->refreshing = 0;
curl_slist_free_all(dav_headers);
- free(if_header);
return rc;
}
if (request->http_code == 416)
fprintf(stderr, "Warning: requested range invalid; we may already have all the data.\n");
- inflateEnd(&request->stream);
+ git_inflate_end(&request->stream);
git_SHA1_Final(request->real_sha1, &request->c);
if (request->zret != Z_STREAM_END) {
unlink(request->tmpfile);
/* Make sure leading directories exist for the remote ref */
ep = strchr(url + strlen(remote->url) + 1, '/');
while (ep) {
- *ep = 0;
+ char saved_character = ep[1];
+ ep[1] = '\0';
slot = get_active_slot();
slot->results = &results;
curl_easy_setopt(slot->curl, CURLOPT_HTTPGET, 1);
free(url);
return NULL;
}
- *ep = '/';
+ ep[1] = saved_character;
ep = strchr(ep + 1, '/');
}
struct active_request_slot *slot;
struct slot_results results;
struct remote_lock *prev = remote->locks;
- char *lock_token_header;
- struct curl_slist *dav_headers = NULL;
+ struct curl_slist *dav_headers;
int rc = 0;
- lock_token_header = xmalloc(strlen(lock->token) + 31);
- sprintf(lock_token_header, "Lock-Token: <%s>",
- lock->token);
- dav_headers = curl_slist_append(dav_headers, lock_token_header);
+ dav_headers = get_dav_token_headers(lock, DAV_HEADER_LOCK);
slot = get_active_slot();
slot->results = &results;
}
curl_slist_free_all(dav_headers);
- free(lock_token_header);
if (remote->locks == lock) {
remote->locks = lock->next;
}
if (path) {
path += remote->path_len;
+ ls->dentry_name = xstrdup(path);
}
- ls->dentry_name = xmalloc(strlen(path) -
- remote->path_len + 1);
- strcpy(ls->dentry_name, path + remote->path_len);
} else if (!strcmp(ctx->name, DAV_PROPFIND_COLLECTION)) {
ls->dentry_flags |= IS_DIR;
}
}
}
+/*
+ * NEEDSWORK: remote_ls() ignores info/refs on the remote side. But it
+ * should _only_ heed the information from that file, instead of trying to
+ * determine the refs from the remote file system (badly: it does not even
+ * know about packed-refs).
+ */
static void remote_ls(const char *path, int flags,
void (*userFunc)(struct remote_ls_ctx *ls),
void *userData)
{
struct active_request_slot *slot;
struct slot_results results;
- char *if_header;
struct buffer out_buffer = { STRBUF_INIT, 0 };
- struct curl_slist *dav_headers = NULL;
+ struct curl_slist *dav_headers;
- if_header = xmalloc(strlen(lock->token) + 25);
- sprintf(if_header, "If: (<%s>)", lock->token);
- dav_headers = curl_slist_append(dav_headers, if_header);
+ dav_headers = get_dav_token_headers(lock, DAV_HEADER_IF);
strbuf_addf(&out_buffer.buf, "%s\n", sha1_to_hex(sha1));
if (start_active_slot(slot)) {
run_active_slot(slot);
strbuf_release(&out_buffer.buf);
- free(if_header);
if (results.curl_result != CURLE_OK) {
fprintf(stderr,
"PUT error: curl result=%d, HTTP code=%ld\n",
}
} else {
strbuf_release(&out_buffer.buf);
- free(if_header);
fprintf(stderr, "Unable to start PUT request\n");
return 0;
}
struct buffer buffer = { STRBUF_INIT, 0 };
struct active_request_slot *slot;
struct slot_results results;
- char *if_header;
- struct curl_slist *dav_headers = NULL;
+ struct curl_slist *dav_headers;
remote_ls("refs/", (PROCESS_FILES | RECURSIVE),
add_remote_info_ref, &buffer.buf);
if (!aborted) {
- if_header = xmalloc(strlen(lock->token) + 25);
- sprintf(if_header, "If: (<%s>)", lock->token);
- dav_headers = curl_slist_append(dav_headers, if_header);
+ dav_headers = get_dav_token_headers(lock, DAV_HEADER_IF);
slot = get_active_slot();
slot->results = &results;
results.curl_result, results.http_code);
}
}
- free(if_header);
}
strbuf_release(&buffer.buf);
}
do {
obj_req->stream.next_out = expn;
obj_req->stream.avail_out = sizeof(expn);
- obj_req->zret = inflate(&obj_req->stream, Z_SYNC_FLUSH);
+ obj_req->zret = git_inflate(&obj_req->stream, Z_SYNC_FLUSH);
git_SHA1_Update(&obj_req->c, expn,
sizeof(expn) - obj_req->stream.avail_out);
} while (obj_req->stream.avail_in && obj_req->zret == Z_OK);
memset(&obj_req->stream, 0, sizeof(obj_req->stream));
- inflateInit(&obj_req->stream);
+ git_inflate_init(&obj_req->stream);
git_SHA1_Init(&obj_req->c);
file; also rewind to the beginning of the local file. */
if (prev_read == -1) {
memset(&obj_req->stream, 0, sizeof(obj_req->stream));
- inflateInit(&obj_req->stream);
+ git_inflate_init(&obj_req->stream);
git_SHA1_Init(&obj_req->c);
if (prev_posn>0) {
prev_posn = 0;
return;
}
- inflateEnd(&obj_req->stream);
+ git_inflate_end(&obj_req->stream);
git_SHA1_Final(obj_req->real_sha1, &obj_req->c);
if (obj_req->zret != Z_STREAM_END) {
unlink(obj_req->tmpfile);
stream.avail_out = size;
stream.next_in = fill(1);
stream.avail_in = input_len;
- inflateInit(&stream);
+ git_inflate_init(&stream);
for (;;) {
- int ret = inflate(&stream, 0);
+ int ret = git_inflate(&stream, 0);
use(input_len - stream.avail_in);
if (stream.total_out == size && ret == Z_STREAM_END)
break;
stream.next_in = fill(1);
stream.avail_in = input_len;
}
- inflateEnd(&stream);
+ git_inflate_end(&stream);
return buf;
}
stream.avail_out = obj->size;
stream.next_in = src;
stream.avail_in = len;
- inflateInit(&stream);
- while ((st = inflate(&stream, Z_FINISH)) == Z_OK);
- inflateEnd(&stream);
+ git_inflate_init(&stream);
+ while ((st = git_inflate(&stream, Z_FINISH)) == Z_OK);
+ git_inflate_end(&stream);
if (st != Z_STREAM_END || stream.total_out != obj->size)
die("serious inflate inconsistency");
free(src);
objects[nr].mode = mode;
array->nr = ++nr;
}
+
+void object_array_remove_duplicates(struct object_array *array)
+{
+ int ref, src, dst;
+ struct object_array_entry *objects = array->objects;
+
+ for (ref = 0; ref < array->nr - 1; ref++) {
+ for (src = ref + 1, dst = src;
+ src < array->nr;
+ src++) {
+ if (!strcmp(objects[ref].name, objects[src].name))
+ continue;
+ if (src != dst)
+ objects[dst] = objects[src];
+ dst++;
+ }
+ array->nr = dst;
+ }
+}
/* Object array handling .. */
void add_object_array(struct object *obj, const char *name, struct object_array *array);
void add_object_array_with_mode(struct object *obj, const char *name, struct object_array *array, unsigned mode);
+void object_array_remove_duplicates(struct object_array *);
#endif /* OBJECT_H */
#include "string-list.h"
#include "mailmap.h"
#include "log-tree.h"
+#include "color.h"
static char *user_format;
context->commit_header_parsed = 1;
}
-static const char *format_subject(struct strbuf *sb, const char *msg,
- const char *line_separator)
+const char *format_subject(struct strbuf *sb, const char *msg,
+ const char *line_separator)
{
int first = 1;
/* these are independent of the commit */
switch (placeholder[0]) {
case 'C':
+ if (placeholder[1] == '(') {
+ const char *end = strchr(placeholder + 2, ')');
+ char color[COLOR_MAXLEN];
+ if (!end)
+ return 0;
+ color_parse_mem(placeholder + 2,
+ end - (placeholder + 2),
+ "--pretty format", color);
+ strbuf_addstr(sb, color);
+ return end - placeholder + 1;
+ }
if (!prefixcmp(placeholder + 1, "red")) {
strbuf_addstr(sb, "\033[31m");
return 4;
#include "commit.h"
#include "diff.h"
#include "revision.h"
+#include "dir.h"
static struct refspec s_tag_refspec = {
0,
static int valid_remote_nick(const char *name)
{
- if (!name[0] || /* not empty */
- (name[0] == '.' && /* not "." */
- (!name[1] || /* not ".." */
- (name[1] == '.' && !name[2]))))
+ if (!name[0] || is_dot_or_dotdot(name))
return 0;
return !strchr(name, '/'); /* no slash */
}
if (!strcmp(arg, "--all")) {
handle_refs(revs, flags, for_each_ref);
+ handle_refs(revs, flags, head_ref);
continue;
}
if (!strcmp(arg, "--branches")) {
#endif
return ret;
}
+
+int run_hook(const char *index_file, const char *name, ...)
+{
+ struct child_process hook;
+ const char **argv = NULL, *env[2];
+ char index[PATH_MAX];
+ va_list args;
+ int ret;
+ size_t i = 0, alloc = 0;
+
+ if (access(git_path("hooks/%s", name), X_OK) < 0)
+ return 0;
+
+ va_start(args, name);
+ ALLOC_GROW(argv, i + 1, alloc);
+ argv[i++] = git_path("hooks/%s", name);
+ while (argv[i-1]) {
+ ALLOC_GROW(argv, i + 1, alloc);
+ argv[i++] = va_arg(args, const char *);
+ }
+ va_end(args);
+
+ memset(&hook, 0, sizeof(hook));
+ hook.argv = argv;
+ hook.no_stdin = 1;
+ hook.stdout_to_stderr = 1;
+ if (index_file) {
+ snprintf(index, sizeof(index), "GIT_INDEX_FILE=%s", index_file);
+ env[0] = index;
+ env[1] = NULL;
+ hook.env = env;
+ }
+
+ ret = start_command(&hook);
+ free(argv);
+ if (ret) {
+ warning("Could not spawn %s", argv[0]);
+ return ret;
+ }
+ ret = finish_command(&hook);
+ if (ret == -ERR_RUN_COMMAND_WAITPID_SIGNAL)
+ warning("%s exited due to uncaught signal", argv[0]);
+
+ return ret;
+}
int finish_command(struct child_process *);
int run_command(struct child_process *);
+extern int run_hook(const char *index_file, const char *name, ...);
+
#define RUN_COMMAND_NO_STDIN 1
#define RUN_GIT_CMD 2 /*If this is to be git sub-command */
#define RUN_COMMAND_STDOUT_TO_STDERR 4
inside_git_dir = 1;
if (!work_tree_env)
inside_work_tree = 0;
- setenv(GIT_DIR_ENVIRONMENT, ".", 1);
+ if (offset != len) {
+ cwd[offset] = '\0';
+ setenv(GIT_DIR_ENVIRONMENT, cwd, 1);
+ } else
+ setenv(GIT_DIR_ENVIRONMENT, ".", 1);
check_repository_format_gently(nongit_ok);
return NULL;
}
stream->avail_out = bufsiz;
if (legacy_loose_object(map)) {
- inflateInit(stream);
- return inflate(stream, 0);
+ git_inflate_init(stream);
+ return git_inflate(stream, 0);
}
/* Set up the stream for the rest.. */
stream->next_in = map;
stream->avail_in = mapsize;
- inflateInit(stream);
+ git_inflate_init(stream);
/* And generate the fake traditional header */
stream->total_out = 1 + snprintf(buffer, bufsiz, "%s %lu",
stream->next_out = buf + bytes;
stream->avail_out = size - bytes;
while (status == Z_OK)
- status = inflate(stream, Z_FINISH);
+ status = git_inflate(stream, Z_FINISH);
}
buf[size] = 0;
if (status == Z_STREAM_END && !stream->avail_in) {
- inflateEnd(stream);
+ git_inflate_end(stream);
return buf;
}
stream.next_out = delta_head;
stream.avail_out = sizeof(delta_head);
- inflateInit(&stream);
+ git_inflate_init(&stream);
do {
in = use_pack(p, w_curs, curpos, &stream.avail_in);
stream.next_in = in;
- st = inflate(&stream, Z_FINISH);
+ st = git_inflate(&stream, Z_FINISH);
curpos += stream.next_in - in;
} while ((st == Z_OK || st == Z_BUF_ERROR) &&
stream.total_out < sizeof(delta_head));
- inflateEnd(&stream);
+ git_inflate_end(&stream);
if ((st != Z_STREAM_END) && stream.total_out != sizeof(delta_head)) {
error("delta data unpack-initial failed");
return 0;
stream.next_out = buffer;
stream.avail_out = size;
- inflateInit(&stream);
+ git_inflate_init(&stream);
do {
in = use_pack(p, w_curs, curpos, &stream.avail_in);
stream.next_in = in;
- st = inflate(&stream, Z_FINISH);
+ st = git_inflate(&stream, Z_FINISH);
curpos += stream.next_in - in;
} while (st == Z_OK || st == Z_BUF_ERROR);
- inflateEnd(&stream);
+ git_inflate_end(&stream);
if ((st != Z_STREAM_END) || stream.total_out != size) {
free(buffer);
return NULL;
status = error("unable to parse %s header", sha1_to_hex(sha1));
else if (sizep)
*sizep = size;
- inflateEnd(&stream);
+ git_inflate_end(&stream);
munmap(map, mapsize);
return status;
}
/* basic@{time or number or -number} format to query ref-log */
reflog_len = at = 0;
- if (str[len-1] == '}') {
+ if (len && str[len-1] == '}') {
for (at = len-2; at >= 0; at--) {
if (str[at] == '@' && str[at+1] == '{') {
reflog_len = (len-1) - (at+2);
#include "cache.h"
-struct pathname {
+static struct cache_def {
+ char path[PATH_MAX + 1];
int len;
- char path[PATH_MAX];
-};
+ int flags;
+ int track_flags;
+ int prefix_len_stat_func;
+} cache;
-/* Return matching pathname prefix length, or zero if not matching */
-static inline int match_pathname(int len, const char *name, struct pathname *match)
+/*
+ * Returns the length (on a path component basis) of the longest
+ * common prefix match of 'name' and the cached path string.
+ */
+static inline int longest_match_lstat_cache(int len, const char *name,
+ int *previous_slash)
{
- int match_len = match->len;
- return (len > match_len &&
- name[match_len] == '/' &&
- !memcmp(name, match->path, match_len)) ? match_len : 0;
+ int max_len, match_len = 0, match_len_prev = 0, i = 0;
+
+ max_len = len < cache.len ? len : cache.len;
+ while (i < max_len && name[i] == cache.path[i]) {
+ if (name[i] == '/') {
+ match_len_prev = match_len;
+ match_len = i;
+ }
+ i++;
+ }
+ /* Is the cached path string a substring of 'name'? */
+ if (i == cache.len && cache.len < len && name[cache.len] == '/') {
+ match_len_prev = match_len;
+ match_len = cache.len;
+ /* Is 'name' a substring of the cached path string? */
+ } else if ((i == len && len < cache.len && cache.path[len] == '/') ||
+ (i == len && len == cache.len)) {
+ match_len_prev = match_len;
+ match_len = len;
+ }
+ *previous_slash = match_len_prev;
+ return match_len;
}
-static inline void set_pathname(int len, const char *name, struct pathname *match)
+static inline void reset_lstat_cache(int track_flags, int prefix_len_stat_func)
{
- if (len < PATH_MAX) {
- match->len = len;
- memcpy(match->path, name, len);
- match->path[len] = 0;
- }
+ cache.path[0] = '\0';
+ cache.len = 0;
+ cache.flags = 0;
+ cache.track_flags = track_flags;
+ cache.prefix_len_stat_func = prefix_len_stat_func;
}
-int has_symlink_leading_path(int len, const char *name)
+#define FL_DIR (1 << 0)
+#define FL_NOENT (1 << 1)
+#define FL_SYMLINK (1 << 2)
+#define FL_LSTATERR (1 << 3)
+#define FL_ERR (1 << 4)
+#define FL_FULLPATH (1 << 5)
+
+/*
+ * Check if name 'name' of length 'len' has a symlink leading
+ * component, or if the directory exists and is real, or not.
+ *
+ * To speed up the check, some information is allowed to be cached.
+ * This can be indicated by the 'track_flags' argument, which also can
+ * be used to indicate that we should check the full path.
+ *
+ * The 'prefix_len_stat_func' parameter can be used to set the length
+ * of the prefix, where the cache should use the stat() function
+ * instead of the lstat() function to test each path component.
+ */
+static int lstat_cache(int len, const char *name,
+ int track_flags, int prefix_len_stat_func)
{
- static struct pathname link, nonlink;
- char path[PATH_MAX];
+ int match_len, last_slash, last_slash_dir, previous_slash;
+ int match_flags, ret_flags, save_flags, max_len, ret;
struct stat st;
- char *sp;
- int known_dir;
- /*
- * See if the last known symlink cache matches.
- */
- if (match_pathname(len, name, &link))
- return 1;
+ if (cache.track_flags != track_flags ||
+ cache.prefix_len_stat_func != prefix_len_stat_func) {
+ /*
+ * As a safeguard we clear the cache if the values of
+ * track_flags and/or prefix_len_stat_func does not
+ * match with the last supplied values.
+ */
+ reset_lstat_cache(track_flags, prefix_len_stat_func);
+ match_len = last_slash = 0;
+ } else {
+ /*
+ * Check to see if we have a match from the cache for
+ * the 2 "excluding" path types.
+ */
+ match_len = last_slash =
+ longest_match_lstat_cache(len, name, &previous_slash);
+ match_flags = cache.flags & track_flags & (FL_NOENT|FL_SYMLINK);
+ if (match_flags && match_len == cache.len)
+ return match_flags;
+ /*
+ * If we now have match_len > 0, we would know that
+ * the matched part will always be a directory.
+ *
+ * Also, if we are tracking directories and 'name' is
+ * a substring of the cache on a path component basis,
+ * we can return immediately.
+ */
+ match_flags = track_flags & FL_DIR;
+ if (match_flags && len == match_len)
+ return match_flags;
+ }
/*
- * Get rid of the last known directory part
+ * Okay, no match from the cache so far, so now we have to
+ * check the rest of the path components.
*/
- known_dir = match_pathname(len, name, &nonlink);
-
- while ((sp = strchr(name + known_dir + 1, '/')) != NULL) {
- int thislen = sp - name ;
- memcpy(path, name, thislen);
- path[thislen] = 0;
-
- if (lstat(path, &st))
- return 0;
- if (S_ISDIR(st.st_mode)) {
- set_pathname(thislen, path, &nonlink);
- known_dir = thislen;
+ ret_flags = FL_DIR;
+ last_slash_dir = last_slash;
+ max_len = len < PATH_MAX ? len : PATH_MAX;
+ while (match_len < max_len) {
+ do {
+ cache.path[match_len] = name[match_len];
+ match_len++;
+ } while (match_len < max_len && name[match_len] != '/');
+ if (match_len >= max_len && !(track_flags & FL_FULLPATH))
+ break;
+ last_slash = match_len;
+ cache.path[last_slash] = '\0';
+
+ if (last_slash <= prefix_len_stat_func)
+ ret = stat(cache.path, &st);
+ else
+ ret = lstat(cache.path, &st);
+
+ if (ret) {
+ ret_flags = FL_LSTATERR;
+ if (errno == ENOENT)
+ ret_flags |= FL_NOENT;
+ } else if (S_ISDIR(st.st_mode)) {
+ last_slash_dir = last_slash;
continue;
- }
- if (S_ISLNK(st.st_mode)) {
- set_pathname(thislen, path, &link);
- return 1;
+ } else if (S_ISLNK(st.st_mode)) {
+ ret_flags = FL_SYMLINK;
+ } else {
+ ret_flags = FL_ERR;
}
break;
}
- return 0;
+
+ /*
+ * At the end update the cache. Note that max 3 different
+ * path types, FL_NOENT, FL_SYMLINK and FL_DIR, can be cached
+ * for the moment!
+ */
+ save_flags = ret_flags & track_flags & (FL_NOENT|FL_SYMLINK);
+ if (save_flags && last_slash > 0 && last_slash <= PATH_MAX) {
+ cache.path[last_slash] = '\0';
+ cache.len = last_slash;
+ cache.flags = save_flags;
+ } else if (track_flags & FL_DIR &&
+ last_slash_dir > 0 && last_slash_dir <= PATH_MAX) {
+ /*
+ * We have a separate test for the directory case,
+ * since it could be that we have found a symlink or a
+ * non-existing directory and the track_flags says
+ * that we cannot cache this fact, so the cache would
+ * then have been left empty in this case.
+ *
+ * But if we are allowed to track real directories, we
+ * can still cache the path components before the last
+ * one (the found symlink or non-existing component).
+ */
+ cache.path[last_slash_dir] = '\0';
+ cache.len = last_slash_dir;
+ cache.flags = FL_DIR;
+ } else {
+ reset_lstat_cache(track_flags, prefix_len_stat_func);
+ }
+ return ret_flags;
+}
+
+/*
+ * Invalidate the given 'name' from the cache, if 'name' matches
+ * completely with the cache.
+ */
+void invalidate_lstat_cache(int len, const char *name)
+{
+ int match_len, previous_slash;
+
+ match_len = longest_match_lstat_cache(len, name, &previous_slash);
+ if (len == match_len) {
+ if ((cache.track_flags & FL_DIR) && previous_slash > 0) {
+ cache.path[previous_slash] = '\0';
+ cache.len = previous_slash;
+ cache.flags = FL_DIR;
+ } else
+ reset_lstat_cache(cache.track_flags,
+ cache.prefix_len_stat_func);
+ }
+}
+
+/*
+ * Completely clear the contents of the cache
+ */
+void clear_lstat_cache(void)
+{
+ reset_lstat_cache(0, 0);
+}
+
+#define USE_ONLY_LSTAT 0
+
+/*
+ * Return non-zero if path 'name' has a leading symlink component
+ */
+int has_symlink_leading_path(int len, const char *name)
+{
+ return lstat_cache(len, name,
+ FL_SYMLINK|FL_DIR, USE_ONLY_LSTAT) &
+ FL_SYMLINK;
+}
+
+/*
+ * Return non-zero if path 'name' has a leading symlink component or
+ * if some leading path component does not exists.
+ */
+int has_symlink_or_noent_leading_path(int len, const char *name)
+{
+ return lstat_cache(len, name,
+ FL_SYMLINK|FL_NOENT|FL_DIR, USE_ONLY_LSTAT) &
+ (FL_SYMLINK|FL_NOENT);
+}
+
+/*
+ * Return non-zero if all path components of 'name' exists as a
+ * directory. If prefix_len > 0, we will test with the stat()
+ * function instead of the lstat() function for a prefix length of
+ * 'prefix_len', thus we then allow for symlinks in the prefix part as
+ * long as those points to real existing directories.
+ */
+int has_dirs_only_path(int len, const char *name, int prefix_len)
+{
+ return lstat_cache(len, name,
+ FL_DIR|FL_FULLPATH, prefix_len) &
+ FL_DIR;
}
if ! test -x "$LIB_HTTPD_PATH"
then
- say "skipping test, no web server found at '$LIB_HTTPD_PATH'"
- test_done
- exit
+ say "skipping test, no web server found at '$LIB_HTTPD_PATH'"
+ test_done
+ exit
fi
HTTPD_VERSION=`$LIB_HTTPD_PATH -v | \
start_httpd() {
prepare_httpd
- trap 'stop_httpd; die' exit
+ trap 'stop_httpd; die' EXIT
"$LIB_HTTPD_PATH" -d "$HTTPD_ROOT_PATH" \
-f "$TEST_PATH/apache.conf" $HTTPD_PARA \
}
stop_httpd() {
- trap 'die' exit
+ trap 'die' EXIT
"$LIB_HTTPD_PATH" -d "$HTTPD_ROOT_PATH" \
-f "$TEST_PATH/apache.conf" -k stop
ServerName dummy
PidFile httpd.pid
DocumentRoot www
+LogFormat "%h %l %u %t \"%r\" %>s %b" common
+CustomLog access.log common
ErrorLog error.log
<IfDefine SSL>
--- /dev/null
+#!/bin/sh
+
+test_description='check that the most basic functions work
+
+
+Verify wrappers and compatibility functions.
+'
+
+. ./test-lib.sh
+
+test_expect_success 'character classes (isspace, isalpha etc.)' '
+ test-ctype
+'
+
+test_done
test_rev_parse 'in repo.git/sub/dir' false true true sub/dir/
cd ../../../.. || exit 1
+test_expect_success 'detecting gitdir when cwd is in a subdir of gitdir' '
+ (expected=$(pwd)/repo.git &&
+ cd repo.git/refs &&
+ unset GIT_DIR &&
+ test "$expected" = "$(git rev-parse --git-dir)")
+'
+
test_expect_success 'repo finds its work tree' '
(cd repo.git &&
: > work/sub/dir/untracked &&
test_expect_success 'pre-rebase hook stops rebase (2)' '
git checkout test &&
git reset --hard side &&
- EDITOR=true test_must_fail git rebase -i master &&
+ (
+ EDITOR=:
+ export EDITOR
+ test_must_fail git rebase -i master
+ ) &&
test "z$(git symbolic-ref HEAD)" = zrefs/heads/test &&
test 0 = $(git rev-list HEAD...side | wc -l)
'
--- /dev/null
+#!/bin/sh
+
+test_description='git rebase --root
+
+Tests if git rebase --root --onto <newparent> can rebase the root commit.
+'
+. ./test-lib.sh
+
+test_expect_success 'prepare repository' '
+ echo 1 > A &&
+ git add A &&
+ git commit -m 1 &&
+ echo 2 > A &&
+ git add A &&
+ git commit -m 2 &&
+ git symbolic-ref HEAD refs/heads/other &&
+ rm .git/index &&
+ echo 3 > B &&
+ git add B &&
+ git commit -m 3 &&
+ echo 1 > A &&
+ git add A &&
+ git commit -m 1b &&
+ echo 4 > B &&
+ git add B &&
+ git commit -m 4
+'
+
+test_expect_success 'rebase --root expects --onto' '
+ test_must_fail git rebase --root
+'
+
+test_expect_success 'setup pre-rebase hook' '
+ mkdir -p .git/hooks &&
+ cat >.git/hooks/pre-rebase <<EOF &&
+#!$SHELL_PATH
+echo "\$1,\$2" >.git/PRE-REBASE-INPUT
+EOF
+ chmod +x .git/hooks/pre-rebase
+'
+cat > expect <<EOF
+4
+3
+2
+1
+EOF
+
+test_expect_success 'rebase --root --onto <newbase>' '
+ git checkout -b work &&
+ git rebase --root --onto master &&
+ git log --pretty=tformat:"%s" > rebased &&
+ test_cmp expect rebased
+'
+
+test_expect_success 'pre-rebase got correct input (1)' '
+ test "z$(cat .git/PRE-REBASE-INPUT)" = z--root,
+'
+
+test_expect_success 'rebase --root --onto <newbase> <branch>' '
+ git branch work2 other &&
+ git rebase --root --onto master work2 &&
+ git log --pretty=tformat:"%s" > rebased2 &&
+ test_cmp expect rebased2
+'
+
+test_expect_success 'pre-rebase got correct input (2)' '
+ test "z$(cat .git/PRE-REBASE-INPUT)" = z--root,work2
+'
+
+test_expect_success 'rebase -i --root --onto <newbase>' '
+ git checkout -b work3 other &&
+ GIT_EDITOR=: git rebase -i --root --onto master &&
+ git log --pretty=tformat:"%s" > rebased3 &&
+ test_cmp expect rebased3
+'
+
+test_expect_success 'pre-rebase got correct input (3)' '
+ test "z$(cat .git/PRE-REBASE-INPUT)" = z--root,
+'
+
+test_expect_success 'rebase -i --root --onto <newbase> <branch>' '
+ git branch work4 other &&
+ GIT_EDITOR=: git rebase -i --root --onto master work4 &&
+ git log --pretty=tformat:"%s" > rebased4 &&
+ test_cmp expect rebased4
+'
+
+test_expect_success 'pre-rebase got correct input (4)' '
+ test "z$(cat .git/PRE-REBASE-INPUT)" = z--root,work4
+'
+
+test_expect_success 'rebase -i -p with linear history' '
+ git checkout -b work5 other &&
+ GIT_EDITOR=: git rebase -i -p --root --onto master &&
+ git log --pretty=tformat:"%s" > rebased5 &&
+ test_cmp expect rebased5
+'
+
+test_expect_success 'pre-rebase got correct input (5)' '
+ test "z$(cat .git/PRE-REBASE-INPUT)" = z--root,
+'
+
+test_expect_success 'set up merge history' '
+ git checkout other^ &&
+ git checkout -b side &&
+ echo 5 > C &&
+ git add C &&
+ git commit -m 5 &&
+ git checkout other &&
+ git merge side
+'
+
+sed 's/#/ /g' > expect-side <<'EOF'
+* Merge branch 'side' into other
+|\##
+| * 5
+* | 4
+|/##
+* 3
+* 2
+* 1
+EOF
+
+test_expect_success 'rebase -i -p with merge' '
+ git checkout -b work6 other &&
+ GIT_EDITOR=: git rebase -i -p --root --onto master &&
+ git log --graph --topo-order --pretty=tformat:"%s" > rebased6 &&
+ test_cmp expect-side rebased6
+'
+
+test_expect_success 'set up second root and merge' '
+ git symbolic-ref HEAD refs/heads/third &&
+ rm .git/index &&
+ rm A B C &&
+ echo 6 > D &&
+ git add D &&
+ git commit -m 6 &&
+ git checkout other &&
+ git merge third
+'
+
+sed 's/#/ /g' > expect-third <<'EOF'
+* Merge branch 'third' into other
+|\##
+| * 6
+* | Merge branch 'side' into other
+|\ \##
+| * | 5
+* | | 4
+|/ /##
+* | 3
+|/##
+* 2
+* 1
+EOF
+
+test_expect_success 'rebase -i -p with two roots' '
+ git checkout -b work7 other &&
+ GIT_EDITOR=: git rebase -i -p --root --onto master &&
+ git log --graph --topo-order --pretty=tformat:"%s" > rebased7 &&
+ test_cmp expect-third rebased7
+'
+
+test_expect_success 'setup pre-rebase hook that fails' '
+ mkdir -p .git/hooks &&
+ cat >.git/hooks/pre-rebase <<EOF &&
+#!$SHELL_PATH
+false
+EOF
+ chmod +x .git/hooks/pre-rebase
+'
+
+test_expect_success 'pre-rebase hook stops rebase' '
+ git checkout -b stops1 other &&
+ (
+ GIT_EDITOR=:
+ export GIT_EDITOR
+ test_must_fail git rebase --root --onto master
+ ) &&
+ test "z$(git symbolic-ref HEAD)" = zrefs/heads/stops1
+ test 0 = $(git rev-list other...stops1 | wc -l)
+'
+
+test_expect_success 'pre-rebase hook stops rebase -i' '
+ git checkout -b stops2 other &&
+ (
+ GIT_EDITOR=:
+ export GIT_EDITOR
+ test_must_fail git rebase --root --onto master
+ ) &&
+ test "z$(git symbolic-ref HEAD)" = zrefs/heads/stops2
+ test 0 = $(git rev-list other...stops2 | wc -l)
+'
+
+test_expect_success 'remove pre-rebase hook' '
+ rm -f .git/hooks/pre-rebase
+'
+
+test_expect_success 'set up a conflict' '
+ git checkout master &&
+ echo conflict > B &&
+ git add B &&
+ git commit -m conflict
+'
+
+test_expect_success 'rebase --root with conflict (first part)' '
+ git checkout -b conflict1 other &&
+ test_must_fail git rebase --root --onto master &&
+ git ls-files -u | grep "B$"
+'
+
+test_expect_success 'fix the conflict' '
+ echo 3 > B &&
+ git add B
+'
+
+cat > expect-conflict <<EOF
+6
+5
+4
+3
+conflict
+2
+1
+EOF
+
+test_expect_success 'rebase --root with conflict (second part)' '
+ git rebase --continue &&
+ git log --pretty=tformat:"%s" > conflict1 &&
+ test_cmp expect-conflict conflict1
+'
+
+test_expect_success 'rebase -i --root with conflict (first part)' '
+ git checkout -b conflict2 other &&
+ (
+ GIT_EDITOR=:
+ export GIT_EDITOR
+ test_must_fail git rebase -i --root --onto master
+ ) &&
+ git ls-files -u | grep "B$"
+'
+
+test_expect_success 'fix the conflict' '
+ echo 3 > B &&
+ git add B
+'
+
+test_expect_success 'rebase -i --root with conflict (second part)' '
+ git rebase --continue &&
+ git log --pretty=tformat:"%s" > conflict2 &&
+ test_cmp expect-conflict conflict2
+'
+
+cat >expect-conflict-p <<\EOF
+commit conflict3 conflict3~1 conflict3^2
+Merge branch 'third' into other
+commit conflict3^2 conflict3~4
+6
+commit conflict3~1 conflict3~2 conflict3~1^2
+Merge branch 'side' into other
+commit conflict3~1^2 conflict3~3
+5
+commit conflict3~2 conflict3~3
+4
+commit conflict3~3 conflict3~4
+3
+commit conflict3~4 conflict3~5
+conflict
+commit conflict3~5 conflict3~6
+2
+commit conflict3~6
+1
+EOF
+
+test_expect_success 'rebase -i -p --root with conflict (first part)' '
+ git checkout -b conflict3 other &&
+ (
+ GIT_EDITOR=:
+ export GIT_EDITOR
+ test_must_fail git rebase -i -p --root --onto master
+ ) &&
+ git ls-files -u | grep "B$"
+'
+
+test_expect_success 'fix the conflict' '
+ echo 3 > B &&
+ git add B
+'
+
+test_expect_success 'rebase -i -p --root with conflict (second part)' '
+ git rebase --continue &&
+ git rev-list --topo-order --parents --pretty="tformat:%s" HEAD |
+ git name-rev --stdin --name-only --refs=refs/heads/conflict3 >out &&
+ test_cmp expect-conflict-p out
+'
+
+test_done
diff --patch-with-raw -r initial..side
diff --name-status dir2 dir
diff --no-index --name-status dir2 dir
+diff --no-index --name-status -- dir2 dir
diff master master^ side
EOF
--- /dev/null
+$ git diff --no-index --name-status -- dir2 dir
+A dir/sub
+$
# Copyright (c) 2006 Junio C Hamano
#
-test_description='Format-patch skipping already incorporated patches'
+test_description='various format-patch tests'
. ./test-lib.sh
'
+test_expect_success 'format-patch from a subdirectory (1)' '
+ filename=$(
+ rm -rf sub &&
+ mkdir -p sub/dir &&
+ cd sub/dir &&
+ git format-patch -1
+ ) &&
+ case "$filename" in
+ 0*)
+ ;; # ok
+ *)
+ echo "Oops? $filename"
+ false
+ ;;
+ esac &&
+ test -f "$filename"
+'
+
+test_expect_success 'format-patch from a subdirectory (2)' '
+ filename=$(
+ rm -rf sub &&
+ mkdir -p sub/dir &&
+ cd sub/dir &&
+ git format-patch -1 -o ..
+ ) &&
+ case "$filename" in
+ ../0*)
+ ;; # ok
+ *)
+ echo "Oops? $filename"
+ false
+ ;;
+ esac &&
+ basename=$(expr "$filename" : ".*/\(.*\)") &&
+ test -f "sub/$basename"
+'
+
+test_expect_success 'format-patch from a subdirectory (3)' '
+ here="$TEST_DIRECTORY/$test" &&
+ rm -f 0* &&
+ filename=$(
+ rm -rf sub &&
+ mkdir -p sub/dir &&
+ cd sub/dir &&
+ git format-patch -1 -o "$here"
+ ) &&
+ basename=$(expr "$filename" : ".*/\(.*\)") &&
+ test -f "$basename"
+'
+
test_done
EOF
git diff -w > out
test_expect_success 'another test, with -w' 'test_cmp expect out'
+git diff -w -b > out
+test_expect_success 'another test, with -w -b' 'test_cmp expect out'
+git diff -w --ignore-space-at-eol > out
+test_expect_success 'another test, with -w --ignore-space-at-eol' 'test_cmp expect out'
+git diff -w -b --ignore-space-at-eol > out
+test_expect_success 'another test, with -w -b --ignore-space-at-eol' 'test_cmp expect out'
tr 'Q' '\015' << EOF > expect
diff --git a/x b/x
EOF
git diff -b > out
test_expect_success 'another test, with -b' 'test_cmp expect out'
+git diff -b --ignore-space-at-eol > out
+test_expect_success 'another test, with -b --ignore-space-at-eol' 'test_cmp expect out'
+
+tr 'Q' '\015' << EOF > expect
+diff --git a/x b/x
+index d99af23..8b32fb5 100644
+--- a/x
++++ b/x
+@@ -1,6 +1,6 @@
+-whitespace at beginning
+-whitespace change
+-whitespace in the middle
++ whitespace at beginning
++whitespace change
++white space in the middle
+ whitespace at end
+ unchanged line
+ CR at endQ
+EOF
+git diff --ignore-space-at-eol > out
+test_expect_success 'another test, with --ignore-space-at-eol' 'test_cmp expect out'
test_expect_success 'check mixed spaces and tabs in indent' '
#
# Copyright (c) Jim Meyering
#
-test_description='diff honors config option, diff.suppress-blank-empty'
+test_description='diff honors config option, diff.suppressBlankEmpty'
. ./test-lib.sh
git add f &&
git commit -q -m. f &&
printf "\ny\n" > f &&
- git config --bool diff.suppress-blank-empty true &&
+ git config --bool diff.suppressBlankEmpty true &&
git diff f > actual &&
test_cmp exp actual &&
perl -i.bak -p -e "s/^\$/ /" exp &&
- git config --bool diff.suppress-blank-empty false &&
+ git config --bool diff.suppressBlankEmpty false &&
git diff f > actual &&
test_cmp exp actual &&
- git config --bool --unset diff.suppress-blank-empty &&
+ git config --bool --unset diff.suppressBlankEmpty &&
git diff f > actual &&
test_cmp exp actual
'
--- /dev/null
+#!/bin/sh
+
+test_description='patience diff algorithm'
+
+. ./test-lib.sh
+
+cat >file1 <<\EOF
+#include <stdio.h>
+
+// Frobs foo heartily
+int frobnitz(int foo)
+{
+ int i;
+ for(i = 0; i < 10; i++)
+ {
+ printf("Your answer is: ");
+ printf("%d\n", foo);
+ }
+}
+
+int fact(int n)
+{
+ if(n > 1)
+ {
+ return fact(n-1) * n;
+ }
+ return 1;
+}
+
+int main(int argc, char **argv)
+{
+ frobnitz(fact(10));
+}
+EOF
+
+cat >file2 <<\EOF
+#include <stdio.h>
+
+int fib(int n)
+{
+ if(n > 2)
+ {
+ return fib(n-1) + fib(n-2);
+ }
+ return 1;
+}
+
+// Frobs foo heartily
+int frobnitz(int foo)
+{
+ int i;
+ for(i = 0; i < 10; i++)
+ {
+ printf("%d\n", foo);
+ }
+}
+
+int main(int argc, char **argv)
+{
+ frobnitz(fib(10));
+}
+EOF
+
+cat >expect <<\EOF
+diff --git a/file1 b/file2
+index 6faa5a3..e3af329 100644
+--- a/file1
++++ b/file2
+@@ -1,26 +1,25 @@
+ #include <stdio.h>
+
++int fib(int n)
++{
++ if(n > 2)
++ {
++ return fib(n-1) + fib(n-2);
++ }
++ return 1;
++}
++
+ // Frobs foo heartily
+ int frobnitz(int foo)
+ {
+ int i;
+ for(i = 0; i < 10; i++)
+ {
+- printf("Your answer is: ");
+ printf("%d\n", foo);
+ }
+ }
+
+-int fact(int n)
+-{
+- if(n > 1)
+- {
+- return fact(n-1) * n;
+- }
+- return 1;
+-}
+-
+ int main(int argc, char **argv)
+ {
+- frobnitz(fact(10));
++ frobnitz(fib(10));
+ }
+EOF
+
+test_expect_success 'patience diff' '
+
+ test_must_fail git diff --no-index --patience file1 file2 > output &&
+ test_cmp expect output
+
+'
+
+test_expect_success 'patience diff output is valid' '
+
+ mv file2 expect &&
+ git apply < output &&
+ test_cmp expect file2
+
+'
+
+cat >uniq1 <<\EOF
+1
+2
+3
+4
+5
+6
+EOF
+
+cat >uniq2 <<\EOF
+a
+b
+c
+d
+e
+f
+EOF
+
+cat >expect <<\EOF
+diff --git a/uniq1 b/uniq2
+index b414108..0fdf397 100644
+--- a/uniq1
++++ b/uniq2
+@@ -1,6 +1,6 @@
+-1
+-2
+-3
+-4
+-5
+-6
++a
++b
++c
++d
++e
++f
+EOF
+
+test_expect_success 'completely different files' '
+
+ test_must_fail git diff --no-index --patience uniq1 uniq2 > output &&
+ test_cmp expect output
+
+'
+
+test_done
--- /dev/null
+#!/bin/sh
+
+test_description='word diff colors'
+
+. ./test-lib.sh
+
+test_expect_success setup '
+
+ git config diff.color.old red
+ git config diff.color.new green
+
+'
+
+decrypt_color () {
+ sed \
+ -e 's/.\[1m/<WHITE>/g' \
+ -e 's/.\[31m/<RED>/g' \
+ -e 's/.\[32m/<GREEN>/g' \
+ -e 's/.\[36m/<BROWN>/g' \
+ -e 's/.\[m/<RESET>/g'
+}
+
+word_diff () {
+ test_must_fail git diff --no-index "$@" pre post > output &&
+ decrypt_color < output > output.decrypted &&
+ test_cmp expect output.decrypted
+}
+
+cat > pre <<\EOF
+h(4)
+
+a = b + c
+EOF
+
+cat > post <<\EOF
+h(4),hh[44]
+
+a = b + c
+
+aa = a
+
+aeff = aeff * ( aaa )
+EOF
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index 330b04f..5ed8eff 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1,3 +1,7 @@<RESET>
+<RED>h(4)<RESET><GREEN>h(4),hh[44]<RESET>
+<RESET>
+a = b + c<RESET>
+
+<GREEN>aa = a<RESET>
+
+<GREEN>aeff = aeff * ( aaa )<RESET>
+EOF
+
+test_expect_success 'word diff with runs of whitespace' '
+
+ word_diff --color-words
+
+'
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index 330b04f..5ed8eff 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1,3 +1,7 @@<RESET>
+h(4),<GREEN>hh<RESET>[44]
+<RESET>
+a = b + c<RESET>
+
+<GREEN>aa = a<RESET>
+
+<GREEN>aeff = aeff * ( aaa<RESET> )
+EOF
+cp expect expect.letter-runs-are-words
+
+test_expect_success 'word diff with a regular expression' '
+
+ word_diff --color-words="[a-z]+"
+
+'
+
+test_expect_success 'set a diff driver' '
+ git config diff.testdriver.wordRegex "[^[:space:]]" &&
+ cat <<EOF > .gitattributes
+pre diff=testdriver
+post diff=testdriver
+EOF
+'
+
+test_expect_success 'option overrides .gitattributes' '
+
+ word_diff --color-words="[a-z]+"
+
+'
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index 330b04f..5ed8eff 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1,3 +1,7 @@<RESET>
+h(4)<GREEN>,hh[44]<RESET>
+<RESET>
+a = b + c<RESET>
+
+<GREEN>aa = a<RESET>
+
+<GREEN>aeff = aeff * ( aaa )<RESET>
+EOF
+cp expect expect.non-whitespace-is-word
+
+test_expect_success 'use regex supplied by driver' '
+
+ word_diff --color-words
+
+'
+
+test_expect_success 'set diff.wordRegex option' '
+ git config diff.wordRegex "[[:alnum:]]+"
+'
+
+cp expect.letter-runs-are-words expect
+
+test_expect_success 'command-line overrides config' '
+ word_diff --color-words="[a-z]+"
+'
+
+cp expect.non-whitespace-is-word expect
+
+test_expect_success '.gitattributes override config' '
+ word_diff --color-words
+'
+
+test_expect_success 'remove diff driver regex' '
+ git config --unset diff.testdriver.wordRegex
+'
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index 330b04f..5ed8eff 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1,3 +1,7 @@<RESET>
+h(4),<GREEN>hh[44<RESET>]
+<RESET>
+a = b + c<RESET>
+
+<GREEN>aa = a<RESET>
+
+<GREEN>aeff = aeff * ( aaa<RESET> )
+EOF
+
+test_expect_success 'use configured regex' '
+ word_diff --color-words
+'
+
+echo 'aaa (aaa)' > pre
+echo 'aaa (aaa) aaa' > post
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index c29453b..be22f37 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1 +1 @@<RESET>
+aaa (aaa) <GREEN>aaa<RESET>
+EOF
+
+test_expect_success 'test parsing words for newline' '
+
+ word_diff --color-words="a+"
+
+
+'
+
+echo '(:' > pre
+echo '(' > post
+
+cat > expect <<\EOF
+<WHITE>diff --git a/pre b/post<RESET>
+<WHITE>index 289cb9d..2d06f37 100644<RESET>
+<WHITE>--- a/pre<RESET>
+<WHITE>+++ b/post<RESET>
+<BROWN>@@ -1 +1 @@<RESET>
+(<RED>:<RESET>
+EOF
+
+test_expect_success 'test when words are only removed at the end' '
+
+ word_diff --color-words=.
+
+'
+
+test_done
--- /dev/null
+#!/bin/sh
+
+test_description='git apply --numstat - <patch'
+
+. ./test-lib.sh
+
+test_expect_success setup '
+ echo hello >text &&
+ git add text &&
+ echo goodbye >text &&
+ git diff >patch
+'
+
+test_expect_success 'git apply --numstat - < patch' '
+ echo "1 1 text" >expect &&
+ git apply --numstat - <patch >actual &&
+ test_cmp expect actual
+'
+
+test_expect_success 'git apply --numstat - < patch patch' '
+ for i in 1 2; do echo "1 1 text"; done >expect &&
+ git apply --numstat - < patch patch >actual &&
+ test_cmp expect actual
+'
+
+test_done
test_tick &&
git commit -m second &&
- mkdir a &&
- echo ni >a/two &&
- git add a/two &&
+ git mv one ichi &&
test_tick &&
git commit -m third &&
- echo san >a/three &&
- git add a/three &&
+ cp ichi ein &&
+ git add ein &&
test_tick &&
git commit -m fourth &&
- git rm a/three &&
+ mkdir a &&
+ echo ni >a/two &&
+ git add a/two &&
+ test_tick &&
+ git commit -m fifth &&
+
+ git rm a/two &&
test_tick &&
- git commit -m fifth
+ git commit -m sixth
'
test_expect_success 'diff-filter=A' '
actual=$(git log --pretty="format:%s" --diff-filter=A HEAD) &&
- expect=$(echo fourth ; echo third ; echo initial) &&
+ expect=$(echo fifth ; echo fourth ; echo third ; echo initial) &&
test "$actual" = "$expect" || {
echo Oops
echo "Actual: $actual"
test_expect_success 'diff-filter=D' '
actual=$(git log --pretty="format:%s" --diff-filter=D HEAD) &&
- expect=$(echo fifth) &&
+ expect=$(echo sixth ; echo third) &&
+ test "$actual" = "$expect" || {
+ echo Oops
+ echo "Actual: $actual"
+ false
+ }
+
+'
+
+test_expect_success 'diff-filter=R' '
+
+ actual=$(git log -M --pretty="format:%s" --diff-filter=R HEAD) &&
+ expect=$(echo third) &&
+ test "$actual" = "$expect" || {
+ echo Oops
+ echo "Actual: $actual"
+ false
+ }
+
+'
+
+test_expect_success 'diff-filter=C' '
+
+ actual=$(git log -C -C --pretty="format:%s" --diff-filter=C HEAD) &&
+ expect=$(echo fourth) &&
+ test "$actual" = "$expect" || {
+ echo Oops
+ echo "Actual: $actual"
+ false
+ }
+
+'
+
+test_expect_success 'git log --follow' '
+
+ actual=$(git log --follow --pretty="format:%s" ichi) &&
+ expect=$(echo third ; echo second ; echo initial) &&
test "$actual" = "$expect" || {
echo Oops
echo "Actual: $actual"
test_expect_success 'setup case sensitivity tests' '
echo case >one &&
test_tick &&
+ git add one
git commit -a -m Second
'
#!/bin/sh
-test_description='git am not losing options'
+test_description='git am with options and not losing them'
. ./test-lib.sh
tm="$TEST_DIRECTORY/t4252"
grep "^Three$" file-2
'
+test_expect_success 'interrupted am --directory="frotz nitfol"' '
+ rm -rf .git/rebase-apply &&
+ git reset --hard initial &&
+ test_must_fail git am --directory="frotz nitfol" "$tm"/am-test-5-? &&
+ git am --skip &&
+ grep One "frotz nitfol/file-5"
+'
+
+test_expect_success 'apply to a funny path' '
+ with_sq="with'\''sq"
+ rm -fr .git/rebase-apply &&
+ git reset --hard initial &&
+ git am --directory="$with_sq" "$tm"/am-test-5-2 &&
+ test -f "$with_sq/file-5"
+'
+
+test_expect_success 'am --reject' '
+ rm -rf .git/rebase-apply &&
+ git reset --hard initial &&
+ test_must_fail git am --reject "$tm"/am-test-6-1 &&
+ grep "@@ -1,3 +1,3 @@" file-2.rej &&
+ test_must_fail git diff-files --exit-code --quiet file-2 &&
+ grep "[-]-reject" .git/rebase-apply/apply-opt
+'
+
test_done
--- /dev/null
+From: A U Thor <au.thor@example.com>
+Date: Thu Dec 4 16:00:00 2008 -0800
+Subject: Six
+
+Applying this patch with --directory='frotz nitfol' should fail
+
+diff --git i/junk/file-2 w/junk/file-2
+index 06e567b..b6f3a16 100644
+--- i/junk/file-2
++++ w/junk/file-2
+@@ -1,7 +1,7 @@
+ One
+ 2
+-3
++Three
+ 4
+ 5
+-6
++Six
+ 7
--- /dev/null
+From: A U Thor <au.thor@example.com>
+Date: Thu Dec 4 16:00:00 2008 -0800
+Subject: Six
+
+Applying this patch with --directory='frotz nitfol' should succeed
+
+diff --git i/file-5 w/file-5
+new file mode 100644
+index 000000..1d6ed9f
+--- /dev/null
++++ w/file-5
+@@ -0,0 +1,3 @@
++One
++two
++three
--- /dev/null
+From: A U Thor <au.thor@example.com>
+Date: Thu Dec 4 16:00:00 2008 -0800
+Subject: Huh
+
+Should fail and leave rejects
+
+diff --git i/file-2 w/file-2
+index 06e567b..b6f3a16 100644
+--- i/file-2
++++ w/file-2
+@@ -1,3 +1,3 @@
+-0
++One
+ 2
+ 3
+@@ -4,4 +4,4 @@
+ 4
+ 5
+-6
++Six
+ 7
--- /dev/null
+#!/bin/sh
+
+test_description='push to a repository that borrows from elsewhere'
+
+. ./test-lib.sh
+
+test_expect_success setup '
+ mkdir alice-pub &&
+ (
+ cd alice-pub &&
+ GIT_DIR=. git init
+ ) &&
+ mkdir alice-work &&
+ (
+ cd alice-work &&
+ git init &&
+ >file &&
+ git add . &&
+ git commit -m initial &&
+ git push ../alice-pub master
+ ) &&
+
+ # Project Bob is a fork of project Alice
+ mkdir bob-pub &&
+ (
+ cd bob-pub &&
+ GIT_DIR=. git init &&
+ mkdir -p objects/info &&
+ echo ../../alice-pub/objects >objects/info/alternates
+ ) &&
+ git clone alice-pub bob-work &&
+ (
+ cd bob-work &&
+ git push ../bob-pub master
+ )
+'
+
+test_expect_success 'alice works and pushes' '
+ (
+ cd alice-work &&
+ echo more >file &&
+ git commit -a -m second &&
+ git push ../alice-pub
+ )
+'
+
+test_expect_success 'bob fetches from alice, works and pushes' '
+ (
+ # Bob acquires what Alice did in his work tree first.
+ # Even though these objects are not directly in
+ # the public repository of Bob, this push does not
+ # need to send the commit Bob received from Alice
+ # to his public repository, as all the object Alice
+ # has at her public repository are available to it
+ # via its alternates.
+ cd bob-work &&
+ git pull ../alice-pub master &&
+ echo more bob >file &&
+ git commit -a -m third &&
+ git push ../bob-pub
+ ) &&
+
+ # Check that the second commit by Alice is not sent
+ # to ../bob-pub
+ (
+ cd bob-pub &&
+ second=$(git rev-parse HEAD^) &&
+ rm -f objects/info/alternates &&
+ test_must_fail git cat-file -t $second &&
+ echo ../../alice-pub/objects >objects/info/alternates
+ )
+'
+
+test_expect_success 'clean-up in case the previous failed' '
+ (
+ cd bob-pub &&
+ echo ../../alice-pub/objects >objects/info/alternates
+ )
+'
+
+test_expect_success 'alice works and pushes again' '
+ (
+ # Alice does not care what Bob does. She does not
+ # even have to be aware of his existence. She just
+ # keeps working and pushing
+ cd alice-work &&
+ echo more alice >file &&
+ git commit -a -m fourth &&
+ git push ../alice-pub
+ )
+'
+
+test_expect_success 'bob works and pushes' '
+ (
+ # This time Bob does not pull from Alice, and
+ # the master branch at her public repository points
+ # at a commit Bob does not know about. This should
+ # not prevent the push by Bob from succeeding.
+ cd bob-work &&
+ echo yet more bob >file &&
+ git commit -a -m fifth &&
+ git push ../bob-pub
+ )
+'
+
+test_done
git clone $HTTPD_URL/test_repo.git test_repo_clone
'
-test_expect_failure 'push to remote repository' '
+test_expect_failure 'push to remote repository with packed refs' '
cd "$ROOT_PATH"/test_repo_clone &&
: >path2 &&
git add path2 &&
test_tick &&
git commit -m path2 &&
+ HEAD=$(git rev-parse --verify HEAD) &&
git push &&
- [ -f "$HTTPD_DOCUMENT_ROOT_PATH/test_repo.git/refs/heads/master" ]
+ (cd "$HTTPD_DOCUMENT_ROOT_PATH"/test_repo.git &&
+ test $HEAD = $(git rev-parse --verify HEAD))
'
-test_expect_failure 'create and delete remote branch' '
+test_expect_success ' push to remote repository with unpacked refs' '
+ (cd "$HTTPD_DOCUMENT_ROOT_PATH"/test_repo.git &&
+ rm packed-refs &&
+ git update-ref refs/heads/master \
+ 0c973ae9bd51902a28466f3850b543fa66a6aaf4) &&
+ git push &&
+ (cd "$HTTPD_DOCUMENT_ROOT_PATH"/test_repo.git &&
+ test $HEAD = $(git rev-parse --verify HEAD))
+'
+
+test_expect_success 'create and delete remote branch' '
cd "$ROOT_PATH"/test_repo_clone &&
git checkout -b dev &&
: >path3 &&
test_must_fail git show-ref --verify refs/remotes/origin/dev
'
+test_expect_success 'MKCOL sends directory names with trailing slashes' '
+
+ ! grep "\"MKCOL.*[^/] HTTP/[^ ]*\"" < "$HTTPD_ROOT_PATH"/access.log
+
+'
+
stop_httpd
test_done
'
+test_expect_success 'clone to an existing empty directory' '
+ mkdir target-3 &&
+ git clone src target-3 &&
+ T=$( cd target-3 && git rev-parse HEAD ) &&
+ S=$( cd src && git rev-parse HEAD ) &&
+ test "$T" = "$S"
+'
+
+test_expect_success 'clone to an existing non-empty directory' '
+ mkdir target-4 &&
+ >target-4/Fakefile &&
+ test_must_fail git clone src target-4
+'
+
+test_expect_success 'clone to an existing path' '
+ >target-5 &&
+ test_must_fail git clone src target-5
+'
+
test_done
git clone --bare . x &&
test "$(GIT_CONFIG=a.git/config git config --bool core.bare)" = true &&
test "$(GIT_CONFIG=x/config git config --bool core.bare)" = true
- git bundle create b1.bundle --all HEAD &&
- git bundle create b2.bundle --all &&
+ git bundle create b1.bundle --all &&
+ git bundle create b2.bundle master &&
mkdir dir &&
cp b1.bundle dir/b3
cp b1.bundle b4
test ! -e .git/refs/heads/master
'
+test_expect_success 'clone empty repository' '
+ cd "$D" &&
+ mkdir empty &&
+ (cd empty && git init) &&
+ git clone empty empty-clone &&
+ test_tick &&
+ (cd empty-clone
+ echo "content" >> foo &&
+ git add foo &&
+ git commit -m "Initial commit" &&
+ git push origin master &&
+ expected=$(git rev-parse master) &&
+ actual=$(git --git-dir=../empty/.git rev-parse master) &&
+ test $actual = $expected)
+'
+
test_done
test_format() {
cat >expect.$1
test_expect_success "format $1" "
-git rev-list --pretty=format:$2 master >output.$1 &&
+git rev-list --pretty=format:'$2' master >output.$1 &&
test_cmp expect.$1 output.$1
"
}
\e[31mfoo\e[32mbar\e[34mbaz\e[mxyzzy
EOF
+test_format advanced-colors '%C(red yellow bold)foo%C(reset)' <<'EOF'
+commit 131a310eb913d107dd3c09a65d1651175898735d
+\e[1;31;43mfoo\e[m
+commit 86c75cfd708a0e5868dc876ed5b8bb66c80b4873
+\e[1;31;43mfoo\e[m
+EOF
+
cat >commit-msg <<'EOF'
Test printing of complex bodies
--- /dev/null
+#!/bin/sh
+
+test_description='--all includes detached HEADs'
+
+. ./test-lib.sh
+
+
+commit () {
+ test_tick &&
+ echo $1 > foo &&
+ git add foo &&
+ git commit -m "$1"
+}
+
+test_expect_success 'setup' '
+
+ commit one &&
+ commit two &&
+ git checkout HEAD^ &&
+ commit detached
+
+'
+
+test_expect_success 'rev-list --all lists detached HEAD' '
+
+ test 3 = $(git rev-list --all | wc -l)
+
+'
+
+test_expect_success 'repack does not lose detached HEAD' '
+
+ git gc &&
+ git prune --expire=now &&
+ git show HEAD
+
+'
+
+test_done
done
'
+test_expect_failure 'packed obs in alt ODB are repacked when local repo has packs' '
+ rm -f .git/objects/pack/* &&
+ echo new_content >> file1 &&
+ git add file1 &&
+ git commit -m more_content &&
+ git repack &&
+ git repack -a -d &&
+ myidx=$(ls -1 .git/objects/pack/*.idx) &&
+ test -f "$myidx" &&
+ for p in alt_objects/pack/*.idx; do
+ git verify-pack -v $p | sed -n -e "/^[0-9a-f]\{40\}/p"
+ done | while read sha1 rest; do
+ if ! ( git verify-pack -v $myidx | grep "^$sha1" ); then
+ echo "Missing object in local pack: $sha1"
+ return 1
+ fi
+ done
+'
+
test_done
compare_mtimes ()
{
- perl -e 'my $reference = shift;
- foreach my $file (@ARGV) {
- exit(1) unless(-f $file && -M $file == -M $reference);
- }
- exit(0);
- ' -- "$@"
+ read tref rest &&
+ while read t rest; do
+ test "$tref" = "$t" || break
+ done
}
test_expect_success '-A without -d option leaves unreachable objects packed' '
tmppack=".git/objects/pack/tmp_pack" &&
ln "$packfile" "$tmppack" &&
git repack -A -l -d &&
- compare_mtimes "$tmppack" "$fsha1path" "$csha1path" "$tsha1path"
+ test-chmtime -v +0 "$tmppack" "$fsha1path" "$csha1path" "$tsha1path" \
+ > mtimes &&
+ compare_mtimes < mtimes
'
test_done
}
compare_svn_head_with () {
- LC_ALL=en_US.UTF-8 svn log --limit 1 `git svn info --url` | \
- sed -e 1,3d -e "/^-\{1,\}\$/d" >current &&
+ # extract just the log message and strip out committer info.
+ # don't use --limit here since svn 1.1.x doesn't have it,
+ LC_ALL=en_US.UTF-8 svn log `git svn info --url` | perl -w -e '
+ use bytes;
+ $/ = ("-"x72) . "\n";
+ my @x = <STDIN>;
+ @x = split(/\n/, $x[1]);
+ splice(@x, 0, 2);
+ $x[-1] = "";
+ print join("\n", @x);
+ ' > current &&
test_cmp current "$1"
}
--- /dev/null
+#!/bin/sh
+
+test_description='test that git handles an svn repository with empty symlinks'
+
+. ./lib-git-svn.sh
+test_expect_success 'load svn dumpfile' '
+ svnadmin load "$rawsvnrepo" <<EOF
+SVN-fs-dump-format-version: 2
+
+UUID: 60780f9a-7df5-43b4-83ab-60e2c0673ef7
+
+Revision-number: 0
+Prop-content-length: 56
+Content-length: 56
+
+K 8
+svn:date
+V 27
+2008-11-26T07:17:27.590577Z
+PROPS-END
+
+Revision-number: 1
+Prop-content-length: 111
+Content-length: 111
+
+K 7
+svn:log
+V 4
+test
+K 10
+svn:author
+V 12
+normalperson
+K 8
+svn:date
+V 27
+2008-11-26T07:18:03.511836Z
+PROPS-END
+
+Node-path: bar
+Node-kind: file
+Node-action: add
+Prop-content-length: 33
+Text-content-length: 0
+Text-content-md5: d41d8cd98f00b204e9800998ecf8427e
+Content-length: 33
+
+K 11
+svn:special
+V 1
+*
+PROPS-END
+
+Revision-number: 2
+Prop-content-length: 121
+Content-length: 121
+
+K 7
+svn:log
+V 13
+bar => doink
+
+K 10
+svn:author
+V 12
+normalperson
+K 8
+svn:date
+V 27
+2008-11-27T03:55:31.601672Z
+PROPS-END
+
+Node-path: bar
+Node-kind: file
+Node-action: change
+Text-content-length: 10
+Text-content-md5: 92ca4fe7a9721f877f765c252dcd66c9
+Content-length: 10
+
+link doink
+
+EOF
+'
+
+test_expect_success 'clone using git svn' 'git svn clone -r1 "$svnrepo" x'
+test_expect_success '"bar" is an empty file' 'test -f x/bar && ! test -s x/bar'
+test_expect_success 'get "bar" => symlink fix from svn' \
+ '(cd x && git svn rebase)'
+test_expect_success '"bar" becomes a symlink' 'test -L x/bar'
+test_done
--- /dev/null
+#!/bin/sh
+
+test_description='test that git handles an svn repository with empty symlinks'
+
+. ./lib-git-svn.sh
+test_expect_success 'load svn dumpfile' '
+ svnadmin load "$rawsvnrepo" <<EOF
+SVN-fs-dump-format-version: 2
+
+UUID: 60780f9a-7df5-43b4-83ab-60e2c0673ef7
+
+Revision-number: 0
+Prop-content-length: 56
+Content-length: 56
+
+K 8
+svn:date
+V 27
+2008-11-26T07:17:27.590577Z
+PROPS-END
+
+Revision-number: 1
+Prop-content-length: 111
+Content-length: 111
+
+K 7
+svn:log
+V 4
+test
+K 10
+svn:author
+V 12
+normalperson
+K 8
+svn:date
+V 27
+2008-11-26T07:18:03.511836Z
+PROPS-END
+
+Node-path: bar
+Node-kind: file
+Node-action: add
+Prop-content-length: 33
+Text-content-length: 4
+Text-content-md5: 912ec803b2ce49e4a541068d495ab570
+Content-length: 37
+
+K 11
+svn:special
+V 1
+*
+PROPS-END
+asdf
+
+Revision-number: 2
+Prop-content-length: 121
+Content-length: 121
+
+K 7
+svn:log
+V 13
+bar => doink
+
+K 10
+svn:author
+V 12
+normalperson
+K 8
+svn:date
+V 27
+2008-11-27T03:55:31.601672Z
+PROPS-END
+
+Node-path: bar
+Node-kind: file
+Node-action: change
+Text-content-length: 10
+Text-content-md5: 92ca4fe7a9721f877f765c252dcd66c9
+Content-length: 10
+
+link doink
+
+EOF
+'
+
+test_expect_success 'clone using git svn' 'git svn clone -r1 "$svnrepo" x'
+
+test_expect_success '"bar" is a symlink that points to "asdf"' '
+ test -L x/bar &&
+ (cd x && test xasdf = x"`git cat-file blob HEAD:bar`")
+'
+
+test_expect_success 'get "bar" => symlink fix from svn' '
+ (cd x && git svn rebase)
+'
+
+test_expect_success '"bar" remains a proper symlink' '
+ test -L x/bar &&
+ (cd x && test xdoink = x"`git cat-file blob HEAD:bar`")
+'
+
+test_done
--- /dev/null
+#!/bin/sh
+#
+# Copyright (c) 2009 Eric Wong
+#
+
+test_description='git svn property tests'
+. ./lib-git-svn.sh
+
+test_expect_success 'setup repo with a git repo inside it' '
+ svn co "$svnrepo" s &&
+ (
+ cd s &&
+ git init &&
+ test -f .git/HEAD &&
+ > .git/a &&
+ echo a > a &&
+ svn add .git a &&
+ svn commit -m "create a nested git repo" &&
+ svn up &&
+ echo hi >> .git/a &&
+ svn commit -m "modify .git/a" &&
+ svn up
+ )
+'
+
+test_expect_success 'clone an SVN repo containing a git repo' '
+ git svn clone "$svnrepo" g &&
+ echo a > expect &&
+ test_cmp expect g/a
+'
+
+test_expect_success 'SVN-side change outside of .git' '
+ (
+ cd s &&
+ echo b >> a &&
+ svn commit -m "SVN-side change outside of .git" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change outside of .git"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo' '
+ (
+ cd g &&
+ git svn rebase &&
+ echo a > expect &&
+ echo b >> expect &&
+ test_cmp a expect &&
+ rm expect
+ )
+'
+
+test_expect_success 'SVN-side change inside of .git' '
+ (
+ cd s &&
+ git add a &&
+ git commit -m "add a inside an SVN repo" &&
+ git log &&
+ svn add --force .git &&
+ svn commit -m "SVN-side change inside of .git" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change inside of .git"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo' '
+ (
+ cd g &&
+ git svn rebase &&
+ echo a > expect &&
+ echo b >> expect &&
+ test_cmp a expect &&
+ rm expect
+ )
+'
+
+test_expect_success 'SVN-side change in and out of .git' '
+ (
+ cd s &&
+ echo c >> a &&
+ git add a &&
+ git commit -m "add a inside an SVN repo" &&
+ svn commit -m "SVN-side change in and out of .git" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change in and out of .git"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo again' '
+ (
+ cd g &&
+ git svn rebase &&
+ echo a > expect &&
+ echo b >> expect &&
+ echo c >> expect &&
+ test_cmp a expect &&
+ rm expect
+ )
+'
+
+test_done
--- /dev/null
+#!/bin/sh
+#
+# Copyright (c) 2009 Vitaly Shukela
+# Copyright (c) 2009 Eric Wong
+#
+
+test_description='git svn property tests'
+. ./lib-git-svn.sh
+
+test_expect_success 'setup test repository' '
+ svn co "$svnrepo" s &&
+ (
+ cd s &&
+ mkdir qqq www &&
+ echo test_qqq > qqq/test_qqq.txt &&
+ echo test_www > www/test_www.txt &&
+ svn add qqq &&
+ svn add www &&
+ svn commit -m "create some files" &&
+ svn up &&
+ echo hi >> www/test_www.txt &&
+ svn commit -m "modify www/test_www.txt" &&
+ svn up
+ )
+'
+
+test_expect_success 'clone an SVN repository with ignored www directory' '
+ git svn clone --ignore-paths="^www" "$svnrepo" g &&
+ echo test_qqq > expect &&
+ for i in g/*/*.txt; do cat $i >> expect2; done &&
+ test_cmp expect expect2
+'
+
+test_expect_success 'SVN-side change outside of www' '
+ (
+ cd s &&
+ echo b >> qqq/test_qqq.txt &&
+ svn commit -m "SVN-side change outside of www" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change outside of www"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo' '
+ (
+ cd g &&
+ git svn rebase --ignore-paths="^www" &&
+ printf "test_qqq\nb\n" > expect &&
+ for i in */*.txt; do cat $i >> expect2; done &&
+ test_cmp expect2 expect &&
+ rm expect expect2
+ )
+'
+
+test_expect_success 'SVN-side change inside of ignored www' '
+ (
+ cd s &&
+ echo zaq >> www/test_www.txt
+ svn commit -m "SVN-side change inside of www/test_www.txt" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change inside of www/test_www.txt"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo' '
+ (
+ cd g &&
+ git svn rebase --ignore-paths="^www" &&
+ printf "test_qqq\nb\n" > expect &&
+ for i in */*.txt; do cat $i >> expect2; done &&
+ test_cmp expect2 expect &&
+ rm expect expect2
+ )
+'
+
+test_expect_success 'SVN-side change in and out of ignored www' '
+ (
+ cd s &&
+ echo cvf >> www/test_www.txt
+ echo ygg >> qqq/test_qqq.txt
+ svn commit -m "SVN-side change in and out of ignored www" &&
+ svn up &&
+ svn log -v | fgrep "SVN-side change in and out of ignored www"
+ )
+'
+
+test_expect_success 'update git svn-cloned repo again' '
+ (
+ cd g &&
+ git svn rebase --ignore-paths="^www" &&
+ printf "test_qqq\nb\nygg\n" > expect &&
+ for i in */*.txt; do cat $i >> expect2; done &&
+ test_cmp expect2 expect &&
+ rm expect expect2
+ )
+'
+
+test_done
error () {
say_color error "error: $*"
- trap - exit
+ trap - EXIT
exit 1
}
exit 1
}
-trap 'die' exit
+trap 'die' EXIT
# The semantics of the editor variables are that of invoking
# sh -c "$EDITOR \"$@\"" files ...
say_color error "FAIL $test_count: $1"
shift
echo "$@" | sed -e 's/^/ /'
- test "$immediate" = "" || { trap - exit; exit 1; }
+ test "$immediate" = "" || { trap - EXIT; exit 1; }
}
test_known_broken_ok_ () {
}
test_done () {
- trap - exit
+ trap - EXIT
test_results_dir="$TEST_DIRECTORY/test-results"
mkdir -p "$test_results_dir"
test_results_path="$test_results_dir/${0%-*}-$$"
test="trash directory.$(basename "$0" .sh)"
test ! -z "$debug" || remove_trash="$TEST_DIRECTORY/$test"
rm -fr "$test" || {
- trap - exit
+ trap - EXIT
echo >&5 "FATAL: Cannot prepare test area"
exit 1
}
--- /dev/null
+#include "cache.h"
+
+
+static int test_isdigit(int c)
+{
+ return isdigit(c);
+}
+
+static int test_isspace(int c)
+{
+ return isspace(c);
+}
+
+static int test_isalpha(int c)
+{
+ return isalpha(c);
+}
+
+static int test_isalnum(int c)
+{
+ return isalnum(c);
+}
+
+static int test_is_glob_special(int c)
+{
+ return is_glob_special(c);
+}
+
+static int test_is_regex_special(int c)
+{
+ return is_regex_special(c);
+}
+
+#define DIGIT "0123456789"
+#define LOWER "abcdefghijklmnopqrstuvwxyz"
+#define UPPER "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
+
+static const struct ctype_class {
+ const char *name;
+ int (*test_fn)(int);
+ const char *members;
+} classes[] = {
+ { "isdigit", test_isdigit, DIGIT },
+ { "isspace", test_isspace, " \n\r\t" },
+ { "isalpha", test_isalpha, LOWER UPPER },
+ { "isalnum", test_isalnum, LOWER UPPER DIGIT },
+ { "is_glob_special", test_is_glob_special, "*?[\\" },
+ { "is_regex_special", test_is_regex_special, "$()*+.?[\\^{|" },
+ { NULL }
+};
+
+static int test_class(const struct ctype_class *test)
+{
+ int i, rc = 0;
+
+ for (i = 0; i < 256; i++) {
+ int expected = i ? !!strchr(test->members, i) : 0;
+ int actual = test->test_fn(i);
+
+ if (actual != expected) {
+ rc = 1;
+ printf("%s classifies char %d (0x%02x) wrongly\n",
+ test->name, i, i);
+ }
+ }
+ return rc;
+}
+
+int main(int argc, char **argv)
+{
+ const struct ctype_class *test;
+ int rc = 0;
+
+ for (test = classes; test->name; test++)
+ rc |= test_class(test);
+
+ return rc;
+}
int main(int argc, char **argv)
{
if (argc == 3 && !strcmp(argv[1], "normalize_absolute_path")) {
- char *buf = xmalloc(strlen(argv[2])+1);
+ char *buf = xmalloc(PATH_MAX + 1);
int rv = normalize_absolute_path(buf, argv[2]);
assert(strlen(buf) == rv);
puts(buf);
memset (&list, 0, sizeof(list));
while ((de = readdir(dir))) {
- if (de->d_name[0] == '.' && (de->d_name[1] == '\0' ||
- (de->d_name[1] == '.' &&
- de->d_name[2] == '\0')))
+ if (is_dot_or_dotdot(de->d_name))
continue;
ALLOC_GROW(list.entries, list.nr + 1, list.alloc);
list.entries[list.nr++] = xstrdup(de->d_name);
char *cp, *prev;
char *name = ce->name;
- if (has_symlink_leading_path(ce_namelen(ce), ce->name))
+ if (has_symlink_or_noent_leading_path(ce_namelen(ce), ce->name))
return;
if (unlink(name))
return;
if (o->index_only || o->reset || !o->update)
return 0;
- if (has_symlink_leading_path(ce_namelen(ce), ce->name))
+ if (has_symlink_or_noent_leading_path(ce_namelen(ce), ce->name))
return 0;
if (!lstat(ce->name, &st)) {
static int ndrivers;
static int drivers_alloc;
-#define FUNCNAME(name, pattern) \
- { name, NULL, -1, { pattern, REG_EXTENDED } }
+#define PATTERNS(name, pattern, word_regex) \
+ { name, NULL, -1, { pattern, REG_EXTENDED }, word_regex }
static struct userdiff_driver builtin_drivers[] = {
-FUNCNAME("html", "^[ \t]*(<[Hh][1-6][ \t].*>.*)$"),
-FUNCNAME("java",
+PATTERNS("html", "^[ \t]*(<[Hh][1-6][ \t].*>.*)$",
+ "[^<>= \t]+|[^[:space:]]|[\x80-\xff]+"),
+PATTERNS("java",
"!^[ \t]*(catch|do|for|if|instanceof|new|return|switch|throw|while)\n"
- "^[ \t]*(([ \t]*[A-Za-z_][A-Za-z_0-9]*){2,}[ \t]*\\([^;]*)$"),
-FUNCNAME("objc",
+ "^[ \t]*(([ \t]*[A-Za-z_][A-Za-z_0-9]*){2,}[ \t]*\\([^;]*)$",
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+[fFlL]?|0[xXbB]?[0-9a-fA-F]+[lL]?"
+ "|[-+*/<>%&^|=!]="
+ "|--|\\+\\+|<<=?|>>>?=?|&&|\\|\\|"
+ "|[^[:space:]]|[\x80-\xff]+"),
+PATTERNS("objc",
/* Negate C statements that can look like functions */
"!^[ \t]*(do|for|if|else|return|switch|while)\n"
/* Objective-C methods */
/* C functions */
"^[ \t]*(([ \t]*[A-Za-z_][A-Za-z_0-9]*){2,}[ \t]*\\([^;]*)$\n"
/* Objective-C class/protocol definitions */
- "^(@(implementation|interface|protocol)[ \t].*)$"),
-FUNCNAME("pascal",
+ "^(@(implementation|interface|protocol)[ \t].*)$",
+ /* -- */
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+[fFlL]?|0[xXbB]?[0-9a-fA-F]+[lL]?"
+ "|[-+*/<>%&^|=!]=|--|\\+\\+|<<=?|>>=?|&&|\\|\\||::|->"
+ "|[^[:space:]]|[\x80-\xff]+"),
+PATTERNS("pascal",
"^((procedure|function|constructor|destructor|interface|"
"implementation|initialization|finalization)[ \t]*.*)$"
"\n"
- "^(.*=[ \t]*(class|record).*)$"),
-FUNCNAME("php", "^[\t ]*((function|class).*)"),
-FUNCNAME("python", "^[ \t]*((class|def)[ \t].*)$"),
-FUNCNAME("ruby", "^[ \t]*((class|module|def)[ \t].*)$"),
-FUNCNAME("bibtex", "(@[a-zA-Z]{1,}[ \t]*\\{{0,1}[ \t]*[^ \t\"@',\\#}{~%]*).*$"),
-FUNCNAME("tex", "^(\\\\((sub)*section|chapter|part)\\*{0,1}\\{.*)$"),
+ "^(.*=[ \t]*(class|record).*)$",
+ /* -- */
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+|0[xXbB]?[0-9a-fA-F]+"
+ "|<>|<=|>=|:=|\\.\\."
+ "|[^[:space:]]|[\x80-\xff]+"),
+PATTERNS("php", "^[\t ]*((function|class).*)",
+ /* -- */
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+|0[xXbB]?[0-9a-fA-F]+"
+ "|[-+*/<>%&^|=!.]=|--|\\+\\+|<<=?|>>=?|===|&&|\\|\\||::|->"
+ "|[^[:space:]]|[\x80-\xff]+"),
+PATTERNS("python", "^[ \t]*((class|def)[ \t].*)$",
+ /* -- */
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+[jJlL]?|0[xX]?[0-9a-fA-F]+[lL]?"
+ "|[-+*/<>%&^|=!]=|//=?|<<=?|>>=?|\\*\\*=?"
+ "|[^[:space:]|[\x80-\xff]+"),
+ /* -- */
+PATTERNS("ruby", "^[ \t]*((class|module|def)[ \t].*)$",
+ /* -- */
+ "(@|@@|\\$)?[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+|0[xXbB]?[0-9a-fA-F]+|\\?(\\\\C-)?(\\\\M-)?."
+ "|//=?|[-+*/<>%&^|=!]=|<<=?|>>=?|===|\\.{1,3}|::|[!=]~"
+ "|[^[:space:]|[\x80-\xff]+"),
+PATTERNS("bibtex", "(@[a-zA-Z]{1,}[ \t]*\\{{0,1}[ \t]*[^ \t\"@',\\#}{~%]*).*$",
+ "[={}\"]|[^={}\" \t]+"),
+PATTERNS("tex", "^(\\\\((sub)*section|chapter|part)\\*{0,1}\\{.*)$",
+ "\\\\[a-zA-Z@]+|\\\\.|[a-zA-Z0-9\x80-\xff]+|[^[:space:]]"),
+PATTERNS("cpp",
+ /* Jump targets or access declarations */
+ "!^[ \t]*[A-Za-z_][A-Za-z_0-9]*:.*$\n"
+ /* C/++ functions/methods at top level */
+ "^([A-Za-z_][A-Za-z_0-9]*([ \t]+[A-Za-z_][A-Za-z_0-9]*([ \t]*::[ \t]*[^[:space:]]+)?){1,}[ \t]*\\([^;]*)$\n"
+ /* compound type at top level */
+ "^((struct|class|enum)[^;]*)$",
+ /* -- */
+ "[a-zA-Z_][a-zA-Z0-9_]*"
+ "|[-+0-9.e]+[fFlL]?|0[xXbB]?[0-9a-fA-F]+[lL]?"
+ "|[-+*/<>%&^|=!]=|--|\\+\\+|<<=?|>>=?|&&|\\|\\||::|->"
+ "|[^[:space:]]|[\x80-\xff]+"),
{ "default", NULL, -1, { NULL, 0 } },
};
-#undef FUNCNAME
+#undef PATTERNS
static struct userdiff_driver driver_true = {
"diff=true",
return parse_string(&drv->external, k, v);
if ((drv = parse_driver(k, v, "textconv")))
return parse_string(&drv->textconv, k, v);
+ if ((drv = parse_driver(k, v, "wordregex")))
+ return parse_string(&drv->word_regex, k, v);
return 0;
}
const char *external;
int binary;
struct userdiff_funcname funcname;
+ const char *word_regex;
const char *textconv;
};
die("Unable to create temporary file: %s", strerror(errno));
return fd;
}
+
+/*
+ * zlib wrappers to make sure we don't silently miss errors
+ * at init time.
+ */
+void git_inflate_init(z_streamp strm)
+{
+ const char *err;
+
+ switch (inflateInit(strm)) {
+ case Z_OK:
+ return;
+
+ case Z_MEM_ERROR:
+ err = "out of memory";
+ break;
+ case Z_VERSION_ERROR:
+ err = "wrong version";
+ break;
+ default:
+ err = "error";
+ }
+ die("inflateInit: %s (%s)", err, strm->msg ? strm->msg : "no message");
+}
+
+void git_inflate_end(z_streamp strm)
+{
+ if (inflateEnd(strm) != Z_OK)
+ error("inflateEnd: %s", strm->msg ? strm->msg : "failed");
+}
+
+int git_inflate(z_streamp strm, int flush)
+{
+ int ret = inflate(strm, flush);
+ const char *err;
+
+ switch (ret) {
+ /* Out of memory is fatal. */
+ case Z_MEM_ERROR:
+ die("inflate: out of memory");
+
+ /* Data corruption errors: we may want to recover from them (fsck) */
+ case Z_NEED_DICT:
+ err = "needs dictionary"; break;
+ case Z_DATA_ERROR:
+ err = "data stream error"; break;
+ case Z_STREAM_ERROR:
+ err = "stream consistency error"; break;
+ default:
+ err = "unknown error"; break;
+
+ /* Z_BUF_ERROR: normal, needs more space in the output buffer */
+ case Z_BUF_ERROR:
+ case Z_OK:
+ case Z_STREAM_END:
+ return ret;
+ }
+ error("inflate: %s (%s)", err, strm->msg ? strm->msg : "no message");
+ return ret;
+}
#define XDF_IGNORE_WHITESPACE (1 << 2)
#define XDF_IGNORE_WHITESPACE_CHANGE (1 << 3)
#define XDF_IGNORE_WHITESPACE_AT_EOL (1 << 4)
+#define XDF_PATIENCE_DIFF (1 << 5)
#define XDF_WHITESPACE_FLAGS (XDF_IGNORE_WHITESPACE | XDF_IGNORE_WHITESPACE_CHANGE | XDF_IGNORE_WHITESPACE_AT_EOL)
#define XDL_PATCH_NORMAL '-'
xdalgoenv_t xenv;
diffdata_t dd1, dd2;
+ if (xpp->flags & XDF_PATIENCE_DIFF)
+ return xdl_do_patience_diff(mf1, mf2, xpp, xe);
+
if (xdl_prepare_env(mf1, mf2, xpp, xe) < 0) {
return -1;
void xdl_free_script(xdchange_t *xscr);
int xdl_emit_diff(xdfenv_t *xe, xdchange_t *xscr, xdemitcb_t *ecb,
xdemitconf_t const *xecfg);
+int xdl_do_patience_diff(mmfile_t *mf1, mmfile_t *mf2, xpparam_t const *xpp,
+ xdfenv_t *env);
#endif /* #if !defined(XDIFFI_H) */
--- /dev/null
+/*
+ * LibXDiff by Davide Libenzi ( File Differential Library )
+ * Copyright (C) 2003-2009 Davide Libenzi, Johannes E. Schindelin
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Lesser General Public
+ * License as published by the Free Software Foundation; either
+ * version 2.1 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with this library; if not, write to the Free Software
+ * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
+ *
+ * Davide Libenzi <davidel@xmailserver.org>
+ *
+ */
+#include "xinclude.h"
+#include "xtypes.h"
+#include "xdiff.h"
+
+/*
+ * The basic idea of patience diff is to find lines that are unique in
+ * both files. These are intuitively the ones that we want to see as
+ * common lines.
+ *
+ * The maximal ordered sequence of such line pairs (where ordered means
+ * that the order in the sequence agrees with the order of the lines in
+ * both files) naturally defines an initial set of common lines.
+ *
+ * Now, the algorithm tries to extend the set of common lines by growing
+ * the line ranges where the files have identical lines.
+ *
+ * Between those common lines, the patience diff algorithm is applied
+ * recursively, until no unique line pairs can be found; these line ranges
+ * are handled by the well-known Myers algorithm.
+ */
+
+#define NON_UNIQUE ULONG_MAX
+
+/*
+ * This is a hash mapping from line hash to line numbers in the first and
+ * second file.
+ */
+struct hashmap {
+ int nr, alloc;
+ struct entry {
+ unsigned long hash;
+ /*
+ * 0 = unused entry, 1 = first line, 2 = second, etc.
+ * line2 is NON_UNIQUE if the line is not unique
+ * in either the first or the second file.
+ */
+ unsigned long line1, line2;
+ /*
+ * "next" & "previous" are used for the longest common
+ * sequence;
+ * initially, "next" reflects only the order in file1.
+ */
+ struct entry *next, *previous;
+ } *entries, *first, *last;
+ /* were common records found? */
+ unsigned long has_matches;
+ mmfile_t *file1, *file2;
+ xdfenv_t *env;
+ xpparam_t const *xpp;
+};
+
+/* The argument "pass" is 1 for the first file, 2 for the second. */
+static void insert_record(int line, struct hashmap *map, int pass)
+{
+ xrecord_t **records = pass == 1 ?
+ map->env->xdf1.recs : map->env->xdf2.recs;
+ xrecord_t *record = records[line - 1], *other;
+ /*
+ * After xdl_prepare_env() (or more precisely, due to
+ * xdl_classify_record()), the "ha" member of the records (AKA lines)
+ * is _not_ the hash anymore, but a linearized version of it. In
+ * other words, the "ha" member is guaranteed to start with 0 and
+ * the second record's ha can only be 0 or 1, etc.
+ *
+ * So we multiply ha by 2 in the hope that the hashing was
+ * "unique enough".
+ */
+ int index = (int)((record->ha << 1) % map->alloc);
+
+ while (map->entries[index].line1) {
+ other = map->env->xdf1.recs[map->entries[index].line1 - 1];
+ if (map->entries[index].hash != record->ha ||
+ !xdl_recmatch(record->ptr, record->size,
+ other->ptr, other->size,
+ map->xpp->flags)) {
+ if (++index >= map->alloc)
+ index = 0;
+ continue;
+ }
+ if (pass == 2)
+ map->has_matches = 1;
+ if (pass == 1 || map->entries[index].line2)
+ map->entries[index].line2 = NON_UNIQUE;
+ else
+ map->entries[index].line2 = line;
+ return;
+ }
+ if (pass == 2)
+ return;
+ map->entries[index].line1 = line;
+ map->entries[index].hash = record->ha;
+ if (!map->first)
+ map->first = map->entries + index;
+ if (map->last) {
+ map->last->next = map->entries + index;
+ map->entries[index].previous = map->last;
+ }
+ map->last = map->entries + index;
+ map->nr++;
+}
+
+/*
+ * This function has to be called for each recursion into the inter-hunk
+ * parts, as previously non-unique lines can become unique when being
+ * restricted to a smaller part of the files.
+ *
+ * It is assumed that env has been prepared using xdl_prepare().
+ */
+static int fill_hashmap(mmfile_t *file1, mmfile_t *file2,
+ xpparam_t const *xpp, xdfenv_t *env,
+ struct hashmap *result,
+ int line1, int count1, int line2, int count2)
+{
+ result->file1 = file1;
+ result->file2 = file2;
+ result->xpp = xpp;
+ result->env = env;
+
+ /* We know exactly how large we want the hash map */
+ result->alloc = count1 * 2;
+ result->entries = (struct entry *)
+ xdl_malloc(result->alloc * sizeof(struct entry));
+ if (!result->entries)
+ return -1;
+ memset(result->entries, 0, result->alloc * sizeof(struct entry));
+
+ /* First, fill with entries from the first file */
+ while (count1--)
+ insert_record(line1++, result, 1);
+
+ /* Then search for matches in the second file */
+ while (count2--)
+ insert_record(line2++, result, 2);
+
+ return 0;
+}
+
+/*
+ * Find the longest sequence with a smaller last element (meaning a smaller
+ * line2, as we construct the sequence with entries ordered by line1).
+ */
+static int binary_search(struct entry **sequence, int longest,
+ struct entry *entry)
+{
+ int left = -1, right = longest;
+
+ while (left + 1 < right) {
+ int middle = (left + right) / 2;
+ /* by construction, no two entries can be equal */
+ if (sequence[middle]->line2 > entry->line2)
+ right = middle;
+ else
+ left = middle;
+ }
+ /* return the index in "sequence", _not_ the sequence length */
+ return left;
+}
+
+/*
+ * The idea is to start with the list of common unique lines sorted by
+ * the order in file1. For each of these pairs, the longest (partial)
+ * sequence whose last element's line2 is smaller is determined.
+ *
+ * For efficiency, the sequences are kept in a list containing exactly one
+ * item per sequence length: the sequence with the smallest last
+ * element (in terms of line2).
+ */
+static struct entry *find_longest_common_sequence(struct hashmap *map)
+{
+ struct entry **sequence = xdl_malloc(map->nr * sizeof(struct entry *));
+ int longest = 0, i;
+ struct entry *entry;
+
+ for (entry = map->first; entry; entry = entry->next) {
+ if (!entry->line2 || entry->line2 == NON_UNIQUE)
+ continue;
+ i = binary_search(sequence, longest, entry);
+ entry->previous = i < 0 ? NULL : sequence[i];
+ sequence[++i] = entry;
+ if (i == longest)
+ longest++;
+ }
+
+ /* No common unique lines were found */
+ if (!longest) {
+ xdl_free(sequence);
+ return NULL;
+ }
+
+ /* Iterate starting at the last element, adjusting the "next" members */
+ entry = sequence[longest - 1];
+ entry->next = NULL;
+ while (entry->previous) {
+ entry->previous->next = entry;
+ entry = entry->previous;
+ }
+ xdl_free(sequence);
+ return entry;
+}
+
+static int match(struct hashmap *map, int line1, int line2)
+{
+ xrecord_t *record1 = map->env->xdf1.recs[line1 - 1];
+ xrecord_t *record2 = map->env->xdf2.recs[line2 - 1];
+ return xdl_recmatch(record1->ptr, record1->size,
+ record2->ptr, record2->size, map->xpp->flags);
+}
+
+static int patience_diff(mmfile_t *file1, mmfile_t *file2,
+ xpparam_t const *xpp, xdfenv_t *env,
+ int line1, int count1, int line2, int count2);
+
+static int walk_common_sequence(struct hashmap *map, struct entry *first,
+ int line1, int count1, int line2, int count2)
+{
+ int end1 = line1 + count1, end2 = line2 + count2;
+ int next1, next2;
+
+ for (;;) {
+ /* Try to grow the line ranges of common lines */
+ if (first) {
+ next1 = first->line1;
+ next2 = first->line2;
+ while (next1 > line1 && next2 > line2 &&
+ match(map, next1 - 1, next2 - 1)) {
+ next1--;
+ next2--;
+ }
+ } else {
+ next1 = end1;
+ next2 = end2;
+ }
+ while (line1 < next1 && line2 < next2 &&
+ match(map, line1, line2)) {
+ line1++;
+ line2++;
+ }
+
+ /* Recurse */
+ if (next1 > line1 || next2 > line2) {
+ struct hashmap submap;
+
+ memset(&submap, 0, sizeof(submap));
+ if (patience_diff(map->file1, map->file2,
+ map->xpp, map->env,
+ line1, next1 - line1,
+ line2, next2 - line2))
+ return -1;
+ }
+
+ if (!first)
+ return 0;
+
+ while (first->next &&
+ first->next->line1 == first->line1 + 1 &&
+ first->next->line2 == first->line2 + 1)
+ first = first->next;
+
+ line1 = first->line1 + 1;
+ line2 = first->line2 + 1;
+
+ first = first->next;
+ }
+}
+
+static int fall_back_to_classic_diff(struct hashmap *map,
+ int line1, int count1, int line2, int count2)
+{
+ /*
+ * This probably does not work outside Git, since
+ * we have a very simple mmfile structure.
+ *
+ * Note: ideally, we would reuse the prepared environment, but
+ * the libxdiff interface does not (yet) allow for diffing only
+ * ranges of lines instead of the whole files.
+ */
+ mmfile_t subfile1, subfile2;
+ xpparam_t xpp;
+ xdfenv_t env;
+
+ subfile1.ptr = (char *)map->env->xdf1.recs[line1 - 1]->ptr;
+ subfile1.size = map->env->xdf1.recs[line1 + count1 - 2]->ptr +
+ map->env->xdf1.recs[line1 + count1 - 2]->size - subfile1.ptr;
+ subfile2.ptr = (char *)map->env->xdf2.recs[line2 - 1]->ptr;
+ subfile2.size = map->env->xdf2.recs[line2 + count2 - 2]->ptr +
+ map->env->xdf2.recs[line2 + count2 - 2]->size - subfile2.ptr;
+ xpp.flags = map->xpp->flags & ~XDF_PATIENCE_DIFF;
+ if (xdl_do_diff(&subfile1, &subfile2, &xpp, &env) < 0)
+ return -1;
+
+ memcpy(map->env->xdf1.rchg + line1 - 1, env.xdf1.rchg, count1);
+ memcpy(map->env->xdf2.rchg + line2 - 1, env.xdf2.rchg, count2);
+
+ xdl_free_env(&env);
+
+ return 0;
+}
+
+/*
+ * Recursively find the longest common sequence of unique lines,
+ * and if none was found, ask xdl_do_diff() to do the job.
+ *
+ * This function assumes that env was prepared with xdl_prepare_env().
+ */
+static int patience_diff(mmfile_t *file1, mmfile_t *file2,
+ xpparam_t const *xpp, xdfenv_t *env,
+ int line1, int count1, int line2, int count2)
+{
+ struct hashmap map;
+ struct entry *first;
+ int result = 0;
+
+ /* trivial case: one side is empty */
+ if (!count1) {
+ while(count2--)
+ env->xdf2.rchg[line2++ - 1] = 1;
+ return 0;
+ } else if (!count2) {
+ while(count1--)
+ env->xdf1.rchg[line1++ - 1] = 1;
+ return 0;
+ }
+
+ memset(&map, 0, sizeof(map));
+ if (fill_hashmap(file1, file2, xpp, env, &map,
+ line1, count1, line2, count2))
+ return -1;
+
+ /* are there any matching lines at all? */
+ if (!map.has_matches) {
+ while(count1--)
+ env->xdf1.rchg[line1++ - 1] = 1;
+ while(count2--)
+ env->xdf2.rchg[line2++ - 1] = 1;
+ xdl_free(map.entries);
+ return 0;
+ }
+
+ first = find_longest_common_sequence(&map);
+ if (first)
+ result = walk_common_sequence(&map, first,
+ line1, count1, line2, count2);
+ else
+ result = fall_back_to_classic_diff(&map,
+ line1, count1, line2, count2);
+
+ xdl_free(map.entries);
+ return result;
+}
+
+int xdl_do_patience_diff(mmfile_t *file1, mmfile_t *file2,
+ xpparam_t const *xpp, xdfenv_t *env)
+{
+ if (xdl_prepare_env(file1, file2, xpp, env) < 0)
+ return -1;
+
+ /* environment is cleaned up in xdl_diff() */
+ return patience_diff(file1, file2, xpp, env,
+ 1, env->xdf1.nrec, 1, env->xdf2.nrec);
+}
xdl_free_classifier(&cf);
- if (xdl_optimize_ctxs(&xe->xdf1, &xe->xdf2) < 0) {
+ if (!(xpp->flags & XDF_PATIENCE_DIFF) &&
+ xdl_optimize_ctxs(&xe->xdf1, &xe->xdf2) < 0) {
xdl_free_ctx(&xe->xdf2);
xdl_free_ctx(&xe->xdf1);
while (ptr + 1 < top && isspace(ptr[1])
&& ptr[1] != '\n')
ptr++;
- if (flags & XDF_IGNORE_WHITESPACE_CHANGE
+ if (flags & XDF_IGNORE_WHITESPACE)
+ ; /* already handled */
+ else if (flags & XDF_IGNORE_WHITESPACE_CHANGE
&& ptr[1] != '\n') {
ha += (ha << 5);
ha ^= (unsigned long) ' ';
}
- if (flags & XDF_IGNORE_WHITESPACE_AT_EOL
+ else if (flags & XDF_IGNORE_WHITESPACE_AT_EOL
&& ptr[1] != '\n') {
while (ptr2 != ptr + 1) {
ha += (ha << 5);