- We do not use Process Substitution <(list) or >(list).
+ - Do not write control structures on a single line with semicolon.
+ "then" should be on the next line for if statements, and "do"
+ should be on the next line for "while" and "for".
+
- We prefer "test" over "[ ... ]".
- We do not write the noiseword "function" in front of shell
functions.
+ - We prefer a space between the function name and the parentheses. The
+ opening "{" should also be on the same line.
+ E.g.: my_function () {
+
- As to use of grep, stick to a subset of BRE (namely, no \{m,n\},
[::], [==], nor [..]) for portability.
quick-install-html: require-htmlrepo
'$(SHELL_PATH_SQ)' ./install-doc-quick.sh $(HTML_REPO) $(DESTDIR)$(htmldir)
+print-man1:
+ @for i in $(MAN1_TXT); do echo $$i; done
+
.PHONY: FORCE
NAME
----
-git-credential-cache--daemon - temporarily store user credentials in memory
+git-credential-cache--daemon - Temporarily store user credentials in memory
SYNOPSIS
--------
NAME
----
-git-credential-cache - helper to temporarily store passwords in memory
+git-credential-cache - Helper to temporarily store passwords in memory
SYNOPSIS
--------
NAME
----
-git-credential-store - helper to store credentials on disk
+git-credential-store - Helper to store credentials on disk
SYNOPSIS
--------
--all::
Instead of using only the annotated tags, use any ref
- found in `.git/refs/`. This option enables matching
+ found in `refs/` namespace. This option enables matching
any known branch, remote-tracking branch, or lightweight tag.
--tags::
Instead of using only the annotated tags, use any tag
- found in `.git/refs/tags`. This option enables matching
+ found in `refs/tags` namespace. This option enables matching
a lightweight (non-annotated) tag.
--contains::
useful in the future for compensating for some git bugs or such,
therefore such a usage is permitted.
-*NOTE*: This command honors `.git/info/grafts` and `.git/refs/replace/`.
+*NOTE*: This command honors `.git/info/grafts` file and refs in
+the `refs/replace/` namespace.
If you have any grafts or replacement refs defined, running this command
will make them permanent.
An object to treat as the head of an unreachability trace.
+
If no objects are given, 'git fsck' defaults to using the
-index file, all SHA1 references in .git/refs/*, and all reflogs (unless
---no-reflogs is given) as heads.
+index file, all SHA1 references in `refs` namespace, and all reflogs
+(unless --no-reflogs is given) as heads.
--unreachable::
Print out objects that exist but that aren't reachable from any
------------
After making sure you know which the object is the tag you are looking
-for, you can reconnect it to your regular .git/refs hierarchy.
+for, you can reconnect it to your regular `refs` hierarchy by using
+the `update-ref` command.
------------
$ git cat-file -t 1ef2b196
-----------
Traditionally, tips of branches and tags (collectively known as
-'refs') were stored one file per ref under `$GIT_DIR/refs`
+'refs') were stored one file per ref in a (sub)directory
+under `$GIT_DIR/refs`
directory. While many branch tips tend to be updated often,
most tags and some branch tips are never updated. When a
repository has hundreds or thousands of tags, this
performance.
This command is used to solve the storage and performance
-problem by stashing the refs in a single file,
+problem by storing the refs in a single file,
`$GIT_DIR/packed-refs`. When a ref is missing from the
-traditional `$GIT_DIR/refs` hierarchy, it is looked up in this
+traditional `$GIT_DIR/refs` directory hierarchy, it is looked
+up in this
file and used if found.
Subsequent updates to branches always create new files under
-`$GIT_DIR/refs` hierarchy.
+`$GIT_DIR/refs` directory hierarchy.
A recommended practice to deal with a repository with too many
refs is to pack its refs with `--all --prune` once, and
The command usually removes loose refs under `$GIT_DIR/refs`
hierarchy after packing them. This option tells it not to.
+
+BUGS
+----
+
+Older documentation written before the packed-refs mechanism was
+introduced may still say things like ".git/refs/heads/<branch> file
+exists" when it means "branch <branch> exists".
+
+
GIT
---
Part of the linkgit:git[1] suite
:git-pull: 1
+-r::
--rebase::
Rebase the current branch on top of the upstream branch after
fetching. If there is a remote-tracking branch corresponding to
DESCRIPTION
-----------
-Adds a 'replace' reference in `.git/refs/replace/`
+Adds a 'replace' reference in `refs/replace/` namespace.
The name of the 'replace' reference is the SHA1 of the object that is
replaced. The content of the 'replace' reference is the SHA1 of the
replacement object.
-Unless `-f` is given, the 'replace' reference must not yet exist in
-`.git/refs/replace/` directory.
+Unless `-f` is given, the 'replace' reference must not yet exist.
Replacement references will be used by default by all git commands
except those doing reachability traversal (prune, pack transfer and
DESCRIPTION
-----------
-Add a tag reference in `.git/refs/tags/`, unless `-d/-l/-v` is given
+Add a tag reference in `refs/tags/`, unless `-d/-l/-v` is given
to delete, list or verify tags.
-Unless `-f` is given, the tag to be created must not yet exist in the
-`.git/refs/tags/` directory.
+Unless `-f` is given, the named tag must not yet exist.
If one of `-a`, `-s`, or `-u <key-id>` is passed, the command
creates a 'tag' object, and requires a tag message. Unless
and full access to internals.
See linkgit:gittutorial[7] to get started, then see
-link:everyday.html[Everyday Git] for a useful minimum set of commands, and
-"man git-commandname" for documentation of each command. CVS users may
-also want to read linkgit:gitcvs-migration[7]. See
-the link:user-manual.html[Git User's Manual] for a more in-depth
-introduction.
+link:everyday.html[Everyday Git] for a useful minimum set of
+commands. The link:user-manual.html[Git User's Manual] has a more
+in-depth introduction.
-The '<command>' is either a name of a Git command (see below) or an alias
-as defined in the configuration file (see linkgit:git-config[1]).
+After you mastered the basic concepts, you can come back to this
+page to learn what commands git offers. You can learn more about
+individual git commands with "git help command". linkgit:gitcli[7]
+manual page gives you an overview of the command line command syntax.
-Formatted and hyperlinked version of the latest git
-documentation can be viewed at
-`http://git-htmldocs.googlecode.com/git/git.html`.
+Formatted and hyperlinked version of the latest git documentation
+can be viewed at `http://git-htmldocs.googlecode.com/git/git.html`.
ifdef::stalenotes[]
[NOTE]
linkgit:git-replace[1] for more information.
-FURTHER DOCUMENTATION
----------------------
-
-See the references above to get started using git. The following is
-probably more detail than necessary for a first-time user.
-
-The link:user-manual.html#git-concepts[git concepts chapter of the
-user-manual] and linkgit:gitcore-tutorial[7] both provide
-introductions to the underlying git architecture.
-
-See linkgit:gitworkflows[7] for an overview of recommended workflows.
-
-See also the link:howto-index.html[howto] documents for some useful
-examples.
-
-The internals are documented in the
-link:technical/api-index.html[GIT API documentation].
-
GIT COMMANDS
------------
for a given pathname. These stages are used to hold the various
unmerged version of a file when a merge is in progress.
+FURTHER DOCUMENTATION
+---------------------
+
+See the references in the "description" section to get started
+using git. The following is probably more detail than necessary
+for a first-time user.
+
+The link:user-manual.html#git-concepts[git concepts chapter of the
+user-manual] and linkgit:gitcore-tutorial[7] both provide
+introductions to the underlying git architecture.
+
+See linkgit:gitworkflows[7] for an overview of recommended workflows.
+
+See also the link:howto-index.html[howto] documents for some useful
+examples.
+
+The internals are documented in the
+link:technical/api-index.html[GIT API documentation].
+
+Users migrating from CVS may also want to
+read linkgit:gitcvs-migration[7].
+
+
Authors
-------
Git was started by Linus Torvalds, and is currently maintained by Junio
`git log -1 HEAD` but write `git log -1 HEAD --`; the former will not work
if you happen to have a file called `HEAD` in the work tree.
+ * many commands allow a long option "--option" to be abbreviated
+ only to their unique prefix (e.g. if there is no other option
+ whose name begins with "opt", you may be able to spell "--opt" to
+ invoke the "--option" flag), but you should fully spell them out
+ when writing your scripts; later versions of Git may introduce a
+ new option whose name shares the same prefix, e.g. "--optimize",
+ to make a short prefix that used to be unique no longer unique.
+
ENHANCED OPTION PARSER
----------------------
By default, the commits are shown in reverse chronological order.
---topo-order::
+--date-order::
+ Show no parents before all of its children are shown, but
+ otherwise show commits in the commit timestamp order.
- This option makes them appear in topological order (i.e.
- descendant commits are shown before their parents).
+--topo-order::
+ Show no parents before all of its children are shown, and
+ avoid showing commits on multiple lines of history
+ intermixed.
++
+For example, in a commit history like this:
++
+----------------------------------------------------------------
---date-order::
+ ---1----2----4----7
+ \ \
+ 3----5----6----8---
- This option is similar to '--topo-order' in the sense that no
- parent comes before all of its children, but otherwise things
- are still ordered in the commit timestamp order.
+----------------------------------------------------------------
++
+where the numbers denote the order of commit timestamps, `git
+rev-list` and friends with `--date-order` show the commits in the
+timestamp order: 8 7 6 5 4 3 2 1.
++
+With `--topo-order`, they would show 8 6 5 3 7 4 2 1 (or 8 7 4 2 6 5
+3 1); some older commits are shown before newer ones in order to
+avoid showing the commits from two parallel development track mixed
+together.
--reverse::
### Check documentation
#
+ALL_COMMANDS = $(ALL_PROGRAMS) $(SCRIPT_LIB) $(BUILT_INS)
+ALL_COMMANDS += git
+ALL_COMMANDS += gitk
+ALL_COMMANDS += gitweb
+ALL_COMMANDS += git-gui git-citool
check-docs::
- @(for v in $(ALL_PROGRAMS) $(SCRIPT_LIB) $(BUILT_INS) git gitk; \
+ @(for v in $(ALL_COMMANDS); \
do \
case "$$v" in \
git-merge-octopus | git-merge-ours | git-merge-recursive | \
sed -e '/^#/d' \
-e 's/[ ].*//' \
-e 's/^/listed /' command-list.txt; \
- ls -1 Documentation/git*txt | \
+ $(MAKE) -C Documentation print-man1 | \
+ grep '\.txt$$' | \
sed -e 's|Documentation/|documented |' \
-e 's/\.txt//'; \
) | while read how cmd; \
do \
- case "$$how,$$cmd" in \
- *,git-citool | \
- *,git-gui | \
- *,git-help | \
- documented,gitattributes | \
- documented,gitignore | \
- documented,gitmodules | \
- documented,gitcli | \
- documented,git-tools | \
- documented,gitcore-tutorial | \
- documented,gitcvs-migration | \
- documented,gitdiffcore | \
- documented,gitglossary | \
- documented,githooks | \
- documented,gitrepository-layout | \
- documented,gitrevisions | \
- documented,gittutorial | \
- documented,gittutorial-2 | \
- documented,git-bisect-lk2009 | \
- documented,git-remote-helpers | \
- documented,gitworkflows | \
- sentinel,not,matching,is,ok ) continue ;; \
- esac; \
- case " $(ALL_PROGRAMS) $(SCRIPT_LIB) $(BUILT_INS) git gitk " in \
+ case " $(ALL_COMMANDS) " in \
*" $$cmd "*) ;; \
*) echo "removed but $$how: $$cmd" ;; \
esac; \
int is_new, is_delete; /* -1 = unknown, 0 = false, 1 = true */
int rejected;
unsigned ws_rule;
- unsigned long deflate_origlen;
int lines_added, lines_deleted;
int score;
unsigned int is_toplevel_relative:1;
paths[1] = NULL;
diff_tree_setup_paths(paths, &diff_opts);
- if (diff_setup_done(&diff_opts) < 0)
- die("diff-setup");
+ diff_setup_done(&diff_opts);
if (is_null_sha1(origin->commit->object.sha1))
do_diff_cache(parent->tree->object.sha1, &diff_opts);
diff_opts.single_follow = origin->path;
paths[0] = NULL;
diff_tree_setup_paths(paths, &diff_opts);
- if (diff_setup_done(&diff_opts) < 0)
- die("diff-setup");
+ diff_setup_done(&diff_opts);
if (is_null_sha1(origin->commit->object.sha1))
do_diff_cache(parent->tree->object.sha1, &diff_opts);
paths[0] = NULL;
diff_tree_setup_paths(paths, &diff_opts);
- if (diff_setup_done(&diff_opts) < 0)
- die("diff-setup");
+ diff_setup_done(&diff_opts);
/* Try "find copies harder" on new path if requested;
* we do not want to use diffcore_rename() actually to
init_revisions(&rev, NULL);
rev.diffopt.flags = opts->flags;
rev.diffopt.output_format |= DIFF_FORMAT_NAME_STATUS;
- if (diff_setup_done(&rev.diffopt) < 0)
- die(_("diff_setup_done failed"));
+ diff_setup_done(&rev.diffopt);
add_pending_object(&rev, head, NULL);
run_diff_index(&rev, 0);
}
argc = setup_revisions(argc, argv, &rev, NULL);
if (!rev.diffopt.output_format) {
rev.diffopt.output_format = DIFF_FORMAT_PATCH;
- if (diff_setup_done(&rev.diffopt) < 0)
- die(_("diff_setup_done failed"));
+ diff_setup_done(&rev.diffopt);
}
DIFF_OPT_SET(&rev.diffopt, RECURSIVE);
opts.output_format |=
DIFF_FORMAT_SUMMARY | DIFF_FORMAT_DIFFSTAT;
opts.detect_rename = DIFF_DETECT_RENAME;
- if (diff_setup_done(&opts) < 0)
- die(_("diff_setup_done failed"));
+ diff_setup_done(&opts);
diff_tree_sha1(head, new_head, "", &opts);
diffcore_std(&opts);
diff_flush(&opts);
git-config ancillarymanipulators
git-count-objects ancillaryinterrogators
git-credential purehelpers
+git-credential-cache purehelpers
+git-credential-store purehelpers
git-cvsexportcommit foreignscminterface
git-cvsimport foreignscminterface
git-cvsserver foreignscminterface
git-show-branch ancillaryinterrogators
git-show-index plumbinginterrogators
git-show-ref plumbinginterrogators
+git-sh-i18n purehelpers
git-sh-setup purehelpers
git-stash mainporcelain
git-status mainporcelain common
DIFF_OPT_SET(&revs->diffopt, NO_INDEX);
revs->max_count = -2;
- if (diff_setup_done(&revs->diffopt) < 0)
- die("diff_setup_done failed");
+ diff_setup_done(&revs->diffopt);
setup_diff_pager(&revs->diffopt);
DIFF_OPT_SET(&revs->diffopt, EXIT_WITH_STATUS);
}
}
-int diff_setup_done(struct diff_options *options)
+void diff_setup_done(struct diff_options *options)
{
int count = 0;
options->output_format = DIFF_FORMAT_NO_OUTPUT;
DIFF_OPT_SET(options, EXIT_WITH_STATUS);
}
-
- return 0;
}
static int opt_arg(const char *arg, int arg_short, const char *arg_long, int *val)
extern int diff_use_color_default;
extern void diff_setup(struct diff_options *);
extern int diff_opt_parse(struct diff_options *, const char **, int);
-extern int diff_setup_done(struct diff_options *);
+extern void diff_setup_done(struct diff_options *);
#define DIFF_DETECT_RENAME 1
#define DIFF_DETECT_COPY 2
sub unquote_rfc2047 {
local ($_) = @_;
my $encoding;
- if (s/=\?([^?]+)\?q\?(.*)\?=/$2/g) {
+ s{=\?([^?]+)\?q\?(.*?)\?=}{
$encoding = $1;
- s/_/ /g;
- s/=([0-9A-F]{2})/chr(hex($1))/eg;
- }
+ my $e = $2;
+ $e =~ s/_/ /g;
+ $e =~ s/=([0-9A-F]{2})/chr(hex($1))/eg;
+ $e;
+ }eg;
return wantarray ? ($_, $encoding) : $_;
}
use Git::SVN::Log;
use Git::SVN::Migration;
-use Git::SVN::Utils qw(fatal can_compress);
+use Git::SVN::Utils qw(
+ fatal
+ can_compress
+ canonicalize_path
+ canonicalize_url
+ join_paths
+ add_path_to_url
+ join_paths
+);
+
use Git qw(
git_cmd_try
command
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
$gs ||= Git::SVN->new;
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
- $gs->prop_walk($gs->{path}, $r, sub {
+ $gs->prop_walk($gs->path, $r, sub {
my ($gs, $path, $props) = @_;
print STDOUT "\n# $path\n";
my $s = $props->{'svn:ignore'} or return;
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
$gs ||= Git::SVN->new;
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
- $gs->prop_walk($gs->{path}, $r, sub {
+ $gs->prop_walk($gs->path, $r, sub {
my ($gs, $path, $props) = @_;
print STDOUT "\n# $path\n";
my $s = $props->{'svn:externals'} or return;
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
$gs ||= Git::SVN->new;
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
- $gs->prop_walk($gs->{path}, $r, sub {
+ $gs->prop_walk($gs->path, $r, sub {
my ($gs, $path, $props) = @_;
# $path is of the form /path/to/dir/
$path = '.' . $path;
$gs->mkemptydirs($_revision);
}
-sub canonicalize_path {
- my ($path) = @_;
- my $dot_slash_added = 0;
- if (substr($path, 0, 1) ne "/") {
- $path = "./" . $path;
- $dot_slash_added = 1;
- }
- # File::Spec->canonpath doesn't collapse x/../y into y (for a
- # good reason), so let's do this manually.
- $path =~ s#/+#/#g;
- $path =~ s#/\.(?:/|$)#/#g;
- $path =~ s#/[^/]+/\.\.##g;
- $path =~ s#/$##g;
- $path =~ s#^\./## if $dot_slash_added;
- $path =~ s#^/##;
- $path =~ s#^\.$##;
- return $path;
-}
-
-sub canonicalize_url {
- my ($url) = @_;
- $url =~ s#^([^:]+://[^/]*/)(.*)$#$1 . canonicalize_path($2)#e;
- return $url;
-}
-
# get_svnprops(PATH)
# ------------------
# Helper for cmd_propget and cmd_proplist below.
$path = $cmd_dir_prefix . $path;
fatal("No such file or directory: $path") unless -e $path;
my $is_dir = -d $path ? 1 : 0;
- $path = $gs->{path} . '/' . $path;
+ $path = join_paths($gs->{path}, $path);
# canonicalize the path (otherwise libsvn will abort or fail to
# find the file)
fatal("Needed URL or usable git-svn --id in ",
"the command-line\n", $usage);
}
- $url = $gs->{url};
- $svn_path = $gs->{path};
+ $url = $gs->url;
+ $svn_path = $gs->path;
}
unless (defined $_revision) {
fatal("-r|--revision is a required argument\n", $usage);
}
}
-sub escape_uri_only {
- my ($uri) = @_;
- my @tmp;
- foreach (split m{/}, $uri) {
- s/([^~\w.%+-]|%(?![a-fA-F0-9]{2}))/sprintf("%%%02X",ord($1))/eg;
- push @tmp, $_;
- }
- join('/', @tmp);
-}
-
-sub escape_url {
- my ($url) = @_;
- if ($url =~ m#^([^:]+)://([^/]*)(.*)$#) {
- my ($scheme, $domain, $uri) = ($1, $2, escape_uri_only($3));
- $url = "$scheme://$domain$uri";
- }
- $url;
-}
sub cmd_info {
my $path = canonicalize_path(defined($_[0]) ? $_[0] : ".");
# canonicalize_path() will return "" to make libsvn 1.5.x happy,
$path = "." if $path eq "";
- my $full_url = $url . ($fullpath eq "" ? "" : "/$fullpath");
+ my $full_url = canonicalize_url( add_path_to_url( $url, $fullpath ) );
if ($_url) {
- print escape_url($full_url), "\n";
+ print "$full_url\n";
return;
}
my $result = "Path: $path\n";
$result .= "Name: " . basename($path) . "\n" if $file_type ne "dir";
- $result .= "URL: " . escape_url($full_url) . "\n";
+ $result .= "URL: $full_url\n";
eval {
my $repos_root = $gs->repos_root;
Git::SVN::remove_username($repos_root);
- $result .= "Repository Root: " . escape_url($repos_root) . "\n";
+ $result .= "Repository Root: " . canonicalize_url($repos_root) . "\n";
};
if ($@) {
$result .= "Repository Root: (offline)\n";
sub complete_svn_url {
my ($url, $path) = @_;
- $path =~ s#/+$##;
+ $path = canonicalize_path($path);
+
+ # If the path is not a URL...
if ($path !~ m#^[a-z\+]+://#) {
if (!defined $url || $url !~ m#^[a-z\+]+://#) {
fatal("E: '$path' is not a complete URL ",
print STDERR "W: $switch not specified\n";
return;
}
- $repo_path =~ s#/+$##;
+ $repo_path = canonicalize_path($repo_path);
if ($repo_path =~ m#^[a-z\+]+://#) {
$ra = Git::SVN::Ra->new($repo_path);
$repo_path = '';
"and a separate URL is not specified");
}
}
- my $url = $ra->{url};
+ my $url = $ra->url;
my $gs = Git::SVN->init($url, undef, undef, undef, 1);
my $k = "svn-remote.$gs->{repo_id}.url";
my $orig_url = eval { command_oneline(qw/config --get/, $k) };
- if ($orig_url && ($orig_url ne $gs->{url})) {
+ if ($orig_url && ($orig_url ne $gs->url)) {
die "$k already set: $orig_url\n",
- "wanted to set to: $gs->{url}\n";
+ "wanted to set to: $gs->url\n";
}
- command_oneline('config', $k, $gs->{url}) unless $orig_url;
- my $remote_path = "$gs->{path}/$repo_path";
+ command_oneline('config', $k, $gs->url) unless $orig_url;
+
+ my $remote_path = join_paths( $gs->path, $repo_path );
$remote_path =~ s{%([0-9A-F]{2})}{chr hex($1)}ieg;
- $remote_path =~ s#/+#/#g;
$remote_path =~ s#^/##g;
$remote_path .= "/*" if $remote_path !~ /\*/;
my ($n) = ($switch =~ /^--(\w+)/);
# to build the base URL ourselves:
our $path_info = decode_utf8($ENV{"PATH_INFO"});
if ($path_info) {
+ # $path_info has already been URL-decoded by the web server, but
+ # $my_url and $my_uri have not. URL-decode them so we can properly
+ # strip $path_info.
+ $my_url = unescape($my_url);
+ $my_uri = unescape($my_uri);
if ($my_url =~ s,\Q$path_info\E$,, &&
$my_uri =~ s,\Q$path_info\E$,, &&
defined $ENV{'SCRIPT_NAME'}) {
opts.rename_score = o->rename_score;
opts.show_rename_progress = o->show_rename_progress;
opts.output_format = DIFF_FORMAT_NO_OUTPUT;
- if (diff_setup_done(&opts) < 0)
- die(_("diff setup failed"));
+ diff_setup_done(&opts);
diff_tree_sha1(o_tree->object.sha1, tree->object.sha1, "", &opts);
diffcore_std(&opts);
if (opts.needed_rename_limit > o->needed_rename_limit)
return newpath;
}
-static void flush_buffer(int fd, const char *buf, unsigned long size)
-{
- while (size > 0) {
- long ret = write_in_full(fd, buf, size);
- if (ret < 0) {
- /* Ignore epipe */
- if (errno == EPIPE)
- break;
- die_errno("merge-recursive");
- } else if (!ret) {
- die(_("merge-recursive: disk full?"));
- }
- size -= ret;
- buf += ret;
- }
-}
-
static int dir_in_way(const char *path, int check_working_copy)
{
int pos, pathlen = strlen(path);
fd = open(path, O_WRONLY | O_TRUNC | O_CREAT, mode);
if (fd < 0)
die_errno(_("failed to open '%s'"), path);
- flush_buffer(fd, buf, size);
+ write_in_full(fd, buf, size);
close(fd);
} else if (S_ISLNK(mode)) {
char *lnk = xmemdupz(buf, size);
diff_setup(&opt);
DIFF_OPT_SET(&opt, RECURSIVE);
opt.output_format = DIFF_FORMAT_NO_OUTPUT;
- if (diff_setup_done(&opt) < 0)
- die("diff_setup_done failed");
+ diff_setup_done(&opt);
diff_tree_sha1(base, remote, "", &opt);
diffcore_std(&opt);
diff_setup(&opt);
DIFF_OPT_SET(&opt, RECURSIVE);
opt.output_format = DIFF_FORMAT_NO_OUTPUT;
- if (diff_setup_done(&opt) < 0)
- die("diff_setup_done failed");
+ diff_setup_done(&opt);
diff_tree_sha1(base, local, "", &opt);
diffcore_std(&opt);
memset(ids, 0, sizeof(*ids));
diff_setup(&ids->diffopts);
DIFF_OPT_SET(&ids->diffopts, RECURSIVE);
- if (diff_setup_done(&ids->diffopts) < 0)
- return error("diff_setup_done failed");
+ diff_setup_done(&ids->diffopts);
return 0;
}
command_output_pipe
command_close_pipe
);
-use Git::SVN::Utils qw(fatal can_compress);
+use Git::SVN::Utils qw(
+ fatal
+ can_compress
+ join_paths
+ canonicalize_path
+ canonicalize_url
+ add_path_to_url
+);
my $can_use_yaml;
BEGIN {
} elsif (m!^(.+)\.usesvmprops=\s*(.*)\s*$!) {
$r->{$1}->{svm} = {};
} elsif (m!^(.+)\.url=\s*(.*)\s*$!) {
- $r->{$1}->{url} = $2;
+ $r->{$1}->{url} = canonicalize_url($2);
} elsif (m!^(.+)\.pushurl=\s*(.*)\s*$!) {
- $r->{$1}->{pushurl} = $2;
+ $r->{$1}->{pushurl} = canonicalize_url($2);
} elsif (m!^(.+)\.ignore-refs=\s*(.*)\s*$!) {
$r->{$1}->{ignore_refs_regex} = $2;
} elsif (m!^(.+)\.(branches|tags)=$svn_refspec$!) {
sub init_remote_config {
my ($self, $url, $no_write) = @_;
- $url =~ s!/+$!!; # strip trailing slash
+ $url = canonicalize_url($url);
my $r = read_all_remotes();
my $existing = find_existing_remote($url, $r);
if ($existing) {
print STDERR "Using higher level of URL: ",
"$url => $min_url\n";
}
- my $old_path = $self->{path};
- $self->{path} = $url;
- $self->{path} =~ s!^\Q$min_url\E(/|$)!!;
- if (length $old_path) {
- $self->{path} .= "/$old_path";
- }
+ my $old_path = $self->path;
+ $url =~ s!^\Q$min_url\E(/|$)!!;
+ $url = join_paths($url, $old_path);
+ $self->path($url);
$url = $min_url;
}
}
unless ($no_write) {
command_noisy('config',
"svn-remote.$self->{repo_id}.url", $url);
- $self->{path} =~ s{^/}{};
- $self->{path} =~ s{%([0-9A-F]{2})}{chr hex($1)}ieg;
+ my $path = $self->path;
+ $path =~ s{^/}{};
+ $path =~ s{%([0-9A-F]{2})}{chr hex($1)}ieg;
+ $self->path($path);
command_noisy('config', '--add',
"svn-remote.$self->{repo_id}.fetch",
- "$self->{path}:".$self->refname);
+ $self->path.":".$self->refname);
}
- $self->{url} = $url;
+ $self->url($url);
}
sub find_by_url { # repos_root and, path are optional
my ($class, $full_url, $repos_root, $path) = @_;
+ $full_url = canonicalize_url($full_url);
+
return undef unless defined $full_url;
remove_username($full_url);
remove_username($repos_root) if defined $repos_root;
}
$p =~ s#^\Q$z\E(?:/|$)#$prefix# or next;
}
+
+ # remote fetch paths are not URI escaped. Decode ours
+ # so they match
+ $p = uri_decode($p);
+
foreach my $f (keys %$fetch) {
next if $f ne $p;
return Git::SVN->new($fetch->{$f}, $repo_id, $f);
}
}
my $self = _new($class, $repo_id, $ref_id, $path);
- if (!defined $self->{path} || !length $self->{path}) {
+ if (!defined $self->path || !length $self->path) {
my $fetch = command_oneline('config', '--get',
"svn-remote.$repo_id.fetch",
":$ref_id\$") or
die "Failed to read \"svn-remote.$repo_id.fetch\" ",
"\":$ref_id\$\" in config\n";
- ($self->{path}, undef) = split(/\s*:\s*/, $fetch);
+ my($path) = split(/\s*:\s*/, $fetch);
+ $self->path($path);
}
- $self->{path} =~ s{/+}{/}g;
- $self->{path} =~ s{\A/}{};
- $self->{path} =~ s{/\z}{};
- $self->{url} = command_oneline('config', '--get',
- "svn-remote.$repo_id.url") or
+ {
+ my $path = $self->path;
+ $path =~ s{\A/}{};
+ $path =~ s{/\z}{};
+ $self->path($path);
+ }
+ my $url = command_oneline('config', '--get',
+ "svn-remote.$repo_id.url") or
die "Failed to read \"svn-remote.$repo_id.url\" in config\n";
+ $self->url($url);
$self->{pushurl} = eval { command_oneline('config', '--get',
"svn-remote.$repo_id.pushurl") };
$self->rebuild;
# username is of no interest
$src =~ s{(^[a-z\+]*://)[^/@]*@}{$1};
- my $replace = $ra->{url};
- $replace .= "/$path" if length $path;
+ my $replace = add_path_to_url($ra->url, $path);
my $section = "svn-remote.$self->{repo_id}";
tmp_config("$section.svm-source", $src);
}
my $r = $ra->get_latest_revnum;
- my $path = $self->{path};
+ my $path = $self->path;
my %tried;
while (length $path) {
- unless ($tried{"$self->{url}/$path"}) {
+ my $try = add_path_to_url($self->url, $path);
+ unless ($tried{$try}) {
return $ra if $self->read_svm_props($ra, $path, $r);
- $tried{"$self->{url}/$path"} = 1;
+ $tried{$try} = 1;
}
$path =~ s#/?[^/]+$##;
}
die "Path: '$path' should be ''\n" if $path ne '';
return $ra if $self->read_svm_props($ra, $path, $r);
- $tried{"$self->{url}/$path"} = 1;
+ $tried{ add_path_to_url($self->url, $path) } = 1;
- if ($ra->{repos_root} eq $self->{url}) {
+ if ($ra->{repos_root} eq $self->url) {
die @err, (map { " $_\n" } keys %tried), "\n";
}
$path = $ra->{svn_path};
$ra = Git::SVN::Ra->new($ra->{repos_root});
while (length $path) {
- unless ($tried{"$ra->{url}/$path"}) {
+ my $try = add_path_to_url($ra->url, $path);
+ unless ($tried{$try}) {
$ok = $self->read_svm_props($ra, $path, $r);
last if $ok;
- $tried{"$ra->{url}/$path"} = 1;
+ $tried{$try} = 1;
}
$path =~ s#/?[^/]+$##;
}
die "Path: '$path' should be ''\n" if $path ne '';
$ok ||= $self->read_svm_props($ra, $path, $r);
- $tried{"$ra->{url}/$path"} = 1;
+ $tried{ add_path_to_url($ra->url, $path) } = 1;
if (!$ok) {
die @err, (map { " $_\n" } keys %tried), "\n";
}
- Git::SVN::Ra->new($self->{url});
+ Git::SVN::Ra->new($self->url);
}
sub svnsync {
if (!$@ && $uuid && $uuid =~ /^([a-f\d\-]{30,})$/i) {
$self->{ra_uuid} = $uuid;
} else {
- die "ra_uuid called without URL\n" unless $self->{url};
+ die "ra_uuid called without URL\n" unless $self->url;
$self->{ra_uuid} = $self->ra->get_uuid;
tmp_config('--add', $key, $self->{ra_uuid});
}
sub ra {
my ($self) = shift;
- my $ra = Git::SVN::Ra->new($self->{url});
+ my $ra = Git::SVN::Ra->new($self->url);
$self->_set_repos_root($ra->{repos_root});
if ($self->use_svm_props && !$self->{svm}) {
if ($self->no_metadata) {
$path =~ s#^/*#/#g;
my $p = $path;
# Strip the irrelevant part of the path.
- $p =~ s#^/+\Q$self->{path}\E(/|$)#/#;
+ $p =~ s#^/+\Q@{[$self->path]}\E(/|$)#/#;
# Ensure the path is terminated by a `/'.
$p =~ s#/*$#/#;
foreach (sort keys %$dirent) {
next if $dirent->{$_}->{kind} != $SVN::Node::dir;
- $self->prop_walk($self->{path} . $p . $_, $rev, $sub);
+ $self->prop_walk($self->path . $p . $_, $rev, $sub);
}
}
sub metadata_url {
my ($self) = @_;
- ($self->rewrite_root || $self->{url}) .
- (length $self->{path} ? '/' . $self->{path} : '');
+ my $url = $self->rewrite_root || $self->url;
+ return canonicalize_url( add_path_to_url( $url, $self->path ) );
}
sub full_url {
my ($self) = @_;
- $self->{url} . (length $self->{path} ? '/' . $self->{path} : '');
+ return canonicalize_url( add_path_to_url( $self->url, $self->path ) );
}
sub full_pushurl {
my ($self) = @_;
if ($self->{pushurl}) {
- return $self->{pushurl} . (length $self->{path} ? '/' .
- $self->{path} : '');
+ return canonicalize_url( add_path_to_url( $self->{pushurl}, $self->path ) );
} else {
return $self->full_url;
}
sub match_paths {
my ($self, $paths, $r) = @_;
- return 1 if $self->{path} eq '';
- if (my $path = $paths->{"/$self->{path}"}) {
+ return 1 if $self->path eq '';
+ if (my $path = $paths->{"/".$self->path}) {
return ($path->{action} eq 'D') ? 0 : 1;
}
- $self->{path_regex} ||= qr/^\/\Q$self->{path}\E\//;
+ $self->{path_regex} ||= qr{^/\Q@{[$self->path]}\E/};
if (grep /$self->{path_regex}/, keys %$paths) {
return 1;
}
my $c = '';
- foreach (split m#/#, $self->{path}) {
+ foreach (split m#/#, $self->path) {
$c .= "/$_";
next unless ($paths->{$c} &&
($paths->{$c}->{action} =~ /^[AR]$/));
- if ($self->ra->check_path($self->{path}, $r) ==
+ if ($self->ra->check_path($self->path, $r) ==
$SVN::Node::dir) {
return 1;
}
unless (defined $paths) {
my $err_handler = $SVN::Error::handler;
$SVN::Error::handler = \&Git::SVN::Ra::skip_unknown_revs;
- $self->ra->get_log([$self->{path}], $rev, $rev, 0, 1, 1,
+ $self->ra->get_log([$self->path], $rev, $rev, 0, 1, 1,
sub { $paths = $_[0] });
$SVN::Error::handler = $err_handler;
}
return undef unless defined $paths;
# look for a parent from another branch:
- my @b_path_components = split m#/#, $self->{path};
+ my @b_path_components = split m#/#, $self->path;
my @a_path_components;
my $i;
while (@b_path_components) {
}
my $r = $i->{copyfrom_rev};
my $repos_root = $self->ra->{repos_root};
- my $url = $self->ra->{url};
- my $new_url = $url . $branch_from;
+ my $url = $self->ra->url;
+ my $new_url = canonicalize_url( add_path_to_url( $url, $branch_from ) );
print STDERR "Found possible branch point: ",
"$new_url => ", $self->full_url, ", $r\n"
unless $::_q > 1;
($base, $head) = parse_revision_argument(0, $r);
} else {
if ($r0 < $r) {
- $gs->ra->get_log([$gs->{path}], $r0 + 1, $r, 1,
+ $gs->ra->get_log([$gs->path], $r0 + 1, $r, 1,
0, 1, sub { $base = $_[1] - 1 });
}
}
# at the moment), so we can't rely on it
$self->{last_rev} = $r0;
$self->{last_commit} = $parent;
- $ed = Git::SVN::Fetcher->new($self, $gs->{path});
+ $ed = Git::SVN::Fetcher->new($self, $gs->path);
$gs->ra->gs_do_switch($r0, $rev, $gs,
$self->full_url, $ed)
or die "SVN connection failed somewhere...\n";
close $fh;
}
- my $strip = qr/\A\Q$self->{path}\E(?:\/|$)/;
+ my $strip = qr/\A\Q@{[$self->path]}\E(?:\/|$)/;
foreach my $d (sort keys %empty_dirs) {
$d = uri_decode($d);
$d =~ s/$strip//;
for my $ticket ( @tickets ) {
my ($uuid, $path, $rev) = split /:/, $ticket;
if ( $uuid eq $self->ra_uuid ) {
- my $url = $self->{url};
- my $repos_root = $url;
+ my $repos_root = $self->url;
my $branch_from = $path;
$branch_from =~ s{^/}{};
- my $gs = $self->other_gs($repos_root."/".$branch_from,
- $url,
+ my $gs = $self->other_gs(add_path_to_url( $repos_root, $branch_from ),
+ $repos_root,
$branch_from,
$rev,
$self->{ref_id});
# are now marked as merge, we can add the tip as a parent.
my @merges = split "\n", $mergeinfo;
my @merge_tips;
- my $url = $self->{url};
+ my $url = $self->url;
my $uuid = $self->ra_uuid;
my %ranges;
for my $merge ( @merges ) {
$email ||= "$author\@$uuid";
$commit_email ||= "$author\@$uuid";
} elsif ($self->use_svnsync_props) {
- my $full_url = $self->svnsync->{url};
- $full_url .= "/$self->{path}" if length $self->{path};
+ my $full_url = canonicalize_url(
+ add_path_to_url( $self->svnsync->{url}, $self->path )
+ );
remove_username($full_url);
my $uuid = $self->svnsync->{uuid};
$log_entry{metadata} = "$full_url\@$rev $uuid";
tree_b => $tree,
editor_cb => sub {
$self->set_tree_cb($log_entry, $tree, @_) },
- svn_path => $self->{path} );
+ svn_path => $self->path );
if (!Git::SVN::Editor->new(\%ed_opts)->apply_diff) {
print "No changes\nr$self->{last_rev} = $tree\n";
}
$_[3] = $path = '' unless (defined $path);
mkpath([$dir]);
- bless {
+ my $obj = bless {
ref_id => $ref_id, dir => $dir, index => "$dir/index",
- path => $path, config => "$ENV{GIT_DIR}/svn/config",
+ config => "$ENV{GIT_DIR}/svn/config",
map_root => "$dir/.rev_map", repo_id => $repo_id }, $class;
+
+ # Ensure it gets canonicalized
+ $obj->path($path);
+
+ return $obj;
+}
+
+sub path {
+ my $self = shift;
+
+ if (@_) {
+ my $path = shift;
+ $self->{path} = canonicalize_path($path);
+ return;
+ }
+
+ return $self->{path};
+}
+
+sub url {
+ my $self = shift;
+
+ if (@_) {
+ my $url = shift;
+ $self->{url} = canonicalize_url($url);
+ return;
+ }
+
+ return $self->{url};
}
# for read-only access of old .rev_db formats
chomp(my $empty_blob = `git hash-object -t blob --stdin < /dev/null`);
my ($ls, $ctx) = command_output_pipe(qw/ls-tree -r -z/, $cmt);
local $/ = "\0";
- my $pfx = defined($switch_path) ? $switch_path : $git_svn->{path};
+ my $pfx = defined($switch_path) ? $switch_path : $git_svn->path;
$pfx .= '/' if length($pfx);
while (<$ls>) {
chomp;
my $ra = Git::SVN::Ra->new($url);
# skip existing cases where we already connect to the root
- if (($ra->{url} eq $ra->{repos_root}) ||
+ if (($ra->url eq $ra->{repos_root}) ||
($ra->{repos_root} eq $repo_id)) {
- $root_repos->{$ra->{url}} = $repo_id;
+ $root_repos->{$ra->url} = $repo_id;
next;
}
my $root_ra = Git::SVN::Ra->new($ra->{repos_root});
- my $root_path = $ra->{url};
+ my $root_path = $ra->url;
$root_path =~ s#^\Q$ra->{repos_root}\E(/|$)##;
foreach my $path (keys %$fetch) {
my $ref_id = $fetch->{$path};
use strict;
use warnings;
use SVN::Client;
+use Git::SVN::Utils qw(
+ canonicalize_url
+ canonicalize_path
+ add_path_to_url
+);
+
use SVN::Ra;
BEGIN {
@ISA = qw(SVN::Ra);
\@rv;
}
-sub escape_uri_only {
- my ($uri) = @_;
- my @tmp;
- foreach (split m{/}, $uri) {
- s/([^~\w.%+-]|%(?![a-fA-F0-9]{2}))/sprintf("%%%02X",ord($1))/eg;
- push @tmp, $_;
- }
- join('/', @tmp);
-}
-
-sub escape_url {
- my ($url) = @_;
- if ($url =~ m#^(https?)://([^/]+)(.*)$#) {
- my ($scheme, $domain, $uri) = ($1, $2, escape_uri_only($3));
- $url = "$scheme://$domain$uri";
- }
- $url;
-}
sub new {
my ($class, $url) = @_;
- $url =~ s!/+$!!;
- return $RA if ($RA && $RA->{url} eq $url);
+ $url = canonicalize_url($url);
+ return $RA if ($RA && $RA->url eq $url);
::_req_svn();
$Git::SVN::Prompt::_no_auth_cache = 1;
}
} # no warnings 'once'
- my $self = SVN::Ra->new(url => escape_url($url), auth => $baton,
+
+ my $self = SVN::Ra->new(url => $url, auth => $baton,
config => $config,
pool => SVN::Pool->new,
auth_provider_callbacks => $callbacks);
- $self->{url} = $url;
+ $RA = bless $self, $class;
+
+ # Make sure its canonicalized
+ $self->url($url);
$self->{svn_path} = $url;
$self->{repos_root} = $self->get_repos_root;
$self->{svn_path} =~ s#^\Q$self->{repos_root}\E(/|$)##;
$self->{cache} = { check_path => { r => 0, data => {} },
get_dir => { r => 0, data => {} } };
- $RA = bless $self, $class;
+
+ return $RA;
+}
+
+sub url {
+ my $self = shift;
+
+ if (@_) {
+ my $url = shift;
+ $self->{url} = canonicalize_url($url);
+ return;
+ }
+
+ return $self->{url};
}
sub check_path {
qw/copyfrom_path copyfrom_rev action/;
if ($s{'copyfrom_path'}) {
$s{'copyfrom_path'} =~ s/$prefix_regex//;
+ $s{'copyfrom_path'} = canonicalize_path($s{'copyfrom_path'});
}
$_[0]{$p} = \%s;
}
sub gs_do_update {
my ($self, $rev_a, $rev_b, $gs, $editor) = @_;
my $new = ($rev_a == $rev_b);
- my $path = $gs->{path};
+ my $path = $gs->path;
if ($new && -e $gs->{index}) {
unlink $gs->{index} or die
# svn_ra_reparent didn't work before 1.4)
sub gs_do_switch {
my ($self, $rev_a, $rev_b, $gs, $url_b, $editor) = @_;
- my $path = $gs->{path};
+ my $path = $gs->path;
my $pool = SVN::Pool->new;
- my $full_url = $self->{url};
- my $old_url = $full_url;
- $full_url .= '/' . $path if length $path;
+ my $old_url = $self->url;
+ my $full_url = add_path_to_url( $self->url, $path );
my ($ra, $reparented);
if ($old_url =~ m#^svn(\+ssh)?://# ||
($full_url =~ m#^https?://# &&
- escape_url($full_url) ne $full_url)) {
+ canonicalize_url($full_url) ne $full_url)) {
$_[0] = undef;
$self = undef;
$RA = undef;
$ra = Git::SVN::Ra->new($full_url);
$ra_invalid = 1;
} elsif ($old_url ne $full_url) {
- SVN::_Ra::svn_ra_reparent($self->{session}, $full_url, $pool);
- $self->{url} = $full_url;
+ SVN::_Ra::svn_ra_reparent(
+ $self->{session},
+ canonicalize_url($full_url),
+ $pool
+ );
+ $self->url($full_url);
$reparented = 1;
}
$ra ||= $self;
- $url_b = escape_url($url_b);
+ $url_b = canonicalize_url($url_b);
my $reporter = $ra->do_switch($rev_b, '', 1, $url_b, $editor, $pool);
my @lock = (::compare_svn_version('1.2.0') >= 0) ? (undef) : ();
$reporter->set_path('', $rev_a, 0, @lock, $pool);
if ($reparented) {
SVN::_Ra::svn_ra_reparent($self->{session}, $old_url, $pool);
- $self->{url} = $old_url;
+ $self->url($old_url);
}
$pool->clear;
my $common_max = scalar @$gsv;
foreach my $gs (@$gsv) {
- my @tmp = split m#/#, $gs->{path};
+ my @tmp = split m#/#, $gs->path;
my $p = '';
foreach (@tmp) {
$p .= length($p) ? "/$_" : $_;
my $inc = $_log_window_size;
my ($min, $max) = ($base, $head < $base + $inc ? $head : $base + $inc);
my $longest_path = longest_common_path($gsv, $globs);
- my $ra_url = $self->{url};
+ my $ra_url = $self->url;
my $find_trailing_edge;
while (1) {
my %revs;
($self->check_path($p, $r) !=
$SVN::Node::dir));
next unless $p =~ /$g->{path}->{regex}/;
- $exists->{$p} = Git::SVN->init($self->{url}, $p, undef,
+ $exists->{$p} = Git::SVN->init($self->url, $p, undef,
$g->{ref}->full_path($de), 1);
}
}
next if ($self->check_path($pathname, $r) !=
$SVN::Node::dir);
$exists->{$pathname} = Git::SVN->init(
- $self->{url}, $pathname, undef,
+ $self->url, $pathname, undef,
$g->{ref}->full_path($p), 1);
}
my $c = '';
sub minimize_url {
my ($self) = @_;
- return $self->{url} if ($self->{url} eq $self->{repos_root});
+ return $self->url if ($self->url eq $self->{repos_root});
my $url = $self->{repos_root};
my @components = split(m!/!, $self->{svn_path});
my $c = '';
do {
- $url .= "/$c" if length $c;
+ $url = add_path_to_url($url, $c);
eval {
my $ra = (ref $self)->new($url);
my $latest = $ra->get_latest_revnum;
$ra->get_log("", $latest, 0, 1, 0, 1, sub {});
};
} while ($@ && ($c = shift @components));
- $url;
+
+ return canonicalize_url($url);
}
sub can_do_switch {
unless (defined $can_do_switch) {
my $pool = SVN::Pool->new;
my $rep = eval {
- $self->do_switch(1, '', 0, $self->{url},
+ $self->do_switch(1, '', 0, $self->url,
SVN::Delta::Editor->new, $pool);
};
if ($@) {
use strict;
use warnings;
+use SVN::Core;
+
use base qw(Exporter);
-our @EXPORT_OK = qw(fatal can_compress);
+our @EXPORT_OK = qw(
+ fatal
+ can_compress
+ canonicalize_path
+ canonicalize_url
+ join_paths
+ add_path_to_url
+);
=head1 NAME
}
+=head3 canonicalize_path
+
+ my $canoncalized_path = canonicalize_path($path);
+
+Converts $path into a canonical form which is safe to pass to the SVN
+API as a file path.
+
+=cut
+
+# Turn foo/../bar into bar
+sub _collapse_dotdot {
+ my $path = shift;
+
+ 1 while $path =~ s{/[^/]+/+\.\.}{};
+ 1 while $path =~ s{[^/]+/+\.\./}{};
+ 1 while $path =~ s{[^/]+/+\.\.}{};
+
+ return $path;
+}
+
+
+sub canonicalize_path {
+ my $path = shift;
+ my $rv;
+
+ # The 1.7 way to do it
+ if ( defined &SVN::_Core::svn_dirent_canonicalize ) {
+ $path = _collapse_dotdot($path);
+ $rv = SVN::_Core::svn_dirent_canonicalize($path);
+ }
+ # The 1.6 way to do it
+ # This can return undef on subversion-perl-1.4.2-2.el5 (CentOS 5.2)
+ elsif ( defined &SVN::_Core::svn_path_canonicalize ) {
+ $path = _collapse_dotdot($path);
+ $rv = SVN::_Core::svn_path_canonicalize($path);
+ }
+
+ return $rv if defined $rv;
+
+ # No SVN API canonicalization is available, or the SVN API
+ # didn't return a successful result, do it ourselves
+ return _canonicalize_path_ourselves($path);
+}
+
+
+sub _canonicalize_path_ourselves {
+ my ($path) = @_;
+ my $dot_slash_added = 0;
+ if (substr($path, 0, 1) ne "/") {
+ $path = "./" . $path;
+ $dot_slash_added = 1;
+ }
+ $path =~ s#/+#/#g;
+ $path =~ s#/\.(?:/|$)#/#g;
+ $path = _collapse_dotdot($path);
+ $path =~ s#/$##g;
+ $path =~ s#^\./## if $dot_slash_added;
+ $path =~ s#^/##;
+ $path =~ s#^\.$##;
+ return $path;
+}
+
+
+=head3 canonicalize_url
+
+ my $canonicalized_url = canonicalize_url($url);
+
+Converts $url into a canonical form which is safe to pass to the SVN
+API as a URL.
+
+=cut
+
+sub canonicalize_url {
+ my $url = shift;
+
+ # The 1.7 way to do it
+ if ( defined &SVN::_Core::svn_uri_canonicalize ) {
+ return SVN::_Core::svn_uri_canonicalize($url);
+ }
+ # There wasn't a 1.6 way to do it, so we do it ourself.
+ else {
+ return _canonicalize_url_ourselves($url);
+ }
+}
+
+
+sub _canonicalize_url_path {
+ my ($uri_path) = @_;
+
+ my @parts;
+ foreach my $part (split m{/+}, $uri_path) {
+ $part =~ s/([^~\w.%+-]|%(?![a-fA-F0-9]{2}))/sprintf("%%%02X",ord($1))/eg;
+ push @parts, $part;
+ }
+
+ return join('/', @parts);
+}
+
+sub _canonicalize_url_ourselves {
+ my ($url) = @_;
+ if ($url =~ m#^([^:]+)://([^/]*)(.*)$#) {
+ my ($scheme, $domain, $uri) = ($1, $2, _canonicalize_url_path(canonicalize_path($3)));
+ $url = "$scheme://$domain$uri";
+ }
+ $url;
+}
+
+
+=head3 join_paths
+
+ my $new_path = join_paths(@paths);
+
+Appends @paths together into a single path. Any empty paths are ignored.
+
+=cut
+
+sub join_paths {
+ my @paths = @_;
+
+ @paths = grep { defined $_ && length $_ } @paths;
+
+ return '' unless @paths;
+ return $paths[0] if @paths == 1;
+
+ my $new_path = shift @paths;
+ $new_path =~ s{/+$}{};
+
+ my $last_path = pop @paths;
+ $last_path =~ s{^/+}{};
+
+ for my $path (@paths) {
+ $path =~ s{^/+}{};
+ $path =~ s{/+$}{};
+ $new_path .= "/$path";
+ }
+
+ return $new_path .= "/$last_path";
+}
+
+
+=head3 add_path_to_url
+
+ my $new_url = add_path_to_url($url, $path);
+
+Appends $path onto the $url. If $path is empty, $url is returned unchanged.
+
+=cut
+
+sub add_path_to_url {
+ my($url, $path) = @_;
+
+ return $url if !defined $path or !length $path;
+
+ # Strip trailing and leading slashes so we don't
+ # wind up with http://x.com///path
+ $url =~ s{/+$}{};
+ $path =~ s{^/+}{};
+
+ # If a path has a % in it, URI escape it so it's not
+ # mistaken for a URI escape later.
+ $path =~ s{%}{%25}g;
+
+ return join '/', $url, $path;
+}
+
1;
size_t mmap_size;
struct strbuf previous_name_buf = STRBUF_INIT, *previous_name;
- errno = EBUSY;
if (istate->initialized)
return istate->cache_nr;
- errno = ENOENT;
istate->timestamp.sec = 0;
istate->timestamp.nsec = 0;
fd = open(path, O_RDONLY);
if (fstat(fd, &st))
die_errno("cannot stat the open index");
- errno = EINVAL;
mmap_size = xsize_t(st.st_size);
if (mmap_size < sizeof(struct cache_header) + 20)
die("index file smaller than expected");
mmap = xmmap(NULL, mmap_size, PROT_READ | PROT_WRITE, MAP_PRIVATE, fd, 0);
- close(fd);
if (mmap == MAP_FAILED)
die_errno("unable to map index file");
+ close(fd);
hdr = mmap;
if (verify_hdr(hdr, mmap_size) < 0)
unmap:
munmap(mmap, mmap_size);
- errno = EINVAL;
die("index file corrupt");
}
if (revs->combine_merges)
revs->ignore_merges = 0;
revs->diffopt.abbrev = revs->abbrev;
- if (diff_setup_done(&revs->diffopt) < 0)
- die("diff_setup_done failed");
+ diff_setup_done(&revs->diffopt);
compile_grep_patterns(&revs->grep_filter);
{
if (!diagnose_misspelt_rev)
die("%s: no such path in the working tree.\n"
- "Use '-- <path>...' to specify paths that do not exist locally.",
+ "Use 'git <command> -- <path>...' to specify paths that do not exist locally.",
arg);
/*
* Saying "'(icase)foo' does not exist in the index" when the
/* ... or fall back the most general message. */
die("ambiguous argument '%s': unknown revision or path not in the working tree.\n"
- "Use '--' to separate paths from revisions", arg);
+ "Use '--' to separate paths from revisions, like this:\n"
+ "'git <command> [<revision>...] -- [<file>...]'", arg);
}
if (!check_filename(prefix, arg))
return;
die("ambiguous argument '%s': both revision and filename\n"
- "Use '--' to separate filenames from revisions", arg);
+ "Use '--' to separate paths from revisions, like this:\n"
+ "'git <command> [<revision>...] -- [<file>...]'", arg);
}
/*
DIFF_OPT_SET(&diff_opts, RECURSIVE);
diff_opts.output_format |= DIFF_FORMAT_CALLBACK;
diff_opts.format_callback = submodule_collect_changed_cb;
- if (diff_setup_done(&diff_opts) < 0)
- die("diff_setup_done failed");
+ diff_setup_done(&diff_opts);
diff_tree_sha1(parent->item->object.sha1, commit->object.sha1, "", &diff_opts);
diffcore_std(&diff_opts);
diff_flush(&diff_opts);
--- /dev/null
+#!/usr/bin/env perl
+
+use strict;
+use warnings;
+
+use Test::More 'no_plan';
+
+use Git::SVN::Utils qw(
+ add_path_to_url
+);
+
+# A reference cannot be a hash key, so we use an array.
+my @tests = (
+ ["http://x.com", "bar"] => 'http://x.com/bar',
+ ["http://x.com", ""] => 'http://x.com',
+ ["http://x.com/foo/", undef] => 'http://x.com/foo/',
+ ["http://x.com/foo/", "/bar/baz/"] => 'http://x.com/foo/bar/baz/',
+ ["http://x.com", 'per%cent'] => 'http://x.com/per%25cent',
+);
+
+while(@tests) {
+ my($have, $want) = splice @tests, 0, 2;
+
+ my $args = join ", ", map { qq['$_'] } map { defined($_) ? $_ : 'undef' } @$have;
+ my $name = "add_path_to_url($args) eq $want";
+ is add_path_to_url(@$have), $want, $name;
+}
--- /dev/null
+#!/usr/bin/env perl
+
+# Test our own home rolled URL canonicalizer. Test the private one
+# directly because we can't predict what the SVN API is doing to do.
+
+use strict;
+use warnings;
+
+use Test::More 'no_plan';
+
+use Git::SVN::Utils;
+my $canonicalize_url = \&Git::SVN::Utils::_canonicalize_url_ourselves;
+
+my %tests = (
+ "http://x.com" => "http://x.com",
+ "http://x.com/" => "http://x.com",
+ "http://x.com/foo/bar" => "http://x.com/foo/bar",
+ "http://x.com//foo//bar//" => "http://x.com/foo/bar",
+ "http://x.com/ /%/" => "http://x.com/%20%20/%25",
+);
+
+for my $arg (keys %tests) {
+ my $want = $tests{$arg};
+
+ is $canonicalize_url->($arg), $want, "canonicalize_url('$arg') => $want";
+}
--- /dev/null
+#!/usr/bin/env perl
+
+use strict;
+use warnings;
+
+use Test::More 'no_plan';
+
+use Git::SVN::Utils;
+my $collapse_dotdot = \&Git::SVN::Utils::_collapse_dotdot;
+
+my %tests = (
+ "foo/bar/baz" => "foo/bar/baz",
+ ".." => "..",
+ "foo/.." => "",
+ "/foo/bar/../../baz" => "/baz",
+ "deeply/.././deeply/nested" => "./deeply/nested",
+);
+
+for my $arg (keys %tests) {
+ my $want = $tests{$arg};
+
+ is $collapse_dotdot->($arg), $want, "_collapse_dotdot('$arg') => $want";
+}
--- /dev/null
+#!/usr/bin/env perl
+
+use strict;
+use warnings;
+
+use Test::More 'no_plan';
+
+use Git::SVN::Utils qw(
+ join_paths
+);
+
+# A reference cannot be a hash key, so we use an array.
+my @tests = (
+ [] => '',
+ ["/x.com", "bar"] => '/x.com/bar',
+ ["x.com", ""] => 'x.com',
+ ["/x.com/foo/", undef, "bar"] => '/x.com/foo/bar',
+ ["x.com/foo/", "/bar/baz/"] => 'x.com/foo/bar/baz/',
+ ["foo", "bar"] => 'foo/bar',
+ ["/foo/bar", "baz", "/biff"] => '/foo/bar/baz/biff',
+ ["", undef, "."] => '.',
+ [] => '',
+
+);
+
+while(@tests) {
+ my($have, $want) = splice @tests, 0, 2;
+
+ my $args = join ", ", map { qq['$_'] } map { defined($_) ? $_ : 'undef' } @$have;
+ my $name = "join_paths($args) eq '$want'";
+ is join_paths(@$have), $want, $name;
+}
Git was compiled with USE_LIBPCRE=YesPlease. Wrap any tests
that use git-grep --perl-regexp or git-grep -P in these.
+ - CASE_INSENSITIVE_FS
+
+ Test is run on a case insensitive file system.
+
+ - UTF8_NFD_TO_NFC
+
+ Test is run on a filesystem which converts decomposed utf-8 (nfd)
+ to precomposed utf-8 (nfc).
+
Tips for Writing Tests
----------------------
else
echo "perf $test_count - $1:"
fi
- for i in $(seq 1 $GIT_PERF_REPEAT_COUNT); do
+ for i in $(test_seq 1 $GIT_PERF_REPEAT_COUNT); do
say >&3 "running: $2"
if test_run_perf_ "$2"
then
'
-test_expect_success 'check whether FS is case-insensitive' '
- mkdir junk &&
- echo good >junk/CamelCase &&
- echo bad >junk/camelcase &&
- if test "$(cat junk/CamelCase)" != good
- then
- test_set_prereq CASE_INSENSITIVE_FS
- fi
-'
-
test_expect_success CASE_INSENSITIVE_FS 'additional case insensitivity tests' '
test_must_fail attr_check a/B/D/g "a/b/d/*" "-c core.ignorecase=0" &&
test_must_fail attr_check A/B/D/NO "a/b/d/*" "-c core.ignorecase=0" &&
auml=$(printf '\303\244')
aumlcdiar=$(printf '\141\314\210')
-case_insensitive=
-unibad=
-no_symlinks=
-test_expect_success 'see what we expect' '
-
- test_case=test_expect_success &&
- test_unicode=test_expect_success &&
- mkdir junk &&
- echo good >junk/CamelCase &&
- echo bad >junk/camelcase &&
- if test "$(cat junk/CamelCase)" != good
- then
- test_case=test_expect_failure &&
- case_insensitive=t
- fi &&
- rm -fr junk &&
- mkdir junk &&
- >junk/"$auml" &&
- case "$(cd junk && echo *)" in
- "$aumlcdiar")
- test_unicode=test_expect_failure &&
- unibad=t
- ;;
- *) ;;
- esac &&
- rm -fr junk &&
- {
- ln -s x y 2> /dev/null &&
- test -h y 2> /dev/null ||
- no_symlinks=1 &&
- rm -f y
- }
-'
-
-test "$case_insensitive" &&
+if test_have_prereq CASE_INSENSITIVE_FS
+then
say "will test on a case insensitive filesystem"
-test "$unibad" &&
+ test_case=test_expect_failure
+else
+ test_case=test_expect_success
+fi
+
+if test_have_prereq UTF8_NFD_TO_NFC
+then
say "will test on a unicode corrupting filesystem"
-test "$no_symlinks" &&
+ test_unicode=test_expect_failure
+else
+ test_unicode=test_expect_success
+fi
+
+test_have_prereq SYMLINKS ||
say "will test on a filesystem lacking symbolic links"
-if test "$case_insensitive"
+if test_have_prereq CASE_INSENSITIVE_FS
then
test_expect_success "detection of case insensitive filesystem during repo init" '
'
fi
-if test "$no_symlinks"
+if test_have_prereq SYMLINKS
then
test_expect_success "detection of filesystem w/o symlink support during repo init" '
- v=$(git config --bool core.symlinks) &&
- test "$v" = false
+ test_must_fail git config --bool core.symlinks ||
+ test "$(git config --bool core.symlinks)" = true
'
else
test_expect_success "detection of filesystem w/o symlink support during repo init" '
- test_must_fail git config --bool core.symlinks ||
- test "$(git config --bool core.symlinks)" = true
+ v=$(git config --bool core.symlinks) &&
+ test "$v" = false
'
fi
. ./test-lib.sh
+if ! test_have_prereq UTF8_NFD_TO_NFC
+then
+ skip_all="filesystem does not corrupt utf-8"
+ test_done
+fi
+
+# create utf-8 variables
Adiarnfc=`printf '\303\204'`
Adiarnfd=`printf 'A\314\210'`
-# check if the feature is compiled in
-mkdir junk &&
->junk/"$Adiarnfc" &&
-case "$(cd junk && echo *)" in
- "$Adiarnfd")
- test_nfd=1
- ;;
- *) ;;
-esac
-rm -rf junk
+Odiarnfc=`printf '\303\226'`
+Odiarnfd=`printf 'O\314\210'`
+AEligatu=`printf '\303\206'`
+Invalidu=`printf '\303\377'`
-if test "$test_nfd"
-then
- # create more utf-8 variables
- Odiarnfc=`printf '\303\226'`
- Odiarnfd=`printf 'O\314\210'`
- AEligatu=`printf '\303\206'`
- Invalidu=`printf '\303\377'`
+#Create a string with 255 bytes (decomposed)
+Alongd=$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd #21 Byte
+Alongd=$Alongd$Alongd$Alongd #63 Byte
+Alongd=$Alongd$Alongd$Alongd$Alongd$Adiarnfd #255 Byte
+#Create a string with 254 bytes (precomposed)
+Alongc=$AEligatu$AEligatu$AEligatu$AEligatu$AEligatu #10 Byte
+Alongc=$Alongc$Alongc$Alongc$Alongc$Alongc #50 Byte
+Alongc=$Alongc$Alongc$Alongc$Alongc$Alongc #250 Byte
+Alongc=$Alongc$AEligatu$AEligatu #254 Byte
- #Create a string with 255 bytes (decomposed)
- Alongd=$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd$Adiarnfd #21 Byte
- Alongd=$Alongd$Alongd$Alongd #63 Byte
- Alongd=$Alongd$Alongd$Alongd$Alongd$Adiarnfd #255 Byte
-
- #Create a string with 254 bytes (precomposed)
- Alongc=$AEligatu$AEligatu$AEligatu$AEligatu$AEligatu #10 Byte
- Alongc=$Alongc$Alongc$Alongc$Alongc$Alongc #50 Byte
- Alongc=$Alongc$Alongc$Alongc$Alongc$Alongc #250 Byte
- Alongc=$Alongc$AEligatu$AEligatu #254 Byte
-
- test_expect_success "detect if nfd needed" '
- precomposeunicode=`git config core.precomposeunicode` &&
- test "$precomposeunicode" = false &&
- git config core.precomposeunicode true
- '
- test_expect_success "setup" '
- >x &&
- git add x &&
- git commit -m "1st commit" &&
- git rm x &&
- git commit -m "rm x"
- '
- test_expect_success "setup case mac" '
- git checkout -b mac_os
- '
- # This will test nfd2nfc in readdir()
- test_expect_success "add file Adiarnfc" '
- echo f.Adiarnfc >f.$Adiarnfc &&
- git add f.$Adiarnfc &&
- git commit -m "add f.$Adiarnfc"
- '
- # This will test nfd2nfc in git stage()
- test_expect_success "stage file d.Adiarnfd/f.Adiarnfd" '
- mkdir d.$Adiarnfd &&
- echo d.$Adiarnfd/f.$Adiarnfd >d.$Adiarnfd/f.$Adiarnfd &&
- git stage d.$Adiarnfd/f.$Adiarnfd &&
- git commit -m "add d.$Adiarnfd/f.$Adiarnfd"
- '
- test_expect_success "add link Adiarnfc" '
- ln -s d.$Adiarnfd/f.$Adiarnfd l.$Adiarnfc &&
- git add l.$Adiarnfc &&
- git commit -m "add l.Adiarnfc"
- '
- # This will test git log
- test_expect_success "git log f.Adiar" '
- git log f.$Adiarnfc > f.Adiarnfc.log &&
- git log f.$Adiarnfd > f.Adiarnfd.log &&
- test -s f.Adiarnfc.log &&
- test -s f.Adiarnfd.log &&
- test_cmp f.Adiarnfc.log f.Adiarnfd.log &&
- rm f.Adiarnfc.log f.Adiarnfd.log
- '
- # This will test git ls-files
- test_expect_success "git lsfiles f.Adiar" '
- git ls-files f.$Adiarnfc > f.Adiarnfc.log &&
- git ls-files f.$Adiarnfd > f.Adiarnfd.log &&
- test -s f.Adiarnfc.log &&
- test -s f.Adiarnfd.log &&
- test_cmp f.Adiarnfc.log f.Adiarnfd.log &&
- rm f.Adiarnfc.log f.Adiarnfd.log
- '
- # This will test git mv
- test_expect_success "git mv" '
- git mv f.$Adiarnfd f.$Odiarnfc &&
- git mv d.$Adiarnfd d.$Odiarnfc &&
- git mv l.$Adiarnfd l.$Odiarnfc &&
- git commit -m "mv Adiarnfd Odiarnfc"
- '
- # Files can be checked out as nfc
- # And the link has been corrected from nfd to nfc
- test_expect_success "git checkout nfc" '
- rm f.$Odiarnfc &&
- git checkout f.$Odiarnfc
- '
- # Make it possible to checkout files with their NFD names
- test_expect_success "git checkout file nfd" '
- rm -f f.* &&
- git checkout f.$Odiarnfd
- '
- # Make it possible to checkout links with their NFD names
- test_expect_success "git checkout link nfd" '
- rm l.* &&
- git checkout l.$Odiarnfd
- '
- test_expect_success "setup case mac2" '
- git checkout master &&
- git reset --hard &&
- git checkout -b mac_os_2
- '
- # This will test nfd2nfc in git commit
- test_expect_success "commit file d2.Adiarnfd/f.Adiarnfd" '
- mkdir d2.$Adiarnfd &&
- echo d2.$Adiarnfd/f.$Adiarnfd >d2.$Adiarnfd/f.$Adiarnfd &&
- git add d2.$Adiarnfd/f.$Adiarnfd &&
- git commit -m "add d2.$Adiarnfd/f.$Adiarnfd" -- d2.$Adiarnfd/f.$Adiarnfd
- '
- test_expect_success "setup for long decomposed filename" '
- git checkout master &&
- git reset --hard &&
- git checkout -b mac_os_long_nfd_fn
- '
- test_expect_success "Add long decomposed filename" '
- echo longd >$Alongd &&
- git add * &&
- git commit -m "Long filename"
- '
- test_expect_success "setup for long precomposed filename" '
- git checkout master &&
- git reset --hard &&
- git checkout -b mac_os_long_nfc_fn
- '
- test_expect_success "Add long precomposed filename" '
- echo longc >$Alongc &&
- git add * &&
- git commit -m "Long filename"
- '
- # Test if the global core.precomposeunicode stops autosensing
- # Must be the last test case
- test_expect_success "respect git config --global core.precomposeunicode" '
- git config --global core.precomposeunicode true &&
- rm -rf .git &&
- git init &&
- precomposeunicode=`git config core.precomposeunicode` &&
- test "$precomposeunicode" = "true"
- '
-else
- say "Skipping nfc/nfd tests"
-fi
+test_expect_success "detect if nfd needed" '
+ precomposeunicode=`git config core.precomposeunicode` &&
+ test "$precomposeunicode" = false &&
+ git config core.precomposeunicode true
+'
+test_expect_success "setup" '
+ >x &&
+ git add x &&
+ git commit -m "1st commit" &&
+ git rm x &&
+ git commit -m "rm x"
+'
+test_expect_success "setup case mac" '
+ git checkout -b mac_os
+'
+# This will test nfd2nfc in readdir()
+test_expect_success "add file Adiarnfc" '
+ echo f.Adiarnfc >f.$Adiarnfc &&
+ git add f.$Adiarnfc &&
+ git commit -m "add f.$Adiarnfc"
+'
+# This will test nfd2nfc in git stage()
+test_expect_success "stage file d.Adiarnfd/f.Adiarnfd" '
+ mkdir d.$Adiarnfd &&
+ echo d.$Adiarnfd/f.$Adiarnfd >d.$Adiarnfd/f.$Adiarnfd &&
+ git stage d.$Adiarnfd/f.$Adiarnfd &&
+ git commit -m "add d.$Adiarnfd/f.$Adiarnfd"
+'
+test_expect_success "add link Adiarnfc" '
+ ln -s d.$Adiarnfd/f.$Adiarnfd l.$Adiarnfc &&
+ git add l.$Adiarnfc &&
+ git commit -m "add l.Adiarnfc"
+'
+# This will test git log
+test_expect_success "git log f.Adiar" '
+ git log f.$Adiarnfc > f.Adiarnfc.log &&
+ git log f.$Adiarnfd > f.Adiarnfd.log &&
+ test -s f.Adiarnfc.log &&
+ test -s f.Adiarnfd.log &&
+ test_cmp f.Adiarnfc.log f.Adiarnfd.log &&
+ rm f.Adiarnfc.log f.Adiarnfd.log
+'
+# This will test git ls-files
+test_expect_success "git lsfiles f.Adiar" '
+ git ls-files f.$Adiarnfc > f.Adiarnfc.log &&
+ git ls-files f.$Adiarnfd > f.Adiarnfd.log &&
+ test -s f.Adiarnfc.log &&
+ test -s f.Adiarnfd.log &&
+ test_cmp f.Adiarnfc.log f.Adiarnfd.log &&
+ rm f.Adiarnfc.log f.Adiarnfd.log
+'
+# This will test git mv
+test_expect_success "git mv" '
+ git mv f.$Adiarnfd f.$Odiarnfc &&
+ git mv d.$Adiarnfd d.$Odiarnfc &&
+ git mv l.$Adiarnfd l.$Odiarnfc &&
+ git commit -m "mv Adiarnfd Odiarnfc"
+'
+# Files can be checked out as nfc
+# And the link has been corrected from nfd to nfc
+test_expect_success "git checkout nfc" '
+ rm f.$Odiarnfc &&
+ git checkout f.$Odiarnfc
+'
+# Make it possible to checkout files with their NFD names
+test_expect_success "git checkout file nfd" '
+ rm -f f.* &&
+ git checkout f.$Odiarnfd
+'
+# Make it possible to checkout links with their NFD names
+test_expect_success "git checkout link nfd" '
+ rm l.* &&
+ git checkout l.$Odiarnfd
+'
+test_expect_success "setup case mac2" '
+ git checkout master &&
+ git reset --hard &&
+ git checkout -b mac_os_2
+'
+# This will test nfd2nfc in git commit
+test_expect_success "commit file d2.Adiarnfd/f.Adiarnfd" '
+ mkdir d2.$Adiarnfd &&
+ echo d2.$Adiarnfd/f.$Adiarnfd >d2.$Adiarnfd/f.$Adiarnfd &&
+ git add d2.$Adiarnfd/f.$Adiarnfd &&
+ git commit -m "add d2.$Adiarnfd/f.$Adiarnfd" -- d2.$Adiarnfd/f.$Adiarnfd
+'
+test_expect_success "setup for long decomposed filename" '
+ git checkout master &&
+ git reset --hard &&
+ git checkout -b mac_os_long_nfd_fn
+'
+test_expect_success "Add long decomposed filename" '
+ echo longd >$Alongd &&
+ git add * &&
+ git commit -m "Long filename"
+'
+test_expect_success "setup for long precomposed filename" '
+ git checkout master &&
+ git reset --hard &&
+ git checkout -b mac_os_long_nfc_fn
+'
+test_expect_success "Add long precomposed filename" '
+ echo longc >$Alongc &&
+ git add * &&
+ git commit -m "Long filename"
+'
+# Test if the global core.precomposeunicode stops autosensing
+# Must be the last test case
+test_expect_success "respect git config --global core.precomposeunicode" '
+ git config --global core.precomposeunicode true &&
+ rm -rf .git &&
+ git init &&
+ precomposeunicode=`git config core.precomposeunicode` &&
+ test "$precomposeunicode" = "true"
+'
test_done
test_expect_success EXPENSIVE 'create 50,000 tags in the repo' '
(
cd "$HTTPD_DOCUMENT_ROOT_PATH/repo.git" &&
- for i in `seq 50000`
+ for i in `test_seq 50000`
do
echo "commit refs/heads/too-many-refs"
echo "mark :$i"
grep "^Subject: =?UTF-8?q?utf8-s=C3=BCbj=C3=ABct?=" msgtxt1
'
+test_expect_success $PREREQ 'utf8 author is correctly passed on' '
+ clean_fake_sendmail &&
+ test_commit weird_author &&
+ test_when_finished "git reset --hard HEAD^" &&
+ git commit --amend --author "Füñný Nâmé <odd_?=mail@example.com>" &&
+ git format-patch --stdout -1 >funny_name.patch &&
+ git send-email --from="Example <nobody@example.com>" \
+ --to=nobody@example.com \
+ --smtp-server="$(pwd)/fake.sendmail" \
+ funny_name.patch &&
+ grep "^From: Füñný Nâmé <odd_?=mail@example.com>" msgtxt1
+'
+
test_expect_success $PREREQ 'detects ambiguous reference/file conflict' '
echo master > master &&
git add master &&
head=`git rev-parse --verify refs/heads/git-svn-HEAD^0`
test_expect_success 'git-svn-HEAD is a real HEAD' "test -n '$head'"
+svnrepo_escaped=`echo $svnrepo | sed 's/ /%20/'`
+
test_expect_success 'initialize old-style (v0) git svn layout' '
mkdir -p "$GIT_DIR"/git-svn/info "$GIT_DIR"/svn/info &&
echo "$svnrepo" > "$GIT_DIR"/git-svn/info/url &&
echo "$svnrepo" > "$GIT_DIR"/svn/info/url &&
git svn migrate &&
- ! test -d "$GIT_DIR"/git svn &&
+ ! test -d "$GIT_DIR"/git-svn &&
git rev-parse --verify refs/${remotes_git_svn}^0 &&
git rev-parse --verify refs/remotes/svn^0 &&
- test "$(git config --get svn-remote.svn.url)" = "$svnrepo" &&
+ test "$(git config --get svn-remote.svn.url)" = "$svnrepo_escaped" &&
test `git config --get svn-remote.svn.fetch` = \
":refs/${remotes_git_svn}"
'
start_httpd
'
+# SVN 1.7 will truncate "not-a%40{0]" to just "not-a".
+# Look at what SVN wound up naming the branch and use that.
+# Be sure to escape the @ if it shows up.
+non_reflog=`svn_cmd ls "$svnrepo/pr ject/branches" | grep not-a | sed 's/\///' | sed 's/@/%40/'`
+
test_expect_success 'test clone with funky branch names' '
git svn clone -s "$svnrepo/pr ject" project &&
(
git rev-parse "refs/remotes/%2Eleading_dot" &&
git rev-parse "refs/remotes/trailing_dot%2E" &&
git rev-parse "refs/remotes/trailing_dotlock%2Elock" &&
- git rev-parse "refs/remotes/not-a%40{0}reflog"
+ git rev-parse "refs/remotes/$non_reflog"
)
'
# capital letters by convention).
test_set_prereq () {
- satisfied="$satisfied$1 "
+ satisfied_prereq="$satisfied_prereq$1 "
+}
+satisfied_prereq=" "
+lazily_testable_prereq= lazily_tested_prereq=
+
+# Usage: test_lazy_prereq PREREQ 'script'
+test_lazy_prereq () {
+ lazily_testable_prereq="$lazily_testable_prereq$1 "
+ eval test_prereq_lazily_$1=\$2
+}
+
+test_run_lazy_prereq_ () {
+ script='
+mkdir -p "$TRASH_DIRECTORY/prereq-test-dir" &&
+(
+ cd "$TRASH_DIRECTORY/prereq-test-dir" &&'"$2"'
+)'
+ say >&3 "checking prerequisite: $1"
+ say >&3 "$script"
+ test_eval_ "$script"
+ eval_ret=$?
+ rm -rf "$TRASH_DIRECTORY/prereq-test-dir"
+ if test "$eval_ret" = 0; then
+ say >&3 "prerequisite $1 ok"
+ else
+ say >&3 "prerequisite $1 not satisfied"
+ fi
+ return $eval_ret
}
-satisfied=" "
test_have_prereq () {
# prerequisites can be concatenated with ','
for prerequisite
do
+ case " $lazily_tested_prereq " in
+ *" $prerequisite "*)
+ ;;
+ *)
+ case " $lazily_testable_prereq " in
+ *" $prerequisite "*)
+ eval "script=\$test_prereq_lazily_$prerequisite" &&
+ if test_run_lazy_prereq_ "$prerequisite" "$script"
+ then
+ test_set_prereq $prerequisite
+ fi
+ lazily_tested_prereq="$lazily_tested_prereq$prerequisite "
+ esac
+ ;;
+ esac
+
total_prereq=$(($total_prereq + 1))
- case $satisfied in
+ case "$satisfied_prereq" in
*" $prerequisite "*)
ok_prereq=$(($ok_prereq + 1))
;;
$GIT_TEST_CMP "$@"
}
+# Print a sequence of numbers or letters in increasing order. This is
+# similar to GNU seq(1), but the latter might not be available
+# everywhere (and does not do letters). It may be used like:
+#
+# for i in `test_seq 100`; do
+# for j in `test_seq 10 20`; do
+# for k in `test_seq a z`; do
+# echo $i-$j-$k
+# done
+# done
+# done
+
+test_seq () {
+ case $# in
+ 1) set 1 "$@" ;;
+ 2) ;;
+ *) error "bug in the test script: not 1 or 2 parameters to test_seq" ;;
+ esac
+ "$PERL_PATH" -le 'print for $ARGV[0]..$ARGV[1]' -- "$@"
+}
+
# This function can be used to schedule some commands to be run
# unconditionally at the end of the test to restore sanity:
#
fi
}
-# test whether the filesystem supports symbolic links
-ln -s x y 2>/dev/null && test -h y 2>/dev/null && test_set_prereq SYMLINKS
-rm -f y
+test_lazy_prereq SYMLINKS '
+ # test whether the filesystem supports symbolic links
+ ln -s x y && test -h y
+'
+
+test_lazy_prereq CASE_INSENSITIVE_FS '
+ echo good >CamelCase &&
+ echo bad >camelcase &&
+ test "$(cat CamelCase)" != good
+'
+
+test_lazy_prereq UTF8_NFD_TO_NFC '
+ # check whether FS converts nfd unicode to nfc
+ auml=$(printf "\303\244")
+ aumlcdiar=$(printf "\141\314\210")
+ >"$auml" &&
+ case "$(echo *)" in
+ "$aumlcdiar")
+ true ;;
+ *)
+ false ;;
+ esac
+'
# When the tests are run as root, permission tests will report that
# things are writable when they shouldn't be.
diff_opts.rename_score = opt->rename_score;
paths[0] = NULL;
diff_tree_setup_paths(paths, &diff_opts);
- if (diff_setup_done(&diff_opts) < 0)
- die("unable to set up diff options to follow renames");
+ diff_setup_done(&diff_opts);
diff_tree(t1, t2, base, &diff_opts);
diffcore_std(&diff_opts);
diff_tree_release_paths(&diff_opts);