/git-bundle
/git-cat-file
/git-check-attr
+/git-check-ignore
/git-check-ref-format
/git-checkout
/git-checkout-index
branch is set to integrate with that remote branch. There is a user
preference configuration variable "push.default" to change this.
+"git push $there tag v1.2.3" used to allow replacing a tag v1.2.3
+that already exists in the repository $there, if the rewritten tag
+you are pushing points at a commit that is a decendant of a commit
+that the old tag v1.2.3 points at. This was found to be error prone
+and starting with this release, any attempt to update an existing
+ref under refs/tags/ hierarchy will fail, without "--force".
+
Updates since v1.8.1
--------------------
* Scripts can ask Git that wildcard patterns in pathspecs they give do
not have any significance, i.e. take them as literal strings.
- * The pathspec code learned to grok "foo/**/bar" as a pattern that
- matches "bar" in 0-or-more levels of subdirectory in "foo".
+ * The patterns in .gitignore and .gitattributes files can have **/,
+ as a pattern that matches 0 or more levels of subdirectory.
+ E.g. "foo/**/bar" matches "bar" in "foo" itself or in a
+ subdirectory of "foo".
* "git blame" (and "git diff") learned the "--no-follow" option.
+ * "git check-ignore" command to help debugging .gitignore files has
+ been added.
+
* "git cherry-pick" can be used to replay a root commit to an unborn
branch.
* "git push" now requires "-f" to update a tag, even if it is a
fast-forward, as tags are meant to be fixed points.
+ * "git push" will stop without doing anything if the new "pre-push"
+ hook exists and exits with a failure.
+
* When "git rebase" fails to generate patches to be applied (e.g. due
to oom), it failed to detect the failure and instead behaved as if
there were nothing to do. A workaround to use a temporary file has
from a conflicted state, that we may have missed.
* The implementation of "imap-send" has been updated to reuse xml
- quoting code from http-push codepath.
+ quoting code from http-push codepath, and lost a lot of unused
+ code.
* There is a simple-minded checker for the test scripts in t/
directory to catch most common mistakes (it is not enabled by
default).
+ * You can build with USE_WILDMATCH=YesPlease to use a replacement
+ implementation of pattern matching logic used for pathname-like
+ things, e.g. refnames and paths in the repository. This new
+ implementation is not expected change the existing behaviour of Git
+ in this release, except for "git for-each-ref" where you can now
+ say "refs/**/master" and match with both refs/heads/master and
+ refs/remotes/origin/master. We plan to use this new implementation
+ in wider places (e.g. "git ls-files '**/Makefile' may find Makefile
+ at the top-level, and "git log '**/t*.sh'" may find commits that
+ touch a shell script whose name begins with "t" at any level) in
+ future versions of Git, but we are not there yet. By building with
+ USE_WILDMATCH, using the resulting Git daily and reporting when you
+ find breakages, you can help us get closer to that goal.
+
Also contains minor documentation updates and code clean-ups.
--- /dev/null
+git-check-ignore(1)
+===================
+
+NAME
+----
+git-check-ignore - Debug gitignore / exclude files
+
+
+SYNOPSIS
+--------
+[verse]
+'git check-ignore' [options] pathname...
+'git check-ignore' [options] --stdin < <list-of-paths>
+
+DESCRIPTION
+-----------
+
+For each pathname given via the command-line or from a file via
+`--stdin`, show the pattern from .gitignore (or other input files to
+the exclude mechanism) that decides if the pathname is excluded or
+included. Later patterns within a file take precedence over earlier
+ones.
+
+OPTIONS
+-------
+-q, --quiet::
+ Don't output anything, just set exit status. This is only
+ valid with a single pathname.
+
+-v, --verbose::
+ Also output details about the matching pattern (if any)
+ for each given pathname.
+
+--stdin::
+ Read file names from stdin instead of from the command-line.
+
+-z::
+ The output format is modified to be machine-parseable (see
+ below). If `--stdin` is also given, input paths are separated
+ with a NUL character instead of a linefeed character.
+
+OUTPUT
+------
+
+By default, any of the given pathnames which match an ignore pattern
+will be output, one per line. If no pattern matches a given path,
+nothing will be output for that path; this means that path will not be
+ignored.
+
+If `--verbose` is specified, the output is a series of lines of the form:
+
+<source> <COLON> <linenum> <COLON> <pattern> <HT> <pathname>
+
+<pathname> is the path of a file being queried, <pattern> is the
+matching pattern, <source> is the pattern's source file, and <linenum>
+is the line number of the pattern within that source. If the pattern
+contained a `!` prefix or `/` suffix, it will be preserved in the
+output. <source> will be an absolute path when referring to the file
+configured by `core.excludesfile`, or relative to the repository root
+when referring to `.git/info/exclude` or a per-directory exclude file.
+
+If `-z` is specified, the pathnames in the output are delimited by the
+null character; if `--verbose` is also specified then null characters
+are also used instead of colons and hard tabs:
+
+<source> <NULL> <linenum> <NULL> <pattern> <NULL> <pathname> <NULL>
+
+
+EXIT STATUS
+-----------
+
+0::
+ One or more of the provided paths is ignored.
+
+1::
+ None of the provided paths are ignored.
+
+128::
+ A fatal error was encountered.
+
+SEE ALSO
+--------
+linkgit:gitignore[5]
+linkgit:gitconfig[5]
+linkgit:git-ls-files[5]
+
+GIT
+---
+Part of the linkgit:git[1] suite
All the operations required for normal use are supported, including
checkout, diff, status, update, log, add, remove, commit.
+
+Most CVS command arguments that read CVS tags or revision numbers
+(typically -r) work, and also support any git refspec
+(tag, branch, commit ID, etc).
+However, CVS revision numbers for non-default branches are not well
+emulated, and cvs log does not show tags or branches at
+all. (Non-main-branch CVS revision numbers superficially resemble CVS
+revision numbers, but they actually encode a git commit ID directly,
+rather than represent the number of revisions since the branch point.)
+
+Note that there are two ways to checkout a particular branch.
+As described elsewhere on this page, the "module" parameter
+of cvs checkout is interpreted as a branch name, and it becomes
+the main branch. It remains the main branch for a given sandbox
+even if you temporarily make another branch sticky with
+cvs update -r. Alternatively, the -r argument can indicate
+some other branch to actually checkout, even though the module
+is still the "main" branch. Tradeoffs (as currently
+implemented): Each new "module" creates a new database on disk with
+a history for the given module, and after the database is created,
+operations against that main branch are fast. Or alternatively,
+-r doesn't take any extra disk space, but may be significantly slower for
+many operations, like cvs update.
+
+If you want to refer to a git refspec that has characters that are
+not allowed by CVS, you have two options. First, it may just work
+to supply the git refspec directly to the appropriate CVS -r argument;
+some CVS clients don't seem to do much sanity checking of the argument.
+Second, if that fails, you can use a special character escape mechanism
+that only uses characters that are valid in CVS tags. A sequence
+of 4 or 5 characters of the form (underscore (`"_"`), dash (`"-"`),
+one or two characters, and dash (`"-"`)) can encode various characters based
+on the one or two letters: `"s"` for slash (`"/"`), `"p"` for
+period (`"."`), `"u"` for underscore (`"_"`), or two hexadecimal digits
+for any byte value at all (typically an ASCII number, or perhaps a part
+of a UTF-8 encoded character).
+
Legacy monitoring operations are not supported (edit, watch and related).
Exports and tagging (tags and branches) are not supported at this stage.
(eg: permissions/ownership, ACLS, etc). See contrib/hooks/setgitperms.perl
for an example of how to do this.
+pre-push
+~~~~~~~~
+
+This hook is called by 'git push' and can be used to prevent a push from taking
+place. The hook is called with two parameters which provide the name and
+location of the destination remote, if a named remote is not being used both
+values will be the same.
+
+Information about what is to be pushed is provided on the hook's standard
+input with lines of the form:
+
+ <local ref> SP <local sha1> SP <remote ref> SP <remote sha1> LF
+
+For instance, if the command +git push origin master:foreign+ were run the
+hook would receive a line like the following:
+
+ refs/heads/master 67890 refs/heads/foreign 12345
+
+although the full, 40-character SHA1s would be supplied. If the foreign ref
+does not yet exist the `<remote SHA1>` will be 40 `0`. If a ref is to be
+deleted, the `<local ref>` will be supplied as `(delete)` and the `<local
+SHA1>` will be 40 `0`. If the local commit was specified by something other
+than a name which could be expanded (such as `HEAD~`, or a SHA1) it will be
+supplied as it was originally given.
+
+If this hook exits with a non-zero status, 'git push' will abort without
+pushing anything. Information about why the push is rejected may be sent
+to the user by writing to standard error.
+
[[pre-receive]]
pre-receive
~~~~~~~~~~~
SEE ALSO
--------
-linkgit:git-rm[1], linkgit:git-update-index[1],
-linkgit:gitrepository-layout[5]
+linkgit:git-rm[1],
+linkgit:git-update-index[1],
+linkgit:gitrepository-layout[5],
+linkgit:git-check-ignore[1]
GIT
---
How to maintain Git
===================
+Activities
+----------
+
The maintainer's git time is spent on three activities.
- - Communication (60%)
+ - Communication (45%)
Mailing list discussions on general design, fielding user
questions, diagnosing bug reports; reviewing, commenting on,
suggesting alternatives to, and rejecting patches.
- - Integration (30%)
+ - Integration (50%)
Applying new patches from the contributors while spotting and
correcting minor mistakes, shuffling the integration and
testing branches, pushing the results out, cutting the
releases, and making announcements.
- - Own development (10%)
+ - Own development (5%)
Scratching my own itch and sending proposed patch series out.
+The Policy
+----------
+
The policy on Integration is informally mentioned in "A Note
from the maintainer" message, which is periodically posted to
this mailing list after each feature release is made.
-The policy.
-
- Feature releases are numbered as vX.Y.Z and are meant to
contain bugfixes and enhancements in any area, including
functionality, performance and usability, without regression.
+ - One release cycle for a feature release is expected to last for
+ eight to ten weeks.
+
- Maintenance releases are numbered as vX.Y.Z.W and are meant
to contain only bugfixes for the corresponding vX.Y.Z feature
release and earlier maintenance releases vX.Y.Z.V (V < W).
- 'pu' branch is used to publish other proposed changes that do
not yet pass the criteria set for 'next'.
- - The tips of 'master', 'maint' and 'next' branches will always
- fast-forward, to allow people to build their own
- customization on top of them.
+ - The tips of 'master' and 'maint' branches will not be rewound to
+ allow people to build their own customization on top of them.
+ Early in a new development cycle, 'next' is rewound to the tip of
+ 'master' once, but otherwise it will not be rewound until the end
+ of the cycle.
- - Usually 'master' contains all of 'maint', 'next' contains all
- of 'master' and 'pu' contains all of 'next'.
+ - Usually 'master' contains all of 'maint' and 'next' contains all
+ of 'master'. 'pu' contains all the topics merged to 'next', but
+ is rebuilt directly on 'master'.
- The tip of 'master' is meant to be more stable than any
tagged releases, and the users are encouraged to follow it.
are found before new topics are merged to 'master'.
+A Typical Git Day
+-----------------
+
A typical git day for the maintainer implements the above policy
by doing the following:
- - Scan mailing list and #git channel log. Respond with review
- comments, suggestions etc. Kibitz. Collect potentially
- usable patches from the mailing list. Patches about a single
- topic go to one mailbox (I read my mail in Gnus, and type
- \C-o to save/append messages in files in mbox format).
+ - Scan mailing list. Respond with review comments, suggestions
+ etc. Kibitz. Collect potentially usable patches from the
+ mailing list. Patches about a single topic go to one mailbox (I
+ read my mail in Gnus, and type \C-o to save/append messages in
+ files in mbox format).
+
+ - Write his own patches to address issues raised on the list but
+ nobody has stepped up solving. Send it out just like other
+ contributors do, and pick them up just like patches from other
+ contributors (see above).
- Review the patches in the saved mailboxes. Edit proposed log
message for typofixes and clarifications, and add Acks
- Obviously correct fixes that pertain to the tip of 'master'
are directly applied to 'master'.
+ - Other topics are not handled in this step.
+
This step is done with "git am".
$ git checkout master ;# or "git checkout maint"
- $ git am -3 -s mailbox
+ $ git am -sc3 mailbox
$ make test
- - Merge downwards (maint->master):
-
- $ git checkout master
- $ git merge maint
- $ make test
+ In practice, almost no patch directly goes to 'master' or
+ 'maint'.
- Review the last issue of "What's cooking" message, review the
- topics scheduled for merging upwards (topic->master and
- topic->maint), and merge.
+ topics ready for merging (topic->master and topic->maint). Use
+ "Meta/cook -w" script (where Meta/ contains a checkout of the
+ 'todo' branch) to aid this step.
+
+ And perform the merge. Use "Meta/Reintegrate -e" script (see
+ later) to aid this step.
+
+ $ Meta/cook -w last-issue-of-whats-cooking.mbox
$ git checkout master ;# or "git checkout maint"
- $ git merge ai/topic ;# or "git merge ai/maint-topic"
+ $ echo ai/topic | Meta/Reintegrate -e ;# "git merge ai/topic"
$ git log -p ORIG_HEAD.. ;# final review
$ git diff ORIG_HEAD.. ;# final review
$ make test ;# final review
- $ git branch -d ai/topic ;# or "git branch -d ai/maint-topic"
-
- - Merge downwards (maint->master) if needed:
-
- $ git checkout master
- $ git merge maint
- $ make test
-
- - Merge downwards (master->next) if needed:
-
- $ git checkout next
- $ git merge master
- $ make test
- Handle the remaining patches:
and not in 'master') is applied to a new topic branch that
is forked from the tip of 'master'. This includes both
enhancements and unobvious fixes to 'master'. A topic
- branch is named as ai/topic where "ai" is typically
- author's initial and "topic" is a descriptive name of the
- topic (in other words, "what's the series is about").
+ branch is named as ai/topic where "ai" is two-letter string
+ named after author's initial and "topic" is a descriptive name
+ of the topic (in other words, "what's the series is about").
- An unobvious fix meant for 'maint' is applied to a new
topic branch that is forked from the tip of 'maint'. The
The above except the "replacement" are all done with:
- $ git am -3 -s mailbox
+ $ git checkout ai/topic ;# or "git checkout -b ai/topic master"
+ $ git am -sc3 mailbox
while patch replacement is often done by:
then replace some parts with the new patch, and reapplying:
+ $ git checkout ai/topic
$ git reset --hard ai/topic~$n
- $ git am -3 -s 000*.txt
+ $ git am -sc3 -s 000*.txt
The full test suite is always run for 'maint' and 'master'
after patch application; for topic branches the tests are run
as time permits.
- - Update "What's cooking" message to review the updates to
- existing topics, newly added topics and graduated topics.
+ - Merge maint to master as needed:
- This step is helped with Meta/cook script (where Meta/ contains
- a checkout of the 'todo' branch).
-
- - Merge topics to 'next'. For each branch whose tip is not
- merged to 'next', one of three things can happen:
+ $ git checkout master
+ $ git merge maint
+ $ make test
- - The commits are all next-worthy; merge the topic to next:
+ - Merge master to next as needed:
$ git checkout next
- $ git merge ai/topic ;# or "git merge ai/maint-topic"
+ $ git merge master
$ make test
+ - Review the last issue of "What's cooking" again and see if topics
+ that are ready to be merged to 'next' are still in good shape
+ (e.g. has there any new issue identified on the list with the
+ series?)
+
+ - Prepare 'jch' branch, which is used to represent somewhere
+ between 'master' and 'pu' and often is slightly ahead of 'next'.
+
+ $ Meta/Reintegrate master..pu >Meta/redo-jch.sh
+
+ The result is a script that lists topics to be merged in order to
+ rebuild 'pu' as the input to Meta/Reintegrate script. Remove
+ later topics that should not be in 'jch' yet. Add a line that
+ consists of '### match next' before the name of the first topic
+ in the output that should be in 'jch' but not in 'next' yet.
+
+ - Now we are ready to start merging topics to 'next'. For each
+ branch whose tip is not merged to 'next', one of three things can
+ happen:
+
+ - The commits are all next-worthy; merge the topic to next;
- The new parts are of mixed quality, but earlier ones are
- next-worthy; merge the early parts to next:
+ next-worthy; merge the early parts to next;
+ - Nothing is next-worthy; do not do anything.
+
+ This step is aided with Meta/redo-jch.sh script created earlier.
+ If a topic that was already in 'next' gained a patch, the script
+ would list it as "ai/topic~1". To include the new patch to the
+ updated 'next', drop the "~1" part; to keep it excluded, do not
+ touch the line. If a topic that was not in 'next' should be
+ merged to 'next', add it at the end of the list. Then:
+
+ $ git checkout -B jch master
+ $ Meta/redo-jch.sh -c1
+
+ to rebuild the 'jch' branch from scratch. "-c1" tells the script
+ to stop merging at the first line that begins with '###'
+ (i.e. the "### match next" line you added earlier).
+
+ At this point, build-test the result. It may reveal semantic
+ conflicts (e.g. a topic renamed a variable, another added a new
+ reference to the variable under its old name), in which case
+ prepare an appropriate merge-fix first (see appendix), and
+ rebuild the 'jch' branch from scratch, starting at the tip of
+ 'master'.
+
+ Then do the same to 'next'
$ git checkout next
- $ git merge ai/topic~2 ;# the tip two are dubious
- $ make test
+ $ sh Meta/redo-jch.sh -c1 -e
- - Nothing is next-worthy; do not do anything.
+ The "-e" option allows the merge message that comes from the
+ history of the topic and the comments in the "What's cooking" to
+ be edited. The resulting tree should match 'jch' as the same set
+ of topics are merged on 'master'; otherwise there is a mismerge.
+ Investigate why and do not proceed until the mismerge is found
+ and rectified.
- - [** OBSOLETE **] Optionally rebase topics that do not have any commit
- in next yet, when they can take advantage of low-level framework
- change that is merged to 'master' already.
+ $ git diff jch next
- $ git rebase master ai/topic
+ When all is well, clean up the redo-jch.sh script with
- This step is helped with Meta/git-topic.perl script to
- identify which topic is rebaseable. There also is a
- pre-rebase hook to make sure that topics that are already in
- 'next' are not rebased beyond the merged commit.
+ $ sh Meta/redo-jch.sh -u
- - [** OBSOLETE **] Rebuild "pu" to merge the tips of topics not in 'next'.
+ This removes topics listed in the script that have already been
+ merged to 'master'. This may lose '### match next' marker;
+ add it again to the appropriate place when it happens.
- $ git checkout pu
- $ git reset --hard next
- $ git merge ai/topic ;# repeat for all remaining topics
- $ make test
+ - Rebuild 'pu'.
- This step is helped with Meta/PU script
+ $ Meta/Reintegrate master..pu >Meta/redo-pu.sh
- - Push four integration branches to a private repository at
- k.org and run "make test" on all of them.
+ Edit the result by adding new topics that are not still in 'pu'
+ in the script. Then
- - Push four integration branches to /pub/scm/git/git.git at
- k.org. This triggers its post-update hook which:
+ $ git checkout -B pu jch
+ $ sh Meta/redo-pu.sh
- (1) runs "git pull" in $HOME/git-doc/ repository to pull
- 'master' just pushed out;
+ When all is well, clean up the redo-pu.sh script with
- (2) runs "make doc" in $HOME/git-doc/, install the generated
- documentation in staging areas, which are separate
- repositories that have html and man branches checked
- out.
+ $ sh Meta/redo-pu.sh -u
- (3) runs "git commit" in the staging areas, and run "git
- push" back to /pub/scm/git/git.git/ to update the html
- and man branches.
+ Double check by running
- (4) installs generated documentation to /pub/software/scm/git/docs/
- to be viewed from http://www.kernel.org/
+ $ git branch --no-merged pu
- - Fetch html and man branches back from k.org, and push four
- integration branches and the two documentation branches to
- repo.or.cz and other mirrors.
+ to see there is no unexpected leftover topics.
+ At this point, build-test the result for semantic conflicts, and
+ if there are, prepare an appropriate merge-fix first (see
+ appendix), and rebuild the 'pu' branch from scratch, starting at
+ the tip of 'jch'.
+
+ - Update "What's cooking" message to review the updates to
+ existing topics, newly added topics and graduated topics.
+
+ This step is helped with Meta/cook script.
+
+ $ Meta/cook
+
+ This script inspects the history between master..pu, finds tips
+ of topic branches, compares what it found with the current
+ contents in Meta/whats-cooking.txt, and updates that file.
+ Topics not listed in the file but are found in master..pu are
+ added to the "New topics" section, topics listed in the file that
+ are no longer found in master..pu are moved to the "Graduated to
+ master" section, and topics whose commits changed their states
+ (e.g. used to be only in 'pu', now merged to 'next') are updated
+ with change markers "<<" and ">>".
+
+ Look for lines enclosed in "<<" and ">>"; they hold contents from
+ old file that are replaced by this integration round. After
+ verifying them, remove the old part. Review the description for
+ each topic and update its doneness and plan as needed. To review
+ the updated plan, run
+
+ $ Meta/cook -w
+
+ which will pick up comments given to the topics, such as "Will
+ merge to 'next'", etc. (see Meta/cook script to learn what kind
+ of phrases are supported).
+
+ - Compile, test and install all four (five) integration branches;
+ Meta/Dothem script may aid this step.
+
+ - Format documentation if the 'master' branch was updated;
+ Meta/dodoc.sh script may aid this step.
+
+ - Push the integration branches out to public places; Meta/pushall
+ script may aid this step.
+
+Observations
+------------
Some observations to be made.
- * Each topic is tested individually, and also together with
- other topics cooking in 'next'. Until it matures, none part
- of it is merged to 'master'.
+ * Each topic is tested individually, and also together with other
+ topics cooking first in 'pu', then in 'jch' and then in 'next'.
+ Until it matures, no part of it is merged to 'master'.
* A topic already in 'next' can get fixes while still in
'next'. Such a topic will have many merges to 'next' (in
other words, "git log --first-parent next" will show many
- "Merge ai/topic to next" for the same topic.
+ "Merge branch 'ai/topic' to next" for the same topic.
* An unobvious fix for 'maint' is cooked in 'next' and then
merged to 'master' to make extra sure it is Ok and then
* Being in the 'next' branch is not a guarantee for a topic to
be included in the next feature release. Being in the
'master' branch typically is.
+
+
+Appendix
+--------
+
+Preparing a "merge-fix"
+~~~~~~~~~~~~~~~~~~~~~~~
+
+A merge of two topics may not textually conflict but still have
+conflict at the semantic level. A classic example is for one topic
+to rename an variable and all its uses, while another topic adds a
+new use of the variable under its old name. When these two topics
+are merged together, the reference to the variable newly added by
+the latter topic will still use the old name in the result.
+
+The Meta/Reintegrate script that is used by redo-jch and redo-pu
+scripts implements a crude but usable way to work this issue around.
+When the script merges branch $X, it checks if "refs/merge-fix/$X"
+exists, and if so, the effect of it is squashed into the result of
+the mechanical merge. In other words,
+
+ $ echo $X | Meta/Reintegrate
+
+is roughly equivalent to this sequence:
+
+ $ git merge --rerere-autoupdate $X
+ $ git commit
+ $ git cherry-pick -n refs/merge-fix/$X
+ $ git commit --amend
+
+The goal of this "prepare a merge-fix" step is to come up with a
+commit that can be squashed into a result of mechanical merge to
+correct semantic conflicts.
+
+After finding that the result of merging branch "ai/topic" to an
+integration branch had such a semantic conflict, say pu~4, check the
+problematic merge out on a detached HEAD, edit the working tree to
+fix the semantic conflict, and make a separate commit to record the
+fix-up:
+
+ $ git checkout pu~4
+ $ git show -s --pretty=%s ;# double check
+ Merge branch 'ai/topic' to pu
+ $ edit
+ $ git commit -m 'merge-fix/ai/topic' -a
+
+Then make a reference "refs/merge-fix/ai/topic" to point at this
+result:
+
+ $ git update-ref refs/merge-fix/ai/topic HEAD
+
+Then double check the result by asking Meta/Reintegrate to redo the
+merge:
+
+ $ git checkout pu~5 ;# the parent of the problem merge
+ $ echo ai/topic | Meta/Reintegrate
+ $ git diff pu~4
+
+This time, because you prepared refs/merge-fix/ai/topic, the
+resulting merge should have been tweaked to include the fix for the
+semantic conflict.
+
+Note that this assumes that the order in which conflicting branches
+are merged does not change. If the reason why merging ai/topic
+branch needs this merge-fix is because another branch merged earlier
+to the integration branch changed the underlying assumption ai/topic
+branch made (e.g. ai/topic branch added a site to refer to a
+variable, while the other branch renamed that variable and adjusted
+existing use sites), and if you changed redo-jch (or redo-pu) script
+to merge ai/topic branch before the other branch, then the above
+merge-fix should not be applied while merging ai/topic, but should
+instead be applied while merging the other branch. You would need
+to move the fix to apply to the other branch, perhaps like this:
+
+ $ mf=refs/merge-fix
+ $ git update-ref $mf/$the_other_branch $mf/ai/topic
+ $ git update-ref -d $mf/ai/topic
* Prepare `struct dir_struct dir` and clear it with `memset(&dir, 0,
sizeof(dir))`.
-* Call `add_exclude()` to add single exclude pattern,
- `add_excludes_from_file()` to add patterns from a file
- (e.g. `.git/info/exclude`), and/or set `dir.exclude_per_dir`. A
- short-hand function `setup_standard_excludes()` can be used to set up
- the standard set of exclude settings.
+* To add single exclude pattern, call `add_exclude_list()` and then
+ `add_exclude()`.
+
+* To add patterns from a file (e.g. `.git/info/exclude`), call
+ `add_excludes_from_file()` , and/or set `dir.exclude_per_dir`. A
+ short-hand function `setup_standard_excludes()` can be used to set
+ up the standard set of exclude settings.
* Set options described in the Data Structure section above.
* Use `dir.entries[]`.
+* Call `clear_directory()` when none of the contained elements are no longer in use.
+
(JC)
# Define NO_FNMATCH_CASEFOLD if your fnmatch function doesn't have the
# FNM_CASEFOLD GNU extension.
#
+# Define USE_WILDMATCH if you want to use Git's wildmatch
+# implementation as fnmatch
+#
# Define NO_GECOS_IN_PWENT if you don't have pw_gecos in struct passwd
# in the C library.
#
# apostrophes to be ASCII so that cut&pasting examples to the shell
# will work.
#
+# Define PERL_PATH to the path of your Perl binary (usually /usr/bin/perl).
+#
# Define NO_PERL_MAKEMAKER if you cannot use Makefiles generated by perl's
# MakeMaker (e.g. using ActiveState under Cygwin).
#
# Define NO_PERL if you do not want Perl scripts or libraries at all.
#
+# Define PYTHON_PATH to the path of your Python binary (often /usr/bin/python
+# but /usr/bin/python2.7 on some platforms).
+#
# Define NO_PYTHON if you do not want Python scripts or libraries at all.
#
# Define NO_TCLTK if you do not want Tcl/Tk GUI.
LIB_H += pack.h
LIB_H += parse-options.h
LIB_H += patch-ids.h
+LIB_H += pathspec.h
LIB_H += pkt-line.h
LIB_H += progress.h
LIB_H += prompt.h
LIB_OBJS += patch-delta.o
LIB_OBJS += patch-ids.o
LIB_OBJS += path.o
+LIB_OBJS += pathspec.o
LIB_OBJS += pkt-line.o
LIB_OBJS += preload-index.o
LIB_OBJS += pretty.o
BUILTIN_OBJS += builtin/bundle.o
BUILTIN_OBJS += builtin/cat-file.o
BUILTIN_OBJS += builtin/check-attr.o
+BUILTIN_OBJS += builtin/check-ignore.o
BUILTIN_OBJS += builtin/check-ref-format.o
BUILTIN_OBJS += builtin/checkout-index.o
BUILTIN_OBJS += builtin/checkout.o
COMPAT_OBJS += compat/fnmatch/fnmatch.o
endif
endif
+ifdef USE_WILDMATCH
+ COMPAT_CFLAGS += -DUSE_WILDMATCH
+endif
ifdef NO_SETENV
COMPAT_CFLAGS += -DNO_SETENV
COMPAT_OBJS += compat/setenv.o
extern int cmd_checkout(int argc, const char **argv, const char *prefix);
extern int cmd_checkout_index(int argc, const char **argv, const char *prefix);
extern int cmd_check_attr(int argc, const char **argv, const char *prefix);
+extern int cmd_check_ignore(int argc, const char **argv, const char *prefix);
extern int cmd_check_ref_format(int argc, const char **argv, const char *prefix);
extern int cmd_cherry(int argc, const char **argv, const char *prefix);
extern int cmd_cherry_pick(int argc, const char **argv, const char *prefix);
#include "cache.h"
#include "builtin.h"
#include "dir.h"
+#include "pathspec.h"
#include "exec_cmd.h"
#include "cache-tree.h"
#include "run-command.h"
return !!data.add_errors;
}
-static void fill_pathspec_matches(const char **pathspec, char *seen, int specs)
-{
- int num_unmatched = 0, i;
-
- /*
- * Since we are walking the index as if we were walking the directory,
- * we have to mark the matched pathspec as seen; otherwise we will
- * mistakenly think that the user gave a pathspec that did not match
- * anything.
- */
- for (i = 0; i < specs; i++)
- if (!seen[i])
- num_unmatched++;
- if (!num_unmatched)
- return;
- for (i = 0; i < active_nr; i++) {
- struct cache_entry *ce = active_cache[i];
- match_pathspec(pathspec, ce->name, ce_namelen(ce), 0, seen);
- }
-}
-
-static char *find_used_pathspec(const char **pathspec)
-{
- char *seen;
- int i;
-
- for (i = 0; pathspec[i]; i++)
- ; /* just counting */
- seen = xcalloc(i, 1);
- fill_pathspec_matches(pathspec, seen, i);
- return seen;
-}
-
static char *prune_directory(struct dir_struct *dir, const char **pathspec, int prefix)
{
char *seen;
*dst++ = entry;
}
dir->nr = dst - dir->entries;
- fill_pathspec_matches(pathspec, seen, specs);
+ add_pathspec_matches_against_index(pathspec, seen, specs);
return seen;
}
+/*
+ * Checks the index to see whether any path in pathspec refers to
+ * something inside a submodule. If so, dies with an error message.
+ */
static void treat_gitlinks(const char **pathspec)
{
int i;
if (!pathspec || !*pathspec)
return;
- for (i = 0; i < active_nr; i++) {
- struct cache_entry *ce = active_cache[i];
- if (S_ISGITLINK(ce->ce_mode)) {
- int len = ce_namelen(ce), j;
- for (j = 0; pathspec[j]; j++) {
- int len2 = strlen(pathspec[j]);
- if (len2 <= len || pathspec[j][len] != '/' ||
- memcmp(ce->name, pathspec[j], len))
- continue;
- if (len2 == len + 1)
- /* strip trailing slash */
- pathspec[j] = xstrndup(ce->name, len);
- else
- die (_("Path '%s' is in submodule '%.*s'"),
- pathspec[j], len, ce->name);
- }
- }
- }
+ for (i = 0; pathspec[i]; i++)
+ pathspec[i] = check_path_for_gitlink(pathspec[i]);
}
static void refresh(int verbose, const char **pathspec)
free(seen);
}
-static const char **validate_pathspec(int argc, const char **argv, const char *prefix)
+/*
+ * Normalizes argv relative to prefix, via get_pathspec(), and then
+ * runs die_if_path_beyond_symlink() on each path in the normalized
+ * list.
+ */
+static const char **validate_pathspec(const char **argv, const char *prefix)
{
const char **pathspec = get_pathspec(prefix, argv);
if (pathspec) {
const char **p;
for (p = pathspec; *p; p++) {
- if (has_symlink_leading_path(*p, strlen(*p))) {
- int len = prefix ? strlen(prefix) : 0;
- die(_("'%s' is beyond a symbolic link"), *p + len);
- }
+ die_if_path_beyond_symlink(*p, prefix);
}
}
const char **pathspec = NULL;
if (argc) {
- pathspec = validate_pathspec(argc, argv, prefix);
+ pathspec = validate_pathspec(argv, prefix);
if (!pathspec)
return -1;
}
fprintf(stderr, _("Maybe you wanted to say 'git add .'?\n"));
return 0;
}
- pathspec = validate_pathspec(argc, argv, prefix);
+ pathspec = validate_pathspec(argv, prefix);
if (read_cache() < 0)
die(_("index file corrupt"));
path_exclude_check_init(&check, &dir);
if (!seen)
- seen = find_used_pathspec(pathspec);
+ seen = find_pathspecs_matching_against_index(pathspec);
for (i = 0; pathspec[i]; i++) {
if (!seen[i] && pathspec[i][0]
&& !file_exists(pathspec[i])) {
--- /dev/null
+#include "builtin.h"
+#include "cache.h"
+#include "dir.h"
+#include "quote.h"
+#include "pathspec.h"
+#include "parse-options.h"
+
+static int quiet, verbose, stdin_paths;
+static const char * const check_ignore_usage[] = {
+"git check-ignore [options] pathname...",
+"git check-ignore [options] --stdin < <list-of-paths>",
+NULL
+};
+
+static int null_term_line;
+
+static const struct option check_ignore_options[] = {
+ OPT__QUIET(&quiet, N_("suppress progress reporting")),
+ OPT__VERBOSE(&verbose, N_("be verbose")),
+ OPT_GROUP(""),
+ OPT_BOOLEAN(0, "stdin", &stdin_paths,
+ N_("read file names from stdin")),
+ OPT_BOOLEAN('z', NULL, &null_term_line,
+ N_("input paths are terminated by a null character")),
+ OPT_END()
+};
+
+static void output_exclude(const char *path, struct exclude *exclude)
+{
+ char *bang = exclude->flags & EXC_FLAG_NEGATIVE ? "!" : "";
+ char *slash = exclude->flags & EXC_FLAG_MUSTBEDIR ? "/" : "";
+ if (!null_term_line) {
+ if (!verbose) {
+ write_name_quoted(path, stdout, '\n');
+ } else {
+ quote_c_style(exclude->el->src, NULL, stdout, 0);
+ printf(":%d:%s%s%s\t",
+ exclude->srcpos,
+ bang, exclude->pattern, slash);
+ quote_c_style(path, NULL, stdout, 0);
+ fputc('\n', stdout);
+ }
+ } else {
+ if (!verbose) {
+ printf("%s%c", path, '\0');
+ } else {
+ printf("%s%c%d%c%s%s%s%c%s%c",
+ exclude->el->src, '\0',
+ exclude->srcpos, '\0',
+ bang, exclude->pattern, slash, '\0',
+ path, '\0');
+ }
+ }
+}
+
+static int check_ignore(const char *prefix, const char **pathspec)
+{
+ struct dir_struct dir;
+ const char *path, *full_path;
+ char *seen;
+ int num_ignored = 0, dtype = DT_UNKNOWN, i;
+ struct path_exclude_check check;
+ struct exclude *exclude;
+
+ /* read_cache() is only necessary so we can watch out for submodules. */
+ if (read_cache() < 0)
+ die(_("index file corrupt"));
+
+ memset(&dir, 0, sizeof(dir));
+ dir.flags |= DIR_COLLECT_IGNORED;
+ setup_standard_excludes(&dir);
+
+ if (!pathspec || !*pathspec) {
+ if (!quiet)
+ fprintf(stderr, "no pathspec given.\n");
+ return 0;
+ }
+
+ path_exclude_check_init(&check, &dir);
+ /*
+ * look for pathspecs matching entries in the index, since these
+ * should not be ignored, in order to be consistent with
+ * 'git status', 'git add' etc.
+ */
+ seen = find_pathspecs_matching_against_index(pathspec);
+ for (i = 0; pathspec[i]; i++) {
+ path = pathspec[i];
+ full_path = prefix_path(prefix, prefix
+ ? strlen(prefix) : 0, path);
+ full_path = check_path_for_gitlink(full_path);
+ die_if_path_beyond_symlink(full_path, prefix);
+ if (!seen[i] && path[0]) {
+ exclude = last_exclude_matching_path(&check, full_path,
+ -1, &dtype);
+ if (exclude) {
+ if (!quiet)
+ output_exclude(path, exclude);
+ num_ignored++;
+ }
+ }
+ }
+ free(seen);
+ clear_directory(&dir);
+ path_exclude_check_clear(&check);
+
+ return num_ignored;
+}
+
+static int check_ignore_stdin_paths(const char *prefix)
+{
+ struct strbuf buf, nbuf;
+ char **pathspec = NULL;
+ size_t nr = 0, alloc = 0;
+ int line_termination = null_term_line ? 0 : '\n';
+ int num_ignored;
+
+ strbuf_init(&buf, 0);
+ strbuf_init(&nbuf, 0);
+ while (strbuf_getline(&buf, stdin, line_termination) != EOF) {
+ if (line_termination && buf.buf[0] == '"') {
+ strbuf_reset(&nbuf);
+ if (unquote_c_style(&nbuf, buf.buf, NULL))
+ die("line is badly quoted");
+ strbuf_swap(&buf, &nbuf);
+ }
+ ALLOC_GROW(pathspec, nr + 1, alloc);
+ pathspec[nr] = xcalloc(strlen(buf.buf) + 1, sizeof(*buf.buf));
+ strcpy(pathspec[nr++], buf.buf);
+ }
+ ALLOC_GROW(pathspec, nr + 1, alloc);
+ pathspec[nr] = NULL;
+ num_ignored = check_ignore(prefix, (const char **)pathspec);
+ maybe_flush_or_die(stdout, "attribute to stdout");
+ strbuf_release(&buf);
+ strbuf_release(&nbuf);
+ free(pathspec);
+ return num_ignored;
+}
+
+int cmd_check_ignore(int argc, const char **argv, const char *prefix)
+{
+ int num_ignored;
+
+ git_config(git_default_config, NULL);
+
+ argc = parse_options(argc, argv, prefix, check_ignore_options,
+ check_ignore_usage, 0);
+
+ if (stdin_paths) {
+ if (argc > 0)
+ die(_("cannot specify pathnames with --stdin"));
+ } else {
+ if (null_term_line)
+ die(_("-z only makes sense with --stdin"));
+ if (argc == 0)
+ die(_("no path specified"));
+ }
+ if (quiet) {
+ if (argc > 1)
+ die(_("--quiet is only valid with a single pathname"));
+ if (verbose)
+ die(_("cannot have both --quiet and --verbose"));
+ }
+
+ if (stdin_paths) {
+ num_ignored = check_ignore_stdin_paths(prefix);
+ } else {
+ num_ignored = check_ignore(prefix, argv);
+ maybe_flush_or_die(stdout, "ignore to stdout");
+ }
+
+ return !num_ignored;
+}
static const char **pathspec;
struct strbuf buf = STRBUF_INIT;
struct string_list exclude_list = STRING_LIST_INIT_NODUP;
+ struct exclude_list *el;
const char *qname;
char *seen = NULL;
struct option options[] = {
if (!ignored)
setup_standard_excludes(&dir);
+ el = add_exclude_list(&dir, EXC_CMDL, "--exclude option");
for (i = 0; i < exclude_list.nr; i++)
- add_exclude(exclude_list.items[i].string, "", 0,
- &dir.exclude_list[EXC_CMDL]);
+ add_exclude(exclude_list.items[i].string, "", 0, el, -(i+1));
pathspec = get_pathspec(prefix, argv);
return git_status_config(k, v, s);
}
-static const char post_rewrite_hook[] = "hooks/post-rewrite";
-
static int run_rewrite_hook(const unsigned char *oldsha1,
const unsigned char *newsha1)
{
int code;
size_t n;
- if (access(git_path(post_rewrite_hook), X_OK) < 0)
+ argv[0] = find_hook("post-rewrite");
+ if (!argv[0])
return 0;
- argv[0] = git_path(post_rewrite_hook);
argv[1] = "amend";
argv[2] = NULL;
static char *ps_matched;
static const char *with_tree;
static int exc_given;
+static int exclude_args;
static const char *tag_cached = "";
static const char *tag_unmerged = "";
static int option_parse_exclude(const struct option *opt,
const char *arg, int unset)
{
- struct exclude_list *list = opt->value;
+ struct string_list *exclude_list = opt->value;
exc_given = 1;
- add_exclude(arg, "", 0, list);
+ string_list_append(exclude_list, arg);
return 0;
}
int cmd_ls_files(int argc, const char **argv, const char *cmd_prefix)
{
- int require_work_tree = 0, show_tag = 0;
+ int require_work_tree = 0, show_tag = 0, i;
const char *max_prefix;
struct dir_struct dir;
+ struct exclude_list *el;
+ struct string_list exclude_list = STRING_LIST_INIT_NODUP;
struct option builtin_ls_files_options[] = {
{ OPTION_CALLBACK, 'z', NULL, NULL, NULL,
N_("paths are separated with NUL character"),
N_("show unmerged files in the output")),
OPT_BOOLEAN(0, "resolve-undo", &show_resolve_undo,
N_("show resolve-undo information")),
- { OPTION_CALLBACK, 'x', "exclude", &dir.exclude_list[EXC_CMDL], N_("pattern"),
+ { OPTION_CALLBACK, 'x', "exclude", &exclude_list, N_("pattern"),
N_("skip files matching pattern"),
0, option_parse_exclude },
{ OPTION_CALLBACK, 'X', "exclude-from", &dir, N_("file"),
argc = parse_options(argc, argv, prefix, builtin_ls_files_options,
ls_files_usage, 0);
+ el = add_exclude_list(&dir, EXC_CMDL, "--exclude option");
+ for (i = 0; i < exclude_list.nr; i++) {
+ add_exclude(exclude_list.items[i].string, "", 0, el, --exclude_args);
+ }
if (show_tag || show_valid_bit) {
tag_cached = "H ";
tag_unmerged = "M ";
OPT_BOOL(0, "progress", &progress, N_("force progress reporting")),
OPT_BIT(0, "prune", &flags, N_("prune locally removed refs"),
TRANSPORT_PUSH_PRUNE),
+ OPT_BIT(0, "no-verify", &flags, N_("bypass pre-push hook"), TRANSPORT_PUSH_NO_HOOK),
OPT_END()
};
char ref_name[FLEX_ARRAY]; /* more */
};
-static const char pre_receive_hook[] = "hooks/pre-receive";
-static const char post_receive_hook[] = "hooks/post-receive";
-
static void rp_error(const char *err, ...) __attribute__((format (printf, 1, 2)));
static void rp_warning(const char *err, ...) __attribute__((format (printf, 1, 2)));
const char *argv[2];
int code;
- if (access(hook_name, X_OK) < 0)
+ argv[0] = find_hook(hook_name);
+ if (!argv[0])
return 0;
- argv[0] = hook_name;
argv[1] = NULL;
memset(&proc, 0, sizeof(proc));
static int run_update_hook(struct command *cmd)
{
- static const char update_hook[] = "hooks/update";
const char *argv[5];
struct child_process proc;
int code;
- if (access(update_hook, X_OK) < 0)
+ argv[0] = find_hook("update");
+ if (!argv[0])
return 0;
- argv[0] = update_hook;
argv[1] = cmd->ref_name;
argv[2] = sha1_to_hex(cmd->old_sha1);
argv[3] = sha1_to_hex(cmd->new_sha1);
}
}
-static char update_post_hook[] = "hooks/post-update";
-
static void run_update_post_hook(struct command *commands)
{
struct command *cmd;
int argc;
const char **argv;
struct child_process proc;
+ char *hook;
+ hook = find_hook("post-update");
for (argc = 0, cmd = commands; cmd; cmd = cmd->next) {
if (cmd->error_string || cmd->did_not_exist)
continue;
argc++;
}
- if (!argc || access(update_post_hook, X_OK) < 0)
+ if (!argc || !hook)
return;
+
argv = xmalloc(sizeof(*argv) * (2 + argc));
- argv[0] = update_post_hook;
+ argv[0] = hook;
for (argc = 1, cmd = commands; cmd; cmd = cmd->next) {
char *p;
0, &cmd))
set_connectivity_errors(commands);
- if (run_receive_hook(commands, pre_receive_hook, 0)) {
+ if (run_receive_hook(commands, "pre-receive", 0)) {
for (cmd = commands; cmd; cmd = cmd->next) {
if (!cmd->error_string)
cmd->error_string = "pre-receive hook declined";
unlink_or_warn(pack_lockfile);
if (report_status)
report(commands, unpack_status);
- run_receive_hook(commands, post_receive_hook, 1);
+ run_receive_hook(commands, "post-receive", 1);
run_update_post_hook(commands);
if (auto_gc) {
const char *argv_gc_auto[] = {
requires_force:1,
merge:1,
nonfastforward:1,
- not_forwardable:1,
update:1,
deletion:1;
enum {
extern int git_env_bool(const char *, int);
extern int git_config_system(void);
extern int config_error_nonbool(const char *);
-#ifdef __GNUC__
+#if defined(__GNUC__) && ! defined(__clang__)
#define config_error_nonbool(s) (config_error_nonbool(s), -1)
#endif
extern const char *get_log_output_encoding(void);
git-bundle mainporcelain
git-cat-file plumbinginterrogators
git-check-attr purehelpers
+git-check-ignore purehelpers
git-checkout mainporcelain common
git-checkout-index plumbingmanipulators
git-check-ref-format purehelpers
archimport) : import;;
cat-file) : plumbing;;
check-attr) : plumbing;;
+ check-ignore) : plumbing;;
check-ref-format) : plumbing;;
checkout-index) : plumbing;;
commit-tree) : plumbing;;
X-Git-Reftype: $refname_type
X-Git-Oldrev: $oldrev
X-Git-Newrev: $newrev
+ Auto-Submitted: auto-generated
This is an automated email from the git hooks/post-receive script. It was
generated because a ref change was pushed to the repository containing
}
/*
- * Given a name and a list of pathspecs, see if the name matches
- * any of the pathspecs. The caller is also interested in seeing
- * all pathspec matches some names it calls this function with
- * (otherwise the user could have mistyped the unmatched pathspec),
- * and a mark is left in seen[] array for pathspec element that
- * actually matched anything.
+ * Given a name and a list of pathspecs, returns the nature of the
+ * closest (i.e. most specific) match of the name to any of the
+ * pathspecs.
+ *
+ * The caller typically calls this multiple times with the same
+ * pathspec and seen[] array but with different name/namelen
+ * (e.g. entries from the index) and is interested in seeing if and
+ * how each pathspec matches all the names it calls this function
+ * with. A mark is left in the seen[] array for each pathspec element
+ * indicating the closest type of match that element achieved, so if
+ * seen[n] remains zero after multiple invocations, that means the nth
+ * pathspec did not match any names, which could indicate that the
+ * user mistyped the nth pathspec.
*/
int match_pathspec(const char **pathspec, const char *name, int namelen,
int prefix, char *seen)
}
/*
- * Given a name and a list of pathspecs, see if the name matches
- * any of the pathspecs. The caller is also interested in seeing
- * all pathspec matches some names it calls this function with
- * (otherwise the user could have mistyped the unmatched pathspec),
- * and a mark is left in seen[] array for pathspec element that
- * actually matched anything.
+ * Given a name and a list of pathspecs, returns the nature of the
+ * closest (i.e. most specific) match of the name to any of the
+ * pathspecs.
+ *
+ * The caller typically calls this multiple times with the same
+ * pathspec and seen[] array but with different name/namelen
+ * (e.g. entries from the index) and is interested in seeing if and
+ * how each pathspec matches all the names it calls this function
+ * with. A mark is left in the seen[] array for each pathspec element
+ * indicating the closest type of match that element achieved, so if
+ * seen[n] remains zero after multiple invocations, that means the nth
+ * pathspec did not match any names, which could indicate that the
+ * user mistyped the nth pathspec.
*/
int match_pathspec_depth(const struct pathspec *ps,
const char *name, int namelen,
}
void add_exclude(const char *string, const char *base,
- int baselen, struct exclude_list *el)
+ int baselen, struct exclude_list *el, int srcpos)
{
struct exclude *x;
int patternlen;
x->base = base;
x->baselen = baselen;
x->flags = flags;
+ x->srcpos = srcpos;
ALLOC_GROW(el->excludes, el->nr + 1, el->alloc);
el->excludes[el->nr++] = x;
+ x->el = el;
}
static void *read_skip_worktree_file_from_index(const char *path, size_t *size)
for (i = 0; i < el->nr; i++)
free(el->excludes[i]);
free(el->excludes);
+ free(el->filebuf);
el->nr = 0;
el->excludes = NULL;
+ el->filebuf = NULL;
}
int add_excludes_from_file_to_list(const char *fname,
const char *base,
int baselen,
- char **buf_p,
struct exclude_list *el,
int check_index)
{
struct stat st;
- int fd, i;
+ int fd, i, lineno = 1;
size_t size = 0;
char *buf, *entry;
close(fd);
}
- if (buf_p)
- *buf_p = buf;
+ el->filebuf = buf;
entry = buf;
for (i = 0; i < size; i++) {
if (buf[i] == '\n') {
if (entry != buf + i && entry[0] != '#') {
buf[i - (i && buf[i-1] == '\r')] = 0;
- add_exclude(entry, base, baselen, el);
+ add_exclude(entry, base, baselen, el, lineno);
}
+ lineno++;
entry = buf + i + 1;
}
}
return 0;
}
+struct exclude_list *add_exclude_list(struct dir_struct *dir,
+ int group_type, const char *src)
+{
+ struct exclude_list *el;
+ struct exclude_list_group *group;
+
+ group = &dir->exclude_list_group[group_type];
+ ALLOC_GROW(group->el, group->nr + 1, group->alloc);
+ el = &group->el[group->nr++];
+ memset(el, 0, sizeof(*el));
+ el->src = src;
+ return el;
+}
+
+/*
+ * Used to set up core.excludesfile and .git/info/exclude lists.
+ */
void add_excludes_from_file(struct dir_struct *dir, const char *fname)
{
- if (add_excludes_from_file_to_list(fname, "", 0, NULL,
- &dir->exclude_list[EXC_FILE], 0) < 0)
+ struct exclude_list *el;
+ el = add_exclude_list(dir, EXC_FILE, fname);
+ if (add_excludes_from_file_to_list(fname, "", 0, el, 0) < 0)
die("cannot use %s as an exclude file", fname);
}
*/
static void prep_exclude(struct dir_struct *dir, const char *base, int baselen)
{
+ struct exclude_list_group *group;
struct exclude_list *el;
struct exclude_stack *stk = NULL;
int current;
(baselen + strlen(dir->exclude_per_dir) >= PATH_MAX))
return; /* too long a path -- ignore */
- /* Pop the directories that are not the prefix of the path being checked. */
- el = &dir->exclude_list[EXC_DIRS];
+ group = &dir->exclude_list_group[EXC_DIRS];
+
+ /* Pop the exclude lists from the EXCL_DIRS exclude_list_group
+ * which originate from directories not in the prefix of the
+ * path being checked. */
while ((stk = dir->exclude_stack) != NULL) {
if (stk->baselen <= baselen &&
!strncmp(dir->basebuf, base, stk->baselen))
break;
+ el = &group->el[dir->exclude_stack->exclude_ix];
dir->exclude_stack = stk->prev;
- while (stk->exclude_ix < el->nr)
- free(el->excludes[--el->nr]);
- free(stk->filebuf);
+ free((char *)el->src); /* see strdup() below */
+ clear_exclude_list(el);
free(stk);
+ group->nr--;
}
/* Read from the parent directories and push them down. */
}
stk->prev = dir->exclude_stack;
stk->baselen = cp - base;
- stk->exclude_ix = el->nr;
memcpy(dir->basebuf + current, base + current,
stk->baselen - current);
strcpy(dir->basebuf + stk->baselen, dir->exclude_per_dir);
+ /*
+ * dir->basebuf gets reused by the traversal, but we
+ * need fname to remain unchanged to ensure the src
+ * member of each struct exclude correctly
+ * back-references its source file. Other invocations
+ * of add_exclude_list provide stable strings, so we
+ * strdup() and free() here in the caller.
+ */
+ el = add_exclude_list(dir, EXC_DIRS, strdup(dir->basebuf));
+ stk->exclude_ix = group->nr - 1;
add_excludes_from_file_to_list(dir->basebuf,
dir->basebuf, stk->baselen,
- &stk->filebuf, el, 1);
+ el, 1);
dir->exclude_stack = stk;
current = stk->baselen;
}
}
return wildmatch(pattern, name,
- ignore_case ? FNM_CASEFOLD : 0) == 0;
+ WM_PATHNAME | (ignore_case ? WM_CASEFOLD : 0),
+ NULL) == 0;
}
/*
int *dtype_p)
{
int pathlen = strlen(pathname);
- int st;
+ int i, j;
+ struct exclude_list_group *group;
struct exclude *exclude;
const char *basename = strrchr(pathname, '/');
basename = (basename) ? basename+1 : pathname;
prep_exclude(dir, pathname, basename-pathname);
- for (st = EXC_CMDL; st <= EXC_FILE; st++) {
- exclude = last_exclude_matching_from_list(
- pathname, pathlen, basename, dtype_p,
- &dir->exclude_list[st]);
- if (exclude)
- return exclude;
+
+ for (i = EXC_CMDL; i <= EXC_FILE; i++) {
+ group = &dir->exclude_list_group[i];
+ for (j = group->nr - 1; j >= 0; j--) {
+ exclude = last_exclude_matching_from_list(
+ pathname, pathlen, basename, dtype_p,
+ &group->el[j]);
+ if (exclude)
+ return exclude;
+ }
}
return NULL;
}
flag = git_env_bool(GIT_LITERAL_PATHSPECS_ENVIRONMENT, 0);
return flag;
}
+
+/*
+ * Frees memory within dir which was allocated for exclude lists and
+ * the exclude_stack. Does not free dir itself.
+ */
+void clear_directory(struct dir_struct *dir)
+{
+ int i, j;
+ struct exclude_list_group *group;
+ struct exclude_list *el;
+ struct exclude_stack *stk;
+
+ for (i = EXC_CMDL; i <= EXC_FILE; i++) {
+ group = &dir->exclude_list_group[i];
+ for (j = 0; j < group->nr; j++) {
+ el = &group->el[j];
+ if (i == EXC_DIRS)
+ free((char *)el->src);
+ clear_exclude_list(el);
+ }
+ free(group->el);
+ }
+
+ stk = dir->exclude_stack;
+ while (stk) {
+ struct exclude_stack *prev = stk->prev;
+ free(stk);
+ stk = prev;
+ }
+}
#define EXC_FLAG_NEGATIVE 16
/*
- * Each .gitignore file will be parsed into patterns which are then
- * appended to the relevant exclude_list (either EXC_DIRS or
- * EXC_FILE). exclude_lists are also used to represent the list of
- * --exclude values passed via CLI args (EXC_CMDL).
+ * Each excludes file will be parsed into a fresh exclude_list which
+ * is appended to the relevant exclude_list_group (either EXC_DIRS or
+ * EXC_FILE). An exclude_list within the EXC_CMDL exclude_list_group
+ * can also be used to represent the list of --exclude values passed
+ * via CLI args.
*/
struct exclude_list {
int nr;
int alloc;
+
+ /* remember pointer to exclude file contents so we can free() */
+ char *filebuf;
+
+ /* origin of list, e.g. path to filename, or descriptive string */
+ const char *src;
+
struct exclude {
+ /*
+ * This allows callers of last_exclude_matching() etc.
+ * to determine the origin of the matching pattern.
+ */
+ struct exclude_list *el;
+
const char *pattern;
int patternlen;
int nowildcardlen;
const char *base;
int baselen;
int flags;
+
+ /*
+ * Counting starts from 1 for line numbers in ignore files,
+ * and from -1 decrementing for patterns from CLI args.
+ */
+ int srcpos;
} **excludes;
};
*/
struct exclude_stack {
struct exclude_stack *prev; /* the struct exclude_stack for the parent directory */
- char *filebuf; /* remember pointer to per-directory exclude file contents so we can free() */
int baselen;
- int exclude_ix;
+ int exclude_ix; /* index of exclude_list within EXC_DIRS exclude_list_group */
+};
+
+struct exclude_list_group {
+ int nr, alloc;
+ struct exclude_list *el;
};
struct dir_struct {
/* Exclude info */
const char *exclude_per_dir;
- struct exclude_list exclude_list[3];
+
/*
- * We maintain three exclude pattern lists:
+ * We maintain three groups of exclude pattern lists:
+ *
* EXC_CMDL lists patterns explicitly given on the command line.
* EXC_DIRS lists patterns obtained from per-directory ignore files.
- * EXC_FILE lists patterns from fallback ignore files.
+ * EXC_FILE lists patterns from fallback ignore files, e.g.
+ * - .git/info/exclude
+ * - core.excludesfile
+ *
+ * Each group contains multiple exclude lists, a single list
+ * per source.
*/
#define EXC_CMDL 0
#define EXC_DIRS 1
#define EXC_FILE 2
+ struct exclude_list_group exclude_list_group[3];
/*
* Temporary variables which are used during loading of the
char basebuf[PATH_MAX];
};
+/*
+ * The ordering of these constants is significant, with
+ * higher-numbered match types signifying "closer" (i.e. more
+ * specific) matches which will override lower-numbered match types
+ * when populating the seen[] array.
+ */
#define MATCHED_RECURSIVELY 1
#define MATCHED_FNMATCH 2
#define MATCHED_EXACTLY 3
extern int is_path_excluded(struct path_exclude_check *, const char *, int namelen, int *dtype);
+extern struct exclude_list *add_exclude_list(struct dir_struct *dir,
+ int group_type, const char *src);
extern int add_excludes_from_file_to_list(const char *fname, const char *base, int baselen,
- char **buf_p, struct exclude_list *el, int check_index);
+ struct exclude_list *el, int check_index);
extern void add_excludes_from_file(struct dir_struct *, const char *fname);
extern void parse_exclude_pattern(const char **string, int *patternlen, int *flags, int *nowildcardlen);
extern void add_exclude(const char *string, const char *base,
- int baselen, struct exclude_list *el);
+ int baselen, struct exclude_list *el, int srcpos);
extern void clear_exclude_list(struct exclude_list *el);
+extern void clear_directory(struct dir_struct *dir);
extern int file_exists(const char *);
extern int is_inside_dir(const char *dir);
#include <sys/time.h>
#include <time.h>
#include <signal.h>
+#ifndef USE_WILDMATCH
#include <fnmatch.h>
+#endif
#include <assert.h>
#include <regex.h>
#include <utime.h>
#include "compat/bswap.h"
+#ifdef USE_WILDMATCH
+#include "wildmatch.h"
+#define FNM_PATHNAME WM_PATHNAME
+#define FNM_CASEFOLD WM_CASEFOLD
+#define FNM_NOMATCH WM_NOMATCH
+static inline int fnmatch(const char *pattern, const char *string, int flags)
+{
+ return wildmatch(pattern, string, flags, NULL);
+}
+#endif
+
/* General helper functions */
extern void vreportf(const char *prefix, const char *err, va_list params);
extern void vwritef(int fd, const char *prefix, const char *err, va_list params);
* behavior. But since we're only trying to help gcc, anyway, it's OK; other
* compilers will fall back to using the function as usual.
*/
-#ifdef __GNUC__
+#if defined(__GNUC__) && ! defined(__clang__)
#define error(fmt, ...) (error((fmt), ##__VA_ARGS__), -1)
#endif
'Valid-responses' => \&req_Validresponses,
'valid-requests' => \&req_validrequests,
'Directory' => \&req_Directory,
+ 'Sticky' => \&req_Sticky,
'Entry' => \&req_Entry,
'Modified' => \&req_Modified,
'Unchanged' => \&req_Unchanged,
{
$log->info("Setting prepend to '$state->{path}'");
$state->{prependdir} = $state->{path};
+ my %entries;
foreach my $entry ( keys %{$state->{entries}} )
{
- $state->{entries}{$state->{prependdir} . $entry} = $state->{entries}{$entry};
- delete $state->{entries}{$entry};
+ $entries{$state->{prependdir} . $entry} = $state->{entries}{$entry};
}
+ $state->{entries}=\%entries;
+
+ my %dirMap;
+ foreach my $dir ( keys %{$state->{dirMap}} )
+ {
+ $dirMap{$state->{prependdir} . $dir} = $state->{dirMap}{$dir};
+ }
+ $state->{dirMap}=\%dirMap;
}
if ( defined ( $state->{prependdir} ) )
$log->debug("Prepending '$state->{prependdir}' to state|directory");
$state->{directory} = $state->{prependdir} . $state->{directory}
}
+
+ if ( ! defined($state->{dirMap}{$state->{directory}}) )
+ {
+ $state->{dirMap}{$state->{directory}} =
+ {
+ 'names' => {}
+ #'tagspec' => undef
+ };
+ }
+
$log->debug("req_Directory : localdir=$data repository=$repository path=$state->{path} directory=$state->{directory} module=$state->{module}");
}
+# Sticky tagspec \n
+# Response expected: no. Tell the server that the directory most
+# recently specified with Directory has a sticky tag or date
+# tagspec. The first character of tagspec is T for a tag, D for
+# a date, or some other character supplied by a Set-sticky
+# response from a previous request to the server. The remainder
+# of tagspec contains the actual tag or date, again as supplied
+# by Set-sticky.
+# The server should remember Static-directory and Sticky requests
+# for a particular directory; the client need not resend them each
+# time it sends a Directory request for a given directory. However,
+# the server is not obliged to remember them beyond the context
+# of a single command.
+sub req_Sticky
+{
+ my ( $cmd, $tagspec ) = @_;
+
+ my ( $stickyInfo );
+ if($tagspec eq "")
+ {
+ # nothing
+ }
+ elsif($tagspec=~/^T([^ ]+)\s*$/)
+ {
+ $stickyInfo = { 'tag' => $1 };
+ }
+ elsif($tagspec=~/^D([0-9.]+)\s*$/)
+ {
+ $stickyInfo= { 'date' => $1 };
+ }
+ else
+ {
+ die "Unknown tag_or_date format\n";
+ }
+ $state->{dirMap}{$state->{directory}}{stickyInfo}=$stickyInfo;
+
+ $log->debug("req_Sticky : tagspec=$tagspec repository=$state->{repository}"
+ . " path=$state->{path} directory=$state->{directory}"
+ . " module=$state->{module}");
+}
+
# Entry entry-line \n
# Response expected: no. Tell the server what version of a file is on the
# local machine. The name in entry-line is a name relative to the directory
tag_or_date => $data[5],
};
+ $state->{dirMap}{$state->{directory}}{names}{$data[1]} = 'F';
+
$log->info("Received entry line '$data' => '" . $state->{directory} . $data[1] . "'");
}
{
$filename = filecleanup($filename);
- my $meta = $updater->getmeta($filename);
+ # no -r, -A, or -D with add
+ my $stickyInfo = resolveStickyInfo($filename);
+
+ my $meta = $updater->getmeta($filename,$stickyInfo);
my $wrev = revparse($filename);
if ($wrev && $meta && ($wrev=~/^-/))
# this is an "entries" line
my $kopts = kopts_from_path($filename,"sha1",$meta->{filehash});
- $log->debug("/$filepart/$meta->{revision}//$kopts/");
- print "/$filepart/$meta->{revision}//$kopts/\n";
+ my $entryLine = "/$filepart/$meta->{revision}//$kopts/";
+ $entryLine .= getStickyTagOrDate($stickyInfo);
+ $log->debug($entryLine);
+ print "$entryLine\n";
# permissions
$log->debug("SEND : u=$meta->{mode},g=$meta->{mode},o=$meta->{mode}");
print "u=$meta->{mode},g=$meta->{mode},o=$meta->{mode}\n";
print "$filename\n";
my $kopts = kopts_from_path($filename,"file",
$state->{entries}{$filename}{modified_filename});
- print "/$filepart/0//$kopts/\n";
+ print "/$filepart/0//$kopts/" .
+ getStickyTagOrDate($stickyInfo) . "\n";
my $requestedKopts = $state->{opt}{k};
if(defined($requestedKopts))
next;
}
- my $meta = $updater->getmeta($filename);
+ # only from entries
+ my $stickyInfo = resolveStickyInfo($filename);
+
+ my $meta = $updater->getmeta($filename,$stickyInfo);
my $wrev = revparse($filename);
unless ( defined ( $wrev ) )
print "Checked-in $dirpart\n";
print "$filename\n";
my $kopts = kopts_from_path($filename,"sha1",$meta->{filehash});
- print "/$filepart/-$wrev//$kopts/\n";
+ print "/$filepart/-$wrev//$kopts/" . getStickyTagOrDate($stickyInfo) . "\n";
$rmcount++;
}
return 1;
}
+ my $stickyInfo = { 'tag' => $state->{opt}{r},
+ 'date' => $state->{opt}{D} };
+
my $module = $state->{args}[0];
$state->{module} = $module;
my $checkout_path = $module;
my $updater = GITCVS::updater->new($state->{CVSROOT}, $module, $log);
$updater->update();
- $checkout_path =~ s|/$||; # get rid of trailing slashes
+ my $headHash;
+ if( defined($stickyInfo) && defined($stickyInfo->{tag}) )
+ {
+ $headHash = $updater->lookupCommitRef($stickyInfo->{tag});
+ if( !defined($headHash) )
+ {
+ print "error 1 no such tag `$stickyInfo->{tag}'\n";
+ cleanupWorkTree();
+ exit;
+ }
+ }
- # Eclipse seems to need the Clear-sticky command
- # to prepare the 'Entries' file for the new directory.
- print "Clear-sticky $checkout_path/\n";
- print $state->{CVSROOT} . "/$module/\n";
- print "Clear-static-directory $checkout_path/\n";
- print $state->{CVSROOT} . "/$module/\n";
- print "Clear-sticky $checkout_path/\n"; # yes, twice
- print $state->{CVSROOT} . "/$module/\n";
- print "Template $checkout_path/\n";
- print $state->{CVSROOT} . "/$module/\n";
- print "0\n";
-
- # instruct the client that we're checking out to $checkout_path
- print "E cvs checkout: Updating $checkout_path\n";
+ $checkout_path =~ s|/$||; # get rid of trailing slashes
my %seendirs = ();
my $lastdir ='';
- # recursive
- sub prepdir {
- my ($dir, $repodir, $remotedir, $seendirs) = @_;
- my $parent = dirname($dir);
- $dir =~ s|/+$||;
- $repodir =~ s|/+$||;
- $remotedir =~ s|/+$||;
- $parent =~ s|/+$||;
- $log->debug("announcedir $dir, $repodir, $remotedir" );
-
- if ($parent eq '.' || $parent eq './') {
- $parent = '';
- }
- # recurse to announce unseen parents first
- if (length($parent) && !exists($seendirs->{$parent})) {
- prepdir($parent, $repodir, $remotedir, $seendirs);
- }
- # Announce that we are going to modify at the parent level
- if ($parent) {
- print "E cvs checkout: Updating $remotedir/$parent\n";
- } else {
- print "E cvs checkout: Updating $remotedir\n";
- }
- print "Clear-sticky $remotedir/$parent/\n";
- print "$repodir/$parent/\n";
-
- print "Clear-static-directory $remotedir/$dir/\n";
- print "$repodir/$dir/\n";
- print "Clear-sticky $remotedir/$parent/\n"; # yes, twice
- print "$repodir/$parent/\n";
- print "Template $remotedir/$dir/\n";
- print "$repodir/$dir/\n";
- print "0\n";
-
- $seendirs->{$dir} = 1;
- }
-
- foreach my $git ( @{$updater->gethead} )
+ prepDirForOutput(
+ ".",
+ $state->{CVSROOT} . "/$module",
+ $checkout_path,
+ \%seendirs,
+ 'checkout',
+ $state->{dirArgs} );
+
+ foreach my $git ( @{$updater->getAnyHead($headHash)} )
{
# Don't want to check out deleted files
next if ( $git->{filehash} eq "deleted" );
my $fullName = $git->{name};
( $git->{name}, $git->{dir} ) = filenamesplit($git->{name});
- if (length($git->{dir}) && $git->{dir} ne './'
- && $git->{dir} ne $lastdir ) {
- unless (exists($seendirs{$git->{dir}})) {
- prepdir($git->{dir}, $state->{CVSROOT} . "/$module/",
- $checkout_path, \%seendirs);
- $lastdir = $git->{dir};
- $seendirs{$git->{dir}} = 1;
- }
- print "E cvs checkout: Updating /$checkout_path/$git->{dir}\n";
- }
+ unless (exists($seendirs{$git->{dir}})) {
+ prepDirForOutput($git->{dir}, $state->{CVSROOT} . "/$module/",
+ $checkout_path, \%seendirs, 'checkout',
+ $state->{dirArgs} );
+ $lastdir = $git->{dir};
+ $seendirs{$git->{dir}} = 1;
+ }
# modification time of this file
print "Mod-time $git->{modified}\n";
# this is an "entries" line
my $kopts = kopts_from_path($fullName,"sha1",$git->{filehash});
- print "/$git->{name}/$git->{revision}//$kopts/\n";
+ print "/$git->{name}/$git->{revision}//$kopts/" .
+ getStickyTagOrDate($stickyInfo) . "\n";
# permissions
print "u=$git->{mode},g=$git->{mode},o=$git->{mode}\n";
statecleanup();
}
+# used by req_co and req_update to set up directories for files
+# recursively handles parents
+sub prepDirForOutput
+{
+ my ($dir, $repodir, $remotedir, $seendirs, $request, $dirArgs) = @_;
+
+ my $parent = dirname($dir);
+ $dir =~ s|/+$||;
+ $repodir =~ s|/+$||;
+ $remotedir =~ s|/+$||;
+ $parent =~ s|/+$||;
+
+ if ($parent eq '.' || $parent eq './')
+ {
+ $parent = '';
+ }
+ # recurse to announce unseen parents first
+ if( length($parent) &&
+ !exists($seendirs->{$parent}) &&
+ ( $request eq "checkout" ||
+ exists($dirArgs->{$parent}) ) )
+ {
+ prepDirForOutput($parent, $repodir, $remotedir,
+ $seendirs, $request, $dirArgs);
+ }
+ # Announce that we are going to modify at the parent level
+ if ($dir eq '.' || $dir eq './')
+ {
+ $dir = '';
+ }
+ if(exists($seendirs->{$dir}))
+ {
+ return;
+ }
+ $log->debug("announcedir $dir, $repodir, $remotedir" );
+ my($thisRemoteDir,$thisRepoDir);
+ if ($dir ne "")
+ {
+ $thisRepoDir="$repodir/$dir";
+ if($remotedir eq ".")
+ {
+ $thisRemoteDir=$dir;
+ }
+ else
+ {
+ $thisRemoteDir="$remotedir/$dir";
+ }
+ }
+ else
+ {
+ $thisRepoDir=$repodir;
+ $thisRemoteDir=$remotedir;
+ }
+ unless ( $state->{globaloptions}{-Q} || $state->{globaloptions}{-q} )
+ {
+ print "E cvs $request: Updating $thisRemoteDir\n";
+ }
+
+ my ($opt_r)=$state->{opt}{r};
+ my $stickyInfo;
+ if(exists($state->{opt}{A}))
+ {
+ # $stickyInfo=undef;
+ }
+ elsif( defined($opt_r) && $opt_r ne "" )
+ # || ( defined($state->{opt}{D}) && $state->{opt}{D} ne "" ) # TODO
+ {
+ $stickyInfo={ 'tag' => (defined($opt_r)?$opt_r:undef) };
+
+ # TODO: Convert -D value into the form 2011.04.10.04.46.57,
+ # similar to an entry line's sticky date, without the D prefix.
+ # It sometimes (always?) arrives as something more like
+ # '10 Apr 2011 04:46:57 -0000'...
+ # $stickyInfo={ 'date' => (defined($stickyDate)?$stickyDate:undef) };
+ }
+ else
+ {
+ $stickyInfo=getDirStickyInfo($state->{prependdir} . $dir);
+ }
+
+ my $stickyResponse;
+ if(defined($stickyInfo))
+ {
+ $stickyResponse = "Set-sticky $thisRemoteDir/\n" .
+ "$thisRepoDir/\n" .
+ getStickyTagOrDate($stickyInfo) . "\n";
+ }
+ else
+ {
+ $stickyResponse = "Clear-sticky $thisRemoteDir/\n" .
+ "$thisRepoDir/\n";
+ }
+
+ unless ( $state->{globaloptions}{-n} )
+ {
+ print $stickyResponse;
+
+ print "Clear-static-directory $thisRemoteDir/\n";
+ print "$thisRepoDir/\n";
+ print $stickyResponse; # yes, twice
+ print "Template $thisRemoteDir/\n";
+ print "$thisRepoDir/\n";
+ print "0\n";
+ }
+
+ $seendirs->{$dir} = 1;
+
+ # FUTURE: This would more accurately emulate CVS by sending
+ # another copy of sticky after processing the files in that
+ # directory. Or intermediate: perhaps send all sticky's for
+ # $seendirs after after processing all files.
+}
+
# update \n
# Response expected: yes. Actually do a cvs update command. This uses any
# previous Argument, Directory, Entry, or Modified requests, if they have
#$log->debug("update state : " . Dumper($state));
- my $last_dirname = "///";
+ my($repoDir);
+ $repoDir=$state->{CVSROOT} . "/$state->{module}/$state->{prependdir}";
+
+ my %seendirs = ();
# foreach file specified on the command line ...
- foreach my $filename ( @{$state->{args}} )
+ foreach my $argsFilename ( @{$state->{args}} )
{
- $filename = filecleanup($filename);
+ my $filename;
+ $filename = filecleanup($argsFilename);
$log->debug("Processing file $filename");
- unless ( $state->{globaloptions}{-Q} || $state->{globaloptions}{-q} )
- {
- my $cur_dirname = dirname($filename);
- if ( $cur_dirname ne $last_dirname )
- {
- $last_dirname = $cur_dirname;
- if ( $cur_dirname eq "" )
- {
- $cur_dirname = ".";
- }
- print "E cvs update: Updating $cur_dirname\n";
- }
- }
-
# if we have a -C we should pretend we never saw modified stuff
if ( exists ( $state->{opt}{C} ) )
{
$state->{entries}{$filename}{unchanged} = 1;
}
- my $meta;
- if ( defined($state->{opt}{r}) and $state->{opt}{r} =~ /^(1\.\d+)$/ )
- {
- $meta = $updater->getmeta($filename, $1);
- } else {
- $meta = $updater->getmeta($filename);
- }
+ my $stickyInfo = resolveStickyInfo($filename,
+ $state->{opt}{r},
+ $state->{opt}{D},
+ exists($state->{opt}{A}));
+ my $meta = $updater->getmeta($filename, $stickyInfo);
# If -p was given, "print" the contents of the requested revision.
if ( exists ( $state->{opt}{p} ) ) {
next;
}
+ # Directories:
+ prepDirForOutput(
+ dirname($argsFilename),
+ $repoDir,
+ ".",
+ \%seendirs,
+ "update",
+ $state->{dirArgs} );
+
+ my $wrev = revparse($filename);
+
if ( ! defined $meta )
{
$meta = {
revision => '0',
filehash => 'added'
};
+ if($wrev ne "0")
+ {
+ $meta->{filehash}='deleted';
+ }
}
my $oldmeta = $meta;
- my $wrev = revparse($filename);
-
# If the working copy is an old revision, lets get that version too for comparison.
- if ( defined($wrev) and $wrev ne $meta->{revision} )
+ my $oldWrev=$wrev;
+ if(defined($oldWrev))
{
- $oldmeta = $updater->getmeta($filename, $wrev);
+ $oldWrev=~s/^-//;
+ if($oldWrev ne $meta->{revision})
+ {
+ $oldmeta = $updater->getmeta($filename, $oldWrev);
+ }
}
#$log->debug("Target revision is $meta->{revision}, current working revision is $wrev");
if ( defined ( $wrev )
and defined($meta->{revision})
and $wrev eq $meta->{revision}
+ and $wrev ne "0"
and defined($state->{entries}{$filename}{modified_hash})
and not exists ( $state->{opt}{C} ) )
{
next;
}
- if ( $meta->{filehash} eq "deleted" )
+ if ( $meta->{filehash} eq "deleted" && $wrev ne "0" )
{
# TODO: If it has been modified in the sandbox, error out
# with the appropriate message, rather than deleting a modified
$log->debug("Updating existing file 'Update-existing $dirpart'");
} else {
# instruct client we're sending a file to put in this path as a new file
- print "Clear-static-directory $dirpart\n";
- print $state->{CVSROOT} . "/$state->{module}/$dirpart\n";
- print "Clear-sticky $dirpart\n";
- print $state->{CVSROOT} . "/$state->{module}/$dirpart\n";
$log->debug("Creating new file 'Created $dirpart'");
print "Created $dirpart\n";
# this is an "entries" line
my $kopts = kopts_from_path($filename,"sha1",$meta->{filehash});
- $log->debug("/$filepart/$meta->{revision}//$kopts/");
- print "/$filepart/$meta->{revision}//$kopts/\n";
+ my $entriesLine = "/$filepart/$meta->{revision}//$kopts/";
+ $entriesLine .= getStickyTagOrDate($stickyInfo);
+ $log->debug($entriesLine);
+ print "$entriesLine\n";
# permissions
$log->debug("SEND : u=$meta->{mode},g=$meta->{mode},o=$meta->{mode}");
my $kopts = kopts_from_path("$dirpart/$filepart",
"file",$mergedFile);
$log->debug("/$filepart/$meta->{revision}//$kopts/");
- print "/$filepart/$meta->{revision}//$kopts/\n";
+ my $entriesLine="/$filepart/$meta->{revision}//$kopts/";
+ $entriesLine .= getStickyTagOrDate($stickyInfo);
+ print "$entriesLine\n";
}
}
elsif ( $return == 1 )
print $state->{CVSROOT} . "/$state->{module}/$filename\n";
my $kopts = kopts_from_path("$dirpart/$filepart",
"file",$mergedFile);
- print "/$filepart/$meta->{revision}/+/$kopts/\n";
+ my $entriesLine = "/$filepart/$meta->{revision}/+/$kopts/";
+ $entriesLine .= getStickyTagOrDate($stickyInfo);
+ print "$entriesLine\n";
}
}
else
}
+ # prepDirForOutput() any other existing directories unless they already
+ # have the right sticky tag:
+ unless ( $state->{globaloptions}{n} )
+ {
+ my $dir;
+ foreach $dir (keys(%{$state->{dirMap}}))
+ {
+ if( ! $seendirs{$dir} &&
+ exists($state->{dirArgs}{$dir}) )
+ {
+ my($oldTag);
+ $oldTag=$state->{dirMap}{$dir}{tagspec};
+
+ unless( ( exists($state->{opt}{A}) &&
+ defined($oldTag) ) ||
+ ( defined($state->{opt}{r}) &&
+ ( !defined($oldTag) ||
+ $state->{opt}{r} ne $oldTag ) ) )
+ # TODO?: OR sticky dir is different...
+ {
+ next;
+ }
+
+ prepDirForOutput(
+ $dir,
+ $repoDir,
+ ".",
+ \%seendirs,
+ 'update',
+ $state->{dirArgs} );
+ }
+
+ # TODO?: Consider sending a final duplicate Sticky response
+ # to more closely mimic real CVS.
+ }
+ }
+
print "ok\n";
}
my $updater = GITCVS::updater->new($state->{CVSROOT}, $state->{module}, $log);
$updater->update();
- # Remember where the head was at the beginning.
- my $parenthash = `git show-ref -s refs/heads/$state->{module}`;
- chomp $parenthash;
- if ($parenthash !~ /^[0-9a-f]{40}$/) {
- print "error 1 pserver cannot find the current HEAD of module";
- cleanupWorkTree();
- exit;
- }
-
- setupWorkTree($parenthash);
-
- $log->info("Lockless commit start, basing commit on '$work->{workDir}', index file is '$work->{index}'");
-
- $log->info("Created index '$work->{index}' for head $state->{module} - exit status $?");
-
my @committedfiles = ();
my %oldmeta;
+ my $stickyInfo;
+ my $branchRef;
+ my $parenthash;
# foreach file specified on the command line ...
foreach my $filename ( @{$state->{args}} )
next unless ( exists $state->{entries}{$filename}{modified_filename} or not $state->{entries}{$filename}{unchanged} );
- my $meta = $updater->getmeta($filename);
+ #####
+ # Figure out which branch and parenthash we are committing
+ # to, and setup worktree:
+
+ # should always come from entries:
+ my $fileStickyInfo = resolveStickyInfo($filename);
+ if( !defined($branchRef) )
+ {
+ $stickyInfo = $fileStickyInfo;
+ if( defined($stickyInfo) &&
+ ( defined($stickyInfo->{date}) ||
+ !defined($stickyInfo->{tag}) ) )
+ {
+ print "error 1 cannot commit with sticky date for file `$filename'\n";
+ cleanupWorkTree();
+ exit;
+ }
+
+ $branchRef = "refs/heads/$state->{module}";
+ if ( defined($stickyInfo) && defined($stickyInfo->{tag}) )
+ {
+ $branchRef = "refs/heads/$stickyInfo->{tag}";
+ }
+
+ $parenthash = `git show-ref -s $branchRef`;
+ chomp $parenthash;
+ if ($parenthash !~ /^[0-9a-f]{40}$/)
+ {
+ if ( defined($stickyInfo) && defined($stickyInfo->{tag}) )
+ {
+ print "error 1 sticky tag `$stickyInfo->{tag}' for file `$filename' is not a branch\n";
+ }
+ else
+ {
+ print "error 1 pserver cannot find the current HEAD of module";
+ }
+ cleanupWorkTree();
+ exit;
+ }
+
+ setupWorkTree($parenthash);
+
+ $log->info("Lockless commit start, basing commit on '$work->{workDir}', index file is '$work->{index}'");
+
+ $log->info("Created index '$work->{index}' for head $state->{module} - exit status $?");
+ }
+ elsif( !refHashEqual($stickyInfo,$fileStickyInfo) )
+ {
+ #TODO: We could split the cvs commit into multiple
+ # git commits by distinct stickyTag values, but that
+ # is lowish priority.
+ print "error 1 Committing different files to different"
+ . " branches is not currently supported\n";
+ cleanupWorkTree();
+ exit;
+ }
+
+ #####
+ # Process this file:
+
+ my $meta = $updater->getmeta($filename,$stickyInfo);
$oldmeta{$filename} = $meta;
my $wrev = revparse($filename);
}
### Emulate git-receive-pack by running hooks/update
- my @hook = ( $ENV{GIT_DIR}.'hooks/update', "refs/heads/$state->{module}",
+ my @hook = ( $ENV{GIT_DIR}.'hooks/update', $branchRef,
$parenthash, $commithash );
if( -x $hook[0] ) {
unless( system( @hook ) == 0 )
### Update the ref
if (system(qw(git update-ref -m), "cvsserver ci",
- "refs/heads/$state->{module}", $commithash, $parenthash)) {
+ $branchRef, $commithash, $parenthash)) {
$log->warn("update-ref for $state->{module} failed.");
print "error 1 Cannot commit -- update first\n";
cleanupWorkTree();
local $SIG{PIPE} = sub { die 'pipe broke' };
- print $pipe "$parenthash $commithash refs/heads/$state->{module}\n";
+ print $pipe "$parenthash $commithash $branchRef\n";
close $pipe || die "bad pipe: $! $?";
}
### Then hooks/post-update
$hook = $ENV{GIT_DIR}.'hooks/post-update';
if (-x $hook) {
- system($hook, "refs/heads/$state->{module}");
+ system($hook, $branchRef);
}
# foreach file specified on the command line ...
{
$filename = filecleanup($filename);
- my $meta = $updater->getmeta($filename);
+ my $meta = $updater->getmeta($filename,$stickyInfo);
unless (defined $meta->{revision}) {
$meta->{revision} = "1.1";
}
print "Checked-in $dirpart\n";
print "$filename\n";
my $kopts = kopts_from_path($filename,"sha1",$meta->{filehash});
- print "/$filepart/$meta->{revision}//$kopts/\n";
+ print "/$filepart/$meta->{revision}//$kopts/" .
+ getStickyTagOrDate($stickyInfo) . "\n";
}
}
next;
}
- my $meta = $updater->getmeta($filename);
- my $oldmeta = $meta;
-
my $wrev = revparse($filename);
+ my $stickyInfo = resolveStickyInfo($filename);
+ my $meta = $updater->getmeta($filename,$stickyInfo);
+ my $oldmeta = $meta;
+
# If the working copy is an old revision, lets get that
# version too for comparison.
if ( defined($wrev) and $wrev ne $meta->{revision} )
{
- $oldmeta = $updater->getmeta($filename, $wrev);
+ my($rmRev)=$wrev;
+ $rmRev=~s/^-//;
+ $oldmeta = $updater->getmeta($filename, $rmRev);
}
# TODO : All possible statuses aren't yet implemented
# same revision but there are local changes
if ( defined ( $wrev ) and defined($meta->{revision}) and
$wrev eq $meta->{revision} and
+ $wrev ne "0" and
$state->{entries}{$filename}{modified_filename} )
{
$status ||= "Locally Modified";
}
if ( defined ( $state->{entries}{$filename}{revision} ) and
- not defined ( $meta->{revision} ) )
+ ( !defined($meta->{revision}) ||
+ $meta->{revision} eq "0" ) )
{
$status ||= "Locally Added";
}
# be providing status on ...
argsfromdir($updater);
+ my($foundDiff);
+
# foreach file specified on the command line ...
- foreach my $filename ( @{$state->{args}} )
+ foreach my $argFilename ( @{$state->{args}} )
{
- $filename = filecleanup($filename);
+ my($filename) = filecleanup($argFilename);
my ( $fh, $file1, $file2, $meta1, $meta2, $filediff );
my $wrev = revparse($filename);
- # We need _something_ to diff against
- next unless ( defined ( $wrev ) );
+ # Priority for revision1:
+ # 1. First -r (missing file: check -N)
+ # 2. wrev from client's Entry line
+ # - missing line/file: check -N
+ # - "0": added file not committed (empty contents for rev1)
+ # - Prefixed with dash (to be removed): check -N
- # if we have a -r switch, use it
if ( defined ( $revision1 ) )
{
- ( undef, $file1 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
$meta1 = $updater->getmeta($filename, $revision1);
- unless ( defined ( $meta1 ) and $meta1->{filehash} ne "deleted" )
+ }
+ elsif( defined($wrev) && $wrev ne "0" )
+ {
+ my($rmRev)=$wrev;
+ $rmRev=~s/^-//;
+ $meta1 = $updater->getmeta($filename, $rmRev);
+ }
+ if ( !defined($meta1) ||
+ $meta1->{filehash} eq "deleted" )
+ {
+ if( !exists($state->{opt}{N}) )
{
- print "E File $filename at revision $revision1 doesn't exist\n";
+ if(!defined($revision1))
+ {
+ print "E File $filename at revision $revision1 doesn't exist\n";
+ }
next;
}
- transmitfile($meta1->{filehash}, { targetfile => $file1 });
- }
- # otherwise we just use the working copy revision
- else
- {
- ( undef, $file1 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
- $meta1 = $updater->getmeta($filename, $wrev);
- transmitfile($meta1->{filehash}, { targetfile => $file1 });
+ elsif( !defined($meta1) )
+ {
+ $meta1 = {
+ name => $filename,
+ revision => '0',
+ filehash => 'deleted'
+ };
+ }
}
+ # Priority for revision2:
+ # 1. Second -r (missing file: check -N)
+ # 2. Modified file contents from client
+ # 3. wrev from client's Entry line
+ # - missing line/file: check -N
+ # - Prefixed with dash (to be removed): check -N
+
# if we have a second -r switch, use it too
if ( defined ( $revision2 ) )
{
- ( undef, $file2 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
$meta2 = $updater->getmeta($filename, $revision2);
-
- unless ( defined ( $meta2 ) and $meta2->{filehash} ne "deleted" )
- {
- print "E File $filename at revision $revision2 doesn't exist\n";
- next;
- }
-
- transmitfile($meta2->{filehash}, { targetfile => $file2 });
}
- # otherwise we just use the working copy
- else
+ elsif(defined($state->{entries}{$filename}{modified_filename}))
{
$file2 = $state->{entries}{$filename}{modified_filename};
+ $meta2 = {
+ name => $filename,
+ revision => '0',
+ filehash => 'modified'
+ };
}
-
- # if we have been given -r, and we don't have a $file2 yet, lets
- # get one
- if ( defined ( $revision1 ) and not defined ( $file2 ) )
+ elsif( defined($wrev) && ($wrev!~/^-/) )
{
- ( undef, $file2 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
+ if(!defined($revision1)) # no revision and no modifications:
+ {
+ next;
+ }
$meta2 = $updater->getmeta($filename, $wrev);
- transmitfile($meta2->{filehash}, { targetfile => $file2 });
+ }
+ if(!defined($file2))
+ {
+ if ( !defined($meta2) ||
+ $meta2->{filehash} eq "deleted" )
+ {
+ if( !exists($state->{opt}{N}) )
+ {
+ if(!defined($revision2))
+ {
+ print "E File $filename at revision $revision2 doesn't exist\n";
+ }
+ next;
+ }
+ elsif( !defined($meta2) )
+ {
+ $meta2 = {
+ name => $filename,
+ revision => '0',
+ filehash => 'deleted'
+ };
+ }
+ }
}
- # We need to have retrieved something useful
- next unless ( defined ( $meta1 ) );
-
- # Files to date if the working copy and repo copy have the same
- # revision, and the working copy is unmodified
- if ( not defined ( $meta2 ) and $wrev eq $meta1->{revision} and
- ( ( $state->{entries}{$filename}{unchanged} and
- ( not defined ( $state->{entries}{$filename}{conflict} ) or
- $state->{entries}{$filename}{conflict} !~ /^\+=/ ) ) or
- ( defined($state->{entries}{$filename}{modified_hash}) and
- $state->{entries}{$filename}{modified_hash} eq
- $meta1->{filehash} ) ) )
+ if( $meta1->{filehash} eq $meta2->{filehash} )
{
+ $log->info("unchanged $filename");
next;
}
- # Apparently we only show diffs for locally modified files
- unless ( defined($meta2) or
- defined ( $state->{entries}{$filename}{modified_filename} ) )
+ # Retrieve revision contents:
+ ( undef, $file1 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
+ transmitfile($meta1->{filehash}, { targetfile => $file1 });
+
+ if(!defined($file2))
{
- next;
+ ( undef, $file2 ) = tempfile( DIR => $TEMP_DIR, OPEN => 0 );
+ transmitfile($meta2->{filehash}, { targetfile => $file2 });
}
- print "M Index: $filename\n";
+ # Generate the actual diff:
+ print "M Index: $argFilename\n";
print "M =======" . ( "=" x 60 ) . "\n";
print "M RCS file: $state->{CVSROOT}/$state->{module}/$filename,v\n";
- if ( defined ( $meta1 ) )
+ if ( defined ( $meta1 ) && $meta1->{revision} ne "0" )
{
print "M retrieving revision $meta1->{revision}\n"
}
- if ( defined ( $meta2 ) )
+ if ( defined ( $meta2 ) && $meta2->{revision} ne "0" )
{
print "M retrieving revision $meta2->{revision}\n"
}
}
}
}
- print "$filename\n";
+ print "$argFilename\n";
$log->info("Diffing $filename -r $meta1->{revision} -r " .
( $meta2->{revision} or "workingcopy" ));
- ( $fh, $filediff ) = tempfile ( DIR => $TEMP_DIR );
-
- if ( exists $state->{opt}{u} )
+ # TODO: Use --label instead of -L because -L is no longer
+ # documented and may go away someday. Not sure if there there are
+ # versions that only support -L, which would make this change risky?
+ # http://osdir.com/ml/bug-gnu-utils-gnu/2010-12/msg00060.html
+ # ("man diff" should actually document the best migration strategy,
+ # [current behavior, future changes, old compatibility issues
+ # or lack thereof, etc], not just stop mentioning the option...)
+ # TODO: Real CVS seems to include a date in the label, before
+ # the revision part, without the keyword "revision". The following
+ # has minimal changes compared to original versions of
+ # git-cvsserver.perl. (Mostly tab vs space after filename.)
+
+ my (@diffCmd) = ( 'diff' );
+ if ( exists($state->{opt}{N}) )
{
- system("diff -u -L '$filename revision $meta1->{revision}'" .
- " -L '$filename " .
- ( defined($meta2->{revision}) ?
- "revision $meta2->{revision}" :
- "working copy" ) .
- "' $file1 $file2 > $filediff" );
- } else {
- system("diff $file1 $file2 > $filediff");
+ push @diffCmd,"-N";
}
+ if ( exists $state->{opt}{u} )
+ {
+ push @diffCmd,("-u","-L");
+ if( $meta1->{filehash} eq "deleted" )
+ {
+ push @diffCmd,"/dev/null";
+ } else {
+ push @diffCmd,("$argFilename\trevision $meta1->{revision}");
+ }
- while ( <$fh> )
+ if( defined($meta2->{filehash}) )
+ {
+ if( $meta2->{filehash} eq "deleted" )
+ {
+ push @diffCmd,("-L","/dev/null");
+ } else {
+ push @diffCmd,("-L",
+ "$argFilename\trevision $meta2->{revision}");
+ }
+ } else {
+ push @diffCmd,("-L","$argFilename\tworking copy");
+ }
+ }
+ push @diffCmd,($file1,$file2);
+ if(!open(DIFF,"-|",@diffCmd))
{
- print "M $_";
+ $log->warn("Unable to run diff: $!");
}
- close $fh;
+ my($diffLine);
+ while(defined($diffLine=<DIFF>))
+ {
+ print "M $diffLine";
+ $foundDiff=1;
+ }
+ close(DIFF);
}
- print "ok\n";
+ if($foundDiff)
+ {
+ print "error \n";
+ }
+ else
+ {
+ print "ok\n";
+ }
}
sub req_log
$opt = { A => 0, N => 0, P => 0, R => 0, c => 0, f => 0, l => 0, n => 0, p => 0, s => 0, r => 1, D => 1, d => 1, k => 1, j => 1, } if ( $type eq "co" );
$opt = { v => 0, l => 0, R => 0 } if ( $type eq "status" );
$opt = { A => 0, P => 0, C => 0, d => 0, f => 0, l => 0, R => 0, p => 0, k => 1, r => 1, D => 1, j => 1, I => 1, W => 1 } if ( $type eq "update" );
- $opt = { l => 0, R => 0, k => 1, D => 1, D => 1, r => 2 } if ( $type eq "diff" );
+ $opt = { l => 0, R => 0, k => 1, D => 1, D => 1, r => 2, N => 0 } if ( $type eq "diff" );
$opt = { c => 0, R => 0, l => 0, f => 0, F => 1, m => 1, r => 1 } if ( $type eq "ci" );
$opt = { k => 1, m => 1 } if ( $type eq "add" );
$opt = { f => 0, l => 0, R => 0 } if ( $type eq "remove" );
}
}
-# This method uses $state->{directory} to populate $state->{args} with a list of filenames
-sub argsfromdir
+# Used by argsfromdir
+sub expandArg
{
- my $updater = shift;
-
- $state->{args} = [] if ( scalar(@{$state->{args}}) == 1 and $state->{args}[0] eq "." );
+ my ($updater,$outNameMap,$outDirMap,$path,$isDir) = @_;
- return if ( scalar ( @{$state->{args}} ) > 1 );
+ my $fullPath = filecleanup($path);
- my @gethead = @{$updater->gethead};
+ # Is it a directory?
+ if( defined($state->{dirMap}{$fullPath}) ||
+ defined($state->{dirMap}{"$fullPath/"}) )
+ {
+ # It is a directory in the user's sandbox.
+ $isDir=1;
- # push added files
- foreach my $file (keys %{$state->{entries}}) {
- if ( exists $state->{entries}{$file}{revision} &&
- $state->{entries}{$file}{revision} eq '0' )
- {
- push @gethead, { name => $file, filehash => 'added' };
- }
+ if(defined($state->{entries}{$fullPath}))
+ {
+ $log->fatal("Inconsistent file/dir type");
+ die "Inconsistent file/dir type";
+ }
}
-
- if ( scalar(@{$state->{args}}) == 1 )
+ elsif(defined($state->{entries}{$fullPath}))
{
- my $arg = $state->{args}[0];
- $arg .= $state->{prependdir} if ( defined ( $state->{prependdir} ) );
-
- $log->info("Only one arg specified, checking for directory expansion on '$arg'");
-
- foreach my $file ( @gethead )
+ # It is a file in the user's sandbox.
+ $isDir=0;
+ }
+ my($revDirMap,$otherRevDirMap);
+ if(!defined($isDir) || $isDir)
+ {
+ # Resolve version tree for sticky tag:
+ # (for now we only want list of files for the version, not
+ # particular versions of those files: assume it is a directory
+ # for the moment; ignore Entry's stick tag)
+
+ # Order of precedence of sticky tags:
+ # -A [head]
+ # -r /tag/
+ # [file entry sticky tag, but that is only relevant to files]
+ # [the tag specified in dir req_Sticky]
+ # [the tag specified in a parent dir req_Sticky]
+ # [head]
+ # Also, -r may appear twice (for diff).
+ #
+ # FUTURE: When/if -j (merges) are supported, we also
+ # need to add relevant files from one or two
+ # versions specified with -j.
+
+ if(exists($state->{opt}{A}))
{
- next if ( $file->{filehash} eq "deleted" and not defined ( $state->{entries}{$file->{name}} ) );
- next unless ( $file->{name} =~ /^$arg\// or $file->{name} eq $arg );
- push @{$state->{args}}, $file->{name};
+ $revDirMap=$updater->getRevisionDirMap();
}
-
- shift @{$state->{args}} if ( scalar(@{$state->{args}}) > 1 );
- } else {
- $log->info("Only one arg specified, populating file list automatically");
-
- $state->{args} = [];
-
- foreach my $file ( @gethead )
+ elsif( defined($state->{opt}{r}) and
+ ref $state->{opt}{r} eq "ARRAY" )
{
- next if ( $file->{filehash} eq "deleted" and not defined ( $state->{entries}{$file->{name}} ) );
- next unless ( $file->{name} =~ s/^$state->{prependdir}// );
- push @{$state->{args}}, $file->{name};
+ $revDirMap=$updater->getRevisionDirMap($state->{opt}{r}[0]);
+ $otherRevDirMap=$updater->getRevisionDirMap($state->{opt}{r}[1]);
}
- }
-}
-
-# This method cleans up the $state variable after a command that uses arguments has run
+ elsif(defined($state->{opt}{r}))
+ {
+ $revDirMap=$updater->getRevisionDirMap($state->{opt}{r});
+ }
+ else
+ {
+ my($sticky)=getDirStickyInfo($fullPath);
+ $revDirMap=$updater->getRevisionDirMap($sticky->{tag});
+ }
+
+ # Is it a directory?
+ if( defined($revDirMap->{$fullPath}) ||
+ defined($otherRevDirMap->{$fullPath}) )
+ {
+ $isDir=1;
+ }
+ }
+
+ # What to do with it?
+ if(!$isDir)
+ {
+ $outNameMap->{$fullPath}=1;
+ }
+ else
+ {
+ $outDirMap->{$fullPath}=1;
+
+ if(defined($revDirMap->{$fullPath}))
+ {
+ addDirMapFiles($updater,$outNameMap,$outDirMap,
+ $revDirMap->{$fullPath});
+ }
+ if( defined($otherRevDirMap) &&
+ defined($otherRevDirMap->{$fullPath}) )
+ {
+ addDirMapFiles($updater,$outNameMap,$outDirMap,
+ $otherRevDirMap->{$fullPath});
+ }
+ }
+}
+
+# Used by argsfromdir
+# Add entries from dirMap to outNameMap. Also recurse into entries
+# that are subdirectories.
+sub addDirMapFiles
+{
+ my($updater,$outNameMap,$outDirMap,$dirMap)=@_;
+
+ my($fullName);
+ foreach $fullName (keys(%$dirMap))
+ {
+ my $cleanName=$fullName;
+ if(defined($state->{prependdir}))
+ {
+ if(!($cleanName=~s/^\Q$state->{prependdir}\E//))
+ {
+ $log->fatal("internal error stripping prependdir");
+ die "internal error stripping prependdir";
+ }
+ }
+
+ if($dirMap->{$fullName} eq "F")
+ {
+ $outNameMap->{$cleanName}=1;
+ }
+ elsif($dirMap->{$fullName} eq "D")
+ {
+ if(!$state->{opt}{l})
+ {
+ expandArg($updater,$outNameMap,$outDirMap,$cleanName,1);
+ }
+ }
+ else
+ {
+ $log->fatal("internal error in addDirMapFiles");
+ die "internal error in addDirMapFiles";
+ }
+ }
+}
+
+# This method replaces $state->{args} with a directory-expanded
+# list of all relevant filenames (recursively unless -d), based
+# on $state->{entries}, and the "current" list of files in
+# each directory. "Current" files as determined by
+# either the requested (-r/-A) or "req_Sticky" version of
+# that directory.
+# Both the input args and the new output args are relative
+# to the cvs-client's CWD, although some of the internal
+# computations are relative to the top of the project.
+sub argsfromdir
+{
+ my $updater = shift;
+
+ # Notes about requirements for specific callers:
+ # update # "standard" case (entries; a single -r/-A/default; -l)
+ # # Special case: -d for create missing directories.
+ # diff # 0 or 1 -r's: "standard" case.
+ # # 2 -r's: We could ignore entries (just use the two -r's),
+ # # but it doesn't really matter.
+ # annotate # "standard" case
+ # log # Punting: log -r has a more complex non-"standard"
+ # # meaning, and we don't currently try to support log'ing
+ # # branches at all (need a lot of work to
+ # # support CVS-consistent branch relative version
+ # # numbering).
+#HERE: But we still want to expand directories. Maybe we should
+# essentially force "-A".
+ # status # "standard", except that -r/-A/default are not possible.
+ # # Mostly only used to expand entries only)
+ #
+ # Don't use argsfromdir at all:
+ # add # Explicit arguments required. Directory args imply add
+ # # the directory itself, not the files in it.
+ # co # Obtain list directly.
+ # remove # HERE: TEST: MAYBE client does the recursion for us,
+ # # since it only makes sense to remove stuff already in
+ # # the sandobx?
+ # ci # HERE: Similar to remove...
+ # # Don't try to implement the confusing/weird
+ # # ci -r bug er.."feature".
+
+ if(scalar(@{$state->{args}})==0)
+ {
+ $state->{args} = [ "." ];
+ }
+ my %allArgs;
+ my %allDirs;
+ for my $file (@{$state->{args}})
+ {
+ expandArg($updater,\%allArgs,\%allDirs,$file);
+ }
+
+ # Include any entries from sandbox. Generally client won't
+ # send entries that shouldn't be used.
+ foreach my $file (keys %{$state->{entries}})
+ {
+ $allArgs{remove_prependdir($file)} = 1;
+ }
+
+ $state->{dirArgs} = \%allDirs;
+ $state->{args} = [
+ sort {
+ # Sort priority: by directory depth, then actual file name:
+ my @piecesA=split('/',$a);
+ my @piecesB=split('/',$b);
+
+ my $count=scalar(@piecesA);
+ my $tmp=scalar(@piecesB);
+ return $count<=>$tmp if($count!=$tmp);
+
+ for($tmp=0;$tmp<$count;$tmp++)
+ {
+ if($piecesA[$tmp] ne $piecesB[$tmp])
+ {
+ return $piecesA[$tmp] cmp $piecesB[$tmp]
+ }
+ }
+ return 0;
+ } keys(%allArgs) ];
+}
+
+## look up directory sticky tag, of either fullPath or a parent:
+sub getDirStickyInfo
+{
+ my($fullPath)=@_;
+
+ $fullPath=~s%/+$%%;
+ while($fullPath ne "" && !defined($state->{dirMap}{"$fullPath/"}))
+ {
+ $fullPath=~s%/?[^/]*$%%;
+ }
+
+ if( !defined($state->{dirMap}{"$fullPath/"}) &&
+ ( $fullPath eq "" ||
+ $fullPath eq "." ) )
+ {
+ return $state->{dirMap}{""}{stickyInfo};
+ }
+ else
+ {
+ return $state->{dirMap}{"$fullPath/"}{stickyInfo};
+ }
+}
+
+# Resolve precedence of various ways of specifying which version of
+# a file you want. Returns undef (for default head), or a ref to a hash
+# that contains "tag" and/or "date" keys.
+sub resolveStickyInfo
+{
+ my($filename,$stickyTag,$stickyDate,$reset) = @_;
+
+ # Order of precedence of sticky tags:
+ # -A [head]
+ # -r /tag/
+ # [file entry sticky tag]
+ # [the tag specified in dir req_Sticky]
+ # [the tag specified in a parent dir req_Sticky]
+ # [head]
+
+ my $result;
+ if($reset)
+ {
+ # $result=undef;
+ }
+ elsif( defined($stickyTag) && $stickyTag ne "" )
+ # || ( defined($stickyDate) && $stickyDate ne "" ) # TODO
+ {
+ $result={ 'tag' => (defined($stickyTag)?$stickyTag:undef) };
+
+ # TODO: Convert -D value into the form 2011.04.10.04.46.57,
+ # similar to an entry line's sticky date, without the D prefix.
+ # It sometimes (always?) arrives as something more like
+ # '10 Apr 2011 04:46:57 -0000'...
+ # $result={ 'date' => (defined($stickyDate)?$stickyDate:undef) };
+ }
+ elsif( defined($state->{entries}{$filename}) &&
+ defined($state->{entries}{$filename}{tag_or_date}) &&
+ $state->{entries}{$filename}{tag_or_date} ne "" )
+ {
+ my($tagOrDate)=$state->{entries}{$filename}{tag_or_date};
+ if($tagOrDate=~/^T([^ ]+)\s*$/)
+ {
+ $result = { 'tag' => $1 };
+ }
+ elsif($tagOrDate=~/^D([0-9.]+)\s*$/)
+ {
+ $result= { 'date' => $1 };
+ }
+ else
+ {
+ die "Unknown tag_or_date format\n";
+ }
+ }
+ else
+ {
+ $result=getDirStickyInfo($filename);
+ }
+
+ return $result;
+}
+
+# Convert a stickyInfo (ref to a hash) as returned by resolveStickyInfo into
+# a form appropriate for the sticky tag field of an Entries
+# line (field index 5, 0-based).
+sub getStickyTagOrDate
+{
+ my($stickyInfo)=@_;
+
+ my $result;
+ if(defined($stickyInfo) && defined($stickyInfo->{tag}))
+ {
+ $result="T$stickyInfo->{tag}";
+ }
+ # TODO: When/if we actually pick versions by {date} properly,
+ # also handle it here:
+ # "D$stickyInfo->{date}" (example: "D2011.04.13.20.37.07").
+ else
+ {
+ $result="";
+ }
+
+ return $result;
+}
+
+# This method cleans up the $state variable after a command that uses arguments has run
sub statecleanup
{
$state->{files} = [];
+ $state->{dirArgs} = {};
$state->{args} = [];
$state->{arguments} = [];
$state->{entries} = {};
+ $state->{dirMap} = {};
}
# Return working directory CVS revision "1.X" out
return ( $filepart, $dirpart );
}
+# Cleanup various junk in filename (try to canonicalize it), and
+# add prependdir to accomodate running CVS client from a
+# subdirectory (so the output is relative to top directory of the project).
sub filecleanup
{
my $filename = shift;
return undef;
}
+ if($filename eq ".")
+ {
+ $filename="";
+ }
$filename =~ s/^\.\///g;
+ $filename =~ s%/+%/%g;
$filename = $state->{prependdir} . $filename;
+ $filename =~ s%/$%%;
return $filename;
}
+# Remove prependdir from the path, so that is is relative to the directory
+# the CVS client was started from, rather than the top of the project.
+# Essentially the inverse of filecleanup().
+sub remove_prependdir
+{
+ my($path) = @_;
+ if(defined($state->{prependdir}) && $state->{prependdir} ne "")
+ {
+ my($pre)=$state->{prependdir};
+ $pre=~s%/$%%;
+ if(!($path=~s%^\Q$pre\E/?%%))
+ {
+ $log->fatal("internal error missing prependdir");
+ die("internal error missing prependdir");
+ }
+ }
+ return $path;
+}
+
sub validateGitDir
{
if( !defined($state->{CVSROOT}) )
return $ret;
}
+# Test if the (deep) values of two references to a hash are the same.
+sub refHashEqual
+{
+ my($v1,$v2) = @_;
+
+ my $out;
+ if(!defined($v1))
+ {
+ if(!defined($v2))
+ {
+ $out=1;
+ }
+ }
+ elsif( !defined($v2) ||
+ scalar(keys(%{$v1})) != scalar(keys(%{$v2})) )
+ {
+ # $out=undef;
+ }
+ else
+ {
+ $out=1;
+
+ my $key;
+ foreach $key (keys(%{$v1}))
+ {
+ if( !exists($v2->{$key}) ||
+ defined($v1->{$key}) ne defined($v2->{$key}) ||
+ ( defined($v1->{$key}) &&
+ $v1->{$key} ne $v2->{$key} ) )
+ {
+ $out=undef;
+ last;
+ }
+ }
+ }
+
+ return $out;
+}
+
package GITCVS::log;
die "Git repo '$self->{git_path}' doesn't exist" unless ( -d $self->{git_path} );
+ # Stores full sha1's for various branch/tag names, abbreviations, etc:
+ $self->{commitRefCache} = {};
+
$self->{dbdriver} = $cfg->{gitcvs}{$state->{method}}{dbdriver} ||
$cfg->{gitcvs}{dbdriver} || "SQLite";
$self->{dbname} = $cfg->{gitcvs}{$state->{method}}{dbname} ||
my $lastcommit = $self->_get_prop("last_commit");
if (defined $lastcommit && $lastcommit eq $commitsha1) { # up-to-date
+ # invalidate the gethead cache
+ $self->clearCommitRefCaches();
return 1;
}
push @git_log_params, $self->{module};
}
# git-rev-list is the backend / plumbing version of git-log
- open(GITLOG, '-|', 'git', 'rev-list', @git_log_params) or die "Cannot call git-rev-list: $!";
-
- my @commits;
-
- my %commit = ();
-
- while ( <GITLOG> )
- {
- chomp;
- if (m/^commit\s+(.*)$/) {
- # on ^commit lines put the just seen commit in the stack
- # and prime things for the next one
- if (keys %commit) {
- my %copy = %commit;
- unshift @commits, \%copy;
- %commit = ();
- }
- my @parents = split(m/\s+/, $1);
- $commit{hash} = shift @parents;
- $commit{parents} = \@parents;
- } elsif (m/^(\w+?):\s+(.*)$/ && !exists($commit{message})) {
- # on rfc822-like lines seen before we see any message,
- # lowercase the entry and put it in the hash as key-value
- $commit{lc($1)} = $2;
- } else {
- # message lines - skip initial empty line
- # and trim whitespace
- if (!exists($commit{message}) && m/^\s*$/) {
- # define it to mark the end of headers
- $commit{message} = '';
- next;
- }
- s/^\s+//; s/\s+$//; # trim ws
- $commit{message} .= $_ . "\n";
- }
- }
- close GITLOG;
-
- unshift @commits, \%commit if ( keys %commit );
+ open(my $gitLogPipe, '-|', 'git', 'rev-list', @git_log_params)
+ or die "Cannot call git-rev-list: $!";
+ my @commits=readCommits($gitLogPipe);
+ close $gitLogPipe;
# Now all the commits are in the @commits bucket
# ordered by time DESC. for each commit that needs processing,
}
# convert the date to CVS-happy format
- $commit->{date} = "$2 $1 $4 $3 $5" if ( $commit->{date} =~ /^\w+\s+(\w+)\s+(\d+)\s+(\d+:\d+:\d+)\s+(\d+)\s+([+-]\d+)$/ );
+ my $cvsDate = convertToCvsDate($commit->{date});
if ( defined ( $lastpicked ) )
{
while ( <FILELIST> )
{
chomp;
- unless ( /^:\d{6}\s+\d{3}(\d)\d{2}\s+[a-zA-Z0-9]{40}\s+([a-zA-Z0-9]{40})\s+(\w)$/o )
+ unless ( /^:\d{6}\s+([0-7]{6})\s+[a-f0-9]{40}\s+([a-f0-9]{40})\s+(\w)$/o )
{
die("Couldn't process git-diff-tree line : $_");
}
# $log->debug("File mode=$mode, hash=$hash, change=$change, name=$name");
- my $git_perms = "";
- $git_perms .= "r" if ( $mode & 4 );
- $git_perms .= "w" if ( $mode & 2 );
- $git_perms .= "x" if ( $mode & 1 );
- $git_perms = "rw" if ( $git_perms eq "" );
+ my $dbMode = convertToDbMode($mode);
if ( $change eq "D" )
{
revision => $head->{$name}{revision} + 1,
filehash => "deleted",
commithash => $commit->{hash},
- modified => $commit->{date},
+ modified => $cvsDate,
author => $commit->{author},
- mode => $git_perms,
+ mode => $dbMode,
};
- $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $commit->{date}, $commit->{author}, $git_perms);
+ $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $cvsDate, $commit->{author}, $dbMode);
}
elsif ( $change eq "M" || $change eq "T" )
{
revision => $head->{$name}{revision} + 1,
filehash => $hash,
commithash => $commit->{hash},
- modified => $commit->{date},
+ modified => $cvsDate,
author => $commit->{author},
- mode => $git_perms,
+ mode => $dbMode,
};
- $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $commit->{date}, $commit->{author}, $git_perms);
+ $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $cvsDate, $commit->{author}, $dbMode);
}
elsif ( $change eq "A" )
{
revision => $head->{$name}{revision} ? $head->{$name}{revision}+1 : 1,
filehash => $hash,
commithash => $commit->{hash},
- modified => $commit->{date},
+ modified => $cvsDate,
author => $commit->{author},
- mode => $git_perms,
+ mode => $dbMode,
};
- $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $commit->{date}, $commit->{author}, $git_perms);
+ $self->insert_rev($name, $head->{$name}{revision}, $hash, $commit->{hash}, $cvsDate, $commit->{author}, $dbMode);
}
else
{
die("Couldn't process git-ls-tree line : $_");
}
- my ( $git_perms, $git_type, $git_hash, $git_filename ) = ( $1, $2, $3, $4 );
+ my ( $mode, $git_type, $git_hash, $git_filename ) = ( $1, $2, $3, $4 );
$seen_files->{$git_filename} = 1;
$head->{$git_filename}{mode}
);
- if ( $git_perms =~ /^\d\d\d(\d)\d\d/o )
- {
- $git_perms = "";
- $git_perms .= "r" if ( $1 & 4 );
- $git_perms .= "w" if ( $1 & 2 );
- $git_perms .= "x" if ( $1 & 1 );
- } else {
- $git_perms = "rw";
- }
+ my $dbMode = convertToDbMode($mode);
# unless the file exists with the same hash, we need to update it ...
- unless ( defined($oldhash) and $oldhash eq $git_hash and defined($oldmode) and $oldmode eq $git_perms )
+ unless ( defined($oldhash) and $oldhash eq $git_hash and defined($oldmode) and $oldmode eq $dbMode )
{
my $newrevision = ( $oldrevision or 0 ) + 1;
revision => $newrevision,
filehash => $git_hash,
commithash => $commit->{hash},
- modified => $commit->{date},
+ modified => $cvsDate,
author => $commit->{author},
- mode => $git_perms,
+ mode => $dbMode,
};
- $self->insert_rev($git_filename, $newrevision, $git_hash, $commit->{hash}, $commit->{date}, $commit->{author}, $git_perms);
+ $self->insert_rev($git_filename, $newrevision, $git_hash, $commit->{hash}, $cvsDate, $commit->{author}, $dbMode);
}
}
close FILELIST;
$head->{$file}{revision}++;
$head->{$file}{filehash} = "deleted";
$head->{$file}{commithash} = $commit->{hash};
- $head->{$file}{modified} = $commit->{date};
+ $head->{$file}{modified} = $cvsDate;
$head->{$file}{author} = $commit->{author};
- $self->insert_rev($file, $head->{$file}{revision}, $head->{$file}{filehash}, $commit->{hash}, $commit->{date}, $commit->{author}, $head->{$file}{mode});
+ $self->insert_rev($file, $head->{$file}{revision}, $head->{$file}{filehash}, $commit->{hash}, $cvsDate, $commit->{author}, $head->{$file}{mode});
}
}
# END : "Detect deleted files"
);
}
# invalidate the gethead cache
- $self->{gethead_cache} = undef;
+ $self->clearCommitRefCaches();
# Ending exclusive lock here
$self->{dbh}->commit() or die "Failed to commit changes to SQLite";
}
+sub readCommits
+{
+ my $pipeHandle = shift;
+ my @commits;
+
+ my %commit = ();
+
+ while ( <$pipeHandle> )
+ {
+ chomp;
+ if (m/^commit\s+(.*)$/) {
+ # on ^commit lines put the just seen commit in the stack
+ # and prime things for the next one
+ if (keys %commit) {
+ my %copy = %commit;
+ unshift @commits, \%copy;
+ %commit = ();
+ }
+ my @parents = split(m/\s+/, $1);
+ $commit{hash} = shift @parents;
+ $commit{parents} = \@parents;
+ } elsif (m/^(\w+?):\s+(.*)$/ && !exists($commit{message})) {
+ # on rfc822-like lines seen before we see any message,
+ # lowercase the entry and put it in the hash as key-value
+ $commit{lc($1)} = $2;
+ } else {
+ # message lines - skip initial empty line
+ # and trim whitespace
+ if (!exists($commit{message}) && m/^\s*$/) {
+ # define it to mark the end of headers
+ $commit{message} = '';
+ next;
+ }
+ s/^\s+//; s/\s+$//; # trim ws
+ $commit{message} .= $_ . "\n";
+ }
+ }
+
+ unshift @commits, \%commit if ( keys %commit );
+
+ return @commits;
+}
+
+sub convertToCvsDate
+{
+ my $date = shift;
+ # Convert from: "git rev-list --pretty" formatted date
+ # Convert to: "the format specified by RFC822 as modified by RFC1123."
+ # Example: 26 May 1997 13:01:40 -0400
+ if( $date =~ /^\w+\s+(\w+)\s+(\d+)\s+(\d+:\d+:\d+)\s+(\d+)\s+([+-]\d+)$/ )
+ {
+ $date = "$2 $1 $4 $3 $5";
+ }
+
+ return $date;
+}
+
+sub convertToDbMode
+{
+ my $mode = shift;
+
+ # NOTE: The CVS protocol uses a string similar "u=rw,g=rw,o=rw",
+ # but the database "mode" column historically (and currently)
+ # only stores the "rw" (for user) part of the string.
+ # FUTURE: It might make more sense to persist the raw
+ # octal mode (or perhaps the final full CVS form) instead of
+ # this half-converted form, but it isn't currently worth the
+ # backwards compatibility headaches.
+
+ $mode=~/^\d\d(\d)\d{3}$/;
+ my $userBits=$1;
+
+ my $dbMode = "";
+ $dbMode .= "r" if ( $userBits & 4 );
+ $dbMode .= "w" if ( $userBits & 2 );
+ $dbMode .= "x" if ( $userBits & 1 );
+ $dbMode = "rw" if ( $dbMode eq "" );
+
+ return $dbMode;
+}
+
sub insert_rev
{
my $self = shift;
return $tree;
}
+=head2 getAnyHead
+
+Returns a reference to an array of getmeta structures, one
+per file in the specified tree hash.
+
+=cut
+
+sub getAnyHead
+{
+ my ($self,$hash) = @_;
+
+ if(!defined($hash))
+ {
+ return $self->gethead();
+ }
+
+ my @files;
+ {
+ open(my $filePipe, '-|', 'git', 'ls-tree', '-z', '-r', $hash)
+ or die("Cannot call git-ls-tree : $!");
+ local $/ = "\0";
+ @files=<$filePipe>;
+ close $filePipe;
+ }
+
+ my $tree=[];
+ my($line);
+ foreach $line (@files)
+ {
+ $line=~s/\0$//;
+ unless ( $line=~/^(\d+)\s+(\w+)\s+([a-zA-Z0-9]+)\t(.*)$/o )
+ {
+ die("Couldn't process git-ls-tree line : $_");
+ }
+
+ my($mode, $git_type, $git_hash, $git_filename) = ($1, $2, $3, $4);
+ push @$tree, $self->getMetaFromCommithash($git_filename,$hash);
+ }
+
+ return $tree;
+}
+
+=head2 getRevisionDirMap
+
+A "revision dir map" contains all the plain-file filenames associated
+with a particular revision (treeish), organized by directory:
+
+ $type = $out->{$dir}{$fullName}
+
+The type of each is "F" (for ordinary file) or "D" (for directory,
+for which the map $out->{$fullName} will also exist).
+
+=cut
+
+sub getRevisionDirMap
+{
+ my ($self,$ver)=@_;
+
+ if(!defined($self->{revisionDirMapCache}))
+ {
+ $self->{revisionDirMapCache}={};
+ }
+
+ # Get file list (previously cached results are dependent on HEAD,
+ # but are early in each case):
+ my $cacheKey;
+ my (@fileList);
+ if( !defined($ver) || $ver eq "" )
+ {
+ $cacheKey="";
+ if( defined($self->{revisionDirMapCache}{$cacheKey}) )
+ {
+ return $self->{revisionDirMapCache}{$cacheKey};
+ }
+
+ my @head = @{$self->gethead()};
+ foreach my $file ( @head )
+ {
+ next if ( $file->{filehash} eq "deleted" );
+
+ push @fileList,$file->{name};
+ }
+ }
+ else
+ {
+ my ($hash)=$self->lookupCommitRef($ver);
+ if( !defined($hash) )
+ {
+ return undef;
+ }
+
+ $cacheKey=$hash;
+ if( defined($self->{revisionDirMapCache}{$cacheKey}) )
+ {
+ return $self->{revisionDirMapCache}{$cacheKey};
+ }
+
+ open(my $filePipe, '-|', 'git', 'ls-tree', '-z', '-r', $hash)
+ or die("Cannot call git-ls-tree : $!");
+ local $/ = "\0";
+ while ( <$filePipe> )
+ {
+ chomp;
+ unless ( /^(\d+)\s+(\w+)\s+([a-zA-Z0-9]+)\t(.*)$/o )
+ {
+ die("Couldn't process git-ls-tree line : $_");
+ }
+
+ my($mode, $git_type, $git_hash, $git_filename) = ($1, $2, $3, $4);
+
+ push @fileList, $git_filename;
+ }
+ close $filePipe;
+ }
+
+ # Convert to normalized form:
+ my %revMap;
+ my $file;
+ foreach $file (@fileList)
+ {
+ my($dir) = ($file=~m%^(?:(.*)/)?([^/]*)$%);
+ $dir='' if(!defined($dir));
+
+ # parent directories:
+ # ... create empty dir maps for parent dirs:
+ my($td)=$dir;
+ while(!defined($revMap{$td}))
+ {
+ $revMap{$td}={};
+
+ my($tp)=($td=~m%^(?:(.*)/)?([^/]*)$%);
+ $tp='' if(!defined($tp));
+ $td=$tp;
+ }
+ # ... add children to parent maps (now that they exist):
+ $td=$dir;
+ while($td ne "")
+ {
+ my($tp)=($td=~m%^(?:(.*)/)?([^/]*)$%);
+ $tp='' if(!defined($tp));
+
+ if(defined($revMap{$tp}{$td}))
+ {
+ if($revMap{$tp}{$td} ne 'D')
+ {
+ die "Weird file/directory inconsistency in $cacheKey";
+ }
+ last; # loop exit
+ }
+ $revMap{$tp}{$td}='D';
+
+ $td=$tp;
+ }
+
+ # file
+ $revMap{$dir}{$file}='F';
+ }
+
+ # Save in cache:
+ $self->{revisionDirMapCache}{$cacheKey}=\%revMap;
+ return $self->{revisionDirMapCache}{$cacheKey};
+}
+
=head2 getlog
See also gethistorydense().
This function takes a filename (with path) argument and returns a hashref of
metadata for that file.
+There are several ways $revision can be specified:
+
+ - A reference to hash that contains a "tag" that is the
+ actual revision (one of the below). TODO: Also allow it to
+ specify a "date" in the hash.
+ - undef, to refer to the latest version on the main branch.
+ - Full CVS client revision number (mapped to integer in DB, without the
+ "1." prefix),
+ - Complex CVS-compatible "special" revision number for
+ non-linear history (see comment below)
+ - git commit sha1 hash
+ - branch or tag name
+
=cut
sub getmeta
my $tablename_rev = $self->tablename("revision");
my $tablename_head = $self->tablename("head");
- my $db_query;
- if ( defined($revision) and $revision =~ /^1\.(\d+)$/ )
+ if ( ref($revision) eq "HASH" )
{
- my ($intRev) = $1;
- $db_query = $self->{dbh}->prepare_cached("SELECT * FROM $tablename_rev WHERE name=? AND revision=?",{},1);
- $db_query->execute($filename, $intRev);
+ $revision = $revision->{tag};
}
- elsif ( defined($revision) and $revision =~ /^[a-zA-Z0-9]{40}$/ )
+
+ # Overview of CVS revision numbers:
+ #
+ # General CVS numbering scheme:
+ # - Basic mainline branch numbers: "1.1", "1.2", "1.3", etc.
+ # - Result of "cvs checkin -r" (possible, but not really
+ # recommended): "2.1", "2.2", etc
+ # - Branch tag: "1.2.0.n", where "1.2" is revision it was branched
+ # from, "0" is a magic placeholder that identifies it as a
+ # branch tag instead of a version tag, and n is 2 times the
+ # branch number off of "1.2", starting with "2".
+ # - Version on a branch: "1.2.n.x", where "1.2" is branch-from, "n"
+ # is branch number off of "1.2" (like n above), and "x" is
+ # the version number on the branch.
+ # - Branches can branch off of branches: "1.3.2.7.4.1" (even number
+ # of components).
+ # - Odd "n"s are used by "vendor branches" that result
+ # from "cvs import". Vendor branches have additional
+ # strangeness in the sense that the main rcs "head" of the main
+ # branch will (temporarily until first normal commit) point
+ # to the version on the vendor branch, rather than the actual
+ # main branch. (FUTURE: This may provide an opportunity
+ # to use "strange" revision numbers for fast-forward-merged
+ # branch tip when CVS client is asking for the main branch.)
+ #
+ # git-cvsserver CVS-compatible special numbering schemes:
+ # - Currently git-cvsserver only tries to be identical to CVS for
+ # simple "1.x" numbers on the "main" branch (as identified
+ # by the module name that was originally cvs checkout'ed).
+ # - The database only stores the "x" part, for historical reasons.
+ # But most of the rest of the cvsserver preserves
+ # and thinks using the full revision number.
+ # - To handle non-linear history, it uses a version of the form
+ # "2.1.1.2000.b.b.b."..., where the 2.1.1.2000 is to help uniquely
+ # identify this as a special revision number, and there are
+ # 20 b's that together encode the sha1 git commit from which
+ # this version of this file originated. Each b is
+ # the numerical value of the corresponding byte plus
+ # 100.
+ # - "plus 100" avoids "0"s, and also reduces the
+ # likelyhood of a collision in the case that someone someday
+ # writes an import tool that tries to preserve original
+ # CVS revision numbers, and the original CVS data had done
+ # lots of branches off of branches and other strangeness to
+ # end up with a real version number that just happens to look
+ # like this special revision number form. Also, if needed
+ # there are several ways to extend/identify alternative encodings
+ # within the "2.1.1.2000" part if necessary.
+ # - Unlike real CVS revisions, you can't really reconstruct what
+ # relation a revision of this form has to other revisions.
+ # - FUTURE: TODO: Rework database somehow to make up and remember
+ # fully-CVS-compatible branches and branch version numbers.
+
+ my $meta;
+ if ( defined($revision) )
{
- $db_query = $self->{dbh}->prepare_cached("SELECT * FROM $tablename_rev WHERE name=? AND commithash=?",{},1);
- $db_query->execute($filename, $revision);
- } else {
- $db_query = $self->{dbh}->prepare_cached("SELECT * FROM $tablename_head WHERE name=?",{},1);
+ if ( $revision =~ /^1\.(\d+)$/ )
+ {
+ my ($intRev) = $1;
+ my $db_query;
+ $db_query = $self->{dbh}->prepare_cached(
+ "SELECT * FROM $tablename_rev WHERE name=? AND revision=?",
+ {},1);
+ $db_query->execute($filename, $intRev);
+ $meta = $db_query->fetchrow_hashref;
+ }
+ elsif ( $revision =~ /^2\.1\.1\.2000(\.[1-3][0-9][0-9]){20}$/ )
+ {
+ my ($commitHash)=($revision=~/^2\.1\.1\.2000(.*)$/);
+ $commitHash=~s/\.([0-9]+)/sprintf("%02x",$1-100)/eg;
+ if($commitHash=~/^[0-9a-f]{40}$/)
+ {
+ return $self->getMetaFromCommithash($filename,$commitHash);
+ }
+
+ # error recovery: fall back on head version below
+ print "E Failed to find $filename version=$revision or commit=$commitHash\n";
+ $log->warning("failed get $revision with commithash=$commitHash");
+ undef $revision;
+ }
+ elsif ( $revision =~ /^[0-9a-f]{40}$/ )
+ {
+ # Try DB first. This is mostly only useful for req_annotate(),
+ # which only calls this for stuff that should already be in
+ # the DB. It is fairly likely to be a waste of time
+ # in most other cases [unless the file happened to be
+ # modified in $revision specifically], but
+ # it is probably in the noise compared to how long
+ # getMetaFromCommithash() will take.
+ my $db_query;
+ $db_query = $self->{dbh}->prepare_cached(
+ "SELECT * FROM $tablename_rev WHERE name=? AND commithash=?",
+ {},1);
+ $db_query->execute($filename, $revision);
+ $meta = $db_query->fetchrow_hashref;
+
+ if(! $meta)
+ {
+ my($revCommit)=$self->lookupCommitRef($revision);
+ if($revCommit=~/^[0-9a-f]{40}$/)
+ {
+ return $self->getMetaFromCommithash($filename,$revCommit);
+ }
+
+ # error recovery: nothing found:
+ print "E Failed to find $filename version=$revision\n";
+ $log->warning("failed get $revision");
+ return $meta;
+ }
+ }
+ else
+ {
+ my($revCommit)=$self->lookupCommitRef($revision);
+ if($revCommit=~/^[0-9a-f]{40}$/)
+ {
+ return $self->getMetaFromCommithash($filename,$revCommit);
+ }
+
+ # error recovery: fall back on head version below
+ print "E Failed to find $filename version=$revision\n";
+ $log->warning("failed get $revision");
+ undef $revision; # Allow fallback
+ }
+ }
+
+ if(!defined($revision))
+ {
+ my $db_query;
+ $db_query = $self->{dbh}->prepare_cached(
+ "SELECT * FROM $tablename_head WHERE name=?",{},1);
$db_query->execute($filename);
+ $meta = $db_query->fetchrow_hashref;
}
- my $meta = $db_query->fetchrow_hashref;
if($meta)
{
$meta->{revision} = "1.$meta->{revision}";
return $meta;
}
+sub getMetaFromCommithash
+{
+ my $self = shift;
+ my $filename = shift;
+ my $revCommit = shift;
+
+ # NOTE: This function doesn't scale well (lots of forks), especially
+ # if you have many files that have not been modified for many commits
+ # (each git-rev-parse redoes a lot of work for each file
+ # that theoretically could be done in parallel by smarter
+ # graph traversal).
+ #
+ # TODO: Possible optimization strategies:
+ # - Solve the issue of assigning and remembering "real" CVS
+ # revision numbers for branches, and ensure the
+ # data structure can do this efficiently. Perhaps something
+ # similar to "git notes", and carefully structured to take
+ # advantage same-sha1-is-same-contents, to roll the same
+ # unmodified subdirectory data onto multiple commits?
+ # - Write and use a C tool that is like git-blame, but
+ # operates on multiple files with file granularity, instead
+ # of one file with line granularity. Cache
+ # most-recently-modified in $self->{commitRefCache}{$revCommit}.
+ # Try to be intelligent about how many files we do with
+ # one fork (perhaps one directory at a time, without recursion,
+ # and/or include directory as one line item, recurse from here
+ # instead of in C tool?).
+ # - Perhaps we could ask the DB for (filename,fileHash),
+ # and just guess that it is correct (that the file hadn't
+ # changed between $revCommit and the found commit, then
+ # changed back, confusing anything trying to interpret
+ # history). Probably need to add another index to revisions
+ # DB table for this.
+ # - NOTE: Trying to store all (commit,file) keys in DB [to
+ # find "lastModfiedCommit] (instead of
+ # just files that changed in each commit as we do now) is
+ # probably not practical from a disk space perspective.
+
+ # Does the file exist in $revCommit?
+ # TODO: Include file hash in dirmap cache.
+ my($dirMap)=$self->getRevisionDirMap($revCommit);
+ my($dir,$file)=($filename=~m%^(?:(.*)/)?([^/]*$)%);
+ if(!defined($dir))
+ {
+ $dir="";
+ }
+ if( !defined($dirMap->{$dir}) ||
+ !defined($dirMap->{$dir}{$filename}) )
+ {
+ my($fileHash)="deleted";
+
+ my($retVal)={};
+ $retVal->{name}=$filename;
+ $retVal->{filehash}=$fileHash;
+
+ # not needed and difficult to compute:
+ $retVal->{revision}="0"; # $revision;
+ $retVal->{commithash}=$revCommit;
+ #$retVal->{author}=$commit->{author};
+ #$retVal->{modified}=convertToCvsDate($commit->{date});
+ #$retVal->{mode}=convertToDbMode($mode);
+
+ return $retVal;
+ }
+
+ my($fileHash)=safe_pipe_capture("git","rev-parse","$revCommit:$filename");
+ chomp $fileHash;
+ if(!($fileHash=~/^[0-9a-f]{40}$/))
+ {
+ die "Invalid fileHash '$fileHash' looking up"
+ ." '$revCommit:$filename'\n";
+ }
+
+ # information about most recent commit to modify $filename:
+ open(my $gitLogPipe, '-|', 'git', 'rev-list',
+ '--max-count=1', '--pretty', '--parents',
+ $revCommit, '--', $filename)
+ or die "Cannot call git-rev-list: $!";
+ my @commits=readCommits($gitLogPipe);
+ close $gitLogPipe;
+ if(scalar(@commits)!=1)
+ {
+ die "Can't find most recent commit changing $filename\n";
+ }
+ my($commit)=$commits[0];
+ if( !defined($commit) || !defined($commit->{hash}) )
+ {
+ return undef;
+ }
+
+ # does this (commit,file) have a real assigned CVS revision number?
+ my $tablename_rev = $self->tablename("revision");
+ my $db_query;
+ $db_query = $self->{dbh}->prepare_cached(
+ "SELECT * FROM $tablename_rev WHERE name=? AND commithash=?",
+ {},1);
+ $db_query->execute($filename, $commit->{hash});
+ my($meta)=$db_query->fetchrow_hashref;
+ if($meta)
+ {
+ $meta->{revision} = "1.$meta->{revision}";
+ return $meta;
+ }
+
+ # fall back on special revision number
+ my($revision)=$commit->{hash};
+ $revision=~s/(..)/'.' . (hex($1)+100)/eg;
+ $revision="2.1.1.2000$revision";
+
+ # meta data about $filename:
+ open(my $filePipe, '-|', 'git', 'ls-tree', '-z',
+ $commit->{hash}, '--', $filename)
+ or die("Cannot call git-ls-tree : $!");
+ local $/ = "\0";
+ my $line;
+ $line=<$filePipe>;
+ if(defined(<$filePipe>))
+ {
+ die "Expected only a single file for git-ls-tree $filename\n";
+ }
+ close $filePipe;
+
+ chomp $line;
+ unless ( $line=~m/^(\d+)\s+(\w+)\s+([a-zA-Z0-9]+)\t(.*)$/o )
+ {
+ die("Couldn't process git-ls-tree line : $line\n");
+ }
+ my ( $mode, $git_type, $git_hash, $git_filename ) = ( $1, $2, $3, $4 );
+
+ # save result:
+ my($retVal)={};
+ $retVal->{name}=$filename;
+ $retVal->{revision}=$revision;
+ $retVal->{filehash}=$fileHash;
+ $retVal->{commithash}=$revCommit;
+ $retVal->{author}=$commit->{author};
+ $retVal->{modified}=convertToCvsDate($commit->{date});
+ $retVal->{mode}=convertToDbMode($mode);
+
+ return $retVal;
+}
+
+=head2 lookupCommitRef
+
+Convert tag/branch/abbreviation/etc into a commit sha1 hash. Caches
+the result so looking it up again is fast.
+
+=cut
+
+sub lookupCommitRef
+{
+ my $self = shift;
+ my $ref = shift;
+
+ my $commitHash = $self->{commitRefCache}{$ref};
+ if(defined($commitHash))
+ {
+ return $commitHash;
+ }
+
+ $commitHash=safe_pipe_capture("git","rev-parse","--verify","--quiet",
+ $self->unescapeRefName($ref));
+ $commitHash=~s/\s*$//;
+ if(!($commitHash=~/^[0-9a-f]{40}$/))
+ {
+ $commitHash=undef;
+ }
+
+ if( defined($commitHash) )
+ {
+ my $type=safe_pipe_capture("git","cat-file","-t",$commitHash);
+ if( ! ($type=~/^commit\s*$/ ) )
+ {
+ $commitHash=undef;
+ }
+ }
+ if(defined($commitHash))
+ {
+ $self->{commitRefCache}{$ref}=$commitHash;
+ }
+ return $commitHash;
+}
+
+=head2 clearCommitRefCaches
+
+Clears cached commit cache (sha1's for various tags/abbeviations/etc),
+and related caches.
+
+=cut
+
+sub clearCommitRefCaches
+{
+ my $self = shift;
+ $self->{commitRefCache} = {};
+ $self->{revisionDirMapCache} = undef;
+ $self->{gethead_cache} = undef;
+}
+
=head2 commitmessage
this function takes a commithash and returns the commit message for that commit
return $result;
}
+=head2 escapeRefName
+
+Apply an escape mechanism to compensate for characters that
+git ref names can have that CVS tags can not.
+
+=cut
+sub escapeRefName
+{
+ my($self,$refName)=@_;
+
+ # CVS officially only allows [-_A-Za-z0-9] in tag names (or in
+ # many contexts it can also be a CVS revision number).
+ #
+ # Git tags commonly use '/' and '.' as well, but also handle
+ # anything else just in case:
+ #
+ # = "_-s-" For '/'.
+ # = "_-p-" For '.'.
+ # = "_-u-" For underscore, in case someone wants a literal "_-" in
+ # a tag name.
+ # = "_-xx-" Where "xx" is the hexadecimal representation of the
+ # desired ASCII character byte. (for anything else)
+
+ if(! $refName=~/^[1-9][0-9]*(\.[1-9][0-9]*)*$/)
+ {
+ $refName=~s/_-/_-u--/g;
+ $refName=~s/\./_-p-/g;
+ $refName=~s%/%_-s-%g;
+ $refName=~s/[^-_a-zA-Z0-9]/sprintf("_-%02x-",$1)/eg;
+ }
+}
+
+=head2 unescapeRefName
+
+Undo an escape mechanism to compensate for characters that
+git ref names can have that CVS tags can not.
+
+=cut
+sub unescapeRefName
+{
+ my($self,$refName)=@_;
+
+ # see escapeRefName() for description of escape mechanism.
+
+ $refName=~s/_-([spu]|[0-9a-f][0-9a-f])-/unescapeRefNameChar($1)/eg;
+
+ # allowed tag names
+ # TODO: Perhaps use git check-ref-format, with an in-process cache of
+ # validated names?
+ if( !( $refName=~m%^[^-][-a-zA-Z0-9_/.]*$% ) ||
+ ( $refName=~m%[/.]$% ) ||
+ ( $refName=~/\.lock$/ ) ||
+ ( $refName=~m%\.\.|/\.|[[\\:?*~]|\@\{% ) ) # matching }
+ {
+ # Error:
+ $log->warn("illegal refName: $refName");
+ $refName=undef;
+ }
+ return $refName;
+}
+
+sub unescapeRefNameChar
+{
+ my($char)=@_;
+
+ if($char eq "s")
+ {
+ $char="/";
+ }
+ elsif($char eq "p")
+ {
+ $char=".";
+ }
+ elsif($char eq "u")
+ {
+ $char="_";
+ }
+ elsif($char=~/^[0-9a-f][0-9a-f]$/)
+ {
+ $char=chr(hex($char));
+ }
+ else
+ {
+ # Error case: Maybe it has come straight from user, and
+ # wasn't supposed to be escaped? Restore it the way we got it:
+ $char="_-$char-";
+ }
+
+ return $char;
+}
+
=head2 in_array()
from Array::PAT - mimics the in_array() function
{ "bundle", cmd_bundle, RUN_SETUP_GENTLY },
{ "cat-file", cmd_cat_file, RUN_SETUP },
{ "check-attr", cmd_check_attr, RUN_SETUP },
+ { "check-ignore", cmd_check_ignore, RUN_SETUP | NEED_WORK_TREE },
{ "check-ref-format", cmd_check_ref_format },
{ "checkout", cmd_checkout, RUN_SETUP | NEED_WORK_TREE },
{ "checkout-index", cmd_checkout_index,
#include <openssl/hmac.h>
#endif
-struct store_conf {
- char *name;
- const char *path; /* should this be here? its interpretation is driver-specific */
- char *map_inbox;
- char *trash;
- unsigned max_size; /* off_t is overkill */
- unsigned trash_remote_new:1, trash_only_new:1;
-};
-
-/* For message->status */
-#define M_RECENT (1<<0) /* unsyncable flag; maildir_* depend on this being 1<<0 */
-#define M_DEAD (1<<1) /* expunged */
-#define M_FLAGS (1<<2) /* flags fetched */
-
-struct message {
- struct message *next;
- size_t size; /* zero implies "not fetched" */
- int uid;
- unsigned char flags, status;
-};
-
-struct store {
- struct store_conf *conf; /* foreign */
-
- /* currently open mailbox */
- const char *name; /* foreign! maybe preset? */
- char *path; /* own */
- struct message *msgs; /* own */
- int uidvalidity;
- unsigned char opts; /* maybe preset? */
- /* note that the following do _not_ reflect stats from msgs, but mailbox totals */
- int count; /* # of messages */
- int recent; /* # of recent messages - don't trust this beyond the initial read */
-};
-
-struct msg_data {
- struct strbuf data;
- unsigned char flags;
-};
-
static const char imap_send_usage[] = "git imap-send < <mbox>";
#undef DRV_OK
static char *next_arg(char **);
-static void free_generic_messages(struct message *);
-
__attribute__((format (printf, 3, 4)))
static int nfsnprintf(char *buf, int blen, const char *fmt, ...);
NULL, /* auth_method */
};
-struct imap_store_conf {
- struct store_conf gen;
- struct imap_server_conf *server;
-};
-
-#define NIL (void *)0x1
-#define LIST (void *)0x2
-
-struct imap_list {
- struct imap_list *next, *child;
- char *val;
- int len;
-};
-
struct imap_socket {
int fd[2];
SSL *ssl;
struct imap {
int uidnext; /* from SELECT responses */
- struct imap_list *ns_personal, *ns_other, *ns_shared; /* NAMESPACE info */
unsigned caps, rcaps; /* CAPABILITY results */
/* command queue */
int nexttag, num_in_progress, literal_pending;
};
struct imap_store {
- struct store gen;
+ /* currently open mailbox */
+ const char *name; /* foreign! maybe preset? */
int uidvalidity;
struct imap *imap;
const char *prefix;
- unsigned /*currentnc:1,*/ trashnc:1;
};
struct imap_cmd_cb {
static int get_cmd_result(struct imap_store *ctx, struct imap_cmd *tcmd);
-static const char *Flags[] = {
- "Draft",
- "Flagged",
- "Answered",
- "Seen",
- "Deleted",
-};
-
#ifndef NO_OPENSSL
static void ssl_socket_perror(const char *func)
{
return ret;
}
-static void free_generic_messages(struct message *msgs)
-{
- struct message *tmsg;
-
- for (; msgs; msgs = tmsg) {
- tmsg = msgs->next;
- free(msgs);
- }
-}
-
static int nfsnprintf(char *buf, int blen, const char *fmt, ...)
{
int ret;
}
}
-static int is_atom(struct imap_list *list)
-{
- return list && list->val && list->val != NIL && list->val != LIST;
-}
-
-static int is_list(struct imap_list *list)
-{
- return list && list->val == LIST;
-}
-
-static void free_list(struct imap_list *list)
-{
- struct imap_list *tmp;
-
- for (; list; list = tmp) {
- tmp = list->next;
- if (is_list(list))
- free_list(list->child);
- else if (is_atom(list))
- free(list->val);
- free(list);
- }
-}
-
-static int parse_imap_list_l(struct imap *imap, char **sp, struct imap_list **curp, int level)
+static int skip_imap_list_l(char **sp, int level)
{
- struct imap_list *cur;
- char *s = *sp, *p;
- int n, bytes;
+ char *s = *sp;
for (;;) {
while (isspace((unsigned char)*s))
s++;
break;
}
- *curp = cur = xmalloc(sizeof(*cur));
- curp = &cur->next;
- cur->val = NULL; /* for clean bail */
if (*s == '(') {
/* sublist */
s++;
- cur->val = LIST;
- if (parse_imap_list_l(imap, &s, &cur->child, level + 1))
- goto bail;
- } else if (imap && *s == '{') {
- /* literal */
- bytes = cur->len = strtol(s + 1, &s, 10);
- if (*s != '}')
- goto bail;
-
- s = cur->val = xmalloc(cur->len);
-
- /* dump whats left over in the input buffer */
- n = imap->buf.bytes - imap->buf.offset;
-
- if (n > bytes)
- /* the entire message fit in the buffer */
- n = bytes;
-
- memcpy(s, imap->buf.buf + imap->buf.offset, n);
- s += n;
- bytes -= n;
-
- /* mark that we used part of the buffer */
- imap->buf.offset += n;
-
- /* now read the rest of the message */
- while (bytes > 0) {
- if ((n = socket_read(&imap->buf.sock, s, bytes)) <= 0)
- goto bail;
- s += n;
- bytes -= n;
- }
-
- if (buffer_gets(&imap->buf, &s))
+ if (skip_imap_list_l(&s, level + 1))
goto bail;
} else if (*s == '"') {
/* quoted string */
s++;
- p = s;
for (; *s != '"'; s++)
if (!*s)
goto bail;
- cur->len = s - p;
s++;
- cur->val = xmemdupz(p, cur->len);
} else {
/* atom */
- p = s;
for (; *s && !isspace((unsigned char)*s); s++)
if (level && *s == ')')
break;
- cur->len = s - p;
- if (cur->len == 3 && !memcmp("NIL", p, 3))
- cur->val = NIL;
- else
- cur->val = xmemdupz(p, cur->len);
}
if (!level)
goto bail;
}
*sp = s;
- *curp = NULL;
return 0;
bail:
- *curp = NULL;
return -1;
}
-static struct imap_list *parse_imap_list(struct imap *imap, char **sp)
+static void skip_list(char **sp)
{
- struct imap_list *head;
-
- if (!parse_imap_list_l(imap, sp, &head, 0))
- return head;
- free_list(head);
- return NULL;
-}
-
-static struct imap_list *parse_list(char **sp)
-{
- return parse_imap_list(NULL, sp);
+ skip_imap_list_l(sp, 0);
}
static void parse_capability(struct imap *imap, char *cmd)
*p++ = 0;
arg = next_arg(&s);
if (!strcmp("UIDVALIDITY", arg)) {
- if (!(arg = next_arg(&s)) || !(ctx->gen.uidvalidity = atoi(arg))) {
+ if (!(arg = next_arg(&s)) || !(ctx->uidvalidity = atoi(arg))) {
fprintf(stderr, "IMAP error: malformed UIDVALIDITY status\n");
return RESP_BAD;
}
for (; isspace((unsigned char)*p); p++);
fprintf(stderr, "*** IMAP ALERT *** %s\n", p);
} else if (cb && cb->ctx && !strcmp("APPENDUID", arg)) {
- if (!(arg = next_arg(&s)) || !(ctx->gen.uidvalidity = atoi(arg)) ||
+ if (!(arg = next_arg(&s)) || !(ctx->uidvalidity = atoi(arg)) ||
!(arg = next_arg(&s)) || !(*(int *)cb->ctx = atoi(arg))) {
fprintf(stderr, "IMAP error: malformed APPENDUID status\n");
return RESP_BAD;
}
if (!strcmp("NAMESPACE", arg)) {
- imap->ns_personal = parse_list(&cmd);
- imap->ns_other = parse_list(&cmd);
- imap->ns_shared = parse_list(&cmd);
+ /* rfc2342 NAMESPACE response. */
+ skip_list(&cmd); /* Personal mailboxes */
+ skip_list(&cmd); /* Others' mailboxes */
+ skip_list(&cmd); /* Shared mailboxes */
} else if (!strcmp("OK", arg) || !strcmp("BAD", arg) ||
!strcmp("NO", arg) || !strcmp("BYE", arg)) {
if ((resp = parse_response_code(ctx, NULL, cmd)) != RESP_OK)
return resp;
- } else if (!strcmp("CAPABILITY", arg))
+ } else if (!strcmp("CAPABILITY", arg)) {
parse_capability(imap, cmd);
- else if ((arg1 = next_arg(&cmd))) {
- if (!strcmp("EXISTS", arg1))
- ctx->gen.count = atoi(arg);
- else if (!strcmp("RECENT", arg1))
- ctx->gen.recent = atoi(arg);
+ } else if ((arg1 = next_arg(&cmd))) {
+ ; /*
+ * Unhandled response-data with at least two words.
+ * Ignore it.
+ *
+ * NEEDSWORK: Previously this case handled '<num> EXISTS'
+ * and '<num> RECENT' but as a probably-unintended side
+ * effect it ignores other unrecognized two-word
+ * responses. imap-send doesn't ever try to read
+ * messages or mailboxes these days, so consider
+ * eliminating this case.
+ */
} else {
fprintf(stderr, "IMAP error: unable to parse untagged response\n");
return RESP_BAD;
imap_exec(ictx, NULL, "LOGOUT");
socket_shutdown(&imap->buf.sock);
}
- free_list(imap->ns_personal);
- free_list(imap->ns_other);
- free_list(imap->ns_shared);
free(imap);
}
-static void imap_close_store(struct store *ctx)
+static void imap_close_store(struct imap_store *ctx)
{
- imap_close_server((struct imap_store *)ctx);
- free_generic_messages(ctx->msgs);
+ imap_close_server(ctx);
free(ctx);
}
return 0;
}
-static struct store *imap_open_store(struct imap_server_conf *srvc)
+static struct imap_store *imap_open_store(struct imap_server_conf *srvc)
{
struct imap_store *ctx;
struct imap *imap;
} /* !preauth */
ctx->prefix = "";
- ctx->trashnc = 1;
- return (struct store *)ctx;
+ return ctx;
bail:
- imap_close_store(&ctx->gen);
+ imap_close_store(ctx);
return NULL;
}
-static int imap_make_flags(int flags, char *buf)
-{
- const char *s;
- unsigned i, d;
-
- for (i = d = 0; i < ARRAY_SIZE(Flags); i++)
- if (flags & (1 << i)) {
- buf[d++] = ' ';
- buf[d++] = '\\';
- for (s = Flags[i]; *s; s++)
- buf[d++] = *s;
- }
- buf[0] = '(';
- buf[d++] = ')';
- return d;
-}
-
+/*
+ * Insert CR characters as necessary in *msg to ensure that every LF
+ * character in *msg is preceded by a CR.
+ */
static void lf_to_crlf(struct strbuf *msg)
{
- size_t new_len;
char *new;
- int i, j, lfnum = 0;
-
- if (msg->buf[0] == '\n')
- lfnum++;
- for (i = 1; i < msg->len; i++) {
- if (msg->buf[i - 1] != '\r' && msg->buf[i] == '\n')
- lfnum++;
+ size_t i, j;
+ char lastc;
+
+ /* First pass: tally, in j, the size of the new string: */
+ for (i = j = 0, lastc = '\0'; i < msg->len; i++) {
+ if (msg->buf[i] == '\n' && lastc != '\r')
+ j++; /* a CR will need to be added here */
+ lastc = msg->buf[i];
+ j++;
}
- new_len = msg->len + lfnum;
- new = xmalloc(new_len + 1);
- if (msg->buf[0] == '\n') {
- new[0] = '\r';
- new[1] = '\n';
- i = 1;
- j = 2;
- } else {
- new[0] = msg->buf[0];
- i = 1;
- j = 1;
- }
- for ( ; i < msg->len; i++) {
- if (msg->buf[i] != '\n') {
- new[j++] = msg->buf[i];
- continue;
- }
- if (msg->buf[i - 1] != '\r')
+ new = xmalloc(j + 1);
+
+ /*
+ * Second pass: write the new string. Note that this loop is
+ * otherwise identical to the first pass.
+ */
+ for (i = j = 0, lastc = '\0'; i < msg->len; i++) {
+ if (msg->buf[i] == '\n' && lastc != '\r')
new[j++] = '\r';
- /* otherwise it already had CR before */
- new[j++] = '\n';
+ lastc = new[j++] = msg->buf[i];
}
- strbuf_attach(msg, new, new_len, new_len + 1);
+ strbuf_attach(msg, new, j, j + 1);
}
/*
* Store msg to IMAP. Also detach and free the data from msg->data,
* leaving msg->data empty.
*/
-static int imap_store_msg(struct store *gctx, struct msg_data *msg)
+static int imap_store_msg(struct imap_store *ctx, struct strbuf *msg)
{
- struct imap_store *ctx = (struct imap_store *)gctx;
struct imap *imap = ctx->imap;
struct imap_cmd_cb cb;
const char *prefix, *box;
- int ret, d;
- char flagstr[128];
+ int ret;
- lf_to_crlf(&msg->data);
+ lf_to_crlf(msg);
memset(&cb, 0, sizeof(cb));
- cb.dlen = msg->data.len;
- cb.data = strbuf_detach(&msg->data, NULL);
-
- d = 0;
- if (msg->flags) {
- d = imap_make_flags(msg->flags, flagstr);
- flagstr[d++] = ' ';
- }
- flagstr[d] = 0;
+ cb.dlen = msg->len;
+ cb.data = strbuf_detach(msg, NULL);
- box = gctx->name;
+ box = ctx->name;
prefix = !strcmp(box, "INBOX") ? "" : ctx->prefix;
cb.create = 0;
- ret = imap_exec_m(ctx, &cb, "APPEND \"%s%s\" %s", prefix, box, flagstr);
+ ret = imap_exec_m(ctx, &cb, "APPEND \"%s%s\" ", prefix, box);
imap->caps = imap->rcaps;
if (ret != DRV_OK)
return ret;
- gctx->count++;
return DRV_OK;
}
int main(int argc, char **argv)
{
struct strbuf all_msgs = STRBUF_INIT;
- struct msg_data msg = {STRBUF_INIT, 0};
- struct store *ctx = NULL;
+ struct strbuf msg = STRBUF_INIT;
+ struct imap_store *ctx = NULL;
int ofs = 0;
int r;
int total, n = 0;
unsigned percent = n * 100 / total;
fprintf(stderr, "%4u%% (%d/%d) done\r", percent, n, total);
- if (!split_msg(&all_msgs, &msg.data, &ofs))
+ if (!split_msg(&all_msgs, &msg, &ofs))
break;
if (server.use_html)
- wrap_in_html(&msg.data);
+ wrap_in_html(&msg);
r = imap_store_msg(ctx, &msg);
if (r != DRV_OK)
break;
extern int optbug(const struct option *opt, const char *reason);
extern int opterror(const struct option *opt, const char *reason, int flags);
-#ifdef __GNUC__
+#if defined(__GNUC__) && ! defined(clang)
#define opterror(o,r,f) (opterror((o),(r),(f)), -1)
#endif
--- /dev/null
+#include "cache.h"
+#include "dir.h"
+#include "pathspec.h"
+
+/*
+ * Finds which of the given pathspecs match items in the index.
+ *
+ * For each pathspec, sets the corresponding entry in the seen[] array
+ * (which should be specs items long, i.e. the same size as pathspec)
+ * to the nature of the "closest" (i.e. most specific) match found for
+ * that pathspec in the index, if it was a closer type of match than
+ * the existing entry. As an optimization, matching is skipped
+ * altogether if seen[] already only contains non-zero entries.
+ *
+ * If seen[] has not already been written to, it may make sense
+ * to use find_pathspecs_matching_against_index() instead.
+ */
+void add_pathspec_matches_against_index(const char **pathspec,
+ char *seen, int specs)
+{
+ int num_unmatched = 0, i;
+
+ /*
+ * Since we are walking the index as if we were walking the directory,
+ * we have to mark the matched pathspec as seen; otherwise we will
+ * mistakenly think that the user gave a pathspec that did not match
+ * anything.
+ */
+ for (i = 0; i < specs; i++)
+ if (!seen[i])
+ num_unmatched++;
+ if (!num_unmatched)
+ return;
+ for (i = 0; i < active_nr; i++) {
+ struct cache_entry *ce = active_cache[i];
+ match_pathspec(pathspec, ce->name, ce_namelen(ce), 0, seen);
+ }
+}
+
+/*
+ * Finds which of the given pathspecs match items in the index.
+ *
+ * This is a one-shot wrapper around add_pathspec_matches_against_index()
+ * which allocates, populates, and returns a seen[] array indicating the
+ * nature of the "closest" (i.e. most specific) matches which each of the
+ * given pathspecs achieves against all items in the index.
+ */
+char *find_pathspecs_matching_against_index(const char **pathspec)
+{
+ char *seen;
+ int i;
+
+ for (i = 0; pathspec[i]; i++)
+ ; /* just counting */
+ seen = xcalloc(i, 1);
+ add_pathspec_matches_against_index(pathspec, seen, i);
+ return seen;
+}
+
+/*
+ * Check the index to see whether path refers to a submodule, or
+ * something inside a submodule. If the former, returns the path with
+ * any trailing slash stripped. If the latter, dies with an error
+ * message.
+ */
+const char *check_path_for_gitlink(const char *path)
+{
+ int i, path_len = strlen(path);
+ for (i = 0; i < active_nr; i++) {
+ struct cache_entry *ce = active_cache[i];
+ if (S_ISGITLINK(ce->ce_mode)) {
+ int ce_len = ce_namelen(ce);
+ if (path_len <= ce_len || path[ce_len] != '/' ||
+ memcmp(ce->name, path, ce_len))
+ /* path does not refer to this
+ * submodule or anything inside it */
+ continue;
+ if (path_len == ce_len + 1) {
+ /* path refers to submodule;
+ * strip trailing slash */
+ return xstrndup(ce->name, ce_len);
+ } else {
+ die (_("Path '%s' is in submodule '%.*s'"),
+ path, ce_len, ce->name);
+ }
+ }
+ }
+ return path;
+}
+
+/*
+ * Dies if the given path refers to a file inside a symlinked
+ * directory in the index.
+ */
+void die_if_path_beyond_symlink(const char *path, const char *prefix)
+{
+ if (has_symlink_leading_path(path, strlen(path))) {
+ int len = prefix ? strlen(prefix) : 0;
+ die(_("'%s' is beyond a symbolic link"), path + len);
+ }
+}
--- /dev/null
+#ifndef PATHSPEC_H
+#define PATHSPEC_H
+
+extern char *find_pathspecs_matching_against_index(const char **pathspec);
+extern void add_pathspec_matches_against_index(const char **pathspec, char *seen, int specs);
+extern const char *check_path_for_gitlink(const char *path);
+extern void die_if_path_beyond_symlink(const char *path, const char *prefix);
+
+#endif /* PATHSPEC_H */
static int ref_entry_cmp_sslice(const void *key_, const void *ent_)
{
- struct string_slice *key = (struct string_slice *)key_;
- struct ref_entry *ent = *(struct ref_entry **)ent_;
- int entlen = strlen(ent->name);
- int cmplen = key->len < entlen ? key->len : entlen;
- int cmp = memcmp(key->str, ent->name, cmplen);
+ const struct string_slice *key = key_;
+ const struct ref_entry *ent = *(const struct ref_entry * const *)ent_;
+ int cmp = strncmp(key->str, ent->name, key->len);
if (cmp)
return cmp;
- return key->len - entlen;
+ return '\0' - (unsigned char)ent->name[key->len];
}
/*
return 0;
}
-static inline int is_forwardable(struct ref* ref)
-{
- struct object *o;
-
- if (!prefixcmp(ref->name, "refs/tags/"))
- return 0;
-
- /* old object must be a commit */
- o = parse_object(ref->old_sha1);
- if (!o || o->type != OBJ_COMMIT)
- return 0;
-
- /* new object must be commit-ish */
- o = deref_tag(parse_object(ref->new_sha1), NULL, 0);
- if (!o || o->type != OBJ_COMMIT)
- return 0;
-
- return 1;
-}
-
void set_ref_status_for_push(struct ref *remote_refs, int send_mirror,
int force_update)
{
}
/*
- * The below logic determines whether an individual
- * refspec A:B can be pushed. The push will succeed
- * if any of the following are true:
+ * Decide whether an individual refspec A:B can be
+ * pushed. The push will succeed if any of the
+ * following are true:
*
* (1) the remote reference B does not exist
*
* (2) the remote reference B is being removed (i.e.,
* pushing :B where no source is specified)
*
- * (3) the update meets all fast-forwarding criteria:
- *
- * (a) the destination is not under refs/tags/
- * (b) the old is a commit
- * (c) the new is a descendant of the old
- *
- * NOTE: We must actually have the old object in
- * order to overwrite it in the remote reference,
- * and the new object must be commit-ish. These are
- * implied by (b) and (c) respectively.
+ * (3) the destination is not under refs/tags/, and
+ * if the old and new value is a commit, the new
+ * is a descendant of the old.
*
* (4) it is forced using the +A:B notation, or by
* passing the --force argument
*/
- ref->not_forwardable = !is_forwardable(ref);
-
ref->update =
!ref->deletion &&
!is_null_sha1(ref->old_sha1);
!has_sha1_file(ref->old_sha1)
|| !ref_newer(ref->new_sha1, ref->old_sha1);
- if (ref->not_forwardable) {
+ if (!prefixcmp(ref->name, "refs/tags/")) {
ref->requires_force = 1;
if (!force_ref_update) {
ref->status = REF_STATUS_REJECT_ALREADY_EXISTS;
#endif
}
+char *find_hook(const char *name)
+{
+ char *path = git_path("hooks/%s", name);
+ if (access(path, X_OK) < 0)
+ path = NULL;
+
+ return path;
+}
+
int run_hook(const char *index_file, const char *name, ...)
{
struct child_process hook;
va_list args;
int ret;
- if (access(git_path("hooks/%s", name), X_OK) < 0)
+ p = find_hook(name);
+ if (!p)
return 0;
+ argv_array_push(&argv, p);
+
va_start(args, name);
- argv_array_push(&argv, git_path("hooks/%s", name));
while ((p = va_arg(args, const char *)))
argv_array_push(&argv, p);
va_end(args);
int finish_command(struct child_process *);
int run_command(struct child_process *);
+extern char *find_hook(const char *name);
extern int run_hook(const char *index_file, const char *name, ...);
#define RUN_COMMAND_NO_STDIN 1
return prefix_path(prefix, prefixlen, copyfrom);
}
+/*
+ * N.B. get_pathspec() is deprecated in favor of the "struct pathspec"
+ * based interface - see pathspec_magic above.
+ *
+ * Arguments:
+ * - prefix - a path relative to the root of the working tree
+ * - pathspec - a list of paths underneath the prefix path
+ *
+ * Iterates over pathspec, prepending each path with prefix,
+ * and return the resulting list.
+ *
+ * If pathspec is empty, return a singleton list containing prefix.
+ *
+ * If pathspec and prefix are both empty, return an empty list.
+ *
+ * This is typically used by built-in commands such as add.c, in order
+ * to normalize argv arguments provided to the built-in into a list of
+ * paths to process, all relative to the root of the working tree.
+ */
const char **get_pathspec(const char *prefix, const char **pathspec)
{
const char *entry = *pathspec;
--- /dev/null
+#!/bin/sh
+
+test_description=check-ignore
+
+. ./test-lib.sh
+
+init_vars () {
+ global_excludes="$(pwd)/global-excludes"
+}
+
+enable_global_excludes () {
+ init_vars &&
+ git config core.excludesfile "$global_excludes"
+}
+
+expect_in () {
+ dest="$HOME/expected-$1" text="$2"
+ if test -z "$text"
+ then
+ >"$dest" # avoid newline
+ else
+ echo "$text" >"$dest"
+ fi
+}
+
+expect () {
+ expect_in stdout "$1"
+}
+
+expect_from_stdin () {
+ cat >"$HOME/expected-stdout"
+}
+
+test_stderr () {
+ expected="$1"
+ expect_in stderr "$1" &&
+ test_cmp "$HOME/expected-stderr" "$HOME/stderr"
+}
+
+stderr_contains () {
+ regexp="$1"
+ if grep "$regexp" "$HOME/stderr"
+ then
+ return 0
+ else
+ echo "didn't find /$regexp/ in $HOME/stderr"
+ cat "$HOME/stderr"
+ return 1
+ fi
+}
+
+stderr_empty_on_success () {
+ expect_code="$1"
+ if test $expect_code = 0
+ then
+ test_stderr ""
+ else
+ # If we expect failure then stderr might or might not be empty
+ # due to --quiet - the caller can check its contents
+ return 0
+ fi
+}
+
+test_check_ignore () {
+ args="$1" expect_code="${2:-0}" global_args="$3"
+
+ init_vars &&
+ rm -f "$HOME/stdout" "$HOME/stderr" "$HOME/cmd" &&
+ echo git $global_args check-ignore $quiet_opt $verbose_opt $args \
+ >"$HOME/cmd" &&
+ test_expect_code "$expect_code" \
+ git $global_args check-ignore $quiet_opt $verbose_opt $args \
+ >"$HOME/stdout" 2>"$HOME/stderr" &&
+ test_cmp "$HOME/expected-stdout" "$HOME/stdout" &&
+ stderr_empty_on_success "$expect_code"
+}
+
+test_expect_success_multi () {
+ prereq=
+ if test $# -eq 4
+ then
+ prereq=$1
+ shift
+ fi
+ testname="$1" expect_verbose="$2" code="$3"
+
+ expect=$( echo "$expect_verbose" | sed -e 's/.* //' )
+
+ test_expect_success $prereq "$testname" '
+ expect "$expect" &&
+ eval "$code"
+ '
+
+ for quiet_opt in '-q' '--quiet'
+ do
+ test_expect_success $prereq "$testname${quiet_opt:+ with $quiet_opt}" "
+ expect '' &&
+ $code
+ "
+ done
+ quiet_opt=
+
+ for verbose_opt in '-v' '--verbose'
+ do
+ test_expect_success $prereq "$testname${verbose_opt:+ with $verbose_opt}" "
+ expect '$expect_verbose' &&
+ $code
+ "
+ done
+ verbose_opt=
+}
+
+test_expect_success 'setup' '
+ init_vars &&
+ mkdir -p a/b/ignored-dir a/submodule b &&
+ if test_have_prereq SYMLINKS
+ then
+ ln -s b a/symlink
+ fi &&
+ (
+ cd a/submodule &&
+ git init &&
+ echo a >a &&
+ git add a &&
+ git commit -m"commit in submodule"
+ ) &&
+ git add a/submodule &&
+ cat <<-\EOF >.gitignore &&
+ one
+ ignored-*
+ EOF
+ for dir in . a
+ do
+ : >$dir/not-ignored &&
+ : >$dir/ignored-and-untracked &&
+ : >$dir/ignored-but-in-index
+ done &&
+ git add -f ignored-but-in-index a/ignored-but-in-index &&
+ cat <<-\EOF >a/.gitignore &&
+ two*
+ *three
+ EOF
+ cat <<-\EOF >a/b/.gitignore &&
+ four
+ five
+ # this comment should affect the line numbers
+ six
+ ignored-dir/
+ # and so should this blank line:
+
+ !on*
+ !two
+ EOF
+ echo "seven" >a/b/ignored-dir/.gitignore &&
+ test -n "$HOME" &&
+ cat <<-\EOF >"$global_excludes" &&
+ globalone
+ !globaltwo
+ globalthree
+ EOF
+ cat <<-\EOF >>.git/info/exclude
+ per-repo
+ EOF
+'
+
+############################################################################
+#
+# test invalid inputs
+
+test_expect_success_multi 'empty command line' '' '
+ test_check_ignore "" 128 &&
+ stderr_contains "fatal: no path specified"
+'
+
+test_expect_success_multi '--stdin with empty STDIN' '' '
+ test_check_ignore "--stdin" 1 </dev/null &&
+ if test -n "$quiet_opt"; then
+ test_stderr ""
+ else
+ test_stderr "no pathspec given."
+ fi
+'
+
+test_expect_success '-q with multiple args' '
+ expect "" &&
+ test_check_ignore "-q one two" 128 &&
+ stderr_contains "fatal: --quiet is only valid with a single pathname"
+'
+
+test_expect_success '--quiet with multiple args' '
+ expect "" &&
+ test_check_ignore "--quiet one two" 128 &&
+ stderr_contains "fatal: --quiet is only valid with a single pathname"
+'
+
+for verbose_opt in '-v' '--verbose'
+do
+ for quiet_opt in '-q' '--quiet'
+ do
+ test_expect_success "$quiet_opt $verbose_opt" "
+ expect '' &&
+ test_check_ignore '$quiet_opt $verbose_opt foo' 128 &&
+ stderr_contains 'fatal: cannot have both --quiet and --verbose'
+ "
+ done
+done
+
+test_expect_success '--quiet with multiple args' '
+ expect "" &&
+ test_check_ignore "--quiet one two" 128 &&
+ stderr_contains "fatal: --quiet is only valid with a single pathname"
+'
+
+test_expect_success_multi 'erroneous use of --' '' '
+ test_check_ignore "--" 128 &&
+ stderr_contains "fatal: no path specified"
+'
+
+test_expect_success_multi '--stdin with superfluous arg' '' '
+ test_check_ignore "--stdin foo" 128 &&
+ stderr_contains "fatal: cannot specify pathnames with --stdin"
+'
+
+test_expect_success_multi '--stdin -z with superfluous arg' '' '
+ test_check_ignore "--stdin -z foo" 128 &&
+ stderr_contains "fatal: cannot specify pathnames with --stdin"
+'
+
+test_expect_success_multi '-z without --stdin' '' '
+ test_check_ignore "-z" 128 &&
+ stderr_contains "fatal: -z only makes sense with --stdin"
+'
+
+test_expect_success_multi '-z without --stdin and superfluous arg' '' '
+ test_check_ignore "-z foo" 128 &&
+ stderr_contains "fatal: -z only makes sense with --stdin"
+'
+
+test_expect_success_multi 'needs work tree' '' '
+ (
+ cd .git &&
+ test_check_ignore "foo" 128
+ ) &&
+ stderr_contains "fatal: This operation must be run in a work tree"
+'
+
+############################################################################
+#
+# test standard ignores
+
+# First make sure that the presence of a file in the working tree
+# does not impact results, but that the presence of a file in the
+# index does.
+
+for subdir in '' 'a/'
+do
+ if test -z "$subdir"
+ then
+ where="at top-level"
+ else
+ where="in subdir $subdir"
+ fi
+
+ test_expect_success_multi "non-existent file $where not ignored" '' "
+ test_check_ignore '${subdir}non-existent' 1
+ "
+
+ test_expect_success_multi "non-existent file $where ignored" \
+ ".gitignore:1:one ${subdir}one" "
+ test_check_ignore '${subdir}one'
+ "
+
+ test_expect_success_multi "existing untracked file $where not ignored" '' "
+ test_check_ignore '${subdir}not-ignored' 1
+ "
+
+ test_expect_success_multi "existing tracked file $where not ignored" '' "
+ test_check_ignore '${subdir}ignored-but-in-index' 1
+ "
+
+ test_expect_success_multi "existing untracked file $where ignored" \
+ ".gitignore:2:ignored-* ${subdir}ignored-and-untracked" "
+ test_check_ignore '${subdir}ignored-and-untracked'
+ "
+done
+
+# Having established the above, from now on we mostly test against
+# files which do not exist in the working tree or index.
+
+test_expect_success 'sub-directory local ignore' '
+ expect "a/3-three" &&
+ test_check_ignore "a/3-three a/three-not-this-one"
+'
+
+test_expect_success 'sub-directory local ignore with --verbose' '
+ expect "a/.gitignore:2:*three a/3-three" &&
+ test_check_ignore "--verbose a/3-three a/three-not-this-one"
+'
+
+test_expect_success 'local ignore inside a sub-directory' '
+ expect "3-three" &&
+ (
+ cd a &&
+ test_check_ignore "3-three three-not-this-one"
+ )
+'
+test_expect_success 'local ignore inside a sub-directory with --verbose' '
+ expect "a/.gitignore:2:*three 3-three" &&
+ (
+ cd a &&
+ test_check_ignore "--verbose 3-three three-not-this-one"
+ )
+'
+
+test_expect_success_multi 'nested include' \
+ 'a/b/.gitignore:8:!on* a/b/one' '
+ test_check_ignore "a/b/one"
+'
+
+############################################################################
+#
+# test ignored sub-directories
+
+test_expect_success_multi 'ignored sub-directory' \
+ 'a/b/.gitignore:5:ignored-dir/ a/b/ignored-dir' '
+ test_check_ignore "a/b/ignored-dir"
+'
+
+test_expect_success 'multiple files inside ignored sub-directory' '
+ expect_from_stdin <<-\EOF &&
+ a/b/ignored-dir/foo
+ a/b/ignored-dir/twoooo
+ a/b/ignored-dir/seven
+ EOF
+ test_check_ignore "a/b/ignored-dir/foo a/b/ignored-dir/twoooo a/b/ignored-dir/seven"
+'
+
+test_expect_success 'multiple files inside ignored sub-directory with -v' '
+ expect_from_stdin <<-\EOF &&
+ a/b/.gitignore:5:ignored-dir/ a/b/ignored-dir/foo
+ a/b/.gitignore:5:ignored-dir/ a/b/ignored-dir/twoooo
+ a/b/.gitignore:5:ignored-dir/ a/b/ignored-dir/seven
+ EOF
+ test_check_ignore "-v a/b/ignored-dir/foo a/b/ignored-dir/twoooo a/b/ignored-dir/seven"
+'
+
+test_expect_success 'cd to ignored sub-directory' '
+ expect_from_stdin <<-\EOF &&
+ foo
+ twoooo
+ ../one
+ seven
+ ../../one
+ EOF
+ (
+ cd a/b/ignored-dir &&
+ test_check_ignore "foo twoooo ../one seven ../../one"
+ )
+'
+
+test_expect_success 'cd to ignored sub-directory with -v' '
+ expect_from_stdin <<-\EOF &&
+ a/b/.gitignore:5:ignored-dir/ foo
+ a/b/.gitignore:5:ignored-dir/ twoooo
+ a/b/.gitignore:8:!on* ../one
+ a/b/.gitignore:5:ignored-dir/ seven
+ .gitignore:1:one ../../one
+ EOF
+ (
+ cd a/b/ignored-dir &&
+ test_check_ignore "-v foo twoooo ../one seven ../../one"
+ )
+'
+
+############################################################################
+#
+# test handling of symlinks
+
+test_expect_success_multi SYMLINKS 'symlink' '' '
+ test_check_ignore "a/symlink" 1
+'
+
+test_expect_success_multi SYMLINKS 'beyond a symlink' '' '
+ test_check_ignore "a/symlink/foo" 128 &&
+ test_stderr "fatal: '\''a/symlink/foo'\'' is beyond a symbolic link"
+'
+
+test_expect_success_multi SYMLINKS 'beyond a symlink from subdirectory' '' '
+ (
+ cd a &&
+ test_check_ignore "symlink/foo" 128
+ ) &&
+ test_stderr "fatal: '\''symlink/foo'\'' is beyond a symbolic link"
+'
+
+############################################################################
+#
+# test handling of submodules
+
+test_expect_success_multi 'submodule' '' '
+ test_check_ignore "a/submodule/one" 128 &&
+ test_stderr "fatal: Path '\''a/submodule/one'\'' is in submodule '\''a/submodule'\''"
+'
+
+test_expect_success_multi 'submodule from subdirectory' '' '
+ (
+ cd a &&
+ test_check_ignore "submodule/one" 128
+ ) &&
+ test_stderr "fatal: Path '\''a/submodule/one'\'' is in submodule '\''a/submodule'\''"
+'
+
+############################################################################
+#
+# test handling of global ignore files
+
+test_expect_success 'global ignore not yet enabled' '
+ expect_from_stdin <<-\EOF &&
+ .git/info/exclude:7:per-repo per-repo
+ a/.gitignore:2:*three a/globalthree
+ .git/info/exclude:7:per-repo a/per-repo
+ EOF
+ test_check_ignore "-v globalone per-repo a/globalthree a/per-repo not-ignored a/globaltwo"
+'
+
+test_expect_success 'global ignore' '
+ enable_global_excludes &&
+ expect_from_stdin <<-\EOF &&
+ globalone
+ per-repo
+ globalthree
+ a/globalthree
+ a/per-repo
+ globaltwo
+ EOF
+ test_check_ignore "globalone per-repo globalthree a/globalthree a/per-repo not-ignored globaltwo"
+'
+
+test_expect_success 'global ignore with -v' '
+ enable_global_excludes &&
+ expect_from_stdin <<-EOF &&
+ $global_excludes:1:globalone globalone
+ .git/info/exclude:7:per-repo per-repo
+ $global_excludes:3:globalthree globalthree
+ a/.gitignore:2:*three a/globalthree
+ .git/info/exclude:7:per-repo a/per-repo
+ $global_excludes:2:!globaltwo globaltwo
+ EOF
+ test_check_ignore "-v globalone per-repo globalthree a/globalthree a/per-repo not-ignored globaltwo"
+'
+
+############################################################################
+#
+# test --stdin
+
+cat <<-\EOF >stdin
+ one
+ not-ignored
+ a/one
+ a/not-ignored
+ a/b/on
+ a/b/one
+ a/b/one one
+ "a/b/one two"
+ "a/b/one\"three"
+ a/b/not-ignored
+ a/b/two
+ a/b/twooo
+ globaltwo
+ a/globaltwo
+ a/b/globaltwo
+ b/globaltwo
+EOF
+cat <<-\EOF >expected-default
+ one
+ a/one
+ a/b/on
+ a/b/one
+ a/b/one one
+ a/b/one two
+ "a/b/one\"three"
+ a/b/two
+ a/b/twooo
+ globaltwo
+ a/globaltwo
+ a/b/globaltwo
+ b/globaltwo
+EOF
+cat <<-EOF >expected-verbose
+ .gitignore:1:one one
+ .gitignore:1:one a/one
+ a/b/.gitignore:8:!on* a/b/on
+ a/b/.gitignore:8:!on* a/b/one
+ a/b/.gitignore:8:!on* a/b/one one
+ a/b/.gitignore:8:!on* a/b/one two
+ a/b/.gitignore:8:!on* "a/b/one\"three"
+ a/b/.gitignore:9:!two a/b/two
+ a/.gitignore:1:two* a/b/twooo
+ $global_excludes:2:!globaltwo globaltwo
+ $global_excludes:2:!globaltwo a/globaltwo
+ $global_excludes:2:!globaltwo a/b/globaltwo
+ $global_excludes:2:!globaltwo b/globaltwo
+EOF
+
+sed -e 's/^"//' -e 's/\\//' -e 's/"$//' stdin | \
+ tr "\n" "\0" >stdin0
+sed -e 's/^"//' -e 's/\\//' -e 's/"$//' expected-default | \
+ tr "\n" "\0" >expected-default0
+sed -e 's/ "/ /' -e 's/\\//' -e 's/"$//' expected-verbose | \
+ tr ":\t\n" "\0" >expected-verbose0
+
+test_expect_success '--stdin' '
+ expect_from_stdin <expected-default &&
+ test_check_ignore "--stdin" <stdin
+'
+
+test_expect_success '--stdin -q' '
+ expect "" &&
+ test_check_ignore "-q --stdin" <stdin
+'
+
+test_expect_success '--stdin -v' '
+ expect_from_stdin <expected-verbose &&
+ test_check_ignore "-v --stdin" <stdin
+'
+
+for opts in '--stdin -z' '-z --stdin'
+do
+ test_expect_success "$opts" "
+ expect_from_stdin <expected-default0 &&
+ test_check_ignore '$opts' <stdin0
+ "
+
+ test_expect_success "$opts -q" "
+ expect "" &&
+ test_check_ignore '-q $opts' <stdin0
+ "
+
+ test_expect_success "$opts -v" "
+ expect_from_stdin <expected-verbose0 &&
+ test_check_ignore '-v $opts' <stdin0
+ "
+done
+
+cat <<-\EOF >stdin
+ ../one
+ ../not-ignored
+ one
+ not-ignored
+ b/on
+ b/one
+ b/one one
+ "b/one two"
+ "b/one\"three"
+ b/two
+ b/not-ignored
+ b/twooo
+ ../globaltwo
+ globaltwo
+ b/globaltwo
+ ../b/globaltwo
+EOF
+cat <<-\EOF >expected-default
+ ../one
+ one
+ b/on
+ b/one
+ b/one one
+ b/one two
+ "b/one\"three"
+ b/two
+ b/twooo
+ ../globaltwo
+ globaltwo
+ b/globaltwo
+ ../b/globaltwo
+EOF
+cat <<-EOF >expected-verbose
+ .gitignore:1:one ../one
+ .gitignore:1:one one
+ a/b/.gitignore:8:!on* b/on
+ a/b/.gitignore:8:!on* b/one
+ a/b/.gitignore:8:!on* b/one one
+ a/b/.gitignore:8:!on* b/one two
+ a/b/.gitignore:8:!on* "b/one\"three"
+ a/b/.gitignore:9:!two b/two
+ a/.gitignore:1:two* b/twooo
+ $global_excludes:2:!globaltwo ../globaltwo
+ $global_excludes:2:!globaltwo globaltwo
+ $global_excludes:2:!globaltwo b/globaltwo
+ $global_excludes:2:!globaltwo ../b/globaltwo
+EOF
+
+sed -e 's/^"//' -e 's/\\//' -e 's/"$//' stdin | \
+ tr "\n" "\0" >stdin0
+sed -e 's/^"//' -e 's/\\//' -e 's/"$//' expected-default | \
+ tr "\n" "\0" >expected-default0
+sed -e 's/ "/ /' -e 's/\\//' -e 's/"$//' expected-verbose | \
+ tr ":\t\n" "\0" >expected-verbose0
+
+test_expect_success '--stdin from subdirectory' '
+ expect_from_stdin <expected-default &&
+ (
+ cd a &&
+ test_check_ignore "--stdin" <../stdin
+ )
+'
+
+test_expect_success '--stdin from subdirectory with -v' '
+ expect_from_stdin <expected-verbose &&
+ (
+ cd a &&
+ test_check_ignore "--stdin -v" <../stdin
+ )
+'
+
+for opts in '--stdin -z' '-z --stdin'
+do
+ test_expect_success "$opts from subdirectory" '
+ expect_from_stdin <expected-default0 &&
+ (
+ cd a &&
+ test_check_ignore "'"$opts"'" <../stdin0
+ )
+ '
+
+ test_expect_success "$opts from subdirectory with -v" '
+ expect_from_stdin <expected-verbose0 &&
+ (
+ cd a &&
+ test_check_ignore "'"$opts"' -v" <../stdin0
+ )
+ '
+done
+
+
+test_done
fi
}
+pathmatch() {
+ if [ $1 = 1 ]; then
+ test_expect_success "pathmatch: match '$2' '$3'" "
+ test-wildmatch pathmatch '$2' '$3'
+ "
+ else
+ test_expect_success "pathmatch: no match '$2' '$3'" "
+ ! test-wildmatch pathmatch '$2' '$3'
+ "
+ fi
+}
+
# Basic wildmat features
match 1 1 foo foo
match 0 0 foo bar
match 0 0 'XXX/adobe/courier/bold/o/normal//12/120/75/75/X/70/iso8859/1' 'XXX/*/*/*/*/*/*/12/*/*/*/m/*/*/*'
match 1 0 'abcd/abcdefg/abcdefghijk/abcdefghijklmnop.txt' '**/*a*b*g*n*t'
match 0 0 'abcd/abcdefg/abcdefghijk/abcdefghijklmnop.txtz' '**/*a*b*g*n*t'
+match 0 x foo '*/*/*'
+match 0 x foo/bar '*/*/*'
+match 1 x foo/bba/arr '*/*/*'
+match 0 x foo/bb/aa/rr '*/*/*'
+match 1 x foo/bb/aa/rr '**/**/**'
+match 1 x abcXdefXghi '*X*i'
+match 0 x ab/cXd/efXg/hi '*X*i'
+match 1 x ab/cXd/efXg/hi '*/*X*/*/*i'
+match 1 x ab/cXd/efXg/hi '**/*X*/**/*i'
+
+pathmatch 1 foo foo
+pathmatch 0 foo fo
+pathmatch 1 foo/bar foo/bar
+pathmatch 1 foo/bar 'foo/*'
+pathmatch 1 foo/bba/arr 'foo/*'
+pathmatch 1 foo/bba/arr 'foo/**'
+pathmatch 1 foo/bba/arr 'foo*'
+pathmatch 1 foo/bba/arr 'foo**'
+pathmatch 1 foo/bba/arr 'foo/*arr'
+pathmatch 1 foo/bba/arr 'foo/**arr'
+pathmatch 0 foo/bba/arr 'foo/*z'
+pathmatch 0 foo/bba/arr 'foo/**z'
+pathmatch 1 foo/bar 'foo?bar'
+pathmatch 1 foo/bar 'foo[/]bar'
+pathmatch 0 foo '*/*/*'
+pathmatch 0 foo/bar '*/*/*'
+pathmatch 1 foo/bba/arr '*/*/*'
+pathmatch 1 foo/bb/aa/rr '*/*/*'
+pathmatch 1 abcXdefXghi '*X*i'
+pathmatch 1 ab/cXd/efXg/hi '*/*X*/*/*i'
+pathmatch 1 ab/cXd/efXg/hi '*Xg*i'
test_done
)
'
-test_expect_success 'push requires --force to update annotated tag' '
- mk_test heads/master &&
- mk_child child1 &&
- mk_child child2 &&
- (
- cd child1 &&
- git tag -a -m "message 1" Tag &&
- git push ../child2 Tag:refs/tmp/Tag &&
- git push ../child2 Tag:refs/tmp/Tag &&
- >file1 &&
- git add file1 &&
- git commit -m "file1" &&
- git tag -f -a -m "message 2" Tag &&
- test_must_fail git push ../child2 Tag:refs/tmp/Tag &&
- git push --force ../child2 Tag:refs/tmp/Tag &&
- git tag -f -a -m "message 3" Tag HEAD~ &&
- test_must_fail git push ../child2 Tag:refs/tmp/Tag &&
- git push --force ../child2 Tag:refs/tmp/Tag
- )
-'
-
test_expect_success 'push --porcelain' '
mk_empty &&
echo >.git/foo "To testrepo" &&
--- /dev/null
+#!/bin/sh
+
+test_description='check pre-push hooks'
+. ./test-lib.sh
+
+# Setup hook that always succeeds
+HOOKDIR="$(git rev-parse --git-dir)/hooks"
+HOOK="$HOOKDIR/pre-push"
+mkdir -p "$HOOKDIR"
+write_script "$HOOK" <<EOF
+cat >/dev/null
+exit 0
+EOF
+
+test_expect_success 'setup' '
+ git config push.default upstream &&
+ git init --bare repo1 &&
+ git remote add parent1 repo1 &&
+ test_commit one &&
+ git push parent1 HEAD:foreign
+'
+write_script "$HOOK" <<EOF
+cat >/dev/null
+exit 1
+EOF
+
+COMMIT1="$(git rev-parse HEAD)"
+export COMMIT1
+
+test_expect_success 'push with failing hook' '
+ test_commit two &&
+ test_must_fail git push parent1 HEAD
+'
+
+test_expect_success '--no-verify bypasses hook' '
+ git push --no-verify parent1 HEAD
+'
+
+COMMIT2="$(git rev-parse HEAD)"
+export COMMIT2
+
+write_script "$HOOK" <<'EOF'
+echo "$1" >actual
+echo "$2" >>actual
+cat >>actual
+EOF
+
+cat >expected <<EOF
+parent1
+repo1
+refs/heads/master $COMMIT2 refs/heads/foreign $COMMIT1
+EOF
+
+test_expect_success 'push with hook' '
+ git push parent1 master:foreign &&
+ diff expected actual
+'
+
+test_expect_success 'add a branch' '
+ git checkout -b other parent1/foreign &&
+ test_commit three
+'
+
+COMMIT3="$(git rev-parse HEAD)"
+export COMMIT3
+
+cat >expected <<EOF
+parent1
+repo1
+refs/heads/other $COMMIT3 refs/heads/foreign $COMMIT2
+EOF
+
+test_expect_success 'push to default' '
+ git push &&
+ diff expected actual
+'
+
+cat >expected <<EOF
+parent1
+repo1
+refs/tags/one $COMMIT1 refs/tags/tag1 $_z40
+HEAD~ $COMMIT2 refs/heads/prev $_z40
+EOF
+
+test_expect_success 'push non-branches' '
+ git push parent1 one:tag1 HEAD~:refs/heads/prev &&
+ diff expected actual
+'
+
+cat >expected <<EOF
+parent1
+repo1
+(delete) $_z40 refs/heads/prev $COMMIT2
+EOF
+
+test_expect_success 'push delete' '
+ git push parent1 :prev &&
+ diff expected actual
+'
+
+cat >expected <<EOF
+repo1
+repo1
+HEAD $COMMIT3 refs/heads/other $_z40
+EOF
+
+test_expect_success 'push to URL' '
+ git push repo1 HEAD &&
+ diff expected actual
+'
+
+# Test that filling pipe buffers doesn't cause failure
+# Too slow to leave enabled for general use
+if false
+then
+ printf 'parent1\nrepo1\n' >expected
+ nr=1000
+ while test $nr -lt 2000
+ do
+ nr=$(( $nr + 1 ))
+ git branch b/$nr $COMMIT3
+ echo "refs/heads/b/$nr $COMMIT3 refs/heads/b/$nr $_z40" >>expected
+ done
+
+ test_expect_success 'push many refs' '
+ git push parent1 "refs/heads/b/*:refs/heads/b/*" &&
+ diff expected actual
+ '
+fi
+
+test_done
--- /dev/null
+#!/bin/sh
+
+test_description='git-cvsserver and git refspecs
+
+tests ability for git-cvsserver to switch between and compare
+tags, branches and other git refspecs'
+
+. ./test-lib.sh
+
+#########
+
+check_start_tree() {
+ rm -f "$WORKDIR/list.expected"
+ echo "start $1" >>"${WORKDIR}/check.log"
+}
+
+check_file() {
+ sandbox="$1"
+ file="$2"
+ ver="$3"
+ GIT_DIR=$SERVERDIR git show "${ver}:${file}" \
+ >"$WORKDIR/check.got" 2>"$WORKDIR/check.stderr"
+ test_cmp "$WORKDIR/check.got" "$sandbox/$file"
+ stat=$?
+ echo "check_file $sandbox $file $ver : $stat" >>"$WORKDIR/check.log"
+ echo "$file" >>"$WORKDIR/list.expected"
+ return $stat
+}
+
+check_end_tree() {
+ sandbox="$1" &&
+ find "$sandbox" -name CVS -prune -o -type f -print >"$WORKDIR/list.actual" &&
+ sort <"$WORKDIR/list.expected" >expected &&
+ sort <"$WORKDIR/list.actual" | sed -e "s%cvswork/%%" >actual &&
+ test_cmp expected actual &&
+ rm expected actual
+}
+
+check_end_full_tree() {
+ sandbox="$1" &&
+ sort <"$WORKDIR/list.expected" >expected &&
+ find "$sandbox" -name CVS -prune -o -type f -print |
+ sed -e "s%$sandbox/%%" | sort >act1 &&
+ test_cmp expected act1 &&
+ git ls-tree --name-only -r "$2" | sort >act2 &&
+ test_cmp expected act2 &&
+ rm expected act1 act2
+}
+
+#########
+
+check_diff() {
+ diffFile="$1"
+ vOld="$2"
+ vNew="$3"
+ rm -rf diffSandbox
+ git clone -q -n . diffSandbox &&
+ (
+ cd diffSandbox &&
+ git checkout "$vOld" &&
+ git apply -p0 --index <"../$diffFile" &&
+ git diff --exit-code "$vNew"
+ ) >check_diff_apply.out 2>&1
+}
+
+#########
+
+cvs >/dev/null 2>&1
+if test $? -ne 1
+then
+ skip_all='skipping git-cvsserver tests, cvs not found'
+ test_done
+fi
+if ! test_have_prereq PERL
+then
+ skip_all='skipping git-cvsserver tests, perl not available'
+ test_done
+fi
+"$PERL_PATH" -e 'use DBI; use DBD::SQLite' >/dev/null 2>&1 || {
+ skip_all='skipping git-cvsserver tests, Perl SQLite interface unavailable'
+ test_done
+}
+
+unset GIT_DIR GIT_CONFIG
+WORKDIR=$(pwd)
+SERVERDIR=$(pwd)/gitcvs.git
+git_config="$SERVERDIR/config"
+CVSROOT=":fork:$SERVERDIR"
+CVSWORK="$(pwd)/cvswork"
+CVS_SERVER=git-cvsserver
+export CVSROOT CVS_SERVER
+
+rm -rf "$CVSWORK" "$SERVERDIR"
+test_expect_success 'setup v1, b1' '
+ echo "Simple text file" >textfile.c &&
+ echo "t2" >t2 &&
+ mkdir adir &&
+ echo "adir/afile line1" >adir/afile &&
+ echo "adir/afile line2" >>adir/afile &&
+ echo "adir/afile line3" >>adir/afile &&
+ echo "adir/afile line4" >>adir/afile &&
+ echo "adir/a2file" >>adir/a2file &&
+ mkdir adir/bdir &&
+ echo "adir/bdir/bfile line 1" >adir/bdir/bfile &&
+ echo "adir/bdir/bfile line 2" >>adir/bdir/bfile &&
+ echo "adir/bdir/b2file" >adir/bdir/b2file &&
+ git add textfile.c t2 adir &&
+ git commit -q -m "First Commit (v1)" &&
+ git tag v1 &&
+ git branch b1 &&
+ git clone -q --bare "$WORKDIR/.git" "$SERVERDIR" >/dev/null 2>&1 &&
+ GIT_DIR="$SERVERDIR" git config --bool gitcvs.enabled true &&
+ GIT_DIR="$SERVERDIR" git config gitcvs.logfile "$SERVERDIR/gitcvs.log"
+'
+
+rm -rf cvswork
+test_expect_success 'cvs co v1' '
+ cvs -f -Q co -r v1 -d cvswork master >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+rm -rf cvswork
+test_expect_success 'cvs co b1' '
+ cvs -f co -r b1 -d cvswork master >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'cvs co b1 [cvswork3]' '
+ cvs -f co -r b1 -d cvswork3 master >cvs.log 2>&1 &&
+ check_start_tree cvswork3 &&
+ check_file cvswork3 textfile.c v1 &&
+ check_file cvswork3 t2 v1 &&
+ check_file cvswork3 adir/afile v1 &&
+ check_file cvswork3 adir/a2file v1 &&
+ check_file cvswork3 adir/bdir/bfile v1 &&
+ check_file cvswork3 adir/bdir/b2file v1 &&
+ check_end_full_tree cvswork3 v1
+'
+
+test_expect_success 'edit cvswork3 and save diff' '
+ (
+ cd cvswork3 &&
+ sed -e "s/line1/line1 - data/" adir/afile >adir/afileNEW &&
+ mv -f adir/afileNEW adir/afile &&
+ echo "afile5" >adir/afile5 &&
+ rm t2 &&
+ cvs -f add adir/afile5 &&
+ cvs -f rm t2 &&
+ ! cvs -f diff -N -u >"$WORKDIR/cvswork3edit.diff"
+ )
+'
+
+test_expect_success 'setup v1.2 on b1' '
+ git checkout b1 &&
+ echo "new v1.2" >t3 &&
+ rm t2 &&
+ sed -e "s/line3/line3 - more data/" adir/afile >adir/afileNEW &&
+ mv -f adir/afileNEW adir/afile &&
+ rm adir/a2file &&
+ echo "a3file" >>adir/a3file &&
+ echo "bfile line 3" >>adir/bdir/bfile &&
+ rm adir/bdir/b2file &&
+ echo "b3file" >adir/bdir/b3file &&
+ mkdir cdir &&
+ echo "cdir/cfile" >cdir/cfile &&
+ git add -A cdir adir t3 t2 &&
+ git commit -q -m 'v1.2' &&
+ git tag v1.2 &&
+ git push --tags gitcvs.git b1:b1
+'
+
+test_expect_success 'cvs -f up (on b1 adir)' '
+ ( cd cvswork/adir && cvs -f up -d ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1.2 &&
+ check_file cvswork adir/a3file v1.2 &&
+ check_file cvswork adir/bdir/bfile v1.2 &&
+ check_file cvswork adir/bdir/b3file v1.2 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'cvs up (on b1 /)' '
+ ( cd cvswork && cvs -f up -d ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1.2 &&
+ check_file cvswork t3 v1.2 &&
+ check_file cvswork adir/afile v1.2 &&
+ check_file cvswork adir/a3file v1.2 &&
+ check_file cvswork adir/bdir/bfile v1.2 &&
+ check_file cvswork adir/bdir/b3file v1.2 &&
+ check_file cvswork cdir/cfile v1.2 &&
+ check_end_tree cvswork
+'
+
+# Make sure "CVS/Tag" files didn't get messed up:
+test_expect_success 'cvs up (on b1 /) (again; check CVS/Tag files)' '
+ ( cd cvswork && cvs -f up -d ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1.2 &&
+ check_file cvswork t3 v1.2 &&
+ check_file cvswork adir/afile v1.2 &&
+ check_file cvswork adir/a3file v1.2 &&
+ check_file cvswork adir/bdir/bfile v1.2 &&
+ check_file cvswork adir/bdir/b3file v1.2 &&
+ check_file cvswork cdir/cfile v1.2 &&
+ check_end_tree cvswork
+'
+
+# update to another version:
+test_expect_success 'cvs up -r v1' '
+ ( cd cvswork && cvs -f up -r v1 ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'cvs up' '
+ ( cd cvswork && cvs -f up ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'cvs up (again; check CVS/Tag files)' '
+ ( cd cvswork && cvs -f up -d ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'setup simple b2' '
+ git branch b2 v1 &&
+ git push --tags gitcvs.git b2:b2
+'
+
+test_expect_success 'cvs co b2 [into cvswork2]' '
+ cvs -f co -r b2 -d cvswork2 master >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_tree cvswork
+'
+
+test_expect_success 'root dir edit [cvswork2]' '
+ (
+ cd cvswork2 && echo "Line 2" >>textfile.c &&
+ ! cvs -f diff -u >"$WORKDIR/cvsEdit1.diff" &&
+ cvs -f commit -m "edit textfile.c" textfile.c
+ ) >cvsEdit1.log 2>&1
+'
+
+test_expect_success 'root dir rm file [cvswork2]' '
+ (
+ cd cvswork2 &&
+ cvs -f rm -f t2 &&
+ cvs -f diff -u >../cvsEdit2-empty.diff &&
+ ! cvs -f diff -N -u >"$WORKDIR/cvsEdit2-N.diff" &&
+ cvs -f commit -m "rm t2"
+ ) >cvsEdit2.log 2>&1
+'
+
+test_expect_success 'subdir edit/add/rm files [cvswork2]' '
+ (
+ cd cvswork2 &&
+ sed -e "s/line 1/line 1 (v2)/" adir/bdir/bfile >adir/bdir/bfileNEW &&
+ mv -f adir/bdir/bfileNEW adir/bdir/bfile &&
+ rm adir/bdir/b2file &&
+ cd adir &&
+ cvs -f rm bdir/b2file &&
+ echo "4th file" >bdir/b4file &&
+ cvs -f add bdir/b4file &&
+ ! cvs -f diff -N -u >"$WORKDIR/cvsEdit3.diff" &&
+ git fetch gitcvs.git b2:b2 &&
+ (
+ cd .. &&
+ ! cvs -f diff -u -N -r v1.2 >"$WORKDIR/cvsEdit3-v1.2.diff" &&
+ ! cvs -f diff -u -N -r v1.2 -r v1 >"$WORKDIR/cvsEdit3-v1.2-v1.diff"
+ ) &&
+ cvs -f commit -m "various add/rm/edit"
+ ) >cvs.log 2>&1
+'
+
+test_expect_success 'validate result of edits [cvswork2]' '
+ git fetch gitcvs.git b2:b2 &&
+ git tag v2 b2 &&
+ git push --tags gitcvs.git b2:b2 &&
+ check_start_tree cvswork2 &&
+ check_file cvswork2 textfile.c v2 &&
+ check_file cvswork2 adir/afile v2 &&
+ check_file cvswork2 adir/a2file v2 &&
+ check_file cvswork2 adir/bdir/bfile v2 &&
+ check_file cvswork2 adir/bdir/b4file v2 &&
+ check_end_full_tree cvswork2 v2
+'
+
+test_expect_success 'validate basic diffs saved during above cvswork2 edits' '
+ test $(grep Index: cvsEdit1.diff | wc -l) = 1 &&
+ test ! -s cvsEdit2-empty.diff &&
+ test $(grep Index: cvsEdit2-N.diff | wc -l) = 1 &&
+ test $(grep Index: cvsEdit3.diff | wc -l) = 3 &&
+ rm -rf diffSandbox &&
+ git clone -q -n . diffSandbox &&
+ (
+ cd diffSandbox &&
+ git checkout v1 &&
+ git apply -p0 --index <"$WORKDIR/cvsEdit1.diff" &&
+ git apply -p0 --index <"$WORKDIR/cvsEdit2-N.diff" &&
+ git apply -p0 --directory=adir --index <"$WORKDIR/cvsEdit3.diff" &&
+ git diff --exit-code v2
+ ) >"check_diff_apply.out" 2>&1
+'
+
+test_expect_success 'validate v1.2 diff saved during last cvswork2 edit' '
+ test $(grep Index: cvsEdit3-v1.2.diff | wc -l) = 9 &&
+ check_diff cvsEdit3-v1.2.diff v1.2 v2
+'
+
+test_expect_success 'validate v1.2 v1 diff saved during last cvswork2 edit' '
+ test $(grep Index: cvsEdit3-v1.2-v1.diff | wc -l) = 9 &&
+ check_diff cvsEdit3-v1.2-v1.diff v1.2 v1
+'
+
+test_expect_success 'cvs up [cvswork2]' '
+ ( cd cvswork2 && cvs -f up ) >cvs.log 2>&1 &&
+ check_start_tree cvswork2 &&
+ check_file cvswork2 textfile.c v2 &&
+ check_file cvswork2 adir/afile v2 &&
+ check_file cvswork2 adir/a2file v2 &&
+ check_file cvswork2 adir/bdir/bfile v2 &&
+ check_file cvswork2 adir/bdir/b4file v2 &&
+ check_end_full_tree cvswork2 v2
+'
+
+test_expect_success 'cvs up -r b2 [back to cvswork]' '
+ ( cd cvswork && cvs -f up -r b2 ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v2 &&
+ check_file cvswork adir/afile v2 &&
+ check_file cvswork adir/a2file v2 &&
+ check_file cvswork adir/bdir/bfile v2 &&
+ check_file cvswork adir/bdir/b4file v2 &&
+ check_end_full_tree cvswork v2
+'
+
+test_expect_success 'cvs up -r b1' '
+ ( cd cvswork && cvs -f up -r b1 ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1.2 &&
+ check_file cvswork t3 v1.2 &&
+ check_file cvswork adir/afile v1.2 &&
+ check_file cvswork adir/a3file v1.2 &&
+ check_file cvswork adir/bdir/bfile v1.2 &&
+ check_file cvswork adir/bdir/b3file v1.2 &&
+ check_file cvswork cdir/cfile v1.2 &&
+ check_end_full_tree cvswork v1.2
+'
+
+test_expect_success 'cvs up -A' '
+ ( cd cvswork && cvs -f up -A ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_full_tree cvswork v1
+'
+
+test_expect_success 'cvs up (check CVS/Tag files)' '
+ ( cd cvswork && cvs -f up ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_full_tree cvswork v1
+'
+
+# This is not really legal CVS, but it seems to work anyway:
+test_expect_success 'cvs up -r heads/b1' '
+ ( cd cvswork && cvs -f up -r heads/b1 ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1.2 &&
+ check_file cvswork t3 v1.2 &&
+ check_file cvswork adir/afile v1.2 &&
+ check_file cvswork adir/a3file v1.2 &&
+ check_file cvswork adir/bdir/bfile v1.2 &&
+ check_file cvswork adir/bdir/b3file v1.2 &&
+ check_file cvswork cdir/cfile v1.2 &&
+ check_end_full_tree cvswork v1.2
+'
+
+# But this should work even if CVS client checks -r more carefully:
+test_expect_success 'cvs up -r heads_-s-b2 (cvsserver escape mechanism)' '
+ ( cd cvswork && cvs -f up -r heads_-s-b2 ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v2 &&
+ check_file cvswork adir/afile v2 &&
+ check_file cvswork adir/a2file v2 &&
+ check_file cvswork adir/bdir/bfile v2 &&
+ check_file cvswork adir/bdir/b4file v2 &&
+ check_end_full_tree cvswork v2
+'
+
+v1hash=$(git rev-parse v1)
+test_expect_success 'cvs up -r $(git rev-parse v1)' '
+ test -n "$v1hash" &&
+ ( cd cvswork && cvs -f up -r "$v1hash" ) >cvs.log 2>&1 &&
+ check_start_tree cvswork &&
+ check_file cvswork textfile.c v1 &&
+ check_file cvswork t2 v1 &&
+ check_file cvswork adir/afile v1 &&
+ check_file cvswork adir/a2file v1 &&
+ check_file cvswork adir/bdir/bfile v1 &&
+ check_file cvswork adir/bdir/b2file v1 &&
+ check_end_full_tree cvswork v1
+'
+
+test_expect_success 'cvs diff -r v1 -u' '
+ ( cd cvswork && cvs -f diff -r v1 -u ) >cvsDiff.out 2>cvs.log &&
+ test ! -s cvsDiff.out &&
+ test ! -s cvs.log
+'
+
+test_expect_success 'cvs diff -N -r v2 -u' '
+ ( cd cvswork && ! cvs -f diff -N -r v2 -u ) >cvsDiff.out 2>cvs.log &&
+ test ! -s cvs.log &&
+ test -s cvsDiff.out &&
+ check_diff cvsDiff.out v2 v1 >check_diff.out 2>&1
+'
+
+test_expect_success 'cvs diff -N -r v2 -r v1.2' '
+ ( cd cvswork && ! cvs -f diff -N -r v2 -r v1.2 -u ) >cvsDiff.out 2>cvs.log &&
+ test ! -s cvs.log &&
+ test -s cvsDiff.out &&
+ check_diff cvsDiff.out v2 v1.2 >check_diff.out 2>&1
+'
+
+test_expect_success 'apply early [cvswork3] diff to b3' '
+ git clone -q . gitwork3 &&
+ (
+ cd gitwork3 &&
+ git checkout -b b3 v1 &&
+ git apply -p0 --index <"$WORKDIR/cvswork3edit.diff" &&
+ git commit -m "cvswork3 edits applied"
+ ) &&
+ git fetch gitwork3 b3:b3 &&
+ git tag v3 b3
+'
+
+test_expect_success 'check [cvswork3] diff' '
+ ( cd cvswork3 && ! cvs -f diff -N -u ) >"$WORKDIR/cvsDiff.out" 2>cvs.log &&
+ test ! -s cvs.log &&
+ test -s cvsDiff.out &&
+ test $(grep Index: cvsDiff.out | wc -l) = 3 &&
+ test_cmp cvsDiff.out cvswork3edit.diff &&
+ check_diff cvsDiff.out v1 v3 >check_diff.out 2>&1
+'
+
+test_expect_success 'merge early [cvswork3] b3 with b1' '
+ ( cd gitwork3 && git merge "message" HEAD b1 ) &&
+ git fetch gitwork3 b3:b3 &&
+ git tag v3merged b3 &&
+ git push --tags gitcvs.git b3:b3
+'
+
+# This test would fail if cvsserver properly created a ".#afile"* file
+# for the merge.
+# TODO: Validate that the .# file was saved properly, and then
+# delete/ignore it when checking the tree.
+test_expect_success 'cvs up dirty [cvswork3]' '
+ (
+ cd cvswork3 &&
+ cvs -f up &&
+ ! cvs -f diff -N -u >"$WORKDIR/cvsDiff.out"
+ ) >cvs.log 2>&1 &&
+ test -s cvsDiff.out &&
+ test $(grep Index: cvsDiff.out | wc -l) = 2 &&
+ check_start_tree cvswork3 &&
+ check_file cvswork3 textfile.c v3merged &&
+ check_file cvswork3 t3 v3merged &&
+ check_file cvswork3 adir/afile v3merged &&
+ check_file cvswork3 adir/a3file v3merged &&
+ check_file cvswork3 adir/afile5 v3merged &&
+ check_file cvswork3 adir/bdir/bfile v3merged &&
+ check_file cvswork3 adir/bdir/b3file v3merged &&
+ check_file cvswork3 cdir/cfile v3merged &&
+ check_end_full_tree cvswork3 v3merged
+'
+
+# TODO: test cvs status
+
+test_expect_success 'cvs commit [cvswork3]' '
+ (
+ cd cvswork3 &&
+ cvs -f commit -m "dirty sandbox after auto-merge"
+ ) >cvs.log 2>&1 &&
+ check_start_tree cvswork3 &&
+ check_file cvswork3 textfile.c v3merged &&
+ check_file cvswork3 t3 v3merged &&
+ check_file cvswork3 adir/afile v3merged &&
+ check_file cvswork3 adir/a3file v3merged &&
+ check_file cvswork3 adir/afile5 v3merged &&
+ check_file cvswork3 adir/bdir/bfile v3merged &&
+ check_file cvswork3 adir/bdir/b3file v3merged &&
+ check_file cvswork3 cdir/cfile v3merged &&
+ check_end_full_tree cvswork3 v3merged &&
+ git fetch gitcvs.git b3:b4 &&
+ git tag v4.1 b4 &&
+ git diff --exit-code v4.1 v3merged >check_diff_apply.out 2>&1
+'
+
+test_done
--- /dev/null
+#!/bin/sh
+
+# An example hook script to verify what is about to be pushed. Called by "git
+# push" after it has checked the remote status, but before anything has been
+# pushed. If this script exits with a non-zero status nothing will be pushed.
+#
+# This hook is called with the following parameters:
+#
+# $1 -- Name of the remote to which the push is being done
+# $2 -- URL to which the push is being done
+#
+# If pushing without using a named remote those arguments will be equal.
+#
+# Information about the commits which are being pushed is supplied as lines to
+# the standard input in the form:
+#
+# <local ref> <local sha1> <remote ref> <remote sha1>
+#
+# This sample shows how to prevent push of commits where the log message starts
+# with "WIP" (work in progress).
+
+remote="$1"
+url="$2"
+
+z40=0000000000000000000000000000000000000000
+
+IFS=' '
+while read local_ref local_sha remote_ref remote_sha
+do
+ if [ "$local_sha" = $z40 ]
+ then
+ # Handle delete
+ else
+ if [ "$remote_sha" = $z40 ]
+ then
+ # New branch, examine all commits
+ range="$local_sha"
+ else
+ # Update to existing branch, examine new commits
+ range="$remote_sha..$local_sha"
+ fi
+
+ # Check for WIP commit
+ commit=`git rev-list -n 1 --grep '^WIP' "$range"`
+ if [ -n "$commit" ]
+ then
+ echo "Found WIP commit in $local_ref, not pushing"
+ exit 1
+ fi
+ fi
+done
+
+exit 0
+#ifdef USE_WILDMATCH
+#undef USE_WILDMATCH /* We need real fnmatch implementation here */
+#endif
#include "cache.h"
#include "wildmatch.h"
+static int perf(int ac, char **av)
+{
+ struct timeval tv1, tv2;
+ struct stat st;
+ int fd, i, n, flags1 = 0, flags2 = 0;
+ char *buffer, *p;
+ uint32_t usec1, usec2;
+ const char *lang;
+ const char *file = av[0];
+ const char *pattern = av[1];
+
+ lang = getenv("LANG");
+ if (lang && strcmp(lang, "C"))
+ die("Please test it on C locale.");
+
+ if ((fd = open(file, O_RDONLY)) == -1 || fstat(fd, &st))
+ die_errno("file open");
+
+ buffer = xmalloc(st.st_size + 2);
+ if (read(fd, buffer, st.st_size) != st.st_size)
+ die_errno("read");
+
+ buffer[st.st_size] = '\0';
+ buffer[st.st_size + 1] = '\0';
+ for (i = 0; i < st.st_size; i++)
+ if (buffer[i] == '\n')
+ buffer[i] = '\0';
+
+ n = atoi(av[2]);
+ if (av[3] && !strcmp(av[3], "pathname")) {
+ flags1 = WM_PATHNAME;
+ flags2 = FNM_PATHNAME;
+ }
+
+ gettimeofday(&tv1, NULL);
+ for (i = 0; i < n; i++) {
+ for (p = buffer; *p; p += strlen(p) + 1)
+ wildmatch(pattern, p, flags1, NULL);
+ }
+ gettimeofday(&tv2, NULL);
+
+ usec1 = (uint32_t)tv2.tv_sec * 1000000 + tv2.tv_usec;
+ usec1 -= (uint32_t)tv1.tv_sec * 1000000 + tv1.tv_usec;
+ printf("wildmatch %ds %dus\n",
+ (int)(usec1 / 1000000),
+ (int)(usec1 % 1000000));
+
+ gettimeofday(&tv1, NULL);
+ for (i = 0; i < n; i++) {
+ for (p = buffer; *p; p += strlen(p) + 1)
+ fnmatch(pattern, p, flags2);
+ }
+ gettimeofday(&tv2, NULL);
+
+ usec2 = (uint32_t)tv2.tv_sec * 1000000 + tv2.tv_usec;
+ usec2 -= (uint32_t)tv1.tv_sec * 1000000 + tv1.tv_usec;
+ if (usec2 > usec1)
+ printf("fnmatch %ds %dus or %.2f%% slower\n",
+ (int)((usec2 - usec1) / 1000000),
+ (int)((usec2 - usec1) % 1000000),
+ (float)(usec2 - usec1) / usec1 * 100);
+ else
+ printf("fnmatch %ds %dus or %.2f%% faster\n",
+ (int)((usec1 - usec2) / 1000000),
+ (int)((usec1 - usec2) % 1000000),
+ (float)(usec1 - usec2) / usec1 * 100);
+ return 0;
+}
+
int main(int argc, char **argv)
{
int i;
+
+ if (!strcmp(argv[1], "perf"))
+ return perf(argc - 2, argv + 2);
+
for (i = 2; i < argc; i++) {
if (argv[i][0] == '/')
die("Forward slash is not allowed at the beginning of the\n"
argv[i] += 3;
}
if (!strcmp(argv[1], "wildmatch"))
- return !!wildmatch(argv[3], argv[2], 0);
+ return !!wildmatch(argv[3], argv[2], WM_PATHNAME, NULL);
else if (!strcmp(argv[1], "iwildmatch"))
- return !!wildmatch(argv[3], argv[2], FNM_CASEFOLD);
+ return !!wildmatch(argv[3], argv[2], WM_PATHNAME | WM_CASEFOLD, NULL);
+ else if (!strcmp(argv[1], "pathmatch"))
+ return !!wildmatch(argv[3], argv[2], 0, NULL);
else if (!strcmp(argv[1], "fnmatch"))
return !!fnmatch(argv[3], argv[2], FNM_PATHNAME);
else
die("Aborting.");
}
+static int run_pre_push_hook(struct transport *transport,
+ struct ref *remote_refs)
+{
+ int ret = 0, x;
+ struct ref *r;
+ struct child_process proc;
+ struct strbuf buf;
+ const char *argv[4];
+
+ if (!(argv[0] = find_hook("pre-push")))
+ return 0;
+
+ argv[1] = transport->remote->name;
+ argv[2] = transport->url;
+ argv[3] = NULL;
+
+ memset(&proc, 0, sizeof(proc));
+ proc.argv = argv;
+ proc.in = -1;
+
+ if (start_command(&proc)) {
+ finish_command(&proc);
+ return -1;
+ }
+
+ strbuf_init(&buf, 256);
+
+ for (r = remote_refs; r; r = r->next) {
+ if (!r->peer_ref) continue;
+ if (r->status == REF_STATUS_REJECT_NONFASTFORWARD) continue;
+ if (r->status == REF_STATUS_UPTODATE) continue;
+
+ strbuf_reset(&buf);
+ strbuf_addf( &buf, "%s %s %s %s\n",
+ r->peer_ref->name, sha1_to_hex(r->new_sha1),
+ r->name, sha1_to_hex(r->old_sha1));
+
+ if (write_in_full(proc.in, buf.buf, buf.len) != buf.len) {
+ ret = -1;
+ break;
+ }
+ }
+
+ strbuf_release(&buf);
+
+ x = close(proc.in);
+ if (!ret)
+ ret = x;
+
+ x = finish_command(&proc);
+ if (!ret)
+ ret = x;
+
+ return ret;
+}
+
int transport_push(struct transport *transport,
int refspec_nr, const char **refspec, int flags,
unsigned int *reject_reasons)
flags & TRANSPORT_PUSH_MIRROR,
flags & TRANSPORT_PUSH_FORCE);
+ if (!(flags & TRANSPORT_PUSH_NO_HOOK))
+ if (run_pre_push_hook(transport, remote_refs))
+ return -1;
+
if ((flags & TRANSPORT_RECURSE_SUBMODULES_ON_DEMAND) && !is_bare_repository()) {
struct ref *ref = remote_refs;
for (; ref; ref = ref->next)
#define TRANSPORT_RECURSE_SUBMODULES_CHECK 64
#define TRANSPORT_PUSH_PRUNE 128
#define TRANSPORT_RECURSE_SUBMODULES_ON_DEMAND 256
+#define TRANSPORT_PUSH_NO_HOOK 512
#define TRANSPORT_SUMMARY_WIDTH (2 * DEFAULT_ABBREV + 3)
#define TRANSPORT_SUMMARY(x) (int)(TRANSPORT_SUMMARY_WIDTH + strlen(x) - gettext_width(x)), (x)
if (!core_apply_sparse_checkout || !o->update)
o->skip_sparse_checkout = 1;
if (!o->skip_sparse_checkout) {
- if (add_excludes_from_file_to_list(git_path("info/sparse-checkout"), "", 0, NULL, &el, 0) < 0)
+ if (add_excludes_from_file_to_list(git_path("info/sparse-checkout"), "", 0, &el, 0) < 0)
o->skip_sparse_checkout = 1;
else
o->el = ⪙
#define NEGATE_CLASS '!'
#define NEGATE_CLASS2 '^'
-#define FALSE 0
-#define TRUE 1
-
#define CC_EQ(class, len, litmatch) ((len) == sizeof (litmatch)-1 \
&& *(class) == *(litmatch) \
&& strncmp((char*)class, litmatch, len) == 0)
#define ISXDIGIT(c) (ISASCII(c) && isxdigit(c))
/* Match pattern "p" against "text" */
-static int dowild(const uchar *p, const uchar *text, int force_lower_case)
+static int dowild(const uchar *p, const uchar *text, unsigned int flags)
{
uchar p_ch;
const uchar *pattern = p;
int matched, match_slash, negated;
uchar t_ch, prev_ch;
if ((t_ch = *text) == '\0' && p_ch != '*')
- return ABORT_ALL;
- if (force_lower_case && ISUPPER(t_ch))
+ return WM_ABORT_ALL;
+ if ((flags & WM_CASEFOLD) && ISUPPER(t_ch))
t_ch = tolower(t_ch);
- if (force_lower_case && ISUPPER(p_ch))
+ if ((flags & WM_CASEFOLD) && ISUPPER(p_ch))
p_ch = tolower(p_ch);
switch (p_ch) {
case '\\':
/* FALLTHROUGH */
default:
if (t_ch != p_ch)
- return NOMATCH;
+ return WM_NOMATCH;
continue;
case '?':
/* Match anything but '/'. */
- if (t_ch == '/')
- return NOMATCH;
+ if ((flags & WM_PATHNAME) && t_ch == '/')
+ return WM_NOMATCH;
continue;
case '*':
if (*++p == '*') {
const uchar *prev_p = p - 2;
while (*++p == '*') {}
- if ((prev_p < pattern || *prev_p == '/') &&
+ if (!(flags & WM_PATHNAME))
+ /* without WM_PATHNAME, '*' == '**' */
+ match_slash = 1;
+ else if ((prev_p < pattern || *prev_p == '/') &&
(*p == '\0' || *p == '/' ||
(p[0] == '\\' && p[1] == '/'))) {
/*
* both foo/bar and foo/a/bar.
*/
if (p[0] == '/' &&
- dowild(p + 1, text, force_lower_case) == MATCH)
- return MATCH;
- match_slash = TRUE;
+ dowild(p + 1, text, flags) == WM_MATCH)
+ return WM_MATCH;
+ match_slash = 1;
} else
- return ABORT_MALFORMED;
+ return WM_ABORT_MALFORMED;
} else
- match_slash = FALSE;
+ /* without WM_PATHNAME, '*' == '**' */
+ match_slash = flags & WM_PATHNAME ? 0 : 1;
if (*p == '\0') {
/* Trailing "**" matches everything. Trailing "*" matches
* only if there are no more slash characters. */
if (!match_slash) {
if (strchr((char*)text, '/') != NULL)
- return NOMATCH;
+ return WM_NOMATCH;
}
- return MATCH;
+ return WM_MATCH;
+ } else if (!match_slash && *p == '/') {
+ /*
+ * _one_ asterisk followed by a slash
+ * with WM_PATHNAME matches the next
+ * directory
+ */
+ const char *slash = strchr((char*)text, '/');
+ if (!slash)
+ return WM_NOMATCH;
+ text = (const uchar*)slash;
+ /* the slash is consumed by the top-level for loop */
+ break;
}
while (1) {
if (t_ch == '\0')
break;
- if ((matched = dowild(p, text, force_lower_case)) != NOMATCH) {
- if (!match_slash || matched != ABORT_TO_STARSTAR)
+ /*
+ * Try to advance faster when an asterisk is
+ * followed by a literal. We know in this case
+ * that the the string before the literal
+ * must belong to "*".
+ * If match_slash is false, do not look past
+ * the first slash as it cannot belong to '*'.
+ */
+ if (!is_glob_special(*p)) {
+ p_ch = *p;
+ if ((flags & WM_CASEFOLD) && ISUPPER(p_ch))
+ p_ch = tolower(p_ch);
+ while ((t_ch = *text) != '\0' &&
+ (match_slash || t_ch != '/')) {
+ if ((flags & WM_CASEFOLD) && ISUPPER(t_ch))
+ t_ch = tolower(t_ch);
+ if (t_ch == p_ch)
+ break;
+ text++;
+ }
+ if (t_ch != p_ch)
+ return WM_NOMATCH;
+ }
+ if ((matched = dowild(p, text, flags)) != WM_NOMATCH) {
+ if (!match_slash || matched != WM_ABORT_TO_STARSTAR)
return matched;
} else if (!match_slash && t_ch == '/')
- return ABORT_TO_STARSTAR;
+ return WM_ABORT_TO_STARSTAR;
t_ch = *++text;
}
- return ABORT_ALL;
+ return WM_ABORT_ALL;
case '[':
p_ch = *++p;
#ifdef NEGATE_CLASS2
if (p_ch == NEGATE_CLASS2)
p_ch = NEGATE_CLASS;
#endif
- /* Assign literal TRUE/FALSE because of "matched" comparison. */
- negated = p_ch == NEGATE_CLASS? TRUE : FALSE;
+ /* Assign literal 1/0 because of "matched" comparison. */
+ negated = p_ch == NEGATE_CLASS ? 1 : 0;
if (negated) {
/* Inverted character class. */
p_ch = *++p;
}
prev_ch = 0;
- matched = FALSE;
+ matched = 0;
do {
if (!p_ch)
- return ABORT_ALL;
+ return WM_ABORT_ALL;
if (p_ch == '\\') {
p_ch = *++p;
if (!p_ch)
- return ABORT_ALL;
+ return WM_ABORT_ALL;
if (t_ch == p_ch)
- matched = TRUE;
+ matched = 1;
} else if (p_ch == '-' && prev_ch && p[1] && p[1] != ']') {
p_ch = *++p;
if (p_ch == '\\') {
p_ch = *++p;
if (!p_ch)
- return ABORT_ALL;
+ return WM_ABORT_ALL;
}
if (t_ch <= p_ch && t_ch >= prev_ch)
- matched = TRUE;
+ matched = 1;
p_ch = 0; /* This makes "prev_ch" get set to 0. */
} else if (p_ch == '[' && p[1] == ':') {
const uchar *s;
int i;
for (s = p += 2; (p_ch = *p) && p_ch != ']'; p++) {} /*SHARED ITERATOR*/
if (!p_ch)
- return ABORT_ALL;
+ return WM_ABORT_ALL;
i = p - s - 1;
if (i < 0 || p[-1] != ':') {
/* Didn't find ":]", so treat like a normal set. */
p = s - 2;
p_ch = '[';
if (t_ch == p_ch)
- matched = TRUE;
+ matched = 1;
continue;
}
if (CC_EQ(s,i, "alnum")) {
if (ISALNUM(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "alpha")) {
if (ISALPHA(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "blank")) {
if (ISBLANK(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "cntrl")) {
if (ISCNTRL(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "digit")) {
if (ISDIGIT(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "graph")) {
if (ISGRAPH(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "lower")) {
if (ISLOWER(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "print")) {
if (ISPRINT(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "punct")) {
if (ISPUNCT(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "space")) {
if (ISSPACE(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "upper")) {
if (ISUPPER(t_ch))
- matched = TRUE;
+ matched = 1;
} else if (CC_EQ(s,i, "xdigit")) {
if (ISXDIGIT(t_ch))
- matched = TRUE;
+ matched = 1;
} else /* malformed [:class:] string */
- return ABORT_ALL;
+ return WM_ABORT_ALL;
p_ch = 0; /* This makes "prev_ch" get set to 0. */
} else if (t_ch == p_ch)
- matched = TRUE;
+ matched = 1;
} while (prev_ch = p_ch, (p_ch = *++p) != ']');
- if (matched == negated || t_ch == '/')
- return NOMATCH;
+ if (matched == negated ||
+ ((flags & WM_PATHNAME) && t_ch == '/'))
+ return WM_NOMATCH;
continue;
}
}
- return *text ? NOMATCH : MATCH;
+ return *text ? WM_NOMATCH : WM_MATCH;
}
/* Match the "pattern" against the "text" string. */
-int wildmatch(const char *pattern, const char *text, int flags)
+int wildmatch(const char *pattern, const char *text,
+ unsigned int flags, struct wildopts *wo)
{
- return dowild((const uchar*)pattern, (const uchar*)text,
- flags & FNM_CASEFOLD ? 1 :0);
+ return dowild((const uchar*)pattern, (const uchar*)text, flags);
}
-/* wildmatch.h */
+#ifndef WILDMATCH_H
+#define WILDMATCH_H
-#define ABORT_MALFORMED 2
-#define NOMATCH 1
-#define MATCH 0
-#define ABORT_ALL -1
-#define ABORT_TO_STARSTAR -2
+#define WM_CASEFOLD 1
+#define WM_PATHNAME 2
-int wildmatch(const char *pattern, const char *text, int flags);
+#define WM_ABORT_MALFORMED 2
+#define WM_NOMATCH 1
+#define WM_MATCH 0
+#define WM_ABORT_ALL -1
+#define WM_ABORT_TO_STARSTAR -2
+
+struct wildopts;
+
+int wildmatch(const char *pattern, const char *text,
+ unsigned int flags,
+ struct wildopts *wo);
+#endif