From: Junio C Hamano Date: Thu, 10 Feb 2011 00:40:12 +0000 (-0800) Subject: Merge branch 'maint-1.7.0' into maint X-Git-Tag: v1.7.4.1~5 X-Git-Url: https://git.lorimer.id.au/gitweb.git/diff_plain/a8e4a5943a63c8fd4a3a9b70ccf4608bcc973707?hp=-c Merge branch 'maint-1.7.0' into maint * maint-1.7.0: fast-import: introduce "feature notes" command fast-import: clarify documentation of "feature" command Conflicts: Documentation/git-fast-import.txt --- a8e4a5943a63c8fd4a3a9b70ccf4608bcc973707 diff --combined Documentation/git-fast-import.txt index 4415e63635,becee8b4e7..02bb49886c --- a/Documentation/git-fast-import.txt +++ b/Documentation/git-fast-import.txt @@@ -92,11 -92,6 +92,11 @@@ OPTION --(no-)-relative-marks= with the --(import|export)-marks= options. +--cat-blob-fd=:: + Specify the file descriptor that will be written to + when the `cat-blob` command is encountered in the stream. + The default behaviour is to write to `stdout`. + --export-pack-edges=:: After creating a packfile, print a line of data to listing the filename of the packfile and the last @@@ -325,11 -320,6 +325,11 @@@ and control the current import process standard output. This command is optional and is not needed to perform an import. +`cat-blob`:: + Causes fast-import to print a blob in 'cat-file --batch' + format to the file descriptor set with `--cat-blob-fd` or + `stdout` if unspecified. + `feature`:: Require that fast-import supports the specified feature, or abort if it does not. @@@ -449,7 -439,7 +449,7 @@@ Marks must be declared (via `mark`) bef * A complete 40 byte or abbreviated commit SHA-1 in hex. * Any valid Git SHA-1 expression that resolves to a commit. See - ``SPECIFYING REVISIONS'' in linkgit:git-rev-parse[1] for details. + ``SPECIFYING REVISIONS'' in linkgit:gitrevisions[7] for details. The special case of restarting an incremental import from the current branch value should be written as: @@@ -492,11 -482,9 +492,11 @@@ External data format: 'M' SP SP SP LF .... + -Here `` can be either a mark reference (`:`) +Here usually `` must be either a mark reference (`:`) set by a prior `blob` command, or a full 40-byte SHA-1 of an -existing Git blob object. +existing Git blob object. If `` is `040000`` then +`` must be the full 40-byte SHA-1 of an existing +Git tree object or a mark reference set with `--import-marks`. Inline data format:: The data content for the file has not been supplied yet. @@@ -521,8 -509,6 +521,8 @@@ in octal. Git only supports the follow * `160000`: A gitlink, SHA-1 of the object refers to a commit in another repository. Git links can only be specified by SHA or through a commit mark. They are used to implement submodules. +* `040000`: A subdirectory. Subdirectories can only be specified by + SHA or through a tree mark set with `--import-marks`. In both formats `` is the complete path of the file to be added (if not already existing) or modified (if already existing). @@@ -542,8 -528,6 +542,8 @@@ The value of `` must be in canoni * contain the special component `.` or `..` (e.g. `foo/./bar` and `foo/../bar` are invalid). +The root of the tree can be represented by an empty string as ``. + It is recommended that `` always be encoded using UTF-8. `filedelete` @@@ -888,33 -872,6 +888,33 @@@ Placing a `progress` command immediatel inform the reader when the `checkpoint` has been completed and it can safely access the refs that fast-import updated. +`cat-blob` +~~~~~~~~~~ +Causes fast-import to print a blob to a file descriptor previously +arranged with the `--cat-blob-fd` argument. The command otherwise +has no impact on the current import; its main purpose is to +retrieve blobs that may be in fast-import's memory but not +accessible from the target repository. + +.... + 'cat-blob' SP LF +.... + +The `` can be either a mark reference (`:`) +set previously or a full 40-byte SHA-1 of a Git blob, preexisting or +ready to be written. + +Output uses the same format as `git cat-file --batch`: + +==== + SP 'blob' SP LF + LF +==== + +This command can be used anywhere in the stream that comments are +accepted. In particular, the `cat-blob` command can be used in the +middle of a commit but not in the middle of a `data` command. + `feature` ~~~~~~~~~ Require that fast-import supports the specified feature, or abort if @@@ -941,13 -898,12 +941,20 @@@ import-marks: second, an --import-marks= command-line option overrides any "feature import-marks" command in the stream. +cat-blob:: + Ignored. Versions of fast-import not supporting the + "cat-blob" command will exit with a message indicating so. + This lets the import error out early with a clear message, + rather than wasting time on the early part of an import + before the unsupported command is detected. + + notes:: + Require that the backend support the 'notemodify' (N) + subcommand to the 'commit' command. + Versions of fast-import not supporting notes will exit + with a message indicating so. + ++ `option` ~~~~~~~~ Processes the specified option so that git fast-import behaves in a @@@ -973,7 -929,6 +980,7 @@@ not be passed as option * date-format * import-marks * export-marks +* cat-blob-fd * force Crash Reports @@@ -1274,13 -1229,6 +1281,13 @@@ and lazy loading of subtrees, allows fa projects with 2,000+ branches and 45,114+ files in a very limited memory footprint (less than 2.7 MiB per active branch). +Signals +------- +Sending *SIGUSR1* to the 'git fast-import' process ends the current +packfile early, simulating a `checkpoint` command. The impatient +operator can use this facility to peek at the objects and refs from an +import in progress, at the cost of some added running time and worse +compression. Author ------ diff --combined fast-import.c index 60f26fe473,ff56ea2891..970d8470ed --- a/fast-import.c +++ b/fast-import.c @@@ -132,17 -132,14 +132,17 @@@ Format of STDIN stream ts ::= # time since the epoch in seconds, ascii base10 notation; tz ::= # GIT style timezone; - # note: comments may appear anywhere in the input, except - # within a data command. Any form of the data command - # always escapes the related input from comment processing. + # note: comments and cat requests may appear anywhere + # in the input, except within a data command. Any form + # of the data command always escapes the related input + # from comment processing. # # In case it is not clear, the '#' that starts the comment # must be the first character on that line (an lf # preceded it). # + cat_blob ::= 'cat-blob' sp (hexsha1 | idnum) lf; + comment ::= '#' not_lf* lf; not_lf ::= # Any byte that is not ASCII newline (LF); */ @@@ -159,7 -156,6 +159,7 @@@ #include "csum-file.h" #include "quote.h" #include "exec_cmd.h" +#include "dir.h" #define PACK_ID_BITS 16 #define MAX_PACK_ID ((1<idx.sha1)) return e; - p = e; e = e->next; } e = new_object(sha1); - e->next = NULL; + e->next = object_table[h]; e->idx.offset = 0; - if (p) - p->next = e; - else - object_table[h] = e; + object_table[h] = e; return e; } @@@ -1012,6 -980,29 +1012,6 @@@ static void cycle_packfile(void start_packfile(); } -static size_t encode_header( - enum object_type type, - uintmax_t size, - unsigned char *hdr) -{ - int n = 1; - unsigned char c; - - if (type < OBJ_COMMIT || type > OBJ_REF_DELTA) - die("bad type %d", type); - - c = (type << 4) | (size & 15); - size >>= 4; - while (size) { - *hdr++ = c | 0x80; - c = size & 0x7f; - size >>= 7; - n++; - } - *hdr = c; - return n; -} - static int store_object( enum object_type type, struct strbuf *dat, @@@ -1112,7 -1103,7 +1112,7 @@@ delta_count_by_type[type]++; e->depth = last->depth + 1; - hdrlen = encode_header(OBJ_OFS_DELTA, deltalen, hdr); + hdrlen = encode_in_pack_object_header(OBJ_OFS_DELTA, deltalen, hdr); sha1write(pack_file, hdr, hdrlen); pack_size += hdrlen; @@@ -1123,7 -1114,7 +1123,7 @@@ pack_size += sizeof(hdr) - pos; } else { e->depth = 0; - hdrlen = encode_header(type, dat->len, hdr); + hdrlen = encode_in_pack_object_header(type, dat->len, hdr); sha1write(pack_file, hdr, hdrlen); pack_size += hdrlen; } @@@ -1197,7 -1188,7 +1197,7 @@@ static void stream_blob(uintmax_t len, memset(&s, 0, sizeof(s)); deflateInit(&s, pack_compression_level); - hdrlen = encode_header(OBJ_BLOB, len, out_buf); + hdrlen = encode_in_pack_object_header(OBJ_BLOB, len, out_buf); if (out_sz <= hdrlen) die("impossibly large object header"); @@@ -1469,20 -1460,6 +1469,20 @@@ static void store_tree(struct tree_entr t->entry_count -= del; } +static void tree_content_replace( + struct tree_entry *root, + const unsigned char *sha1, + const uint16_t mode, + struct tree_content *newtree) +{ + if (!S_ISDIR(mode)) + die("Root cannot be a non-directory"); + hashcpy(root->versions[1].sha1, sha1); + if (root->tree) + release_tree_content_recursive(root->tree); + root->tree = newtree; +} + static int tree_content_set( struct tree_entry *root, const char *p, @@@ -1490,7 -1467,7 +1490,7 @@@ const uint16_t mode, struct tree_content *subtree) { - struct tree_content *t = root->tree; + struct tree_content *t; const char *slash1; unsigned int i, n; struct tree_entry *e; @@@ -1505,12 -1482,9 +1505,12 @@@ if (!slash1 && !S_ISDIR(mode) && subtree) die("Non-directories cannot have subtrees"); + if (!root->tree) + load_tree(root); + t = root->tree; for (i = 0; i < t->entry_count; i++) { e = t->entries[i]; - if (e->name->str_len == n && !strncmp(p, e->name->str_dat, n)) { + if (e->name->str_len == n && !strncmp_icase(p, e->name->str_dat, n)) { if (!slash1) { if (!S_ISDIR(mode) && e->versions[1].mode == mode @@@ -1563,7 -1537,7 +1563,7 @@@ static int tree_content_remove const char *p, struct tree_entry *backup_leaf) { - struct tree_content *t = root->tree; + struct tree_content *t; const char *slash1; unsigned int i, n; struct tree_entry *e; @@@ -1574,20 -1548,9 +1574,20 @@@ else n = strlen(p); + if (!root->tree) + load_tree(root); + t = root->tree; for (i = 0; i < t->entry_count; i++) { e = t->entries[i]; - if (e->name->str_len == n && !strncmp(p, e->name->str_dat, n)) { + if (e->name->str_len == n && !strncmp_icase(p, e->name->str_dat, n)) { + if (slash1 && !S_ISDIR(e->versions[1].mode)) + /* + * If p names a file in some subdirectory, and a + * file or symlink matching the name of the + * parent directory of p exists, then p cannot + * exist and need not be deleted. + */ + return 1; if (!slash1 || !S_ISDIR(e->versions[1].mode)) goto del_entry; if (!e->tree) @@@ -1624,7 -1587,7 +1624,7 @@@ static int tree_content_get const char *p, struct tree_entry *leaf) { - struct tree_content *t = root->tree; + struct tree_content *t; const char *slash1; unsigned int i, n; struct tree_entry *e; @@@ -1635,12 -1598,9 +1635,12 @@@ else n = strlen(p); + if (!root->tree) + load_tree(root); + t = root->tree; for (i = 0; i < t->entry_count; i++) { e = t->entries[i]; - if (e->name->str_len == n && !strncmp(p, e->name->str_dat, n)) { + if (e->name->str_len == n && !strncmp_icase(p, e->name->str_dat, n)) { if (!slash1) { memcpy(leaf, e, sizeof(*leaf)); if (e->tree && is_null_sha1(e->versions[1].sha1)) @@@ -1729,7 -1689,7 +1729,7 @@@ static void dump_marks_helper(FILE *f if (m->shift) { for (k = 0; k < 1024; k++) { if (m->data.sets[k]) - dump_marks_helper(f, (base + k) << m->shift, + dump_marks_helper(f, base + (k << m->shift), m->data.sets[k]); } } else { @@@ -1836,7 -1796,7 +1836,7 @@@ static int read_next_command(void return EOF; } - do { + for (;;) { if (unread_command_buf) { unread_command_buf = 0; } else { @@@ -1869,14 -1829,9 +1869,14 @@@ rc->prev->next = rc; cmd_tail = rc; } - } while (command_buf.buf[0] == '#'); - - return 0; + if (!prefixcmp(command_buf.buf, "cat-blob ")) { + parse_cat_blob(); + continue; + } + if (command_buf.buf[0] == '#') + continue; + return 0; + } } static void skip_optional_lf(void) @@@ -2199,7 -2154,6 +2199,7 @@@ static void file_change_m(struct branc case S_IFREG | 0644: case S_IFREG | 0755: case S_IFLNK: + case S_IFDIR: case S_IFGITLINK: /* ok */ break; @@@ -2231,12 -2185,6 +2231,12 @@@ p = uq.buf; } + /* Git does not track empty, non-toplevel directories. */ + if (S_ISDIR(mode) && !memcmp(sha1, EMPTY_TREE_SHA1_BIN, 20) && *p) { + tree_content_remove(&b->branch_tree, p, NULL); + return; + } + if (S_ISGITLINK(mode)) { if (inline_data) die("Git links cannot be specified 'inline': %s", @@@ -2251,34 -2199,25 +2251,34 @@@ * another repository. */ } else if (inline_data) { + if (S_ISDIR(mode)) + die("Directories cannot be specified 'inline': %s", + command_buf.buf); if (p != uq.buf) { strbuf_addstr(&uq, p); p = uq.buf; } read_next_command(); parse_and_store_blob(&last_blob, sha1, 0); - } else if (oe) { - if (oe->type != OBJ_BLOB) - die("Not a blob (actually a %s): %s", - typename(oe->type), command_buf.buf); } else { - enum object_type type = sha1_object_info(sha1, NULL); + enum object_type expected = S_ISDIR(mode) ? + OBJ_TREE: OBJ_BLOB; + enum object_type type = oe ? oe->type : + sha1_object_info(sha1, NULL); if (type < 0) - die("Blob not found: %s", command_buf.buf); - if (type != OBJ_BLOB) - die("Not a blob (actually a %s): %s", - typename(type), command_buf.buf); + die("%s not found: %s", + S_ISDIR(mode) ? "Tree" : "Blob", + command_buf.buf); + if (type != expected) + die("Not a %s (actually a %s): %s", + typename(expected), typename(type), + command_buf.buf); } + if (!*p) { + tree_content_replace(&b->branch_tree, sha1, mode, NULL); + return; + } tree_content_set(&b->branch_tree, p, sha1, mode, NULL); } @@@ -2337,13 -2276,6 +2337,13 @@@ static void file_change_cr(struct branc tree_content_get(&b->branch_tree, s, &leaf); if (!leaf.versions[1].mode) die("Path %s not in branch", s); + if (!*d) { /* C "path/to/subdir" "" */ + tree_content_replace(&b->branch_tree, + leaf.versions[1].sha1, + leaf.versions[1].mode, + leaf.tree); + return; + } tree_content_set(&b->branch_tree, d, leaf.versions[1].sha1, leaf.versions[1].mode, @@@ -2757,95 -2689,14 +2757,95 @@@ static void parse_reset_branch(void unread_command_buf = 1; } -static void parse_checkpoint(void) +static void cat_blob_write(const char *buf, unsigned long size) { + if (write_in_full(cat_blob_fd, buf, size) != size) + die_errno("Write to frontend failed"); +} + +static void cat_blob(struct object_entry *oe, unsigned char sha1[20]) +{ + struct strbuf line = STRBUF_INIT; + unsigned long size; + enum object_type type = 0; + char *buf; + + if (!oe || oe->pack_id == MAX_PACK_ID) { + buf = read_sha1_file(sha1, &type, &size); + } else { + type = oe->type; + buf = gfi_unpack_entry(oe, &size); + } + + /* + * Output based on batch_one_object() from cat-file.c. + */ + if (type <= 0) { + strbuf_reset(&line); + strbuf_addf(&line, "%s missing\n", sha1_to_hex(sha1)); + cat_blob_write(line.buf, line.len); + strbuf_release(&line); + free(buf); + return; + } + if (!buf) + die("Can't read object %s", sha1_to_hex(sha1)); + if (type != OBJ_BLOB) + die("Object %s is a %s but a blob was expected.", + sha1_to_hex(sha1), typename(type)); + strbuf_reset(&line); + strbuf_addf(&line, "%s %s %lu\n", sha1_to_hex(sha1), + typename(type), size); + cat_blob_write(line.buf, line.len); + strbuf_release(&line); + cat_blob_write(buf, size); + cat_blob_write("\n", 1); + free(buf); +} + +static void parse_cat_blob(void) +{ + const char *p; + struct object_entry *oe = oe; + unsigned char sha1[20]; + + /* cat-blob SP LF */ + p = command_buf.buf + strlen("cat-blob "); + if (*p == ':') { + char *x; + oe = find_mark(strtoumax(p + 1, &x, 10)); + if (x == p + 1) + die("Invalid mark: %s", command_buf.buf); + if (!oe) + die("Unknown mark: %s", command_buf.buf); + if (*x) + die("Garbage after mark: %s", command_buf.buf); + hashcpy(sha1, oe->idx.sha1); + } else { + if (get_sha1_hex(p, sha1)) + die("Invalid SHA1: %s", command_buf.buf); + if (p[40]) + die("Garbage after SHA1: %s", command_buf.buf); + oe = find_object(sha1); + } + + cat_blob(oe, sha1); +} + +static void checkpoint(void) +{ + checkpoint_requested = 0; if (object_count) { cycle_packfile(); dump_branches(); dump_tags(); dump_marks(); } +} + +static void parse_checkpoint(void) +{ + checkpoint_requested = 1; skip_optional_lf(); } @@@ -2879,7 -2730,6 +2879,7 @@@ static void option_import_marks(const c } import_marks_file = make_fast_import_path(marks); + safe_create_leading_directories_const(import_marks_file); import_marks_file_from_stream = from_stream; } @@@ -2895,39 -2745,21 +2895,39 @@@ static void option_date_format(const ch die("unknown --date-format argument %s", fmt); } +static unsigned long ulong_arg(const char *option, const char *arg) +{ + char *endptr; + unsigned long rv = strtoul(arg, &endptr, 0); + if (strchr(arg, '-') || endptr == arg || *endptr) + die("%s: argument must be a non-negative integer", option); + return rv; +} + static void option_depth(const char *depth) { - max_depth = strtoul(depth, NULL, 0); + max_depth = ulong_arg("--depth", depth); if (max_depth > MAX_DEPTH) die("--depth cannot exceed %u", MAX_DEPTH); } static void option_active_branches(const char *branches) { - max_active_branches = strtoul(branches, NULL, 0); + max_active_branches = ulong_arg("--active-branches", branches); } static void option_export_marks(const char *marks) { export_marks_file = make_fast_import_path(marks); + safe_create_leading_directories_const(export_marks_file); +} + +static void option_cat_blob_fd(const char *fd) +{ + unsigned long n = ulong_arg("--cat-blob-fd", fd); + if (n > (unsigned long) INT_MAX) + die("--cat-blob-fd cannot exceed %d", INT_MAX); + cat_blob_fd = (int) n; } static void option_export_pack_edges(const char *edges) @@@ -2983,14 -2815,14 +2983,16 @@@ static int parse_one_feature(const cha option_import_marks(feature + 13, from_stream); } else if (!prefixcmp(feature, "export-marks=")) { option_export_marks(feature + 13); + } else if (!strcmp(feature, "cat-blob")) { + ; /* Don't die - this feature is supported */ } else if (!prefixcmp(feature, "relative-marks")) { relative_marks_paths = 1; } else if (!prefixcmp(feature, "no-relative-marks")) { relative_marks_paths = 0; } else if (!prefixcmp(feature, "force")) { force_update = 1; + } else if (!strcmp(feature, "notes")) { + ; /* do nothing; we have the feature */ } else { return 0; } @@@ -3061,7 -2893,7 +3063,7 @@@ static int git_pack_config(const char * } static const char fast_import_usage[] = -"git fast-import [--date-format=f] [--max-pack-size=n] [--big-file-threshold=n] [--depth=n] [--active-branches=n] [--export-marks=marks.file]"; +"git fast-import [--date-format=] [--max-pack-size=] [--big-file-threshold=] [--depth=] [--active-branches=] [--export-marks=]"; static void parse_argv(void) { @@@ -3079,11 -2911,6 +3081,11 @@@ if (parse_one_feature(a + 2, 0)) continue; + if (!prefixcmp(a + 2, "cat-blob-fd=")) { + option_cat_blob_fd(a + 2 + strlen("cat-blob-fd=")); + continue; + } + die("unknown option %s", a); } if (i != global_argc) @@@ -3126,7 -2953,6 +3128,7 @@@ int main(int argc, const char **argv prepare_packed_git(); start_packfile(); set_die_routine(die_nicely); + set_checkpoint_signal(); while (read_next_command() != EOF) { if (!strcmp("blob", command_buf.buf)) parse_new_blob(); @@@ -3148,9 -2974,6 +3150,9 @@@ /* ignore non-git options*/; else die("Unsupported command: %s", command_buf.buf); + + if (checkpoint_requested) + checkpoint(); } /* argv hasn't been parsed yet, do so */ diff --combined t/t9301-fast-import-notes.sh index 7cf8cd8a2f,164edf0c3d..463254c727 --- a/t/t9301-fast-import-notes.sh +++ b/t/t9301-fast-import-notes.sh @@@ -120,6 -120,7 +120,7 @@@ test_expect_success 'add notes with sim test_tick cat >input < $GIT_COMMITTER_DATE data <expect <