# HG changeset patch # User Artem Tikhomirov # Date 1376585030 -7200 # Node ID b4242b7e7dfe72eb6cc68886f3022985cc2f1df9 # Parent 7743a9c10bfa9823ce2ad620acf9965c3a0573f7 Merge command: implement conflict resolution alternatives diff -r 7743a9c10bfa -r b4242b7e7dfe cmdline/org/tmatesoft/hg/console/Merge.java --- a/cmdline/org/tmatesoft/hg/console/Merge.java Wed Aug 14 20:07:26 2013 +0200 +++ b/cmdline/org/tmatesoft/hg/console/Merge.java Thu Aug 15 18:43:50 2013 +0200 @@ -56,8 +56,8 @@ static class Dump implements HgMergeCommand.Mediator { - public void same(HgFileRevision first, HgFileRevision second, Resolver resolver) throws HgCallbackTargetException { - System.out.printf("Unchanged %s:%s", first.getPath(), first.getRevision().shortNotation()); + public void same(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + System.out.printf("Unchanged %s:%s", rev.getPath(), rev.getRevision().shortNotation()); } public void onlyA(HgFileRevision base, HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgAddRemoveCommand.java --- a/src/org/tmatesoft/hg/core/HgAddRemoveCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgAddRemoveCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -18,7 +18,6 @@ import java.util.LinkedHashSet; -import org.tmatesoft.hg.internal.COWTransaction; import org.tmatesoft.hg.internal.DirstateBuilder; import org.tmatesoft.hg.internal.DirstateReader; import org.tmatesoft.hg.internal.Internals; @@ -123,7 +122,7 @@ progress.worked(1); cancellation.checkCancelled(); } - Transaction.Factory trFactory = new COWTransaction.Factory(); + Transaction.Factory trFactory = implRepo.getTransactionFactory(); Transaction tr = trFactory.create(repo); try { dirstateBuilder.serialize(tr); diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgCheckoutCommand.java --- a/src/org/tmatesoft/hg/core/HgCheckoutCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgCheckoutCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -231,8 +231,8 @@ lastFileMode = workingDirWriter.fmode(); lastFileModificationTime = workingDirWriter.mtime(); return true; - } catch (IOException ex) { - failure = new HgIOException("Failed to write down file revision", ex, workingDirWriter.getDestinationFile()); + } catch (HgIOException ex) { + failure = ex; } catch (HgRuntimeException ex) { failure = new HgLibraryFailureException(ex); } diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgCommitCommand.java --- a/src/org/tmatesoft/hg/core/HgCommitCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgCommitCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -20,7 +20,6 @@ import java.io.IOException; -import org.tmatesoft.hg.internal.COWTransaction; import org.tmatesoft.hg.internal.CommitFacility; import org.tmatesoft.hg.internal.CompleteRepoLock; import org.tmatesoft.hg.internal.FileContentSupplier; @@ -112,7 +111,8 @@ newRevision = Nodeid.NULL; return new Outcome(Kind.Failure, "nothing to add"); } - CommitFacility cf = new CommitFacility(Internals.getInstance(repo), parentRevs[0], parentRevs[1]); + final Internals implRepo = Internals.getInstance(repo); + CommitFacility cf = new CommitFacility(implRepo, parentRevs[0], parentRevs[1]); for (Path m : status.getModified()) { HgDataFile df = repo.getFileNode(m); cf.add(df, new WorkingCopyContent(df)); @@ -131,7 +131,7 @@ } cf.branch(detectBranch()); cf.user(detectUser()); - Transaction.Factory trFactory = new COWTransaction.Factory(); + Transaction.Factory trFactory = implRepo.getTransactionFactory(); Transaction tr = trFactory.create(repo); try { newRevision = cf.commit(message, tr); diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgMergeCommand.java --- a/src/org/tmatesoft/hg/core/HgMergeCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgMergeCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -18,13 +18,23 @@ import static org.tmatesoft.hg.repo.HgRepository.BAD_REVISION; +import java.io.File; +import java.io.FileInputStream; +import java.io.IOException; import java.io.InputStream; import org.tmatesoft.hg.internal.Callback; import org.tmatesoft.hg.internal.CsetParamKeeper; +import org.tmatesoft.hg.internal.DirstateBuilder; +import org.tmatesoft.hg.internal.DirstateReader; import org.tmatesoft.hg.internal.Experimental; +import org.tmatesoft.hg.internal.FileUtils; +import org.tmatesoft.hg.internal.Internals; import org.tmatesoft.hg.internal.ManifestRevision; +import org.tmatesoft.hg.internal.MergeStateBuilder; import org.tmatesoft.hg.internal.Pool; +import org.tmatesoft.hg.internal.Transaction; +import org.tmatesoft.hg.internal.WorkingDirFileWriter; import org.tmatesoft.hg.repo.HgChangelog; import org.tmatesoft.hg.repo.HgParentChildMap; import org.tmatesoft.hg.repo.HgRepository; @@ -62,7 +72,7 @@ return this; } - public void execute(Mediator mediator) throws HgCallbackTargetException, HgRepositoryLockException, HgLibraryFailureException, CancelledException { + public void execute(Mediator mediator) throws HgCallbackTargetException, HgRepositoryLockException, HgIOException, HgLibraryFailureException, CancelledException { if (firstCset == BAD_REVISION || secondCset == BAD_REVISION || ancestorCset == BAD_REVISION) { throw new IllegalArgumentException("Merge heads and their ancestors are not initialized"); } @@ -71,65 +81,91 @@ try { Pool cacheRevs = new Pool(); Pool cacheFiles = new Pool(); + + Internals implRepo = Internals.getInstance(repo); + final DirstateBuilder dirstateBuilder = new DirstateBuilder(implRepo); + dirstateBuilder.fillFrom(new DirstateReader(implRepo, new Path.SimpleSource(repo.getSessionContext().getPathFactory(), cacheFiles))); + final HgChangelog clog = repo.getChangelog(); + dirstateBuilder.parents(clog.getRevision(firstCset), clog.getRevision(secondCset)); + // + MergeStateBuilder mergeStateBuilder = new MergeStateBuilder(implRepo); + ManifestRevision m1, m2, ma; m1 = new ManifestRevision(cacheRevs, cacheFiles).init(repo, firstCset); m2 = new ManifestRevision(cacheRevs, cacheFiles).init(repo, secondCset); ma = new ManifestRevision(cacheRevs, cacheFiles).init(repo, ancestorCset); - ResolverImpl resolver = new ResolverImpl(); - for (Path f : m1.files()) { - Nodeid fileRevBase, fileRevA, fileRevB; - if (m2.contains(f)) { - fileRevA = m1.nodeid(f); - fileRevB = m2.nodeid(f); - fileRevBase = ma.contains(f) ? ma.nodeid(f) : null; - if (fileRevA.equals(fileRevB)) { - HgFileRevision fr = new HgFileRevision(repo, fileRevA, m1.flags(f), f); - mediator.same(fr, fr, resolver); - } else if (fileRevBase == fileRevA) { - assert fileRevBase != null; - HgFileRevision frBase = new HgFileRevision(repo, fileRevBase, ma.flags(f), f); - HgFileRevision frSecond= new HgFileRevision(repo, fileRevB, m2.flags(f), f); - mediator.fastForwardB(frBase, frSecond, resolver); - } else if (fileRevBase == fileRevB) { - assert fileRevBase != null; - HgFileRevision frBase = new HgFileRevision(repo, fileRevBase, ma.flags(f), f); - HgFileRevision frFirst = new HgFileRevision(repo, fileRevA, m1.flags(f), f); - mediator.fastForwardA(frBase, frFirst, resolver); + Transaction transaction = implRepo.getTransactionFactory().create(repo); + ResolverImpl resolver = new ResolverImpl(implRepo, dirstateBuilder, mergeStateBuilder); + try { + for (Path f : m1.files()) { + Nodeid fileRevBase, fileRevA, fileRevB; + if (m2.contains(f)) { + fileRevA = m1.nodeid(f); + fileRevB = m2.nodeid(f); + fileRevBase = ma.contains(f) ? ma.nodeid(f) : null; + if (fileRevA.equals(fileRevB)) { + HgFileRevision fr = new HgFileRevision(repo, fileRevA, m1.flags(f), f); + resolver.presentState(f, fr, fr); + mediator.same(fr, resolver); + } else if (fileRevBase == fileRevA) { + assert fileRevBase != null; + HgFileRevision frBase = new HgFileRevision(repo, fileRevBase, ma.flags(f), f); + HgFileRevision frSecond= new HgFileRevision(repo, fileRevB, m2.flags(f), f); + resolver.presentState(f, frBase, frSecond); + mediator.fastForwardB(frBase, frSecond, resolver); + } else if (fileRevBase == fileRevB) { + assert fileRevBase != null; + HgFileRevision frBase = new HgFileRevision(repo, fileRevBase, ma.flags(f), f); + HgFileRevision frFirst = new HgFileRevision(repo, fileRevA, m1.flags(f), f); + resolver.presentState(f, frFirst, frBase); + mediator.fastForwardA(frBase, frFirst, resolver); + } else { + HgFileRevision frBase = fileRevBase == null ? null : new HgFileRevision(repo, fileRevBase, ma.flags(f), f); + HgFileRevision frFirst = new HgFileRevision(repo, fileRevA, m1.flags(f), f); + HgFileRevision frSecond= new HgFileRevision(repo, fileRevB, m2.flags(f), f); + resolver.presentState(f, frFirst, frSecond); + mediator.resolve(frBase, frFirst, frSecond, resolver); + } } else { - HgFileRevision frBase = fileRevBase == null ? null : new HgFileRevision(repo, fileRevBase, ma.flags(f), f); - HgFileRevision frFirst = new HgFileRevision(repo, fileRevA, m1.flags(f), f); - HgFileRevision frSecond= new HgFileRevision(repo, fileRevB, m2.flags(f), f); - mediator.resolve(frBase, frFirst, frSecond, resolver); + // m2 doesn't contain the file, either new in m1, or deleted in m2 + HgFileRevision frFirst = new HgFileRevision(repo, m1.nodeid(f), m1.flags(f), f); + resolver.presentState(f, frFirst, null); + if (ma.contains(f)) { + // deleted in m2 + HgFileRevision frBase = new HgFileRevision(repo, ma.nodeid(f), ma.flags(f), f); + mediator.onlyA(frBase, frFirst, resolver); + } else { + // new in m1 + mediator.newInA(frFirst, resolver); + } } - } else { - // m2 doesn't contain the file, either new in m1, or deleted in m2 - HgFileRevision frFirst = new HgFileRevision(repo, m1.nodeid(f), m1.flags(f), f); + resolver.apply(); + } // for m1 files + for (Path f : m2.files()) { + if (m1.contains(f)) { + continue; + } + HgFileRevision frSecond= new HgFileRevision(repo, m2.nodeid(f), m2.flags(f), f); + // file in m2 is either new or deleted in m1 + resolver.presentState(f, null, frSecond); if (ma.contains(f)) { - // deleted in m2 + // deleted in m1 HgFileRevision frBase = new HgFileRevision(repo, ma.nodeid(f), ma.flags(f), f); - mediator.onlyA(frBase, frFirst, resolver); + mediator.onlyB(frBase, frSecond, resolver); } else { - // new in m1 - mediator.newInA(frFirst, resolver); + // new in m2 + mediator.newInB(frSecond, resolver); } + resolver.apply(); } - resolver.apply(); - } // for m1 files - for (Path f : m2.files()) { - if (m1.contains(f)) { - continue; - } - HgFileRevision frSecond= new HgFileRevision(repo, m2.nodeid(f), m2.flags(f), f); - // file in m2 is either new or deleted in m1 - if (ma.contains(f)) { - // deleted in m1 - HgFileRevision frBase = new HgFileRevision(repo, ma.nodeid(f), ma.flags(f), f); - mediator.onlyB(frBase, frSecond, resolver); - } else { - // new in m2 - mediator.newInB(frSecond, resolver); - } - resolver.apply(); + resolver.serializeChanged(transaction); + transaction.commit(); + } catch (HgRuntimeException ex) { + transaction.rollback(); + throw ex; + } catch (HgIOException ex) { + transaction.rollback(); + throw ex; } } catch (HgRuntimeException ex) { throw new HgLibraryFailureException(ex); @@ -160,18 +196,43 @@ } /** - * This is the way client code takes part in the merge process + * This is the way client code takes part in the merge process. + * It's advised to subclass {@link MediatorBase} unless special treatment for regular cases is desired */ @Experimental(reason="Provisional API. Work in progress") @Callback public interface Mediator { - public void same(HgFileRevision first, HgFileRevision second, Resolver resolver) throws HgCallbackTargetException; + /** + * file revisions are identical in both heads + */ + public void same(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException; + /** + * file left in first/left/A trunk only, deleted in second/right/B trunk + */ public void onlyA(HgFileRevision base, HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException; + /** + * file left in second/right/B trunk only, deleted in first/left/A trunk + */ public void onlyB(HgFileRevision base, HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException; + /** + * file is missing in ancestor revision and second/right/B trunk, introduced in first/left/A trunk + */ public void newInA(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException; + /** + * file is missing in ancestor revision and first/left/A trunk, introduced in second/right/B trunk + */ public void newInB(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException; + /** + * file was changed in first/left/A trunk, unchanged in second/right/B trunk + */ public void fastForwardA(HgFileRevision base, HgFileRevision first, Resolver resolver) throws HgCallbackTargetException; + /** + * file was changed in second/right/B trunk, unchanged in first/left/A trunk + */ public void fastForwardB(HgFileRevision base, HgFileRevision second, Resolver resolver) throws HgCallbackTargetException; + /** + * File changed (or added, if base is null) in both trunks + */ public void resolve(HgFileRevision base, HgFileRevision first, HgFileRevision second, Resolver resolver) throws HgCallbackTargetException; } @@ -182,24 +243,170 @@ @Experimental(reason="Provisional API. Work in progress") public interface Resolver { public void use(HgFileRevision rev); - public void use(InputStream content); + /** + * Replace current revision with stream content. + * Note, callers are not expected to {@link InputStream#close()} this stream. + * It will be {@link InputStream#close() closed} at Hg4J's discretion + * not necessarily during invocation of this method. IOW, the library may decide to + * use this stream not right away, at some point of time later, and streams supplied + * shall respect this. + * + * @param content New content to replace current revision, shall not be null + * @throws IOException propagated exceptions from content + */ + public void use(InputStream content) throws IOException; + public void forget(HgFileRevision rev); public void unresolved(); // record the file for later processing by 'hg resolve' } + /** + * Base mediator implementation, with regular resolution + */ + @Experimental(reason="Provisional API. Work in progress") + public abstract class MediatorBase implements Mediator { + public void same(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + resolver.use(rev); + } + public void onlyA(HgFileRevision base, HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + resolver.use(rev); + } + public void onlyB(HgFileRevision base, HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + resolver.use(rev); + } + public void newInA(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + resolver.use(rev); + } + public void newInB(HgFileRevision rev, Resolver resolver) throws HgCallbackTargetException { + resolver.use(rev); + } + public void fastForwardA(HgFileRevision base, HgFileRevision first, Resolver resolver) throws HgCallbackTargetException { + resolver.use(first); + } + public void fastForwardB(HgFileRevision base, HgFileRevision second, Resolver resolver) throws HgCallbackTargetException { + resolver.use(second); + } + } + private static class ResolverImpl implements Resolver { - void apply() { + + private final Internals repo; + private final DirstateBuilder dirstateBuilder; + private final MergeStateBuilder mergeStateBuilder; + private boolean changedDirstate; + private HgFileRevision revA; + private HgFileRevision revB; + private Path file; + // resolutions: + private HgFileRevision resolveUse, resolveForget; + private File resolveContent; + private boolean resolveMarkUnresolved; + + public ResolverImpl(Internals implRepo, DirstateBuilder dirstateBuilder, MergeStateBuilder mergeStateBuilder) { + repo = implRepo; + this.dirstateBuilder = dirstateBuilder; + this.mergeStateBuilder = mergeStateBuilder; + changedDirstate = false; + } + + void serializeChanged(Transaction tr) throws HgIOException { + if (changedDirstate) { + dirstateBuilder.serialize(tr); + } + mergeStateBuilder.serialize(tr); + } + + void presentState(Path p, HgFileRevision revA, HgFileRevision revB) { + assert revA != null || revB != null; + file = p; + this.revA = revA; + this.revB = revB; + resolveUse = resolveForget = null; + resolveContent = null; + resolveMarkUnresolved = false; + } + + void apply() throws HgIOException, HgRuntimeException { + if (resolveMarkUnresolved) { + mergeStateBuilder.unresolved(file); + } else if (resolveForget != null) { + if (resolveForget == revA) { + changedDirstate = true; + dirstateBuilder.recordRemoved(file); + } + } else if (resolveUse != null) { + if (resolveUse != revA) { + changedDirstate = true; + final WorkingDirFileWriter fw = new WorkingDirFileWriter(repo); + fw.processFile(resolveUse); + if (resolveUse == revB) { + dirstateBuilder.recordMergedFromP2(file); + } else { + dirstateBuilder.recordMerged(file, fw.fmode(), fw.mtime(), fw.bytesWritten()); + } + } // if resolution is to use revA, nothing to do + } else if (resolveContent != null) { + changedDirstate = true; + // FIXME write content to file using transaction? + InputStream is; + try { + is = new FileInputStream(resolveContent); + } catch (IOException ex) { + throw new HgIOException("Failed to read temporary content", ex, resolveContent); + } + final WorkingDirFileWriter fw = new WorkingDirFileWriter(repo); + fw.processFile(file, is, revA == null ? revB.getFileFlags() : revA.getFileFlags()); + // XXX if presentState(null, fileOnlyInB), and use(InputStream) - i.e. + // resolution is to add file with supplied content - shall I put 'Merged', MergedFromP2 or 'Added' into dirstate? + if (revA == null && revB != null) { + dirstateBuilder.recordMergedFromP2(file); + } else { + dirstateBuilder.recordMerged(file, fw.fmode(), fw.mtime(), fw.bytesWritten()); + } + } else { + assert false; + } } public void use(HgFileRevision rev) { - // TODO Auto-generated method stub + if (rev == null) { + throw new IllegalArgumentException(); + } + assert resolveContent == null; + assert resolveForget == null; + resolveUse = rev; } - public void use(InputStream content) { - // TODO Auto-generated method stub + public void use(InputStream content) throws IOException { + if (content == null) { + throw new IllegalArgumentException(); + } + assert resolveUse == null; + assert resolveForget == null; + try { + // cache new contents just to fail fast if there are troubles with content + final FileUtils fileUtils = new FileUtils(repo.getLog(), this); + resolveContent = fileUtils.createTempFile(); + fileUtils.write(content, resolveContent); + } finally { + content.close(); + } + // do not care deleting file in case of failure to allow analyze of the issue + } + + public void forget(HgFileRevision rev) { + if (rev == null) { + throw new IllegalArgumentException(); + } + if (rev != revA || rev != revB) { + throw new IllegalArgumentException("Can't forget revision which doesn't represent actual state in either merged trunk"); + } + assert resolveUse == null; + assert resolveContent == null; + resolveForget = rev; } public void unresolved() { - // TODO Auto-generated method stub + resolveMarkUnresolved = true; } } } diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgPullCommand.java --- a/src/org/tmatesoft/hg/core/HgPullCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgPullCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -21,7 +21,6 @@ import java.util.List; import org.tmatesoft.hg.internal.AddRevInspector; -import org.tmatesoft.hg.internal.COWTransaction; import org.tmatesoft.hg.internal.Internals; import org.tmatesoft.hg.internal.PhasesHelper; import org.tmatesoft.hg.internal.RepositoryComparator; @@ -78,7 +77,7 @@ // add revisions to changelog, manifest, files final Internals implRepo = HgInternals.getImplementationRepo(repo); final AddRevInspector insp; - Transaction.Factory trFactory = new COWTransaction.Factory(); + Transaction.Factory trFactory = implRepo.getTransactionFactory(); Transaction tr = trFactory.create(repo); try { incoming.inspectAll(insp = new AddRevInspector(implRepo, tr)); diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/core/HgRevertCommand.java --- a/src/org/tmatesoft/hg/core/HgRevertCommand.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/core/HgRevertCommand.java Thu Aug 15 18:43:50 2013 +0200 @@ -21,7 +21,6 @@ import java.util.LinkedHashSet; import java.util.Set; -import org.tmatesoft.hg.internal.COWTransaction; import org.tmatesoft.hg.internal.CsetParamKeeper; import org.tmatesoft.hg.internal.DirstateBuilder; import org.tmatesoft.hg.internal.DirstateReader; @@ -160,7 +159,7 @@ progress.worked(1); cancellation.checkCancelled(); } - Transaction.Factory trFactory = new COWTransaction.Factory(); + Transaction.Factory trFactory = implRepo.getTransactionFactory(); Transaction tr = trFactory.create(repo); try { // TODO same code in HgAddRemoveCommand and similar in HgCommitCommand diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/CommitFacility.java --- a/src/org/tmatesoft/hg/internal/CommitFacility.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/internal/CommitFacility.java Thu Aug 15 18:43:50 2013 +0200 @@ -212,7 +212,7 @@ } // bring dirstate up to commit state, TODO share this code with HgAddRemoveCommand final DirstateBuilder dirstateBuilder = new DirstateBuilder(repo); - dirstateBuilder.fillFrom(new DirstateReader(repo, new Path.SimpleSource())); + dirstateBuilder.fillFrom(new DirstateReader(repo, repo.getSessionContext().getPathFactory())); for (Path p : removals) { dirstateBuilder.recordRemoved(p); } diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/DirstateBuilder.java --- a/src/org/tmatesoft/hg/internal/DirstateBuilder.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/internal/DirstateBuilder.java Thu Aug 15 18:43:50 2013 +0200 @@ -92,6 +92,22 @@ removed.put(fname, n); } + public void recordMerged(Path fname, int fmode, int mtime, int bytesWritten) { + forget(fname); + merged.put(fname, new HgDirstate.Record(fmode, bytesWritten,mtime, fname, null)); + } + + /** + * From DirState wiki: + *

"size is ... when the dirstate is in a merge state: -2 will *always* return dirty, it is used to mark a file that was cleanly picked from p2" + * and + *

"Additional meta status...'np2': merged from other parent (status == 'n', size == -2)" + */ + public void recordMergedFromP2(Path fname) { + forget(fname); + normal.put(fname, new HgDirstate.Record(0, -2, -1, fname, null)); + } + private HgDirstate.Record forget(Path fname) { HgDirstate.Record r; if ((r = normal.remove(fname)) != null) { diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/FileUtils.java --- a/src/org/tmatesoft/hg/internal/FileUtils.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/internal/FileUtils.java Thu Aug 15 18:43:50 2013 +0200 @@ -105,7 +105,7 @@ fos.flush(); fos.close(); } - + public void closeQuietly(Closeable stream) { closeQuietly(stream, null); } @@ -126,4 +126,9 @@ } } } + + // nothing special, just a single place with common prefix + public File createTempFile() throws IOException { + return File.createTempFile("hg4j-", null); + } } diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/Internals.java --- a/src/org/tmatesoft/hg/internal/Internals.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/internal/Internals.java Thu Aug 15 18:43:50 2013 +0200 @@ -513,6 +513,10 @@ public RevlogStream resolveStoreFile(Path path) { return streamProvider.getStoreFile(path, false); } + + public Transaction.Factory getTransactionFactory() { + return new COWTransaction.Factory(); + } // marker method public static IllegalStateException notImplemented() { diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/MergeStateBuilder.java --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/src/org/tmatesoft/hg/internal/MergeStateBuilder.java Thu Aug 15 18:43:50 2013 +0200 @@ -0,0 +1,48 @@ +/* + * Copyright (c) 2013 TMate Software Ltd + * + * This program is free software; you can redistribute it and/or modify + * it under the terms of the GNU General Public License as published by + * the Free Software Foundation; version 2 of the License. + * + * This program is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + * GNU General Public License for more details. + * + * For information on how to redistribute this software under + * the terms of a license other than GNU General Public License + * contact TMate Software at support@hg4j.com + */ +package org.tmatesoft.hg.internal; + +import org.tmatesoft.hg.core.HgIOException; +import org.tmatesoft.hg.repo.HgMergeState; +import org.tmatesoft.hg.util.Path; + +/** + * Constructs merge/state file + * + * @see HgMergeState + * @author Artem Tikhomirov + * @author TMate Software Ltd. + */ +public class MergeStateBuilder { + + private final Internals repo; + + public MergeStateBuilder(Internals implRepo) { + repo = implRepo; + } + + public void resolved() { + throw Internals.notImplemented(); + } + + public void unresolved(Path file) { + throw Internals.notImplemented(); + } + + public void serialize(Transaction tr) throws HgIOException { + } +} diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/internal/WorkingDirFileWriter.java --- a/src/org/tmatesoft/hg/internal/WorkingDirFileWriter.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/internal/WorkingDirFileWriter.java Thu Aug 15 18:43:50 2013 +0200 @@ -21,9 +21,12 @@ import java.io.File; import java.io.FileOutputStream; import java.io.IOException; +import java.io.InputStream; import java.nio.ByteBuffer; import java.nio.channels.FileChannel; +import org.tmatesoft.hg.core.HgFileRevision; +import org.tmatesoft.hg.core.HgIOException; import org.tmatesoft.hg.repo.HgDataFile; import org.tmatesoft.hg.repo.HgManifest; import org.tmatesoft.hg.repo.HgRuntimeException; @@ -65,45 +68,94 @@ * Executable bit is set if specified and filesystem supports it. * @throws HgRuntimeException */ - public void processFile(HgDataFile df, int fileRevIndex, HgManifest.Flags flags) throws IOException, HgRuntimeException { - try { - prepare(df.getPath()); - if (flags != HgManifest.Flags.Link) { - destChannel = new FileOutputStream(dest).getChannel(); - } else { - linkChannel = new ByteArrayChannel(); - } - df.contentWithFilters(fileRevIndex, this); - } catch (CancelledException ex) { - hgRepo.getSessionContext().getLog().dump(getClass(), Severity.Error, ex, "Our impl doesn't throw cancellation"); - } finally { - if (flags != HgManifest.Flags.Link) { - destChannel.close(); - destChannel = null; - // leave dest in case anyone enquires with #getDestinationFile + public void processFile(final HgDataFile df, final int fileRevIndex, HgManifest.Flags flags) throws HgIOException, HgRuntimeException { + processFile(df.getPath(), new Fetch() { + public void readInto(ByteChannel ch) { + try { + df.contentWithFilters(fileRevIndex, ch); + } catch (CancelledException ex) { + handleUnexpectedCancel(ex); + } } - } - if (linkChannel != null && symlinkCap) { - assert flags == HgManifest.Flags.Link; - fileFlagsHelper.createSymlink(dest.getParentFile(), dest.getName(), linkChannel.toArray()); - } else if (flags == HgManifest.Flags.Exec && execCap) { - fileFlagsHelper.setExecutableBit(dest.getParentFile(), dest.getName()); - } - // Although HgWCStatusCollector treats 644 (`hg manifest -v`) and 664 (my fs) the same, it's better - // to detect actual flags here - fmode = flags.fsMode(); // default to one from manifest - if (fileFlagsHelper != null) { - // if neither execBit nor link is supported by fs, it's unlikely file mode is supported, too. + }, flags); + } + + public void processFile(final HgFileRevision fr) throws HgIOException, HgRuntimeException { + processFile(fr.getPath(), new Fetch() { + + public void readInto(ByteChannel ch) throws IOException, HgRuntimeException { + try { + fr.putContentTo(ch); + } catch (CancelledException ex) { + handleUnexpectedCancel(ex); + } + } + }, fr.getFileFlags()); + } + + /** + * Closes supplied content stream + */ + public void processFile(Path fname, final InputStream content, HgManifest.Flags flags) throws HgIOException, HgRuntimeException { + processFile(fname, new Fetch() { + + public void readInto(ByteChannel ch) throws IOException, HgRuntimeException { + try { + ByteBuffer bb = ByteBuffer.wrap(new byte[8*1024]); + int r; + while ((r = content.read(bb.array())) != -1) { + bb.position(0).limit(r); + for (int wrote = 0; wrote < r; ) { + r -= wrote; + wrote = ch.write(bb); + assert bb.remaining() == r - wrote; + } + } + } catch (CancelledException ex) { + handleUnexpectedCancel(ex); + } + } + }, flags); + } + + private interface Fetch { + void readInto(ByteChannel ch) throws IOException, HgRuntimeException; + } + + private void processFile(Path fname, Fetch fetch, HgManifest.Flags flags) throws HgIOException, HgRuntimeException { + try { + byte[] symlinkContent = null; try { - fmode = fileFlagsHelper.getFileMode(dest, fmode); - } catch (IOException ex) { - // Warn, we've got default value and can live with it - hgRepo.getSessionContext().getLog().dump(getClass(), Warn, ex, "Failed get file access rights"); + prepare(fname, flags); + fetch.readInto(this); + } finally { + symlinkContent = close(fname, flags); } + if (flags == HgManifest.Flags.Link && symlinkCap) { + assert symlinkContent != null; + fileFlagsHelper.createSymlink(dest.getParentFile(), dest.getName(), symlinkContent); + } else if (flags == HgManifest.Flags.Exec && execCap) { + fileFlagsHelper.setExecutableBit(dest.getParentFile(), dest.getName()); + } + // Although HgWCStatusCollector treats 644 (`hg manifest -v`) and 664 (my fs) the same, it's better + // to detect actual flags here + fmode = flags.fsMode(); // default to one from manifest + if (fileFlagsHelper != null) { + // if neither execBit nor link is supported by fs, it's unlikely file mode is supported, too. + try { + fmode = fileFlagsHelper.getFileMode(dest, fmode); + } catch (IOException ex) { + // Warn, we've got default value and can live with it + hgRepo.getSessionContext().getLog().dump(getClass(), Warn, ex, "Failed get file access rights"); + } + } + } catch (IOException ex) { + String msg = String.format("Failed to write file %s to the working directory", fname); + throw new HgIOException(msg, ex, dest); } } - public void prepare(Path fname) throws IOException { + private void prepare(Path fname, HgManifest.Flags flags) throws IOException { String fpath = fname.toString(); dest = new File(hgRepo.getRepo().getWorkingDir(), fpath); if (fpath.indexOf('/') != -1) { @@ -113,6 +165,25 @@ linkChannel = null; totalBytesWritten = 0; fmode = 0; + if (flags != HgManifest.Flags.Link) { + destChannel = new FileOutputStream(dest).getChannel(); + } else { + linkChannel = new ByteArrayChannel(); + } + } + + private byte[] close(Path fname, HgManifest.Flags flags) throws IOException { + if (flags != HgManifest.Flags.Link) { + destChannel.close(); + destChannel = null; + // leave dest in case anyone enquires with #getDestinationFile + } + if (linkChannel != null) { + final byte[] rv = linkChannel.toArray(); + linkChannel = null; + return rv; + } + return null; } public int write(ByteBuffer buffer) throws IOException, CancelledException { @@ -144,4 +215,8 @@ public int mtime() { return (int) (dest.lastModified() / 1000); } + + private void handleUnexpectedCancel(CancelledException ex) { + hgRepo.getSessionContext().getLog().dump(WorkingDirFileWriter.class, Severity.Error, ex, "Our impl doesn't throw cancellation"); + } } diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/repo/HgMergeState.java --- a/src/org/tmatesoft/hg/repo/HgMergeState.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/repo/HgMergeState.java Thu Aug 15 18:43:50 2013 +0200 @@ -218,7 +218,7 @@ /** * List of conflicts as recorded in the merge state information file. - * Note, this information is valid unless {@link #isStale()} is true. + * Note, this information is not valid unless {@link #isStale()} is true. * * @return non-null list with both resolved and unresolved conflicts. */ diff -r 7743a9c10bfa -r b4242b7e7dfe src/org/tmatesoft/hg/util/Path.java --- a/src/org/tmatesoft/hg/util/Path.java Wed Aug 14 20:07:26 2013 +0200 +++ b/src/org/tmatesoft/hg/util/Path.java Thu Aug 15 18:43:50 2013 +0200 @@ -214,6 +214,7 @@ public static class SimpleSource implements Source { private final PathRewrite normalizer; private final Convertor convertor; + private final Path.Source delegate; public SimpleSource() { this(new PathRewrite.Empty(), null); @@ -224,12 +225,30 @@ } public SimpleSource(PathRewrite pathRewrite, Convertor pathConvertor) { + assert pathRewrite != null; normalizer = pathRewrite; convertor = pathConvertor; + delegate = null; + } + + public SimpleSource(Path.Source actual, Convertor pathConvertor) { + assert actual != null; + normalizer = null; + delegate = actual; + convertor = pathConvertor; } public Path path(CharSequence p) { - Path rv = Path.create(normalizer.rewrite(p)); + // in fact, it's nicer to have sequence of sources, and a bunch of small + // Source implementations each responsible for specific aspect, like Convertor + // or delegation to another Source. However, these classes are just too small + // to justify their existence + Path rv; + if (delegate != null) { + rv = delegate.path(p); + } else { + rv = Path.create(normalizer.rewrite(p)); + } if (convertor != null) { return convertor.mangle(rv); }