Skip to content

Commit

Permalink
Merge pull request #4 from Chaphasilor/dev
Browse files Browse the repository at this point in the history
Quality-of-life improvements
  • Loading branch information
Chaphasilor authored Feb 23, 2022
2 parents c6a9143 + 33f8813 commit 206726a
Show file tree
Hide file tree
Showing 6 changed files with 87 additions and 54 deletions.
15 changes: 8 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ This tool requires the two input videos, the one where you want to add the addit
It then tries to find the exact same frame in both videos, in order to synchronize them (in case one of them is longer or shorter than the other).
It allows you to pick the audio and subtitle tracks you want to add to the destination and specify the output file.

There's an interactive mode (simply don't pass any arguments, flags work) and a CLI mode (pass the two arguments listed at the top).
There's an interactive mode (simply don't pass any arguments, flags are fine) and a CLI mode (pass the two arguments listed at the top).

## Examples

Expand Down Expand Up @@ -44,20 +44,21 @@ $ video-sync -h # help page

- `-s, --subsTracks=<list>` subtitle tracks to sync over to the destination video. comma-separated list of mkvmerge IDs or ISO 639-2 language tags (track matching that language will be synced). if omitted, all subtitle tracks will be synced

- `-g, --algorithm=<simple|matching-scene>` [default: matching-scene] search algorithm to use for video syncing

- `-e, --offsetEstimate=<number>` estimated offset between the two videos (in ms) for video syncing. positive values means that the source video is ahead of the destination video

- `-f, --forceOffset` use the estimated offset as the final offset, no synching

- `-x, --exclusiveDirection=<ahead|behind>` only search the matching frame offset in one direction. 'ahead' means that the source video scene comes *before* the destination video scene. (requires algorithm=matching-scene)

- `-g, --algorithm=<simple|matching-scene>` [default: matching-scene] search algorithm to use for video syncing

- `-m, --maxOffset=<number>` [default: 120] maximum considered offset between the videos (in seconds) for video syncing.

- `-r, --searchResolution=<number>` [default: 80] resolution of the search region (in frames) for video syncing. increases accuracy at the cost of longer runtime (requires algorithm=simple)
- `--searchIncrements=<number>` [default: 3] maximum area (video duration, in seconds) to search for the next scene in any direction (forward/backward) before searching in the other direction (requires algorithm=matching-scene)
- `--sceneSearchDuration=<number>` [default: 300] maximum area (video duration, in seconds) to search for any "abrupt" scene change in the destination video before aborting (requires algorithm=matching-scene)
- `-x, --exclusiveDirection=<ahead|behind>` only search the matching frame offset in one direction. 'ahead' means that the source video scene comes *before* the destination video scene. (requires algorithm=matching-scene)

- `-i, --iterations=<number>` [default: 2] number of iterations to perform for video syncing (requires algorithm=simple)
- `-t, --threshold=<number>` [default: 0.6] minimum confidence threshold for video syncing. (requires algorithm=simple)
- `-w, --searchWidth=<number>` [default: 20] width of the search region (in seconds) for video syncing. the program will find the closest matching frame in this region, 'sourceOffset' being the center (requires algorithm=simple)
- `-r, --searchResolution=<number>` [default: 80] resolution of the search region (in frames) for video syncing. increases accuracy at the cost of longer runtime (requires algorithm=simple)

- `-y, --confirm` automatically confirm missing tracks, low confidence scores, warped videos and overwrite prompts

Expand Down
61 changes: 37 additions & 24 deletions src/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -78,8 +78,17 @@ class VideoSyncCommand extends Command {
}

}

console.debug(`answers.source:`, answers.source)
console.debug(`answers.destination:`, answers.destination)

let availableTracks = tracks.getTrackInfo(answers.source)

//TODO compare video duration and print warning/prompt if they are too different
// let video1Data = await ffprobe(video1)
// let video2Data = await ffprobe(video2)
// let video1Duration = Number(video1Data.format.duration) * 1000 // offset in ms
// let video2Duration = Number(video2Data.format.duration) * 1000 // offset in ms

let selectedTracks

Expand Down Expand Up @@ -220,7 +229,6 @@ class VideoSyncCommand extends Command {
result = await calcOffset(answers.destination, answers.source, {
comparisonAlgorithm: ALGORITHMS.SSIM,
iterations: flags.iterations,
searchWidth: flags.searchWidth,
searchResolution: flags.searchResolution,
maxOffset: flags.maxOffset,
offsetEstimate: flags.offsetEstimate,
Expand All @@ -230,6 +238,8 @@ class VideoSyncCommand extends Command {
result = await calculateOffset(answers.destination, answers.source, {
maxOffset: flags.maxOffset * 1000,
offsetEstimate: flags.offsetEstimate,
searchLengthInSeconds: flags.sceneSearchDuration,
searchIncrementSizeInSeconds: flags.searchIncrements,
})
}

Expand Down Expand Up @@ -287,10 +297,10 @@ class VideoSyncCommand extends Command {
}

VideoSyncCommand.description = `video-sync - a tool for automating the process of muxing additional audio tracks into videos
This tool requires the two input videos, the one where you want to add the additional tracks *to* (the destination video) and the one where you take the additional tracks *from* (the source video).
This tool requires two input videos, one where you want to add the additional tracks *to* (the destination video) and one where you take the additional tracks *from* (the source video).
It then tries to find the exact same frame in both videos, in order to synchronize them (in case one of them is longer or shorter than the other).
It allows you to pick the audio and subtitle tracks you want to add to the destination and specify the output file.
There's an interactive mode (simply don't pass any arguments, flags work) and a CLI mode (pass the two arguments listed at the top).
There's an interactive mode (simply don't pass any arguments, flags are fine) and a CLI mode (pass the two arguments listed at the top).
`

VideoSyncCommand.args = [
Expand All @@ -314,12 +324,6 @@ VideoSyncCommand.flags = {
description: `output file path`,
required: false, // if omitted, only the offset is printed
}),
confirm: flags.boolean({
char: `y`,
description: `automatically confirm missing tracks, low confidence scores, warped videos and overwrite prompts`,
required: false, // if omitted, only the offset is printed
default: false,
}),
audioTracks: flags.string({
char: `a`,
multiple: true, // important to allow spaces in-between
Expand All @@ -340,21 +344,6 @@ VideoSyncCommand.flags = {
options: [`simple`, `matching-scene`],
default: `matching-scene`,
}),
iterations: flags.integer({
char: `i`,
description: `number of iterations to perform for video syncing (requires algorithm=simple)`,
default: 2,
}),
searchWidth: flags.integer({
char: `w`,
description: `width of the search region (in seconds) for video syncing. the program will find the closest matching frame in this region, 'sourceOffset' being the center (requires algorithm=simple)`,
default: 20,
}),
maxOffset: flags.integer({
char: `m`,
description: `maximum considered offset between the videos (in seconds) for video syncing.`,
default: 120,
}),
offsetEstimate: flags.integer({
char: `e`,
description: `estimated offset between the two videos (in ms) for video syncing. positive values means that the source video is ahead of the destination video`,
Expand All @@ -365,12 +354,30 @@ VideoSyncCommand.flags = {
description: `use the estimated offset as the final offset, no synching`,
default: false,
}),
maxOffset: flags.integer({
char: `m`,
description: `maximum considered offset between the videos (in seconds) for video syncing.`,
default: 120,
}),
searchIncrements: flags.integer({
description: `maximum area (video duration, in seconds) to search for the next scene in any direction (forward/backward) before searching in the other direction (requires algorithm=matching-scene)`,
default: 3,
}),
sceneSearchDuration: flags.integer({
description: `maximum area (video duration, in seconds) to search for any "abrupt" scene change in the destination video before aborting (requires algorithm=matching-scene)`,
default: 300,
}),
exclusiveDirection: flags.string({
char: `x`,
description: `only search the matching frame offset in one direction. 'ahead' means that the source video scene comes *before* the destination video scene. (requires algorithm=matching-scene)`,
parse: (input) => input ? (input === `ahead` ? -1 : 1) : false,
default: undefined,
}),
iterations: flags.integer({
char: `i`,
description: `number of iterations to perform for video syncing (requires algorithm=simple)`,
default: 2,
}),
threshold: flags.string({
char: `t`,
description: `minimum confidence threshold for video syncing. (requires algorithm=simple)`,
Expand All @@ -382,6 +389,12 @@ VideoSyncCommand.flags = {
description: `resolution of the search region (in frames) for video syncing. increases accuracy at the cost of longer runtime (requires algorithm=simple)`,
default: 80,
}),
confirm: flags.boolean({
char: `y`,
description: `automatically confirm missing tracks, low confidence scores, warped videos and overwrite prompts`,
required: false, // if omitted, only the offset is printed
default: false,
}),
verbose: flags.boolean({
char: `v`,
description: `output additional logs`,
Expand Down
33 changes: 21 additions & 12 deletions util/calc-offset.js
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ module.exports.calcOffset = async function(video1Path, video2Path, options = {
// const rollingFrameOffset = parseInt(video1IsLarger ? offset2 : offset1)

// generate the static frame using the video length and a padding of 1%
console.log(`videoInfo.lengths:`, videoInfo.lengths)
console.debug(`videoInfo:`, videoInfo)
let staticLength = videoInfo.lengths[video1IsLarger ? 0 : 1]
let staticFrameOffset = Math.round(staticLength/2)
console.log(`staticFrameOffset:`, staticFrameOffset)
Expand Down Expand Up @@ -108,7 +108,7 @@ module.exports.calcOffset = async function(video1Path, video2Path, options = {

console.log(`iteration:`, iteration)

let searchWidth = options.searchWidth / iteration
let searchWidth = options.maxOffset*2 / iteration

console.log(`searchWidth:`, searchWidth)
console.log(`searchCenter:`, searchCenter)
Expand Down Expand Up @@ -174,10 +174,10 @@ module.exports.calcOffset = async function(video1Path, video2Path, options = {
console.log(`searchCenter:`, searchCenter)

let totalOffset = (staticFrameOffset - searchCenter).toFixed(0)
console.log(`Video 2 is approx. ${Math.abs(totalOffset)} ms ${video1IsLarger && totalOffset > 0 ? `ahead` : `behind`} video 1 (${closestMatch.value})`)
console.log(`Video 2 is approx. ${Math.abs(totalOffset)} ms ${video1IsLarger && totalOffset > 0 ? `ahead of` : `behind`} video 1 (${closestMatch.value})`)

// cli.action.stop(`Done! Source video is approx. ${Math.abs(totalOffset)} ms ${video1IsLarger && totalOffset > 0 ? `ahead` : `behind`} destination video (confidence ${closestMatch.value.toFixed(5)}).`)
spinner.succeed(`Source video is approx. ${Math.abs(totalOffset)} ms ${video1IsLarger && totalOffset > 0 ? `ahead` : `behind`} destination video (confidence ${closestMatch.value.toFixed(5)}).`)
spinner.succeed(`Source video is approx. ${Math.abs(totalOffset)} ms ${video1IsLarger && totalOffset > 0 ? `ahead of` : `behind`} destination video (confidence ${closestMatch.value.toFixed(5)}).`)

return {
videoOffset: totalOffset,
Expand All @@ -192,11 +192,20 @@ async function getVideoInfo(vid1, vid2) {
let vid2Data = await probe(vid2)
console.log(`vid1Data:`, vid1Data)
console.log(`vid2Data:`, vid2Data)
const vid1VideoStreamIndex = vid1Data.streams.findIndex(x => x.codec_type === `video`)
const vid2VideoStreamIndex = vid2Data.streams.findIndex(x => x.codec_type === `video`)

console.debug(`Video 1: width: ${vid1Data.streams[0].width}, height: ${vid1Data.streams[0].height}`)
console.debug(`Video 2: width: ${vid2Data.streams[0].width}, height: ${vid2Data.streams[0].height}`)
if (vid1VideoStreamIndex < 0) {
throw new Error(`No video stream found in '${vid1}'`)
}
if (vid2VideoStreamIndex < 0) {
throw new Error(`No video stream found in '${vid2}'`)
}

console.debug(`Video 1: width: ${vid1Data.streams[vid1VideoStreamIndex].width}, height: ${vid1Data.streams[vid1VideoStreamIndex].height}`)
console.debug(`Video 2: width: ${vid2Data.streams[vid2VideoStreamIndex].width}, height: ${vid2Data.streams[vid2VideoStreamIndex].height}`)

if (vid1Data.streams[0].width > vid2Data.streams[0].width && vid1Data.streams[0].height < vid2Data.streams[0].height) {
if (vid1Data.streams[0].width > vid2Data.streams[vid2VideoStreamIndex].width && vid1Data.streams[0].height < vid2Data.streams[vid2VideoStreamIndex].height) {
console.warn(`Videos have different aspect ratios. You might get worse results.`)
}

Expand All @@ -207,15 +216,15 @@ async function getVideoInfo(vid1, vid2) {
],
dimensions: [
{
width: vid1Data.streams[0].width,
height: vid1Data.streams[0].height,
width: vid1Data.streams[vid1VideoStreamIndex].width,
height: vid1Data.streams[vid1VideoStreamIndex].height,
},
{
width: vid2Data.streams[0].width,
height: vid2Data.streams[0].height,
width: vid2Data.streams[vid2VideoStreamIndex].width,
height: vid2Data.streams[vid2VideoStreamIndex].height,
},
],
}

}
module.exports.getVideoInfo = getVideoInfo
module.exports.getVideoInfo = getVideoInfo
22 changes: 14 additions & 8 deletions util/find-offset-new.js
Original file line number Diff line number Diff line change
Expand Up @@ -177,8 +177,7 @@ async function calculateOffset(video1, video2, options) {
//TODO add support for options.offsetEstimate
//TODO add flag to specify search direction (e.g. if known whether the source is ahead or behind the destination)

const video1SearchLength = 300 * 1000
const searchIncrementSize = 10000 // maximum search area to find the next scene before switching sides

const startTime = Date.now();
const spinner = ora(`Syncing the videos...`).start();

Expand All @@ -188,13 +187,19 @@ async function calculateOffset(video1, video2, options) {
} catch (err) {
await fs.mkdir(`tmp`)
}

// search starts upwards
let direction = 1

let direction = 1 // search starts upwards
if (options.searchDirection) {
direction = options.searchDirection
}
let video1SearchLength = 300 * 1000
if (options.searchLengthInSeconds) {
video1SearchLength = options.searchLengthInSeconds * 1000
}
let searchIncrementSize = 2500 // maximum search area to find the next scene before switching sides
if (options.searchIncrementSizeInSeconds) {
searchIncrementSize = options.searchIncrementSizeInSeconds * 1000
}

let video1Data = await ffprobe(video1)
let video2Data = await ffprobe(video2)
Expand Down Expand Up @@ -223,6 +228,7 @@ async function calculateOffset(video1, video2, options) {

// make sure to stay within offset bounds
// continue while at least one side still within the bounds
//FIXME makes sure that a direction is blocked as soon as the max search offset is surpassed
while (
currentSearchOffsets.upper < video2Duration &&
currentSearchOffsets.lower > 0 &&
Expand Down Expand Up @@ -288,7 +294,7 @@ async function calculateOffset(video1, video2, options) {
videoOffset: video1SceneChange.preSceneChangeFrame.offset - sceneComparison.video2SceneChange.preSceneChangeFrame.offset,
confidence: 1,
}
spinner.succeed(`Source video is approx. ${Math.abs(result.videoOffset)} ms ${result.videoOffset > 0 ? `ahead` : `behind`} destination video. Took ${ms(Date.now() - startTime)}`)
spinner.succeed(`Source video is approx. ${Math.abs(result.videoOffset)} ms ${result.videoOffset > 0 ? `ahead of` : `behind`} the destination video. Took ${ms(Date.now() - startTime)}`)
return result

} else {
Expand Down Expand Up @@ -324,7 +330,7 @@ async function calculateOffset(video1, video2, options) {
force: true,
})

throw new Error(`Couldn't sync videos! (tried for ${ms(Date.now() - startTime)}`)
throw new Error(`Couldn't sync videos! (tried for ${ms(Date.now() - startTime)})`)

}
module.exports.calculateOffset = calculateOffset
Expand Down Expand Up @@ -353,4 +359,4 @@ module.exports.calculateOffset = calculateOffset

//[ ] when automating, use the previously found offset as an estimate for following videos (if videos from the same source)

//[ ] what happens when there are multiple similar scene changes?
//[ ] what happens when there are multiple similar scene changes?
6 changes: 3 additions & 3 deletions util/tracks.js
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ function getTrackInfo(video) {
channels: track.properties[`audio_channels`],
ids: {
mkvmerge: track.id,
}
},
}
}),
subs: subsTracks.map(track => {
Expand All @@ -246,7 +246,7 @@ function getTrackInfo(video) {
codec: track.codec,
ids: {
mkvmerge: track.id,
}
},
}
})
}
Expand All @@ -270,4 +270,4 @@ function getTrackType(trackInfo) {

// matchTracksAndStreams(`/mnt/c/Users/Chaphasilor/Videos/BadBatchCopy.mkv`)
// .then(tracks => console.info(`tracks:`, tracks))
// .catch(err => console.error(`ERROR:`, err))
// .catch(err => console.error(`ERROR:`, err))
4 changes: 4 additions & 0 deletions util/warping.js
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,8 @@ async function findClosestFrame(destinationVideo, sourceVideo, destinationTimest
similarity,
}
}

await fs.rm(fullOutputPath)

}

Expand Down Expand Up @@ -122,6 +124,8 @@ async function validateOffset(destinationVideo, sourceVideo, offsetToTest) {

const videoInfo = await getVideoInfo(destinationVideo, sourceVideo)

console.debug(`videoInfo:`, videoInfo)

const mostSimilarFrameOffsets = []

for (const position of testPositions) {
Expand Down

0 comments on commit 206726a

Please sign in to comment.