您的位置:首页 > 移动开发 > Swift

swift 视频合成剪辑——swift学习(十)

2016-06-29 18:02 239 查看

Composition

普及下相关知识:一个工程文件有很多轨道,如音频轨道1、音频轨道2、视频轨道1视频轨道2等,每个轨道里有许多素材,它可以进行压缩、旋转等操作,素材库中的视频拖到轨道中会分为视频轨和音频轨两个轨道。

编辑

AVFoundation framework提供了丰富的功能类以便于编辑资源的音视频。

AVAsset:素材库里的素材;

AVAssetTrack:素材的轨道;

AVFoundation编辑API的核心就是compositions(合成);

composition就是从一个或多个不同的资源中获取到的简单的track(轨道)的集合;

AVMutableComposition:一个用来合成视频的工程文件;

AVMutableCompositionTrack:工程文件中的轨道,有有音频轨、视频轨等,里面可以插入各种对应的素材;

AVMutableVideoCompositionLayerInstruction:视频轨道中的一个视频,可以缩放、旋转等;

AVMutableVideoCompositionInstruction:一个视频轨道,包含了这个轨道上的所有视频素材;

AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction一般是配合使用,用来给视频添加水印或者旋转方向的;

AVMutableVideoComposition:管理所有视频轨道,可以决定最终视频的尺寸,裁剪需要在这里进行;

AVMutableAudioMix是给视频添加音频的;

AVAssetExportSession:配置渲染参数并渲染。

你可以使用AVMutableAudioMix类编辑你的音频track。比如你可以设置最大音量以及音轨坡度。

你可以使用AVMutableVideoComposition类直接编辑视频轨道。

创建Composition

你可以使用AVMutableComposition类创建composition。你必须添加一个或多个composition track到你的composition里,相当于使用AVMutableCompositionTrack类。最简单的composition就是一个视频一个音频track。

let mutableComposition = AVMutableComposition()
let mutableCompositionVideoTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) //视频轨道
let mutableCompositionAudioTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid) //音频轨道


多种方式初始化Composition Track

当添加新的track到composition,你必须提供媒体类型和Track ID。尽管音视频是我们常用额媒体的类型,但是同样,我们可以使用其他类型,比如AVMediaTypeSubtitle或AVMediaTypeText.

添加试听数据到Composition

一旦你的composition有一个或者多个tracks,你可以添加新的数据到合适的tracks上。你可以使用可变的composition接口放置多个曲目相同的底层媒体类型在同一轨道上。

let videoAsset = AVAsset(URL:NSURL(string:"http://140.207.205.6/sohu/v1/TmwmTmxATmXG9vhNXRDFk9CBJpakEWJmopXAOCkytHrChWoIymcAr.mp4?k=mlU3Gr&p=j9lvzSwUopPGqmwCoSwGqSoCqpXWsUwIWFo7oB2svm12ZDeS0tvGRD6sWYNsfY1svmfCZMbVwmfVZD6HfYXswmNCNF2OWYdXfGN4wm6AZDNXfY1swm1BqVPcgYeSoMAARDx&r=TmI20LscWOoUgt8IS3HTaI2yT8XPLNGWVEt88eILOfkHkg9kjdxLYf9SlZEmrW49JLzSxm0KOyOHXIY&q=OpC7hW7IWJodRDbOwmfCyY2sWF1HfJ1tlG6t5G64WYo2ZDv4fFesWGNOwm4cWhbOvmscWY&cip=140.207.16.150")!)  //视频资源1

let anotherAsset = AVAsset(URL: NSURL(string: "http://data.vod.itc.cn/?new=/218/117/QH0duE89EFks1QWyHBshuL.mp4&vid=3088596&plat=17&mkey=CLfbS51YewD-mLXpGv4ZyKSwDrzD1heG&ch=tv&uid=1602171411153447&SOHUSVP=EtzMYkA639BAKmIufcqZ2lpZcDeNMTd-V15MB1rYf9k&pt=5&prod=h5&pg=1&eye=0&cv=1.0.0&qd=68000&src=11060001&ca=4&cateCode=115&_c=1&appid=tv&oth=&cd=")!) //视频资源2
let videoAssetTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack //视频资源1的视频轨
let anotherVideoAssetTrack = anotherAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack //视频资源2的视频轨
//
try! mutableCompositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration), ofTrack: videoAssetTrack, atTime: kCMTimeZero) //在视频轨道里插入视频资源1的视频轨
try! mutableCompositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, anotherVideoAssetTrack.timeRange.duration), ofTrack: anotherVideoAssetTrack, atTime: videoAssetTrack.timeRange.duration) ////在视频轨道里插入视频资源2的视频轨。
//这样两个视频轨道就合成到一个视频轨道里了!!


检索兼容的composition tracks

可能的话,你应该只有唯一的composition track 对应每个媒体类型。统一资源的兼容能使资源使用最小化。当连续地呈现媒体数据的时候,你应该将相同媒体类型的数据放到相同的composition track(组合曲目)里。你可以对可变的composition里查找你的composition tracks是否兼容你想要的那个资源曲目:

let compatibleCompositiontrack = mutableComposition.mutableTrackCompatibleWithTrack(videoAssetTrack)

if (compatibleCompositiontrack != nil) {
//
}


生成音量坡度

一个AVMutableAudioMix对象可以在你的composition所有音频track里单独执行自定义音频处理。你通过audioMix类方法创建一个audio mix(音频混合),然后使用AVMutableAudioMixInputParameters的实例将audio mix与你composition里的指定track进行关联。audio mix可以被用于改变audio track的音量:

let mutableAudioMix = AVMutableAudioMix()
let mixParamters = AVMutableAudioMixInputParameters(track: mutableCompositionAudioTrack)
mixParamters.setVolumeRampFromStartVolume(1.0, toEndVolume: 0.0, timeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration))
mutableAudioMix.inputParameters = [mixParamters]


自定义视频处理

对应audio mix,你也只需要一个AVMutableVideoComposition对象在你的composition里所有视频track里单独执行自定义视频处理。你可以使用视频composition直接为你的视频track设置size、scale、frame rate。

改变composition的背景色

所有的视频composition必须有包含至少一组视频合成指令的AVVideoCompositionInstruction数组,你可以使用AVMutableVideoCompositionInstruction创建自己的视频合成指令。利用视频合成指令,你可以修改构图的背景颜色,指定是否需要处理或者应用层指令。

//改变背景色
let mutableVideoCompositionInstuction = AVMutableVideoCompositionInstruction()
mutableVideoCompositionInstuction.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)
mutableVideoCompositionInstuction.backgroundColor = UIColor.redColor().CGColor


应用不透明坡道

视频组合指令也可以被用于视频组合层指令。一个AVMutableVideoCompositionLayerInstruction可以进行转换,转换坡度,不透明和透明度坡度

let mutableComposition = AVMutableComposition()
let mutableCompositionVideoTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let firstVideoAssetk = AVAsset(URL:NSURL(string:"http://data.vod.itc.cn/?new=/218/117/QH0duE89EFks1QWyHBshuL.mp4&vid=3088596&plat=17&mkey=CLfbS51YewD-mLXpGv4ZyKSwDrzD1heG&ch=tv&uid=1602171411153447&SOHUSVP=EtzMYkA639BAKmIufcqZ2lpZcDeNMTd-V15MB1rYf9k&pt=5&prod=h5&pg=1&eye=0&cv=1.0.0&qd=68000&src=11060001&ca=4&cateCode=115&_c=1&appid=tv&oth=&cd=")!)
let firstVideoAssetTrack = firstVideoAssetk.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
let firstVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration)
let firstVideoLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack:mutableCompositionVideoTrack)
firstVideoLayerInstruction.setOpacityRampFromStartOpacity(1, toEndOpacity: 0, timeRange: CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration))

let secondVideoAssetk = AVAsset(URL:NSURL(string:"http://data.vod.itc.cn/?new=/13/41/ISfD3DLpScKctFu1HKheqB.mp4&vid=3088680&plat=17&mkey=F6E2ruSK6YflEML59DnzvUYrCDunhuIQ&ch=tv&uid=1602171411153447&SOHUSVP=EtzMYkA639AXH-8Jc72Vn_bYjYnF97iVz4-oU02a_pE&pt=5&prod=h5&pg=1&eye=0&cv=1.0.0&qd=68000&src=11060001&ca=4&cateCode=115&_c=1&appid=tv&oth=&cd=")!)
let secondVideoAssetTrack = secondVideoAssetk.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
let secondVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration,CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration))

let secondVideoLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack:mutableCompositionVideoTrack)
secondVideoLayerInstruction.setOpacityRampFromStartOpacity(1, toEndOpacity: 0, timeRange: CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration))

firstVideoCompositionInstruction.layerInstructions = [firstVideoLayerInstruction]
secondVideoCompositionInstruction.layerInstructions = [secondVideoLayerInstruction]

let mutableVideoComposition = AVMutableVideoComposition()
mutableVideoComposition.instructions = [firstVideoCompositionInstruction,secondVideoCompositionInstruction]
videoComposition.renderSize = CGSizeMake(300, 200) //裁剪出对应大小
videoComposition.frameDuration = CMTimeMake(1, 30)


合并核心动画效果

一个合成视频可以通过animationTool属性添加核心动画,通过animationTool,你可以完成类似水印、添加标题、动画覆盖等任务。核心动画有两种使用方式:你可以添加一个单独的核心动画层作为它单独的合成轨道,你也可以将核心动画直接添加到视频帧里。

//待续


例子

ss

//
import UIKit
import AVFoundation
import MobileCoreServices
import AssetsLibrary

//无效

class CompositionViewController: FCFBaseViewController {
var kDateFormatter:NSDateFormatter?
override func viewDidLoad() {
super.viewDidLoad()
//工程文件
let mutableComposition = AVMutableComposition()
//视频轨道
let videoCompositionTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
//音频轨道
let audioCompositionTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
//添加资源
//        let firstVideoAssetTrack = AVAsset(URL: NSURL(string: "http://down.treney.com/mov/test.mp4")!).tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
let videoAsset = AVAsset(URL:NSURL(string:"http://down.treney.com/mov/test.mp4")!)
//视频素材1的视频轨
let firstVideoAssetTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
//在视频轨道里插入视频素材1的视频轨
try! videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration), ofTrack: firstVideoAssetTrack, atTime: kCMTimeZero)

//        let secondVideoAssetTrack = AVAsset(URL: NSURL(string: "http://119.6.239.159/sohu/v1/TmwioEItfBc7R8PRfDAsg4eg0MbG0FyFMTENWEKl5m47fFoGRMNiNw.mp4?k=Qn4Gfr&p=j9lvzSwUopPGqmwCoSwGqSoCqpXWsUwIWFo7oB2svm12ZDeS0tvGRD6sWYNsfY1svmfCZMbVwmfVZD6HfYXswmNCNF2OfO1XWDWOwm6AZDNXfY1swm1BqVPcgYeSoMAARDx&r=TmI20LscWOoUgt8IS3TT9qLEWqTt5YYoWqkTkjysaY24WGeaRJZcfJsXwJoSKODOfoIWDd4wmXAyBj&q=OpCUhW7IWJodRDbOwmfCyY2sWF1HfJ1tlG6t5G64WYo2ZDv4fFeOWJ6Xvm4cRY1SqF2OY&cip=140.207.16.150")!).tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
let anotherAsset = AVAsset(URL: NSURL(string: "http://119.6.239.159/sohu/v1/TmwioEItfBc7R8PRfDAsg4eg0MbG0FyFMTENWEKl5m47fFoGRMNiNw.mp4?k=Qn4Gfr&p=j9lvzSwUopPGqmwCoSwGqSoCqpXWsUwIWFo7oB2svm12ZDeS0tvGRD6sWYNsfY1svmfCZMbVwmfVZD6HfYXswmNCNF2OfO1XWDWOwm6AZDNXfY1swm1BqVPcgYeSoMAARDx&r=TmI20LscWOoUgt8IS3TT9qLEWqTt5YYoWqkTkjysaY24WGeaRJZcfJsXwJoSKODOfoIWDd4wmXAyBj&q=OpCUhW7IWJodRDbOwmfCyY2sWF1HfJ1tlG6t5G64WYo2ZDv4fFeOWJ6Xvm4cRY1SqF2OY&cip=140.207.16.150")!)
//视频素材2的视频轨
let secondVideoAssetTrack = anotherAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
//在视频轨道里插入视频素材2的视频轨
try! videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, secondVideoAssetTrack.timeRange.duration), ofTrack: secondVideoAssetTrack, atTime: firstVideoAssetTrack.timeRange.duration)

//        try! videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration), ofTrack: firstVideoAssetTrack, atTime: kCMTimeZero)
//        try! videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, secondVideoAssetTrack.timeRange.duration), ofTrack: secondVideoAssetTrack, atTime: firstVideoAssetTrack.timeRange.duration)
try! audioCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration)), ofTrack: AVAsset(URL: NSURL(string: "http://tsmusic128.tc.qq.com/37023937.mp3")!).tracksWithMediaType(AVMediaTypeAudio)[0] as AVAssetTrack, atTime: kCMTimeZero)

//
var isFirstVideoPortrait = false
let firstTransform = firstVideoAssetTrack.preferredTransform
if firstTransform.a == 0 && firstTransform.d == 0 && (firstTransform.b == 1.0 || firstTransform.b == -1.0) && (firstTransform.c == 1.0 || firstTransform.c == -1.0) {
isFirstVideoPortrait = true
}
var isSecondVideoPortrait = false
let secondTransform = secondVideoAssetTrack.preferredTransform
if secondTransform.a == 0 && secondTransform.d == 0 && (secondTransform.b == 1.0 || secondTransform.b == -1.0) && (secondTransform.c == 1.0 || secondTransform.c == -1.0) {
isSecondVideoPortrait = true
}
if (isFirstVideoPortrait && !isSecondVideoPortrait) || (!isFirstVideoPortrait && isSecondVideoPortrait) {
let alertController = UIAlertController.init(title: "Error!", message: "cannot combine a video shot in portrait mode with a video shot in landscape mode", preferredStyle: UIAlertControllerStyle.Alert)
alertController.addAction(UIAlertAction.init(title: "dismiss", style: UIAlertActionStyle.Cancel, handler: { (action) in

}))
self.presentViewController(alertController, animated: true){}
}

//应用指令
let firstVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
firstVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstVideoAssetTrack.timeRange.duration)
let secondVideoCompositionInstruction = AVMutableVideoCompositionInstruction()
secondVideoCompositionInstruction.timeRange = CMTimeRangeMake(firstVideoAssetTrack.timeRange.duration, CMTimeAdd(firstVideoAssetTrack.timeRange.duration, secondVideoAssetTrack.timeRange.duration))

let firstVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack:videoCompositionTrack)
firstVideoCompositionLayerInstruction.setTransform(firstTransform, atTime: kCMTimeZero)

let secondVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoCompositionTrack)
secondVideoCompositionLayerInstruction.setTransform(secondTransform, atTime: firstVideoAssetTrack.timeRange.duration)

firstVideoCompositionInstruction.layerInstructions = [firstVideoCompositionLayerInstruction]
secondVideoCompositionInstruction.layerInstructions = [secondVideoCompositionLayerInstruction]

let mutableVideoComposition = AVMutableVideoComposition()
mutableVideoComposition.instructions = [firstVideoCompositionInstruction,secondVideoCompositionInstruction]

//
var naturalSizeFirst:CGSize!
var naturalSizeSecond:CGSize!
if isFirstVideoPortrait {
naturalSizeFirst = CGSizeMake(firstVideoAssetTrack.naturalSize.height, firstVideoAssetTrack.naturalSize.width)
naturalSizeSecond = CGSizeMake(secondVideoAssetTrack.naturalSize.height, secondVideoAssetTrack.naturalSize.width)
}else{
naturalSizeFirst = firstVideoAssetTrack.naturalSize
naturalSizeSecond = secondVideoAssetTrack.naturalSize
}
var renderWidth:CGFloat!
var renderHeight:CGFloat!
if naturalSizeFirst.width > naturalSizeSecond.width {
renderWidth = naturalSizeFirst.width
}else{
renderWidth = naturalSizeSecond.width
}
if naturalSizeFirst.height > naturalSizeSecond.height {
renderHeight = naturalSizeFirst.height
}else{
renderHeight = naturalSizeSecond.height
}
mutableVideoComposition.renderSize = CGSizeMake(renderWidth, renderHeight)
mutableVideoComposition.frameDuration = CMTimeMake(1,30)

if kDateFormatter==nil {
kDateFormatter = NSDateFormatter()
kDateFormatter?.dateStyle = NSDateFormatterStyle.MediumStyle
kDateFormatter?.timeStyle = NSDateFormatterStyle.ShortStyle
}

let exporter = AVAssetExportSession.init(asset: mutableComposition, presetName: AVAssetExportPresetHighestQuality)
var outputURL:NSURL?
let fileManager = NSFileManager.defaultManager()
do{
try outputURL = fileManager.URLForDirectory(.DocumentDirectory, inDomain: .UserDomainMask, appropriateForURL: nil, create: true)
outputURL?.URLByAppendingPathComponent((kDateFormatter?.stringFromDate(NSDate()))!).URLByAppendingPathExtension(String(UTTypeCopyPreferredTagWithClass(AVFileTypeQuickTimeMovie as CFStringRef,kUTTagClassFilenameExtension)))

}catch{

}

exporter?.outputURL = outputURL
exporter?.outputFileType = AVFileTypeQuickTimeMovie
exporter?.shouldOptimizeForNetworkUse = true
exporter?.videoComposition = mutableVideoComposition
exporter?.exportAsynchronouslyWithCompletionHandler({
dispatch_async(dispatch_get_main_queue(), {
if exporter?.status == AVAssetExportSessionStatus.Completed {
let library = ALAssetsLibrary()
if library.videoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL) {
library.writeVideoAtPathToSavedPhotosAlbum(outputURL,
completionBlock:nil)
}
}
})
})
}

override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}

/*
// MARK: - Navigation

// In a storyboard-based application, you will often want to do a little preparation before navigation
override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) {
// Get the new view controller using segue.destinationViewController.
// Pass the selected object to the new view controller.
}
*/
}
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息