在得知 iOS 8 中的程序员可以使用 HW-H264-Decoder 后,我想立即使用它。 WWDC 2014 对“直接访问视频编码和解码”有一个很好的介绍。你可以看看here .
基于那里的案例 1,我开始开发一个应用程序,它应该能够从 GStreamer 获取 H264-RTP-UDP-Stream,将其放入“appsink”元素中以直接访问 NAL 单元并进行转换以创建 CMSampleBuffers,这是我的 AVSampleBufferDisplayLayer然后可以显示。
执行所有操作的有趣代码如下:
//
// GStreamerBackend.m
//
#import "GStreamerBackend.h"
NSString * const naluTypesStrings[] = {
@"Unspecified (non-VCL)",
@"Coded slice of a non-IDR picture (VCL)",
@"Coded slice data partition A (VCL)",
@"Coded slice data partition B (VCL)",
@"Coded slice data partition C (VCL)",
@"Coded slice of an IDR picture (VCL)",
@"Supplemental enhancement information (SEI) (non-VCL)",
@"Sequence parameter set (non-VCL)",
@"Picture parameter set (non-VCL)",
@"Access unit delimiter (non-VCL)",
@"End of sequence (non-VCL)",
@"End of stream (non-VCL)",
@"Filler data (non-VCL)",
@"Sequence parameter set extension (non-VCL)",
@"Prefix NAL unit (non-VCL)",
@"Subset sequence parameter set (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Coded slice of an auxiliary coded picture without partitioning (non-VCL)",
@"Coded slice extension (non-VCL)",
@"Coded slice extension for depth view components (non-VCL)",
@"Reserved (non-VCL)",
@"Reserved (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
@"Unspecified (non-VCL)",
};
static GstFlowReturn new_sample(GstAppSink *sink, gpointer user_data)
{
GStreamerBackend *backend = (__bridge GStreamerBackend *)(user_data);
GstSample *sample = gst_app_sink_pull_sample(sink);
GstBuffer *buffer = gst_sample_get_buffer(sample);
GstMemory *memory = gst_buffer_get_all_memory(buffer);
GstMapInfo info;
gst_memory_map (memory, &info, GST_MAP_READ);
int startCodeIndex = 0;
for (int i = 0; i < 5; i++) {
if (info.data[i] == 0x01) {
startCodeIndex = i;
break;
}
}
int nalu_type = ((uint8_t)info.data[startCodeIndex + 1] & 0x1F);
NSLog(@"NALU with Type \"%@\" received.", naluTypesStrings[nalu_type]);
if(backend.searchForSPSAndPPS) {
if (nalu_type == 7)
backend.spsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];
if (nalu_type == 8)
backend.ppsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];
if (backend.spsData != nil && backend.ppsData != nil) {
const uint8_t* const parameterSetPointers[2] = { (const uint8_t*)[backend.spsData bytes], (const uint8_t*)[backend.ppsData bytes] };
const size_t parameterSetSizes[2] = { [backend.spsData length], [backend.ppsData length] };
CMVideoFormatDescriptionRef videoFormatDescr;
OSStatus status = CMVideoFormatDescriptionCreateFromH264ParameterSets(kCFAllocatorDefault, 2, parameterSetPointers, parameterSetSizes, 4, &videoFormatDescr);
[backend setVideoFormatDescr:videoFormatDescr];
[backend setSearchForSPSAndPPS:false];
NSLog(@"Found all data for CMVideoFormatDescription. Creation: %@.", (status == noErr) ? @"successfully." : @"failed.");
}
}
if (nalu_type == 1 || nalu_type == 5) {
CMBlockBufferRef videoBlock = NULL;
OSStatus status = CMBlockBufferCreateWithMemoryBlock(NULL, info.data, info.size, kCFAllocatorNull, NULL, 0, info.size, 0, &videoBlock);
NSLog(@"BlockBufferCreation: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");
const uint8_t sourceBytes[] = {(uint8_t)(info.size >> 24), (uint8_t)(info.size >> 16), (uint8_t)(info.size >> 8), (uint8_t)info.size};
status = CMBlockBufferReplaceDataBytes(sourceBytes, videoBlock, 0, 4);
NSLog(@"BlockBufferReplace: %@", (status == kCMBlockBufferNoErr) ? @"successfully." : @"failed.");
CMSampleBufferRef sbRef = NULL;
const size_t sampleSizeArray[] = {info.size};
status = CMSampleBufferCreate(kCFAllocatorDefault, videoBlock, true, NULL, NULL, backend.videoFormatDescr, 1, 0, NULL, 1, sampleSizeArray, &sbRef);
NSLog(@"SampleBufferCreate: %@", (status == noErr) ? @"successfully." : @"failed.");
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sbRef, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
NSLog(@"Error: %@, Status:%@", backend.displayLayer.error, (backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusUnknown)?@"unknown":((backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusRendering)?@"rendering":@"failed"));
dispatch_async(dispatch_get_main_queue(),^{
[backend.displayLayer enqueueSampleBuffer:sbRef];
[backend.displayLayer setNeedsDisplay];
});
}
gst_memory_unmap(memory, &info);
gst_memory_unref(memory);
gst_buffer_unref(buffer);
return GST_FLOW_OK;
}
@implementation GStreamerBackend
- (instancetype)init
{
if (self = [super init]) {
self.searchForSPSAndPPS = true;
self.ppsData = nil;
self.spsData = nil;
self.displayLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.displayLayer.bounds = CGRectMake(0, 0, 300, 300);
self.displayLayer.backgroundColor = [UIColor blackColor].CGColor;
self.displayLayer.position = CGPointMake(500, 500);
self.queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(self.queue, ^{
[self app_function];
});
}
return self;
}
- (void)start
{
if(gst_element_set_state(self.pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Failed to set pipeline to playing");
}
}
- (void)app_function
{
GstElement *udpsrc, *rtphdepay, *capsfilter;
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
g_set_application_name ("appsink");
self.pipeline = gst_pipeline_new ("testpipe");
udpsrc = gst_element_factory_make ("udpsrc", "udpsrc");
GstCaps *caps = gst_caps_new_simple("application/x-rtp", "media", G_TYPE_STRING, "video", "clock-rate", G_TYPE_INT, 90000, "encoding-name", G_TYPE_STRING, "H264", NULL);
g_object_set(udpsrc, "caps", caps, "port", 5000, NULL);
gst_caps_unref(caps);
rtphdepay = gst_element_factory_make("rtph264depay", "rtph264depay");
capsfilter = gst_element_factory_make("capsfilter", "capsfilter");
caps = gst_caps_new_simple("video/x-h264", "streamformat", G_TYPE_STRING, "byte-stream", "alignment", G_TYPE_STRING, "nal", NULL);
g_object_set(capsfilter, "caps", caps, NULL);
self.appsink = gst_element_factory_make ("appsink", "appsink");
gst_bin_add_many (GST_BIN (self.pipeline), udpsrc, rtphdepay, capsfilter, self.appsink, NULL);
if(!gst_element_link_many (udpsrc, rtphdepay, capsfilter, self.appsink, NULL)) {
NSLog(@"Cannot link gstreamer elements");
exit (1);
}
if(gst_element_set_state(self.pipeline, GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS)
NSLog(@"could not change to ready");
GstAppSinkCallbacks callbacks = { NULL, NULL, new_sample,
NULL, NULL};
gst_app_sink_set_callbacks (GST_APP_SINK(self.appsink), &callbacks, (__bridge gpointer)(self), NULL);
main_loop = g_main_loop_new (context, FALSE);
g_main_loop_run (main_loop);
/* Free resources */
g_main_loop_unref (main_loop);
main_loop = NULL;
g_main_context_pop_thread_default(context);
g_main_context_unref (context);
gst_element_set_state (GST_ELEMENT (self.pipeline), GST_STATE_NULL);
gst_object_unref (GST_OBJECT (self.pipeline));
}
@end
运行应用程序并开始流式传输到 iOS 设备时我得到的结果:
NALU with Type "Sequence parameter set (non-VCL)" received.
NALU with Type "Picture parameter set (non-VCL)" received.
Found all data for CMVideoFormatDescription. Creation: successfully..
NALU with Type "Coded slice of an IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:unknown
NALU with Type "Coded slice of a non-IDR picture (VCL)" received.
BlockBufferCreation: successfully.
BlockBufferReplace: successfully.
SampleBufferCreate: successfully.
Error: (null), Status:rendering
[...] (repetition of the last 5 lines)
所以它似乎按照它应该的方式解码,但我的问题是,我在我的 AVSampleBufferDisplayLayer 中看不到任何东西。 kCMSampleAttachmentKey_DisplayImmediately 可能有问题, 但我已经按照我被告知的方式设置了 here (see the 'important' note) .
欢迎任何想法;)
最佳答案
现在可以使用了。每个 NALU 的长度不包含长度头本身。所以在将它用于我的 sourceBytes 之前,我已经从我的 info.size 中减去 4。
关于ios - 如何将 iOS 8 中的 AVSampleBufferDisplayLayer 用于带有 GStreamer 的 RTP H264 流?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25980070/
我正在学习如何使用Nokogiri,根据这段代码我遇到了一些问题:require'rubygems'require'mechanize'post_agent=WWW::Mechanize.newpost_page=post_agent.get('http://www.vbulletin.org/forum/showthread.php?t=230708')puts"\nabsolutepathwithtbodygivesnil"putspost_page.parser.xpath('/html/body/div/div/div/div/div/table/tbody/tr/td/div
总的来说,我对ruby还比较陌生,我正在为我正在创建的对象编写一些rspec测试用例。许多测试用例都非常基础,我只是想确保正确填充和返回值。我想知道是否有办法使用循环结构来执行此操作。不必为我要测试的每个方法都设置一个assertEquals。例如:describeitem,"TestingtheItem"doit"willhaveanullvaluetostart"doitem=Item.new#HereIcoulddotheitem.name.shouldbe_nil#thenIcoulddoitem.category.shouldbe_nilendend但我想要一些方法来使用
我试图在一个项目中使用rake,如果我把所有东西都放到Rakefile中,它会很大并且很难读取/找到东西,所以我试着将每个命名空间放在lib/rake中它自己的文件中,我添加了这个到我的rake文件的顶部:Dir['#{File.dirname(__FILE__)}/lib/rake/*.rake'].map{|f|requiref}它加载文件没问题,但没有任务。我现在只有一个.rake文件作为测试,名为“servers.rake”,它看起来像这样:namespace:serverdotask:testdoputs"test"endend所以当我运行rakeserver:testid时
作为我的Rails应用程序的一部分,我编写了一个小导入程序,它从我们的LDAP系统中吸取数据并将其塞入一个用户表中。不幸的是,与LDAP相关的代码在遍历我们的32K用户时泄漏了大量内存,我一直无法弄清楚如何解决这个问题。这个问题似乎在某种程度上与LDAP库有关,因为当我删除对LDAP内容的调用时,内存使用情况会很好地稳定下来。此外,不断增加的对象是Net::BER::BerIdentifiedString和Net::BER::BerIdentifiedArray,它们都是LDAP库的一部分。当我运行导入时,内存使用量最终达到超过1GB的峰值。如果问题存在,我需要找到一些方法来更正我的代
关闭。这个问题是opinion-based.它目前不接受答案。想要改进这个问题?更新问题,以便editingthispost可以用事实和引用来回答它.关闭4年前。Improvethisquestion我想在固定时间创建一系列低音和高音调的哔哔声。例如:在150毫秒时发出高音调的蜂鸣声在151毫秒时发出低音调的蜂鸣声200毫秒时发出低音调的蜂鸣声250毫秒的高音调蜂鸣声有没有办法在Ruby或Python中做到这一点?我真的不在乎输出编码是什么(.wav、.mp3、.ogg等等),但我确实想创建一个输出文件。
Rails2.3可以选择随时使用RouteSet#add_configuration_file添加更多路由。是否可以在Rails3项目中做同样的事情? 最佳答案 在config/application.rb中:config.paths.config.routes在Rails3.2(也可能是Rails3.1)中,使用:config.paths["config/routes"] 关于ruby-on-rails-Rails3中的多个路由文件,我们在StackOverflow上找到一个类似的问题
给定这段代码defcreate@upgrades=User.update_all(["role=?","upgraded"],:id=>params[:upgrade])redirect_toadmin_upgrades_path,:notice=>"Successfullyupgradeduser."end我如何在该操作中实际验证它们是否已保存或未重定向到适当的页面和消息? 最佳答案 在Rails3中,update_all不返回任何有意义的信息,除了已更新的记录数(这可能取决于您的DBMS是否返回该信息)。http://ar.ru
我在我的项目目录中完成了compasscreate.和compassinitrails。几个问题:我已将我的.sass文件放在public/stylesheets中。这是放置它们的正确位置吗?当我运行compasswatch时,它不会自动编译这些.sass文件。我必须手动指定文件:compasswatchpublic/stylesheets/myfile.sass等。如何让它自动运行?文件ie.css、print.css和screen.css已放在stylesheets/compiled。如何在编译后不让它们重新出现的情况下删除它们?我自己编译的.sass文件编译成compiled/t
我正在寻找执行以下操作的正确语法(在Perl、Shell或Ruby中):#variabletoaccessthedatalinesappendedasafileEND_OF_SCRIPT_MARKERrawdatastartshereanditcontinues. 最佳答案 Perl用__DATA__做这个:#!/usr/bin/perlusestrict;usewarnings;while(){print;}__DATA__Texttoprintgoeshere 关于ruby-如何将脚
Rackup通过Rack的默认处理程序成功运行任何Rack应用程序。例如:classRackAppdefcall(environment)['200',{'Content-Type'=>'text/html'},["Helloworld"]]endendrunRackApp.new但是当最后一行更改为使用Rack的内置CGI处理程序时,rackup给出“NoMethodErrorat/undefinedmethod`call'fornil:NilClass”:Rack::Handler::CGI.runRackApp.newRack的其他内置处理程序也提出了同样的反对意见。例如Rack