Skip to content

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS.

License

Notifications You must be signed in to change notification settings

yuvaraj-ncompass/HaishinKit.swift

 
 

Repository files navigation

HaishinKit for iOS, macOS, tvOS, and Android.

GitHub Stars Release Platform Compatibility Swift Compatibility GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • README.md contains unreleased content, which can be tested on the main branch.
  • API Documentation

Sponsored with 💖 by
Stream Chat
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬

💬 Communication

  • If you need help with making LiveStreaming requests using HaishinKit, use a GitHub Discussions with Q&A.
  • If you'd like to discuss a feature request, use a GitHub Discussions with Idea
  • If you met a HaishinKit's bug🐛, use a GitHub Issue with Bug report template
    • The trace level log is very useful. Please set LBLogger.with(HaishinKitIdentifier).level = .trace.
    • If you don't use an issue template. I will immediately close the your issue without a comment.
  • If you want to contribute, submit a pull request with a pr template.
  • If you want to support e-mail based communication without GitHub.
    • Consulting fee is $50/1 incident. I'm able to response a few days.
  • Discord chatroom.
  • 日本語が分かる方は、日本語でのコミニケーションをお願いします!

💖 Sponsors

Streamlabs

🌏 Related projects

Project name Notes License
HaishinKit for Android. Camera and Microphone streaming library via RTMP for Android. BSD 3-Clause "New" or "Revised" License
HaishinKit for Flutter. Camera and Microphone streaming library via RTMP for Flutter. BSD 3-Clause "New" or "Revised" License

🎨 Features

RTMP

  • Authentication
  • Publish and Recording
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Handling (see also #1153)
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)
    • Tunneled (RTMPT over SSL/TLS) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension
  • Supported codec
    • Audio
      • AAC
    • Video
      • H264/AVC
        • ex: stream.videoSettings.profileLevel = kVTProfileLevel_H264_Baseline_3_1 as String
      • H265/HEVC (Also server-side must support Enhanced RTMP.)
        • ex: stream.videoSettings.profileLevel = kVTProfileLevel_HEVC_Main_AutoLevel as String

SRT

It's a different framework. You can use it through SwiftPM.

import SRTHaishinKit
  • Publish and Recording (H264/AAC)
  • Playback(beta)
  • mode
    • caller
    • listener
    • rendezvous

HLS

  • HTTPService
  • HLS Publish

Multi Camera

Supports two camera video sources. A picture-in-picture display that shows the image of the secondary camera of the primary camera. Supports camera split display that displays horizontally and vertically.

Picture-In-Picture Split
// If you're using multi-camera functionality, please make sure to call the attachMultiCamera method first. This is required for iOS 14 and 15, among others.
if #available(iOS 13.0, *) {
  let front = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)
  stream.attachMultiCamera(front)
}
let back = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)
stream.attachCamera(back)
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio))

Rendering

Features HKView PiPHKView MTHKView
Engine AVCaptureVideoPreviewLayer AVSampleBufferDisplayLayer Metal
Publish
Playback
VisualEffect
PictureInPicture

MultiCamera

Others

🌏 Requirements

- iOS OSX tvOS Xcode Swift
1.5.0+ 11.0+ 10.13+ 10.2+ 14.3+ 5.7+
1.4.0+ 11.0+ 10.13+ 10.2+ 14.0+ 5.7+

🐾 Examples

Examples project are available for iOS with UIKit, iOS with SwiftUI, macOS and tvOS.

  • Camera and microphone publish.
  • RTMP Playback
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --use-xcframeworks
open HaishinKit.xcodeproj

☕ Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

🔧 Installation

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 1.5.8
end

target 'Your Target'  do
    platform :ios, '11.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 1.5.8

Swift Package Manager

https://github.com/shogo4405/HaishinKit.swift

🔧 Prerequisites

Make sure you setup and activate your AVAudioSession iOS.

import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    try session.setActive(true)
} catch {
    print(error)
}

📓 RTMP Usage

Real Time Messaging Protocol (RTMP).

let connection = RTMPConnection()
let stream = RTMPStream(connection: rtmpConnection)
stream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
    // print(error)
}
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
    // print(error)
}

let hkView = MTHKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(stream)

// add ViewController#view
view.addSubview(hkView)

connection.connect("rtmp://localhost/appName/instanceName")
stream.publish("streamName")

RTMP URL Format

  • rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
    • [] mark is an Optional.
    rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]")
    rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
    
  • rtmp://localhost/live/streamName
    rtmpConneciton.connect("rtmp://localhost/live")
    rtmpStream.publish("streamName")
    

Settings

var stream = RTMPStream(connection: rtmpConnection)

stream.frameRate = 30
stream.sessionPreset = AVCaptureSession.Preset.medium

/// Specifies the video capture settings.
stream.videoCapture(for: 0).isVideoMirrored = false
stream.videoCapture(for: 0).preferredVideoStabilizationMode = .auto
// rtmpStream.videoCapture(for: 1).isVideoMirrored = false

// Specifies the audio codec settings.
stream.audioSettings = AudioCodecSettings(
  bitRate: 64 * 1000
)

// Specifies the video codec settings.
stream.videoSettings = VideoCodecSettings(
  videoSize: .init(width: 854, height: 480),
  profileLevel: kVTProfileLevel_H264_Baseline_3_1 as String,
  bitRate: 640 * 1000,
  maxKeyFrameIntervalDuration: 2,
  scalingMode: .trim,
  bitRateMode: .average,
  allowFrameReordering: nil,
  isHardwareEncoderEnabled: true
)

// Specifies the recording settings. 0" means the same of input.
stream.startRecording([
  AVMediaType.audio: [
    AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
    AVSampleRateKey: 0,
    AVNumberOfChannelsKey: 0,
    // AVEncoderBitRateKey: 128000,
  ],
  AVMediaType.video: [
    AVVideoCodecKey: AVVideoCodecH264,
    AVVideoHeightKey: 0,
    AVVideoWidthKey: 0,
    /*
    AVVideoCompressionPropertiesKey: [
      AVVideoMaxKeyFrameIntervalDurationKey: 2,
      AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
      AVVideoAverageBitRateKey: 512000
    ]
    */
  ]
])

// 2nd arguemnt set false
stream.attachAudio(AVCaptureDevice.default(for: .audio), automaticallyConfiguresApplicationAudioSession: false)
// picrure in picrure settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
  mode: .pip,
  cornerRadius: 16.0,
  regionOfInterest: .init(
    origin: CGPoint(x: 16, y: 16),
    size: .init(width: 160, height: 160)
  )
)
// split settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
  mode: .split(direction: .east),
  cornerRadius: 0.0,
  regionOfInterest: .init(
    origin: .zero,
    size: .zero
  )
)

Authentication

var connection = RTMPConnection()
connection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
let screen = IOUIScreenCaptureUnit(shared: UIApplication.shared)
screen.delegate = stream
screen.startRunning()

// macOS
stream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

📓 HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var stream = HTTPStream()
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back))
stream.attachAudio(AVCaptureDevice.default(for: .audio))
stream.publish("hello")

var hkView = MTHKView(frame: view.bounds)
hkView.attachStream(httpStream)

var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.addHTTPStream(stream)
httpService.startRunning()

// add ViewController#view
view.addSubview(hkView)

💠 Sponsorship

Looking for sponsors. Sponsoring I will enable us to:

  • Purchase smartphones or peripheral devices for testing purposes.
  • Pay for testing on a specific streaming service or for testing on mobile lines.
  • Potentially private use to continue the OSS development

If you use any of our libraries for work, see if your employers would be interested in sponsorship. I have some special offers. I would greatly appreciate. Thank you.

  • If you request I will note your name product our README.
  • If you mention on a discussion, an issue or pull request that you are sponsoring us I will prioritise helping you even higher.

スポンサーを募集しています。利用用途としては、

  • テスト目的で、スマートフォンの購入や周辺機器の購入を行います。
  • 特定のストリーミングサービスへのテストの支払いや、モバイル回線でのテストの支払いに利用します。
  • 著書のOSS開発を継続的に行う為に私的に利用する可能性もあります。

このライブラリーを仕事で継続的に利用している場合は、ぜひ。雇用主に、スポンサーに興味がないか確認いただけると幸いです。いくつか特典を用意しています。

  • README.mdへの企業ロゴの掲載
  • IssueやPull Requestの優先的な対応

Sponsorship

📖 Reference

📜 License

BSD-3-Clause

About

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Swift 99.5%
  • Other 0.5%