-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discoverable, decentralized, cached live video streaming #31
Comments
This might be interesting to try out. |
See this related issue. Some promising options there. |
Okay! So I did some more research on this:
|
Update. I was able to get basic live streaming to work with YouPHPTube. So we have some kind of a working solution now. Onwards towards something smoother, and closer to being more decentralized. Separately, I will work on a playbook to get YouPHPTube into IIAB. |
How realistic/practical might it be for educators to "publish" short videos from their Android phones[*] (or short screencasts/tutorials from their laptops) to YouPHPTube on IIAB? [*] whether 100% homebrewed or from WhatsApp, or YouTube itself...which contains a ton of very thoughtful educational vids and visualizations (amid the swamp ;) ASIDE: many YouTube science demonstrations are more vivid than PhET science simulations...these 2 artforms should really be combined side-by-side!? |
Another update. Me and @sboruah were able to test that she could livestream from her phone's camera, and I could watch it in a browser. There is one limitation: The live stream does not get recorded directly onto the site, and it seems that the feature is enabled through a paid plugin. There is a work around though, as it is easily possible for those videos to be recorded on the server somewhere, where they may be presented (manually), through a simple online form, to the encoder as part of this suite. |
@holta for what its worth, streaming, transcoding requires significant compute resources, so if IIAB is running on decent hardware (i3 NUC or so), then it is entirely possible. |
Bumping milestone. Good progress happened on this over the week. |
One challenge with p2p hls webrtc based solution seems to be that we might have to build apps, or hunt for existing ones, that can do a live stream. |
Let's see...maybe not. Maybe we just have to allow it through a browser by simple means. Let's see. |
I'm talking of mobile phones. People accustomed to using apps, so it might be the preferred way. Desktop will have to be browser based. |
Well, I think people are more accustomed to go to a browser than to an app, even if you count all apps combined vs. browser. But surely browser will win every time a custom made app for a single purpose. |
Also, I have little knowledge of app development, but maybe the security model in phones also encourages apps rather than webcamera+mic permissions in a browser. Anyway, I dont know very well about this, so perhaps that's why I think someone more adept needs to answer this. |
It makes sense to have a protocol where we always assume that something can be done in a browser, because it is easy and it does not introduce anything else and allows us to focus on core capabilities instead, prototype fast, etc. etc. and only if that absolutely fails, we consider introducing something new (like an app). Apps are kind of like cognitive garbage...just like we should not use new materials if we can use garbage, we should not use apps if we can use browser.
Precisely, which further makes the case for the browser based approach 100%, as it is same regardless of device and platform. You can be on apple or samsung, or you can be on your laptop or tablet or whatever, it does not matter, you know exactly what to do to access the livestream (or whatever else it may be). |
No. In app you give the permissions to the app, in browser you give permissions to the browser. Access to camera is 17 characters long syntax and works almost universally. (does not work on edge I think). Once I have the setup in home, we will know much more about all of these things as certainty. |
Possibly meaningful here >> https://recordrtc.org/ |
I can't see any reason why this would not work on just the mesh node. |
@m-anish asked me to comment ... I built the dweb.archive.org site as a decentralized version of the Internet Archive, and am about half way through a year long project dweb-mirror for a offline version of the Archive (suitable for mesh networks etc), it currently can crawl IA resources; serve them fully offline and also act as a proxy (saving as you browse a local server). Its tested on MacOSX; RPi (NOOBs or IIAB); and Rachel3+/WorldPossible though its still got bugs I'm working through. For video ... we tried WebRTC - it was a massive resource hog, caused browser crashes etc, though we think that was mostly because it was opening lots of connections to DHT's. Like most people in the P2P community we dropped it as far as possible. The one video system we found worked well was WebTorrent, especially because it can easily be setup to have fall-back Http URLs, which works well when you have a mix of popular videos (lots of peers to connect to) and unpopular (no other peers online), so the latter gets seeded by http from an origin server. I'm part way through a process where videos viewed on the dweb-mirror project will be shared by any peers using webtorrent, in part to get around the well-known issues with bandwidth on Raspberry Pi's. |
@mitra42 thx for commenting! Did you also get a chance to try hls? We experimented with hlsjs-p2p-engine and got decent results for live streaming and are thinking to build around that? HLS is not 'true' live, and there is a lag of about 10-20 sec but it is perfectly acceptable for our scenario. Also, re: webtorrent, is there a project that we can test on the mesh we have locally to do some live video streaming? How well/badly does it work on mobile phones? Would love to test. I came across this in my research, but didn't test it as it seemed they had some known limitations, and limited browser support, but perhaps should rethink that decision. One reason we are considering the hlsjs-p2p-engine is it seems to be well supported across browsers through various containers (video.js et. al.). We are also thinking to integrate hlsjs-p2p-engine in the video.js container within kiwix zim files which are heavily used within IIAB. |
I haven't tried webtorrent for livestreaming, its not our application, our application (at Internet Archive, and in dweb-mirror) is for static video, and in particular to make sure the experience is always as good as a direct connection to the http server. |
With just video and relatively small number of viewers then for live stream then webrtc (which is what hlsjs is built on) may be what you want, I'm just sceptical of it for viewing static videos given all the problems and how many people have abandoned it. Have you also tried it fully disconnected, webrtc has supposedly some single-point-of-failure issues with single rendezvous servers, but maybe hlsjs knows a way around it. |
Hmm, I haven't yet tried hlsjs-p2p-engine fully offline. It needs a signaling server and a few js files. We ran the signaling server locally but the js files were still being served online. I think we should make it a priority to test it fully offline (@pdey ), but I am optimistic that it should work offline. For VOD, we do not envision a lot of p2p, atleast initially, as the number of concurrent users watching a particular video may be small, but still it might get around cases where when a teacher in a class asks pupils to load a video in classroom laptops/tablets and suddenly there is a lot of load on the server. Thanks for your comments, this is helpful info. To set some basic context, below are two blogposts about Zanskar, where we hope to test and deploy these technologies (more posts to come in the near future). https://medium.com/eka-foundation/en-meshed-in-la-la-land-part-1-34e0ce29ea1b |
Seems like ipv4 multicast can be brought into the picture in the case where a whole class needs to have access to the same video, at the same time |
If what you are doing is static video, I'd strongly recommend using WebTorrent instead, because I'm guessing that the case of a class watching recommended videos is probably more common than all watching the same video at the same place at the same time. |
Actually, the primary case we want to "optimize" for is live streaming, and server loading during that. That concurrent class watching a video is something that we'll have to see from usage data if it really is a pain point, but it was just a guess that at something which might happen in the future. But the primary focus remains to make live streaming more decentralized and efficient. Apologies if my earlier comment was confusing. |
Understood - I guess I've not seen livestreaming (i.e. video camera capturing image, showing it elsewhere) being a usecase in disconnected networks, so good to hear that its really an issue for your use cases.. |
There is absolutely no issue serving js offline via static files. If we like, we can also do it all on node. |
Updates on hls-js-p2p engine: What works
What does not work
Based on these findings, i have tried thinking on a slightly different idea. Following is an outline ( I will soon share the details as i work them out)
|
Prasenjit, I also had some conversations and thought processes around this. Lets talk soon! |
bundle.zip Install and setup NGINXDownloadsudo apt-get install build-essential libpcre3 libpcre3-dev libssl-dev Extracttar -zxvf nginx-1.7.5.tar.gz Install nginxcd nginx-1.7.5 Download and copy NGINX init scriptssudo wget https://raw.githubusercontent.com/JasonGiedymin/nginx-init-ubuntu/master/nginx -O /etc/init.d/nginx sudo chmod +x /etc/init.d/nginx Test upstart scriptsudo service nginx status # to poll for current status NGINX service configurationsudo nano /usr/local/nginx/conf/nginx.conf Remove all lines and paste the content of attached file. Change the Test config filesudo /usr/local/nginx/sbin/nginx -c /usr/local/nginx/conf/nginx.conf -t Cross-domain configsudo nano /usr/local/nginx/html/crossdomain.xml
Restartsudo service nginx restart Link stream url to OBS studiostream server: rtmp://localhost/hls |
Work on this lives:
|
FWIW. something for the future. https://openvidu.io/ |
This might be interesting too - https://github.com/arslancb/clipbucket Looks like for Zanskar 2019 - we are going with cham |
@m-anish There isn't any docs on that cham page,
|
Hi @mitra42 Apologies for such a late reply. For Zanskar, we went ahead with Cham, but there were a lot of changes that were made to it while in Zanskar. @pdey and @so-ale probably have those in their personal repos/storage. Also, as it turns out, there seems to be some issue with the intel NUC (we don't know whether it is hardware or software at this point), but the network hasn't been operational for the past 3 weeks or so. I shall be getting the schoolserver shortly, which is in transport from Zanskar, which will also have the latest cham code commits. If @pdey or @so-ale can produce them, that'd be fine to, but I should be able to respond to you soon (hopefully a week to ten days). Cham was quite simple and straightforward, and actually there may be lots of room for adding complexity :) It can be used for live streaming, and the live streams are also archived in various quality settings once the session is done. Don't understand the 'authoring podcasts' question. The way it works is very simple. Some user end software like OBS is used to compose a stream. It can be a live stream, audio, or pre-recorded as long as OBS can handle it. It streams it to an rtmp end point where cham, running on nginx produces 3 different quality levels of 'live streams' while also recording them to disk. The live streams you get as a consumer depend on your connection bandwidth to the server. All this works well but everything is centralized, and for the future, we'd wish to look into making this more decentralized, or atleast move it beyond a single point of failure. |
Thanks @m-anish, that sounds replicable, which would be great, I'm hoping our first deploy is going to be in a limited-internet school in Indonesia, and it looks likely they will want to integrate local content. I look forward to a bit more detail on that setup, there are a bunch of new things there (OBS, rtmp, cham,), which I can look up, but piecing them all together would be much quicker with some more details. I'm interested then to figure out hooking up the output for selective automated upload to the Internet Archive when the box sees the internet connecting and/or via sneakernet. |
@m-anish I think you will get the code base soon. If you need it any earlier, i have a copy in one of my hard drives but i am not sure if it has the latest snapshot. |
Hi @mitra42 So, I updated the cham repo to the latest codebase. But there is a catch. The installation of cham is really in two parts:
I had written a cham playbook for iiab that does everything that is needed for a working setup, but there have been many changes to iiab since so I no longer know if my playbook will work there. My question to you would be: Do you want this as part of IIAB or as a standalone setup? If its the former, I'll work on updating the playbook, which I guess I need to anyway. If it is the latter, I will pass you specific instructions to get everything working. I'll need to know what hardware you're running on and your OS version in that case. Looking forward to your reply. |
Thanks - I could see using it in either context (standalone or IIAB) so probably doing the IIAB one is best - especially since I think that is what you are using? With a working set on IIAB I can always look at how to adapt that to a different setup if required. |
hmm... okay. Let me try an IIAB install on a VM and see how it goes. Give me a couple of days to iron out the proper instructions for you and make any necessary changes in the PR. |
@mitra42 I updated the PR that adds cham to IIAB. You can try a fresh iiab install. I tried on a VM with Ubuntu 19.10 and it seemed to work. |
Will do, @holta wants me to do a test with the current version anyway, so can do both at same time. |
@m-anish if you can in coming days, please help @mitra42 with his Cham installation question @ iiab/iiab#2209 ? |
As a subset of #18 , one of the requests from Zanskar to focus on the live video streaming aspect of the mesh network.
The current implementation involves running vlc on a windows laptop and using its web camera and microphone to create a live video stream, and then asking users in different places to open that live stream in their vlc instances. This works well, but suffers from:
What if there could be some kind of caching implemented on the mesh nodes.
I came across this project, whose readme also lists other potentially similar projects.
We can break this down in two parts:
The text was updated successfully, but these errors were encountered: