Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> As for the financial costs--you're paying someone else to build and manage your infrastructure, so yes, it will cost more than buying raw infra. Again, this is a tradeoff: is this more valuable to your product/users to answer pages or build features? Depending on where you are in your product lifecycle, YMMV

First, 2-3 orders of magnitude more expensive is quite a steep price to pay for the featureset of Firebase.

Second, it’s only a good idea to use Firebase if you don’t plan on ever migrating away – because that will be hell.

And third, Firebase is a massive engineering effort, and I’m surprised that stuff actually works, considering how much work it is to try and replicate the features I need myself.

On the other hand, somehow I feel like Firebase is just an example of the hell we live in, one of the greatest development tools, and all of it is secret, most not even patented, never will be open or free, and just being used to force developers into a closed ecosystem of Google’s Cloud.

It’d be a lot nicer place if projects like this would be open, so people could use it on their own systems (because I certainly won’t trust a company running systems in a country ruled by Trump).



Another option to check out is Couchbase Mobile. Open source, self-hosted database with full offline functionality and a solid sync solution.

I think Firebase is pretty amazing. Have been a fan since I met some of the team (I think it was at the 2011 Launch conference). Like anything, though, there are tradeoffs.

Briefly, I'd say that there's overlap, with Couchbase Mobile tending to shine toward the more complex end (including making the 10% much easier), and also being extremely easy to use as a substitute for SQLite/Core.

(I work for Couchbase.)


> All data is stored and transmitted as JSON – the embedded database, the database server, REST APIs, stream APIs, and batch APIs.

That’s likely going to become an issue with my usecase – even currently while using a custom binary format on the net, and decoding with Java NIO, we’re seeing ~80-90% CPU utilization on latest Android phones for ~4-5 seconds during connection to sync the latest tenthousands to hundredthousands of messages.

I doubt using Strings, and specifically JSON, will make that more performant.

(But I’ll definitely look into your code as inspiration for how to continue)


> It’d be a lot nicer place if projects like this would be open, so people could use it on their own systems

Check Feathers.js if you want realtime on your own system.


I have, sadly it doesn’t really scale well enough.

I work on Quasseldroid, an android client for the quassel IRC bouncer, and the idea is that you can always access all messages instantly, while the client only buffers a few of them.

This works quite well, and is easy to do with Feathers.js and others, even Firebase...

...if there wasn’t the issue that the average user is in hundreds of channels and gets tenthousands of messages per hour, in some cases, even thousands of messages a second (if the user is in a channel with > 10k other users, for example, and everyone is chatting – such as during eurovision on Quakenet’s #eurovision).

Although users self-host, so we don’t have infrastructure costs, they usually do so on a raspberry pi – and Feathers.js can’t easily handle thousands of changes per minute to its database while running on a raspberry pi, and accurately sync them to multiple clients.


Oh wow.

So how did you/are you solving this?


Well, while I only work on the Android client, the solution currently in use is a large codebase of code specifically written for this single purpose. In C++. Compiled natively. That’s the main reason why good performance with the system is even possible.

A huge performance boost (reconnection times down from ~2 minutes to ~12 seconds on 64kbps 2G network, using a Nexus 5X) could be accomplished by using Java NIO with non-copying IO for parsing the custom binary protocol, but we’re looking into using flat protobufs for improving performance further.

But then the database layer becomes the bottleneck. SQLite is far too slow to be normally usable, most users use PostgreSQL for the backing database.

Basically, the trick is in "don’t use JSON and JS, do stuff with native code and binary protocols to reduce overhead".

But this makes maintenance basically impossible, and isn’t really ideal either.

DISCLAIMER: I do not speak for the Quassel project, all opinions represented here are solely my own.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: