As an industry we really need a better way to tell what’s going g where than:
- someone finally reading the T&Cs
- legal drafting the T&Cs as broadly as possible
- the actual systems running at the time matching what’s in the T&Cs when legal last checked in
Maybe this is a point to make to the Persona CEO. If he wants to avoid a public issue like this then maybe some engineering effort and investment in this direction would be in his best interest.
But I’ve seen a lot of similar claims - just open LinkedIn for a second - and I always come back to the same questions:
- What value has been delivered?
- How much did you spend?*
- How long did it take _all told_?
I know you made a context management db. But if your argument is that AI is the future like this then that seems a somewhat self-referential proof.
What value has been delivered/products built outside of tooling to build products?
I’m aware you probably can’t be 100% open here - IP and all - but I feel it would go a long way to reinforcing your arguments the more concrete you can be.
* points for being up front about the 1000 per engineer minimum. But there’s still the human cost and actual token cost here
"I’m trying to break off of big tech as much as I can"
I wish I could check this more ...
I've had similar needs/desires/gripes with my calendars and the terrible state of calendaring apps for a while.
So thank you for scratching your own itch and sharing it with us.
I'm curious when you say that "[CalDAV] is an area begging for disruption".
Can you enlighten us as to what your wishlist could be for (a) better protocol/systems/ecosystem might be?
(a rant about your pain points might work too).
I've seen polylith over the years and it's always piqued my interest.
I'm curious as to what has been built (by yourselves or others) in the 4 (?) years since its release. Have the experiences held up to the initial goals and designs?
Congrats on the launch and hitting HN's front page.
Do you mind if I ask how your example site (multiplayer.dev) scaled?
I'm super curious about realtime multiplayer solutions (and I don't think I'm alone). But I find a great lack of info on what running this kind of app would cost. I come from the old-school hold-no-state, request->response->gtfo mentality, and I always have the _feeling_ that it'd be expensive to scale.
Not just holding the websocket open, but how much effort do you expend parsing the WAL? How chatty is that kind of persistence mechanism? What other 'gotchas' are there etc.
I'd love nothing more than to dispel that vague feeling with data.
I know it wouldn't come close to a full performance analysis, but throwing a few datapoints on a chart would help get a ballpark idea and tune my hype->action conversion.
I could nail down a pile of questions, but I'm sure you know better than I how to measure your own systems. But roughly I'm wondering:
- how many users did you have?
- how much traffic did you get?
- how much would/did it cost to host on supabase?
- how much resources did the database/realtime components consume?
Congrats again on the launch and have a nice weekend.
But I always come to the same question with services that provide auth and user management: You pay a lot of money for someone _else_ to own critical information about your customers. What happens if you want to move away and use a different/your own/your customers own service?
Your customer data (at least login) lives in WorkOS' database.
How do you get it out? How much does that cost? Are there contractual guarantees around that?
The same goes for your customers integration points. If the customer has to do any setup to integrate WorkOs for your app then moving away would involve them making changes. Not necessarily an easy thing to manage.
Not to be negative: I'd be happy to hear that WorkOS have great processes and guarantees around this.
WorkOS doesn't really own the user management database. It's more like an agnostic API to connect with multiple IdPs through protocols like SAML and OIDC. Identity providers such as Okta, OneLogin, and Azure AD are the ones responsible for storing that data.
It would probably take months to implement SSO with all of the flexibility and ease of use they offer, mainly just because of the built-in integrations with so many providers. The price is pretty steep though, so this would really only be used by the big bucks Enterprise Software™ guys.
It's not. Implementing the OIDC flow from scratch takes half a day to get working and maybe a week to polish. Using available libraries you can do it way faster of course.
I only very briefly looked into rqlite. It's very interesting, but if I understand it correctly it's also not geared toward a write heavy workflow. (all writes are routed to the same node)
I.e. it's leaning more toward the moderate, but reliable writes, and heavy read use cases?
Please let me know ~if I'm missing anything~ what use cases I'm missing.
That's correct. rqlite replicates SQLite for fault-tolerance and high-availability, not for performance. It fact performance takes a hit, for the reasons you state. But it's no worse (nor better) than something like, say, etcd or Consul.
- someone finally reading the T&Cs
- legal drafting the T&Cs as broadly as possible
- the actual systems running at the time matching what’s in the T&Cs when legal last checked in
Maybe this is a point to make to the Persona CEO. If he wants to avoid a public issue like this then maybe some engineering effort and investment in this direction would be in his best interest.
reply