Hacker Newsnew | past | comments | ask | show | jobs | submit | creata's commentslogin

What's the rationale for using Rust to write a UI? Using a scripting language (or at least a garbage-collected language) is much less restrictive, and it's not like the "what goes where" UI code is especially performance-sensitive.

This is a perfectly reasonable question, and I think there are two aspects to it.

First, one of the research questions tested by Xilem is whether it is practical to write UI in Rust. It's plausible that scripting languages do end up being better, but we don't really know that until we've explored the question more deeply. And there are other very interesting explorations of this question, including Dioxus and Leptos.

Second, even if scripting languages do turn out to be better at expressing UI design and interaction (something I find plausible, though not yet settled), it's very compelling to have a high performance UI engine under a scriptable layer. I've done some experiments with Python bindings, and I think in an alternate universe you'd have apps like ComfyUI as high performance desktop app rather than a web page. Also, the layering of Xilem as the reactive layer, backed by Masonry as the widgets, is explicitly designed to be amenable to scripting, though to my knowledge there hasn't been a lot of actual work on this.


The way Vello/Masonry/Xilem are split projects is partially what got me interested in it (and in turn caused me to post it to HN), as well as the reactive architecture of Xilem.

I do believe a garbage collected interpreted language would work best for UIs. Something like Vala (for gtk) but with a runtime/vm.

python-qt has shown to be a very strong combination. My issue with such solutions is that packaging a python application to the end-user can bloat binary size.

I also think GTK should get some credit in that space, because due to GObject introspection it's easy to interface with GTK with any language.


Same reason every other language has UI frameworks. It is more comfortable and nice to write the whole desktop program in the same language.

[flagged]


I have been watching people write UI frameworks in Rust for over a decade, you meanie.

The results tend to involve more dynamic allocation than you'd see in a garbage-collected language, or tons of reference counting (e.g., in Leptos) that acts as a less efficient GC. I've read many of raphlinus's posts, and while they're always interesting, the total experience in the Xilem examples just seems like much more effort than using FFI (even C FFI) to glue to something more workable.

Your comparison to assembly is very bizarre - languages of the sort I mentioned are usually at least as safe as Rust, and the "scripting language for top-level logic + compiled language for the bits that need to be fast" combination is ancient. In fact, your vague allusions to "a stable base, without infinite danger everywhere" shows much less understanding of what's at stake, in my view.

I'm sorry my question wasn't enlightened enough for you.

And this is a news aggregator. Not the official discussion forums or anything. People can ask small insignificant questions here, or so I thought.

I'm so tired. You write one measly paragraph that could simply be ignored and someone calls you a "perpetuate drain". Even the chatbots have more humanity than you've got.


> I'm so tired. You write one measly paragraph that could simply be ignored and someone calls you a "perpetuate drain". Even the chatbots have more humanity than you've got.

Don't tire everyone else out by asking open ended draining questions. Show some engagement, before doing what looks like a discarding.

You've shown you have some interest or connection to the situation, with your reply. None of that was present before, in your 'just-asking-questions' "measely paragraph". It looked like just another anti-rust anti-systemd anti-pipewire/pulseaudio anti-wayland drain, only sapping energies without showing faintest attempts at engaging. Offer something, try to have some positive sum.

We are all so tired. Why be a vacuum, why drain us, like you did? Critical review is fine! But show some engagement, offer something yourself, when doing so.


> Don't tire everyone else out by asking open ended draining questions

What are you on about? If you're so tired, don't put it on yourself to answer these "open ended draining questions". Internet continues to work the same way as before, anyone writes whatever they feel like, others engage, if you don't want, don't. But it's not up to you what questions are acceptable or not, what kind of world view is that?


It's sometimes hard to package a scripting language project to the end-user. Rust compiles to a binary. That's one benefit for me.

Tech wise? If you have your UI in Rust it's both the safest and most performant language to implement it.

And you don't need to ship the entire web stack just to get GUI.


Good thing about iced is, you get a compact executable, runs on any OS, looks exactly the same everywhere, perform much better than web based UI, no need to manage any permission to access local files, and you can customize the look as you need, but comes with tolerable default.

Price to pay is building the UI is bit complex as it doesn't hold your hand, unforgiving, and not native.

I like iced. But tauri is good middle ground


Iced is the clear number one for me, too. The only thing I'd love to see officially supported in iced in the future is mobile apps. But it looks like that ain't gonna happen anytime soon (with the most recent PRs getting rejected once again).

Have a question, what is the best way to detect if a `text_input` is focused?

Not exactly what you asked, but I recently answered a very similar question on StackOverflow: https://stackoverflow.com/questions/79345013/how-to-focus-te...

Unforgiving?

Hm, I believe my wording is bit unclear. The trait system can really get complex as your widget layout gets more complex, and you want to write reusable components, But there's no clear way to understand what's the type it's expecting from the errors. You need to really understand the traits to implement any reusable components. Which is why I felt it's kind of unforgiving, if you're not fully knowledgeable of the primitives.

But I'm still learning it, so, probably missing some details.


Afaik most UI are build with C/C++

If you use a different language you have to deal with some kind of FFI and that's always painful.

The part that worries me here is the diff. Does it happen in the host or in the guest? What code gets run when you run `yoloai diff`?

It actually runs git (with hooks disabled) to generate the diff. It happens on the host when using copy mode, and inside the sandbox when using overlay mode.

The above example doesn't specify workdir mounting mode, so it would be copy, not overlay.


If it runs inside the sandbox and the guest is compromised, can't the guest just lie?

Absolutely. That's why overlay is not the default.

That's... uh, an interesting approach to security.

What is? Defaulting to the most secure method?

> The only practical defense is for these frontier models

Another practical defence for many of these devices would be to just disconnect them... I feel like an old man yelling at a cloud, but too much is connected to the Internet these days.


It can be easier to hack the device and patch it than determine which device it is. This is nearly always true for the non-technical, but it is true for most technical people as well. Many of the devices in peoples homes that aren't being actively patched are not that old!

Why doesn't this atm tell me my balance anymore? Oh we implemented creata's advice

Why didn't this smartboard tell me my plane was delayed? Oh we implemented creata's advice

ad nauseum


Two very minor suggestions for the demo:

1. I don't know what the "Docxtemplater" button does, but it eats my document without warning and that's annoying.

2. It would be nice if the page came with some example .docx files we could see it work on.


> If you want to rotate things there are usually better ways.

Can you elaborate? If you want a representation of 2D rotations for pen-and-paper or computer calculations, unit complex numbers are to my knowledge the most common and convenient one.


For pen and paper you can hold tracing paper at an angle. Use a protractor to measure the angle. That's easier than any calculation. Or get a transparent coordinate grid, literally rotate the coordinate system and read off your new coordinates.

For computers, you could use a complex number since it's effectively a cache of sin(a) and cos(a), but you often want general affine transformations and not just rotations, so you use a matrix instead.


> For computers, you could use a complex number since it's effectively a cache of sin(a) and cos(a), but you often want general affine transformations and not just rotations, so you use a matrix instead.

That makes sense in some contexts but in, say, 2D physics simulations, you don't want general homogeneous matrices or affine transformations to represent the position/orientation of a rigid body, because you want to be able to easily update it over time without breaking the orthogonality constraint.

I guess you could say that your tuple (c, s) is a matrix [ c -s ; s c ] instead of a complex number c + si, or that it's some abstract element of SO(2), or indeed that it's "a cache of sin(a) and cos(a)", but it's simplest to just say it's a unit complex number.


Why use a unit complex number (2 numbers) instead of an angle (1 number)? Maybe it optimizes out the sins and cosses better — I don't know — but a cache is not a new type of number.


There's a significant advantage in using a tuple over a scalar to represent angles.

For many operations you can get rid of calls to trigonometric functions, or reduce the number of calls necessary. These calls may not be supported by standard libraries in minimalistic hardware. Even if it were, avoiding calls to transcendental can be useful.


Because rotations with complex numbers is not just rotations, its rotations+scaling.

The advantage of complex numbers is to rotate+scale something (or more generally move somewhere in a complex plane), is a one step multiplication operation.


Do you find yourself scaling things a lot in 2D physics, but not translating them? I'd think translation and rotation are more common than scaling.


If you need to support zoom, scaling shows up very frequently.

I can give an example from real life. A piece of code one of my colleagues was working on required finding a point on the angular bisector. The code became a tangle of trigonometry calls both the forward and inverse functions. The code base was Python, so there was native support of complex numbers.

So you need angular bisector of two points p and q ? just take their geometric mean and you are done. At the Python code base level you only have a call to sqrt. That simplifies things.


Iirc Gauss suggested "lateral numbers". Not the worst idea, but it's too late now.


Clifford algebras are harder to philosophically motivate than complex numbers, so you've reduced a hard problem to a harder problem.


They're not objectively harder to motivate, just preferentially harder for people who aren't interested in them. But they're extremely interesting. They offer a surface for modelling all kinds of geometrical relationships very succinctly, semantically anyway.

This is also super interesting and I don't know why anyone would be uninterested in it philosophically: https://en.wikipedia.org/wiki/Classification_of_Clifford_alg...


There is such a thing of using overly simple abstractions, which can be especially tempting when there's special cases at "low `n`". This is common in the 1D, 2D and 3D cases and then falls apart as soon as something like 4D Special Relativity comes along.

This phenomenon is not precisely named, but "low-dimensional accidents", "exceptional isomorphisms", or "dimensional exceptionalism" are close.

Something that drives me up the wall -- as someone who has studied both computer science and physics -- is that the latter has endless violations of strong typing. I.e.: rotations or vibrations are invariably "swept under the rug" of complex numbers, losing clarity and generality in the process.


I hate when people casually move "between" Q and Z as if a rational number with unit denominator suddenly becomes an integer, and it's all because of this terrible "a/b" notation. It's more like (a, b). You can't ever discard that second component, it's always there. ;)


Yes, you're right. You can't say your function operates in Z "but has solutions in Q". That's what people are doing when they take a real function and go "ooh look, secret complex solutions!"


Why would we expect most real numbers to be computable? It's an idealized continuum. It makes perfect sense that there are way too many points in it for us to be able to compute them all.


Maybe I'm getting hung up on words, but my beef is with the parent saying they find real numbers "completely natural".

It's a reasonable assumption that the universe is computable. Most reals aren't, which essentially puts them out of reach - not just in physical terms, but conceptually. If so, I struggle to see the concept as particularly "natural".

We could argue that computable numbers are natural, and that the rest of reals is just some sort of a fever dream.


> It's a reasonable assumption that the universe is computable

Literally every elementary particle enters the chat to disagree. Also every cloud of smoke and each whisp of dissipated heat.


It feels like less of an expectation and more of a: the "leap" from the rationals to the reals is a far larger one than the leap from the reals to the complex numbers. The complex numbers aren't even a different cardinality.

> for us to be able to compute them all

It's that if you pick a real at random, the odds are vanishingly small that you can compute that one particular number. That large of a barrier to human knowledge is the huge leap.


The idea is we can't actually prove a non-computable real number exists without purposefully having axioms that allow for deriving non-computable things. (We can't prove they don't exist either, without making some strong assumptions).


> The idea is we can't actually prove a non-computable real number exists without purposefully having axioms that allow for deriving non-computable things.

Sorry, what do you mean?

The real numbers are uncountable. (If you're talking about constructivism, I guess it's more complicated. There's some discussion at https://mathoverflow.net/questions/30643/are-real-numbers-co... . But that is very niche.)

The set of things we can compute is, for any reasonable definition of computability, countable.


I am talking about constructivism, but that's not entirely the same as saying the reals are not uncountable. One of the harder things to grasp one's head around in logic is that there is a difference between, so to speak, what a theory thinks is true vs. what is actually true in a model of that theory. It is entirely possible to have a countable model of a theory that thinks it is uncountable. (In fact, there is a theorem that countable models of first order theories always exist, though it requires the Axiom of Choice).


I think that what matters here (and what I think is the natural interpretation of "not every real number is computable") is what the theory thinks is true. That is, we're working with internal notions of everything.


I'd agree with that for practical purposes, but sometimes the external perspective can be enlightening philosophically.

In this case, to actually prove the statement internally that "not every real number is computable", you'd need some non-constructive principle (usually added to the logical system rather than the theory itself). But, the absence of that proof doesn't make its negation provable either ("every real number is computable"). While some schools of constructivism want the negation, others prefer to live in the ambiguity.


I hold that the discovery of computation was as significant as the set theory paradoxes and should have produced a similar shift in practice. No one does naive set theory anymore. The same should have happened with classical mathematics but no one wanted to give up excluded middle, leading to the current situation. Computable reals are the ones that actually exist. Non-computable reals (or any other non-computable mathematical object) exist in the same way Russel’s paradoxical set exists, as a string of formal symbols.

Formal reasoning is so powerful you can pretend these things actually exist, but they don’t!

I see you are already familiar with subcountability so you know the rest.


What do you really mean exists - maybe you mean has something to do with a calculation in physics, or like we can possibly map it into some physical experience?

Doesn't that formal string of symbols exist?

Seems like allowing formal string of symbols that don't necessarily "exist" (or well useful for physics) can still lead you to something computable at the end of the day?

Like a meta version of what happens in programming - people often start with "infinite" objects eg `cycle [0,1] = [0,1,0,1...]` but then extract something finite out of it.


They don’t exist as concepts. A rational number whose square is 2 is (convenient prose for) a formal symbol describing some object. It happens that it does not describe any object. I am claiming that many objects described after the explosion of mathematics while putting calculus on a firmer foundation to resolve infinitesimals do not exist.

List functions like that need to be handled carefully to ensure termination. Summations of infinite series deal are a better example, consider adding up a geometric series. You need to add “all” the terms to get the correct result.

Of course you don’t actually add all the terms, you use algebra to determine a value.


You can go farther and say that you can't even construct real numbers without strong enough axioms. Theories of first order arithmetic, like Peano arithmetic, can talk about computable reals but not reals in general.


> greatly lags Racket performance

This is a different implementation of Guile, though. Has Hoot (on, say, V8) been benchmarked?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: