Hacker Newsnew | past | comments | ask | show | jobs | submit | theocs's commentslogin

Even if all PoW energy came from renewable sources, it's extremely wasteful and I'm sure we could have used that energy on something more productive.

That's not to say the allocation mechanism in PoW isn't interesting, but that's not really good enough in itself.

I've been experimenting with energy limited PoW, something called NuPoW [1], that reduces the reward if too much energy is collectively used. But this only works if you outsource the security to an underlying chain.

[1] https://nupow.fi/whitepaper/


The ever more prevalent phenomenon of negative electricity prices is a market signal that the energy could not be used more productively. If it could, prices would never be negative.


They trade a token, sometimes called a stablecoin, that represent BTC as its underlying value.

Primarily two extremes for how this is done: Centralized, like WBTC (https://coinlist.co/help/what-is-wrapped-bitcoin-wbtc), and decentralized, like tBTC (https://defirate.com/tbtc/)


Besides the extremes, there's the pragmatic in the middle - interoperability in general. RenVM supports Bitcoin and many other coins on multiple hosts, not just Ethereum (eg. Polkadot):

https://renproject.io

https://mainnet.renproject.io


I'm not convinced RenVM is really in the middle in its current state. Currently the core team holds all of the keys for the over $300M in Bitcoin stored by their project [0]. I'd take wbtc, which is held by a consortium of well known custodians in the space over that any day.

[0] https://www.theblockcrypto.com/daily/76787/ren-bitcoin-walle...


I think the wrapped BTCs, especially the decentralized approaches to it, are clever links that allow for the transfer of BTC value over to the Ethereum ecosystem.

It's a way of selling BTC for other tokens, extracting and removing its value over time, and instead investing it into DeFi and similar.


Or I just right click open it in private mode


No guarantee that they won’t track you through other fingerprinting methods.


When do you ever have that guarantee anyway, though?


I'm not a statistician, but I'd think the less they can fingerprint you, the more you go in the "noise" bucket.


Given that these tokens follow the ERC 20 standard, I don't see why other clients can't just integrate support if they want.


Any feedback welcome! And if you have questions happy to answer and it will help guide me towards what blog posts to focus on next :)


If you want to build a REPL in the browser I'd highly recommend checking out replumb

https://github.com/ScalaConsultants/replumb


Even after finishing the readme I'm not clear on what exactly this does. At the very least it looks useful for working around the macros problem[1]. Mind explaining what mileage you got out of it?

[1]: http://blog.fikesfarm.com/posts/2015-09-07-messing-with-macr...


Glad you like it :)

You can bind to arbitrary data regions using named items. For example:

cljs.user=> (add-binding-named-item "A1:B2" "binding-id" println)

nil

{:rc 2, :cc 2, :type matrix, :id binding-id}

cljs.user=> (get-binding-data "binding-id" println)

nil

{:rc 2, :cc 2, :type matrix, :id binding-id, :data [[1 2] [3 4]]}

cljs.user=> (set-binding-data! "binding-id" [[2 3] [4 5]])


why use continuation functions (... println) instead of returning values ? Is it because of the underlying JS API is async ? Maybe use something like chans to pipe async values ? Just doesn't seem clojureish to me :D


Author of cljs-ajax here, I'd always favour a straight callback. The async stuff is so easy to add on top it doesn't really seem worth it to add the dependency to the library and complicate matters for people that don't want it.


I guess that's true - core.async adds a lot of stuff you don't want for something as simple as futures/promises.

I wish ClojureScript guys would abstract 'go' macro from core.async in to the clojurescript core, provide well documented protocols that it uses and include implementation for JS Promises by default (then leave core.async to expose chans for those who need channels trough those protocols).

Hell if I knew they wouldn't be against doing this I might even do it my self in a few weeks.

Considering the async nature of JS it makes sense to provide such a construct as a part of language/standard library.


You're right about the underlying JS API being async, and that's why those functions are as well.

I wouldn't mind adding a blocking variant of these functions, and use async internally to make that happen. But, not there yet.


autobot


The cool thing being that the official launch will be the "Autobot Rollout"


This is very technical, most likely way past what 99% of developers both need to care about and have to care about.

Given that most developers work at best with embarrassingly parallel problems they wouldn't need to know much of these details.

But if you're up for it: It's a very rewarding feeling when you're both allowed to figure these things out and someone are also paying you while you figure it out.


Who cares about "most developers". This is interesting for its own sake.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: