really digging the "local PDS mock" approach [@spacecowboy17.bsky.social]( ) used for prototyping. claude is pretty good at generating those, and it's way easier to iterate on the UI and the model without worrying about actual network
if i want to associate some data with some other record, what's the play? 1. using (did,rkey) as my rkey, somehow escaping it. this guarantees there's 1:1 mapping 2. tid as rkey, but content includes a link to primary record. the convention would be "most recent match"
is there an atproto record etiquette? e.g. is it better to do cascading deletes (when there's two types of entities and one depends on another), or is it better to just leave orphaned records (and ignore them on app view level)
honestly running a graphql query over all data from multiple social apps could be a good explanatory demo for atproto i think [@slices.network]( ) already allows this to some extent... but over specific slices of network. i think at least for a demo, global would be much more compelling RE:
this sounds corny but atproto is basically the internet of json
what is the best practice for when to reach for a strongRef, as opposed to a ref? is it basically "if you really want to make sure it's linking to the same version"? is this up to the app to decide what "deserves" this guarantee? e.g. a "like" wants to be concrete but sometimes you don't mind edits?
i keep using [playground.typelex.org]( ) just to look up lexicons. is there any site that does this with good autocomplete? ideally i want a big search box like google, i type "app.bsky post". it suggests "app.bsky.feed.post" (partial match). pressing Enter shows lexicon definition + example from shape
that's actually a really fun benchmark, watch the next 20 seconds (timestamped) [React Conf 2025 Day 1](https://www.youtube.com/live/zyVRg2QR6LA?si=rq9xUZuZTdgDAAnS&t=20455 )
would be cool if @npub10vgn...h0np had "shouts" too
honestly i think there was something about the google reader shutdown that in retrospect was a turning point both for google itself and for the broader internet RE: