did one of my famous β€œsleep until 3 PM” things today but forgot i had milk and eggs delivered this morning. milk and eggs sitting at my door since 10AM. oops
the reason imo that the for you feed is so good is because it seemingly prioritizes recency. this is good imo. i don’t care about stuff that happened hours ago. i dont care about replying to someone who is offline. i want to talk to people who are active rn and are talking about stuff i like.
πŸ”΄ LIVE learning to play fps again [@hailey.at is πŸ”΄LIVE on stream....]( )
ive discovered something cursed on this website
forgot battlefield 6 was out today, there goes my weekend
probably makes sense to let you use oai or anthropic instead of local models too (e.g. if you don't have a machine that can run the models). you should be able to use either of these super cheap (like, certainly under a few dollars a month unless you're using the more expensive models or get piled) RE: [support both openai and anthro...]( ) View quoted note β†’
is there any discourse happening rn? want to test something.
to be clear, this isn't to suggest that we should stop letting people run models locally. in fact, i believe that we _need_ to level the playing field as much as possible, and i'm very excited about progress being made. but we certainly need to be paying attention to the possible consequences. RE: View quoted note β†’
there actually is a lot of fun to be had with local llm models, but its also quite concerning that its now possible to run these with ease on consumer hardware. if you thought the amount of "bot" type replies and stuff that are all over the internet was bad rn, its probably going to get a lot worse
most of my work gets done after 3 PM RE: