Today I received the coolest, most thoughtful gift. Twice. My day started out when a friend traveled from far far away, and dropped by to bring me a package on behalf of @npub1d2ac...2px3, whom they had just visited. It was a beautiful *handmade* puzzle box, with three interesting locks built into it, and a *four-page* letter of appreciation—the amount of effort they put into making me feel special is extraordinary 🥰 --- Then, a couple hours later, a package arrived from my girlfriend. Inside were two lever locks. One from the 1830s, and one from the 1920s that is stamped with "Mastodon" 🥰 The 4-lever lock is now the oldest I'm my collection, and the Mastodon lock is the first 8-lever lock in my collection. --- And finally, a couple hours after the package arrived from my girlfriend, I got to meet a new friend from Mastodon in person for the first time—which was its own present 🥰 --- So today I got to add five unique new locks, several good memories, and a new friend to my collections. It was a good frickin' day 💝 #Gratitude #ThreeGoodThings #GetLovedNerd
So I've been fucking around, testing Google's purity filters some more—seeing what it rejects, what it accepts, and what changes it makes without instruction. In every case, if I give it a photo with much skin showing, it rejects it. If I give it a photo with underwear or lingerie showing, it tends to cover me up more (i.e. it zips up jeans, buttons up shirts, makes fishnets opaque, etc). It almost always makes me look less like a tomboy. Often enlarging breasts (even while hiding them away). This time it accepted four photos (out of well over a dozen submitted), from three different photoshoots—two with face covered and two with face showing (which is the closest I've done to a face reveal now, I guess 😋). *more details in alt-text. Conclusion: Google has baked a technology into its default Android photo gallery that (surprise) reinforces unrealistic ideals of beauty while simultaneously treating feminine bodies as inherently sexual and in need of censoring. Follow-up: I'd like to see folx with other body types, gender presentations, and styles, test the edges of what Google Photos "Remix" and "AI Enhance" features will accept, and what un-requested changes it makes to those images. Does it censor topless men? Does it lean into racist stereotypes? Does it make thinner femme folx more curvy? Does it make curvier folx thinner? If this tech is going to be crammed into everything, where kids, friends, and corporations are going to be using it, we should understand the potential psychological effects and built-in biases we're likely to encounter with increasing frequency. #AlicePics #AI #ForcedPurity #Testing
Heeeyyy Fedi!!! I have a question that I *know* a bunch of you are going to be able to answer. For the past couple decades I've wanted an electric guitar, but my ex always shot it down. So, now that I've escaped that abusive relationship, and with it nearing the holidays again, I've decided I'm going to finally get one. Last time I played guitar was in middleschool (and some bass in highschool), so I'm starting from scratch. My constraints: - 6 string electric guitar - under $200 - used/hand me down is great - pink would be awesome - probably right-handed (because that's what I learned on) Does anyone have a recommendation (or better yet, an old guitar near Seattle that could use the sweet, sweet love of a new owner?) I looked at Amazon (🤢) and found something that seems like it fits my needs, but I hate supporting them, *and* I have no idea if it's garbage or not. https://a.co/d/1Daaxxw #AskFedi #ElectricGuitar #Music #Recommendations #RepairReuseRecycle
Google's image-gen and editing apps *regularly* won't work on my photos if there's *any* hint of skin. Sometimes it will even reject editing simple (fully clothed) selfies of mine because it determines they're too risqué. I tried this photo numerous times, getting rejected over and over, until it finally accepted it. Same photo. Same edits. Different results. This reminds me of the research I did years ago on the racial, gender, and economic biases in the three main image-gen models (at the time). I showed statistically that each model associated women with domestic abuse victimhood, men with aggression & violence (specifically frat boys and Black men), racial minorities with poverty, and well-paid professions with white men. Now we're seeing more sexist bias in *AI tools* that are being put into commonly-used software. This makes me want to run tests on image edit rejection in Google Photos with variously femme and masc-presenting images in different states of undress. #Puritanical #Corporate #AI #BuiltInBias #Research
I've been reflecting on what drives me, and I think I've got a reasonable handle on my main rules: - Kindness is punk AF; do more of it. - Run towards the fight & draw fire. - Reduce harm & support others. - Be someone you'd look up to. Also, It's okay to break sometimes. Which is not so much a rule as it is permission to not always live up to my other rules.
Hey y'all, this is a little hawkward and I wouldn't normally ask, but I have some bills I need to pay, and I'm broke. If any of y'all happen to be comfy enough to throw a buck my way, I could use a couple hundred of them before next month. Thank you 💝 https://buymeacoffee.com/alice.watson #MutialAid #Tipping