matsushima: you'll simply need to keep evolving (let me see)
[personal profile] matsushima
There’s a word for this in Japanese: fujoshi, often translated as “rotten girl,” a reclaimed pejorative for women who love men who love men. In Asia, the genre is known as BL (“boys’ love”), an umbrella term sometimes called yaoi that can run from the chaste to the pornographic. In the West, it’s called M/M (“male/male”) romance. BL and M/M romance have separate yet parallel histories; both began as a cooperative female fan culture in which women would make canonical texts gay for one another. BL has since evolved into its own commercial industry in Asia with several boom cycles in manga, anime, and live-action TV shows and movies across the continent — in Japan, Thailand, South Korea, China, Taiwan, and elsewhere. Meanwhile in the West, M/M romance has remained mostly in the basement of pop culture on fan-fiction forums and in e-books. Through Heated Rivalry, what was fringe has finally broken loose. The fujoshi switch has been flipped, and now everyone’s fujoing out.
-Girls Who Love Boys Who Love Boys by E. Alex Jung (February 2026)

Recommended reading:
Do Normies Have a Right to Read Heated Rivalry Fan Fiction? by Katherine Dee (March 2026)
Ethical and privacy considerations for research using online fandom data Brianna Dym & Casey Feisler (2020)
"Normal Female Interest in Men Bonking": Selections from The Terra Nostra Underground and Strange Bedfellows Shoshanna Green, Cynthia Jenkins and Henry Jenkins (1993? 1994?)
matsushima: I achieve my dreams. (magic circle)
[personal profile] matsushima
The weird AI-powered fake profiles that Meta deployed in 2023 were quietly mothballed six months later, and would have disappeared from history completely, had Bluesky users not found some that had escaped deletion. This appears to be the fate of all commercial AI projects: at best, to be ignored but tolerated, when bundled with something that people actually need (cf: Microsoft’s Co-pilot); at worst, to fail entirely because the technology just isn’t there. Companies can’t launch a new AI venture without their customers telling them, clearly, “nobody wants this.”

And yet they persist. Why? Class solidarity. The capitalist class, as a whole, has made a massive bet on AI: $1 trillion dollars, according to Goldman Sachs – a figure calculated before the Trump administration pledged a further $500 billion for its ‘Project Stargate’.
-AI: The New Aesthetics of Fascism, by Gareth Watkins for New Socialist
matsushima: maybe i just hate you (…)
[personal profile] matsushima
Content note: Self harm (involuntary)
My sensory-processing issues are a physical element of my disability that would absolutely still exist in a world without capitalism. Like my poor fine motor control and reduced muscle tone, my sensory processing issues debilitate me: there are tasks I simply cannot perform because of how my body is wired, and this makes me different from most other people in ways that are non-negotiable.

Still, my physical disabilities are worsened quite clearly by capitalism: Because large corporations have both a profit motive and a vested interest in reinvesting those profits into advertisements, and because the internet does not receive public financial support, my daily life is bombarded with bright, noisy, flashing, disruptive advertisements, which makes it far more difficult for me to process relevant information and can swiftly bring me to the verge of a meltdown.

If the internet were funded as a public utility and was therefore not sandblasted in ads, I would be less disabled. If my local streets were less plastered in billboards and littered with junk mail advertising chain restaurants, I would be less disabled.
-Devon Price, "My Disability is Manufactured"

Maybe I should've picked something more fun for the first post of 2025, but… Happy New Year! Let this be the year we all start taking better care of each other and ourselves (and stop using speakerphone in public).
matsushima: you try and show me shallow pools but I've seen oceans (black skies)
[personal profile] matsushima
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

Some of their chats got romantic or sexual. But other times, Dany just acted like a friend — a judgment-free sounding board he could count on to listen supportively and give good advice, who rarely broke character and always texted back.

Sewell’s parents and friends had no idea he’d fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. … At night, he’d come home and go straight to his room, where he’d talk to Dany for hours.
-Kevin Roose, Can A.I. Be Blamed for …


Content notes: Suicide, death of a child (teenager), gun violence (Title cut off intentionally)

Profile

longform journalism

March 2026

S M T W T F S
1234567
8 9 1011121314
15 161718192021
22232425262728
293031    

Syndicate

RSS Atom
Page generated Mar. 16th, 2026 05:21 pm
Powered by Dreamwidth Studios