Hmm, wrong link, here's the right one: www.washingtonpost.com/business/202...
@kingjen
Expert of data privacy, algo manipulation (dark patterns), & artificial intelligence. Privacy & Data Policy Fellow, Stanford HAI. Outdoors fanatic and nuts enough to swim in the SF bay. I bleed blue & gold (Berkeley I-School). Former trust & safety Yahoo!
Hmm, wrong link, here's the right one: www.washingtonpost.com/business/202...
I get zero 'I told you so' pleasure out of this whole mess (at least for the users!). That is very interesting (but not good) that you are locked out--I've been wondering whether they will continue to staff enough to honor the surge in deletion requests I assume they are getting.
If the genetic data stays in CA and another business attempts to work with the data I assume those rights to access and delete are still in force. No idea if a bankruptcy court has any discretion to consider the impact of a sale on Californians' legal rights over that data. If you know, chime in!
Talked with the @washingtonpost.com this week about the 23&Me bankruptcy. www.washingtonpost.com/business/202... One in the weeds question for me is the applicability of California's Genetic Information Privacy Act if the assets are sold off.
CalGIPA was written with a functional business in mind; the statute has no mention of bankruptcy assets. What if the company is sold for parts and the genetic data leaves CA or even the US? privacyrights.org/resources-to...
And none of this excuses the core fact that using Signal in this context was way outside standard procedure!
I'm in a large group chat on the app. My only usability critique is that the app could offer a confirmation step when adding users to a group, either one by one or en masse. Even so, navigating a large group on a small mobile screen has its inherent challenges.
I talked with the @huffpost.com this week about the @signal.org mishap, making it clear that the app is not to blame here. www.huffpost.com/entry/trump-...
Now the company is now bankrupt--and our Attorney General is reminding customers that you can request to delete your data before you completely lose control over it in bankruptcy proceedings: oag.ca.gov/news/press-r...
These early adopters felt very assured that there was no way the company could fail, and that their data was well protected, even though at the time there were no laws at all that specifically protected their consumer-collected genetic data in this context.
"I read though enough of their disclosure information on their initial website that I felt comfortable that their objectives were clear and true, and they couldn't be bought."
I conducted these interviews already eight years ago.
"I trust them as a company. I trust them to keep my data secure."
"The most brilliant people are working at that company."
"I had a level of trust in their integrity."
Most could not foresee such a risk. Here's a few quotes:
"I guess that I felt like I knew that this company was related to Google, and it it wasn't a company I knew nothing about. . . it was like a it's a company next door."
"They're well-established, they have protocols to protect their services."
In 2016-2017, I interviewed a handful of 23&Me customers for my Ph.D dissertation research on privacy, personal disclosure and power. I asked my interviewees about their concerns should something 'bad' happen to the company, like, oh, going bankrupt, and the impact on their genetic data.
I used Twitter for work: both connecting with others in my area of expertise and to find info that would otherwise be lost to me or buried on a mailing list that I might not even know about. LI doesn’t facilitate conversation like Twitter or Bluesky.
Linked In is a job/ career networking site. It works well enough for that. But as a Twitter replacement (which is how I tried to use it for the past year) it was terrible. This site is already 1000x better.
What did Gaetz post in his Venmo transaction comments? Wrong answers only.
🚨 REMINDER: Join EPIC and the Privacy Law Section of the California Lawyers Association TODAY at 12pm PT/3pm ET for a panel discussion on Risks and Risk Assessments: A Look at California's Proposed AI & Privacy Regulations.
epic.zoom.us/webinar/regi...
@kingjen.bsky.social
It is from the 18th century?
I'm not yet on this official agenda, but I will be part of this discussion on Thursday @ noon PT.
[cw: AI-generated CSAM discussion]
New Publication: In a case study for the Partnership on AI, my HAI colleague Caroline Meinhardt & I explain why direct disclosure mechanisms (e.g. content labels) can’t serve as a panacea for all the harms of synthetic media. partnershiponai.org/hai-research...
If you ever elect to become a parent, you'll be ultra-radicalized on the topic of child care affordability.
I'm preparing to teach a class at Stanford on dark patterns/manipulative design in January-AFAIK the first class to be fully devoted to the topic at a university (at least in the US!). If you have a recent paper, a case, an example, or anything else that you think I should include, feel free to LMK!
Agreed it’s not a magic health solution. I do feel better standing compared to sitting all day, which I can barely tolerate now after years of standing. I have eyed an under desk treadmill to up the ante …
Shout out to my Stanford colleague and friend Dan Ho, the kind of lawyer who actually reads his closing docs, and by nature of being a law professor turns them into an impactful research project. www.sfchronicle.com/bayarea/arti...
Thank you - that's a way more efficient way of populating my follow list! ;)