Poll: Attitude and behaviors regarding electronic files

Where we talk about modern advancements like the abacus and printing press.

Check all that apply...

1. I save almost every electronic project I ever created.
8
17%
2. I save on more than one device or cloud platform so that I if one storage fails, I am less likely to ever lose the files.
10
21%
3. I sync everything to the cloud but not confident my stuff will be there for me for the rest of my life.
4
8%
4. I sync everything to the cloud or a device and assume/hope that it will be there for me for as long as I want or need them.
2
4%
5. I delete roughly half of my projects and save roughly half.
2
4%
6. I delete most of the stuff I produce and just save the best stuff.
0
No votes
7. I can find most things I want to find in a relatively short amount of time.
11
23%
8. I have difficulty finding most things I want to find.
0
No votes
9. I've had major losses of my work, either on personal devices or on the cloud.
5
10%
10. The biggest loss of work I've ever experienced were my investments in MennoDiscuss before its servers failed.
6
13%
 
Total votes: 48

User avatar
Josh
Posts: 24202
Joined: Wed Oct 19, 2016 6:23 pm
Location: 1000' ASL
Affiliation: The church of God

Re: Poll: Attitude and behaviors regarding electronic files

Post by Josh »

Ken wrote: Thu Sep 29, 2022 6:52 pm
Josh wrote: Thu Sep 29, 2022 4:10 pm
joshuabgood wrote: Thu Sep 29, 2022 2:20 pm I am a cloud guy, specifically Google. I figure a billion dollar company is probably more secure than any local servers. But even if I lost everything I wouldn't have lost that much. :)
Well. Except they have zero duty to you to preserve your files, and if they don’t like your political or theological opinions, they can cancel your accounts with no recourse.
Most institutions that use Google services have institutional accounts with Google in which they pay Google a LOT of money for institutional Gmail and Google Drive (cloud) services. These are contracts and not something that Google can turn off or on based on politics or religion.
Incorrect. My organisation’s TOS says Google can shut off my service for any or no reason, including deciding they don’t like things I say.
I actually have 3 separate Google accounts, my personal one, and separate work accounts for the two different school districts that I work for that have separate Gmail and Google Drive accounts linked to them. I don't know how much money they pay Google, but it is a LOT.
It’s around $5 per month per user. You think that is a lot?
I have seen days when Google services have been down for short periods of time making it impossible to access institutional email accounts and institutional cloud drive accounts. But I have never seen a Google failure that resulted in the permanent loss of Google cloud files. Perhaps they have occurred but they don't make the news.

For my personal files I don't use Google I use Dropbox and keep every file on my computers synced to Dropbox. I have two computers and my wife has another so that means our important financial files and photo archives are all synced to three separate hard drives plus the cloud. I used to back them all up to external hard drives as well but I haven't bothered to do that in a long time. Having everything in 4 different places seems like plenty.

For my work files I use Google Drive which is synced to my personal laptop and my desktop at work. That means every work document I ever create or use is always available three separate places, both my work and home laptop and on Google Drive.

For me to lose a personal file or work file it would mean the simultaneous loss of multiple personal computers plus a catastrophic failure of one of the giant cloud services. That seems like an extremely unlikely event to have happen all at the same time.

Edit: I looked it up. Google charges educational institutions $4 per month per user (student and staff). The larger district I work for has 23,564 students and probably over 2,000 staff (probably more if you include all the subs, maintenance workers, cafeteria workers, bus drivers, etc. who all have work email accounts). So call it 26,000 google accounts. At $4 per month per account they are paying Google $104,000 per month or over $1.2 million per year for all cloud services. And yes, Google has a very real contractual obligation to preserve all of those files. That is part of what the district is paying for with its $1.2 million/year. Things like institutional email accounts are discoverable in lawsuits and have to be legally preserved.

Corporate clients pay even more than educational institutions. I think corporate rates for Google services start at $6 per user.
You may want to re-read that contract. Google has no contractable SLA (it is “best effort”, although so far they have far exceeded this SLA) and specifically disclaims any liability for data loss. Backups are the responsibility of the customer.
0 x
Ken
Posts: 16239
Joined: Thu Jun 13, 2019 12:02 am
Location: Washington State
Affiliation: former MCUSA

Re: Poll: Attitude and behaviors regarding electronic files

Post by Ken »

Josh wrote: Thu Sep 29, 2022 9:32 pm
Ken wrote: Thu Sep 29, 2022 6:52 pm
Josh wrote: Thu Sep 29, 2022 4:10 pm

Well. Except they have zero duty to you to preserve your files, and if they don’t like your political or theological opinions, they can cancel your accounts with no recourse.
Most institutions that use Google services have institutional accounts with Google in which they pay Google a LOT of money for institutional Gmail and Google Drive (cloud) services. These are contracts and not something that Google can turn off or on based on politics or religion.
Incorrect. My organisation’s TOS says Google can shut off my service for any or no reason, including deciding they don’t like things I say.
I actually have 3 separate Google accounts, my personal one, and separate work accounts for the two different school districts that I work for that have separate Gmail and Google Drive accounts linked to them. I don't know how much money they pay Google, but it is a LOT.
It’s around $5 per month per user. You think that is a lot?
I have seen days when Google services have been down for short periods of time making it impossible to access institutional email accounts and institutional cloud drive accounts. But I have never seen a Google failure that resulted in the permanent loss of Google cloud files. Perhaps they have occurred but they don't make the news.

For my personal files I don't use Google I use Dropbox and keep every file on my computers synced to Dropbox. I have two computers and my wife has another so that means our important financial files and photo archives are all synced to three separate hard drives plus the cloud. I used to back them all up to external hard drives as well but I haven't bothered to do that in a long time. Having everything in 4 different places seems like plenty.

For my work files I use Google Drive which is synced to my personal laptop and my desktop at work. That means every work document I ever create or use is always available three separate places, both my work and home laptop and on Google Drive.

For me to lose a personal file or work file it would mean the simultaneous loss of multiple personal computers plus a catastrophic failure of one of the giant cloud services. That seems like an extremely unlikely event to have happen all at the same time.

Edit: I looked it up. Google charges educational institutions $4 per month per user (student and staff). The larger district I work for has 23,564 students and probably over 2,000 staff (probably more if you include all the subs, maintenance workers, cafeteria workers, bus drivers, etc. who all have work email accounts). So call it 26,000 google accounts. At $4 per month per account they are paying Google $104,000 per month or over $1.2 million per year for all cloud services. And yes, Google has a very real contractual obligation to preserve all of those files. That is part of what the district is paying for with its $1.2 million/year. Things like institutional email accounts are discoverable in lawsuits and have to be legally preserved.

Corporate clients pay even more than educational institutions. I think corporate rates for Google services start at $6 per user.
You may want to re-read that contract. Google has no contractable SLA (it is “best effort”, although so far they have far exceeded this SLA) and specifically disclaims any liability for data loss. Backups are the responsibility of the customer.
Why don't you tell us:

1. How many cloud failures Google has experienced in say the past 5 years (business data irretrievably lost because of failures in Google cloud service backups), and

2. How many times Google has censored or deleted information in paid institutional drive accounts or institutional email accounts for political or religious reasons.

That is what we are actually talking about here, electronic file backups. And not getting your YouTube video taken down for violating content standards. I'm guessing the number on both accounts approaches zero.
0 x
A fool can throw out more questions than a wise man can answer. -RZehr
joshuabgood
Posts: 2838
Joined: Fri Oct 21, 2016 5:23 pm
Affiliation: BMA

Re: Poll: Attitude and behaviors regarding electronic files

Post by joshuabgood »

Ken wrote: Fri Sep 30, 2022 12:05 am
Josh wrote: Thu Sep 29, 2022 9:32 pm
Ken wrote: Thu Sep 29, 2022 6:52 pm

Most institutions that use Google services have institutional accounts with Google in which they pay Google a LOT of money for institutional Gmail and Google Drive (cloud) services. These are contracts and not something that Google can turn off or on based on politics or religion.
Incorrect. My organisation’s TOS says Google can shut off my service for any or no reason, including deciding they don’t like things I say.
I actually have 3 separate Google accounts, my personal one, and separate work accounts for the two different school districts that I work for that have separate Gmail and Google Drive accounts linked to them. I don't know how much money they pay Google, but it is a LOT.
It’s around $5 per month per user. You think that is a lot?
I have seen days when Google services have been down for short periods of time making it impossible to access institutional email accounts and institutional cloud drive accounts. But I have never seen a Google failure that resulted in the permanent loss of Google cloud files. Perhaps they have occurred but they don't make the news.

For my personal files I don't use Google I use Dropbox and keep every file on my computers synced to Dropbox. I have two computers and my wife has another so that means our important financial files and photo archives are all synced to three separate hard drives plus the cloud. I used to back them all up to external hard drives as well but I haven't bothered to do that in a long time. Having everything in 4 different places seems like plenty.

For my work files I use Google Drive which is synced to my personal laptop and my desktop at work. That means every work document I ever create or use is always available three separate places, both my work and home laptop and on Google Drive.

For me to lose a personal file or work file it would mean the simultaneous loss of multiple personal computers plus a catastrophic failure of one of the giant cloud services. That seems like an extremely unlikely event to have happen all at the same time.

Edit: I looked it up. Google charges educational institutions $4 per month per user (student and staff). The larger district I work for has 23,564 students and probably over 2,000 staff (probably more if you include all the subs, maintenance workers, cafeteria workers, bus drivers, etc. who all have work email accounts). So call it 26,000 google accounts. At $4 per month per account they are paying Google $104,000 per month or over $1.2 million per year for all cloud services. And yes, Google has a very real contractual obligation to preserve all of those files. That is part of what the district is paying for with its $1.2 million/year. Things like institutional email accounts are discoverable in lawsuits and have to be legally preserved.

Corporate clients pay even more than educational institutions. I think corporate rates for Google services start at $6 per user.
You may want to re-read that contract. Google has no contractable SLA (it is “best effort”, although so far they have far exceeded this SLA) and specifically disclaims any liability for data loss. Backups are the responsibility of the customer.
Why don't you tell us:

1. How many cloud failures Google has experienced in say the past 5 years (business data irretrievably lost because of failures in Google cloud service backups), and

2. How many times Google has censored or deleted information in paid institutional drive accounts or institutional email accounts for political or religious reasons.

That is what we are actually talking about here, electronic file backups. And not getting your YouTube video taken down for violating content standards. I'm guessing the number on both accounts approaches zero.
I have been using Google Drive and Suite since 2008. Zero problems...
0 x
User avatar
Josh
Posts: 24202
Joined: Wed Oct 19, 2016 6:23 pm
Location: 1000' ASL
Affiliation: The church of God

Re: Poll: Attitude and behaviors regarding electronic files

Post by Josh »

Ken wrote: Fri Sep 30, 2022 12:05 am
Josh wrote: Thu Sep 29, 2022 9:32 pm
Ken wrote: Thu Sep 29, 2022 6:52 pm

Most institutions that use Google services have institutional accounts with Google in which they pay Google a LOT of money for institutional Gmail and Google Drive (cloud) services. These are contracts and not something that Google can turn off or on based on politics or religion.
Incorrect. My organisation’s TOS says Google can shut off my service for any or no reason, including deciding they don’t like things I say.
I actually have 3 separate Google accounts, my personal one, and separate work accounts for the two different school districts that I work for that have separate Gmail and Google Drive accounts linked to them. I don't know how much money they pay Google, but it is a LOT.
It’s around $5 per month per user. You think that is a lot?
I have seen days when Google services have been down for short periods of time making it impossible to access institutional email accounts and institutional cloud drive accounts. But I have never seen a Google failure that resulted in the permanent loss of Google cloud files. Perhaps they have occurred but they don't make the news.

For my personal files I don't use Google I use Dropbox and keep every file on my computers synced to Dropbox. I have two computers and my wife has another so that means our important financial files and photo archives are all synced to three separate hard drives plus the cloud. I used to back them all up to external hard drives as well but I haven't bothered to do that in a long time. Having everything in 4 different places seems like plenty.

For my work files I use Google Drive which is synced to my personal laptop and my desktop at work. That means every work document I ever create or use is always available three separate places, both my work and home laptop and on Google Drive.

For me to lose a personal file or work file it would mean the simultaneous loss of multiple personal computers plus a catastrophic failure of one of the giant cloud services. That seems like an extremely unlikely event to have happen all at the same time.

Edit: I looked it up. Google charges educational institutions $4 per month per user (student and staff). The larger district I work for has 23,564 students and probably over 2,000 staff (probably more if you include all the subs, maintenance workers, cafeteria workers, bus drivers, etc. who all have work email accounts). So call it 26,000 google accounts. At $4 per month per account they are paying Google $104,000 per month or over $1.2 million per year for all cloud services. And yes, Google has a very real contractual obligation to preserve all of those files. That is part of what the district is paying for with its $1.2 million/year. Things like institutional email accounts are discoverable in lawsuits and have to be legally preserved.

Corporate clients pay even more than educational institutions. I think corporate rates for Google services start at $6 per user.
You may want to re-read that contract. Google has no contractable SLA (it is “best effort”, although so far they have far exceeded this SLA) and specifically disclaims any liability for data loss. Backups are the responsibility of the customer.
Why don't you tell us:

1. How many cloud failures Google has experienced in say the past 5 years (business data irretrievably lost because of failures in Google cloud service backups), and

2. How many times Google has censored or deleted information in paid institutional drive accounts or institutional email accounts for political or religious reasons.

That is what we are actually talking about here, electronic file backups. And not getting your YouTube video taken down for violating content standards. I'm guessing the number on both accounts approaches zero.
If your google account gets suspended or terminated for (for example) a YouTube video you post, like a local mega church did, with standard Christian doctrines about the family, you can’t get into your Google Drive files, email etc either. The whole account is just gone.

You won’t care about this since you don’t hold any religious beliefs Google doesn’t approve of. But some of us do.
0 x
Ken
Posts: 16239
Joined: Thu Jun 13, 2019 12:02 am
Location: Washington State
Affiliation: former MCUSA

Re: Poll: Attitude and behaviors regarding electronic files

Post by Ken »

Josh wrote: Sat Oct 01, 2022 1:38 pmIf your google account gets suspended or terminated for (for example) a YouTube video you post, like a local mega church did, with standard Christian doctrines about the family, you can’t get into your Google Drive files, email etc either. The whole account is just gone.

You won’t care about this since you don’t hold any religious beliefs Google doesn’t approve of. But some of us do.
Can you point to an actual example of a megachurch or other institution having its paid Google accounts suspended or canceled (Gmail and Google Drive electronic file backups)

As opposed to simply having a YouTube video or YouTube channel being taken down for content, copyright, or terms of service violations?
0 x
A fool can throw out more questions than a wise man can answer. -RZehr
ken_sylvania
Posts: 4092
Joined: Tue Nov 01, 2016 12:46 pm
Affiliation: CM

Re: Poll: Attitude and behaviors regarding electronic files

Post by ken_sylvania »

Josh wrote: Sat Oct 01, 2022 1:38 pm If your google account gets suspended or terminated for (for example) a YouTube video you post, like a local mega church did, with standard Christian doctrines about the family, you can’t get into your Google Drive files, email etc either. The whole account is just gone.

You won’t care about this since you don’t hold any religious beliefs Google doesn’t approve of. But some of us do.
Which church was that? T.B. Joshua Ministries?
0 x
RZehr
Posts: 7253
Joined: Thu Oct 20, 2016 12:42 am
Affiliation: Cons. Mennonite

Re: Poll: Attitude and behaviors regarding electronic files

Post by RZehr »

Here is a story that shows the actions that Google took, and the implications that meant for a man that Google decided was violating their terms of services.
https://www.nytimes.com/2022/08/21/tech ... photo.html
It was a Friday night in February 2021. His wife called an advice nurse at their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. The nurse said to send photos so the doctor could review them in advance.

Mark’s wife grabbed her husband’s phone and texted a few high-quality close-ups of their son’s groin area to her iPhone so she could upload them to the health care provider’s messaging system. In one, Mark’s hand was visible, helping to better display the swelling. Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible, or what those giants might think of the images.

With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up. But the episode left Mark with a much larger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name for fear of potential reputational harm, had been caught in an algorithmic net designed to snare people exchanging child sexual abuse material.

Because technology companies routinely capture so much data, they have been pressured to act as sentinels, examining what passes through their servers to detect and prevent criminal behavior. Child advocates say the companies’ cooperation is essential to combat the rampant online spread of sexual abuse imagery. But it can entail peering into private archives, such as digital photo albums — an intrusion users may not expect — that has cast innocent behavior in a sinister light in at least two cases The Times has unearthed.

Jon Callas, a technologist at the Electronic Frontier Foundation, a digital civil liberties organization, called the cases canaries “in this particular coal mine.”

“There could be tens, hundreds, thousands more of these,” he said.

Given the toxic nature of the accusations, Mr. Callas speculated that most people wrongfully flagged would not publicize what had happened.

“I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”

The police agreed. Google did not.

‘A Severe Violation’
After setting up a Gmail account in the mid-aughts, Mark, who is in his 40s, came to rely heavily on Google. He synced appointments with his wife on Google Calendar. His Android smartphone camera backed up his photos and videos to the Google cloud. He even had a phone plan with Google Fi.

Two days after taking the photos of his son, Mark’s phone made a blooping notification noise: His account had been disabled because of “harmful content” that was “a severe violation of Google’s policies and might be illegal.” A “learn more” link led to a list of possible reasons, including “child sexual abuse & exploitation.”

Mark was confused at first but then remembered his son’s infection. “Oh, God, Google probably thinks that was child porn,” he thought.

In an unusual twist, Mark had worked as a software engineer on a large technology company’s automated tool for taking down video content flagged by users as problematic. He knew such systems often have a human in the loop to ensure that computers don’t make a mistake, and he assumed his case would be cleared up as soon as it reached that person.

He filled out a form requesting a review of Google’s decision, explaining his son’s infection. At the same time, he discovered the domino effect of Google’s rejection. Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.

“The more eggs you have in one basket, the more likely the basket is to break,” he said.

In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”
A few days after Mark filed the appeal, Google responded that it would not reinstate the account, with no further explanation.

Mark didn’t know it, but Google’s review team had also flagged a video he made and the San Francisco Police Department had already started to investigate him.

'How Google Flags Images'
The day after Mark’s troubles started, the same scenario was playing out in Texas. A toddler in Houston had an infection in his “intimal parts,” wrote his father in an online post that I stumbled upon while reporting out Mark’s story. At the pediatrician’s request, Cassio, who also asked to be identified only by his first name, used an Android to take photos, which were backed up automatically to Google Photos. He then sent them to his wife via Google’s chat service.

Cassio was in the middle of buying a house, and signing countless digital documents, when his Gmail account was disabled. He asked his mortgage broker to switch his email address, which made the broker suspicious until Cassio’s real estate agent vouched for him.

“It was a headache,” Cassio said.

Images of children being exploited or sexually abused are flagged by technology giants millions of times each year. In 2021, Google alone filed over 600,000 reports of child abuse material and disabled the accounts of over 270,000 users as a result. Mark’s and Cassio’s experiences were drops in a big bucket.

The tech industry’s first tool to seriously disrupt the vast online exchange of so-called child pornography was PhotoDNA, a database of known images of abuse, converted into unique digital codes, or hashes; it could be used to quickly comb through large numbers of images to detect a match even if a photo had been altered in small ways. After Microsoft released PhotoDNA in 2009, Facebook and other tech companies used it to root out users circulating illegal and harmful imagery.

“It’s a terrific tool,” the president of the National Center for Missing and Exploited Children said at the time.
A bigger breakthrough came along almost a decade later, in 2018, when Google developed an artificially intelligent tool that could recognize never-before-seen exploitative images of children. That meant finding not just known images of abused children but images of unknown victims who could potentially be rescued by the authorities. Google made its technology available to other companies, including Facebook.

When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them. Jon Callas of the E.F.F. called the scanning intrusive, saying a family photo album on someone’s personal device should be a “private sphere.” (A Google spokeswoman said the company scans only when an “affirmative action” is taken by a user; that includes when the user’s phone backs up photos to the company’s cloud.)

“This is precisely the nightmare that we are all concerned about,” Mr. Callas said. “They’re going to scan my family album, and then I’m going to get into trouble.”

A human content moderator for Google would have reviewed the photos after they were flagged by the artificial intelligence to confirm they met the federal definition of child sexual abuse material. When Google makes such a discovery, it locks the user’s account, searches for other exploitative material and, as required by federal law, makes a report to the CyberTipline at the National Center for Missing and Exploited Children.

The nonprofit organization has become the clearinghouse for abuse material; it received 29.3 million reports last year, or about 80,000 reports a day. Fallon McNulty, who manages the CyberTipline, said most of these are previously reported images, which remain in steady circulation on the internet. So her staff of 40 analysts focuses on potential new victims, so they can prioritize those cases for law enforcement.

“Generally, if NCMEC staff review a CyberTipline report and it includes exploitative material that hasn’t been seen before, they will escalate,” Ms. McNulty said. “That may be a child who hasn’t yet been identified or safeguarded and isn’t out of harm’s way.”

Ms. McNulty said Google’s astonishing ability to spot these images so her organization could report them to police for further investigation was “an example of the system working as it should.”
CyberTipline staff members add any new abusive images to the hashed database that is shared with technology companies for scanning purposes. When Mark’s wife learned this, she deleted the photos Mark had taken of their son from her iPhone, for fear Apple might flag her account. Apple announced plans last year to scan iCloud Photos for known sexually abusive depictions of children, but the rollout was delayed indefinitely after resistance from privacy groups.

In 2021, the CyberTipline reported that it had alerted authorities to “over 4,260 potential new child victims.” The sons of Mark and Cassio were counted among them.
In December 2021, Mark received a manila envelope in the mail from the San Francisco Police Department. It contained a letter informing him that he had been investigated as well as copies of the search warrants served on Google and his internet service provider. An investigator, whose contact information was provided, had asked for everything in Mark’s Google account: his internet searches, his location history, his messages and any document, photo and video he’d stored with the company.

The search, related to “child exploitation videos,” had taken place in February, within a week of his taking the photos of his son.

Mark called the investigator, Nicholas Hillard, who said the case was closed. Mr. Hillard had tried to get in touch with Mark but his phone number and email address hadn’t worked.

“I determined that the incident did not meet the elements of a crime and that no crime occurred,” Mr. Hillard wrote in his report. The police had access to all the information Google had on Mark and decided it did not constitute child abuse or exploitation.
Mark asked if Mr. Hillard could tell Google that he was innocent so he could get his account back.

“You have to talk to Google,” Mr. Hillard said, according to Mark. “There’s nothing I can do.”

Mark appealed his case to Google again, providing the police report, but to no avail. After getting a notice two months ago that his account was being permanently deleted, Mark spoke with a lawyer about suing Google and how much it might cost. “I decided it was probably not worth $7,000,” he said.

Kate Klonick, a law professor at St. John’s University who has written about online content moderation, said it can be challenging to “account for things that are invisible in a photo, like the behavior of the people sharing an image or the intentions of the person taking it.” False positives, where people are erroneously flagged, are inevitable given the billions of images being scanned. While most people would probably consider that trade-off worthwhile, given the benefit of identifying abused children, Ms. Klonick said companies need a “robust process” for clearing and reinstating innocent people who are mistakenly flagged.
“This would be problematic if it were just a case of content moderation and censorship,” Ms. Klonick said. “But this is doubly dangerous in that it also results in someone being reported to law enforcement.” It could have been worse, she said, with a parent potentially losing custody of a child. “You could imagine how this might escalate,” Ms. Klonick said. Cassio was also investigated by the police. A detective from the Houston Police department called in the fall of 2021, asking him to come into the station. After Cassio showed the detective his communications with the pediatrician, he was quickly cleared. But he, too, was unable to get his decade-old Google account back, despite being a paying user of Google’s web services. He now uses a Hotmail address for email, which people mock him for, and makes multiple backups of his data.

0 x
Ken
Posts: 16239
Joined: Thu Jun 13, 2019 12:02 am
Location: Washington State
Affiliation: former MCUSA

Re: Poll: Attitude and behaviors regarding electronic files

Post by Ken »

RZehr wrote: Sat Oct 01, 2022 3:35 pm Here is a story that shows the actions that Google took, and the implications that meant for a man that Google decided was violating their terms of services.
https://www.nytimes.com/2022/08/21/tech ... photo.html
It was a Friday night in February 2021. His wife called an advice nurse at their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. The nurse said to send photos so the doctor could review them in advance.

Mark’s wife grabbed her husband’s phone and texted a few high-quality close-ups of their son’s groin area to her iPhone so she could upload them to the health care provider’s messaging system. In one, Mark’s hand was visible, helping to better display the swelling. Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible, or what those giants might think of the images.

With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up. But the episode left Mark with a much larger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name for fear of potential reputational harm, had been caught in an algorithmic net designed to snare people exchanging child sexual abuse material.

Because technology companies routinely capture so much data, they have been pressured to act as sentinels, examining what passes through their servers to detect and prevent criminal behavior. Child advocates say the companies’ cooperation is essential to combat the rampant online spread of sexual abuse imagery. But it can entail peering into private archives, such as digital photo albums — an intrusion users may not expect — that has cast innocent behavior in a sinister light in at least two cases The Times has unearthed.

Jon Callas, a technologist at the Electronic Frontier Foundation, a digital civil liberties organization, called the cases canaries “in this particular coal mine.”

“There could be tens, hundreds, thousands more of these,” he said.

Given the toxic nature of the accusations, Mr. Callas speculated that most people wrongfully flagged would not publicize what had happened.

“I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”

The police agreed. Google did not.

‘A Severe Violation’
After setting up a Gmail account in the mid-aughts, Mark, who is in his 40s, came to rely heavily on Google. He synced appointments with his wife on Google Calendar. His Android smartphone camera backed up his photos and videos to the Google cloud. He even had a phone plan with Google Fi.

Two days after taking the photos of his son, Mark’s phone made a blooping notification noise: His account had been disabled because of “harmful content” that was “a severe violation of Google’s policies and might be illegal.” A “learn more” link led to a list of possible reasons, including “child sexual abuse & exploitation.”

Mark was confused at first but then remembered his son’s infection. “Oh, God, Google probably thinks that was child porn,” he thought.

In an unusual twist, Mark had worked as a software engineer on a large technology company’s automated tool for taking down video content flagged by users as problematic. He knew such systems often have a human in the loop to ensure that computers don’t make a mistake, and he assumed his case would be cleared up as soon as it reached that person.

He filled out a form requesting a review of Google’s decision, explaining his son’s infection. At the same time, he discovered the domino effect of Google’s rejection. Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.

“The more eggs you have in one basket, the more likely the basket is to break,” he said.

In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”
A few days after Mark filed the appeal, Google responded that it would not reinstate the account, with no further explanation.

Mark didn’t know it, but Google’s review team had also flagged a video he made and the San Francisco Police Department had already started to investigate him.

'How Google Flags Images'
The day after Mark’s troubles started, the same scenario was playing out in Texas. A toddler in Houston had an infection in his “intimal parts,” wrote his father in an online post that I stumbled upon while reporting out Mark’s story. At the pediatrician’s request, Cassio, who also asked to be identified only by his first name, used an Android to take photos, which were backed up automatically to Google Photos. He then sent them to his wife via Google’s chat service.

Cassio was in the middle of buying a house, and signing countless digital documents, when his Gmail account was disabled. He asked his mortgage broker to switch his email address, which made the broker suspicious until Cassio’s real estate agent vouched for him.

“It was a headache,” Cassio said.

Images of children being exploited or sexually abused are flagged by technology giants millions of times each year. In 2021, Google alone filed over 600,000 reports of child abuse material and disabled the accounts of over 270,000 users as a result. Mark’s and Cassio’s experiences were drops in a big bucket.

The tech industry’s first tool to seriously disrupt the vast online exchange of so-called child pornography was PhotoDNA, a database of known images of abuse, converted into unique digital codes, or hashes; it could be used to quickly comb through large numbers of images to detect a match even if a photo had been altered in small ways. After Microsoft released PhotoDNA in 2009, Facebook and other tech companies used it to root out users circulating illegal and harmful imagery.

“It’s a terrific tool,” the president of the National Center for Missing and Exploited Children said at the time.
A bigger breakthrough came along almost a decade later, in 2018, when Google developed an artificially intelligent tool that could recognize never-before-seen exploitative images of children. That meant finding not just known images of abused children but images of unknown victims who could potentially be rescued by the authorities. Google made its technology available to other companies, including Facebook.

When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them. Jon Callas of the E.F.F. called the scanning intrusive, saying a family photo album on someone’s personal device should be a “private sphere.” (A Google spokeswoman said the company scans only when an “affirmative action” is taken by a user; that includes when the user’s phone backs up photos to the company’s cloud.)

“This is precisely the nightmare that we are all concerned about,” Mr. Callas said. “They’re going to scan my family album, and then I’m going to get into trouble.”

A human content moderator for Google would have reviewed the photos after they were flagged by the artificial intelligence to confirm they met the federal definition of child sexual abuse material. When Google makes such a discovery, it locks the user’s account, searches for other exploitative material and, as required by federal law, makes a report to the CyberTipline at the National Center for Missing and Exploited Children.

The nonprofit organization has become the clearinghouse for abuse material; it received 29.3 million reports last year, or about 80,000 reports a day. Fallon McNulty, who manages the CyberTipline, said most of these are previously reported images, which remain in steady circulation on the internet. So her staff of 40 analysts focuses on potential new victims, so they can prioritize those cases for law enforcement.

“Generally, if NCMEC staff review a CyberTipline report and it includes exploitative material that hasn’t been seen before, they will escalate,” Ms. McNulty said. “That may be a child who hasn’t yet been identified or safeguarded and isn’t out of harm’s way.”

Ms. McNulty said Google’s astonishing ability to spot these images so her organization could report them to police for further investigation was “an example of the system working as it should.”
CyberTipline staff members add any new abusive images to the hashed database that is shared with technology companies for scanning purposes. When Mark’s wife learned this, she deleted the photos Mark had taken of their son from her iPhone, for fear Apple might flag her account. Apple announced plans last year to scan iCloud Photos for known sexually abusive depictions of children, but the rollout was delayed indefinitely after resistance from privacy groups.

In 2021, the CyberTipline reported that it had alerted authorities to “over 4,260 potential new child victims.” The sons of Mark and Cassio were counted among them.
In December 2021, Mark received a manila envelope in the mail from the San Francisco Police Department. It contained a letter informing him that he had been investigated as well as copies of the search warrants served on Google and his internet service provider. An investigator, whose contact information was provided, had asked for everything in Mark’s Google account: his internet searches, his location history, his messages and any document, photo and video he’d stored with the company.

The search, related to “child exploitation videos,” had taken place in February, within a week of his taking the photos of his son.

Mark called the investigator, Nicholas Hillard, who said the case was closed. Mr. Hillard had tried to get in touch with Mark but his phone number and email address hadn’t worked.

“I determined that the incident did not meet the elements of a crime and that no crime occurred,” Mr. Hillard wrote in his report. The police had access to all the information Google had on Mark and decided it did not constitute child abuse or exploitation.
Mark asked if Mr. Hillard could tell Google that he was innocent so he could get his account back.

“You have to talk to Google,” Mr. Hillard said, according to Mark. “There’s nothing I can do.”

Mark appealed his case to Google again, providing the police report, but to no avail. After getting a notice two months ago that his account was being permanently deleted, Mark spoke with a lawyer about suing Google and how much it might cost. “I decided it was probably not worth $7,000,” he said.

Kate Klonick, a law professor at St. John’s University who has written about online content moderation, said it can be challenging to “account for things that are invisible in a photo, like the behavior of the people sharing an image or the intentions of the person taking it.” False positives, where people are erroneously flagged, are inevitable given the billions of images being scanned. While most people would probably consider that trade-off worthwhile, given the benefit of identifying abused children, Ms. Klonick said companies need a “robust process” for clearing and reinstating innocent people who are mistakenly flagged.
“This would be problematic if it were just a case of content moderation and censorship,” Ms. Klonick said. “But this is doubly dangerous in that it also results in someone being reported to law enforcement.” It could have been worse, she said, with a parent potentially losing custody of a child. “You could imagine how this might escalate,” Ms. Klonick said. Cassio was also investigated by the police. A detective from the Houston Police department called in the fall of 2021, asking him to come into the station. After Cassio showed the detective his communications with the pediatrician, he was quickly cleared. But he, too, was unable to get his decade-old Google account back, despite being a paying user of Google’s web services. He now uses a Hotmail address for email, which people mock him for, and makes multiple backups of his data.

Yeah, that sounds bad. Honestly he should have sued. If only for the discovery process and to uncover Google's decision-making process.

Google SHOULD be involved in screening child pornography and child exploitation on its servers. But they should have a more transparent and responsive review process. That is probably something that needs to be addressed by legislation and regulation.
0 x
A fool can throw out more questions than a wise man can answer. -RZehr
User avatar
Josh
Posts: 24202
Joined: Wed Oct 19, 2016 6:23 pm
Location: 1000' ASL
Affiliation: The church of God

Re: Poll: Attitude and behaviors regarding electronic files

Post by Josh »

… you think any of us little people can sue Google? And have a result beside Google ruining you, who can afford much more expensive lawyers?
0 x
Post Reply