Google announced the decision to shut down Google Plus on Monday. Part of a broad review of how much user information it shares with third-party developers.
Google said on Monday that it would shut down Google Plus, the company’s long-struggling answer toFacebook’s giant social network, after it discovered a security vulnerability that exposed the private data of up to 500,000 users.
Google did not tell its users about the security issue when it was found in March because it didn’t appear that anyone had gained access to user information, and the company’s “Privacy & Data Protection Office” decided it was not legally required to report it, the search giant said in a blog post.
The decision to stay quiet, which raised eyebrows in the cybersecurity community, comes against the backdrop of relatively new rules in California & Europe that govern when a company must disclose a security episode.
Up to 438 applications made by other companies may have had access to the vulnerability through coding links called application programming interfaces. Those outside developers could have seen user names, email addresses, occupation, gender and age. They did not have access to phone numbers, messages, Google Plus posts or data from other Google accounts, the company said.
Google said it had found no evidence that outside developers were aware of the security flaw and noindication that any user profiles were touched. This flaw was fixed in an update made in March.
Google looked at the “type of data involved, whether we could accurately identify the users to inform. Whether there was any evidence of misuse, or any actions a developer or user could take in response. None of these thresholds were met in this instance. Ben Smith, a Google vice president for engineering, wrote in the blog post.
The disclosure made on Monday could receive additional scrutiny because of a memo to senior executives reportedly prepared by Google’s policy and legal teams that warned of embarrassment for the company — similar to what happened to Facebook this year — if it went public with the vulnerability.
The memo, according to The Wall Street Journal, warned that disclosing the problem would invite regulatory scrutiny and that Sundar Pichai, Google’s chief executive, would most likely be called to testify in front of Congress. A Google spokesman, Rob Shilkin, declined to comment on the memo. He said the company planned to announce the disclosures later but moved up the announcement.
Facebook acknowledged that Cambridge Analytica, had gained access to the personal info of 87 million Facebook users. Mark Zuckerberg, Facebook’s chief executive, spent two days testifying in congressional hearings about that and other issues.
In May, Europe adopted new General Data Protection Regulation laws. They require companies to notify regulators of a potential leak of personal information within 72 hours. Google’s security issue occurred in March, before the new rules went into effect.
California recently passed a privacy law, this goes into effect in 2020. It will allow consumers in the event of a data breach to sue for up to $750 for each violation. It also gives the state’s attorney general the right to go after companies for intentional violations of privacy.
Steven Andrés, said there was no obvious legal requirement for Google to disclose the vulnerability. Steven added that it was troubling to see the company discussing how reporting the vulnerability might look to regulators.
There is no federal law requiring companies to disclose a security vulnerability. Companies must wade through a patchwork of state laws with different standards.
Arvind Narayanan, a computer science professor at Princeton University who is often critical of tech companies for lax privacy practices, said on Twitter that it was common for companies to fix a problem before it is exploited. “That happens thousands of times every year. Requiring disclosure of all of these would be totally counterproductive,” Mr. Narayanan wrote.
In private meetings with lawmakers last month, Mr. Pichai promised to testify before the end of the year at a hearing about whether tech companies are filtering conservative voices in their products. Pichai was also expected to be asked that if Google plans to re-enter the Chinese market. The vulnerability that was discovered in March. The company’s discussions about how regulators could react also are likely to come in his testimony.
Google is criticised for not sending Mr. Pichai to a hearing attended by top executives from Facebook and Twitter.