Connect with us


Why the F.T.C. Is Taking a New Look at Facebook Privacy




After a yearlong string of news reports that have called Facebook’s data-sharing practices into question, federal regulators are taking a hard look at how the social media company handles the personal information of its users.

It is not the first time Facebook has drawn government scrutiny. About seven years ago, after charges were leveled by the Federal Trade Commission, the company made an agreement with the agency to overhaul its privacy practices.

That agreement, called a consent decree, provides a road map for how the F.T.C. is likely to scrutinize Facebook over the coming months.

In 2007, Facebook introduced Facebook Beacon, a program that broadcast details on users’ online purchases to their friends, initially allowing users to opt out of sharing their purchases only on a case-by-case basis.

Facebook’s chief executive, Mark Zuckerberg, apologized with what an article in The New York Times described as a “symphony of contrition.” In a Facebook post that year, Mr. Zuckerberg wrote: “I’m not proud of the way we’ve handled this situation and I know we can do better.”

At the end of 2009, a coalition of nonprofit consumer and privacy groups, led by the Electronic Privacy Information Center, petitioned the Federal Trade Commission to investigate Facebook’s handling of user data.

The groups filed a complaint saying that Facebook had repeatedly disregarded users’ expectations and diminished their privacy. The complaint argued that the company had violated a federal law prohibiting unfair and deceptive business practices.

In 2011, the F.T.C. filed charges against Facebook that said the company had deceived consumers about their privacy.

The F.T.C.’s complaint charged Facebook with a number of deceptive privacy practices. Among them:

■ Facebook shared users’ personal details with advertisers even though the company had promised not to do so, the agency said.

■ Facebook allowed third-party apps that users had installed to have access to nearly all of their personal data — even though Facebook had stated the apps could obtain only the personal information they needed to operate, the agency said.

■ In 2009, the agency said, Facebook changed its information-handling practices, making certain personal details — like users’ friends lists — public, overriding the choices of people who wanted to keep that data private. The policy change, the F.T.C.’s complaint said, exposed users’ profile information, including “potentially controversial political views or other sensitive information,” to third parties.

■ The agency said Facebook claimed it certified the security practices of apps participating in its “Verified Apps program,” but the company did not do so.

In November 2011, Facebook agreed to settle complaints that it had deceived consumers by “telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public,” the F.T.C. said in a statement at the time.

The agreement, which became final in 2012, prohibited Facebook from misleading consumers about their data privacy and security. The social network committed to getting the explicit consent of users before making changes that overrode their privacy preferences.

The agency ordered Facebook to put a comprehensive privacy program in place to protect the privacy and confidentiality of users’ information and to manage the risks of existing and new products.

It also required Facebook over the next 20 years to undergo biennial audits by an independent third party to certify that the privacy program was properly protecting the information of the company’s users.

In March 2018, The New York Times reported that a voter-profiling company, Cambridge Analytica, had harvested the personal data of millions of Facebook users without their knowledge or permission.

The voter-profiling company obtained the data from a researcher who had offered a personality survey app on Facebook. Although only about 270,000 Facebook users agreed to share their data to participate in the survey, the Facebook platform enabled the app to improperly harvest the personal details of millions of those users’ friends — consumers who had not agreed to share their information with the survey app, The Times reported.

Privacy experts, law professors and at least one former F.T.C. official have argued that Facebook’s failure to prevent the survey app from obtaining the data of users’ friends violated the federal consent agreement. So did Facebook’s failure to prevent the app developer from sharing both users’ data and the data of users’ friends with Cambridge Analytica, these critics said.

They said the Cambridge Analytica episode suggested that Facebook had failed to adequately conduct the risk assessments the agreement required it to do. It also failed to obtain required, explicit consent from users’ friends for the sharing of their data with third parties, the privacy experts said.

They also argued that Facebook had failed to operate a comprehensive privacy protection program and take reasonable precautions — steps the company was obligated to take under the consent decree.

“The consent decree requires Facebook to always be vigilant to possible privacy problems and try to solve them,” said David C. Vladeck, a professor at Georgetown Law and a former director of consumer protection at the F.T.C. who oversaw the investigation that led to the consent decree. “Cambridge Analytica made clear that Facebook was not auditing third-party apps.”

On March 26, the F.T.C. said it was conducting an investigation into Facebook’s privacy practices. An agency spokeswoman declined to comment last week on the progress of the investigation.

Since then, Facebook has made other admissions about privacy problems that experts said could potentially violate the consent agreement or trigger new federal charges of deceptive privacy practices.

■ In June, the company said a software bug made public the posts of up to 14 million users who thought the posts were private.

Also in June, The New York Times reported that Facebook had allowed device makers like Amazon, Apple, Blackberry, Microsoft and Samsung access to the data of users’ friends without their explicit consent, even after the company said that it would no longer share such information with outsiders.

■ In September, the company said a security breach had exposed the personal data of nearly 50 million users.

■ In October, Facebook said Russian firms had scraped user data, including “matching photos from individuals’ personal social media accounts in order to identify them.”

■ In December, Facebook said a software bug had given apps access to a larger set of users’ photos than usual.

■ Also in December, The New York Times reported that Facebook had shared user data with Amazon, Microsoft, Yahoo and other companies without users’ knowledge or permission.

In addition to the F.T.C., Facebook is under investigation by the Justice Department, the Federal Bureau of Investigation, the Securities and Exchange Commission and several government agencies in Europe over Cambridge Analytica’s harvesting of user data.

Facebook said it had developed a privacy program as required by federal regulators and it had not violated the consent decree.

“We are transparent with people about how we use their information and respect people’s privacy settings,” said Sally Aldous, a Facebook spokeswoman. “We have a privacy program, which ensures we protect people’s information, which we continuously evolve to address the privacy risks of our products and services.”

Ms. Aldous said the company’s privacy program involved more than three dozen control mechanisms — including a privacy governance team and security teams that “ensure privacy risks for product launches and major changes are identified, discussed, and escalated for decisions when necessary.”

Facebook said it disagreed with The Times’s characterization of its sharing of user data with Amazon, Apple, Blackberry, Microsoft, Samsung, Yahoo and other companies.

The social network said device makers used information from Facebook to integrate certain Facebook features on their platforms and agreed not to use that information for their own purposes. The company also said Spotify and other third-party apps had access to users’ Facebook data only after users signed in with their Facebook account in the third-party apps.

“None of these partnerships or features gave companies access to information without people’s permission, nor did they violate our 2012 settlement with the F.TC.,” Konstantinos Papamiltiadis, director of developer platforms and programs at Facebook, wrote in a company news release last week.


Source link

قالب وردپرس


More groups join in support of women in STEM program at Carleton




OTTAWA — Major companies and government partners are lending their support to Carleton University’s newly established Women in Engineering and Information Technology Program.

The list of supporters includes Mississauga-based construction company EllisDon.

The latest to announce their support for the program also include BlackBerry QNX, CIRA (Canadian Internet Registration Authority), Ericsson, Nokia, Solace, Trend Micro, the Canadian Nuclear Safety Commission, CGI, Gastops, Leonardo DRS, Lockheed Martin Canada, Amdocs and Ross.

The program is officially set to launch this September.

It is being led by Carleton’s Faculty of Engineering and Design with the goal of establishing meaningful partnerships in support of women in STEM.  

The program will host events for women students to build relationships with industry and government partners, create mentorship opportunities, as well as establish a special fund to support allies at Carleton in meeting equity, diversity and inclusion goals.

Continue Reading


VR tech to revolutionize commercial driver training




Serious Labs seems to have found a way from tragedy to triumph? The Edmonton-based firm designs and manufactures virtual reality simulators to standardize training programs for operators of heavy equipment such as aerial lifts, cranes, forklifts, and commercial trucks. These simulators enable operators to acquire and practice operational skills for the job safety and efficiency in a risk-free virtual environment so they can work more safely and efficiently.

The 2018 Humboldt bus catastrophe sent shock waves across the industry. The tragedy highlighted the need for standardized commercial driver training and testing. It also contributed to the acceleration of the federal government implementing a Mandatory Entry-Level Training (MELT) program for Class 1 & 2 drivers currently being adopted across Canada. MELT is a much more rigorous standard that promotes safety and in-depth practice for new drivers.

Enter Serious Labs. By proposing to harness the power of virtual reality (VR), Serious Labs has earned considerable funding to develop a VR commercial truck driving simulator.

The Government of Alberta has awarded $1 million, and Emissions Reduction Alberta (ERA) is contributing an additional $2 million for the simulator development. Commercial deployment is estimated to begin in 2024, with the simulator to be made available across Canada and the United States, and with the Alberta Motor Transport Association (AMTA) helping to provide simulator tests to certify that driver trainees have attained the appropriate standard. West Tech Report recently took the opportunity to chat with Serious Labs CEO, Jim Colvin, about the environmental and labour benefits of VR Driver Training, as well as the unique way that Colvin went from angel investor to CEO of the company.

Continue Reading


Next-Gen Tech Company Pops on New Cover Detection Test




While the world comes out of the initial stages of the pandemic, COVID-19 will be continue to be a threat for some time to come. Companies, such as Zen Graphene, are working on ways to detect the virus and its variants and are on the forefronts of technology.

Nanotechnology firm ZEN Graphene Solutions Ltd. (TSX-Venture:ZEN) (OTCPK:ZENYF), is working to develop technology to help detect the COVID-19 virus and its variants. The firm signed an exclusive agreement with McMaster University to be the global commercializing partner for a newly developed aptamer-based, SARS-CoV-2 rapid detection technology.

This patent-pending technology uses clinical samples from patients and was funded by the Canadian Institutes of Health Research. The test is considered extremely accurate, scalable, saliva-based, affordable, and provides results in under 10 minutes.

Shares were trading up over 5% to $3.07 in early afternoon trade.

Continue Reading