The Privacy Post #20

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
EU gets tough on cyber security

Jean-Claude Juncker has outlined plans to expand the EU’s cybersecurity agency, ENISA. In his State of the Union speech, he announced that ENISA would introduce a cybersecurity certification scheme to help standardise ICT products and services. The President of the European Commission highlighted that cybersecurity would be a priority for the European Union. A detailed plan can be found here.

Canada’s privacy watchdog seeks stronger enforcement powers

In an annual report to the Canadian parliament, privacy Commissioner Daniel Therrien has stated that his office will shift towards launching its own investigations instead of reacting to complaints. This will be part of a “proactive enforcement model”. The report also requested that the Office of the Privacy Commissioner was given the authority to issue binding orders and fines against companies.

Mr. Therrien criticised Canadian privacy laws, warning that they were inadequate in face of the European Union’s General Data Protection Regulation (GDPR).

“Governments understand that for the digital economy to flourish, consumers need to have confidence that their data will be protected adequately… There is a direct link between consumer trust and growth in the digital economy.”
– Daniel Therrien, Canadian privacy Commissioner.

Man found guilty under UK terrorism laws after refusing to reveal passwords

The international director of an advocacy group campaigning against the impact of counter-terrorism policies has been found guilty under terrorism laws. After being stopped at an airport last year, Muhammad Rabbani refused to hand over laptop and phone passwords to police. He has called the law in question, Schedule 7 of the Terrorism Act 2000, dangerous to personal privacy as it acted as a “digital strip-search” – when at no point was he under suspicion.

Apple Acknowledges Password Security Flaw In This Week’s macOS High Sierra Release

A former NSA analyst has revealed that the latest macOS update, High Sierra, is potentially vulnerable from unapproved apps. A security issue means that applications can access the user’s keychain and therefore passwords without any user interaction. Patrick Wardle, who found the problem, still recommended users to update to the latest software as the vulnerability affects all macOSs.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

Cognitiv+ a finalist at The UBS Future of Finance Challenge 2017!

Georgia Wright Events

Cognitiv+ has been shortlisted as a finalist for UBS’s Future of Finance Challenge!

The competition will culminate on 24 October 2017 in London. Cognitiv+ will compete against 9 other finalists in the Northern and Western Europe and Africa region.

The competition was open to Fintech companies around the world, aiming to seek out those with innovative and potentially disruptive technological ideas and solutions that will support the transformation of the banking industry.

“As the financial industry faces disruption and rapid technology changes, UBS is keen to find more solutions through four competition challenges:

Digital Ecosystem, RegTech and LegalTech, Investment Banking 4.0 and Wealth in the Digital Age.”

– UBS

The entries were judged using a ‘6 P’ criteria:

  1. Proposition: How compelling is the technology?
  2. Pioneering: How new and different is it?
  3. Potential: How big is the market opportunity?
  4. Practical: How easy is it to implement?
  5. Plan: What are the next steps to get the technology to market?
  6. Proposer: What relevant capabilities and experience do the founders have?

Read more about the competition here. Find the full list of finalists here.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

The Privacy Post #19

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
Uber agrees to 20 years of privacy audits following FTC charges

US regulators have found that Uber had failed to protect the personal information of drivers and passengers. The Federal Trade Commission (FTC) announced that the ride-sharing company has agreed to put into place a comprehensive privacy program which will see Uber subjected to assesments by an independent auditor every two years for the next two decades.

“This case shows that, even if you’re a fast-growing company, you can’t leave consumers behind: you must honour your privacy and security promises.”
– Maureen Ohlhausen, FTC Acting Charirman.

Uber’s software program ‘God View’ allowed its employees improper access to consumer data in 2014. A data breach in 2015 affecting personal data of 100,000 Uber drivers has also raised concerns. The FTC also found that the steps taken to mitigate improper use of consumer data was misrepresented to the public.

Apple, Google and Other Tech Giants Urge Supreme Court to Block Warrantless Cellphone Tracking

Fourteen US tech giants have filed a brief with the Supreme Court supporting an increase to cellphone privacy under the Fourth Amendment. The friend-of-the-court filing was submitted by companies including Apple, Google, Facebook, Verizon and Twitter. They recommend that more rigorous warrant requirements for law enforcement should be implemented, especially when certain cell phone data is sought, such as location information.

“That users rely on technology companies to process their data for limited purposes does not mean that they expect their intimate data to be monitored by the government without a warrant,”

While Congress kills internet privacy, states take a stand for users

Despite Trump’s administration undoing new Federal Communication Commission regulations, it is reported that states across the US are still committed to protecting the privacy of their citizens.

Vacuums that pick up data as well as dirt renew privacy concerns

Artificially intelligent vacuum cleaners are the latest household gadget to raise privacy concerns. COO of iRobot, Christian Cerda, has commented that the company wouldn’t rule out selling data collected my the smart vacuums. The smart appliances market in Europe has been increasing year on year, but is expected to face ongoing privacy challenges from those concerned about data sharing between different technology companies.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

The Privacy Post #18

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
UK data protection laws to be overhauled

The UK government has released details about the Data Protection Bill. The Bill will bring the EU’s General Data Protection Regulation into UK domestic laws, as well as a few more protections. Find our full analysis here.

“It will give people more control over their data, require more consent for its use and prepare Britain for Brexit,”
– Matt Hancock, Digital Minister.

 

 

Hotspot Shield VPN accused of violating users’ privacy

A privacy group has recommended that the Federal Trade Commission investigate a popular virtual private network (VPN) provider. AnchorFree Inc. has been accused by The Center of Democracy and Technology of misleading users of their VPN service with claims that they are ensuring their online privacy. According to a code analysis of Hotspot Shield’s source code, the VPN provider tracks users and doesn’t adequately secure their personal data.

Disney Faces Children’s Privacy Class Claims Over Mobile App

A total of 43 Disney apps have been accused of collecting data from mobile applications in violation of the Children’s Online Privacy Protection Act. The Walt Disney Co. is now engaged in a federal lawsuit with a concerned mother for tracking software contained in the apps which “exfiltrate[s] that information off the smart device for advertising and other commercial purposes.”

Windows 10 privacy: You’re happy for us to collect your data, says Microsoft

Microsoft has claimed that it has had “positive” feedback following the release of Windows 10’s personalisation options. These options include privacy-enhancing features such as an online privacy dashboard allowing users to control activity data across several Microsoft services. Other changes included more controls for location, speech recognition, diagnostics, relevant ads and recommendations. It is also offers more transparency concerning Microsoft’s use and collection of data.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

New Data Protection Bill: post-Brexit plans & the GDPR

Georgia Wright Privacy

UK government announces new data protection legislation

Preparations for the EU General Data Protection Regulation (GDPR) have been well underway ahead of its May 2018 enforcement date. News this week concerning the government’s intent to legislate new data protection laws means that these preparations will stay relevant. However, it has become apparent that the new British laws will go above and beyond the GDPR in protecting privacy.

Why do we need new laws?

“It will give people more control over their data, require more consent for its use and prepare Britain for Brexit,”
– Matt Hancock, Digital Minister.

The government’s statement of intent released on 7 August 2017, was hinted at during the Queen’s speech. In an opening statement, Matt Hancock iterates the main purposes of the legislation:

  • Support innovation by ensuring safe data processing standards
  • Ensure data protection in a system with more accountability but less bureaucracy
  • Tougher rules on consent, rights to access, move and delete data
  • Enhanced enforcement
  • New powers for the Information Commissioner’s Office (ICO)
  • Bring domestic law in like with EU law
  • Ensure data protection within criminal investigations and law enforcement action

As the Financial Times has pointed out, the most recent UK data protection legislation (the Data Protection Act 1998) was written before the time of Facebook, Instagram and Uber. With the UK eager to retain its position as a leader in the G20 internet economy, legislation addressing technological advances is essential.

How will the UK Data Protection Bill differ from the GDPR?

The Bill will enforce the GDPR, but also adds extra stipulations and offences. Some new offences will be penalised with an unlimited maximum fine. These include:

  • “altering records with intent to prevent disclosure following a subject access request” – applicable to all data controllers or processors;
  • “intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data” – also applicable to those knowingly handling or processing such data.

Why is the UK Data Protection Bill different from the GDPR?

The statement of intent repeatedly reinforces the strength of the UK’s digital economy and reputation for strong data protection standards. The legislation is part of the government’s wider digital strategy to aid the growth and development of the UK’s digital economy. This will be achieved by ensuring that businesses and the government can use data in “innovative and effective” ways:

  • implementing strong data infrastructure,
  • having a high level of regulatory compliance,
  • developing a data-literate workforce,
  • advancing data skills.

The Bill’s ultimate goal will be the implementation of three interconnected objectives: maintaining trust, future trade and security.

“Our data economy will be integral to the UK’s growth and future prosperity.
Analysis predicts that data will benefit the UK economy by up to £241 billion between 2015 and 2020…
Our vision is to make the UK the safest place to live and do business online.”

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

The Privacy Post #17

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
UK Home Secretary supports back doors while claiming ‘real people’ don’t need end-to-end encryption

In the campaign for access to encrypted messages, UK Home Secretary Amber Rudd has claimed that “real people often prefer ease of use and a multitude of features to perfect, unbreakable security.” She has further claimed that the end-to-end encryption technique is causing problems by preventing anti-terrorist efforts. This rhetoric has been highly critiqued.

Apple files patent for screen with privacy-viewing options

A new patent application by Apple entitled “Displays With Adjustable Angles-of-View” is aimed at increasing the privacy of tablet and phone users. By utilising a electrically adjustable lens array to modify backlight illumination, Apple hopes to restrict the display’s angle of view. Read the application here.

EU privacy watchdog: Privacy shield should be temporary

The UK-US data transfer deal will be discussed later this month in Washington. Privacy watchdogs and experts from the European Commission will assess whether Privacy Shield is doing enough to sufficiently protect EU citizens in light of controversial US surveillance laws and intelligence-sharing agreements.

“In my view it’s an interim instrument for the short term. Something more robust needs to be conceived,”
– Giovanni Buttarelli, European Data Protection Supervisor

Update on FCC Privacy Rules

The Federal Communications Commission new order of 26 June 2017 means that the Code of Federal Regulations will be updated to cater for the reinstatement of pre-2016 Privacy Order rules.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

Big data analytics: the 9 data protection problems concerning the ICO

Georgia Wright Articles

Earlier this year, the Information Commissioner’s Office (ICO) described the combination of big data, artificial intelligence and machine learning as ‘big data analytics’. The ICO also marked this trio as distinct from traditional data processing. But what distinctions are concerning the ICO?

The report from March 2017 ‘Big data, artificial intelligence, machine learning and data protection’ sets out guidance and recommendations to address the privacy concerns of this type of data processing. There are apparently nine data protection principles threatened by the use of artificial intelligence to process big data.

1. Big data analytics must be fair

The requirement for data processing to be fair means that the effects on data subjects must be limited and unobtrusive. For example, in 2015 a female doctor wasn’t allowed entry to a female gym changing room as the automatic security system had assumed that she was male (being called ‘Dr’). It’s this sort of processing which unfairly discriminates against data subjects.

“assessing fairness also involves looking at the effects of the processing on individuals, and their expectations as to how their data will be used”

The complex nature of processing might also impact the transparency requirement of the Data Protection Act (DPA) – a fair processing notice is recommended. Making consumers aware of how and when their data might be collected for processing will help build trust between businesses and consumers.

2. Permission to process

Big data inherently comes paired with consent issues – the General Data Protection Regulation (GDPR) will require “unambiguous” consent. The consent must be given in a “clear affirmative action”. This is no small feat when taking into consideration how many subjects might be involved and how the complex processing might be explained to them.

Graduated consent could  provide an innovative solution here. Allowing subjects to opt-in when the data is collected throughout the relationship between service provider and consumer overcomes the binary nature of ‘opt-in or not at all’ forms. At the exact point when an app wants to share information with a third party, the user can be given a ‘just in time’ notification to gain their consent. This targeted consent will likely be better informed too.

3. Purpose limitation

Data protection principles require that any further processing (which is not directly consented to) must not be incompatible with the original purpose. Big data analytics often leads to the finding of unexpected correlations – this may in turn lead to data being used for new purposes.

4. Holding onto data

The concept of data minimisation underpins data protection legislation. However, when artificial intelligence is applied to data, the scope of analysis is usually much greater – why analyse a sample of a data set when you could easily analyse it all?

“in a study of businesses in the UK, France and Germany, 72% said they had gathered data they did not subsequently use”

5. Accuracy

When data sets are large, incorrect data might be passed over or dismissed. Secondly, big data might not represent a general population – all of the data set doesn’t mean that certain groups might have been originally excluded or underrepresented. Finally, hidden biases might be applied from big data analysis results. Applying results on individuals in order to profile them might lead to inaccurate assumptions about them.

6. Individual rights: data accessibility

The benefits of big data pose data protection implications yet again – its volume, variety and complexity. The DPA requires that individuals should be allowed to access their personal data. However, there is one positive outcome noted here: if organisations make the move to big data, they might undertake the process of bringing together disparate data stores. This could make it easier to locate data on an individual in the event of a subject access request.

7. Security measures and risks

Whilst positioning big data analytics as a useful tool for analysing security risks, the ICO contrasts this by highlighting its drawbacks. Large data sets and the nature of big data processing can throw up specific information security threats.

8. Accountability

The GDPR contains several additional provisions promoting accountability. The context of big data processing (in that it can be experimental, without defined hypothesis or business need) might cause problems when complying with these provisions. For example, organisations of over 250 employees must maintain records of data processing activities. Additionally, erroneous algorithmic decisions based on biased profiling throw up accountability issues.

9. Controllers and processors

If artificial intelligence is analysing the data – who is processing it? This question isn’t as rhetorical as it appears to be – the issue lies in establishing if a third party provider of artificial intelligence is the processor or controller.

“in a forthcoming article on the transfer of data from the Royal Free London NHS Foundation Trust to Google DeepMind, Julia Powles argues that, despite assertions to the contrary, DeepMind is actually a joint data controller as opposed to a data processor”

If the analytics provider has the power to decide what data to collect and how to apply analytical techniques on behalf of another, it is likely to be the data controller as well as the processor.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

The Privacy Post #16

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
Clouds linger over troubled transatlantic data-transfer deal

The transatlantic privacy deal, Privacy Shield, has been called out as invalid by the Human Rights Watch. In a letter to the European Commission, the rights group has stated that US surveillance laws are an infringement on the privacy rights of European citizens. These laws falls short of European privacy standards, as well as US intelligence sharing arrangements. Read the full letter here.

“There’s no way to get around the fact that US laws and policies allow abusive monitoring and need to be drastically overhauled before they can meet human rights standards,”

– Maria McFarland Sánchez-Moreno, co-director of the US Program at Human Rights Watch.

Trump voting commission wins right to collect state voter data

The Presidential Advisory Commission on Election Integrity has been allowed to collect data from states concerning their registered voters’ full names, political affiliation, criminal records, addresses, date of birth and whether they have voted in past elections. The Electronic Privacy Information Center had challenged this move in a lawsuit. President Trump has started the commission to allegedly seek out voter fraud.

Privacy Crackdown: Russia Bans VPNs And Apple ‘Sides With Censorship’ In China

Virtual Private Networks (VPNs) have been banned in Russia starting from 1 November 2017. VPNs have allowed citizens to access banned websites by allowing users access to a secure server. The connection between the user and the VPN’s server is encrypted and allows private browsing.
Apple has also removed VPN apps from its Chinese app store, in a move which has seen Apple accused of siding with censorship.

PNR: EU Court rules that draft EU/Canada air passenger data deal is unacceptable

The Court of Justice of the European Union has confirmed that the EU/Canada deal on collection and sharing of air travellers’ data breaches European law.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

UK parliament calls for information on AI

Georgia Wright Articles

The UK government has invited contributions to its investigation into the economic, ethical and social implications of advance in artificial intelligence. The inquiry plans to analyse the risks and benefits of artificial intelligence’s effects on society whilst also searching for pragmatic solutions. The deadline for the submission of written evidence will be 6 September 2016.

“The Committee wants to use this inquiry to understand what opportunities exist for society in the development and use of artificial intelligence, as well as what risks there might be.”

– Lord Clement-Jones, Chairman of the Select Committee on Artificial Intelligence.

The inquiry comes after the the White House’s report on the future of artificial intelligence in October 2016. The US report reinforces the need for a regulatory environment which is safe yet stimulating and makes recommendations for effective retraining of workers. This had pushed MPs to call for a new UK artificial intelligence commission. Read more about the history of UK & US’ policies for innovation in artificial intelligence here.

The House of Lords Select Committee on Artificial Intelligence will consider:

  • What is the current state of artificial intelligence? How is it likely to develop over the next 5, 10 and 20 years?
  • Is the current level of excitement surrounding artificial intelligence warranted?
  • How can the general public best be prepared for more widespread use of artificial intelligence?
  • Who in society is gaining the most from the development and use of artificial intelligence? Who is gaining the least?
  • Should the public’s understanding of, and engagement with, artificial intelligence be improved?
  • What are the key industry sectors that stand to benefit from the development and use of artificial intelligence?
  • How can the data-based monopolies of some large corporations, and the ‘winner-takes-all’ economics associated with them, be addressed?
  • What are the ethical implications of the development and use of artificial intelligence?
  • In what situations is a relative lack of transparency in artificial intelligence systems (so-called ‘black boxing’) acceptable?
  • What role should the Government take in the development and use of artificial intelligence in the UK?
  • Should artificial intelligence be regulated?
  • What lessons can be learnt from other countries or international organisations in their policy approach to artificial intelligence?

Find out how to submit information here.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.

The Privacy Post #15

Georgia Wright Privacy

Every week we’ll be rounding up the latest news in privacy and other interesting articles.
French court refers Google privacy case to ECJ

France’s Conseil d’État, the country’s highest administrative court, has asked European judges to consider whether search engines should apply the “right to be forgotten” globally. This means that search providers such as Google or Bing could have to exclude search results from non-European domains, for example Google.com. The case was initiated by Google following a  €100,000 fine from France’s privacy watchdog, Commission Nationale de l’Informatique et des Libertés. The penalty was imposed for not applying the “right to be forgotten” across its global domains. The European Courts of Justice are expected to make a ruling in 2019.

“We look forward to making our case at the European Court of Justice,”
– Peter Fleischer, Google’s global privacy counsel.

Silicon Valley mostly quiet in internet surveillance debate in Congress

Major technology firms have surprisingly been absent from the debate surrounding the Foreign Intelligence Surveillance Act (FISA) in Washington. Facebook, Google’s Alphabet and Apple have not expressed opinions on the renewal of the controversial US internet surveillance law, known as Section 702. It allows the US National Security Agency to collect and analyse the digital communications of overseas foreigners, although it is thought that data on Americans has also been collected without warrant. The tech giants are supposedly quiet due to Section 702’s link to the ongoing debate about Privacy Shield in Europe. Read more about Privacy Shield here.

Privacy pros in the know aren’t waiting for Brexit – they’re preparing for GDPR

A study prepared by the International Association of Privacy Professionals (IAPP) has surveyed privacy professionals to reveal their plans ahead of Brexit. In spite of the fact that the UK may have to implement its own data protection laws following Brexit, privacy professionals’ primary focus is to demonstrate compliance with the General Data Protection Regulation (GDPR) from the outset. The survey has additionally found that:

  • 94% of those surveyed are preparing to comply the GDPR
  • 66% are implementing a new internal privacy accountability framework.
  • 58% are investing in privacy training for employees.
FBI warns parents of privacy risks associated with internet-connected toys

The FBI has issued a warning on its website, advising parents of the risks of smart toys. These toys employ new technologies in order to learn and adapt their behaviour according to user interactions. Internet-connected toys may have privacy and safety risks if they have capabilities like microphones cameras, GPS, data storage and speech recognition.

Like this post? Subscribe to our weekly newsletter here to be updated with all news privacy, Cognitiv+ and more.