Video: Harmonisation and supervision
Panel discussion from Nordic Privacy Arena 2018. With Magnus Sjögren, Bisnode, Eija Warma, Castrén & Snellman, Albine Vincent, CNIL, and Viljar Peep, The Estonian Ministry of Justice.
Video: Online advertising and privacy
Online advertising – What’s wrong and how do we fix it? Keynote from Nordic Privacy Arena 2018 by Raegan MacDonald, head of EU public policy at Mozilla.
The keynote was followed by a panel discussion with Anna Colaps, EDPS, Per Meyerdierks, Google, Annie Ståhl, Schibsted, and Daniel Westman, independent researcher and advisor:
Call for speakers – Nordic Privacy Arena 2019
Want to present at Nordic Privacy Arena 2019 (Stockholm, 23-24 September)?. Let us know by filling out the form below. Please provide name, organization, contact information and your suggested topic.
Click here for information about and photos from Nordic Privacy Arena 2018.
Nordic Privacy Arena – Day Two
Nordic Privacy Arena – Day One
Around 400 data protection experts gathered at Münchenbryggeriet in Stockholm on Monday morning. The Swedish Data Protection Forum would like to thank all of you – attendees, moderators and speakers, sponsors, and staff – for making this possible.
During the first day of the conference, we welcomed the following keynote speakers:
Sarah van Hecke, Legal Design Adviser, Houthoff
Anne Berner, Minister of Transport and Communications, Finland
Raegan MacDonald, Head of EU public policy, Mozilla
Robert Bond, Partner, Bristows
Monday also saw panel discussions on harmonisation, ad tech, consumer attitudes towards data processing, data transfers, regulation of online platforms and the job market for privacy professionals.
Selected speeches will be published shortly. Tomorrow, we’ll explore some of these themes more in-depth. We’ll also discuss health data in the insurance industry, breach reporting and loyalty schemes, among other things.
Survey: Salaries for Swedish data protection specialists
Earlier this year The Swedish Data Protection Forum and HR Commitment gathered statistics on salaries among Swedish data protection specialists. We also asked the 172 respondents about work-life balance, if they were satisfied with their salaries, if they were planning on changing jobs and if they had been contacted by recruiters in the past six months.
- 50% say they have been contacted by recruiters during the past six months
- Women earn 51 380 SEK on average, men 55 082 SEK
- About a third are dissatisfied with their salaries
- Demand for privacy professionals is increasing
- Almost half of the respondents working in the private sector state their employer plans on recruiting more privacy professionals the coming 12 months
172 privacy professionals answered the survey, which was carried out in the spring of 2018.
Han är årets middagstalare
Aral Balkan är designern, programmeraren och aktivisten bakom trackerblockeraren Better Blocker. Han tagit fram ett manifest för etisk design, grundat initiativet Indienet och missionerar bl a för minskad övervakning på nätet.
Balkan har tidigare talat vid konferenser som The Conference, Thinking Digital, The Next Web Conference, Future Fest och TED. På måndag talar han inför omkring 400 nordiska dataskyddsexperter vid Nordic Privacy Arena.
“Current privacy communication is not working”
Helena Haapio is an Associate Professor at University of Vaasa, where she teaches strategic business law. She wrote her dissertation on Next Generation Contracts and has contributed to Stanford’s Legal Design Lab‘s library of Privacy Design Patterns, which covers examples of visual and interactive mechanisms for making privacy policies understandable. She’s also a contract counsel – or contract innovator – at her own company Lexpert.
– I used to work as an in-house counsel and draft complex contracts – now I find myself simplifying them, says Haapio.
– Not just drafting, but also restructuring and designing: working as part of a lawyer-designer team transforming contracting processes and documents, requests for proposals, legal guidebooks, and such, converting these from text-heavy documents gathering dust in legal compliance binders to interactive business tools and toolboxes people actually use.
Helena Haapio will give a keynote speech on Legal Design and privacy communication at Nordic Privacy Arena 2018. She will also represent the Legal Design Alliance during the pitching round later in the afternoon, which also includes speakers from Facebook, PrivacyAnt, Claudette, Synch and Signatu.
What is Legal Design, and how is it relevant for privacy pros?
– Legal Design is an umbrella term for merging forward-looking legal thinking with design thinking. It puts the user in the center and applies human-centered design to prevent or solve legal problems, says Haapio.
– Let’s face it: whether we talk about privacy or other topics, the people who use legal information, documents, services, and policies are not being served well by their current design. Legal Design seeks to change this. So where privacy pros seek to provide better legal communications, systems, or solutions, they can find allies in the emerging field of Legal Design. In many areas of Legal Tech and privacy, the design choices made can have a huge impact on both the processes and systems as well as their outputs.
– There is a growing need for easy-to-use solutions, on the one hand, and for protecting the users, on the other. Legal designers can bring a new perspective by identifying and making different expectations and requirements visible early on, helping embed them from the beginning into the design specifications, building in navigation tools, affordances and signifiers, and asking questions such as: How can we make privacy communication work better? How can we visualize and simplify things? How can we secure successful implementation?
What are Legal Design patterns?
– Like engineers and architects, we lawyers, too, want to make something useful and usable for our clients. We looked at the work of engineers and architects for inspiration – and what do they do? They look for patterns. Software engineers as well as UI and UX designers collect patterns and create pattern libraries: common solutions to recurring problems.
– So we borrowed from them the idea of design patterns and started to create our own pattern library. Our very first was a collection of design patterns for contracts. We wanted not just to improve the content of contracts, but also their communication and presentation, with a focus on the needs of business users, successful business outcomes, and the avoidance of misunderstandings and disputes.
– Legal design patterns provide a way to learn from each other and other fields and to communicate complex legal information. They are not templates or strict recipes intended to be copy-pasted. Rather, they are adaptable structures that can be re-purposed and re-mixed to serve specific challenges. We expanded the idea from contracts to other domains, including privacy.
Can you give an example of legal design being put to good use in a data protection context?
– The UK startup Juro did a remarkable job creating a privacy notice that people actually read and understand. They used several legal design patterns to gain and retain the reader’s attention, communicate as clearly as possible, and avoid information overload.
– They also applied other design patterns, such as accordion, showing an overview first and details on demand, letting readers drill down later, if they want to. They used conversational language, framing headings as questions users may have, borrowing techniques from the FAQ genre. They used floating menus and visual design patterns, such as a timeline, to map out all the privacy-sensitive interactions between the user and Juro, making the whole process more tangible and transparent. So they used a number of well thought-out legal design solutions that we will hopefully see more of.
What will be the main focus of your keynote speech at Nordic Privacy Arena?
– From many contexts we know that current privacy communication is not working. Most privacy notices are written by lawyers for lawyers: way too long, overly legalistic, uninformative, and unhelpful. My keynote will be on Legal Design for better privacy communication. I will introduce some of the work we and others have done to make legal information work better and easier to prepare. I will share examples of how different design patterns, especially visual ones, can help people navigate and make sense of complex messages. The goal of visualization is not to generate visuals, however, it is to generate understanding.
– I will encourage the attendees to see their policies and terms through the users’ eyes, and, where needed, simplify content and how it is presented. I hope they will be convinced of the opportunities offered by simplification and visualization, explore resources in our design pattern libraries, start experimenting with new designs, and share new and improved design patterns with the community. If they find this the way to go, I will encourage them to join the recently launched Legal Design Alliance, LeDA: a network of lawyers, designers, technologists, academics, and other professionals who are committed to making the legal system more human-centered and effective, through the use of design.
Secretary-General, The Swedish Data Protection Forum
Student med dokumenterat intresse för dataskyddsfrågor?
Vi ger bort ett antal biljetter till Nordic Privacy Arena (Stockholm 12-13 november). Skicka en kort motivering till email@example.com.
Speaker profile: Max Bleyleben, SuperAwesome
170 000 kids go online for the first time every day. Most of them do it to watch videos, typically on YouTube – many of the most popular YouTube channels are aimed at younger audiences. They also watch ads, read comments and create content of their own.
SuperAwesome, one of the fastest growing companies in Europe, provides “kid-safe” tools that help content creators and advertisers communicate with half a billion children every month.
These tools, used by customers such as Activision, Hasbro and Nintendo, include AwesomeAds, an advertising platform, PopJam, social content tools for engagement with children and Kids Web Services, developer tools and a compliance platform for creating kid-safe apps and sites. SuperAwesome has also launched Kidfluencer YouTube Network, a program with best practices, standards and certifications for YouTubers with young audiences.
The idea is to build a “Zero Data Internet”, and to outperform other platforms while guaranteeing privacy compliance, the latter being achieved through contextual advertising and a mix of machine learning-based and manual reviewing of content.
Max Bleyleben, Managing Director and CPO, will give a keynote speech on ad tech and children at Nordic Privacy Arena (Stockholm 12-13 November 2018).
Bleyleben helped scaling high-growth tech companies during the first Internet boom. He has since worked as an investor focusing on Internet companies, and co-founded Beamly, a social TV platform backed by Viacom, Comcast and Sky.
– I joined SuperAwesome in January 2015 with a specific remit to scale the business globally. I was excited by the market opportunity – there had been so little investment in making the Internet appropriate for kids, even though they are its fastest-growing user segment, says Bleyleben.
– With kids’ linear TV in continuing decline, brands are having to shift more of their marketing budgets to digital, and yet there are few large-scale, compliant options for them to do so. SuperAwesome was then the leading engagement platform for brands and content owners to interact with kids online at scale, and fully in compliance with data privacy laws.
How does contextual advertising work, and what is “Zero Data”?
– ‘Zero-data’ is what we call it when brands engage with kids through our platform. In fact, we believe all kids’ digital content and advertising should be ‘zero-data’. This means delivering content, enabling engagement, and delivering advertising without collecting any personal information.
– It is simply not necessary to collect IP addresses, device IDs, geo-location and similar data to engage meaningfully with kids online. But all the existing internet platforms collect this data as a matter of course. In fact, we estimate that by the time children are 12 years old some 72m data points about them will have been collected and stored and used to create targetable profiles. We aim to change that by building the zero-data Internet.
– Contextual advertising, without profile-based targeting, is extremely effective in the kids’ market. This is because kids use the Internet in a fundamentally different way than adults. Kids go straight to the site or app they want, and stay there until the iPad is pulled from their hands. There is no need to follow them around. The way to find the kids’ audience online is to understand their likes, current trends, the hottest games, etc. Our system builds highly accurate contextual audiences for our clients from a deep understanding of the best kids’ publishers. This approach to targeting yields incredible performance in kids’ digital engagement.
What is your position on the “age limits” for consent? Are platforms doing enough to verify identities (or rather, ages)?
– We believe that zero-data should be the standard for all kids and teens digital services. It has less to do with a child’s ability to give consent, and more to do with simply protecting the privacy of people as long as possible. Today the most commonly used kids’ platforms – social media – are not doing enough to verify age and apply the right standard. They are fully aware of the kids on their platform and could do a lot more to protect them, for example, by switching off profile data collection on channels or pages that obviously carry kids’ content.
– For that we don’t even need fancy new age verification technologies, adds Bleyleben.
– But it does require the social media platforms to acknowledge that they have some understanding of the content they distribute.
How does one explain the ramifications of agreeing to processing of personal data to a teenager?
– How do you explain them effectively to an adult? Very poorly at the moment. We all need to get better at explaining and – ideally – at reducing the amount of data processing to an absolute minimum, as GDPR and GDPR-K require.
In your view, have the intentions behind the data protection reform been realized?
– Not yet, but we can see many organisations making a strong effort to comply with GDPR-K. We have seen a significant increase in interest in learning about the zero-data Internet, and in our solutions to enable compliant engagement with kids. There remain many gaps, including more guidance required from regulators on, for example, the definition of a ‘child-directed’ service; or on what age verification approaches are acceptable in light of the data minimisation principle; among others.
– But what gives me optimism is that nearly everyone in the kids’ industry is aware of the principles underlying GDPR-K, and that is generating a different approach to building kids’ services – with data minimisation as a baseline.
You’re reviewing content both manually and with AI. How advanced have algorithms become? And on the flipside, is it possible for big companies to review to stay compliant with the GDPR without ML-based tools?
– Content moderation remains a big challenge for kids’ digital services. GDPR-K does not really affect it either way, at least until its provisions reduce the exposure of kids to adult platforms.
– Moderation can’t ever rely on technology alone, even with the best AI. This is because effective moderation is not just about eliminating inappropriate content. Effective moderation requires community management – creating and developing and promoting healthy communities, where people interact in an appropriate manner. All our moderated platforms use a combination of AI-based tools to eliminate inappropriate content, flag risk factors, and help build trust scores for users, and human moderators and community managers who are present and active in the kids’ community to promote good behaviour. Only this combined approach is ultimately effective.
What do you think about Facebook’s and Google’s practices when it comes to advertising, profiling and tracking? Is the data protection reform enough, or do you expect further reforms?
– In theory, the principles and provisions of GDPR-K should be sufficient to protect kids from the practices of the large internet platforms. In practice, so far, they have been insufficient because the platforms’ approach seems to be to comply with only the minimum requirements as interpreted by them, which flouts many of the principles.
– If GDPR-K were properly observed, these platforms would switch off user profiling and personal data collection on content which they know to be consumed by kids. And they would invest in and innovate around improved age verification for the portions of their services which are meant to be for over-13s only. We have yet to see significant efforts by them to put GDPR-K into practice, and I would expect the outcome of civil lawsuits or regulator investigations to help crystallise this issue.
Secretary-General, The Swedish Data Protection Forum