Holistic Approach To Privacy and Security in Tech: Key Principles
This article will explain how I address privacy and security concerns specific to large web and mobile applications, big tech, and social networks.
Join the DZone community and get the full member experience.
Join For FreeWhen I’m asked about privacy and security issues that the IT industry faces today, the most generic idea that comes up is “do everything with privacy in mind.” This can be applied to any stage and part of the development process, and the best practice is to do it as early as possible.
In this article, I would like to explain how I tackle privacy and security issues that are specific for large scale web and mobile applications and Big Tech.
First, let’s outline some of the biggest challenges Big Tech companies deal with in terms of privacy and security.
Security and Privacy Challenges of the Big Tech
Some of the challenges, such as data privacy and cybersecurity, are common for almost any IT project. Others have a distinct Big Tech flavor to them. For instance, regulatory scrutiny. Big Tech companies accumulate immense amounts of valued data, so privacy concerns shared by both users and government agencies are amplified. It is absolutely vital that companies address these concerns proactively, and as a developer, it is my task to think of what solution we can offer. Another serious challenge I have come across in my line of work is the recovery from the consequences formerly associated with practices that do not meet today’s standards. Such cases among Big Tech used to receive a lot of media coverage and public outcry. They could impact user trust. It is also a developer's task to create user flows that are transparent and safe enough so that users feel confident that their data is not going to be manipulated or mishandled.
I would say that all security and privacy problems can be divided into two large groups, by their point of views: end-user issues and company issues. Let’s take a closer look from each side.
Privacy and Security for End-Users
There are several problems I can define when we need to deliver a solution that will satisfy high security and privacy standards for end-users. When people use a large-scale social media or shop online, they share sensitive data, and it is vital that they are aware of where it is stored and how it is used. User activity data, collected en masse, is a very valuable asset, so on the one hand, companies want to have this data; on the other hand, they want users to feel secure enough to share it safely. Here are some solutions that I worked with that address this problem:
Offer Your Users Opting-Out of Unwanted Activities
When working with development projects for big social networks, the important thing is to introduce opt-out mechanisms that users can always fall back to if they feel uncomfortable providing certain information about themselves or if they have changed their mind later. So, whenever your users perform actions in your app that require providing information about themselves, you need to offer several options. For example, if your project features geographically targeted ads, it’s best if your users make an informed and explicit choice about allowing your application to remember and use their geolocation data. A checkbox where they can select or deselect ‘allow the application to use my geolocation data’ works well in such cases. The most important aspect here is to give your user an opportunity to refuse to submit their geolocation data and still be able to use your application.
I also consider it a generally positive user experience when an app is informing the user about what happens with their data. A user has to make educated choices when being active on social media, and I, as a developer, have to provide them with tools to make these choices.
Opt-out tools of your app should also allow users to manage their personal data that already is publicly available on the Internet through your web and edit permissions to it accordingly. For example, if a user posts images through your application, they should be able to change at any given time, who can see their images, comment, view their meta data, or tag them in their own images.
Introducing and sharing unambiguous privacy policies that cover all these points is essential. It may also be good practice to prevent users from further use of the application or a specific feature of the application if they do not confirm the acceptance of such a privacy policy.
Provide Users With Certain Level of Control Over Shared Data
End-users do not want to have their data leaked, their activity monitored closely, and their opinions manipulated. For one of our Big Tech customers, we work on tools that allow their users to control and even completely disconnect information about their activities on third-party websites and apps. This way, they can limit the data that is used for precisely targeted ads they see.
For example, with the recent onset of generative AI tools, user-generated data publicly available on the web is often used to train various AIs. You can provide your users with specific controls to remove their information from the internal AI training pools if they choose so. You can also add a point to your privacy policy with recommendation to review and, if desired, to restrict publicly available information so that 3rd party AIs won't be able to access that data. This action can restrict the usage of user images, videos, or text posts, especially those with recognizable and distinctive styles, from being utilized to train generative AI models.
Privacy and Security for Companies
Enterprise-level privacy and security is more than just the other side of the end-user privacy and security coin when it comes to Big Tech companies. On the one hand, they are responsible for the end-users’ privacy and security, but on the other hand, they need to take care of such problems as potential privacy breaches, compliance and its costs, and cybersecurity.
Legacy Code and Risk Mitigation
For example, we have a dedicated team that works in risk mitigation depending on new features that are being introduced. Identifying potential privacy and security gaps that arise from new feature development takes a huge amount of analysis. One big issue that causes potential privacy breach risks, and it is an issue many Big Tech companies face, is the quantity of legacy code developers work with on a daily basis. We can’t just let go or substitute all of the legacy code; we have to work around it and with it. As technology constantly evolves, legacy code that was written years ago may contain certain vulnerabilities that possibly weren’t located to be addressed back when it was released. My objective, as a developer, is to proactively look for such loopholes before they get exploited.
I would recommend to include regular legacy code check-ups for vulnerabilities in your day-to-day workflows. When your project runs on a big scale, many people across the team are often unaware that some problems they face with the legacy code may not be unique but rather recurring. In this case, a dedicated specialist or team who can analyze issue trends will help you pinpoint the problem location and patterns. You should also carefully document and keep track of such findings and solutions that you come up with so that teams are educated and are proactively using it to build products with the latest privacy best practices in a uniform way. This approach helps to locate potential issues in a timely manner and fix them before they present an opportunity for exploitation and develop a new framework that ensures the best practices are built in by default.
Ease Your Compliance Processes
Legacy code can cause an array of troubles. Another issue with legacy code is that it may not always be compliant with the latest security and privacy requirements from government regulators, such as GDPR and CCPA. Big Tech companies with international audiences have to comply with regulators worldwide. Developers like me have to think about how our solutions can help companies cut exorbitant compliance costs. We also support FTC interactions by introducing workflows that help FTC assessors to streamline review processes, thus making them quicker and smoother.
Follow Privacy-by-Design Principles
Big Tech companies and social media are specifically vulnerable to reputational damage. Data leaks and privacy breaches are always bad publicity. There are several best practices that I follow when my team and I are tasked with preventing such incidents on the level of coding and development. First and foremost, we follow the principles of privacy-by-design approach. In our work for a large social media company, my team and I took part in the development of tools that gave users more control over their data through available privacy and download settings. Introducing such tools helps you observe the principle of Visibility and Transparency.
Another part I’d like to emphasize as a recommendation that falls in line with the principle of being proactive, not reactive, is the most thorough privacy and security testing. Test and analyze any potential privacy risks, create automated user-facing tests to make sure privacy is covered, and use vulnerability assessment and penetration testing to identify potential security gaps and fix them. Creating such QA workflows is vital for adhering to the end-to-end security principle. My team and I worked on implementing best privacy practices for our customer’s development processes and the life-cycle of data, from its acquisition to complete disposal.
Be Proactive in Privacy and Security Issues
I think that digital privacy and security is the sphere where development is especially fast, so for me, as a professional, it is important that the solutions I work on always stay a few steps ahead of those who want to benefit from security breaches. Big social media stores a lot of coveted data, so I can’t only use flagship tech to improve security; I need to be proactive about it. That is why I make an effort to research and study not only existing risks but also potential risks and think ahead of their mitigation. Working for Big Tech means you are working with huge and distributed cross-functional teams that are making hundreds of daily commits, each of which could hold a potential privacy risk. So, to stay on top of it all, the important part is good communication across the team. For example, I work on educating the members of my team and the teams we work with about the code logic, commitment function and verification process. We analyze and ensure the proper fulfillment of obligations, and together, we have created workflows and processes that allow us to be as transparent as possible and introduce many successful preventive measures as well as establishing a strong privacy built in mindset.
Key Takeaways and Recs
When developing solutions for privacy and security for Big Tech companies, I think there are several key points to follow that will enhance the development processes and increase efficiency:
- Observe privacy by design principles in everything; work with privacy in mind.
- Use a holistic, integral approach to tackle privacy and security issues from both sides, the end-users and the company.
- Be proactive when working with risk mitigation by focusing on incident prevention and controlling all potential risk factors.
Published at DZone with permission of Garnik Dadoyan. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments