By: Kerry Alvarez
In today’s digital age, how can we ensure that the information stored on our devices remains private? Ideally, individuals should have control over who can access their data and the extent of that access. But what happens when that choice is taken away? What if, despite your efforts to protect sensitive personal information, it gets exposed? Imagine a personalized health app sharing your medical records with advertising companies without your consent. This would raise serious concerns about privacy and data security in today’s digital world. Unfortunately, this is reality.
In 2021, a lawsuit was filed against Flo Health, Inc. following the discovery that the company’s app, Flo, was using specialized Software Development Kits (“SDKs”) to transmit user information to advertising partners and major technology companies, such as Google, Flurry Inc., and Meta. Despite Flo’s privacy policies, ensuring users that their health information—such as menstrual cycles, pregnancies, and personal intimacy details—would remain private, it was found that the application shared this data with third parties up until their lawsuit. There were no limitations on how these third-party companies could use the data, with some using it for research and others for targeted advertising. The lawsuit followed a complaint and settlement agreement between Flo Health Inc. and the Federal Trade Commission, which required Flo to gain affirmative consent from its users before sharing personal health information with advertising companies. There is also a pending class-action lawsuit in the Northern District of California against Flo Health, Inc., over similar allegations and for violating the California Confidentiality of Medical Information Act. The class action lawsuit brought in August of 2024 requests damages and injunctive relief on behalf of millions of women who were harmed when their private health information was exploited without their consent.
Often, companies design apps with built-in features for sharing information with third parties from the start, and then add additional features that enhance this method of information sharing. One of the ways companies do this is through SDKs, which are sets of code which can be integrated into an application. SDKs allow developers to integrate specific functions more easily. Instead of creating features like authentication or analytics from scratch, developers can license pre-built SDKs which come with prepared software libraries. SDKs are a collection of pre-packaged tools, libraries, and documentation that allow app developers to integrate specific features into their applications without having to build these functionalities.
Because integrating an SDK can speed up the time it would take to develop systems implemented in these apps, it saves these companies both money and effort. It enables developers to utilize pre-built functions that would otherwise be complex and time-consuming to create from the ground up. For instance, SDKs can be used to implement analytics, advertising, social media sharing, user authentication, and more. Several examples include Application Programming Interface, documentation, tools, and sample code to streamline the development of the applications.
Because this is a complicated form of software, issues arise when privacy assurances conflict with these information-sharing technologies. Due to the complex nature of SDKs, they can be challenging to manage and can occasionally fall behind in compliance with evolving privacy policies. The significant lag in the technology’s adherence with privacy guidelines should create an ethical obligation for companies to be proactive in ensuring the data protection of their users. This can be achieved by a comprehensive evaluation of the SDK’s data collection methods, including the types of data gathered, their purpose, and the intended recipients. To promote transparency and establish trust, developers should also provide clear information to users regarding data collection practices, retention periods, and data-sharing policies.
In the case of Flo Health Inc, the developers took no such measures to ensure data privacy. Millions of users were kept in the dark as to where their sensitive data was going. It is important that privacy policies are transparent for their users. Individuals should also be cautious of apps that collect extensive personal data, especially health information, that aren’t clear about how that data is used. As technology continues to evolve, so do the challenges of maintaining privacy, especially with apps that handle sensitive personal data. The several lawsuits that Flo Health Inc. faces highlight the risks of using SDKs that prioritize data sharing over user privacy without providing clear communication or gaining consent. Furthermore, the lack of user awareness about SDKs complicates this issue.
Most users do not realize that apps may rely on multiple third-party tools that have their own data collection practices. This is concerning when it comes to health-related apps, where users might assume a higher standard of privacy protection. Without clear communication and consent mechanisms, users are left in the dark about the true scope of data sharing. The misuse of sensitive health information emphasizes the need for stricter privacy policies that are both transparent and up to date. It is also important that users remain conscious about the apps they use, especially if these applications are collecting health data that they do not want third parties to have access to. While SDKs provide a valuable tool for app development, they also introduce significant risks to user privacy. The case of Flo Health Inc. illustrates the dangers of relying on third-party SDKs without adequate transparency and user consent. With greater transparency and accountability, trust can be rebuilt, ensuring the protection of privacy in the world of evolving technology.
Student Bio: Kerry Alvarez is a third-year law student at Suffolk University Law School and staff writer on the Journal of High Technology Law. Kerry received a Bachelor of Arts in English from The College of New Jersey.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.