We live in a world of instant information. In the age of social media, Google, and virtual assistants like Siri and Alexa, we have reached a point where we can read the news, hear a weather report, and order groceries without ever interacting with another person. This access to constant and unlimited data has created opportunities for innovation that would have been inconceivable to people living only a few decades earlier.
However, users don’t always realize they’re sharing just as much as they are receiving. While they are mindlessly inputting credit card information, posting clothing pictures, and sending messages at the touch of a button, their data is shared, traded, and used by tech companies. This has created a growing controversy over the ethics of digital data collection and whether this practice violates the privacy rights of users. What’s forgotten, however, is that users choose to use these tools and that a lack of understanding about how companies are using their data does not legitimize their outrage.
User Apathy Is the Real Problem
Technology is now commonplace, but so is user apathy. Users of apps, email services, and social media platforms haphazardly agree to “terms and conditions” but treat the warnings as just a box to check before consuming their content. This same nonchalant, dismissive treatment is given to pop-up windows that warn users of web browser trackers and other data collection mechanisms websites use.
This was a prime example of legislators trying to punish a private company whose terms and conditions they didn’t take the time to understand.
Unfortunately, the people clicking “accept” without a second thought are the same ones decrying tech companies for collecting and using their information. Instead of taking the time to research the services they use, people have begun demanding legislation to stop data collection. In fact, the topic has become so widely discussed that it’s one of the nation’s top political priorities.
When reports surfaced that the political data firm Cambridge Analytica had used Facebook users’ information while working for the Trump campaign, it created a scandal so immense it spurred a federal investigation into Facebook and two days of Senate hearings. The controversy eventually forced Cambridge Analytica to close its doors.
This circus of legislative punishment resulted from information sharing that Facebook denied was a data breach. Instead, Facebook argued that Cambridge Analytica had only taken information users had agreed to make available by signing Facebook’s user agreement. This was a prime example of citizens and legislators trying to punish a private company whose terms and conditions they didn’t take the time to understand.
Education, Not Legislation
The issue of digital privacy and how to regulate data collection online consistently baffles legislators. It’s impossible for Congress, whose members are an average age of nearly 60, to run alongside new technology’s progress. By trying to force regulations on an industry that is ever-changing, we are only making it harder for new and smaller technology companies to grow, depriving ourselves of innovation that will make our lives better. Whether it’s the shows Netflix recommends, the books and clothing promoted by Amazon, or Spotify playlists perfectly cultivated to our tastes, data collection ultimately gives us products we like.
Users are in charge of their own online presence just as they are in charge of their physical presence in real life.
So the onus of digital privacy shouldn’t fall solely on tech companies. Users are in charge of their own online presence just as they are in charge of their physical presence in real life. Any fair solution to data collection must recognize this reality.
Education, not regulation, must play the largest role in helping users shield their data from prying eyes. For example, in October, New York City launched its first library privacy week, a series of public workshops aimed at helping people learn digital privacy best practices. The 30 events taught users how to protect their information and practice safe browsing on unsecured WiFi networks. This kind of user-driven data security is an ideal solution for those who want more control over their online presence. Of course, for others who object to sharing any private information with online sites, there’s always the option to not use them at all.
Although it has become almost integral to our society, a digital presence is not mandatory. But by mindlessly accepting the terms and conditions, users are saying that what they gain from a digital tool is worth more to them than the privacy they may lose. Before blindly hitting “accept,” users skeptical about data mining should consider the role they have to play in protecting their own information. People can’t complain if a lack of privacy is a risk they choose to take.