Most industries are already making use of IoT monitoring devices, designed to provide better security, better safety, or more data to the companies creating them—or some combination of the three. The emergence of devices like these has been an impressive boon both to tech companies and companies of various industries making use of them. Tech companies get a blue ocean of new opportunities for innovation, and companies of other industries have new ways to collect data or improve their businesses.

But there’s an interesting effect here worth exploring. By increasing your security or access to data, you’re necessarily sacrificing some degree of privacy. You, individually, might be willing to make that sacrifice, but in the near future, you may not have much of a choice.

New Tech, New Data

Data has become a top priority for many companies, for both benevolent and nefarious purposes. For example, companies like Google and Facebook want to collect as much data on your browsing habits, interests, and demographic makeup as possible so they can sell that information to advertisers, or develop products that cater specifically to you. But in the healthcare industry, data-gathering devices are crucial in learning how to better diagnose complex conditions—and avoid medical malpractice lawsuits as well.

Accordingly, adopted tech is encroaching into new territories of data collection. Companies are attempting to institute new types of monitoring, and collect new types of data at the same time. It’s hard to blame them, since more data can lead to more innovation and more consumer demand, but it’s problematic because we’re venturing into unfamiliar territory. Our laws and regulations are decades behind the times.

Why the Law Can’t Keep Up

Some politicians have proposed major updates to our laws and consumer protections in accordance with the latest available technology, but the fact remains that the law can’t truly keep up with advancing tech. Historically, new innovations and new laws have evolved in a kind of cat-and-mouse interaction. When companies and individuals start exploiting some loophole, or some new area of development, politicians could pass new legislation that limits them.

The problem with technology is that it’s advancing so quickly and in such new forms that we don’t have a structural framework we can apply to it. There are too many questions about new concepts, like “personal data,” for politicians to summarily pass new legislation restricting how, when, or why it’s collected.

For example, do people “own” their personal data? In other words, is it fair for companies to collect data on you like your driving habits or your consumer preferences without compensating you? Can they collect it without your knowledge or consent? What if these data are being used for some positive purpose, like preventing car accidents or improving medical care?

If consumers aren’t actively demanding new privacy standards and tech-focused laws, politicians are going to be less inclined to take action. Collectively, we need to consider the new questions that these new monitoring devices are raising.

The Question of Necessity

Most data collection occurs on an opt-in basis. You have to take some specific action or use a specific product or service for a company to collect your information. For example, for Facebook to collect data on you as a user, you have to have a Facebook account and leave it logged in. For Google to collect data on your driving habits, you have to have Google Maps.

The simple way to avoid problems of monitoring and privacy is to avoid using apps that collect your data. But as we evolve as a society, these forms of technology become a practical necessity. If you have a modern job that requires near-constant communication with project management apps, instant messengers, and more, you might be required to have a smartphone. The benefits of GPS apps, smart speakers, and social media apps are difficult to ignore.

Accordingly, when a single company or a handful of companies share near-total control of a given, practically necessary service—like how Google controls the vast majority of all online searches—we may need to rethink how we treat that company’s data collection practices (and operations in general). As a loose analogy, imagine a scenario in which only one company provided electricity throughout the United States. That company decides to only provide electricity to consumers who allow the company to install cameras in their homes. This would be seen as a gross abuse of power, and a ruthless invasion of personal privacy.

Of course, collecting search history information is significantly less invasive than installing a camera, but the question of service necessity is still worth asking.

Is Privacy That Important?

You may wonder why data privacy is so important in the first place. After all, you may trust tech companies to keep your best interests in mind, or you may not care if they learn about your favorite movies and restaurants.

However, there are good reasons why we should be thinking more critically about data privacy. First, collecting data is an exercise of power, and companies that collect data tend to gain power, which they can then use to collect even more data. Acting early to establish and enforce data privacy helps prevent this power from spiraling out of control.

Additionally, you don’t want your data to be used against you. Even innocuous-seeming types of data could, in the wrong hands, be used to negatively impact your life. For example, could someone filing a personal lawsuit against you be able to cite data on your search history? Is your insurance company justified in jacking up your rate because it caught you speeding, or noticed your driving patterns change?

How to Protect Your Privacy

For the time being, we don’t have a sweeping policy that protects consumer data, but we do have a plethora of internet-connected devices monitoring our daily habits. In the meantime, there are some basic strategies and habits you can adopt to protect your own personal privacy—even as you increase your safety and security.

  • Read privacy policies. It may seem tedious and unnecessary, but it’s important for you to read the privacy policies posted by the companies who make the products you use. In these policies, you’ll discover what type of information the company provides, how it collects this information, and potentially, how it uses the information on an ongoing basis. You don’t need to be a lawyer to understand these policies (most of the time), and you can also look up simplified summaries that explain them high-level.
  • Check your privacy settings. Most apps provide you some level of privacy within the app itself. For example, Google, Facebook, and other social media apps allow you to restrict which information you provide, and which other users can see your information. This won’t protect you from the company collecting the information, but can protect you from peers.
  • Make use of privacy products. It’s also wise to invest in additional privacy products, or privacy modes designed to limit how much of your data can be collected. For example, you can use an incognito or “private browsing” mode to restrict access to your browser data, and install a virtual private network (VPN) to hide some of your internet traffic.
  • Be careful what you share and when. Finally, you can be aware of how and when you’re being tracked, and use that information to police your behavior. For example, you might be wary of what you post on social media or what you search when you’re logged into your primary accounts. There’s a limit to how much you can protect yourself in this way, but it’s worth exploring anyway.

New technology will keep emerging, and potentially at accelerating rates. We’re currently in a gray area of development where we’re constantly discovering and asking new questions, but the answers aren’t clear. We all need to raise our awareness and work toward transparency and action if we want to create an environment where consumers and tech developers can thrive.

Frank Landman

Frank Landman

Frank is a freelance journalist who has worked in various editorial capacities for over 10 years. He covers trends in technology as they relate to business.