Perhaps the best thing about the Apple vs. FBI controversy is that it has engaged the public in a much needed debate to a greater extent than anything since the Edward Snowden controversy.

The tension between the desire of government agencies to gain more access with greater ease, and the public’s evolving perception of data privacy rights, is ongoing and writ large in the case of FBI vs. Apple, which brings the complexities regarding security technology such as encryption to light.

As most Americans (and iPhone users) are now aware, the government has demanded that Apple create software to help the FBI crack open an iPhone, and this week the government suggested it could use a third party to access the phone. This is the phone that was uncovered in the aftermath of the mass shooting in San Bernardino last year. Apple objected to the FBI’s demands, spawning a massive public battle between the government and technology sectors.

To take the technical issues first, I’d like to clarify some terms that have been tossed around, such as encryption. The Apple iPhone can encrypt information stored on it, meaning the data’s scrambled so that nobody can make sense of them without the key, like this: xmmib fmelkb. That is “Apple iPhone” encrypted with the Caesar cipher – a highly predictable substitution cipher, so “a” is always “x” and so on. Yet this illustrates a couple of important points. The FBI is not asking Apple to reveal or break the way the encryption process works on the iPhone, nor are they asking for the key (Apple does not have the key). The FBI wants Apple to remove impediments to guessing the key.

Many of the encryption systems we use on digital devices today create very complex keys using random data and very long numbers, but there is a user element as well, a passphrase or passcode that we choose ourselves. On iPhones, this began as a four digit number, of which there are 10,000 possible combinations (10). With the iOS9 version of the operating system (OS) you can choose six digits, offering 1 million possible combinations (10). On top of that, the OS includes a delay function that slows down any “brute forcing” of the passcode by a series of guesses. If that were not enough, iOS is designed to erase the data on the phone after 10 incorrect attempts.

So how could the FBI get to the contents of this phone? By updating it with a specially crafted version of iOS that removes the lockout and delay functions, permitting the FBI to execute a highly automated series of guesses at high speed. Why is it a big deal for the FBI to ask Apple to create a backdoor one time? The problem that many technology companies have with this request is, in my opinion, the strongest of Apple’s numerous objections: the “just this one device” claim.

There is no technical or legal basis for saying this case is a one off. If Apple complies with the current court order and creates a version of the OS that facilitates access to this one iPhone, it can be used on other iPhones. Other law enforcement agencies will join the line that is already forming to demand Apple’s assistance with other iPhones, and Apple will have no basis to refuse because that’s the way the legal system works. It’s called precedent and, while some FBI statements have seemed to imply that this case would not set a precedent, all practical (and legal) signs point against that.

A 1977 Supreme Court decision in the case of United States v. New York Telephone upheld the use of the All Writs Act to compel a phone company to help the government track telephone activity related to an illegal gambling case (at a time when there were no statutory provisions for that). The government case against Apple relies heavily on that ruling and if Apple loses in court, a further precedent will be set, one that can be used in cases impacting many aspects of our digital life. Any number of agencies will have a strong legal basis for requiring any hardware and software makers to selectively turn off security features to assist government investigations.

Right now, the outcome of FBI vs. Apple is impossible to call, with George Washington University Law School professor Orin Kerr likening it to “a crazy-hard law school exam hypothetical in which a professor gives students an unanswerable problem just to see how they do.” Yet, even as this case descends deeper into the weeds of U.S. law, there are clearly implications for banks and commerce, especially for companies with business overseas.

Consider the current negotiations to create a “Privacy Shield” in place of the Safe Harbor arrangement under which companies were allowed to process and store the personal information of European data subjects in the U.S. Last October, that agreement was deemed inadequate by the Court of Justice of the European Union (CJEU) because, in light of the Snowden revelations, the U.S. appeared incapable of adequately protecting that information from surveillance by its own intelligence services (notably the NSA but also the FBI). U.S. negotiators striving to put the new Privacy Shield in place are arguing that the U.S. understands and respects the privacy concerns of its EU trading partners. That argument is harder to make if Apple loses.

Some industries have come to an arrangement with the government over reporting suspicious activity or, in the case of telecommunications, allowing government access to traffic data under certain defined conditions. To some extent such cooperation mitigates the need for backdoors, but that does not mean they will not be inserted covertly by governments and possibly exploited by criminals. Among the many problems with allowing government backdoors, is that the fallout from this case could hamper banks’ ability to use secure software for communications and other tasks. Some fear that a precedent of this nature will ultimately limit the security features banks can offer their customers.  A reusable back door like the one the FBI has asked Apple for could not only be used covertly by governments, but exploited by criminals.

Stephen Cobb has been researching information assurance and data privacy for more than 20 years and is currently part of the ESET global research team.