Apple and the FBI are embroiled in a fight over whether the federal government can compel the technology company to hack the iPhone used by one of the San Bernardino terrorists. Maine’s senators Tuesday waded into the debate over encryption.

Sens. Susan Collins and Angus King announced they are co-sponsoring legislation to create a National Commission on Security and Technology Challenges. The 16-member digital security commission would clarify when law enforcement can access people’s data on encrypted devices.

Apple argues that once it breaks the encryption on the iPhone, it would set a legal precedent that would enable the government to routinely conscript companies to grant access to customer data. The FBI has argued that encryption has effectively turned iPhones into “warrant-free spaces” that hamper criminal investigations.

“The issues of ‘ going dark’ and preserving personal privacy are ones that we simply must grapple with today and for the future,” Collins said Tuesday afternoon on the Senate floor.

If Congress approves the commission — which Apple supports — lawmakers would, among other tasks, recommend revisions to procedures for obtaining warrants to compel technology companies to assist investigators in accessing the data of alleged criminals and terrorists without duly harming Americans’ right to privacy.

“We don’t want to compromise national security, but we also don’t want to compromise personal security,” King said on the Senate floor.

How far can the government go to compel Apple to break its encryption software?

A federal judge in California last month approved the FBI’s request for a search warrant to access the personal data on an iPhone used by one of the San Bernardino shooters.

Apple, however, has beefed up the security on its mobile devices in recent years to protect customer data from cyber thieves. The iPhone is designed to erase all personal data on it after 10 failed attempts to log in. So the federal judge ordered Apple to write software to disable that function so the FBI can test password combinations to force its way into the phone, a process that could take five years.

In a motion filed Feb. 25, Apple argued that under the First Amendment the government can’t compel it to write software. A federal judge in 1996 ruled that computer code is a protected form of free speech. In that case, the court ruled that the government regulations barring the export of encryption software and requirements that technology to be wiretap-ready violated the Constitution.

But, as NPR reported last month, when public safety is a concern, the government has latitude to compel certain forms of speech.

“There are plenty of circumstances where the government mandates people to speak. For example, you have to put the nutrition label on your can of food if you want to sell that food into the economy,” Eric Goldman, a professor at Santa Clara University School of Law, told NPR.

The FBI argues that compelling Apple to unlock the iPhone is in the interest of public safety because it may contain information about potential terrorist plots.

James Lewis, an expert in cybersecurity at the Center for Strategic and International Studies in Washington, told The New York Times that if Apple is capable of opening the iPhone or creating software capable of bypassing its security, which Apple has done 70 times, then the company must comply with the FBI’s warrant.

“There are plenty of instances when it’s appropriate for encrypted products to be opened to law enforcement,” Lewis said. “The investigation into the San Bernardino killers is a prime example.”

Does an 18th-century law apply in a 21st-century context?

When the federal judge in California ordered Apple to work with the FBI to unlock the iPhone, she invoked the All Writs Act of 1789 that gives federal judges the authority to compel citizens or, in this case Apple, to help carry out search warrants.

Almost immediately, Apple and others called into question whether the government can rely on a 227-year-old law to enforce a search warrant connected to 21st-century technology, just one of the issues the digital security commission would examine.

“This is an issue of immense significance and public policy importance that should not be decided by a single court in California or Iowa or New Jersey or anywhere else based upon a 220-year-old law,” King said. “This is an issue of policy that should be decided here.”

But Congress hasn’t always succeeded in tailoring laws specific to new technology, Irina Raicu, director of the Internet Ethics Program at Santa Clara University’s Markkula Center for Applied Ethics, told NPR. For example, when Congress passed the Electronic Communications Privacy Act of 1986 lawmakers assumed no one would store emails and other electronic communications for more than six months, Raicu said. Today, the expansion of digital storage has allowed inboxes to fill with old emails.

The San Bernardino case, Raicu argued, indicates that the principles underlying the law makes it adaptable to advances in technology.

“The law actually seems to be keeping up with technology by being so broad that we’re just reinterpreting it all the time,” Raicu said.

A federal judge in New York, however, in a similar case involving an iPhone in a drug case ruled Monday that the government had inflated its authority under the All Writs Act and that its expansive reading of the law calls into question whether the law is constitutional.

King said the ruling underscores the need for Congress to resolve the debate around encryption and how far the government can go to conscript a company’s help in the name of national security or a criminal investigation.

“It is a very, very strong argument, and it makes the case I think very straightforwardly that this decision should not stay in the hands of the court,” King said.