US Government Will No Longer Push For User’s Encrypted Data


US Government Will No Longer Push For User’s Encrypted Data

Last year Google and Apple (and other companies) made some changes to the way encryption was handled. Instead of Google and Apple holding the keys to the encryption, they gave the keys to their customers. What this meant is that law enforcement agencies can no longer ask these companies to turn over user encrypted data.

If they want the data, they will have to convince those users to give it up themselves, something which the FBI was not too happy about. However for a while the government did not give up their quest to make it so that tech companies could be forced to turn over encrypted user data, but all of that has since changed.

According to reports, the Obama administration has finally backed down in their battle against the tech companies over encrypted data. This means that if you were worried that these tech companies would one day be forced to install back doors which are supposedly only for government access, you won’t have to worry about that anymore.

The argument made by the tech companies basically stated that by installing back doors, even if it was just for the “good guys”, could leave their products and services open to hacks. While this is no doubt a big victory, there are some who are skeptical that this is the end of that.

According to Peter G. Neumann, one of the nation’s leading computer scientists, “This looks promising, but there’s still going to be tremendous pressure from law enforcement. The NSA is capable of dealing with the cryptography for now, but law enforcement is going to have real difficulty with this. This is never a done deal.”

Obama administration has decided not to seek a legislative remedy now


Obama administration has decided not to seek a legislative remedy now

FBI Director James Comey told a congressional panel that the Obama administration won’t ask Congress for legislation requiring the tech sector to install backdoors into their products so the authorities can access encrypted data.

Comey said the administration for now will continue lobbying private industry to create backdoors to allow the authorities to open up locked devices to investigate criminal cases and terrorism.

“The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” Comey told a Senate panel of the Homeland Security and Governmental Affairs Committee on Thursday.

Comey’s comments come as many in the privacy community were awaiting a decision by the administration over whether it would seek such legislation. Many government officials, including Comey himself, have called for backdoors. All the while, there’s been intense lobbying by the White House to guilt the tech sector for a backdoor. And Congress has remained virtually silent on the issue that resembles the so-called Crypto Wars.

The president’s public position on the topic, meanwhile, has been mixed. Obama had said he is a supporter and “believer in strong encryption” but also “sympathetic” to law enforcement’s need to prevent terror attacks.

The government’s lobbying efforts, at least publicly, appear to be failing to convince tech companies to build backdoors into their products. Some of the biggest names in tech, like Apple, Google, and Microsoft, have publicly opposed allowing the government a key to access their consumers’ encrypted products. All the while, some government officials, including Comey, have railed against Apple and Google for selling encrypted products where only the end-user has the decryption passcode.

According to a letter to Obama from the tech sector:

Obama administration has decided not to seek a legislative remedy now

The government cannot force the tech sector to build encryption end-arounds. The closest law on the books is the Communications Assistance for Law Enforcement Act of 1994, known as CALEA. The measure generally demands that telecommunication companies make their phone networks available to wiretaps.

Obama administration opts not to force firms to decrypt data — for now


Obama administration opts not to force firms to decrypt data — for now

After months of deliberation, the Obama administration has made a long-awaited decision on the thorny issue of how to deal with encrypted communications: It will not — for now — call for legislation requiring companies to decode messages for law enforcement.

Rather, the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations.

“The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.

The decision, which essentially maintains the status quo, underscores the bind the administration is in — between resolving competing pressures to help law enforcement and protecting consumer privacy.

The FBI says it is facing an increasing challenge posed by the encryption of communications of criminals, terrorists and spies. A growing number of companies have begun to offer encryption in which the only people who can read a message, for instance, are the person who sent it and the person who received it. Or, in the case of a device, only the device owner has access to the data. In such cases, the companies themselves lack “backdoors” or keys to decrypt the data for government investigators, even when served with search warrants or intercept orders.

The decision was made at a Cabinet meeting Oct. 1.

“As the president has said, the United States will work to ensure that malicious actors can be held to account – without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.”

But privacy advocates are concerned that the administration’s definition of strong encryption also could include a system in which a company holds a decryption key or can retrieve unencrypted communications from its servers for law enforcement.

“The government should not erode the security of our devices or applications, pressure companies to keep and allow government access to our data, mandate implementation of vulnerabilities or backdoors into products, or have disproportionate access to the keys to private data,” said, a coalition of industry and privacy groups that has launched a campaign to petition the Obama administration.

To Amie Stepanovich, the U.S. policy manager for Access, one of the groups signing the petition, the status quo isn’t good enough. “It’s really crucial that even if the government is not pursuing legislation, it’s also not pursuing policies that will weaken security through other methods,” she said.

The FBI and Justice Department have been talking with tech companies for months. On Thursday, Comey said the conversations have been “increasingly productive.” He added: “People have stripped out a lot of the venom.”

He said the tech executives “are all people who care about the safety of America and also care about privacy and civil liberties.”

Comey said the issue afflicts not just federal law enforcement but also state and local agencies investigating child kidnappings and car crashes— “cops and sheriffs … [who are] increasingly encountering devices they can’t open with a search warrant.”

One senior administration official said the administration thinks it’s making enough progress with companies that seeking legislation now is unnecessary. “We feel optimistic,” said the official, who spoke on the condition of anonymity to describe internal discussions. “We don’t think it’s a lost cause at this point.”

Legislation, said Rep. Adam Schiff (D-Calif.), is not a realistic option given the current political climate. He said he made a recent trip to Silicon Valley to talk to Twitter, Facebook and Google. “They quite uniformly are opposed to any mandate or pressure — and more than that, they don’t want to be asked to come up with a solution,” Schiff said.

Law enforcement officials know that legislation is a tough sell now. But, one senior official stressed, “it’s still going to be in the mix.”

On the other side of the debate, technology, diplomatic and commerce agencies were pressing for an outright statement by Obama to disavow a legislative mandate on companies. But their position did not prevail.

Daniel Castro, vice president of the Information Technology & Innovation Foundation, said absent any new laws, either in the United States or abroad, “companies are in the driver’s seat.” He said that if another country tried to require companies to retain an ability to decrypt communications, “I suspect many tech companies would try to pull out.”

Risk Analysis, Encryption Stressed in HITECH Act Final Rules


Risk Analysis, Encryption Stressed in HITECH Act Final Rules

Two final rules for the HITECH electronic health record incentive program strongly emphasize the value of risk assessments and encryption as measures for safeguarding patient information.

A new rule establishing requirements for proving a provider is a “meaningful user” for Stage 3 of the incentive program requires protecting patient data through the implementation of appropriate technical, administrative and physical safeguards and conducting a risk analysis that includes assessing encryption of ePHI created or maintained by a certified electronic health record.

A companion final rule setting 2015 standards for certifying EHR software as qualifying for the program requires the software to be capable of creating a hashing algorithm with security strength equal to or greater than SHA-2.

The Department of Health and Human Services’ Centers for Medicare and Medicaid Services says the Stage 3 requirements are optional in 2017. Providers who choose to begin Stage 3 in 2017 will have a 90-day reporting period. However, all providers will be required to comply with Stage 3 requirements beginning in 2018 using EHR technology certified to the 2015 Edition requirements.

When it comes to privacy and security requirements included in the final rules, versus what was in the proposed rules, there were “no significant changes, no surprises,” says John Halamka, CIO of Beth Israel Deaconess Medical Center.

Some privacy and security experts, however, point out the rules spotlight the importance of safeguarding electronic protected health information through measures such as risk analysis, encryption and secure data exchange. But some observers criticize HHS for not offering more detailed guidance on risk assessments.

Risk Analysis

While conducting a risk analysis was also a requirement in Stages 1 and 2 of the meaningful use program, the final rule for Stage 3 requires that healthcare providers drill down further by “conducting or reviewing a security risk analysis … including addressing the security – to include encryption – of electronic protected health information created or maintained by certified electronic health record technology … and implement security updates as necessary and correct identified security deficiencies.”

The objective of that requirement is to protect electronic health information through the implementation of “appropriate technical, administrative and physical safeguards,” the rule states. Rulemakers stress assessing the data created or maintained by an electronic health record system, versus conducting a more comprehensive security risk assessment as required under the HIPAA Security Rule.

“Although [HHS’] Office for Civil Rights does oversee the implementation of the HIPAA Security Rule and the protection of patient health information, we believe it is important and necessary for a provider to attest to the specific actions required to protect ePHI created or maintained by CEHRT in order to meet the EHR incentive program requirements,” the rule notes. “In fact, in our audits of providers who attested to the requirements of the EHR Incentive Program, this objective and measure are failed more frequently than any other requirement.

“This objective and measure are only relevant for meaningful use and this program, and are not intended to supersede what is separately required under HIPAA and other rulemaking. We do believe it is crucial that all [eligible healthcare providers] evaluate the impact CEHRT has on their compliance with HIPAA and the protection of health information in general.”

New to the risk analysis requirement is the addition of assessing administrative and technical safeguards. “This measure enables providers to implement risk management security measures to reduce the risks and vulnerabilities identified. Administrative safeguards – for example, risk analysis, risk management, training and contingency plans – and physical safeguards – for example, facility access controls, workstation security – are also required to protect against threats and impermissible uses or disclosures to ePHI created or maintained by CEHRT.”

Missed Opportunity?

HHS should have used the final rule to offer even more helpful guidance about risk assessments, says privacy attorney David Holtzman, vice president of compliance at the security consulting firm CynergisTek.

“CMS focused significant attention to the role of risk analysis in safeguarding the privacy and security of health information created or maintained in an EHR,” he says. “However, they missed an important opportunity to … ensure that administrative and physical safeguards requirements of the HIPAA Security Rule are assessed in any security risk analysis.”

To guide healthcare providers, including smaller doctors’ offices, in conducting the Stage 3 risk analysis, the rule makes note of free tools and resources available to assist providers, including a Security Risk Assessment Tool developed by ONC and OCR.

But the use of that tool is daunting for some smaller healthcare entities, contends Keith Fricke, principal consultant at consulting firm tw-Security.

“The SRA tool is too overbearing for any organization to use, let alone small healthcare organizations, including small provider offices,” he says.

Secure Data Exchange

Besides a renewed focus on risk analysis, other privacy and security related enhancements to the meaningful use Stage 3 final rule include an emphasis on encryption and secure messaging.

“More than half of the objectives in Stage 3 starting in 2017 require EHRs to have interoperable exchange technology that is encrypted and offered to relying parties with strong identity assurance,” said David Kibbe, M.D., CEO of DirectTrust, which created and maintains a framework for secure e-mail in the healthcare sector.

“DirectTrust’s work can and will be relied upon for multiple Stage 2 and 3 objectives and criteria announced by CMS in the new rule,” he says.

For instance, secure electronic messaging to communicate with patients on relevant health information is an objective in Stage 3, with a series of measurements.

Software Certification Rule

While privacy and security are weaved through the final rule for Stage 3 of the meaningful use program for healthcare providers, HHS’ Office of the National Coordinator for Health IT also raised the bar on requirements in the final rule for 2015 Edition health IT software certification. That includes phasing in requirements for more robust encryption.

“Given that the National Institute of Standards and Technology, technology companies, and health IT developers are moving away from SHA-1, we believe now is the appropriate time to move toward the more secure SHA-2 standard,” ONC wrote in its rulemaking.

The rule also states: “We note that there is no requirement obligating health IT developers to get their products certified to this requirement immediately, and we would expect health IT developers to not begin seeking certification to this criterion until later in 2016 for implementation in 2017 and 2018. We further note that certification only ensures that a health IT module can create hashes using SHA-2; it does not require the use of SHA-2. For example, users of certified health IT may find it appropriate to continue to use SHA-1 for backwards compatibility if their security risk analysis justifies the risk.”

Some other safeguard features, such as data segmentation for privacy of sensitive health information, are included in the software certification rule as optional, Halamka notes. “That’s appropriate for immature standards,” he says.

Public Input

CMS is continuing to seek public comment on the “meaningful use” rule for 60 days. This input could be considered by CMS for future policy developments for the EHR incentive program, as well as other government programs, the agency says.

However, this additional public comment period could become problematic, Holtzman contends. “The adoption of the changes in the objective and measures as a ‘final rule with comment’ could cause delays in EHR vendors and developers in producing upgrades to their technology. The uncertainty in that CMS could make further changes in the months ahead might encourage these industry partners to hold off in their production process.”