Customer Headaches Could Curtail Apple’s Encryption Push


Customer Headaches Could Curtail Apple’s Encryption Push

At an event held during Apple’s fight with the FBI over whether it should help unlock a dead terrorist’s iPhone, CEO Tim Cook promised “We will not shrink” from the responsibility of protecting customer data —including from government overreach.

Yet the obvious next step for the company could be hard to take without inconveniencing customers.

Apple is currently able to read the contents of data stored in its iCloud backup service, something at odds with Cook’s claims that he doesn’t want his company to be capable of accessing customer data such as mobile messages.

Apple has not denied reports it is working to change that. And the company is expected to make some mention of its security technology at its World Wide Developers Conference next week, as it did at March’s iPhone event in March.

But redesigning iCloud so that only a customer can unlock his data would increase the risk of people irrevocably losing access to precious photos and messages when they lose their passwords. Apple would not be able to reset a customer’s password for them.

“That’s a really tough call for a company that says its products ‘Just work,’” says Chris Soghoian, a principal technologist with the American Civil Liberties Union—referring to a favorite line of Apple’s founder, Steve Jobs.

Cook has boasted of how the encryption built into Apple’s iPhones and iMessage system keeps people safe by ensuring that only they can access their data. FBI director James Comey has complained about it.

But the design of iCloud means that Apple can read much of its customers’ data, and help the government do so, too. The service is enabled by default (although you can opt out), and automatically backs up messages, photos, and more to the company’s servers. There the data is protected by encryption, which Apple has the key to unlock. The company’s standoff with the FBI happened only because the backups Apple handed the agency from San Bernardino shooter Syed Farook’s iPhone ended six weeks before the shooting, because he had turned them off.

Apple could lock itself and law enforcement out of iCloud data by encrypting each person’s iCloud backups using a password under his control, perhaps the same one that locks his iPhone.

The company has not denied reports from the Financial Times and Wall Street Journal that it is working on such a design. Passwords and credit card details stored using an iCloud feature called Keychain are already protected in this way. But taking this approach would prevent Apple from being able to reset a person’s password if he forgets it. The data would be effectively gone forever.
It is probably impractical for Apple to roll out that approach for everyone’s data, as the company did for the security protections built into the iPhone, says Vic Hyder, chief strategy officer with Silent Circle, which offers secure messaging, calls, and data sharing for corporations.

“It puts control on the customer but also responsibility on the customer,” he says. “This will likely be an option, not the default.”

Soghoian of the ACLU agrees. “I think they will probably offer it as an option, but be reluctant to advertise that feature much,” he says. “More people forget their passwords than get investigated by the FBI.”

Bryan Ford, an associate professor at the Swiss Federal Institute of Technology in Lausanne, says Apple could take steps to reduce the risk of accidental data loss.

The company’s FileVault disk encryption feature for PCs offers the option to print out a recovery key. A similar process could be used for iCloud encryption, says Ford.

Apple could also implement other safeguards, he says. For example, people could have the option of distributing extra encryption keys or passwords to several “trustees,” who could help recover data if the original password was lost. To prevent abuse it could be required that a certain number of trustees, say, three of five, came forward to unlock the data.

The cryptography needed for such a design is well understood, says Ford. He recently designed a similar but more complex system intended to help companies such as Apple prevent their software updates from being abused (see “How Apple Could Fed-Proof Its Software Update System”).

Alan Fairless, cofounder and CEO of SpiderOak, which offers companies fully encrypted data storage, says he thinks companies like Apple will eventually make truly secure cloud storage accessible to consumers.

Encrypted messaging was clunky and hard to use until recently, but is now widespread thanks to Apple and WhatsApp, he points out. Encrypting stored data is more challenging, but Apple has shown itself willing to spend significantly on encryption technology, for example by adding new chips to the iPhone, says Fairless.
However, he also thinks Apple and its customers aren’t yet ready for encrypted iCloud backups to be the default. “It’ll take consumer technology a while to catch up,” says Fairless.

Moot point: Judge closes iPhone encryption case in Brooklyn



The United States Justice Department said on Friday that it has withdrawn a request compelling Apple Inc to cooperate in unlocking an iPhone related to a drug case in NY following a third-party providing a passcode to the authorities to access the handset.

“An individual provided the department with the passcode to the locked phone at issue in the Eastern District of New York”, Justice Department spokesman Marc Raimondi said in a statement.

On Friday, the Justice Department told a federal court in Brooklyn that it would withdraw the motion to force Apple to pull data from a drug dealer’s locked iPhone, The Washington Post reported.

Investigators have dropped the court case against Apple as they have successfully gained access to the iPhone 5s involved in the NY drug case.

There are about a dozen other All Writs Act orders for Apple’s assistance with opening for other devices that are unresolved, but are not in active litigation, according to a Justice Department official. Apple, meanwhile, demanded to know in the NY case whether the government had exhausted all other options to get to the data.

The company said it “strongly supports, and will continue to support, the efforts of law enforcement in pursuing criminals”, but not through the government’s misuse of a law it wants to use as a “precedent to lodge future, more onerous requests for Apple’s assistance”.

The case dates back to 2014, when authorities seized the iPhone 5s of the suspect Jun Feng. Feng pleaded guilty in October to conspiring to distribute methamphetamine and is scheduled to be sentenced in June. Comments attributed to Apple’s attorneys also suggest that while the company isn’t aware of the method used, it’s convinced that normal product development is eventually going to plug whatever exploit was used to gain access to that iPhone.

According to the Wall Street Journal, that “individual” is Feng himself, who has already been convicted and only recently became aware that his phone was the subject of a national controversy.

The case began on February 16 with an order from Judge Sheri Pym and ended on March 28 when the Justice Department withdrew its legal actaion against Apple.

As a result, Comey’s remarks strongly implied that the bureau paid at least $1.3 million to get onto the phone, which had belonged to Syed Rizwan Farook, who, with his wife, killed 14 people during the December 2 terror attack in San Bernardino, Calif. 、

Brooklyn case takes front seat in Apple encryption fight


Brooklyn case takes front seat in Apple encryption fight

The Justice Department said Friday it will continue trying to force Apple to reveal an iPhone’s data in a New York drug case, putting the Brooklyn case at the center of a fight over whether a 227-year-old law gives officials wide authority to force a technology company to help in criminal probes.

The government told U.S. District Judge Margo K. Brodie in Brooklyn that it still wants an order requiring Apple’s cooperation in the drug case even though it recently dropped its fight to compel Apple to help it break into an iPhone used by a gunman in a December attack in San Bernardino that killed 14 people.

“The government’s application is not moot and the government continues to require Apple’s assistance in accessing the data that it is authorized to search by warrant,” the Justice Department said in a one-paragraph letter to Brodie.

Apple expressed disappointment, saying its lawyers will press the question of whether the FBI has tried any other means to get into the phone in Brooklyn.

Apple had sought to delay the Brooklyn case, saying that the same technique the FBI was using to get information from the phone in California might work with the drug case phone, eliminating the need for additional litigation.

Federal prosecutors told Brodie on Friday that it would not modify their March request for her to overturn a February ruling concluding that the centuries-old All Writs Act could not be used to force Apple to help the government extract information from iPhones.

Magistrate Judge James Orenstein made the ruling after inviting Apple to challenge the 1789 law, saying he wanted to know if the government requests had created a burden for the Cupertino, California-based company.

Since then, lawyers say Apple has opposed requests to help extract information from over a dozen iPhones in California, Illinois, Massachusetts and New York.

In challenging Orenstein’s ruling, the government said the jurist had overstepped his powers, creating “an unprecedented limitation on” judicial authority.

It said it did not have adequate alternatives to obtaining Apple’s assistance in the Brooklyn case, which involves a phone with a different version of operating system than the phone at issue in the California case.

In a statement Friday, Justice Department spokeswoman Emily Pierce said the mechanism used to gain access in the San Bernardino case can only be used on a narrow category of phones.

“In this case, we still need Apple’s help in accessing the data, which they have done with little effort in at least 70 other cases when presented with court orders for comparable phones running iOS 7 or earlier operating systems,” she said.

Apple is due to file a response in the case by Thursday.

How to encrypt iPhone and Android, and why you should do it now


How to encrypt iPhone and Android, and why you should do it now

Apple’s fight with the FBI may be over for the time being, but this high-profile fight about user privacy and state security may have puzzled some smartphone users. When is an iPhone or Android device encrypted? And how does one go about securing the data on them?


It’s pretty simple actually: as long as you set up a password or PIN for the iPhone or iPad’s lockscreen, the device is encrypted. Without knowing the access code, nobody can unlock it, which means your personal data including photos, messages, mail, calendar, contacts, and data from other apps, is secured. Sure the FBI can crack some iPhones, but only if they’re included in criminal investigations, and only if the recent hacks work on all iPhones out there.

If you don’t use a lockscreen password, you should do it right away. Go to Settings, Touch ID & Passcode, tap on Turn Passcode On and enter a strong passcode or password.


As CNET points out, things are a bit more complicated on Android.

The newer the device, the easier it is to get it done. In this category, we have Nexus devices, the Galaxy S7 series, and other new handsets that ship with Android 6.0 preloaded. Just like with the iPhone, go to the Settings app to enable a security lock for the screen, and the phone is encrypted.

With older devices, the encryption procedure is a bit more complex, as you’ll also have to encrypt the handset manually. You’ll even have to do it with newer devices, including the Galaxy S6 and Moto X Pure. Go to Settings, then Security then Encrypt phone. While you’re at it, you may want to encrypt your microSD card as well, so data on it can be read on other devices – do it from the Security menu, then Encrypt external SD card. Once that’s done, you will still need to use a password for the lockscreen.

CNET says there are reasons you should consider not encrypting your Android device, like the fact that a device might take a performance hit when encrypted. The performance drop may be barely noticeable on new devices, but older models and low-end handsets could suffer.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses


Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The FBI insists that encrypted products like the iPhone and encrypted online services will put people in harm’s way, especially in light of the ISIS-connected San Bernardino shooting late last year. That’s why the Bureau has been arguing for encryption backdoors that would be available to law enforcement agencies, and why it looked to coerce Apple to add a backdoor to iOS.

However, extensive reports that show the preparations ISIS made before hitting Paris and Brussels revealed the kind of encrypted products ISIS radicals used to stay in touch with central command. Unsurprisingly, these products are out of the FBI’s jurisdiction, and one in particular was one of the safest encrypted communication products you can find online. In fact,its original developers are suspected to have ties to the criminal underworld.

Telling the inside story of the Paris and Brussels attacks, CNN explains that ISIS cell members used a chat program called Telegram to talk to one another in the moments ahead of the attacks. Using data obtained from official investigations,CNN learned that just hours before the Bataclan theater was hit, one of the attackers had downloaded Telegram on a Samsung smartphone.

Police never recovered communications from the messaging app. Not only is Telegram encrypted end-to-end, but it also has a self destruct setting.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

Conceived by Russian developers, the app is out of the FBI’s jurisdiction. But Telegram is the least problematic encrypted service for intelligence agencies looking to collect data and connect suspects. CNN also mentions a far more powerful app, one that hasn’t yet been cracked by law enforcement.

TrueCrypt is the app in question. One of the ISIS radicals who was captured by French police in the months leading to the mid-November Paris attacks revealed details about this program.

TrueCrypt resides on a thumb drive and is used to encrypt messages. French citizen and IT expert Reda Hame was instructed to upload the encrypted message to a Turkish file-sharing site. “An English-speaking expert on clandestine communications I met over there had the same password,” Hame told interrogators. “It operated like a dead letter drop.”

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

According to The New York Times, Hame was told not to send the message via email, so as to not generate any metadata that would help intelligence agencies connect him to other terrorists.

The ISIS technician also instructed Hame to transfer TrueCrypt from the USB key to a second unit once he reached Europe. “He told me to copy what was on the key and then throw it away,” Hame explained. “That’s what I did when I reached Prague.”

Hame made a long journey home from Turkey, making it look like he was a tourist visiting various cities in Europe. Whenever he reached a new place, he was to call a special number belonging to one of the masterminds behind the attacks, and he used a local SIM card to mark his location.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The Times also mentions a secondary program that was installed on flash drives. Called CCleaner, the program can be used to erase a user’s online history on any computer.

If that’s not enough to show the level of sophistication of these bloody ISIS attacks on Europe and other targets, a story from The New Yorker sheds more light on TrueCrypt, a program whose creators can’t be forced to assist the FBI.

According to the publication, TrueCrypt was launched in 2004 to replace a program called Encryption for the Masses (E4M) developed long before the iPhone existed. Interestingly, the programmer who made it is Paul Le Roux, who also happens to be a dangerous crime lord, having built a global drug, arms and money-laundering cartel out of a base in the Philippines.

E4M is open-source, and so is TrueCrypt, meaning that their creators aren’t companies motivated by a financial interest to keep their security intact.

“TrueCrypt was written by anonymous folks; it could have been Paul Le Roux writing under an assumed name, or it could have been someone completely different,” Johns Hopkins Information Security Institute computer-science professor Matthew Green told The New Yorker.

The developers stopped updating it in 2014 for fear that Le Roux’s decision to cooperate with the DEA might cripple its security. Le Roux was arrested in Liberia on drug-trafficking charges in September 2012. But Green concluded in 2015 that TrueCrypt is still backdoor-free, which explains why ISIS agents still use it.

FBI Hacks iPhone, Ending Apple Encryption Challenge


FBI Hacks iPhone, Ending Apple Encryption Challenge

The Department of Justice said in a federal court filing Monday that it had bypassed encryption on the iPhone 5c used by a terrorist in a mass shooting last year in California and requested the court vacate its order compelling Apple to assist it in accessing the device.

The filing effectively ends a contentious legal battle between the federal government and Apple over the phone used by Syed Rizwan Farook. Farook was fatally shot by authorities along with his wife, Tashfeen Malik, after they killed 14 people in San Bernardino, California, in December.

“The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court’s Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016,” government lawyers said in their filing in U.S. District Court for the Central District of California.

The two-page filing contains no information about the methods the government used to bypass the phone’s encryption.

A scheduled March 22 hearing was canceled last week after government lawyers said an “outside party” had proposed a possible way to unlock the phone that would not require Apple’s help. The tech giant had vowed to oppose the order in court, stating that helping the government access an encrypted iPhone would set a precedent for undermining privacy and cybersecurity.

“Our decision to conclude the litigation was based solely on the fact that, with the recent assistance of a third party, we are now able to unlock that iPhone without compromising any information on the phone,” prosecutors said in a statement.

“We sought an order compelling Apple to help unlock the phone to fulfill a solemn commitment to the victims of the San Bernardino shooting – that we will not rest until we have fully pursued every investigative lead related to the vicious attack,” the statement said. “Although this step in the investigation is now complete, we will continue to explore every lead, and seek any appropriate legal process, to ensure our investigation collects all of the evidence related to this terrorist attack. The San Bernardino victims deserve nothing less.”

Why few hackers are lining up to help FBI crack iPhone encryption


Why few hackers are lining up to help FBI crack iPhone encryption

When the FBI said it couldn’t unlock the iPhone at the center of the San Bernardino shooting investigation without the help of Apple, the hackers at DriveSavers Data Recovery took it as a challenge.

Almost 200 man hours and one destroyed iPhone later, the Bay Area company has yet to prove the FBI wrong. But an Israeli digital forensics firm reportedly has, and the FBI is testing the method.

Finding a solution to such a high-profile problem would be a major feat — with publicity, job offers and a big payday on the line. But, in fact, the specialists at DriveSavers are among only a few U.S. hackers trying to solve it. Wary of the stigma of working with the FBI, many established hackers, who can be paid handsomely by tech firms for identifying flaws, say assisting the investigation would violate their industry’s core principles.

Some American security experts say they would never help the FBI, others waver in their willingness to do so. And not all of those who would consider helping want their involvement publicized for risk of being labeled the hacker who unhinged a backdoor to millions of iPhones.

“The FBI has done such a horrible job of managing this process that anybody in the hacking community, the security community or the general public who would openly work with them would be viewed as helping the bad guys,” said Adriel Desautels, chief executive of cybersecurity testing company Netragard. “It would very likely be a serious PR nightmare.”

Much of the security industry’s frustration with the FBI stems from the agency’s insistence that Apple compromise its own security. The fact that the FBI is now leaning on outside help bolsters the security industry’s belief that, given enough time and funding, investigators could find a workaround — suggesting the agency’s legal tactics had more to do with setting a precedent than cracking the iPhone 5c owned by gunman Syed Rizwan Farook.

Some like Mike Cobb, the director of engineering at DriveSavers in Novato, Calif., wanted to be the first to find a way in. Doing so could bring rewards, including new contracts and, if desired, free marketing.

“The bragging rights, the technical prowess, are going to be considerable and enhanced by the fact that it’s a very powerful case in the press,” said Shane McGee, chief privacy officer for cybersecurity software maker FireEye Inc.

Altruism could motivate others. Helping the FBI could further an inquiry into how a husband-and-wife couple managed to gun down 14 people, wound many others and briefly get away.

Another positive, McGee said, is that legal liability is low: While unauthorized tampering with gadgets has led to prison time, it’s legal as long as people meddle with iPhones they own — and the court order helps too.

But top security experts doubt the benefits are worth the risk of being seen as a black sheep within their community.

Hackers have said they don’t want to touch the San Bernardino case “with a 10-foot pole because the FBI doesn’t look the like good guy and frankly isn’t in the right asking Apple to put a back door into their program,” Desautels said. The assisting party, if ever identified, could face backlash from privacy advocates and civil liberties activists.

“They’d be tainted,” Desautels said.

The unease in the hacker community can be seen through Nicholas Allegra, a well-known iPhone hacker who most recently worked for Citrix.

Concerned an FBI victory in its legal fight with Apple would embolden authorities to force more companies to develop software at the government’s behest, Allegra had dabbled in finding a crack in iPhone 5c security. If successful, he hoped his findings would lead the FBI to drop the Apple dispute.

But he has left the project on the back burner, concerned that if he found a solution, law enforcement would use it beyond the San Bernardino case.

“I put in some work. I could have put more in,” he said. But “I wasn’t sure if I even wanted to.”

Companies including Microsoft, United Airlines and Uber encourage researchers and even hackers to target them and report problems by dangling cash rewards.

HackerOne, an intermediary for many of the companies, has collectively paid $6 million to more than 2,300 people since 2013. Boutique firms and freelancers can earn a living between such bounties and occasionally selling newly discovered hacking tools to governments or malicious hackers.

But Apple doesn’t have a bounty program, removing another incentive for tinkering with the iPhone 5c.

Why few hackers are lining up to help FBI crack iPhone encryption

Still, Israeli firm Cellebrite is said to have attempted and succeeded at defeating the device’s security measures.

The company, whose technology is heavily used by law enforcement agencies worldwide to extract and analyze data from phones, declined to comment. The FBI has said only that an “outside party” presented a new idea Sunday night that will take about two weeks to verify. Apple officials said they aren’t aware of the details.

Going to the FBI before going to the company would violate standard practice in the hacking community. Security researchers almost always warn manufacturers about problems in their products and services before sharing details with anyone else. It provides time for a issuing a fix before a malicious party can exploit it.

“We’ve never disclosed something to the government ahead of the company that distributed the hardware or software,” McGee said. “There could be far-reaching consequences.”

Another drawback is that an iPhone 5c vulnerability isn’t considered a hot commodity in the minds of many hackers, who seek to one-up each other by attacking newer, more widely used products. The 5c model went on sale in 2013 and lacks a fingerprint sensor. Newer iPhones are more powerful and have different security built into them. Only if the hack could be applied to contemporary iPhones would it be worth a rare $1-million bounty, experts say.

The limited scope of this case is why many hackers were taken back by a court order asking for what they consider broadly applicable software to switch off several security measures. Instead, experts wanted the FBI to invest in going after the gunman’s specific phone with more creativity. In other words, attack the problem with technology, not the courts.

“If you have access to the hardware and you have the ability to dismantle the phone, the methodology doesn’t seem like it would be all that complex,” Desautels said.

Two years ago, his team tried to extract data from an iPad at the request of a financial services company that wanted to test the security of the tablets before offering them to employees. Netragard’s researcher failed after almost a month; he accidentally triggered a date change within the software that rendered the iPad unusable. But Desautels said cracking the iPad would have been “possible and trivial” for someone with more time and a dozen iPads to mess with.

The same, he imagines, would be true for an iPhone. The FBI, though, has said it had exhausted all known possibilities.

Taking Apple to court generated attention about the problem and “stimulated creative people around the world to see what they might be able to do,” FBI Director James Comey said in a letter to the Wall Street Journal editorial board Wednesday. Not “all technical creativity” resides within government, he said.

The plea worked, grabbing the interest of companies like DriveSavers, which gets about 2,000 gigs a month to retrieve photos, videos and notes from phones that are damaged or belong to someone who died. But despite all of the enticements in the San Bernardino case, they’ve worked to unlock an iPhone 5c only intermittently.

They’ve made progress. Cobb’s team can spot the encrypted data on an iPhone 5c memory chip They’re exploring how to either alter that data or copy it to another chip. Both scenarios would allow them to reset software that tracks invalid password entries. Otherwise, 10 successive misfires would render the encrypted data permanently inaccessible.

Swapping chips requires soldering, which the iPhone isn’t built to undergo multiple times. They have an adapter that solves the issue, and about 300 old iPhones in their stockpile in case, as one already has, the device gets ruined.

Had they been first to devise a proposed solution, DriveSavers “absolutely” would have told the FBI because their method doesn’t present extraordinary security risks, Cobb said.

But whether it would want to be publicly known as the code cracker in the case, Cobb said that would be “a much bigger, wider conversation” to ponder.

Debate over tech tools’encryption


Before the San Bernardino terror attack, Syed Rizawan Farook’s iPhone was just one fancy Apple device among hundreds of millions worldwide.

But since the California government worker and his wife shot and killed 14 people on December 2, apparently inspired by extremist group IS, his iPhone 5c has become a key witness – and the government wants Apple to make it talk.

The iPhone, WhatsApp, even social media – government authorities say some of tech fans’ favourite playthings are also some of the most powerful, and problematic, weapons in the arsenals of violent extremists.

Now, in a series of quiet negotiations and noisy legal battles, they’re trying to disarm them, as tech companies and civil liberties groups fight back.

The public debate started with a court order that Apple hack a standard encryption protocol to get at data on Farook’s iPhone, but its repercussions are being felt beyond the tech and law enforcement worlds.

“This is one of the harder questions that we will ever have to deal with,” said Albert Gidari, director of privacy at Stanford Law School’s Centre for Internet and Society.

“How far are we going to go? Where does the government power end to collect all evidence that might exist, and whether it infringes on basic rights? There’s no simple answer,” he told DPA.

It’s not new that terrorists and criminals use mainstream technology to plan and co-ordinate, or that law enforcement breaks into it to catch them. Think of criminals planning a robbery by phone, foiled by police listening in.

But as encryption technology and other next-generation data security move conversations beyond the reach of a conventional wiretap or physical search, law enforcement has demanded the industry provide “back-door” technology to access it too.

At the centre of the fray are otherwise mainstream gadgets and platforms that make private, secure and even anonymous data storage and communication commonplace.

Hundreds of millions of iPhones running iOS 8 or higher are programmed with the same auto-encryption protocol that has stymied investigators in the San Bernardino attack and elsewhere.

US authorities are struggling with how to execute a wiretap order on Facebook-owned WhatsApp’s encrypted messaging platform, used by 1 billion people, the New York Times reported.

In a similar case earlier this month, Brazilian authorities arrested a company executive for not providing WhatsApp data the company said it itself could not access.

Belgium’s interior minister Jan Jambon said in November he believed terrorists were using Sony’s PlayStation 4 gaming network to communicate, Politico reported, although media reports dispute his assertions.

In a world where much of social interaction has moved online, it’s only natural that violent extremism has made the move too.

ISIS, in particular, has integrated its real-world operations with the virtual world, using social media like Twitter and YouTube for recruitment and propaganda and end-to-end encryption for secure communication, authorities say.

Law enforcement authorities and government-aligned terror experts call it the “digital jihad”.

Under pressure from governments, social media providers have cracked down on accounts linked to extremists. Twitter reported it had closed 125,000 ISIS-linked accounts since mid-2015.

Most in the industry have drawn the line at any compromise on encryption, however, saying the benefits of secure data outweigh the costs of its abuse by criminals – leaving authorities wringing their hands.

“Something like San Bernardino” or the November 13 terror attack in Paris “can occur with virtually no indications it was about to happen,” retired general and former Obama anti-terror envoy John Allen warned an audience of techies at the South by Southwest digital conference.

Just a day before, US President Barack Obama had made an unprecedented appearance there, calling for compromise in the showdown between government and tech.

Citing examples of child pornographers, airline security and Swiss bank accounts, Obama said authorities must have the ability to search mobile devices, encrypted or not.

But Gidari called it a “Pandora’s box” too dangerous to open.

Apple CEO defends position in encryption dispute with feds


Apple CEO defends position in encryption dispute with feds

Apple CEO Tim Cook said in an interview Wednesday it was a tough decision to resist a court order directing the tech giant to override security features on the iPhone used by one of the San Bernardino gunmen who killed 14 people in a December terror attack.

However, Cook reiterated to ABC News in his first interview since the controversy erupted last week that if his company complied with the FBI’s demand to unlock Syed Rizwan Farook’s encrypted phone it would be “bad for America.”

“Some things are hard and some things are right, and some things are both. This is one of those things,” Cook said. The interview came as both sides in the dispute are courting public support, through interviews and published statements, while also mustering legal arguments in the case.

Federal authorities have insisted they’re only asking for narrow assistance in bypassing some security features on the iPhone, which they believe contains information related to the mass murders. Apple argues that doing so would make other iPhones more susceptible to hacking by authorities or criminals in the future.

The Apple chief expressed sympathy for the shooting victims’ families, and said his company provided engineers and technical advice to authorities investigating the case. But he said authorities are now asking the company “to write a piece of software that we view as sort of the equivalent of cancer.”

The software could “expose people to incredible vulnerabilities,” Cook added, arguing that smartphones contain private information about users and even their families.

“This would be bad for America,” he said. “It would also set a precedent that I believe many people in America would be offended by.”

Meanwhile, Attorney General Loretta Lynch defended the FBI’s push to access the locked phone Wednesday, saying judges at all levels have held such companies “must assist if it is reasonably within their power to do so – and suggesting Congress does not need to get involved as Apple wants.

But Lynch used testimony Wednesday before a House appropriations subcommittee to lay out the DOJ position that courts already have found companies must assist in opening devices.

“If the government needs the assistance of third parties to ensure that the search is actually conducted, judges all over the country and on the Supreme Court have said that those parties must assist if it is reasonably within their power to do so,” she said, without mentioning Apple by name. “And that is what we have been asking, and we owe it to the victims and to the public whose safety we must protect to ensure that we have done everything under the law to fully investigate terrorist attacks on American soil.”

Apple also is expected to argue that the Obama administration’s request to help it hack into an iPhone in the federal investigation of the San Bernardino attack is improper under an 18th century law, the 1789 All Writs Act, which has been used to compel companies to provide assistance to law enforcement.

Magistrate Judge Sheri Pym in California ordered Apple last week to create specialized software to help the FBI hack into a locked, county-issued iPhone used by Farook.

Here’s why the FBI forcing Apple to break into an iPhone is a big deal


iphone 4s

When U.S. Magistrate Sheri Pym ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino, Calif., shootings, the tech world shuddered.

Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where a company’s success could be made or broken based on its ability to protect customer data.

The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.

Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.

So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.

Federal prosecutors looking for more information behind the San Bernardino shootings don’t know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.

That’s why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.

Kurt Opsahl, general counsel for the Electronic Frontier Foundation, a San Francisco-based digital rights non-profit, explained that this “backdoor” means Apple will have to to write brand new code that will compromise key features of the phone’s security. Apple has five business days to respond to the request.

What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEO Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes. In the interview, Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.

He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”

Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones. According to a page on Apple’s website detailing government requests, Apple says encryption data is tied to the device’s passcode.

Cook also dismissed the idea that iPhone users should swap privacy for security. “We’re America. We should have both.”

What does this mean for the next time the government wants access? The order doesn’t create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.

What do digital rights experts have to say? There are two things that make this order very dangerous, Opsahl said. The first is the question is raises about who can make this type of demand. If the U.S. government can force Apple to do this, why can’t the Chinese or Russian governments?

The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key. It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.

And the lawmakers? Well, they are torn. Key House Democrat, Rep. Adam Schiff, D-Calif., says Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue.

On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.

What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google — responding to consumer demands for privacy — have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.