Why You Need Private Browsing

If you thought browsing securely (and privately) was as easy as opening a new incognito window, think again.

Private browsing is all the rage now that it’s necessary in order to access certain websites in some countries. Luckily, there are lots of ways to access the web that doesn’t require Safari, Firefox, or Chrome. There are also ways to surf the internet that aren’t actually secure at all — even if they’re advertised as such. The first step to tapping into a safe connection is understanding what a safe connection is — and what it’s not.

This is not what private browsing looks like.

Google Chrome’s Incognito mode may cover your tracks online locally, but it doesn’t erase them entirely. When you choose to browse privately using a major web browser, the places you visit online will not accumulate in your computer’s history. This way, no one else who accesses your device will be able to see the websites you used during your private browsing session. (In fact, you won’t even be able to see them yourself.)

Except…you can. In fact, anyone can; That is, anyone who has access to your internet bill. All it takes is calling up your internet service provider and requesting a log of the websites you visited at any given time and day. (Yes, this can include times and days when you were browsing “privately.”)

Yet you and anyone with access to your internet bill aren’t the only ones with access to your browsing history! All of the websites you’re visiting can also see you, even if you’re not logged into an account associated with their services. This is because your path to that website isn’t protected. Online, who you are is defined by how you arrived there.

Encryption is the Key

Truly private browsing requires an encrypted connection through a browser that has Virtual Private Network (VPN) capabilities. This isn’t your typical browser, but rather a special kind that you may have to do a bit of Googling to find (that is, unless you’re lucky enough to find yourself reading this article).

When you connect to the internet through a VPN, where your connection originates is indistinguishable. This is because your connection is made possible through a web of devices and a remote server (some private browsers allow users to choose from a number of remote servers, but most don’t). Unlike when you connect to the web using a standard connection, when you browse through a VPN, your device’s point of origin is unidentifiable.

The only thing that is visible when you’re browsing utilizing a VPN, is the location you choose to be visible. Private browsers with VPN capabilities allow you to choose from connections around the world to display as your point of origin. (If you connect through a VPN location in Switzerland, it will appear as though you are browsing the web via Switzerland, even if your physical location is Palo Alto, California.)

Encrypted Browsing in the Work Place

In the workplace, things get a bit more complicated. Although a VPN connection will encrypt your traffic, your employer’s IT department may be able to still tell if you are using an encrypted connection especially if you’re on the company network. This may be against your company’s policy, so be aware of the consequences.

Also if you’re on a company machine, then it may already be controlled by corporate and your activities are already being monitored regardless if a VPN is on or not. The safest bet is use to a VPN on your own personal device over data and not on your company network to keep your browsing private from your employer.

How to Choose a Private Browser

There are many private browsers out there that are completely free, which is why choosing the right one to do the job can be a daunting task. Ever since the rise in popularity of private browsing in recent months, some have even adopted questionable means of serving their users (including feigning VPN capabilities and selling data).

The first thing to note when shopping for a private browser is what makes it private. If the only thing advertised is an ability to delete your local history, then you’re being pushed a glorified incognito window. Almost all today’s browser’s incognito mode does not encrypt your traffic.

The first thing that should be advertised is what VPN options the browser offers. A user friendly private encrypted browser will have different servers to connect to the web through, easy ways to switch between servers, as well as an intuitive interface for connecting, and disconnecting from the web.

Encryption is crucial for truly private browsing because it masks information about your surfing habits such as how long you stayed on a site, how many times you visited, and what your activity log looked like for any particular website. Someone snooping on your online activity may be able to see how much data you’re using in a browsing session, but they won’t be able to see how it’s used if your connection is encrypted.

There are a number of quality private browsers out there that can be downloaded for free, but it’s important to lookout for any hidden catch. When a web product or service is offered for free, sometimes the reason for that is because you’re paying for it with your data.

Other Ways to Stay Safe Online

Browsing privately isn’t the only way to protect your data on the internet. You can start using these tools even without a private browser to enhance your traditional web experience and make it harder to be tracked.

Start by switching up your default search engine. Google’s AdSense makes a private browsing experience impossible using Google. A private search engine such as DuckDuckGo and StartPage don’t creep on your habits for the sake of targeting advertisements to you.

If you browse the web primarily from your phone, be sure to turn off Geotagging to prevent the public caching of your physical location each time you take a photo. (If you’re using a private browser but still have this feature turned on, your browsing location with conflate with your physical location.)

There are many free password managers available that will help you generate passwords that are difficult to be compromised, and will remind you when it’s time to change up your passwords.

Last but not least, you can use browser security tools such as HTTPS Everywhere and Privacy Badger to protect your data even when you’re not browsing privately.

French official begins anti-encryption campaign

French official begins anti-encryption campaign

This story was delivered to BI Intelligence Apps and Platforms Briefing subscribers. To learn more and subscribe, please click here.

A French official plans to begin mobilizing a global effort — starting with Germany — against tech companies encrypting their messaging apps, according to Reuters.

Messaging apps, such as Telegram and WhatsApp, that promote end-to-end encryption, are used by terrorists to organize attacks in Europe, the minister said. Although individual governments have previously explored seeking mandatory backdoors from tech companies, this is the first attempt to unify the case across countries. If successful, it could make it more difficult for these companies to resist the requests.

The debate over the use of end-to-end encryption in chat apps recently made headlines after it was revealed that terrorists might have used secure chat apps to coordinate a slew of attacks in France. Law-enforcement officials argue that the highly secure tech impedes their ability to carry out investigations relating to crimes that use the chat apps.

Tech companies argue that providing backdoor access to their apps, even to governments, creates a potential vulnerability that can be targeted by malicious players seeking access to users’ personal data. To add weight to this argument, it was revealed last week that a “golden key” built by Microsoft for developers was accidentally leaked. And while the company has sent out patches for a majority of its devices, it’s unlikely to reach those potentially affected.

BI Intelligence, Business Insider’s premium research service, has compiled a detailed report on messaging apps that takes a close look at the size of the messaging app market, how these apps are changing, and the types of opportunities for monetization that have emerged from the growing audience that uses messaging services daily.

Here are some of the key takeaways from the report:

  • Mobile messaging apps are massive. The largest services have hundreds of millions of monthly active users (MAU). Falling data prices, cheaper devices, and improved features are helping propel their growth.
  • Messaging apps are about more than messaging. The first stage of the chat app revolution was focused on growth. In the next phase, companies will focus on building out services and monetizing chat apps’ massive user base.
  • Popular Asian messaging apps like WeChat, KakaoTalk, and LINE have taken the lead in finding innovative ways to keep users engaged. They’ve also built successful strategies for monetizing their services.
  • Media companies, and marketers are still investing more time and resources into social networks like Facebook and Twitter than they are into messaging services. That will change as messaging companies build out their services and provide more avenues for connecting brands, publishers, and advertisers with users.

In full, this report:

  • Gives a high-level overview of the messaging market in the US by comparing total monthly active users for the top chat apps.
  • Examines the user behavior of chat app users, specifically what makes them so attractive to brands, publishers, and advertisers.
  • Identifies what distinguishes chat apps in the West from their counterparts in the East.
  • Discusses the potentially lucrative avenues companies are pursuing to monetize their services.
  • Offers key insights and implications for marketers as they consider interacting with users through these new platforms.

FBI Chief Calls for National Talk Over Encryption vs. Safety

SAN FRANCISCO — The FBI’s director says the agency is collecting data that he will present next year in hopes of sparking a national conversation about law enforcement’s increasing inability to access encrypted electronic devices.

Speaking on Friday at the American Bar Association conference in San Francisco, James Comey says the agency was unable to access 650 of 5,000 electronic devices investigators attempted to search over the last 10 months.

FBI Chief Calls for National Talk Over Encryption vs. Safety

Comey says encryption technology makes it impossible in a growing number of cases to search electronic devices. He says it’s up to U.S. citizens to decide whether to modify the technology.

The FBI earlier this year engaged in a high-profile fight with Apple to access data from a locked iPhone used by a shooter in the San Bernardino, California, terrorist attack.

Google finally adds HSTS encryption to google.com

Google finally adds HSTS encryption to google.com

Google, known for its security practices, has finally brought HTTP Strict Transport Security (HSTS) to google.com to strengthen its data encryption. HSTS helps protect against eavesdroppers, man-in-the-middle attacks, and hijackers who attempt to spoof a trusted website. Chrome, Safari, and Internet Explorer all support HSTS.

“HSTS prevents people from accidentally navigating to HTTP URLs by automatically converting insecure HTTP URLs into secure HTTPS URLs,” said Jay Brown, a senior technical program manager for security at Google, in a blog post. “Users might navigate to these HTTP URLs by manually typing a protocol-less or HTTP URL in the address bar, or by following HTTP links from other websites.”

Typically, implementing HSTS is a fairly simple process, Brown said. But, due to Google’s complex algorithms, the company had to address mixed content, bad HREFs, redirects to HTTP, and other issues like updating legacy services which could cause problems for users as they try to access the core domain.

Brown also noted that the team accidentally broke Google’s Santa Tracker just before Christmas last year during testing.

According to Google, about 80% of requests to its servers today use encrypted connections. The use of HSTS goes a step further by preventing users from mistakenly visiting unsafe URLs.

Certain domains, including Paypal and Twitter, will be automatically configured with HSTS to keep users safe, according to Google’s HSTS Preload List.

Google is now focused on increasing the “max-age,” or the duration that the header is active. The max-age is currently set to one day to help mitigate the risk of any potential problems with the rollout. “By increasing the max-age, however, we reduce the likelihood that an initial request to www.google.com happens over HTTP,” Brown said. “Over the next few months, we will ramp up the max-age of the header to at least one year.”

Increasing encryption

Google is currently working to implement HTTPS across all of its products. In March 2014, the company announced the use of HTTPS-only for Gmail.

Increasing encryption and security around its core products will be key for Google to remain in good standing with enterprise and consumer customers as concerns over cybersecurity ramp up across verticals.

Encryption remains at the forefront of many cybersecurity discussions, especially after last year’s terrorist attack in San Bernardino, CA, and the FBI’s dispute with Apple over access to the shooter’s iPhone.

In March, Google joined Facebook, Microsoft, and others who filed in support of Apple in its refusal of a court order forcing it to unlock the shooter’s iPhone for authorities.

The Federal Bureau of Investigations is holding ongoing talks with technology companies about a range of privacy and encryption issues, according to FBI director James Comey. The agency is also collecting statistics on the effect of encryption on its investigations.

“Encrypting data in transit helps keep our users and their data secure,” Brown said. “We’re excited to be implementing HSTS and will continue to extend it to more domains and Google products in the coming months.”

Hacker finds breach in WhatsApp’s encryption system

A security expert has found a breach in WhatsApp’s supposed ‘end-to-end’ encryption system. On earlier 2016, the Facebook-owned company proudly announced that messages would feature end-to-end encryption, thus giving users the tranquility that their private conversations would remain untouched.

Jonathan Zdziarski, a digital forensic specialist and digital security expert, published an article on Thursday with bold declarations. He stated that WhatsApp does not really delete users’ messages. Zdziarski started several conversations on his WhatsApp account, using an iPhone. After a bit of chit-chat, he deleted, cleared and archived some of the conversations. Finally, he clicked the “Clear All Chats” feature.

Hacker finds breach in WhatsApp’s encryption system

The “deleted records” were not actually deleted since the messages still appeared in SQLite, a relational database management system. According to Zdziarski, the chat’s database gets copied every time an iPhone users does a backup, saving it in a desktop backup and iCloud (Zdziarski states that this is “irrelevant to whether or not you use WhatsApp’s built-in iCloud sync”).

Which are the risks?

Zdziarski stated that the “leftover” evidence in SQLite poses some risks. For example, if somebody has physical access to a smartphone, he or she could hack it and create a backup of that information. In the same way, if a hacker has physical access to a computer, he or she could enter an “unencrypted backup” and access messages.

Law enforcement could obtain clear records of conversations by giving Apple a court order. Zdziarski has been very clear in stating that he doesn’t believe WhatsApp is keeping information on purpose. He even offers some advice in the article about how the company could make the service better and safer.

Hacker finds breach in WhatsApp’s encryption system

Alternatives

For Zdziarski, the only way to truly delete WhatsApp messages is to remove the app entirely. However, he offered some tips to “minimize” risks. For example, using iTunes to set a very complex backup password could help. Using Configurator to lock the smartphone is also a good idea since it makes harder for someone else to steal the phone’s passwords.

Finally, users would have to disable iCloud backup. If the user still feels uneasy, there are still a few safer alternatives. Telegram, an app available for Android and iOS, promises to have end-to-end encryption. The app is very popular in NGOs for even having a “self-destruct” modality for messages.

Telegram’s founder, Pavel Durov, founded the social networking site VK. He had an argument with Russian authorities and left his country in a self-imposed exile. VK is now owned by Mail.Ru Group, which has the monopoly of social networking market in Russia and is a Putin ally.

After this, he decided to create the instant messaging service with the aim of giving Russians a secure messaging app that would be unbreakable by Russian intelligence services. The BlackBerry Messenger service is also secure since the PIN-to-PIN service uses “Triple Data Encryption Standard”.

Do anti-encryption Democrats see the importance of encryption now?

Do anti-encryption Democrats see the importance of encryption now?

One would certainly hope so after the turmoil that has followed the release of thousands of DNC emails by Wikileaks. But Democratic lawmakers in the past have worked to weaken encryption standards, demanding backdoors that they say can be used by law enforcement authorities to track terrorists, but also leave computers vulnerable to hackers.
Consider CISA, a bill introduced to the Senate by California Democrat Dianne Feinstein. Despite near-unanimous expert testimony opposing the bill, along with a vocal public outcry, 30 Democratic senators voted in favor of passing the bill last year. This year, Feinstein coauthored the “Compliance with Court Orders Act of 2016” with Republican Senator Richard Burr, in the name of protecting America from terrorism following the FBI’s battle with Apple over decrypting the San Bernardino shooter’s iPhone.

As encryption expert Jonathan Zdziarski wrote following the announcement of the Feinstein-Burr bill, “The reality is that there is no possible way to comply with it without intentionally backdooring the encryption in every product that may be used in the United States.” While it’s still unclear how, exactly, hackers got into the DNC’s servers, Democrats now know, in the most personal way, the kinds of embarrassments that can result from encryption vulnerabilities.

The Democrats can blame Russia all they want. The fact of the matter is that stronger encryption, like the end-to-end encryption now standard in everything from iMessage to Whatsapp, continues to be the best defense against hackers.

Facebook to add end-to-end encryption to Messenger app

Facebook to add end-to-end encryption to Messenger app

Facebook has started to introduce a setting to its “Messenger” app that provides users with end-to-end encryption, meaning messages can only be read on the device to which they were sent.

The encrypted feature is currently only available in a beta form to a small number of users for testing, but it will become available to all of its estimated 900-million users by late summer or in the fall, the social media giant said.

The feature will be called “secret conversations”.

“That means the messages are intended just for you and the other person – not anyone else, including us,” Facebook announced in a blog post.

The feature will also allow users to set a timer, causing messages to expire after the allotted amount of time passes.

Facebook is the latest to join an ongoing trend of encryption among apps.

Back in April, Whatsapp, which is owned by Facebook and has more than a billion users, strengthened encryption settings so that messages were only visible on the sending and recipient devices.

Whatsapp had been providing limited encryption services since 2014.

The company says it is now using a powerful form of encryption to protect the security of photos, videos, group chats and voice calls in addition to the text messages sent by more than a billion users around the globe.

Controversy

Encryption has become a hotly debated subject, with some US authorities warning that criminals and armed groups can use it to hide their tracks.

“WhatsApp has always prioritised making your data and communication as secure as possible,” a blog post by WhatsApp co-founders Jan Koum and Brian Acton said, announcing the change at the time.

Like Facebook has until now, Google and Yahoo use less extensive encryption to protect emails and messages while they are in transit, to prevent outsiders from eavesdropping.

Apple uses end-to-end encryption for its iMessage service, but some experts say WhatsApp’s method may be more secure because it provides a security code that senders and recipients can use to verify a message came from someone they know – and not from a hacker posing as a friend.

Full disk encryption flaw could affect millions of Android users

Full disk encryption flaw could affect millions of Android users

When it comes to vulnerabilities and security, Google’s Android has never been in the good books of security experts or even its users to a great extent. Now, another vulnerability has surfaced that claims to leave millions of devices affected. Security expert Gal Beniamini has now revealed another flaw in Android encryption.

According to the DailyMail, the security researcher has said that Android devices with full disk encryption and powered by Qualcomm processors are at risk of brute force attacks wherein hackers can use persistent trial and error approach. Full disk encryption is on all devices running Android 5.0 onwards. It generates a 128-bit master key for a user’s password. The report adds that the key is stored in the device and can be cracked by malicious minds.

“Android FDE is only as strong as the TrustZone kernel or KeyMaster. Finding a TrustZone kernel vulnerability or a vulnerability in the KeyMaster trustlet, directly leads to the disclosure of the KeyMaster keys, thus enabling off-device attacks on Android FDE,” Beniamini explains.

A combination of things like Qualcomm processors verifying security and Android kernels are causing the vulnerability. Google along with Qualcomm is working at releasing security patches, but Beniamini said hat fixing the issue may require hardware upgrade.

“Full disk encryption is used world-wide, and can sometimes be instrumental to ensuring the privacy of people’s most intimate pieces of information. As such, I believe the encryption scheme should be designed to be as “bullet-proof” as possible, against all types of adversaries. As we’ve seen, the current encryption scheme is far from bullet-proof, and can be hacked by an adversary or even broken by the OEMs themselves (if they are coerced to comply with law enforcement),” he adds.

Lately, encryption debate had taken centre stage when Apple refused to unlock an iPhone belonging to a terrorist involved in San Bernardino shooting. The FBI reportedly managed to break into the device without Apple’s help and is believed to have paid a whopping $13 million to do so.

US wiretap operations encountering encryption fell in 2015

US wiretap operations encountering encryption fell in 2015

The US government has been very vocal recently about how the increase in encryption on user devices is hampering their investigations. The reality is that according to a report from the Administrative Office of U.S. Courts, law enforcement with court-ordered wiretaps encountered fewer encrypted devices in 2015 than in 2014.

In regards to encrypted devices, the reports states: “The number of state wiretaps in which encryption was encountered decreased from 22 in 2014 to seven in 2015. In all of these wiretaps, officials were unable to decipher the plain text of the messages. Six federal wiretaps were reported as being encrypted in 2015, of which four could not be decrypted.”

This is out of 2,745 state and 1,403 federal for a grand total of 4,148 wiretaps, an increase of 17 percent over 2014. So while surveillance increased, the amount of times law enforcement encountered encryption decreased.

Earlier this year the Department of Justice and FBI were locked in a court battle with Apple over an encrypted iPhone used by San Bernardino shooter Syed Rizwan Farook. The government eventually dropped the case after finding a third party to help it bypass the phone’s security.

But it started a national debate about personal devices and encryption. Tech companies want their customers to be secure while law enforcement want backdoors or keys to encrypted devices for investigations. But it looks like when it comes to wiretaps, encryption isn’t as big a problem as many would suspect.

Supreme Court rejects PIL for WhatsApp ban, but encryption debate is just beginning

Supreme Court rejects PIL for WhatsApp ban, but encryption debate is just beginning

WhatsApp’s end-to-end encryption might still be a contentious issue, but on Wednesday the Supreme Court refused to allow a PIL seeking a ban on the popular app and similar messenger services.

The PIL, filed by Gurugram-based RTI activist Sudhir Yadav, said these apps have complete encryption, which poses a threat to the country’s security.

A bench of Chief Justice T S Thakur and Justice A M Khanwilkar rejected the PIL, suggesting Yadav could approach the government or Telecom Regulatory Authority of India (TRAI) with his plea.
But Yadav said his application to the department of telecommunication and the government got the response that they did not possess information in this regard. The petitioner contended that end-to-end 256-bit encryption introduced by WhatsApp in April made all messages, chat, call, video, images and documents end-to-end encrypted, and thus it was impossible for security agencies to decode these.

According to him, this could be national security threat for India, as agencies will not be able to track terrorists, who can plan attacks without worrying that the government can access their messages. The RTI petitioner sought to maintain a balance where police agencies can get lawful access to data while keeping information private.

Supreme Court rejects PIL for WhatsApp ban, but encryption debate is just beginning

So what is WhatsApp’s end-to-end encryption and why has it become such an issue? For starters, WhatsApp’s end-to-end encryption ensures that a user’s messages, videos, photos sent over the app, can’t be read by anyone else — not WhatsApp, not cyber-criminals, not law-enforcement agencies. Even calls and group chats are end-to-end encrypted.

End-to-end encryption means encryption at the device level and thus your chats, messages, videos are not stored on WhatsApp’s servers at all. The only way to access this data is if your device is compromised and the messages have not been deleted. This encryption is designed to keep out man-in-the-middle attacks.

Given WhatsApp has over a billion users, this end-to-end encryption is a big deal. Let’s not forget that in Brazil, a senior WhatsApp executive was jailed because the company did not hand over data in a court case. WhatsApp claimed the data is encrypted and it does not have access to it.

WhatsApp co-founder Jan Koum, in fact, is known for dedication to user privacy and this is also one of the reasons the app has never sold ads. When WhatsApp announced the end-to-end encryption, Koum wrote, “People deserve security. It makes it possible for us to connect with our loved ones. It gives us the confidence to speak our minds. It allows us to communicate sensitive information with colleagues, friends, and others. We’re glad to do our part in keeping people’s information out of the hands of hackers and cyber-criminals.”

Supreme Court rejects PIL for WhatsApp ban, but encryption debate is just beginning

WhatsApp has relied on the “The Signal Protocol”, designed by Open Whisper Systems for its end-to-end encryption. What is also significant is the feature is enabled by default on WhatsApp, unlike apps like Telegram where you have to go into a secret chat mode for end-to-end encrypted chats.

WhatsApp is also one of the most popular apps in India. In fact, research has consistently shown it is one of most used apps after Facebook, and it is common for most people in India to be part of various groups on the service. Family, School, College friends, even office groups are present on WhatsApp. End-to-end encryption means all of this data is secure, and can’t be accessed by third-parties including government agencies.

For now the Courts have refused to go for a ban on WhatsApp, and instead directed Yadav towards the government. India per se doesn’t have a law on what kind of encryption third-party apps can used.

As we had noted earlier, the 40-bit encryption limit, which is too low given the current times, is something ISPs and TSPs have to stick with and doesn’t apply to apps.

Until India comes up with an encryption law, WhatsApp remains legal and we’ll have to wait and watch how the encryption versus security agency debate plays out in the country.

Greedy Bart ransomware encrypts files in ZIP archives

Greedy Bart ransomware encrypts files in ZIP archives

A new ransomware threat known as Bart is experimenting with the price it charges victims and encryption strategies.

If your PC is infected by Bart you will be asked to pay three Bitcoin (BTC) or just under $2,000 to regain access to your files, which is significantly more than the usual 0.5 BTC ($300) to 1.5 BTC fee.

Also, you won’t get a decryption key, but rather a password that opens password-protected ZIP archives, where the files of Bart-infected machines have been copied.

While .zip is intended primarily for compression, it also offers encryption. However, as PC World recently pointed out, the program used to create and open the ZIP file determines whether the weak ZipCrypto encryption or the tougher-to-crack AES-256 is used.

Security firm PhishMe noted on Friday that Bart’s use of .zip files for encryption differs from most file encrypting ransomware, which traditionally use a more sophisticated asymmetric, public-private key pair for encryption.

Another distinguishing feature of Bart is that it doesn’t rely on command and control infrastructure in order to tell which PCs the malware should proceed to encrypt and provide instructions to pay the ransom.

Security firm Proofpoint also reported the emergence of Bart on Friday, and said that instead of using a command and control host, it relied on a unique browser identifier in the URL.

The Bart ransomware also won’t run if it detects the user’s system language is Russian, Ukranian, or Belorussian, according to Proofpoint.

Proofpoint also found links between the Bart ransomware and the more widely used Locky ransomware, such as a similar looking payment page, and that it like Locky it is being distributed in spam email. However, Proofpoint also found that the ransomware code itself was “largely unique” from Locky.

Russia encryption grab may require chat backdoors as standard

Russia encryption grab may require chat backdoors as standard

MOOTED LEGAL CHANGES in Russia may apply a boot to the face of open and private chat messaging services and create a very cold winter for communications.

Reports from the country said that plans to require backdoors in otherwise encrypted chat services are quite advanced and will launch with a mandatory status.

Russia is often accused of messing with internet liberties, but before we get on our high horse we should remember that this is exactly the kind of ambrosia that the UK and US would like to have with their anti-terror breakfast.

Local news site CurrentTime said that companies resisting the anti-terror laws could be fined, and names WhatsApp as the kind out of service that would be involved.

The report explained that senator Elena Mizulina referred to a research group of some kind, and some ill repute, called the League of Safe Internet that had uncovered evidence of unwelcome underground operations including “a number of closed groups where teenagers [are] brainwashed to kill police officers”.

She added that perhaps it is time to start nipping such activity in the bud and that Russia could “maybe go back to the idea of ​​pre-filtering [messages] as we cannot look at it in silence”.

CurrentTime has a clip of the legislation and it does seem as though Russia will ensure that the right level of deterrent is in place.

“Failure to comply with the organiser of the dissemination of information on the internet obligation to submit to the federal executive authority in the field of safety information required for decoding the received, sent, delivered or processed by electronic communications,” said the bill.

“It is proposed to punish by a fine of ₽3,000 to ₽5,000 [£32 to £52] for citizens, ₽30,000 to ₽50,000 [£316 to £528] for officials and ₽800,000 to ₽1m [£8,450 to £10,565] for legal entities.” µ.

Apple to expand encryption on Macs

Apple to expand encryption on Macs

Apple is amping up its commitment to encryption.

The company is beginning the first major overhaul of the Mac filing system — the way it stores files on the hard drive — in more than 18 years. The move was quietly announced during a conference break out session after Apple’s blockbuster unveiling of its new operating system MacOS Sierra.

Amidst other new features, including the ability to place timestamps on files accurate to fractional seconds and a more efficient mechanism to clone files, the new Apple File System (APFS) updates file encryption.
The new system allows files to be encrypted with multiple keys, providing an extra layer of security against attackers or, to the FBI’s recent chagrin, law enforcement agencies.

The shift comes after Apple faced vocal criticism for its commitment to encrypted data after refusing to unlock an iPhone used by one of the shooters in the San Bernardino, Calif, terrorist attack.

Currently, on computers using OSX’s encryption, files are encrypted using the same key. The operating system unlocks the files on computers where a user has logged in. If an attacker compromises the key or attacks the computer when a user has logged in, the files are no longer encrypted.

On APFS, users will have the option to encrypt different segments of the file storage system with different keys. Access to one file wouldn’t mean access to all of them.

APFS will also encrypt the metadata contained in each file.

The new file system will released in 2017, months after Sierra’s release.

Apple Echoes Commitment to Encryption after Orlando Shooting

Apple Echoes Commitment to Encryption after Orlando Shooting

Apple used the kickoff of its Worldwide Developers Conference Monday to reaffirm the company’s stance on encryption and data monetization, one day after the most deadly mass shooting in U.S. history threatened to rekindle the debate surrounding the use of the technology.

“In every feature that we do, we carefully consider how to protect your privacy,” Apple senior vice president of software engineering Craig Federighi told conference attendees in San Francisco Monday.

Federighi said that includes the Cupertino-based company’s commitment “to use end-to-end encryption by default,” and described a new policy at Apple known as “differential privacy,” which incorporates using machines to learn how users use Apple products via crowdsourced data, while not tracking specific data back to individual users.

Federighi’s keynote came one day after 29-year-old Omar Mateen shot and killed 49 people at a gay nightclub in Orlando early Sunday, and who authorities later said pledged allegiance to ISIS during the attack.

The scenario echoes last year’s shooting in San Bernardino, where two attackers later found to have made a similar pledge to the Islamic extremist terror group were found in possession of an iPhone after a shootout with police that left both dead. The FBI asked Apple to bypass the device’s encryption as part of their investigation — a request Apple refused, prompting a court battle that ended prematurely after the FBI found a third-party to crack the phone’s encryption.

Investigators recovered a phone from Mateen after he died in Sunday’s attack, but have declined to identify its make. Regardless of whether the device is an Apple product, the shooting could easily become fodder for those in government pushing for a back door into encrypted communication platforms like Apple’s, especially given the increasing number and popularity of encryption applications like Telegram of the Facebook-owned WhatsApp.

“We are going through the killer’s life — especially his electronics — to understand as much as we can about his path and whether there was anyone else involved, either in directing him or in assisting him,” FBI Director James Comey said Monday.

The FBI director said investigators are confident Mateen was self-radicalized online.

Comey has repeatedly testified before Congress on the emerging issue of terrorists and criminals “going dark” online as a result of their use of communication platforms with end-to-end encryption, which in Apple’s case, not even the company itself can access without a user’s PIN.

The tug of war between privacy and security has spread from cases still pending in court against Apple and others to Congress, where lawmakers have offered several legislative proposals to discuss or even mandate law enforcement cooperation, all the way up to the 2016 presidential election, with Donald Trump calling for a “boycott” of Apple products.

Apple CEO Tim Cook opened the conference Monday by leading the crowd in a moment of silence for the victims of Sunday’s shooting.

“The Apple community is made up of people from all around the world, all different backgrounds, all different points of view,” said Cook, who came out as gay in 2014. “We celebrate our diversity.”

“We offer our deepest sympathies to everyone whose lives were touched by this violence,” he continued, “this senseless, unconscionable act of terrorism, of hate aimed at dividing and destroying.”

Cook wrote an open letter earlier this year in the wake of the San Bernardino debate pushing back against the FBI’s attempt to force the company into cooperating.

Amazon is going to remove encryption capabilities of its Kindle Fire, Rumours says Apple & FBI Case is reason – Lansing Technology Time

Amazon is going to remove encryption capabilities of its Kindle Fire, Rumours says Apple & FBI Case is reason – Lansing Technology Time

According to Amazon, Removing Kindle Fire,Fire OS 5’s onboard encryption is not a new development, and it’s not related to the iPhone fight

Amazon said that the Fire OS 5 update removed local device encryption support for the Kindle Fire, Fire Phone, Amazon Fire HD, or Amazon Fire TV Stick was because the feature simply wasn’t being used.

Privacy advocates and some users criticized the move, which came to light on Thursday even as Apple Inc was waging an unprecedented legal battle over U.S. government demands that the iPhone maker help unlock an encrypted phone used by San Bernardino shooter Rizwan Farook.

On-device encryption scrambles data so that the device can only be accessed if the user enters the correct password. Cryptologist Bruce Schneier said Amazon’s move to remove the feature was “stupid” and called on the company to restore it.

Amazon’s move is a bad one. But it’s not a retreat in the face of Apple-FBI pressures

One of the features removed includes one that allowed owners to encrypt their device with a pin which, if entered incorrectly 30 times in a row, deletes all the data stored on it. The feature is similar to the safety feature found on the iPhone at the center of the San Berardino shooter trial, which erases all the device data if the passcode is entered incorrectly ten times.

Amazon joined other major technology companies in filing an amicus brief supporting Apple on Thursday, asking a federal judge to overturn a court order requiring Apple to create software tools to unlock Farook’s phone.

Amazon spokeswoman Robin Handaly said in an email that the company had removed the encryption feature for Kindle Fire tablets in the fall when it launched Fire OS 5, a new version of its tablet operating system.

“It was a feature few customers were actually using,” she said, adding that Kindle Fire tablets’ communication with the company’s cloud meets its “high standards for privacy and security including appropriate use of encryption.”

Encryption expert Dan Guido said that Amazon may have eliminated the feature to cut component costs for tablets that sell for as low as $50.

But digital privacy advocates and customers said those arguments were not good enough reasons for discontinuing the feature.

“Removing device encryption due to lack of customer use is an incredibly poor excuse for weakening the security of those customers that did use the feature,” said Jeremy Gillula, staff technologist with the Electronic Frontier Foundation.

“Given that the information stored on a tablet can be just as sensitive as that stored on a phone or on a computer, Amazon should instead be pushing to make device encryption the default – not removing it,” Gillula said.

David Scovetta, a security analyst who owns two Kindle e-readers as well as Amazon’s TV set-top box, said he is now wary of buying new gadgets from the company.

“Amazon could just as easily be encouraging its users to adopt it rather than remove it as a feature. That’s a massive step backwards,” he said.

Fire OS 5 is the first release to use the Android 5.0 “Lollipop” codebase, and as such it is possible that this removal is down to a technical issue (such as battery life or performance). Last year Google reported that it would allow hardware makers to decide whether or not to enable encryption-by-default because of performance issues on older devices.

People are talking about the lack of encryption today because the OS update is only now hitting older devices, like the fourth-generation Fire HD and Fire HDX 8.9. Despite how neatly the sudden forfeiture of encryption by a tech giant fits the Apple-FBI narrative, this encryption deprecation isn’t related to that battle. Instead, Amazon appears to have given up onboard encryption without any public fight at all.

UK’s lower house eases up on encryption

UK's lower house eases up on encryption

The United Kingdom’s House of Commons approved far-reaching authority for spy agencies to access cyber data Tuesday, but pulled back some restrictions on encryption opposed by Apple and Facebook.

The so-called “snooper’s charter,” officially the Investigatory Power Act, codifies intelligence agencies’ use of metadata analysis and malware to hack computers that has been ongoing in the U.K. It requires communications companies to maintain records of customers’ web browsing for a full year to assist investigations.
But the final version eased up on restrictions on encryption. Early drafts of the law mandated encryption include backdoor access – an issue that recently sparked a battle between Apple and the FBI in the U.S. The version passed Tuesday requires only that companies help break encryption if it is reasonable in terms of cost and technology.

That would keep the kinds of encryption used on Apple phones and Facebook’s newly announced end-to-end encrypted messaging service off the table. When properly implemented, neither would be technologically possible to crack.

The changes to encryption were one of a few amendments meant to assuage concerns about the law’s effect on privacy. Civil liberties groups are still unhappy with the complete product, though interior minister Theresa May called the safeguards “world leading.”

The final vote on the IPA was 444-69. It now heads to the House of Lords for their approval.

Customer Headaches Could Curtail Apple’s Encryption Push

Customer Headaches Could Curtail Apple’s Encryption Push

At an event held during Apple’s fight with the FBI over whether it should help unlock a dead terrorist’s iPhone, CEO Tim Cook promised “We will not shrink” from the responsibility of protecting customer data —including from government overreach.

Yet the obvious next step for the company could be hard to take without inconveniencing customers.

Apple is currently able to read the contents of data stored in its iCloud backup service, something at odds with Cook’s claims that he doesn’t want his company to be capable of accessing customer data such as mobile messages.

Apple has not denied reports it is working to change that. And the company is expected to make some mention of its security technology at its World Wide Developers Conference next week, as it did at March’s iPhone event in March.

But redesigning iCloud so that only a customer can unlock his data would increase the risk of people irrevocably losing access to precious photos and messages when they lose their passwords. Apple would not be able to reset a customer’s password for them.

“That’s a really tough call for a company that says its products ‘Just work,’” says Chris Soghoian, a principal technologist with the American Civil Liberties Union—referring to a favorite line of Apple’s founder, Steve Jobs.

Cook has boasted of how the encryption built into Apple’s iPhones and iMessage system keeps people safe by ensuring that only they can access their data. FBI director James Comey has complained about it.

But the design of iCloud means that Apple can read much of its customers’ data, and help the government do so, too. The service is enabled by default (although you can opt out), and automatically backs up messages, photos, and more to the company’s servers. There the data is protected by encryption, which Apple has the key to unlock. The company’s standoff with the FBI happened only because the backups Apple handed the agency from San Bernardino shooter Syed Farook’s iPhone ended six weeks before the shooting, because he had turned them off.

Apple could lock itself and law enforcement out of iCloud data by encrypting each person’s iCloud backups using a password under his control, perhaps the same one that locks his iPhone.

The company has not denied reports from the Financial Times and Wall Street Journal that it is working on such a design. Passwords and credit card details stored using an iCloud feature called Keychain are already protected in this way. But taking this approach would prevent Apple from being able to reset a person’s password if he forgets it. The data would be effectively gone forever.
It is probably impractical for Apple to roll out that approach for everyone’s data, as the company did for the security protections built into the iPhone, says Vic Hyder, chief strategy officer with Silent Circle, which offers secure messaging, calls, and data sharing for corporations.

“It puts control on the customer but also responsibility on the customer,” he says. “This will likely be an option, not the default.”

Soghoian of the ACLU agrees. “I think they will probably offer it as an option, but be reluctant to advertise that feature much,” he says. “More people forget their passwords than get investigated by the FBI.”

Bryan Ford, an associate professor at the Swiss Federal Institute of Technology in Lausanne, says Apple could take steps to reduce the risk of accidental data loss.

The company’s FileVault disk encryption feature for PCs offers the option to print out a recovery key. A similar process could be used for iCloud encryption, says Ford.

Apple could also implement other safeguards, he says. For example, people could have the option of distributing extra encryption keys or passwords to several “trustees,” who could help recover data if the original password was lost. To prevent abuse it could be required that a certain number of trustees, say, three of five, came forward to unlock the data.

The cryptography needed for such a design is well understood, says Ford. He recently designed a similar but more complex system intended to help companies such as Apple prevent their software updates from being abused (see “How Apple Could Fed-Proof Its Software Update System”).

Alan Fairless, cofounder and CEO of SpiderOak, which offers companies fully encrypted data storage, says he thinks companies like Apple will eventually make truly secure cloud storage accessible to consumers.

Encrypted messaging was clunky and hard to use until recently, but is now widespread thanks to Apple and WhatsApp, he points out. Encrypting stored data is more challenging, but Apple has shown itself willing to spend significantly on encryption technology, for example by adding new chips to the iPhone, says Fairless.
However, he also thinks Apple and its customers aren’t yet ready for encrypted iCloud backups to be the default. “It’ll take consumer technology a while to catch up,” says Fairless.

HelpSystems Fills Encryption Gap With Linoma Buy

Despite all the IBM i security vendors that HelpSystems has bought over the years–and there have been at least five of them–the company has lacked one key security capability valued by enterprises: encryption. With last week’s deal to acquire Linoma Software, the Minneapolis software vendor has finally obtained that encryption capability for IBM i.

HelpSystems has been experiencing heavy demand for IBM i encryption capabilities, says CEO Chris Heim. “I wouldn’t say we lost sales because of it, but we definitely wanted to offer a full solution to our customers and that’s why we wanted to check that encryption box,” he tells IT Jungle.

Linoma’s Crypto Complete provides a full-featured encryption solution for IBM i customers. In addition to providing the core encryption capability (by automating the use of IBM’s field-level encryption APIs), it also includes key management and audit trail capabilities that auditors are increasingly expecting companies to have.

Bob Luebbe, who is Linoma’s president and chief architect–and formerly its co-owner along with his wife Christy–says interest in encryption among IBM i shops is on the upswing.

“Most companies have already taken care of credit card data under PCI,” he says. “But now personally identifiable information [PII], such as birthdays and Social Security numbers, is really popular to protect. That’s what we’re seeing the most demand for.”

While there has been no new major federal laws mandating protection of PII, several states have passed state privacy laws that address PII, while HIPAA continues to drive solutions for encryption private health information (PHI). With the average cost of a data breach touching nearly $7 million, the cost of buying software and services to encrypt sensitive fields in a DB2 for i database doesn’t look nearly so bad.

“A lot of companies are being a lot more proactive than ever before,” Luebbe says. “It’s fairly inexpensive to implement encryption camped to getting a multi-million dollar price tag for remediation. Plus a lot of companies in the public eye want to maintain their customers’ trust, to ensure them that their data is being protected and secured.”

Getting the AES algorithms to encrypt and decrypt data in a DB2 for i database is one thing. You actually don’t need a third-party tool like Crypto Complete to do that, provided you’re comfortable working with IBM’s APIs (which can be complex). But increasingly, having encryption means more than that.

“Auditors are getting a lot smarter,” Luebbe says. “An auditor, when they came into your shop, they used to ask if you’re encrypting data, and you check that box. But now they’re getting more diligent. They want to know what kind of key management you have in place, who’s authored to work with those keys, where’s the audit trail, and who’s actually authorized to decrypt that information. They’re really expanding their requirement and putting a lot more pressure on shops to move just beyond calling APIs to encrypt information.”

HelpSystems also had its eye on GoAnywhere, Linoma’s line of managed file transfer (MFT) solutions that help to control the flow of data among file systems and databases running on IBM i, Linux, Windows, and many other on-premise and cloud platforms.

The GoAnywhere suite has been Linoma’s biggest seller lately, and HelpSystems will eagerly begin offering what Heim considers to be best-of-breed.

“I would probably say the encryption piece fills a bigger hole for us in our IBM i security portfolio,” Heim says. “But on cross-platform, it’s MFT. That’s been a dynamite product for Bob. We did a survey of a lot of the products out there and we think it’s the best in the industry.”

There will be few changes for Linoma going forward. The company will continue to operate out of its headquarters in Ashland, Nebraska indefinitely. Linoma’s 2,000 or so customers will get technical support in the same manner. All 32 Linoma employees will be retained; in fact, the company is hiring.

Heim first contacted Luebbe about a possible deal about a year ago, and Luebbe says initially he wasn’t interested. But after several meetings with the Minnesota native, Luebbe eventually came to the conclusion that he could use Help’s help to take Linoma to the next level.

“As we were growing, we were starting to feel the pain in our development [and support structure]. It’s hard to maintain that growth without some help,” Luebbe says. “We were also worried about business continuation if something were to happen to me.”

A similarity between the two companies’ cultures helped seal the deal. “It just felt like a bigger version of Linoma,” Luebbe says. “I love their motto: ‘Happy employees equal happy customers.’ That really drove it home for me. They really treat their people well. They have great customer service.”

Luebbe also likes that he will have HelpSystems’ large Minneapolis team available for brainstorming. “We were like our own little island in the middle of Nebraska,” he says. “It’s great that now we’re going to have a lot of great ideas to bounce back and forth between our sales team and R&D and support team.”

And now that HelpSystems is handling some of the more mundane aspects of running a software business, Luebbe will be free to spend more time with the customers and products.
“I love to give demos and work with the technical team and help design the next releases of the product. Those are the things I love,” he says. “I don’t especially love working with layers and accountants and insurance people.”

Added Heim: “We’re taking over that for him.”

Is Facebook making end-to-end encryption on Messenger opt-in only?

Is Facebook making end-to-end encryption on Messenger opt-in only?

Facebook’s native chat is due to be silenced: Facebook’s reportedly going to kill it off, forcing users to instead use Messenger.

Rumor has it that Facebook Messenger will also offer the option of end-to-end encryption sometime in the next few months.

The Guardian, relying on input from three unnamed sources close to the project, earlier this week reported the end-to-end encryption news. Facebook hasn’t confirmed it, declining to comment on rumors or speculation.

The timing of these two things isn’t clear, but it would make sense for them to coincide – kill the native chat app just as a more privacy-protecting version of Messenger is ready to pull users in.

Ars Technica reports that some users are already getting pushed off the mobile version of Facebook’s native chat and onto the free, dedicated Messenger app.

Users of the regular Facebook mobile app were evicted a while ago. Now, it’s happening to those who access it via their phones’ web browsers or via third-party apps such as Tinfoil or Metal, Ars reports.

Some Android users are even being booted off chat automatically, shunted over to Google’s Play store to download Messenger when they try to check out their messages on the mobile site.

End-to-end encryption would shield conversations from all but the sender and receiver. That includes the prying eyes of both government surveillance outfits or from tech companies themselves.

The tradeoff: if Facebook can’t see conversations or get at users’ personal data, it can’t use artificial intelligence (AI) to chime in and do helpful things.

And, as we reported yesterday, Facebook’s on track to do a lot more language processing to figure out, for example, who’s messaging about needing a ride and therefore might want to have an Uber link pop up.

End-to-end encryption in Messenger would also put it on par with other encrypted messaging apps, including Apple iMessage, WhatsApp and Google’s new Allo messaging app.

Both Facebook and Google are trying to balance users’ demands for secure messaging with their thirst for services enhanced by the use of users’ personal data. Their solution: offer the end-to-end encryption as an opt-in feature.

As you might expect, some users are displeased with the notion of being forced onto Messenger, while some privacy experts are displeased with the idea that the speculated end-to-end encryption is opt-in rather than default.

The Guardian quoted Kenneth White, a security researcher and co-director of the Open Crypto Audit Project, which tests the security of encryption software:

Is Facebook making end-to-end encryption on Messenger opt-in only?

The timing on killing native chat and releasing the Messenger crypto feature isn’t known, but The Register reports that it’s already been released for Windows 10 Mobile users.

Despite end-to-end encryption, your WhatsApp and Telegram chats can be spied on

end-to-end-encryptionEven though WhatsApp promises end-to-end encryption on all of its chats, and Telegram offers end-to-end encryption on secret chats, the truth is that messages on these platforms can still be hacked. The reason is because the messaging apps still rely on phone networks that use Signalling System No. 7, better known as SS7.

You might recall that back in April, we told you about SS7 when we passed along a story shown on 60 Minutes about hacking. SS7 is a protocol used to connect carriers around the world and affects all smartphone users regardless of the device they use. While SS7 can’t break the encryption employed by the two aforementioned messaging apps, it can be used to fool a wireless operator into helping the hacker open a duplicate WhatsApp and Telegram account in the name of the target.

The first step that a hacker employing SS7 does is trick the target’s carrier into believing that his phone number is the same as the target’s mobile number. Once that is accomplished, the hacker installs WhatsApp and Telegram on his phone, and uses the target’s number to set up new accounts. This will allow them to receive the secret code falsely proving that the hacker is the legitimate user of these accounts. Once all this is accomplished, the ruse is on as the hacker can send and receive messages pretending to be the target.

You can see how this all works by watching the pair of videos below. Most security firms still prefer WhatsApp and Telegram for their end-to-end encryption, which prevents “man-in-the-middle” hacks that redirect messages to a hacker’s phone. But obviously, opening a duplicate account can allow hackers to read messages not intended for their prying eyes.

​Symantec warns encryption and privacy are not the same

“Encryption and privacy is not the same thing,” said Nick Savvides, Symantec APAC cybersecurity strategy manager.

Encryption is a privacy “enhancing tool”, Savvides went on to explain, while privacy is more about handling what information is collected, how the collected information is handled, and what other data can be derived from it. The two are often confused because they are related: Encryption is used to maintain privacy.

Savvides said that unfortunately most websites do not use encryption, highlighting the company’s most recent Internet Threat Security Report, which revealed that 97 percent of active websites do not have any basic security and 75 percent have unpatched vulnerabilities, with 16 percent of those being critical.

Meanwhile, the remaining 3 percent of active websites with security are banks and corporate businesses, according to Savvides.

He said the IT security community often blames “lazy” users for the lack of encryption. However, he said the real hindrance is the complexity that is involved with encryption, and it’s often something that users expect to be provided with.

“They don’t do [encryption] because it’s hard; they only do it when they absolutely have to,” he said.

He pointed out that iMessage, Apple’s built-in instant messaging service, and more recently mobile messaging app Whatsapp, are two examples of where end-to-end encryption is provided, and not something that users have to actively go seek.

In turn, the security company has extended its partnership program, Encryption Everywhere to Australia, which is already live in North America and Europe. The program falls under Symantec’s goal to achieve 100 percent encryption for all websites globally by 2018.

Under the Encryption Everywhere program, Symantec has initially partnered with WHMCS and cPanel to hand out domain-validated TLS/SSL certificates for free, before taking a multi-tier paid model approach.

“We’d like to see broader base encryption utilised across the world, across the internet. Whether it’s ours or somebody else’s, we’d like to see it adopted because it will make the internet a safer place, free from prying eyes,” Savvides said.

Survey findings from Norton by Symantec released on Tuesday indicated that online threats will not be slowing, particularly with the proliferation of the Internet of Things.

The survey showed that while almost two thirds of Australians use at least one mobile app to manage their finances or control other connected devices, 66 percent do not have security software on their smartphones, and 33 percent choose not to have a password or PIN on these devices.

Despite this, 61 percent of Australians admitted that they would be upset if their financial information was compromised.

According to Mark Gorrie, Norton by Symantec APAC director, as the smartphone becomes a central control hub and a “gateway” to other devices, the onus is on both the vendor and the user to ensure security is top of mind. Gorrie, however, pointed out that historically, vendors have always seen security as an afterthought, but indicated that it has improved more recently.

“Vendors should be taking seriously because it is such a big issue. We see the threats just keep growing every year, and just won’t give up because it’s a profitable business for a lot of people. There is definitely a responsibility security should rank highly on the devices vendors are releasing, but equally people have to be proactive to help themselves,” he said.

Allo doesn’t offer default end-to-end encryption setting because it would disable Google Assistant

When Google unveiled Allo — their smart messaging app coming soon to Android and iOS — one of the more interesting features they revealed was end-to-end encryption. As we later learned, the technology powering Allo’s end-to-end encryption was built upon Signal Protocol, the same open-source protocol from Open Whisper Systems that WhatsApp currently uses.

We’ve known since the announcement that E2E encryption was a feature of Allo’s Incognito mode, but now Ars Technica has confirmed exactly why this is the case. Because Google Assistant is such a huge part of Allo, it simply wouldn’t be possible for Google to listen in on conversations and provide smart suggestions for restaurants, or quick replies.

This is after Thai Duong, the co-leader on Google’s product security team, made it known in a blog post that he wished Allo’s E2E encryption was enabled by default (outside of Incognito Mode) — not an option left up to the user. The sentiment was further echoed by Edward Snowden in a Twitter post, advising users to avoid using it for now.

It didn’t take long for Duong’s higher-uppers to get word and the blog post was promptly revised (several times, actually). Duong did mention that it would be possible for Google to add a default encryption option where Google Assistant would only work when messaged directly, but there’s currently no plans to add such a feature.

In the end, what it comes down to is whether the user values Google Assistant over the privacy of Incognito Mode. It’d be nice to have both, but for now it’s just one or the other.

Google engineer says he’ll push for default end-to-end encryption in Allo

Google engineer says he'll push for default end-to-end encryption in Allo

After Google’s decision not to provide end-to-end encryption by default in its new chat app, Allo, raised questions about the balance of security and effective artificial intelligence, one of the company’s top security engineers said he’d push for end-to-end encryption to become the default in future versions of Allo.

Allo debuted with an option to turn on end-to-end encryption, dubbed “incognito mode.” Google obviously takes security seriously, but had to compromise on strong encryption in Allo in order for its AI to work. (Allo messages are encrypted in transit and at rest.)

Thai Duong, an engineer who co-leads Google’s product security team, wrote in a blog post today that he’d push for end-to-end encryption in Allo — then quietly deleted two key paragraphs from his post. In the version he originally published, Duong wrote:

Google engineer says he'll push for default end-to-end encryption in Allo

These two paragraphs have been erased from the version of Duong’s post that is currently live.

This edit probably doesn’t mean that Duong won’t continue to lobby internally for end-to-end encryption — his job is to make Google’s products as secure as possible. But Google, like most major companies, is pretty cagey about revealing its plans for future products and likely didn’t want Duong to reveal on his personal blog what’s next for Allo.

Even without the paragraphs on end-to-end encryption, Duong’s post offers interesting insight into Google’s thinking as it planned to launch Allo. For users who care about the security of their messaging apps, Duong highlights that it’s not encryption that matters most to Allo, but rather the disappearing message feature.

“Most people focus on end-to-end encryption, but I think the best privacy feature of Allo is disappearing messaging,” Duong wrote. “This is what users actually need when it comes to privacy. Snapchat is popular because they know exactly what users want.”

Duong also confirmed the likely reason Google didn’t choose to enable end-to-end encryption in Allo by default: doing so would interfere with some of the cool AI features Allo offers. For users who don’t choose to enable end-to-end encryption, Allo will run AI that offers suggestions, books dinner reservations and buys movie tickets. But the AI won’t work if it can’t scan a user’s messages, and it gets locked out if the user enables end-to-end encryption.

We reached out to Google to ask if the company asked Duong to edit to his blog post and will update if we hear back. Duong stressed that the post only reflected his personal beliefs, not those of Google — and we hope his advocacy for a default incognito mode comes to fruition.

OSGP custom RC4 encryption cracked yet again

OSGP custom RC4 encryption cracked yet again

The Open Smart Grid Protocol’s (OSGP) home-grown RC4 encryption has been cracked once again. The easy-to-break custom RC4 was cracked last year.

A year ago, the OSGP Alliance advised that better security would be implemented, but the RC4 still remains according to German researchers Linus Feiten and Matthias Sauer.

Feiten and Sauer claim to have the ability to extract the secret key used in the OSGP’s RC4 stream cipher. “Our new method comprises the modification of a known attack exploiting biases in the RC4 cipher stream output to effectively calculate the secret encryption key. Once this secret key is obtained, it can be used to decrypt all intercepted data sent in an OSCP smart grid,” Sauer and Feiten explained in their research.

Decrypting the secret key can expose the energy consumption of an individual customer thus an attacker could create messages reporting incorrect information to the grid operator.

Grid operators waited on vendor support to protect their networks with the alliance’s OSGP-AES-128-PSK specification bit encryption released in July as it was described as a “new work proposal for standardisation purposes”.

John McAfee claims to have hacked WhatsApp’s encrypted messages, but the real story could be different

John McAfee claims to have hacked WhatsApp’s encrypted messages, but the real story could be different

Last month, WhatsApp enabled end-to-end encryption for its billion users to secure all the communications made between users — be it a group chat, voice calls, personal chats or the photos and videos that are being shared. While WhatsApp says it is difficult even for them to access the conversations, cybersecurity expert John McAfee and his team of four hackers claim to successfully read an encrypted WhatsApp message, Cybersecurity Ventures reports. While it sounds like a bold claim, the real story could be completely different.

John McAfee, the creator of one of the popular anti-virus software, apparently tried to trick the media in believing that he hacked the encryption used by WhatsApp, Gizmodo reports. To convince the reporters that he could read the encrypted conversations, McAfee is said to have sent two phones preinstalled with malware containing a keylogger.

According to Dan Guido, a cybersecurity expert who was contacted to verify the claim, McAfee sent two Samsung phones in sealed boxes to the reporter. The experts then took the phones out and exchanged a text on WhatsApp, which McAfee was able to read over a Skype call. Citing sources, the publication also reports that McAfee offered his story to a couple of big publications as well, which includes Russia Today and the International Business Times.

“John McAfee was offering to a different couple of news organizations to mail them some phones, have people show up, and then demonstrate with those two phones that [McAfee] in a remote location would be able to read the message as it was sent across the phones. I advised the reporter to go out and buy their own phones, because even though they come in a box it’s very easy to get some saran wrap and a hair dryer to rebox them,” Guido told the publication.

McAfee has a long history of being shifty, especially when it comes to his alleged cybersecurity exploits. For instance, earlier this year in March, he claimed to hack into San Bernadino terrorist Syed Farook’s phone, but he never managed to prove his claims right. Later on, McAfee admitted that he lied to get the public attention.

This time too McAfee seems to have lied to reporters to buy his story, but when reporters asked to verify the claim, he changed the story. Moxie Marlinspike, who developed and implemented the encryption tool in WhatsApp told the publication about McAfee admitting his plan.

“I talked to McAfee on the phone, he reluctantly told me that it was a malware thing with pre-cooked phones, and all the outlets he’d contacted decided not to cover it after he gave them details about how it’d work,” he said.

With McAfee’s claims turn out to be false, WhatsApp saying that it does not have the ‘key’ to decrypt communications sounds good so far. However, if at all, someday, someone manages to hack into the conversations, it could turn into havoc. While it will give the ability to monitor the conversations between terrorists, it could also breach the privacy of the users.

Legal effects of encryption bills discussed at dark web event

1

An attorney who has worked for the U.S. Army and the Central Intelligence Agency discussed attempts to regulate encryption technologies at the Inside Dark Web conference in New York City on Thursday.

“State legislative response may be un-Constitutional, because it would place a burden on interstate commerce,” said Blackstone Law Group partner Alexander Urbeis. “So they may, in fact, be a way to encourage the federal government to enact encryption legislation.” Several states, including California, Louisiana, and New York, have introduced encryption legislation recently.

California’s “Assembly Bill 1681,” which would have created a $2,500 penalty of phone manufacturers and operating system providers that leased or sold smartphones in the state for each instance in which they did not obey a court order to decrypt a phone, was defeated last month. A similar bill proposed in New York is currently in committee.

“The economic implications would outstrip the privacy implications,” Urbeis said, discussing the effects of the encryption bill sponsored by Sen. Dianne Feinstein (D-Calif.) and Senate Intelligence Committee Chairman Richard Burr (R-N.C.). “The economic implications of these legislation have not been fully thought through. They are obviously going to become very attractive targets for hackers, criminal groups.”

Urbeis also heads Black Chambers, an information security firm that protects legal privilege. Many law firms “have lost the confidence of clients to protect their data,” he said, discussing the reaction to the Panama Papers. “Law firms have been for a long time the soft underbelly of their clients,” he said.

American ISIS Recruits Down, but Encryption Is Helping Terrorists’Online Efforts, Says FBI Director

American ISIS Recruits Down, but Encryption Is Helping Terrorists'Online Efforts, Says FBI Director

The number of Americans traveling to the Middle East to fight alongside Islamic State has dropped, but the terrorist group’s efforts to radicalize people online is getting a major boost from encryption technology, FBI Director James Comey said Wednesday.

Since August, just one American a month has traveled or attempted to travel to the Middle East to join the group, compared with around six to 10 a month in the preceding year and a half, Mr. Comey told reporters in a round table meeting at FBI headquarters.

However, federal authorities have their hands full trying to counter Islamic State’s social media appeal. Of around 1,000 open FBI investigations into people who may have been radicalized across the U.S., about 80% are related to Islamic State, Mr. Comey said.

The increasing use of encrypted communications is complicating law enforcement’s efforts to protect national security, said Mr. Comey, calling the technology a “huge feature of terrorist tradecraft.”

The FBI director cited Facebook Inc.’s WhatsApp texting service, which last month launched end-to-end encryption in which only the sender and receiver are able to read the contents of messages.

“WhatsApp has over a billion customers—overwhelmingly good people but in that billion customers are terrorists and criminals,” Mr. Comey said. He predicted an inevitable “collision” between law enforcement and technology companies offering such services.

Silicon Valley leaders argue that stronger encryption is necessary to protect consumers from a variety of threats.

“While we recognize the important work of law enforcement in keeping people safe, efforts to weaken encryption risk exposing people’s information to abuse from cybercriminals, hackers and rogue states,” WhatsApp CEO Jan Koum wrote last month in a blog post accompanying the rollout of the stronger encryption technology. The company Wednesday declined to comment on Mr. Comey’s remarks.

The FBI also continues to face major challenges in unlocking phones used by criminals including terrorists, Mr. Comey said. Investigators have been unable to unlock around 500 of the 4,000 or so devices the FBI has examined in the first six month of this fiscal year, which began Oct. 1, he said.

“I expect that number just to grow as the prevalence of the technology grows with newer models,” Mr. Comey added.

A terrorist’s locked iPhone recently sparked a high-stakes legal battle between the Justice Department and Apple Inc.
After Syed Rizwan Farook and his wife killed 14 people and wounded 22 in a December shooting rampage in San Bernardino, Calif., FBI agents couldn’t unlock the phone of Mr. Farook—who, along with his wife, was killed later that day in a shootout with police.

The government tried to force Apple to write software to open the device, but the technology company resisted, saying that such an action could compromise the security of millions of other phones.

That court case came to an abrupt end in March, when the FBI said it no longer needed Apple’s help because an unidentified third party had shown it a way to bypass the phone’s security features.

Users’interest should drive encryption policy: IAMAI

Users'interest should drive encryption policy: IAMAI

Encryption is a fundamental and necessary tool to safeguard digital communication infrastructure but the interests of Internet users should be foremost in framing any policy, the Internet and Mobile Association of India (IAMAI) said here on Tuesday.

“Trust, convenience and confidence of users are the keywords to designing an ideal encryption policy that will help in getting more people online with safe and secured internet platforms,” said IAMAI president Subho Ray.

The association, which has published a discussion paper on encryption policy, suggests that a broad-based public consultation with all stakeholders including users groups should precede making of an encryption policy.

According to the paper, the foundation of a user centric encryption policy consists of freedom of encryption, strong encryption base standard, no plaintext storage and mandatory legal monitoring or no backdoor entry.

An essential element in the suggestion that support for strong encryption is critical to counter cyber security issues around the globe, but also pitches for the importance of freedom of encryption for the users, organisations and business entities.

The encryption challenge

The encryption challenge

IT managers know the movies get it wrong. A teenager with a laptop cannot crack multiple layers of encryption — unless that laptop is connected to a supercomputer somewhere and the teenager can afford to wait a few billion years.

Encryption works. It works so well that even the government gets stymied, as demonstrated by the lengths to which the FBI went to access an iPhone used by one of the San Bernardino, Calif., shooters.

So in the face of ever more damaging stories about data breaches, why aren’t all government agencies encrypting everything, everywhere, all the time?

Encryption can be costly and time consuming. It can also be sabotaged by users and difficult to integrate with legacy applications.

Furthermore, according to a recent 451 Research survey of senior security executives, government agencies seem to be fighting the previous war. Instead of protecting data from hackers who’ve already gotten in, they’re still focusing on keeping the bad guys out of their systems.

Among U.S. government respondents, the top category for increased spending in the next 12 months was network defenses — at 53 percent. By comparison, spending for data-at-rest defenses such as encryption ranked dead last, with just 37 percent planning to increase their spending.

Part of the reason for those figures is that government agencies overestimate the benefits of perimeter defenses. Sixty percent said network defenses were “very” effective, a higher percentage than any other category, while government respondents ranked data-at-rest defenses as less effective than respondents in any other category.

There was a time when that attitude made sense. “Organizations used to say that they wouldn’t encrypt data in their data centers because they’re behind solid walls and require a [password] to get in,” said Steve Pate, chief architect at security firm HyTrust.

That attitude, however, runs counter to the modern reality that there is no longer a perimeter to protect. Every organization uses third-party service providers, offers mobile access or connects to the web — or a combination of all three.

A security audit at the Office of Personnel Management, for example, showed that use of multifactor authentication, such as the government’s own personal identity verification card readers, was not required for remote access to OPM applications. That made it easy for an attacker with a stolen login and password to bypass all perimeter defenses and directly log into the OPM systems.

An over-reliance on perimeter defenses also means that government agencies pay less attention to where their important data is stored than they should.

According to the 451 Research survey, government respondents were among those with the lowest confidence in the security of their sensitive data’s location. Although 50 percent of financial-sector respondents expressed confidence, only 37 percent of government respondents could say the same.

In fact, only 16 percent of all respondents cited “lack of perceived need” as a barrier to adopting data security, but 31 percent — or almost twice as many — government respondents did so.

Earlier this year, the Ponemon Institute released a report showing that 33 percent of government agencies use encryption extensively, compared to 41 percent of companies in general and far behind the financial sector at 56 percent. In that survey of more than 5,000 technology experts, 16 percent of agency respondents said they had no encryption strategy.

On a positive note, the public sector has been making headway. Last year, for example, only 25 percent of government respondents to the Ponemon survey said they were using encryption extensively.

“This is showing heightened interest in data protection,” said Peter Galvin, vice president of strategy at Thales e-Security, which sponsored the Ponemon report. High-profile data breaches have drawn public attention to the issue, he added.

ON ENCRYPTION: THERE ARE NO LOCKS ONLY “ANGELS” CAN OPEN

Despite the FBI dropping its case against Apple over whether or not the tech giant should supply the government agency with the ability to hack into the San Bernardino shooter’s iPhone, the argument over how our devices — especially our phones — should be encrypted continues to rage on. And regardless of how you feel about the issue, almost everybody agrees that the debate can be pretty murky, as privacy vs. protection debates usually are. To make the whole argument a lot easier to digest however, we have one of the web’s best educators and entertainers, CGP Grey, who has broken it all down in one clear five-minute video.

The video, posted above, parallels physical locks with digital “locks” (encryption), noting how they relate and how they differ in order to help us better understand the encryption debate. And one of the most important points that Grey makes about digital locks is that they need to work not only against local threats, but threats from across the globe — threats coming from “internet burglars” and their “burglar bots.”

Grey touches on the scenario in which a bad guy with an armed bomb dies, leaving behind only an encrypted phone with the code to stop the bomb. In this particular case — a parallel to the San Bernardino shooter case, as there may have been information regarding further threats on his encrypted phone — Grey points out that this may be a time when we’d want the police to have access, or a “backdoor,” to the phone. But if companies were forced to build backdoors into their products so government agencies could use them for situations like these, could we trust authorities not to abuse their powers? Could we trust that “demons” (people with bad intentions) wouldn’t hijack the backdoors?

Grey argues that we couldn’t, saying that “there’s no way to build a digital lock that only angels can open and demons cannot.”

There’s also a bonus “footnote” video (below) in which Grey discusses just how intimate the data on our phones has become (do you remember where you were on April 8th at 6:02AM? No? Your phone does).

What do you think about CGP Grey’s breakdown of the encryption debate?

Moot point: Judge closes iPhone encryption case in Brooklyn

1

The United States Justice Department said on Friday that it has withdrawn a request compelling Apple Inc to cooperate in unlocking an iPhone related to a drug case in NY following a third-party providing a passcode to the authorities to access the handset.

“An individual provided the department with the passcode to the locked phone at issue in the Eastern District of New York”, Justice Department spokesman Marc Raimondi said in a statement.

On Friday, the Justice Department told a federal court in Brooklyn that it would withdraw the motion to force Apple to pull data from a drug dealer’s locked iPhone, The Washington Post reported.

Investigators have dropped the court case against Apple as they have successfully gained access to the iPhone 5s involved in the NY drug case.

There are about a dozen other All Writs Act orders for Apple’s assistance with opening for other devices that are unresolved, but are not in active litigation, according to a Justice Department official. Apple, meanwhile, demanded to know in the NY case whether the government had exhausted all other options to get to the data.

The company said it “strongly supports, and will continue to support, the efforts of law enforcement in pursuing criminals”, but not through the government’s misuse of a law it wants to use as a “precedent to lodge future, more onerous requests for Apple’s assistance”.

The case dates back to 2014, when authorities seized the iPhone 5s of the suspect Jun Feng. Feng pleaded guilty in October to conspiring to distribute methamphetamine and is scheduled to be sentenced in June. Comments attributed to Apple’s attorneys also suggest that while the company isn’t aware of the method used, it’s convinced that normal product development is eventually going to plug whatever exploit was used to gain access to that iPhone.

According to the Wall Street Journal, that “individual” is Feng himself, who has already been convicted and only recently became aware that his phone was the subject of a national controversy.

The case began on February 16 with an order from Judge Sheri Pym and ended on March 28 when the Justice Department withdrew its legal actaion against Apple.

As a result, Comey’s remarks strongly implied that the bureau paid at least $1.3 million to get onto the phone, which had belonged to Syed Rizwan Farook, who, with his wife, killed 14 people during the December 2 terror attack in San Bernardino, Calif. 、

FBI: Encryption increasing problem

FBI: Encryption increasing problem

The FBI is facing an increasing struggle to access readable information and evidence from digital devices because of default encryption, a senior FBI official told members of Congress at a hearing on digital encryption Tuesday.

Amy Hess said officials encountered passwords in 30 percent of the phones the FBI seized during investigations in the last six months, and investigators have had “no capability” to access information in about 13 percent of the cases.

“We have seen those numbers continue to increase, and clearly that presents us with a challenge,” said Hess, the executive assistant director of the FBI branch that oversees the development of surveillance technologies.

In her testimony to a subcommittee of the House Energy and Commerce Committee, Hess defended the Justice Department’s use of a still-unidentified third party to break into the locked iPhone used by one of the two San Bernardino, California, attackers. But she said the reliance on an outside entity represented just “one potential solution” and that there’s no “one-size-fits-all” approach for recovering evidence.

Representatives from local law enforcement agencies echoed Hess’s concerns. Thomas Galati, chief of the intelligence bureau at the New York Police Department, said officials there have been unable to break open 67 Apple devices for use in 44 different investigations of violent crime — including 10 homicide cases.

Still, despite anxieties over “going dark,” a February report from the Berkman Center for Internet and Society at Harvard University said the situation was not as dire as law enforcement had described and that investigators were not “headed to a future in which our ability to effectively surveil criminals and bad actors is impossible.”

The hearing comes amid an ongoing dispute between law enforcement and Silicon Valley about how to balance consumer privacy against the need for police and federal agents to recover communications and eavesdrop on suspected terrorists and criminals. The Senate is considering a bill that would effectively prohibit unbreakable encryption and require companies to help the government access data on a computer or mobile device when a warrant is issued. Bruce Sewell, Apple’s general counsel, touted the importance of encryption.

“The best way we, and the technology industry, know how to protect your information is through the use of strong encryption. Strong encryption is a good thing, a necessary thing. And the government agrees,” Sewell testified.

How Apple makes encryption easy and invisible

How Apple makes encryption easy and invisible

Do you know how many times a day you unlock your iPhone? Every time you do, you’re participating in Apple’s user-friendly encryption scheme.

Friday, the company hosted a security “deep dive” at which it shared some interesting numbers about its security measures and philosophy as well as user habits. To be honest, we’re less concerned with how Apple’s standards work than the fact that they do and will continue to. But that’s kind of the point behind the whole system — Apple designed its encryption system so that we don’t even have to think about it.

Apple’s encryption and security protocols have faced a ton of scrutiny during its recent showdown with the government. And if anything, that debate has gotten more people thinking seriously about how data can and should be secured. And the topic is not going away for a while.

We weren’t there Friday, but Ben Bajarin from Techpinions offers some great analysis, and his piece includes some really cool stats. For one, Apple says that the average user unlocks their phone 80 times a day. We don’t know if that’s across all platforms or just iOS. It sounds a little low in my case, however, because I’m generally pretty fidgety.

But because people are checking their phones so often, it’s important for Apple developers to make encryption powerful without causing the end user frustration. Like if they could just plunk their thumb down, and their phone would unlock, for example.

89 percent of people who own Touch ID-enabled devices use the feature, Apple says. And that’s a really impressive adoption rate, but it makes sense when you think about how much easier the biometric system is to use than a passcode.

Passcodes are great, of course, and you have to have one. But as an experiment a while ago, I turned off Touch ID and went numbers-only to unlock my phone. And guess what? It was really annoying. I switched the feature back on by the end of the day.

Apple also talked up its so-called Secure Enclave, which is its slightly intimidating name for the single co-processor that has handled all encryption for its devices since the iPhone 5s. Each Enclave has its own, unique ID that it uses to scramble up all of the other data for safekeeping. And neither Apple nor other parts of your phone know what that UID is; it all just happens on its own. And that’s pretty much how we prefer it.

Apple, FBI set to resume encryption fight at House hearing

The encryption battle between Apple and the FBI is moving from the courtroom to Congress next week.

Representatives from the tech titan and the federal law enforcement agency are scheduled to testify Tuesday before the House Energy and Commerce Committee about the debate over how the use of encryption in tech products and services hampers law enforcement activities.

In February, Apple clashed with the FBI over whether the company would help investigators hack into the encrypted iPhone of San Bernardino shooter Syed Farook. That case ended when the FBI said it had found a way to unlock the phone without Apple’s help. The debate, however, is unresolved.

Technology companies and rights groups argue that strong encryption, which scrambles data so it can be read only by the right person, is needed to keep people safe and protect privacy. Law enforcement argues it can’t fight crimes unless it has access to information on mobile devices.

The hearing, called “Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives,” will include two panels. The first features Amy Hess, executive assistant director for science and technology at the FBI, who will speak about law enforcement concerns along with other law enforcement officials from around the country. Apple general counsel Bruce Sewell will speak during a second panel, which will feature computer science and security professionals.

The FBI and Apple did not immediately respond to requests for comment on their testimony.

The hearing’s agenda comes just a day after a US Senate encryption bill was released that would give law enforcement and government investigators access to encrypted devices and communications. Authored by US Sens. Dianne Feinstein and Richard Burr, the bill furthers a fight that pits national security against cybersecurity.

Earlier this month, Facebook complicated things a bit further for the FBI when it announced that all communications sent on its popular WhatsApp messaging app are now encrypted.

Feinstein encryption bill sets off alarm bells

Feinstein encryption bill sets off alarm bells

A draft version of a long-awaited encryption bill from Sens. Dianne Feinstein, D-Calif., and Richard Burr, R-N.C., was leaked online last week, and the technology industry is already calling foul.

The bill requires any company that receives a court order for information or data to “provide such information or data to such government in an intelligible format” or to “provide such technical assistance as is necessary to obtain such information or data in an intelligible format.” It doesn’t specify the terms under which a company would be forced to help, or what the parameters of “intelligible” are.

The lack of these boundaries is one of the reasons why the backlash to the bill — which isn’t even finished — has been so fast and overwhelming. Kevin Bankston, director of the Open Technology Institute, called it “easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.”

It’s disheartening that the senators intend to continue pressing on with this bill, especially in light of the FBI’s recent bullying of Apple. After the FBI bungled its handling of the San Bernardino shooter’s phone, it tried and failed to force Apple into creating a new program that would let it hack into not just the shooter’s phone but probably many other phones as well. When Apple resisted, the FBI mysteriously came up with a workaround. Small wonder other technology companies are reacting poorly to this Senate bill.

Feinstein’s staffers said that the issue is larger than one phone. That’s true — and it’s exactly why such a broad proposal should make everyone who uses a smartphone uneasy. Giving law enforcement such a broad mandate would inevitably lead to questionable decisions, and it would weaken Internet security for everyone.

Feinstein’s staff also said that the reason for the bill’s vagueness is that the goal is simply to clarify law, not to set a strict method for companies or to tell the court what the penalties should be should companies choose not to follow orders. That sounds good in theory. In practice, Feinstein and Burr would be well-advised to go back to the table with technology interests — and really listen to their concerns.

“Petya” ransomware encryption cracked

"Petya" ransomware encryption cracked

Utility generates unscrambling key.

Users whose data has been held to ransom by the Petya malware now have an option to decrypt the information, thanks to a new tool that generates an unscrambling key.

Petya appeared around March this year. Once executed with Windows administrator privileges, Petya rewrites the master boot record on the computer’s hard drive, crashes the operating system and on restart, scrambles the data on the disk while masquerading as the CHKDSK file consistency utility.

The Petya attackers then demand approximately A$555 in ransom, payable in BItcoin, to provide a decryption key for the locked system.

An anonymous security researcher using the Twitter handle leo_and_stone has now cracked the encryption Petya uses, the Salsa10 function created by DJ Bernstein in 2004.

Decrypting hard disks scrambled with Petya using the tool is a relatively complex operation. The tool requires data from an eight-byte nonce (random, use-once number) file and a 512-byte sector from the hard disk to be input into a website to generate the the decryption key.

This means the Petya-infected hard drive has to be removed from the victim computer, and the small amount of data needed for the decryptor read and copied with low-level system utilities.

Once that is done, the scrambled hard drive has to be reinserted into a computer to bring up the Petya ransom demand screen, at which stage the decryption key can be entered.

Tech support site Bleeping Computer, run by computer forensics specialist Lawrence Abrams, reported success with Leo Stone’s Petya decryptor, with keys being generated in just seconds.

A Windows tool to make it easier to extract the verification data and nonce was also created by researcher Fabian Wosar from security vendor Emsisoft.

WhatsApp’s encryption services are legal for now, but maybe not for long

WhatsApp's encryption services are legal for now, but maybe not for long

WhatsApp introduced end-to-end encryption for all its services today. This means that all user calls, texts, video, images and other files sent can only be viewed by the intended recipient, and no one, not even WhatsApp itself, can access this data. This guarantee of user privacy creates new concerns for the government.

WhatsApp will now find it impossible to comply with government requests for data, since WhatsApp itself will not have the decryption key. In effect, WhatsApp is doing exactly what Apple did in the Apple vs FBI battle; it’s preventing government access to data, but on a much larger scale. While Apple restricted access to users of iPhones only, now practically every user of WhatsApp on any device is protected. 51% of all users of internet messaging services in India use WhatsApp, with a total number of over 70 million users (Source: TRAI’s OTT Consultation Paper, dated March 2015). WhatsApp has now prevented government access to the messages and calls of at least 70 million Indian users.

No encryption requirements are applicable on OTTs like WhatsApp

Telecom service providers and internet service providers, like Airtel and Vodafone, have to obtain a license from the Department of Telecommunications in order to be able to provide such services in India. This license includes several restrictions, including license fees, ensuring emergency services, confidentiality of customer information and requirements for lawful interception, monitoring and the security of the network. These include encryption requirements.

For example, the ‘License Agreement for Provision of Internet Service (Including Internet Telephony)’ for internet service providers (like Reliance and Airtel), permits the usage of up to 40-bit encryption. To employ a higher encryption standard, permission will have to be acquired and a decryption key deposited with the Telecom Authority.

Apps like WhatsApp, Skype and Viber are, however, neither telecom service providers nor internet service providers. These are known as ‘Over-The-Top Services’, or OTTs. Currently, OTTs are not regulated and as such, there are no encryption requirements, nor are there any other requirements in the name of security which these have to comply with.

The Telecom Regulatory Authority of India came out with an OTT Consultation Paper in 2015. Discussions on the paper are closed, but TRAI is yet to issue regulations on the matter. In the absence of any regulations at present, it’s clear that WhatsApp’s new end-to-end encryption policy is perfectly legal, even though it presents a new dilemma for the government.

Impact of end-to-end encryption on proposed regulatory system

Other countries have adopted various approaches to resolve the issue of OTT services. For example, in France, Skype was made to register as a telecom operator. In Germany, Voice-Over-IP is subject to the same requirements as other telecom services because of the technology neutral approach of its Telecommunications Act. In China, VOIP calls have a separate regulatory system under the head of ‘voice based calls’. These systems will make voice-over-IP subject to the same security requirements as telecom providers. For the most part however, OTT services are unregulated abroad as well.

In a detailed discussion on the issue in TRAI’s OTT Consultation Paper, TRAI notes that OTT services circumvent all regulatory requirements by providing services which are otherwise available only through a license. It has suggested the classification of OTT services either as a communication service provider or an application service provider, and to impose similar regulatory requirements as on telecom service providers.

The proposed licensing requirements include enabling ‘lawful interception’. It can be assumed that the provisions will be along the lines of those imposed on telecom regulatory requirements. Given that a 40-bit encryption system is a much lower standard than that used by WhatsApp and also considering that WhatsApp doesn’t even possess the decryption key for deposition with the relevant authority, it remains to be seen how the government will gain access to WhatsApp messages.

Liability of WhatsApp to comply with decryption directions under IT Act

WhatsApp, being an intermediary, is expected to comply with directions to intercept, monitor and decrypt information issued under Section 69 of the Information Technology Act, 2000. Complying with such a direction will now be impossible for WhatsApp in view of its end-to-end encryption. Even before the introduction of this, since WhatsApp is not a company based in India, it may have been able to refuse to comply with such directions. In fact, compliance by such companies in regard to data requests from the Indian government has been reported to be very low.

India’s now withdrawn draft encryption policy took the first step towards overcoming these problems and obtaining access. It required service providers, from both India and abroad, which are using encryption technology, to enter into agreements with India in order to be able to provide such services. One essential requirement of these agreements was to comply with data requests as and when they’re made by the government. This will include any interception, monitoring and decryption requests made under Section 69 of the IT Act. Though it was later clarified that WhatsApp is not within the purview of this policy, this indicates the route that may be taken by the government to obtain access. If WhatsApp refuses to comply with such a regime, that would make WhatsApp illegal in India.

End-to-end encryption is not without its drawbacks. The high, unbreachable level of security and privacy available is in favour of users and against governments. It will make such systems the favorite for illegal activities as well. For example, tracing voice calls made by terrorists using Voice-Over-IP is extremely difficult because of its routing over fake networks. The issue raised in the Apple vs FBI case was also the same, whether an individual user’s privacy can be compromised in favour of the larger public interest. A balance between the two is needed, maintaining user privacy and allowing interception for lawful purposes is required.

Brooklyn case takes front seat in Apple encryption fight

Brooklyn case takes front seat in Apple encryption fight

The Justice Department said Friday it will continue trying to force Apple to reveal an iPhone’s data in a New York drug case, putting the Brooklyn case at the center of a fight over whether a 227-year-old law gives officials wide authority to force a technology company to help in criminal probes.

The government told U.S. District Judge Margo K. Brodie in Brooklyn that it still wants an order requiring Apple’s cooperation in the drug case even though it recently dropped its fight to compel Apple to help it break into an iPhone used by a gunman in a December attack in San Bernardino that killed 14 people.

“The government’s application is not moot and the government continues to require Apple’s assistance in accessing the data that it is authorized to search by warrant,” the Justice Department said in a one-paragraph letter to Brodie.

Apple expressed disappointment, saying its lawyers will press the question of whether the FBI has tried any other means to get into the phone in Brooklyn.

Apple had sought to delay the Brooklyn case, saying that the same technique the FBI was using to get information from the phone in California might work with the drug case phone, eliminating the need for additional litigation.

Federal prosecutors told Brodie on Friday that it would not modify their March request for her to overturn a February ruling concluding that the centuries-old All Writs Act could not be used to force Apple to help the government extract information from iPhones.

Magistrate Judge James Orenstein made the ruling after inviting Apple to challenge the 1789 law, saying he wanted to know if the government requests had created a burden for the Cupertino, California-based company.

Since then, lawyers say Apple has opposed requests to help extract information from over a dozen iPhones in California, Illinois, Massachusetts and New York.

In challenging Orenstein’s ruling, the government said the jurist had overstepped his powers, creating “an unprecedented limitation on” judicial authority.

It said it did not have adequate alternatives to obtaining Apple’s assistance in the Brooklyn case, which involves a phone with a different version of operating system than the phone at issue in the California case.

In a statement Friday, Justice Department spokeswoman Emily Pierce said the mechanism used to gain access in the San Bernardino case can only be used on a narrow category of phones.

“In this case, we still need Apple’s help in accessing the data, which they have done with little effort in at least 70 other cases when presented with court orders for comparable phones running iOS 7 or earlier operating systems,” she said.

Apple is due to file a response in the case by Thursday.

How to encrypt iPhone and Android, and why you should do it now

How to encrypt iPhone and Android, and why you should do it now

Apple’s fight with the FBI may be over for the time being, but this high-profile fight about user privacy and state security may have puzzled some smartphone users. When is an iPhone or Android device encrypted? And how does one go about securing the data on them?

iPhone

It’s pretty simple actually: as long as you set up a password or PIN for the iPhone or iPad’s lockscreen, the device is encrypted. Without knowing the access code, nobody can unlock it, which means your personal data including photos, messages, mail, calendar, contacts, and data from other apps, is secured. Sure the FBI can crack some iPhones, but only if they’re included in criminal investigations, and only if the recent hacks work on all iPhones out there.

If you don’t use a lockscreen password, you should do it right away. Go to Settings, Touch ID & Passcode, tap on Turn Passcode On and enter a strong passcode or password.

Android

As CNET points out, things are a bit more complicated on Android.

The newer the device, the easier it is to get it done. In this category, we have Nexus devices, the Galaxy S7 series, and other new handsets that ship with Android 6.0 preloaded. Just like with the iPhone, go to the Settings app to enable a security lock for the screen, and the phone is encrypted.

With older devices, the encryption procedure is a bit more complex, as you’ll also have to encrypt the handset manually. You’ll even have to do it with newer devices, including the Galaxy S6 and Moto X Pure. Go to Settings, then Security then Encrypt phone. While you’re at it, you may want to encrypt your microSD card as well, so data on it can be read on other devices – do it from the Security menu, then Encrypt external SD card. Once that’s done, you will still need to use a password for the lockscreen.

CNET says there are reasons you should consider not encrypting your Android device, like the fact that a device might take a performance hit when encrypted. The performance drop may be barely noticeable on new devices, but older models and low-end handsets could suffer.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The FBI insists that encrypted products like the iPhone and encrypted online services will put people in harm’s way, especially in light of the ISIS-connected San Bernardino shooting late last year. That’s why the Bureau has been arguing for encryption backdoors that would be available to law enforcement agencies, and why it looked to coerce Apple to add a backdoor to iOS.

However, extensive reports that show the preparations ISIS made before hitting Paris and Brussels revealed the kind of encrypted products ISIS radicals used to stay in touch with central command. Unsurprisingly, these products are out of the FBI’s jurisdiction, and one in particular was one of the safest encrypted communication products you can find online. In fact,its original developers are suspected to have ties to the criminal underworld.

Telling the inside story of the Paris and Brussels attacks, CNN explains that ISIS cell members used a chat program called Telegram to talk to one another in the moments ahead of the attacks. Using data obtained from official investigations,CNN learned that just hours before the Bataclan theater was hit, one of the attackers had downloaded Telegram on a Samsung smartphone.

Police never recovered communications from the messaging app. Not only is Telegram encrypted end-to-end, but it also has a self destruct setting.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

Conceived by Russian developers, the app is out of the FBI’s jurisdiction. But Telegram is the least problematic encrypted service for intelligence agencies looking to collect data and connect suspects. CNN also mentions a far more powerful app, one that hasn’t yet been cracked by law enforcement.

TrueCrypt is the app in question. One of the ISIS radicals who was captured by French police in the months leading to the mid-November Paris attacks revealed details about this program.

TrueCrypt resides on a thumb drive and is used to encrypt messages. French citizen and IT expert Reda Hame was instructed to upload the encrypted message to a Turkish file-sharing site. “An English-speaking expert on clandestine communications I met over there had the same password,” Hame told interrogators. “It operated like a dead letter drop.”

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

According to The New York Times, Hame was told not to send the message via email, so as to not generate any metadata that would help intelligence agencies connect him to other terrorists.

The ISIS technician also instructed Hame to transfer TrueCrypt from the USB key to a second unit once he reached Europe. “He told me to copy what was on the key and then throw it away,” Hame explained. “That’s what I did when I reached Prague.”

Hame made a long journey home from Turkey, making it look like he was a tourist visiting various cities in Europe. Whenever he reached a new place, he was to call a special number belonging to one of the masterminds behind the attacks, and he used a local SIM card to mark his location.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The Times also mentions a secondary program that was installed on flash drives. Called CCleaner, the program can be used to erase a user’s online history on any computer.

If that’s not enough to show the level of sophistication of these bloody ISIS attacks on Europe and other targets, a story from The New Yorker sheds more light on TrueCrypt, a program whose creators can’t be forced to assist the FBI.

According to the publication, TrueCrypt was launched in 2004 to replace a program called Encryption for the Masses (E4M) developed long before the iPhone existed. Interestingly, the programmer who made it is Paul Le Roux, who also happens to be a dangerous crime lord, having built a global drug, arms and money-laundering cartel out of a base in the Philippines.

E4M is open-source, and so is TrueCrypt, meaning that their creators aren’t companies motivated by a financial interest to keep their security intact.

“TrueCrypt was written by anonymous folks; it could have been Paul Le Roux writing under an assumed name, or it could have been someone completely different,” Johns Hopkins Information Security Institute computer-science professor Matthew Green told The New Yorker.

The developers stopped updating it in 2014 for fear that Le Roux’s decision to cooperate with the DEA might cripple its security. Le Roux was arrested in Liberia on drug-trafficking charges in September 2012. But Green concluded in 2015 that TrueCrypt is still backdoor-free, which explains why ISIS agents still use it.

How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple

How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple

An urgent meeting inside FBI headquarters little more than a week ago is what convinced federal law enforcement officials that they may be able to abandon a brewing legal fight with tech giant Apple, sources told ABC News today.

In the days after the December 2015 massacre in San Bernardino, California, which killed 14 people and wounded 22 others, the iPhone left behind by one of the shooters, Syed Farook, was secretly flown to the FBI’s laboratory in Quantico, Virginia, sources said.

The FBI had been unable to review the phone’s contents due to a security feature that — after 10 failed attempts to enter the 4-digit access code — would render the phone’s files forever inaccessible.

By last month, the FBI was at an impasse with Apple, which fought a court order telling the company to help authorities bypass the security feature. Apple maintained the U.S. government was asking it to create a “backdoor” into its devices that would endanger the privacy of hundreds of millions of iPhone users around the world.

“It is in our view the software equivalent of cancer,” Apple CEO Tim Cook recently told “World News Tonight” anchor David Muir.

But the FBI insisted it had a responsibility to access any data potentially relevant to the deadly terror attack in San Bernardino.

“I don’t know whether there is evidence of the identity of another terrorist on the phone, or nothing at all. But we ought to be fired in the FBI if we didn’t pursue that lead,” FBI Director James Comey told a House panel in February.

As the legal battle played out, the FBI appealed to cyber experts around the world for help.

“We’ve talked to anybody who will talk with us about it, and I welcome additional suggestions,” Comey said during a House hearing four weeks ago.

In response, countless companies and hackers — including what one source familiar with matter called many “whackadoodles” — came forward claiming to have a way into Farook’s phone, sources said.

But nothing appeared viable. That is, until a company that the FBI has yet to identify came forward about two weeks ago. After initial contacts with the FBI, company officials flew to Washington to lay out their solution, sources told ABC News.

On Sunday, March 20, in a meeting at FBI headquarters, company officials demonstrated their technology on another iPhone. Convinced it would work, the FBI greenlighted applying it to Farook’s phone, sources said.

This past weekend — just days ago — the attempt was made, and “the FBI has now successfully retrieved the data stored on” the phone, according to the Justice Department.

Forensic examiners are now attempting to exploit potential evidence from the phone. It’s unclear if anything of investigative value has been found yet.

The FBI has refused to identify the company that offered the solution, with one source citing a “mutual agreement.” Nevertheless, Apple did not play a part in finding the solution, company officials said.

As for whether the solution might be shared with Apple, it’s a decision that will be made through consultation with multiple federal agencies, sources said.

One federal law enforcement source said it’s important to emphasize that the ultimate solution identified in this case was not found despite the lawsuit filed against Apple, but because of it.

The solution was “generated as a result of the media attention,” the source said.

At the same time, the source said federal authorities believe the end to the current litigation should not end the national discussion about balancing the interests of security and privacy.

“Our need for public safety and our need for privacy are crashing into each other, and we have to sort that out as a people,” Comey said recently. “This world some people imagine where nobody can look at your stuff is a world that will have public safety costs.”

FBI Hacks iPhone, Ending Apple Encryption Challenge

FBI Hacks iPhone, Ending Apple Encryption Challenge

The Department of Justice said in a federal court filing Monday that it had bypassed encryption on the iPhone 5c used by a terrorist in a mass shooting last year in California and requested the court vacate its order compelling Apple to assist it in accessing the device.

The filing effectively ends a contentious legal battle between the federal government and Apple over the phone used by Syed Rizwan Farook. Farook was fatally shot by authorities along with his wife, Tashfeen Malik, after they killed 14 people in San Bernardino, California, in December.

“The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court’s Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016,” government lawyers said in their filing in U.S. District Court for the Central District of California.

The two-page filing contains no information about the methods the government used to bypass the phone’s encryption.

A scheduled March 22 hearing was canceled last week after government lawyers said an “outside party” had proposed a possible way to unlock the phone that would not require Apple’s help. The tech giant had vowed to oppose the order in court, stating that helping the government access an encrypted iPhone would set a precedent for undermining privacy and cybersecurity.

“Our decision to conclude the litigation was based solely on the fact that, with the recent assistance of a third party, we are now able to unlock that iPhone without compromising any information on the phone,” prosecutors said in a statement.

“We sought an order compelling Apple to help unlock the phone to fulfill a solemn commitment to the victims of the San Bernardino shooting – that we will not rest until we have fully pursued every investigative lead related to the vicious attack,” the statement said. “Although this step in the investigation is now complete, we will continue to explore every lead, and seek any appropriate legal process, to ensure our investigation collects all of the evidence related to this terrorist attack. The San Bernardino victims deserve nothing less.”

Why few hackers are lining up to help FBI crack iPhone encryption

Why few hackers are lining up to help FBI crack iPhone encryption

When the FBI said it couldn’t unlock the iPhone at the center of the San Bernardino shooting investigation without the help of Apple, the hackers at DriveSavers Data Recovery took it as a challenge.

Almost 200 man hours and one destroyed iPhone later, the Bay Area company has yet to prove the FBI wrong. But an Israeli digital forensics firm reportedly has, and the FBI is testing the method.

Finding a solution to such a high-profile problem would be a major feat — with publicity, job offers and a big payday on the line. But, in fact, the specialists at DriveSavers are among only a few U.S. hackers trying to solve it. Wary of the stigma of working with the FBI, many established hackers, who can be paid handsomely by tech firms for identifying flaws, say assisting the investigation would violate their industry’s core principles.

Some American security experts say they would never help the FBI, others waver in their willingness to do so. And not all of those who would consider helping want their involvement publicized for risk of being labeled the hacker who unhinged a backdoor to millions of iPhones.

“The FBI has done such a horrible job of managing this process that anybody in the hacking community, the security community or the general public who would openly work with them would be viewed as helping the bad guys,” said Adriel Desautels, chief executive of cybersecurity testing company Netragard. “It would very likely be a serious PR nightmare.”

Much of the security industry’s frustration with the FBI stems from the agency’s insistence that Apple compromise its own security. The fact that the FBI is now leaning on outside help bolsters the security industry’s belief that, given enough time and funding, investigators could find a workaround — suggesting the agency’s legal tactics had more to do with setting a precedent than cracking the iPhone 5c owned by gunman Syed Rizwan Farook.

Some like Mike Cobb, the director of engineering at DriveSavers in Novato, Calif., wanted to be the first to find a way in. Doing so could bring rewards, including new contracts and, if desired, free marketing.

“The bragging rights, the technical prowess, are going to be considerable and enhanced by the fact that it’s a very powerful case in the press,” said Shane McGee, chief privacy officer for cybersecurity software maker FireEye Inc.

Altruism could motivate others. Helping the FBI could further an inquiry into how a husband-and-wife couple managed to gun down 14 people, wound many others and briefly get away.

Another positive, McGee said, is that legal liability is low: While unauthorized tampering with gadgets has led to prison time, it’s legal as long as people meddle with iPhones they own — and the court order helps too.

But top security experts doubt the benefits are worth the risk of being seen as a black sheep within their community.

Hackers have said they don’t want to touch the San Bernardino case “with a 10-foot pole because the FBI doesn’t look the like good guy and frankly isn’t in the right asking Apple to put a back door into their program,” Desautels said. The assisting party, if ever identified, could face backlash from privacy advocates and civil liberties activists.

“They’d be tainted,” Desautels said.

The unease in the hacker community can be seen through Nicholas Allegra, a well-known iPhone hacker who most recently worked for Citrix.

Concerned an FBI victory in its legal fight with Apple would embolden authorities to force more companies to develop software at the government’s behest, Allegra had dabbled in finding a crack in iPhone 5c security. If successful, he hoped his findings would lead the FBI to drop the Apple dispute.

But he has left the project on the back burner, concerned that if he found a solution, law enforcement would use it beyond the San Bernardino case.

“I put in some work. I could have put more in,” he said. But “I wasn’t sure if I even wanted to.”

Companies including Microsoft, United Airlines and Uber encourage researchers and even hackers to target them and report problems by dangling cash rewards.

HackerOne, an intermediary for many of the companies, has collectively paid $6 million to more than 2,300 people since 2013. Boutique firms and freelancers can earn a living between such bounties and occasionally selling newly discovered hacking tools to governments or malicious hackers.

But Apple doesn’t have a bounty program, removing another incentive for tinkering with the iPhone 5c.

Why few hackers are lining up to help FBI crack iPhone encryption

Still, Israeli firm Cellebrite is said to have attempted and succeeded at defeating the device’s security measures.

The company, whose technology is heavily used by law enforcement agencies worldwide to extract and analyze data from phones, declined to comment. The FBI has said only that an “outside party” presented a new idea Sunday night that will take about two weeks to verify. Apple officials said they aren’t aware of the details.

Going to the FBI before going to the company would violate standard practice in the hacking community. Security researchers almost always warn manufacturers about problems in their products and services before sharing details with anyone else. It provides time for a issuing a fix before a malicious party can exploit it.

“We’ve never disclosed something to the government ahead of the company that distributed the hardware or software,” McGee said. “There could be far-reaching consequences.”

Another drawback is that an iPhone 5c vulnerability isn’t considered a hot commodity in the minds of many hackers, who seek to one-up each other by attacking newer, more widely used products. The 5c model went on sale in 2013 and lacks a fingerprint sensor. Newer iPhones are more powerful and have different security built into them. Only if the hack could be applied to contemporary iPhones would it be worth a rare $1-million bounty, experts say.

The limited scope of this case is why many hackers were taken back by a court order asking for what they consider broadly applicable software to switch off several security measures. Instead, experts wanted the FBI to invest in going after the gunman’s specific phone with more creativity. In other words, attack the problem with technology, not the courts.

“If you have access to the hardware and you have the ability to dismantle the phone, the methodology doesn’t seem like it would be all that complex,” Desautels said.

Two years ago, his team tried to extract data from an iPad at the request of a financial services company that wanted to test the security of the tablets before offering them to employees. Netragard’s researcher failed after almost a month; he accidentally triggered a date change within the software that rendered the iPad unusable. But Desautels said cracking the iPad would have been “possible and trivial” for someone with more time and a dozen iPads to mess with.

The same, he imagines, would be true for an iPhone. The FBI, though, has said it had exhausted all known possibilities.

Taking Apple to court generated attention about the problem and “stimulated creative people around the world to see what they might be able to do,” FBI Director James Comey said in a letter to the Wall Street Journal editorial board Wednesday. Not “all technical creativity” resides within government, he said.

The plea worked, grabbing the interest of companies like DriveSavers, which gets about 2,000 gigs a month to retrieve photos, videos and notes from phones that are damaged or belong to someone who died. But despite all of the enticements in the San Bernardino case, they’ve worked to unlock an iPhone 5c only intermittently.

They’ve made progress. Cobb’s team can spot the encrypted data on an iPhone 5c memory chip They’re exploring how to either alter that data or copy it to another chip. Both scenarios would allow them to reset software that tracks invalid password entries. Otherwise, 10 successive misfires would render the encrypted data permanently inaccessible.

Swapping chips requires soldering, which the iPhone isn’t built to undergo multiple times. They have an adapter that solves the issue, and about 300 old iPhones in their stockpile in case, as one already has, the device gets ruined.

Had they been first to devise a proposed solution, DriveSavers “absolutely” would have told the FBI because their method doesn’t present extraordinary security risks, Cobb said.

But whether it would want to be publicly known as the code cracker in the case, Cobb said that would be “a much bigger, wider conversation” to ponder.

Apple-FBI fight may be first salvo in encryption war

Apple-FBI fight may be first salvo in encryption war

The Apple-FBI fight may just be the opening salvo in a broader war over encryption, as technology companies continue to lock up their users’ messages, photos and other data to shield them from thieves and spies — and, incidentally, criminal investigators.

WhatsApp, the globally popular messaging system owned by Facebook, has already run into trouble on this front in Brazil. WhatsApp encrypts all user messages in “end to end” fashion, meaning no one but the sender and recipient can read them. Brazilian authorities arrested a Facebook executive this month after the company said it couldn’t unscramble encrypted messages sought by police.

U.S. officials are debating how to enforce a similar wiretap order for WhatsApp communications in a U.S. criminal case, the New York Times reported. WhatsApp started as a way to exchange written messages over the Internet, but it has added services like photo-sharing and voice calling, while gradually building encryption into all those formats.

Spokesmen for WhatsApp and the Justice Department declined comment on the Times report, which said the wiretap order had been sealed to keep details secret. The Brazilian case is pending, although the Facebook executive was released from jail after a day.

For now, U.S. authorities and the tech industry are watching for the outcome of Apple’s legal battle with the FBI, which wants to force the company to help unlock an encrypted iPhone used by one of the San Bernardino mass shooters. But as more companies explore adding encryption, further confrontations are likely.

“I think we can say, without a doubt, there’s going to be more pressure on app-makers now,” said Nate Cardozo, staff attorney at the Electronic Frontier Foundation.

Cardozo said he’s aware of other recent cases in which U.S. authorities have approached individual companies (he wouldn’t name them) that use encryption and warned them that criminals or terrorists are using their services. Cardozo said authorities have urged those companies to redesign their apps or provide other technical solutions that would let agents read the encrypted messages.

Tech companies say they don’t want to interfere with legitimate criminal investigations or national security matters. Instead, they argue they’re concerned about hacking, privacy invasion and violations of civil rights.

“It’s the government’s job to protect public safety,” said Denelle Dixon-Thayer, chief legal and business officer at Mozilla, which makes the Firefox Web browser. “Our job in the tech sector is to support that goal by providing the best data security.”

While law enforcement authorities have chafed at tech companies’ use of encryption, national security officials have warned against weakening encryption. “We’re foursquare behind strong data security and encryption,” Defense Secretary Ash Carter told a tech audience this month.

Debate over tech tools’encryption

Before the San Bernardino terror attack, Syed Rizawan Farook’s iPhone was just one fancy Apple device among hundreds of millions worldwide.

But since the California government worker and his wife shot and killed 14 people on December 2, apparently inspired by extremist group IS, his iPhone 5c has become a key witness – and the government wants Apple to make it talk.

The iPhone, WhatsApp, even social media – government authorities say some of tech fans’ favourite playthings are also some of the most powerful, and problematic, weapons in the arsenals of violent extremists.

Now, in a series of quiet negotiations and noisy legal battles, they’re trying to disarm them, as tech companies and civil liberties groups fight back.

The public debate started with a court order that Apple hack a standard encryption protocol to get at data on Farook’s iPhone, but its repercussions are being felt beyond the tech and law enforcement worlds.

“This is one of the harder questions that we will ever have to deal with,” said Albert Gidari, director of privacy at Stanford Law School’s Centre for Internet and Society.

“How far are we going to go? Where does the government power end to collect all evidence that might exist, and whether it infringes on basic rights? There’s no simple answer,” he told DPA.

It’s not new that terrorists and criminals use mainstream technology to plan and co-ordinate, or that law enforcement breaks into it to catch them. Think of criminals planning a robbery by phone, foiled by police listening in.

But as encryption technology and other next-generation data security move conversations beyond the reach of a conventional wiretap or physical search, law enforcement has demanded the industry provide “back-door” technology to access it too.

At the centre of the fray are otherwise mainstream gadgets and platforms that make private, secure and even anonymous data storage and communication commonplace.

Hundreds of millions of iPhones running iOS 8 or higher are programmed with the same auto-encryption protocol that has stymied investigators in the San Bernardino attack and elsewhere.

US authorities are struggling with how to execute a wiretap order on Facebook-owned WhatsApp’s encrypted messaging platform, used by 1 billion people, the New York Times reported.

In a similar case earlier this month, Brazilian authorities arrested a company executive for not providing WhatsApp data the company said it itself could not access.

Belgium’s interior minister Jan Jambon said in November he believed terrorists were using Sony’s PlayStation 4 gaming network to communicate, Politico reported, although media reports dispute his assertions.

In a world where much of social interaction has moved online, it’s only natural that violent extremism has made the move too.

ISIS, in particular, has integrated its real-world operations with the virtual world, using social media like Twitter and YouTube for recruitment and propaganda and end-to-end encryption for secure communication, authorities say.

Law enforcement authorities and government-aligned terror experts call it the “digital jihad”.

Under pressure from governments, social media providers have cracked down on accounts linked to extremists. Twitter reported it had closed 125,000 ISIS-linked accounts since mid-2015.

Most in the industry have drawn the line at any compromise on encryption, however, saying the benefits of secure data outweigh the costs of its abuse by criminals – leaving authorities wringing their hands.

“Something like San Bernardino” or the November 13 terror attack in Paris “can occur with virtually no indications it was about to happen,” retired general and former Obama anti-terror envoy John Allen warned an audience of techies at the South by Southwest digital conference.

Just a day before, US President Barack Obama had made an unprecedented appearance there, calling for compromise in the showdown between government and tech.

Citing examples of child pornographers, airline security and Swiss bank accounts, Obama said authorities must have the ability to search mobile devices, encrypted or not.

But Gidari called it a “Pandora’s box” too dangerous to open.

Google closing in on target of full encryption

Google is disclosing how much of the traffic to its search engine and other services is being protected from hackers as part of its push to encrypt all online activity.

Encryption shields 77 percent of the requests sent from around the world to Google’s data centers, up from 52 percent at the end of 2013, according to company statistics released Tuesday. The numbers cover all Google services except its YouTube video site, which has more than 1 billion users. Google plans to add YouTube to its encryption breakdown by the end of this year.

Encryption is a security measure that scrambles transmitted information so it’s unintelligible if intercepted by a third party.

Google began emphasizing the need to encrypt people’s online activities after confidential documents leaked in 2013 by former National Security Agency contractor Edward Snowden revealed that the U.S. government had been vacuuming up personal data transferred over the Internet. The surveillance programs exploited gaping holes in unencrypted websites.

While rolling out more encryption on its services, Google has been trying to use the clout of its influential search engine to prod other websites to strengthen their security.

In August 2014, Google revised its secret formula for ranking websites in its search order to boost those that automatically encrypted their services. The change meant websites risked being demoted in Google’s search results and losing visitors if they didn’t embrace encryption.

Google is highlighting its own progress on digital security while the FBI and Apple Inc. are locked in a court battle over access to an encrypted iPhone used by one of the two extremist killers behind the mass shootings in San Bernardino, California, in December.

Google joined several other major technology companies to back Apple in its refusal to honor a court order to unlock the iPhone, arguing that it would require special software that could be exploited by hackers and governments to pry their way into other encrypted devices.

In its encryption crusade, Google is trying to make it nearly impossible for government spies and other snoops from deciphering personal information seized while in transit over the Internet.

The statistics show that Google’s Gmail service is completely encrypted as long as the correspondence remains confined to Gmail. Mail exchanges between Gmail and other email services aren’t necessarily encrypted.

Google’s next most frequently encrypted services are maps (83 percent of traffic) and advertising (77 percent, up from just 9 percent at the end of 2013). Encryption frequency falls off for Google’s news service (60 percent) and finance (58 percent).

Take a stand against the Obama/FBI anti-encryption charm offensive

It has been frustrating to watch as the horrific San Bernardino terrorist killing spree has been used as a cover by the FBI to achieve the anti-encryption goals they’ve been working towards for years. Much of that frustration stems from the fact that the American media has so poorly reported the facts in this case.

Take a stand against the Obama/FBI anti-encryption charm offensive

The real issue in play is that the FBI wants backdoor access to any and all forms of encryption and is willing to demonize Apple in order to establish an initial precedent it can then use against all other software and hardware makers, all of whom are smaller and are far less likely to even attempt to stand up against government overreach.

However, the media has constantly echoed the FBI’s blatantly false claims that it “does not really want a backdoor,” that only cares about “just this one” phone, that all that’s really involved is “Apple’s failure to cooperate in unlocking” this single device, and that there “isn’t really any precedent that would be set.” Every thread of that tapestry is completely untrue, and even the government has now admitted this repeatedly.

Representative democracy doesn’t work if the population only gets worthless information from the fourth estate.

However, in case after case journalists have penned entertainment posing as news, including a bizarre fantasy written up by Mark Sullivan for Fast Company detailing “How Apple Could Be Punished For Defying FBI.”

A purportedly respectable polling company asked the population whether Apple should cooperate with the police in a terrorism case. But that wasn’t the issue at hand. The real issue is whether the U.S. Federal Government should act to make real encryption illegal by mandating that companies break their own security so the FBI doesn’t have to on its own.

The Government’s Anti-Encryption Charm Offensive

Last Friday, U.S. Attorney General Loretta Lynch made an appearance on The Late Show with Stephen Colbert to again insist that this is a limited case of a single device that has nothing to do with a backdoor, and that it was really an issue of the County-owned phone asking Apple for assistance in a normal customer service call.

Over the weekend, President Obama appeared at SXSW to gain support for the FBI’s case, stating outright that citizens’ expectation that encryption should actually work is “incorrect” and “absolutist.”

He actually stated that, “If your argument is ‘strong encryption no matter what, and we can and should in fact create black boxes,’ that I think does not strike the kind of balance we have lived with for 200, 300 years. And it’s fetishizing our phone above every other value, and that can’t be the right answer.”

That’s simply technically incorrect. There’s no “balance” possible in the debate on encryption. Either we have access to real encryption or we don’t. It very much is an issue of absolutes. Real encryption means that the data is absolutely scrambled, the same way that a paper shredder absolutely obliterates documents. If you have a route to defeat encryption on a device or between two devices, it’s a backdoor, whether the government wants to play a deceptive word game or not.

If the government obtains a warrant, that means its has the legal authority to seize evidence. It does not mean that the agencies involved have unbridled rights to conscript unrelated parties into working on their behalf to decipher, translate or recreate any bits of data that are discovered.

If companies like Apple are forced to build security backdoors by the government to get around encryption, then those backdoors will also be available to criminals, to terrorists, to repressive regimes and to our own government agencies that have an atrocious record of protecting the security of data they collect, and in deciding what information they should be collecting in the first place.

For every example of a terrorist with collaborator contacts on his phone, or a criminal with photos of their crimes on their phone, or a child pornographer with smut on their computer, there are thousands of individuals who can be hurt by terrorists plotting an attack using backdoors to cover their tracks, or criminals stalking their victims’ actions and locations via backdoor exploits of their devices’ security, or criminal gangs distributing illicit content that steps around security barriers the same way that the police hope to step around encryption on devices.

Security is an absolutist position. You either have it or you don’t.

Obama was right in one respect. He noted that in a world with “strong, perfect encryption,” it could be that “what you’ll find is that after something really bad happens the politics of this will swing and it will become sloppy and rushed. And it will go through Congress in ways that have not been thought through. And then you really will have a danger to our civil liberties because the disengaged or taken a position that is not sustainable.”

However, the real answer to avoiding “sloppy, rushed” panic-driven legislation is to instead establish clear rights for citizens and their companies to create and use secure tools, even if there is some fear that secure devices may be used in a way that prevents police from gaining access to some the evidence they might like to access in certain cases.

The United States makes no effort to abridge the use of weapons like those used in San Bernardino to actually commit the atrocity. It should similarly not insist that American encryption should only work with a backdoor open on the side, giving police full access to any data they might want.

It’s not just a bad idea, it’s one that will accomplish nothing because anyone nefarious who wants to hide their data from the police can simply use non-American encryption products that the FBI, the president and the U.S. Congress have no ability to weaken, regardless of how much easier it would make things for police.

Obama: ‘Absolutist view’ on encryption not answer

Obama: ‘Absolutist view’ on encryption not answer

President Barack Obama said Friday that the encryption versus national security debate, currently being played out in Apple’s legal fight against the federal government, won’t be settled by taking an “absolutist view.”

Addressing an audience of tech enthusiasts meeting in the Texas capital, Obama said both values are important.

He restated his commitment to strong encryption, but also asked how will government catch child pornographers or disrupt terrorist plots if smartphones and other electronic devices are made ways that keep law enforcement from accessing the data stored on them.

“My conclusion, so far, is you cannot take an absolutist view on this,” Obama said at the South by Southwest Interactive festival.

During a question-and-answer session with Evan Smith, CEO and editor in chief of The Texas Tribune, Smith asked Obama “where do you come down” on the question of balancing law enforcement’s needs with an individual’s right to privacy.

Obama said government shouldn’t be able to “just willy nilly” get into smartphones that are full of very personal data. But at the same time, while asserting he’s “way on the civil liberties side,” Obama said “there has to be some concession” to be able to get to the information in certain cases.

The president was not asked to comment on the litigation between Apple and the FBI. He also said he couldn’t discuss specifics.

Apple and the federal government are embroiled in a legal fight over Apple’s refusal to help the FBI access an iPhone used in last year’s terrorist attack San Bernardino, California, in which 14 people were killed. The FBI wants Apple to create a program specifically for that particular phone to help the bureau review the data on it. Apple has refused, and says to do what the government is asking would set a terrible precedent.

Rep. Darrell Issa, R-Calif., who has sharply questioned FBI Director James Comey during congressional hearings on the matter, released a statement in which he said Obama’s comments showed his “fundamental lack of understanding of the tech community, the complexities of encryption and the importance of privacy to our safety in an increasingly digital world.”

“There’s just no way to create a special key for government that couldn’t also be taken advantage of by the Russians, the Chinese or others who want access to the sensitive information we all carry in our pockets every day,” Issa said.

Obama used his appearance at the decades-old festival to encourage the audience of tech enthusiasts to use their skills and imagination to “tackle big problems in new ways.” He said the administration already is using technology to make people’s lives better, and cited as an example the streamlining of federal applications. But he urged industry leaders and entrepreneurs to use technology to help increase voter participation.

“The reason I’m here, really, is to recruit all of you. It’s to say to you, as I’m about to leave office, how can we start coming up with new platforms and new ideas, new approaches across disciplines and across skill sets, to solve some of the big problems that we’re facing today.”

South by Southwest Interactive is part of South by Southwest, a movie, music and interactive media festival that had been held in Austin for the past 30 years. Obama’s appearance was the first by a sitting U.S. president.

After the festival, which also is known as SXSW, Obama helped raise money for Democrats at a pair of fundraisers in Austin.

Government says Apple arguments in encryption case a “diversion”, presents point-by-point rebuttal

As the Apple vs. FBI encryption debate heats up in California, the U.S. government on Thursday fired back at Apple’s oppositions to a court order compelling its assistance in an FBI investigation, and in a new motion discounted a number of arguments related to supposed backdoors, “master keys,” the All Writs Act and more.

Government says Apple arguments in encryption case a "diversion", presents point-by-point rebuttal

In its letter in support of a federal magistrate judge’s original order to compel Apple’s help in unlocking an iPhone used by San Bernardino terror suspect Syed Rizwan Farook, federal prosecutors intimate the company is playing to the media in an attempt to protect its brand. The document was penned by U.S. Attorneys for the Central District of California Eileen M. Decker, Chief of the Cyber and intellectual Property Crimes Section Tracy L. Wilkison and Chief of the National Security Division Patricia A. Donahue.

“Apple and its amici try to alarm this Court with issues of network security, encryption, back doors, and privacy, invoking larger debates before Congress and in the news media. That is a diversion. Apple desperately wants—desperately needs—this case not to be ‘about one isolated iPhone,'” the letter reads. (Emphasis in original.)

The government argues Farook’s phone may contain actionable intelligence that could help shed light on last year’s terror attack. Investigators need Apple’s help in acquiring said information, if it exists, but instead of providing aid as it has done in the past, the company is waging a war of words both in court and publicly. Prosecutors classify Apple’s statements, including arguments that weakening the security of one iPhone is a slippery slope to a surveillance state, as “not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights.”

One of Apple’s main targets is the All Writs Act, a contingency that imbues courts with the power to issue orders if no other judicial tools are available. After being met with resistance to an initial warrant, the FBI leveraged AWA as a legal foundation to compel Apple’s assistance. If the DOJ is successful in its court action, it could pave the way for broader application of the statute in other investigations, Apple says. Indeed, the FBI is currently asserting AWA in at least nine other cases involving iOS devices.

In this case, however, the government argues its use of AWA is proper.

As for undue burden, the letter notes Apple grosses hundreds of billions of dollars each year. It would take as few as six employees plucked from Apple’s workforce of approximately 100,000 people as little as two weeks to create a workable solution to the FBI’s problem, the letter says, adding that the company is to blame for being in the position it currently finds itself.

“This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant,” according to the government.

A few interesting tidbits were also revealed in the course of dismantling Apple’s opposition, including a technical revelation that strikes at the heart of one of Apple’s key arguments. Apple has maintained that a forced iCloud backup, obtained by connecting Farook’s iPhone to a known Wi-Fi network, might contain information FBI agents are looking for. However, that option was rendered moot after the FBI ordered San Bernardino officials to reset Farook’s Apple ID password.

“The evidence on Farook’s iCloud account suggests that he had already changed his iCloud password himself on October 22, 2015—shortly after the last backup—and that the autobackup feature was disabled. A forced backup of Farook’s iPhone was never going to be successful, and the decision to obtain whatever iCloud evidence was immediately available via the password change was the reasoned decision of experienced FBI agents investigating a deadly terrorist conspiracy,” the government claims.

Finally, the letter takes issue with Apple’s assertions that the instant order violates its First and Fifth Amendment rights. Apple claims computer code should be covered by free speech protections, meaning DOJ requests to write code in an attempt to break into Farook’s iPhone amounts to forced speech. Nebulous legal footing aside, Apple’s claims are “particularly weak because it does not involve a person being compelled to speak publicly, but a for-profit corporation being asked to modify commercial software that will be seen only by Apple”

The idea of narrow investigation is mentioned multiple times. Apple is not being required to create a master key for all iOS devices, government representatives insist, but instead a piece of code applicable to one iPhone. Even if hackers or nefarious agents manage to steal said code, it would only be useful in unlocking Farook’s iPhone 5c, the government attests. This issue is under debate, however, as some experts say the flawed iOS version could be used on other devices. Creating a specialized forensics tool also acts as a proof-of-concept that iOS is vulnerable to attack.

Apple and the DOJ are set to meet in court over the matter in a hearing scheduled for March 22.

New FBI strategy wins back lost ground in encryption fight

New FBI strategy wins back lost ground in encryption fight

By July 2015, FBI Director Jim Comey knew he was losing the battle against sophisticated technologies that allowed criminals to communicate without fear of government surveillance.

In back-to-back congressional hearings that month, Comey struggled to make the case that terrorists and crooks were routinely using such encryption systems to evade the authorities. He conceded that he had no real answer to the problem and agreed that all suggested remedies had major drawbacks. Pressed for specifics, he couldn’t even say how often bureau investigations had been stymied by what he called the “going dark” problem.

“We’re going to try and do that for you, but I’m not optimistic we’re going to be able to get you a great data set,” he told lawmakers.

This week, Comey was back before Congress with a retooled sales pitch. Gone were the vague allusions to ill-defined problems. In their place: a powerful tale of the FBI’s need to learn what is on an encrypted iPhone used by one of the terrorists who killed 14 people in California. “Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t,” Comey wrote shortly before testifying. “But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”

The tactical shift has won Comey tangible gains. After more than a year of congressional inaction, two prominent lawmakers, Sen. Mark Warner (D-Va.) and House Homeland Security Chairman Michael McCaul (R-Texas), have proposed a federal commission that could lead to encryption legislation. Several key lawmakers, who previously hadn’t chosen sides over encryption, such as Rep. Jim Langevin (D-RI), are siding with the administration in its legal battle with Apple. Likewise, several former national security officials — such as former National Security Agency chief Gen. Michael Hayden and former Director of National Intelligence Mike McConnell — who lined up with privacy advocates in the past have returned to the government side in this case.

“The public debate was not going the FBI’s way and it appears there’s been a deliberate shift in strategy,” said Mike German, a former FBI special agent. “They realized…that the most politically tenable argument was going to be ‘we need access when we have a warrant and in a serious criminal case. All the better if it’s a terrorism case.’”

The catalyst for change has been a high-stakes legal fight in a central California courtroom where Apple seeks to overturn a judge’s order to write new software to help the FBI circumvent an iPhone passcode. Other technology companies such as Microsoft, Google, Facebook and Twitter this week rallied to Apple’s side. The Justice Department, meanwhile, has drawn supporting legal briefs from law enforcement associations as well as families of the San Bernardino victims.

Comey’s evolution may have been foreshadowed last summer. In an August email, Robert Litt, the intelligence community’s top lawyer, wrote colleagues that the mood on Capitol Hill “could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement,” according to The Washington Post.

The Dec. 2 San Bernardino attack, coming less than three weeks after a coordinated series of Islamic State shootings and bombing killed at least 130 people in Paris, reignited law enforcement concern about terrorists’ ability to shield their plotting via encryption. The San Bernardino killers, Syed Farook and his wife Tashfeen Malik, destroyed two cellphones before dying in a gun battle with police. Investigators discovered the iPhone at issue in the courtroom fight inside the Farook family’s black Lexus sedan.

To be sure, Comey’s new strategy thus far has paid only limited dividends. The Warner-McCaul commission, if it is ever formed, may or may not change U.S. encryption policy. Renewed support from former officials, such as Hayden and McConnell, extends only to the San Bernardino case.

Indeed, the FBI director’s hopes for an enduring solution to “the going dark” problem remain aspirational. The White House last fall abandoned plans to seek legislation mandating a technological fix for authorities’ encryption headaches. And since then, the Obama administration has confined itself to jawboning Silicon Valley.

But in choosing to make a fight over the iPhone used by one of the San Bernardino terrorists, Comey has selected an advantageous battlefield. Many encryption supporters say that the San Bernardino case isn’t really about encryption because the FBI is asking Apple to build custom software that bypasses the phone’s passcode, a separate though related security feature. That distinction, however, may be lost on the public and many members of Congress. Some have even speculated the FBI is using the San Bernardino massacre to revive an encryption debate that it appeared to have lost.

“It appears to me they’re using this case specifically to try to enact a policy proposal they could not get through Congress last year,” said Rep. Ted Lieu (D-Calif.), an encryption advocate. “It’s clear to me that the FBI is trying to use this case to influence the public.”

The fight with Apple not only carries the emotional heft of terrorism, but — thanks to the distinction between encryption backdoors and passcode subversion — has drawn many of Comey’s most vocal critics from the national security community back into the fold.

Hayden, the former NSA head, and McConnell, the nation’s ex-intelligence czar, opposed Congress mandating the creation of technological “back doors” for the government to exploit. Yet, on the Apple case, they side with Comey.

“The FBI made this a test case and that was very deliberate on their part, to refocus the conversation,” said Robert Cattanach, a former Justice Department prosecutor. “This is not some abstract principle of privacy versus government overreach. There are real impacts.”

The San Bernardino case could be a win-win for Comey. If Apple prevails in court, Congress might respond by intervening with legislation. Both the FBI and Apple have said Congress is better equipped to manage the issue than courts.

The legal battles also may discourage companies from building strong encryption given the risk of future legal showdowns, said German, who is now a fellow with the Brennan Center for Justice.

“This is less about Apple than about the developer who is sitting in his garage right now creating the next big thing,” he said. “The idea is to make that person realize that the stronger they build the security the harder it will be for them when they get that order to unlock it to do so. There’s an incentive to build a crack in the system.”

No room for compromise in Apple vs FBI iPhone encryption battle

No room for compromise in Apple vs FBI iPhone encryption battle

As Apple’s legal battle with the FBI over encryption heads toward a showdown, there appears little hope for a compromise that would placate both sides and avert a divisive court decision.

The FBI is pressing Apple to develop a system that would allow the law enforcement agency to break into a locked iPhone used by one of the San Bernardino attackers, a demand the tech company claims would make all its devices vulnerable.

In an effort to break the deadlock, some US lawmakers are pushing for a panel of experts to study the issue of access to encrypted devices for law enforcement in order to find common ground.

Senator Mark Warner and Representative Mike McCaul on Monday proposed the creation of a 16-member “National Commission on Security and Technology Challenges.”

But digital rights activists warn that the issue provides little middle ground — that once law enforcement gains a “back door,” there would be no way to close it.

“We are concerned that the commission may focus on short-sighted solutions involving mandated or compelled back doors,” said Joseph Hall, chief technologist at the Center for Democracy & Technology.

“Make no mistake, there can be no compromise on back doors. Strong encryption makes anyone who has a cell phone or who uses the Internet far more secure.”

Kevin Bankston of the New America Foundation’s Open Technology Institute expressed similar concerns.

“We’ve already had a wide range of blue ribbon expert panels consider the issue,” he said.

“And all have concluded either that surveillance back doors are a dangerously bad idea, that law enforcement’s concerns about ‘going dark’ are overblown, or both.”

The debate had been simmering for years before the Apple-FBI row.

Last year, a panel led by Massachusetts Institute of Technology scientists warned against “special access” for law enforcement, saying they pose “grave security risks” and “imperil innovation.”

Opening up all data

“I’m not sure there is much room for compromise from a technical perspective,” said Stephen Wicker, a Cornell University professor of computer engineering who specializes in mobile computing security.

Opening the door to the FBI effectively makes any data on any mobile device available to the government, he said.

“This is data that was not available anywhere 10 years ago, it’s a function of the smartphone,” Wicker said.

“We as a country have to ask if we want to say that anything outside our personal human memory should be available to the federal government.”

Apple has indicated it is ready for a “conversation” with law enforcement on the matter.

But FBI Director James Comey told a congressional panel that some answers are needed because “there are times when law enforcement saves our lives, rescues our children.”

Asked about the rights envisioned by the framers of the US constitution, he said, “I also doubt that they imagined there would be any place in American life where law enforcement, with lawful authority, could not go.”

A brief filed on behalf of law enforcement associations argued that because of Apple’s new encryption, criminals “have now switched to the new iPhones as the device of choice for their criminal wrongdoing.”

Ed Black, president of the Computer & Communications Industry Association, which includes major technology firms but not Apple, said that although tech firms and law enforcement have had many battles, “there are many areas where we cooperate and where we find middle ground.”

But Black said the tech sector is largely united in this case because the FBI wants Apple to create weaker software or introduce “malware” to be able to crack the locked iPhone.

“On this narrow specific issue of ‘can companies be compelled to create malware,’ I think there may not be an answer,” he said.

‘Going dark’ fears

Law enforcement fears about “going dark” in the face of new technology have been largely exaggerated, Black said.

While access to encrypted apps and smartphones is difficult and traditional wiretaps don’t work on new technology, “there are a lot of other tools for law enforcement,” he said.

“There is more information available in 2016 than in any year since the founding of the country.”

Although law enforcement has growing expectations about using technology to thwart criminals, that type of power is too broad, Black added.

“If they are seeking a level of total surveillance capability, I don’t see a compromise available,” he said.

Wicker said that to give law enforcement access, Congress could in theory mandate that devices use automatic cloud backups that could not be disabled. But that would constitute a dramatic departure from current views about privacy.

“From an individual rights standpoint,” he said, “that would take away control by the user of their personal information.”

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

Today, several reports pointed out that Amazon’s Fire OS 5 does not support device encryption, drawing a connection between the company’s encryption retreat and the current Apple-FBI iPhone unlocking fracas. But Amazon’s decision to remove Fire OS 5’s onboard encryption is not a new development, and it’s not related to the iPhone fight. The real question at hand is why Amazon decided to roll back encryption protection for consumers all on its own.

Introduced last fall, Amazon’s Fire OS 5 featured a refreshing redesign that added several usability features. But Fire OS 5 also took away device encryption support, while still maintaining security features for communication between devices and Amazon’s cloud.

“In the fall when we released Fire OS 5, we removed some enterprise features that we found customers weren’t using,” Amazon spokesperson Robin Handaly told WIRED. “All Fire tablets’ communication with Amazon’s cloud meet our high standards for privacy and security, including appropriate use of encryption.”

We’ve reached out again for clarification as to what “appropriate use” of encryption entails in Amazon’s view.

To be clear, removing encryption protections of any kind from Fire tablets should be seen as a step back for consumers, and for security as a whole.

“Amazon’s decision is backward—it not only moves away from default device encryption, where other manufacturers are headed, but removes all choice by the end user to decide to encrypt it after purchase,” says Nathan White, Senior Legislative Manager at digital rights organization Access Now. “The devices themselves also become more attractive targets for thieves. Users should no longer trust these devices: If you wouldn’t post it to the internet publicly, don’t put it on a Fire Tablet.”

Further, Amazon’s insistence that it maintains a secure connection with the cloud doesn’t ease concerns over the data on the device itself that’s now vulnerable.

“Data encryption at rest and data encryption in motion are two completely different things,” says White. “They shouldn’t conflate two important issues by saying ‘we encrypt in motion, so data at rest doesn’t matter.’”

Even without the cloud connection, a device stores all sorts of personal information, from email credentials to credit card numbers to sensitive business information, if you happen to be an enterprise user. In fact, the lack of encryption means corporate customers aren’t able to use certain email clients on Fire tablets any longer.

Amazon’s move is a bad one. But it’s not a retreat in the face of Apple-FBI pressures. For better or worse (mostly worse), it’s been this way for months. As Handaly noted, Fire OS 5 came out last fall, on a suite of new Amazon devices. Amazon message board users have been commenting on, and complaining about, the absence of encryption since at least early January.

So why the sudden focus? Likely because of this tweet:

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

People are talking about the lack of encryption today because the OS update is only now hitting older devices, like the fourth-generation Fire HD and Fire HDX 8.9. Despite how neatly the sudden forfeiture of encryption by a tech giant fits the Apple-FBI narrative, this encryption deprecation isn’t related to that battle. Instead, Amazon appears to have given up onboard encryption without any public fight at all.

“This move does not help users. It does not help corporate image. And it does not fit into industry trends,” says Amie Stepanovich, US Policy Manager at Access Now.

U.S.Defense Secretary Ashton Cater Doesn’t Believe in Encryption Backdoors

U.S.Defense Secretary Ashton Cater Doesn't Believe in Encryption Backdoors

Secretary of Defense Ashton Carter came out against supporting encryption back-doors at a conference panel on Wednesday.

At the RSA information security conference in San Francisco, Carter told a packed room that he supported strong encryption and thought back-door access to encrypted communication as unrealistic. During his talk on the Apple vs. FBI case, which he shied away from the details because it is a “law enforcement issue,” Carter received scattered applause from the crowd of security professionals after he said he supports strong encryption.

“I think first of all that for the Department of Defense, data security including encryption is absolutely essential to us. We are for strong encryption,” Carter says. “I’m not a believer in backdoors or a single technical approach. I don’t think it’s realistic.”

Carter joined Attorney General Loretta Lynch in supporting encryption at the RSA Conference this week. In a stage interview with Bloomberg at the Moscone Center on Tuesday, Lynch called for “a middle ground” between national security and privacy.

In the 50-odd minute talk with Ted Schlein, general partner for the influential venture capital firm Kleiner, Perkins, Caufield & Byers, Carter focused his talk on how to bridge the gap between the Pentagon and Silicon Valley.

Carter, who was appointed to the secretary position last February by President Barack Obama, spoke about two initiatives in particular: the Defense Innovation Unit-Experimental (DIUx) and the Defense Innovation Advisory Board. Both serve to make the department more agile and tech-savvy in the age of cyberwarfare with competitors like Russia and China, Carter says.

“DIUX is a place to connect. It is down the road [from Silicon Valley]. I’ve given it a very open charter,” Carter says. “We need to be very hawkish on the idea of reform.”

Earlier on Wednesday, the Defense Department announced that former Google CEO Eric Schmidt will chair the Defense Innovation Advisory Board. “There is going to be some technical minds who come in and giving me advice to be more innovative,” Carter says. “I am so grateful to Eric Schmidt for his willingness to do this. He’s the perfect chairman for this.”

He also announced a new competition called “Hack the Pentagon” where ethical, or white hat, hackers find vulnerabilities in the Pentagon’s systems and boost the overall cybersecurity of the department. “You would rather find the vulnerabilities in your networks that way than the other way of pilfering information,” Carter says. Hackers must be American citizens, Carter added.

While the Pentagon is bolstering its defenses in protecting its own data, it is also aggressively attacking ISIS, Carter says. Similar to the radio-jamming tactics during the Cold War, the Pentagon has been disrupting the terrorist group’s online channels of communications. “We will and must defeat ISIL. I’m looking for all the ways to accelerate that,” Carter says. “We are using cyber to disrupt communication and doubt the reliability of the comm. Now that enemies use cyber, that’s another way to shut them down.”

Apple’s rivals wary of taking stand on encryption issue, against the FBI

Apple’s rivals wary of taking stand on encryption issue, against the FBI

As Apple resists the US government in a high profile stand-off over privacy, rival device makers are, for now, keeping a low profile.

Most are Asian companies — the region produces eight of every 10 smartphones sold around the world — and operate in a complex legal, political and security landscape.

Only China’s Huawei has publicly backed Apple CEO Tim Cook in his fight to resist demands to unlock an encrypted iPhone belonging to one of those who went on a shooting rampage in San Bernardino, California in December.

“We put a lot of investment into privacy, and security protection is key. It is very important for the consumer,” Richard Yu, chief executive of Huawei’s consumer business group, told reporters at the Mobile World Congress earlier this week.

But Yu stopped short of saying explicitly that Huawei would adopt the same stance. “Some things the government requires from vendors we cannot do,” he said, citing an example of unlocking an encrypted Android device. “These are important things for the consumer, for privacy protection.”

Lenovo Group CEO Yang Yuanqing declined to say whether he backs the Apple position, saying the issue required time and consideration.

“Today it happens to Apple, tomorrow it could happen to Lenovo mobile phones. So we must be very serious to consider. We need to take some time,” Yang told Reuters.

Samsung Electronics Co and Chinese device maker Xiaomi declined to comment, while ZTE Corporation did not respond to requests for comments.

South Korean mobile maker LG Electronics Inc said it takes personal privacy and security very seriously, but declined to say whether it had ever worked with any government to insert so-called “backdoors” into its products or whether it had ever been asked to unlock a smartphone.

“Nobody wants to be seen as a roadblock to an investigation,” said a spokesperson for Micromax, India’s biggest local smartphone maker. “Nobody wants that kind of stigma. We have to take care of both customer security as well as (a) genuine threat to national security.”

Many Asian countries don’t have privacy laws that device makers can fall back on to resist demands from law enforcement authorities.

“As part of the evidence gathering process provided for under the law, law enforcement agencies in Singapore may request information from persons or organizations,” Singapore’s Ministry of Home Affairs Spokesperson told Reuters.
An official at India’s telecom regulator said authorities can ask for private user data from technology companies, as can those in Indonesia, said Ismail Cawidu, spokesman for Indonesia’s Communication and Information Ministry.
Eugene Tan, associate professor of law at the Singapore Management University, said he wouldn’t be surprised if technology firms weren’t being asked for access to their devices.

“It’s just that these are not made public. You can imagine for the technology companies, they are also concerned about the publicity — if they are seen to be caving in to law enforcement agencies, there is always a fear that people may not use their products and services,” he said.

Micromax said this was commonplace in India. “I can’t say no to a law enforcement request, and every day there is one,” the company’s spokesperson said. “You have to comply with requests in the larger interest of national security.”

The Apple battle may even spur regulators in some markets to demand device makers to grant them access.
Thailand’s telecoms regulator said it is studying the possibility of having separate agreements with handset makers and social media firms such as Facebook and Naver’s LINE to help extract data from mobile phones.
“There is political pressure” for regulating devices, said Rob Bratby, manager of Olswang Asia, a technology-focused law firm based in Singapore.

He said there was no evidence of any such regulatory interest yet, but it was a matter of time.

Encryption is Not a Threat to Our Safety, But Political Correctness is

Encryption is Not a Threat to Our Safety, But Political Correctness is

The legal battle between Apple, Inc and the US government has no sign of abating. Tim Cook, CEO of Apple, indicated that he is willing to fight the US government all the way to the Supreme Court. Apple Inc. just upped the ante by announcing that its engineers are working on new iPhone security features, which would make the iphone almost impossible to hack into by the company itself or government agencies. On the other hand, many government officials and politicians argue that encryption deprives them of opportunities to track the activities of bad guys and stop them from doing harm. Some in Congress are working on a new law to compel technology companies to grant the US government “limited” access by circumventing encryption.

Supporters of either Apple or the US government have written extensively on privacy vs. security issues. But something else has been missing in the current debate. Let’s revisit the San Bernardino terrorist attack. It’s worth remembering that one of the San Bernardino shooters, Tashfeen Malik, didn’t encrypt her radical and anti-America thoughts and ideas on Facebook prior to her visa application, they were posted for anyone to read. But our immigration officials were prevented from reviewing her easily accessible social media postings because the Secretary of Homeland Security feared a civil liberty backlash and bad PR. There is no legal basis for the Secretary’s concern. America has no obligation to grant a visa to any non-US citizen who expresses anti-America sentiment. It was widely reported after the San Bernardino shooting that Tashfeen Malik was responsible for radicalizing her husband Syed. Had someone at the Department of Homeland done half an hour google search, and accordingly denied Tashfeen’s fiancées visa, fourteen lives in San Bernardino could have been saved.

Failing to vet Tashfeen Malik adequately was not an outlier case. The leadership of the Department of Homeland security has a history of ” willingness to compromise the security of citizens for the ideological rigidity of political correctness.” Philip Haney, a former officer who spent 15 years at the Department of Homeland Security (DHS), wrote for the Hill recently that back in 2009, he was ordered by his supervisor at DHS ” to delete or modify several hundred records of individuals tied to designated Islamist terror groups like Hamas from the important federal database, the Treasury Enforcement Communications System (TECS).”

Apple and FBI to testify before Congress next week over encryption

Apple and FBI to testify before Congress next week over encryption

Over the past few days, Apple has made it abundantly clear that it will not comply with the FBI’s demand that it write a new piece of software to help bypass built-in iPhone security measures.

On the contrary, Apple has said that it wants the FBI to withdraw all of its demands while adding that the only way to move forward is to form a commission of experts on intelligence, technology, and civil liberties to discuss “the implications for law enforcement, national security, privacy, and personal freedoms.”

In the meantime, Apple has vehemently argued that Congress should be tasked with determining the fate of the shooter’s iPhone, not the courts. Come next Tuesday, Apple will finally be able to plead its case directly in front of our country’s lawmakers.

Earlier today, the House Judiciary Committee announced that it will be holding a congressional hearing on encryption on Tuesday, March 1. The hearing itself is called, “The Encryption Tightrope: Balancing Americans’ Security and Privacy.”

Slated to testify on the first panel is FBI director James Comey who, you might recall, recently penned a blogpost arguing that the current debate isn’t about the implications of encryption, but rather about “the victims and justice.”

On the second panel, Apple’s top lawyer, Bruce Sewell, will appear and present Apple’s case. Appearing alongside him will be Susan Landau, a cybersecurity expert, and New York District Attorney Cyrus R. Vance, Jr.

A statement from the House Judiciary Committee on the upcoming hearing reads as follows:

Apple and FBI to testify before Congress next week over encryption

This should undoubtedly make for a lively hearing.

Speaking to the seriousness with which Apple views this debate, Tim Cook yesterday said that helping the FBI would be tantamount to creating the “software equivalent of cancer.”

Apple CEO defends position in encryption dispute with feds

Apple CEO defends position in encryption dispute with feds

Apple CEO Tim Cook said in an interview Wednesday it was a tough decision to resist a court order directing the tech giant to override security features on the iPhone used by one of the San Bernardino gunmen who killed 14 people in a December terror attack.

However, Cook reiterated to ABC News in his first interview since the controversy erupted last week that if his company complied with the FBI’s demand to unlock Syed Rizwan Farook’s encrypted phone it would be “bad for America.”

“Some things are hard and some things are right, and some things are both. This is one of those things,” Cook said. The interview came as both sides in the dispute are courting public support, through interviews and published statements, while also mustering legal arguments in the case.

Federal authorities have insisted they’re only asking for narrow assistance in bypassing some security features on the iPhone, which they believe contains information related to the mass murders. Apple argues that doing so would make other iPhones more susceptible to hacking by authorities or criminals in the future.

The Apple chief expressed sympathy for the shooting victims’ families, and said his company provided engineers and technical advice to authorities investigating the case. But he said authorities are now asking the company “to write a piece of software that we view as sort of the equivalent of cancer.”

The software could “expose people to incredible vulnerabilities,” Cook added, arguing that smartphones contain private information about users and even their families.

“This would be bad for America,” he said. “It would also set a precedent that I believe many people in America would be offended by.”

Meanwhile, Attorney General Loretta Lynch defended the FBI’s push to access the locked phone Wednesday, saying judges at all levels have held such companies “must assist if it is reasonably within their power to do so – and suggesting Congress does not need to get involved as Apple wants.

But Lynch used testimony Wednesday before a House appropriations subcommittee to lay out the DOJ position that courts already have found companies must assist in opening devices.

“If the government needs the assistance of third parties to ensure that the search is actually conducted, judges all over the country and on the Supreme Court have said that those parties must assist if it is reasonably within their power to do so,” she said, without mentioning Apple by name. “And that is what we have been asking, and we owe it to the victims and to the public whose safety we must protect to ensure that we have done everything under the law to fully investigate terrorist attacks on American soil.”

Apple also is expected to argue that the Obama administration’s request to help it hack into an iPhone in the federal investigation of the San Bernardino attack is improper under an 18th century law, the 1789 All Writs Act, which has been used to compel companies to provide assistance to law enforcement.

Magistrate Judge Sheri Pym in California ordered Apple last week to create specialized software to help the FBI hack into a locked, county-issued iPhone used by Farook.

Why Canada isn’t having a policy debate over encryption

Why Canada isn’t having a policy debate over encryption

The legal saga between Apple and the FBI has thrust encryption into the government’s policy spotlight again – but only if you live in the United States. In Canada, you could be excused for not knowing such a debate exists .

Ever since FBI director James Comey characterized the rising tide of encrypted data as “going dark” in an October, 2014 speech, American civil liberties groups, cryptographers, private companies and politicians have argued ceaselessly about encryption’s merits and the dangers of so-called backdoors.

While most acknowledge that encryption keeps vast swaths of Internet communication and services secure, there have nonetheless been calls for legislation, “golden keys” and the formation of encryption committees in response to increasingly vocal arguments that encryption is helping criminals and terrorists operate beyond the law’s reach.

Things culminated last week with the FBI’s order that Apple Inc. modify its software to make it easier for law enforcement to break the iPhone’s security protections – modifications that have been characterized as a backdoor for law enforcement, or criminals, to use again and again.

In Canada, however, policy discussions involving encryption and, more largely, police powers in the digital realm – such as cellphone tracking devices and the use of hacking tools – have been “functionally non-existent,” according to Citizen Lab researcher Christopher Parsons.

“We haven’t had the kind of debate and back and forth and public positions taken that you see in the United States, you see in the United Kingdom. We just don’t do it here,” Mr. Parsons said.

Some of the reasons are familiar. There is, for example, a comparatively smaller policy community in Canada that focuses on these issues than there is in the U.S., and a smaller amount of case law – not to mention the fact that previous governments have shown more interest in expanding police powers, rather than curtailing or even detailing them.

And if past U.S. cases are any indication, the government will just as easily benefit by staying out of the debate and piggybacking on the outcome of the FBI’s case.

“They can dodge the debate and benefit from it without having to engage in it,” said Tamir Israel, a staff lawyer with the Canadian Internet Policy and Public Interest Clinic. “And then the other side to that is they often will find quieter ways to get comparable results where they can’t directly piggyback.”

By way of example, Mr. Israel pointed to the Solicitor General’s Enforcement Standards (SGES), which outline 23 technical surveillance standards that must be followed as a condition of obtaining a wireless spectrum licence in Canada. After the U.S. passed lawful surveillance legislation called the Communications Assistance for Law Enforcement Act in the 1990s, Canada used the SGES to quietly introduce similar standards.

Although the standards were introduced in the mid-1990s and updated again in 2008, details were not made public until The Globe and Mail obtained past and current versions of the documents in 2013.

Mr. Israel pointed to a wider problem preventing a successful encryption debate in Canada: a lack of transparency surrounding the government’s position and policies. He raised cellphone tracking technology called Stingrays, or IMSI catchers, as an example. “I personally find it very hard to believe that no law enforcement agencies in Canada are using these. But we can’t even get the debate going, because we can’t get past that first step where any of them admit that they’re using them.”

The RCMP would not comment on Apple’s dispute with the FBI but said in a statement: “International police agencies are all in agreement that some ability to access evidence when judicial authorization is granted is required, recognizing that secure data and communications enables commerce and social interactions in today’s reality. These are complex challenges which the RCMP continues to study.”

The statement continued: “The RCMP encourages public discourse with Canadians as public policy continues to take shape on the issue of encryption.”

The Office of the Privacy Commissioner of Canada said in an e-mail that it was not aware of any government agencies that have proposed backdoors in Canadian companies or Internet service providers, and that it is following encryption discussions “with interest.”

When reached via e-mail, Liberal MP Robert Oliphant, who chairs the standing committee on public safety and national security, wrote that, “while encryption and backdoors are of great concern to a number of people, they have not yet surfaced as issues for our committee in its early days.”

However, he added, the committee is still “sifting through all the important issues of safety and security and will be setting our work plan shortly.”

Public Safety Canada said in a statement that it is “monitoring the ongoing debate in the U.S. and other countries on the issue of government access to encrypted data” and that “no special events related to encryption” are currently planned.

NDP MP and committee vice-chair Brian Masse, echoing Mr. Oliphant’s statement, added that any proposed legislative changes involving encryption or backdoors should be handled democratically and involve both the Privacy Commissioner and Parliament.

Meanwhile, neither the chair nor vice-chairs of the standing committee on industry, science and technology responded to a request for comment.

A small comfort, Citizen Lab’s Mr. Parsons argued, is that Canadian politicians have shown themselves to be more level-headed and avoided the sky-is-falling rhetoric of their counterparts in the U.S., where Senator Dianne Feinstein, who chairs the Senate select committee on intelligence, stated earlier this month that “an Internet connection and an encrypted message application” is all Islamic State militants need to carry out an attack.

If this issue is going to be given some weight, Mr. Parsons suggested, “committee meetings that very seriously look into this while there isn’t a terror moment, it’s the ideal way of going.”

Mark Zuckerberg Defends Apple’s Stance On Encryption

Mark Zuckerberg Defends Apple’s Stance On Encryption

The real battle for data encryption on our mobile devices has heated up considerably over the past few weeks and is looking to come to a boil relatively soon as tech companies and industry moguls alike join Apple in its defense of encryption. This all began way back in 2013 when Edward Snowden became the whistleblower on the US government’s PRISM domestic spying program, revealing that our mobile devices might be feeding the government more information than anyone had thought of before. Ever since then, tech companies have been slowly locking down our personal information, whether it be adding two-step authorization on logins or actually encrypting the data stored on a mobile device. Google and Apple, among nearly 150 other big-name tech companies, petitioned President Obama to support data encryption nearly a year ago, and now it looks like that letter couldn’t have been more timely.

If you’ve been keeping up with the news lately you’ll know that the San Bernardino shooter used an encrypted iPhone, something Apple has been known for since enacting the standard some time ago. Joining Apple in its quest to secure our data was Google, who has begun requiring phones that ship with Android 6.0 Marshmallow to be fully encrypted, keeping your data secured from apparently even the FBI if they so desire to access it. While the FBI and Apple are currently in a legal battle for the future of encryption, company after company has begun piling on the FBI to get them to deal with the problem another way and save encryption and user’s privacy as a whole.

Facebook owner, CEO and multi-billionaire Mark Zuckerberg has joined this fight, alongside his company, and is fighting to keep encrypted phone sales from being banned in the US. This in addition to the alternative that the FBI has suggested in creating a backdoor in encryption methods for law enforcement, a move that completely removes the purpose of encryption in the first place. Such a move would eliminate the greatest security measure users have to secure their devices, and ultimately defeat the point of encryption in the first place. Zuckerberg’s support of encryption is a huge win for the tech community, and further helps back the opinions of people like John McAfee in stating that the FBI has fallen way behind the times and needs to resort to other methods of dealing with encryption than trying to remove it.

San Bernardino victims to oppose Apple on iPhone encryption

San Bernardino victims to oppose Apple on iPhone encryption

Some victims of the San Bernardino attack will file a legal brief in support of the U.S. government’s attempt to force Apple Inc to unlock the encrypted iPhone belonging to one of the shooters, a lawyer representing the victims said on Sunday.

Stephen Larson, a former federal judge who is now in private practice, told Reuters that the victims he represents have an interest in the information which goes beyond the Justice Department’s criminal investigation.

“They were targeted by terrorists, and they need to know why, how this could happen,” Larson said.

Larson said he was contacted a week ago by the Justice Department and local prosecutors about representing the victims, prior to the dispute becoming public. He said he will file an amicus brief in court by early March.

A Justice Department spokesman declined to comment on the matter on Sunday.

Larson declined to say how many victims he represents. Fourteen people died and 22 others were wounded in the shooting attack by a married couple who were inspired by Islamic State militants and died in a gun battle with police.

Entry into the fray by victims gives the federal government a powerful ally in its fight against Apple, which has cast itself as trying to protect public privacy from overreach by the federal government.

An Apple spokesman declined to comment. In a letter to customers last week, Tim Cook, the company’s chief executive, said: “We mourn the loss of life and want justice for all those whose lives were affected,” saying that the company has “worked hard to support the government’s efforts to solve this horrible crime.”

Federal Bureau of Investigation Director James Comey said in a letter released on Sunday night that the agency’s request wasn’t about setting legal precedent, but rather seeking justice for the victims and investigating other possible threats.

“Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law. That’s what this is,” Comey wrote.

The FBI is seeking the tech company’s help to access shooter Syed Rizwan Farook’s phone by disabling some of its passcode protections. The company so far has pushed back, arguing that such a move would set a dangerous precedent and threaten customer security.

The clash between Apple and the Justice Department has driven straight to the heart of a long-running debate over how much law enforcement and intelligence officials should be able to monitor digital communications.

The Justice Department won an order in a Riverside, California federal court on Tuesday against Apple, without the company present in court. Apple is scheduled to file its first legal arguments on Friday, and U.S. Magistrate Judge Sheri Pym, who served as a federal prosecutor before being appointed to the bench, has set a hearing on the issue for next month.

Larson once presided over cases in Riverside, and Pym argued cases in Larson’s courtroom several times as a prosecutor while Larson was a judge, he said. Larson returned to private practice in 2009, saying at the time that a judge’s salary was not enough to provide for his seven children.

He said he is representing the San Bernardino victims for free.

What Tim Cook doesn’t want to admit about iPhones and encryption

What Tim Cook doesn't want to admit about iPhones and encryption

When Hillary Clinton called for a “Manhattan-like project” to find a way for the government to spy on criminals without undermining the security of everyone else’s communications, the technology world responded with mockery.

“Also we can create magical ponies who burp ice cream while we’re at it,” snarked prominent Silicon Valley investor Marc Andreessen. Clinton’s idea “makes no sense,” added Techdirt’s Mike Masnick, because “backdooring encryption means that everyone is more exposed to everyone, including malicious hackers.”

It’s an argument that’s been echoed by Apple CEO Tim Cook, who is currently waging a legal battle with the FBI over its request to unlock the iPhone of San Bernardino terrorism suspect Syed Rizwan Farook. “You can’t have a backdoor that’s only for the good guys,” Cook said in November.

There’s just one problem: This isn’t actually true, and the fight over Farook’s iPhone proves it. Apple has tacitly admitted that it can modify the software on Farook’s iPhone to give the FBI access without damaging the security of anyone else’s iPhone.

Claiming that secure back doors are technically impossible is politically convenient. It allows big technology companies like Apple to say that they’d love to help law enforcement but don’t know how to do it without also helping criminals and hackers.

But now, faced with a case where Apple clearly can help law enforcement, Cook is in the awkward position of arguing that it shouldn’t be required to.

Apple isn’t actually worried about the privacy of a dead terrorism suspect. Cook is worried about the legal precedent — not only being forced to help crack more iPhones in the future, but conceivably being forced to build other hacking tools as well.

But by taking a hard line in a case where Apple really could help law enforcement in an important terrorism case — and where doing so wouldn’t directly endanger the security of anyone else’s iPhone — Apple risks giving the impression that tech companies’ objections aren’t being made entirely in good faith.

The San Bernardino case shows secure back doors are possible

What Tim Cook doesn't want to admit about iPhones and encryption

Technologists aren’t lying when they say secure back doors are impossible. They’re just talking about something much narrower than what the term means to a layperson. Specifically, their claim is that it’s impossible to design encryption algorithms that scramble data in a way that the recipient and the government — but no one else — can read.

That’s been conventional wisdom ever since 1994, when a researcher named Matt Blaze demonstrated that a government-backed proposal for a back-doored encryption chip had fatal security flaws. In the two decades since, technologists have become convinced that this is something close to a general principle: It’s very difficult to design encryption algorithms that are vulnerable to eavesdropping by one party but provably secure against everyone else. The strongest encryption algorithms we know about are all designed to be secure against everyone.

But the fact that we don’t know how to make an encryption algorithm that can be compromised only by law enforcement doesn’t imply that we don’t know how to make a technology product that can be unlocked only by law enforcement. In fact, the iPhone 5C that Apple and the FBI are fighting about this week is a perfect example of such a technology product.

You can read about how the hack the FBI has sought would work in my previous coverage, or this even more detailed technical analysis. But the bottom line is that the technology the FBI is requesting — and that Apple has tacitly conceded it could build if forced to do so — accomplishes what many back door opponents have insisted is impossible.

Without Apple’s help, Farook’s iPhone is secure against all known attacks. With Apple’s help, the FBI will be able to crack the encryption on Farook’s iPhone. And helping the FBI crack Farook’s phone won’t help the FBI or anyone else unlock anyone else’s iPhone.

It appears, however, that more recent iPhones are not vulnerable to the same kind of attack. (Update: Apple has told Techcrunch that newer iPhones are also vulnerable.) If Farook had had an iPhone 6S instead of an iPhone 5C, it’s likely (though only Apple knows for sure) that Apple could have truthfully said it had no way to help the FBI extract the data.

That worries law enforcement officials like FBI Director James Comey, who has called on technology companies to work with the government to ensure that encrypted data can always be unscrambled. Comey hasn’t proposed a specific piece of legislation, but he is effectively calling on Apple to stop producing technology products like the iPhone 6S that cannot be hacked even with Apple’s help.

The strongest case against back doors is repressive regimes overseas

What Tim Cook doesn't want to admit about iPhones and encryption

If you have a lot of faith in the US legal system (and you’re not too concerned about the NSA’s creative interpretations of surveillance law), Comey’s demand might seem reasonable. Law enforcement agencies have long had the ability to get copies of almost all types of private communication and data if they first get a warrant. There would be a number of practical problems with legally prohibiting technology products without back doors, but you might wonder why technology companies don’t just voluntarily design their products to comply with lawful warrants.

But things look different from a global perspective. Because if you care about human rights, then you should want to make sure that ordinary citizens in authoritarian countries like China, Cuba, and Saudi Arabia also have access to secure encryption.

And if technology companies provided the US government with backdoor access to smartphones — either voluntarily or under legal compulsion — it would be very difficult for them to refuse to extend the same courtesy to other, more authoritarian regimes. In practice, providing access to the US government also means providing access to the Chinese government.

And this is probably Apple’s strongest argument in its current fight with the FBI. If the US courts refuse to grant the FBI’s request, Apple might be able to tell China that it simply doesn’t have the software required to help hack into the iPhone 5C’s of Chinese suspects. But if Apple were to create the software for the FBI, the Chinese government would likely put immense pressure on Apple to extend it the same courtesy.

Google CEO Pichai Lends Apple Support on Encryption

Google CEO Pichai Lends Apple Support on Encryption

Google Chief Executive Sundar Pichai lent support to Apple Inc.’s  pushback against a federal order to help law enforcement break into the locked iPhone of an alleged shooter in the San Bernardino, Calif., attacks.

Mr. Pichai wrote on Twitter on Wednesday that “forcing companies to enable hacking could compromise users’ privacy.”

Google CEO Pichai Lends Apple Support on Encryption

A federal judge Tuesday ordered Apple to enable investigators to bypass the passcode of the iPhone once used by alleged shooter Syed Rizwan Farook. Apple CEO Tim Cook wrote on Apple’s website that such a move would create “a backdoor” around security measures hackers could eventually use to steal iPhone users’ data.

On Twitter, Mr. Pichai called Mr. Cook’s letter an “important post.” He said that while Alphabet Inc.’s Google provides user data to law enforcement under court orders, “that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.”

Google CEO Pichai Lends Apple Support on Encryption

Google, like Apple, has been locked in an intensifying battle with U.S. authorities over the companies’ smartphone encryption software. The firms say that the encryption is crucial to protecting users’ privacy, and keeping their trust. Law enforcement officials say such software hinders criminal investigations, including into the San Bernardino attacks.

Here’s why the FBI forcing Apple to break into an iPhone is a big deal

iphone 4s

When U.S. Magistrate Sheri Pym ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino, Calif., shootings, the tech world shuddered.

Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where a company’s success could be made or broken based on its ability to protect customer data.

The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.

Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.

So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.

Federal prosecutors looking for more information behind the San Bernardino shootings don’t know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.

That’s why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.

Kurt Opsahl, general counsel for the Electronic Frontier Foundation, a San Francisco-based digital rights non-profit, explained that this “backdoor” means Apple will have to to write brand new code that will compromise key features of the phone’s security. Apple has five business days to respond to the request.

What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEO Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes. In the interview, Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.

He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”

Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones. According to a page on Apple’s website detailing government requests, Apple says encryption data is tied to the device’s passcode.

Cook also dismissed the idea that iPhone users should swap privacy for security. “We’re America. We should have both.”

What does this mean for the next time the government wants access? The order doesn’t create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.

What do digital rights experts have to say? There are two things that make this order very dangerous, Opsahl said. The first is the question is raises about who can make this type of demand. If the U.S. government can force Apple to do this, why can’t the Chinese or Russian governments?

The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key. It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.

And the lawmakers? Well, they are torn. Key House Democrat, Rep. Adam Schiff, D-Calif., says Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue.

On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.

What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google — responding to consumer demands for privacy — have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.

Encryption May Hurt Surveillance, but Internet Of Things Could Open New Doors

Tech companies and privacy advocates have been in a stalemate with government officials over how encrypted communication affects the ability of federal investigators to monitor terrorists and other criminals. A new study by Harvard’s Berkman Center for Internet and Society convened experts from all sides to put the issue in context.

The report concluded that information from some apps and devices like smartphones may be harder for government investigators to intercept because of stronger encryption. But, it said, we are connecting so many more things to the Internet (light bulbs, door locks, watches, toasters) that they could create new surveillance channels.

Encryption May Hurt Surveillance, But Internet Of Things Could Open New Doors

The encryption debate has reheated recently following the attacks in Paris and to some extent San Bernardino, Calif., with CIA and FBI officials warning about their investigation channels “going dark” because of the stronger encryption placed on communications tools like WhatsApp or FaceTime.

(The distinction is this: With things like emails, Web searches, photos or social network posts, information typically gets encrypted on your phone or laptop and then decrypted and stored on a big corporate data server, where law enforcement officials have the technical and legal ability to get access to the content, for instance, with a subpoena. But with messages that are encrypted end-to-end, data gets encrypted on one device and only gets decrypted when it reaches the recipient’s device, making it inaccessible even with a subpoena.)

The agencies have asked for “back doors” into these technologies, though the Obama administration cooled off its push for related legislation late last year over concerns that such security loopholes would also attract hackers and other governments.

But the Harvard report (which was funded by the Hewlett Foundation) argues that “going dark” is a faulty metaphor for the surveillance of the future, thanks to the raft of new technologies that are and likely will remain unencrypted — all the Web-connected home appliances and consumer electronics that sometimes get dubbed the Internet of Things.

Some of the ways the data used to be accessed will undoubtedly become unavailable to investigators, says Jonathan Zittrain, a Harvard professor who was one of the authors. “But the overall landscape is getting brighter and brighter as there are so many more paths by which to achieve surveillance,” he says.

“If you have data flowing or at rest somewhere and it’s held by somebody that can be under the jurisdiction of not just one but multiple governments, those governments at some point or another are going to get around to asking for the data,” he says.

The study team is notable for including technical experts and civil liberties advocates alongside current and former National Security Agency, Defense Department and Justice Department officials. Another chief author was Matthew Olsen, former director of the National Counterterrorism Center and NSA general counsel.

Though not all 14 core members had to agree to every word of the report, they had to approve of the thrust of its findings — with the exception of current NSA officials John DeLong and Anne Neuberger, whose jobs prevented them from signing onto the report (and Zittrain says nothing should be inferred about their views).

The results of the report are a bit ironic: It tries to close one can of worms (the debate over encryption hurting surveillance) but opens another one (the concerns about privacy in the future of Internet-connected everything).

“When you look at it over the long term,” says Zittrain, “with the breadth of ways in which stuff that used to be ephemeral is now becoming digital and stored, the opportunities for surveillance are quite bright, possibly even worryingly so.”

Weak email encryption laws put Aussie consumers at risk of fraud

Weak email encryption laws put Aussie consumers at risk of fraud

A consumer alert issued by Victoria’s Legal Services Commissioner a few weeks ago raised, to our mind, an old and curious issue. Why aren’t Australian professionals required to secure their email?

Eighteen years ago, Victoria’s Law Institute Journal carried an excellent feature article on the ease with which email can be forged, the fact that it was already happening and the gold standard technology for mitigating the risk, digital signatures and encryption. We have to say it was excellent, since we wrote it, but it did get a lot of attention. It even won an award. But it had no practical impact at all.

Fast forward to 2016 and the same State’s Legal Services Commissioner is alarmed by a UK report of an email hoax that fleeced a newly married couple of their home deposit. Just when they were waiting for instructions from their lawyers on where to transfer their hard earned ₤45,000, fraudsters sent a bogus message that impersonated the attorneys and nominated a false bank account. The hapless couple complied and the scammers collected their cash.

UNSECURED SYSTEM

The Victorian Commissioner’s alert includes several good points of advice to consumers, like being cautious about links and attachments in emails from unfamiliar senders and using antivirus software. But curiously, it doesn’t canvass the key technology question raised in the UK report: Why wasn’t the lawyers’ email secured against forgery?

The newlywed groom pointed the finger right at the problem, quoted as saying “‘Losing this money is bad enough. But what makes it worse is that this could have all been avoided if our emails had been encrypted. It seems crazy to ask us to transfer such huge amounts by sending a bank account number.”

The lawyers’ response: “Advantage Property Lawyers said that the firm was not responsible for the couple’s loss. It said its emails were not encrypted but that this was standard industry practice. We stick to the highest industry standards in all aspects of our business.”

So non-encryption, fairly described by Joe Public as crazy, is the standard industry practice in the UK, just as it is in Australia.

There may be more to this than meets the eye. A couple of years after our 1997 article, we were asked to host a media lunch for Phil Zimmerman, the US tech wizard who created the first user friendly email encryption and signing software. We invited a senior officer of the Law Institute, thinking the topic would be of vital interest. Apparently not.

Over lunch, Zimmerman offered to supply the Institute with free copies of the tool so it could lead the profession down the road of best practice. For reasons we didn’t understand then and still don’t, the offer created no interest.

LACK OF INTEREST

We recounted the story of that lunch in this column years later, wondering if that would spark some enquiry into the options for fighting exactly the kind of fraud that’s happening in the UK. Silence. It seems that, at the highest levels, legal eagles’ eyes glaze over when the topic of secure email arises. As long as the entire profession ignores the issue, we can all say that “our emails are not encrypted but this is standard industry practice.”

For the record, encryption can help secure email in two ways. First, it can prove that a message is from an authenticated sender, and hasn’t been tampered with in transit. Optionally, it can also scramble the contents of messages so only the intended recipient can read them. Implementing these protections requires some centralised infrastructure and a way to ensure it is used by the target audience. Australia’s law societies are ideally placed to sponsor a more secure system, especially now that a uniform national legal practice regime is in operation.

We used Zimmerman’s product for a couple of years, and it was simple. Using an Outlook plug in, you clicked a button to send a signed message. You entered a password, the software worked its magic in the background, and a digital signature was applied. We gave it up when it became clear that insecure email was set to remain industry best practice for years to come.

Back in 1997, we wrapped up our article with the wildly inaccurate prediction that “in two years, all commercial documentation will be digitally signed. Lawyers have every reason to lead the way.”

Here’s hoping it doesn’t take another 18 years.

Top senator: Encryption bill may “do more harm than good”

Top senator: Encryption bill may "do more harm than good"

Legislating encryption standards might “do more harm than good” in the fight against terrorism, Senate Homeland Security Committee Chairman Ron Johnson (R-Wis.) said on Thursday.

In the wake of the terrorist attacks in Paris and San Bernardino, Calif., lawmakers have been debating whether to move a bill that would force U.S. companies to decrypt data for law enforcement.

“Is it really going to solve any problems if we force our companies to do something here in the U.S.?” Johnson asked at the American Enterprise Institute, a conservative think tank. “It’s just going to move offshore. Determined actors, terrorists, are still going to be able to find a service provider that will be able to encrypt accounts.”
Investigators have said the Paris attackers used encrypted apps to communicate. It’s part of a growing trend, law enforcement says, in which criminals and terrorists are using encryption to hide from authorities.

For many, the solution has been to require that tech companies maintain the ability to decrypt data when compelled by a court order. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) are currently working on such a bill.

But the tech community and privacy advocates have pushed back. They warn that any type of guaranteed access to encrypted data puts all secure information at risk. Keeping a key around to unlock encryption means that anyone, they argue, including hackers can use that key.

Johnson said he understands the importance of strong encryption.

“Let’s face it, encryption helps protect personal information,” he said. “It’s crucial to that. I like the fact that if somebody gets my iPhone, they’re going to have a hard time getting into it.”

Capitol Hill faces a learning curve on the issue, Johnson explained.

“It really is not understanding the complexity,” he said. “And I’m not being critical here. It’s really complex, which is the biggest problem you have in terms of cyber warfare [and] cyberattacks.”

“The experts, the attackers are multiple steps ahead of the good guys trying to reel them in, trying to find them,” Johnson added.

McCaul: US playing ‘catchup’ to terrorists using encryption

McCaul: US playing 'catchup' to terrorists using encryption

The U.S. is playing “catchup” with terrorists and cyber vigilantes who coordinate via encrypted communications, according to the chairman of the House Homeland Security Committee.

“Today’s digital battlefield has many more adversaries that just nation states,” Rep. Michael McCaul (R-Texas) said in a Tuesday column for Bloomberg. “Terrorist groups such as ISIS [the Islamic State in Iraq and Syria], as well as hacktivists … are adept at using encryption technologies to communicate and carry out malicious campaigns, leaving America to play catchup.”

McCaul has been outspoken in the fight between tech companies and law enforcement over the regulation of encryption technology. He is currently prepping legislation that would establish a national commission to find ways to balance the public’s right to privacy with giving police access to encrypted information.

“I do think this is one of the greatest challenges to law enforcement that I have probably seen in my lifetime,” the former federal prosecutor told reporters last week.

Lawmakers are split over whether legislation is needed to address the growing use of technology that can prevent even a device manufacturer from decrypting data.

Tech experts argue that any guaranteed access for law enforcement weakens overall Internet security and makes online transactions such as banking and hotel bookings riskier. Privacy advocates say strong encryption provides important protection to individuals.

But law enforcement officials, along with some lawmakers, continue to argue that impenetrable encryption is a danger to public safety.

“From gang activity to child abductions to national security threats, the ability to access electronic evidence in a timely manner is often essential to successfully conducting lawful investigations and preventing harm to potential victims,” Assistant Attorney General Leslie Caldwell said at the annual State of the Net conference on Monday.

The White House has tried to engage Silicon Valley on the topic, recently meeting with top tech executives on the West Coast. But some lawmakers feel the process should move quicker.

In the upper chamber, Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.)  are working on a bill that would force companies to build their encryption so they could respond to a court order for secured data.

Both members of the Intelligence Committee have expressed a desire to move swiftly on encryption legislation and bypass the proposed national commission to study the topic.

McCaul warned that the threats the U.S. faces online “will only grow more prevalent.”

“The security of Americans’ personal information needs to keep pace with the emerging technologies of today,” McCaul said.

Half-Measures on Encryption Since Snowden

Half-Measures on Encryption Since Snowden

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

Half-Measures on Encryption Since Snowden

But Apple products aren’t widely used in the poor world. Of the 3.4 billion smartphones in use worldwide, more than 80 percent run Google’s Android operating system. Many are low-end phones with less built-in protection than iPhones. This has produced a “digital security divide,” says Chris Soghoian, principal technologist at the American Civil Liberties Union. “The phone used by the rich is encrypted by default and cannot be surveilled, and the phone used by most people in the global south and the poor and disadvantaged in America can be surveilled,” he said at MIT Technology Review’s EmTech conference in November.

Pronouncements on new encryption plans quickly followed the Snowden revelations. In November 2013, Yahoo announced that it intended to encrypt data flowing between its data centers and said it would also encrypt traffic moving between a user’s device and its servers (as signaled by the address prefix HTTPS). Microsoft announced in November and December 2013 that it would expand encryption to many of its major products and services, meaning data would be encrypted in transit and on Microsoft’s servers. Google announced in March 2014 that connections to Gmail would use HTTPS and that it would encrypt e-mails sent to other providers who can also support encryption, such as Yahoo. And finally, in 2013 and 2014, Apple implemented the most dramatic changes of all, announcing that the latest version of iOS, the operating system that runs on all iPhones and iPads, would include built-in end-to-end encrypted text and video messaging. Importantly, Apple also announced it would store the keys to decrypt this information only on users’ phones, not on Apple’s servers—making it far more difficult for a hacker, an insider at Apple, or even government officials with a court order to gain access.

Google, Microsoft, and Yahoo don’t provide such end-to-end encryption of communications data. But users can turn to a rising crop of free third-party apps, like ChatSecure and Signal, that support such encryption and open their source code for review. Relatively few users take the extra step to learn about and use these tools. Still, secure messaging apps may play a key role in making it easier to implement wider encryption across the Internet, says Stephen Farrell, a computer scientist at Trinity College Dublin and a leader of security efforts at the Internet Engineering Task Force, which develops fundamental Internet protocols. “Large messaging providers need to get experience with deployment of end-to-end secure messaging and then return to the standards process with that experience,” he says. “That is what will be needed to really address the Internet-scale messaging security problem.”

AT&T CEO won’t join Tim Cook in fight against encryption backdoors

AT&T CEO won’t join Tim Cook in fight against encryption backdoors

US politicians have been urging tech companies to weaken the security of smartphones and other products by inserting encryption backdoors that let the government access personal data.

Numerous tech companies—including Apple—have come out strongly against the idea, saying that encryption backdoors would expose the personal data of ordinary consumers, not just terrorists.

But tech company leaders aren’t all joining the fight against the deliberate weakening of encryption. AT&T CEO Randall Stephenson said this week that AT&T, Apple, and other tech companies shouldn’t have any say in the debate.

“I don’t think it is Silicon Valley’s decision to make about whether encryption is the right thing to do,” Stephenson said in an interview with The Wall Street Journal. “I understand [Apple CEO] Tim Cook’s decision, but I don’t think it’s his decision to make.”

AT&T has been criticized repeatedly for its cooperation with the US National Security Agency, but Stephenson says his company has been singled out unfairly.

“‘It is silliness to say there’s some kind of conspiracy between the US government and AT&T,’ he said, adding that the company turns over information only when accompanied by a warrant or court order,” the Journal reported yesterday.

While presidential candidate Hillary Clinton called for a “Manhattan-like project” to help law enforcement break into encrypted communications, Cook argues that it’s impossible to make an encryption backdoor that can be used only by law enforcement. “The reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys,” Cook said last month.

Security researchers recently discovered a backdoor password in Juniper firewall code. Researchers also found a deliberately concealed backdoor in dozens of products sold by a company that supplies audio-visual and building control equipment to the US Army, White House, and other security-conscious organizations.

FBI Director James Comey told lawmakers in October that the Obama administration won’t ask Congress for legislation requiring tech companies to install backdoors in their products, but he said the administration would continue lobbying companies to create backdoors even though they’re not required to.

Despite AT&T sitting out the debate, plenty of tech companies balk at the idea. A letter to President Obama protesting deliberate weakening of security last year was signed by Adobe, Apple, Cisco, CloudFlare, Dropbox, Evernote, Facebook, Google, Level 3, Microsoft, Mozilla, Rackspace, Symantec, Tumblr, Twitter, and others. AT&T did not sign the letter.

Cisco Security Report: Dwell time and encryption security struggles

Cisco Security Report: Dwell time and encryption security struggles

The 2016 Cisco Security Report highlighted the duality of cybersecurity and described a number of issues, including encryption security and dwell time as a constant struggle between threat actors looking for more effective and efficient attack techniques and security providers responding to those changes.

One of the statistics in the report that could have been spun as a net positive for Cisco was that since May, Cisco reduced the median time to detection (or dwell time) of known threats on its networks to 17 hours. However, Jason Brvenik, principal engineer for the Security Business Group at Cisco, noted that this metric was more representative of the “push and pull” between threat actors and security and should be used more as a way to see which side is improving at a given time.

“Our point in talking about time to detection is that it’s a durable metric that organizations can use, establish and measure to help them understand how well they’re doing and what their opportunity is to improve,” Brvenik said. “And, if you don’t start paying attention to time to detection, then the attacker basically has unfettered access until you get that.”

Fred Kost, senior vice president at HyTrust, said that although dwell time is an important security metric, it is reactive and not preventative.

“Time to detection will vary over time as the cat and mouse game plays out between attackers and defenders,” Kost said. “Part of the challenge for enterprises is the improving ability of attackers to remain covert once they have access to the network and servers, driving the need to have better segmentation and controls on what privileges users have, especially as virtualization and cloud makes access to a greater number of systems more likely.”

Brvenik said Cisco has had some success in bringing down dwell time, but that this measurement would ultimately vary.

“I fully expect that [threat actors] are going to recognize the lack of ROI or the reduction of ROI they’re getting and they’re going to come back and try something new,” Brvenik said. “As a defender, you have to be right 100% of the time and the attacker only has to be right once.”

Cisco found this was already true in the evolution of botnets and exploit kits (EK) like Angler. The Security Report showed that attackers using Angler had large scale campaigns with 90,000 targets per server per day, 10% of which were served exploits. Of those served exploits, 40% were compromised and 62% of those were served ransomware. Though only a small fraction paid the ransom (2.9%) and each instance was a few hundred dollars, that still added up to $34 million per ransomware campaign over the course of a year.

Craig Williams, senior technical leader and outreach manager at Cisco, said the advancements seen in how attackers use the Angler EK and botnets can be directly attributed to the security industry getting better at its job. Williams described how five or ten years ago, botnets were simple setups of one server connecting to another, so it was easy to block the host server to take down the botnet. But, attackers have found a way to use Angler to make this much more difficult to stop.

“The way that they set up the network to host these exploits is really intelligently architected around the fact that they want to have the ability to rotate servers as we take them down. You can kind of think of it like a Hydra,” Williams said. “When the customer gets redirected to the Angler exploit kit’s landing page, to them it looks like the front-end proxy server is all there is. But, the reality is that behind the scenes it’s actually being connected to another server hosting the exploit and yet a third server that’s actually continuously pinging it to make sure it’s online. The second it goes down from an abuse ticket or blocked by a good guy, it’ll actually rotate that server out and replace it with another server with a completely different IP address. So, effectively cutting the head off the Hydra, another head pops up in place and takes over. It’s a really unique design and I think it’s one that we’ve seen and will continue to see people evolve to just because it’s a little more efficient way to be a bad guy and that’s just the nature of the game in this day and age.”

Another subject that Cisco found to have both positive and negative consequences was encryption. The report stated that encryption can create security issues for organizations, including a false sense of security. Research found that encrypted traffic, especially HTTPS, crossed a tipping point in 2015 and now more than 50% of bytes transferred were encrypted over the year. But, Brvenik said this is something organizations need to plan for because it means “they’re rapidly losing visibility into some of the threats that can present there.”

Williams noted that while the push towards encryption is good from a privacy standpoint, it will also introduce “significant security issues.” Williams said the biggest misconceptions were that people tend to think if something is encrypted, it is safe, and that more encryption is always better.

“Think about what encryption was designed to be used for — only the sensitive pieces of data. That’s how encryption is intended to be used,” Williams said. “An advertisement from a website is not a sensitive piece of data and it shouldn’t be encrypted. If it is, then you’re effectively hiding any potential attacks from detection systems. So, even if your company has IPS or in-line antivirus, you’re not going to see potential attacks.”

Brvenik said the loss of visibility will have cascading impacts and organizations need to plan security strategies now.

“The impact of a lack of visibility in one layer will affect others. There are solutions that can move to the endpoint; there are solutions that can move to decapsulation; there are a lot of approaches there,” Brvenik said. “The point is — they need to start thinking about it now because they’re going to find themselves in a situation where it’s too late.”

Gur Shatz, co-founder and chief technology officer of Cato Networks, said enterprises need to be careful about how they plan security strategies because dealing with encrypted data can be resource-intensive.

“Encrypted traffic requires decryption before it can be analyzed. This is a CPU-intensive process, and could add latency,” Shatz said. “Ideally, you want to decrypt once, and do all the threat detection (multiple layers) on the decrypted traffic. When using point solutions, each one will need to decrypt the traffic separately, potentially slowing down traffic. On the flip side, some enterprises will want to choose and integrate best-of-breed point solutions, because they believe they can get better detection.”

Jeff Schilling, CSO for Armor, said the latency issue could force difficult decisions.

“More complex encryption algorithms are harder to decrypt for Layer 7 inspection, looking for common web OWASP top ten application attacks. This is driving the web industry to look to CDN application inspection architectures, which can inject latency, which many of our customers can’t tolerate,” Schilling said. “We have to ask ourselves, which problem has more risk? Threat actors decrypting data or launching application layer attacks? I think there is more risk in the latter.”

One attack vector that Cisco said was being overlooked was in malicious browser extensions. Cisco’s research found that more than 85% of organizations encounter malicious extensions in the browser, which can lead to leaked data, stolen account credentials, and even attackers installing malicious software on a victim’s computer.

Williams said this is especially dangerous because the browser is the largest attack surface in an organization. But, Williams also said that this should be a very easy problem to fix, because although internal Web apps may need a specific plugin or browser version, the tools exist to secure the enterprise environment.

“The reality is in this day and age there are so many different types of browsers out there and so many different ways to install those, that you can easily have a secured browser for the Internet and another browser you use because you have to have a specific plugin or a specific variant,” Williams said. “You can determine this from the network. There is no reason companies should allow insecure browsers to access the Internet anymore. We have the technology. We have solutions that can filter out vulnerable browsers and just prevent them from connecting out.”

Robert Hansen, vice president of WhiteHat Labs at WhiteHat Security, said enterprises should have strong policies about what browser extensions can be installed by employees.

“Browser extensions often leak data about their presence, people’s web-surfing habits, and other system level information. Sometimes this can be fairly innocuous (for instance anonymized metadata about usage) and sometimes it can be incredibly dangerous, like full URL paths of internal sensitive devices,” Hansen said. “In general, people really shouldn’t be installing their own browser extensions – that should be for IT to vet and do for them to ensure they aren’t inadvertently installing something malicious.”

British voice encryption protocol has massive weakness, researcher says

A protocol designed and promoted by the British government for encrypting voice calls has a by-design weakness built into it that could allow for mass surveillance, according to a University College London researcher.

Steven Murdoch, who works in the university’s Information Security Research Group, analyzed a protocol developed by CESG, which is part of the spy agency GCHQ.

The MIKEY-SAKKE (Multimedia Internet KEYing-Sakai-KasaharaKey Encryption) protocol calls for a master decryption key to be held by a service provider, he wrote in an analysis published Tuesday.

Cryptography engineers seeking to build secure systems avoid this approach, known as key escrow, as it makes whatever entity holding the key a target for attack. It also makes the data of users more vulnerable to legal action, such as secret court orders.

The approach taken by the British government is not surprising given that it has frequently expressed its concerns over how encryption could inhibit law enforcement and impact terrorism-related investigations.

The technology industry and governments have been embroiled in a fierce ongoing debate over encryption, with tech giants saying building intentionally weak cryptography systems could provide attack vectors for nation-state adversaries and hackers.

Murdoch wrote CESG is well aware of the implications of its design. Interestingly, the phrase “key escrow” is never used in the protocol’s specification.

“This is presented as a feature rather than bug, with the motivating case in the GCHQ documentation being to allow companies to listen to their employees calls when investigating misconduct, such as in the financial industry,” he wrote.

The endorsement of the protocol has wide-ranging implications for technology vendors. Murdoch wrote that the British government will only certify voice encryption products that use it. The government’s recommendations also influence purchasing decisions throughout the industry.

“As a result, MIKEY-SAKKE has a monopoly over the vast majority of classified U.K. government voice communication, and so companies developing secure voice communication systems must implement it in order to gain access to this market,” he wrote.

GCHA has already begun certifying products under its Commercial Product Assurance (CPA) security evaluation program. Approved products must use MIKEY-SAKKE and also Secure Chorus, an open-source code library that ensure interoperability between different devices.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

During last night’s democratic debate we were once again inundated with calls from politicians who sought compromise from Silicon Valley in its on-going battle with terrorism. Encryption was the point of contention.

The candidates echoed previous statements regarding the dangerous world we live in. The reason for danger, or so it goes, is the inability of law enforcement to pursue threats from terrorists, both domestic and international, who are increasingly reliant on encryption to communicate.

The sentiment is true, albeit misguided, but more on that in a moment.

Currently, the argument is painted as black and white, a “you’re either for us, or against us” exchange that leaves average Americans scratching their collective heads wondering why Silicon Valley isn’t stepping up the fight against terrorism by cooperating with government.

Arguments, even this one, are rarely binary.

In fact, from a security standpoint, the compromise the government seeks is impossible.

“Technically, there is no such backdoor that only the government can access,” says cyber security expert Swati Khandelwal of The Hacker News. “If surveillance tools can exploit ‘vulnerability by design,’ then an attacker who gained access to it would enjoy the same privilege.”

Microsoft MVP of developer security, Troy Hunt adds:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

The encryption smear campaign

For all that encryption does for us, it has become a quagmire of political talking points and general misuderstanding by citizens and I’d argue, politicians.

“The truth is that encryption is a tool that is used for good, by all of us who use the internet everyday,” says famed computer security expert Graham Cluley.

“Encryption is a tool for freedom. Freedom to express yourself. Freedom to be private. Freedom to keep your personal data out of the hands of hackers.”

The term itself has become a bit of a paradox. Numbers paint a picture of citizens who think it’s important but have no real idea of how and where it protects them.

According to a Pew Research report, fewer than 40 percent of US citizens feel their data is safe online, yet only 10 percent of adults say they’ve used encrypted phone calls, text messages or email and 9 percent have tried to cover online footprints with a proxy, VPN or TOR.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

These numbers demonstrate a fundamental misunderstanding of encryption and further detail its public perception.

It is, after all, only natural to attempt to protect yourself when you can foresee a threat, yet US citizens have a rather apathetic view of the very technology that could make them safer online.

According to the experts in the field that I spoke with, they all seem to agree that there are two reasons people aren’t taking more steps to remain secure.

  1. Barrier to entry: These technologies feature a lot of jargon and many aren’t all that user friendly. PGP for example, the email encryption technology used by Edward Snowden to communicate with Laura Poitras and Glenn Greenwald, involves relatively-foreign setup instructions for your average citizen.
  2. Negative connotation: Most Americans don’t realize they use encryption every day of their lives. Instead, they know encryption as the tool terrorists use to send private messages, recruit new members and spread propaganda online. This is largely due to the on-going encryption debate.

This debate, whether planned or incidental, is doubling as a smear campaign for the very suite of tools that keeps our online lives secure.

It wasn’t mentioned amongst our expert panel, but I believe there is a third reason that the general public isn’t taking action to better secure themselves online.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

Silicon Valley isn’t backing down

So far, the term “debate” may be more of a misnomer. Silicon Valley isn’t debating anything. Furthermore, there may not be anything to debate in the first place.

You can’t “compromise” on weakened security; either it’s secure, or it isn’t.

This veritable Pandora’s Box the US Government wants to explore would lead to backdoor access into our personal lives not just for the government but for hackers and bad actors around the globe. And once you open it, there’s no going back.

“You can’t have secure encryption with a government backdoor. Those two things are mutually exclusive,” notes Cyber security thought leader at Script Rock, Jon Hendren. “‘Working with Silicon Valley’ is essentially code for ‘Giving us a backdoor into the accounts, data, and personal lives of users’ and should be totally unacceptable to anyone who even remotely values their privacy.”

There’s also the issue of trust. Even if these backdoors weren’t creating vulnerabilities for bad actors to attack, do we trust the government with our data in the first place?

Our expert panel says, no.

“Handing over such backdoor access to the government would also require an extraordinary degree of trust,” says Khandelwal, “but data breaches like OPM [Office of Personnel Management] proved that government agencies cannot be trusted to keep these backdoor keys safe from hackers.”

Hendren adds:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

Compromise, in this case, is a rather contentious point of view. The compromise the government seeks isn’t a compromise at all; it’s a major loss of privacy and security as it relates to citizens around the globe.

Clulely shows us how this “compromise” might play out.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

The third option is to keep things as they are, or further expound on the efforts to secure the internet.

There really is no middle ground in this debate.

Would providing a backdoor help to fight terrorism?

Since fighting terrorism is the narrative in which the government is using to attempt to stamp out encryption, you have to wonder if encryption is truly the thorn in its side that government officials claim it to be.

It’s well-known that ISIS is using encrypted chat apps, like Telegram, to plan attacks and communicate without detection, but would a backdoor have any effect on the surveillance or capture of extremists?

“A backdoor would also have a limited window of efficacy– bad guys would avoid it once that cat is out of the bag,” says Hendren. “This would have the effect of pushing bad actors further down into more subtle and unconventional ways of communication that counter-terrorists might not be aware of, or watching out for, lessening our visibility overall.”

Khandelwal agrees:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

It’s naive to believe that an extremist group that recruits and grooms new members from the keyboard, not the battlefield, isn’t tech savvy enough to find a new means of communication as current ones become compromised.

The privacy debate isn’t going anywhere. Moreover, the ambient noise created by politicians spouting off about technologies they don’t understand should grow in volume as we near the primaries and then the general election.

What’s clear though, is that this isn’t a debate and that Silicon Valley has no means to compromise.

Without compromise, the government is left with but one recourse, policy. One can only hope that we have leaders and citizens who better understand the need for encryption before that day comes.

In this debate, no one is compromising — or compromised — but the end user.

What “El Chapo”, Sean Penn and BlackBerry teach us about encryption

Have you heard the one about the Mexican drug kingpin, the eccentric movie star and the Ugly Duckling smartphone that’s all of a sudden the talk of the tech town for all the wrong reasons?

No? Me neither, but recent reports about the role Sean Penn and BlackBerry phones allegedly played in the capture of two-time prison escapee and illegal-substance peddler extraordinaire Joaquin “El Chapo” Guzman have all the makings of a classic knee-slapper.

If you just came out of some sort of coma and have no idea about the connection between, Penn, El Chapo, BlackBerry and the hoosegow, you’ll first want to read Penn’s exclusive interview with the wily drug lord on RollingStone.com, in which the leathery actor describes communications between he and Guzman, and Guzman and actress Kate del Castillo, using “a web of BBM devices.” Next, check out this CNN.com story that details the most recent capture of El Chapo, and how it allegedly stemmed from intercepted BlackBerry messages sent between Guzman, his associates and del Castillo last fall.

BlackBerry texts vs. BBM messages vs. BBM Protected

Just yesterday, I received an odd tweet from some random weirdo on Twitter, and it got me thinking about BlackBerry’s role in this whole charade. (See below.)

What "El Chapo", Sean Penn and BlackBerry teach us about encryption

The majority of stories I found on the subject refer to the messages as “BlackBerry texts,” or something of the like. Based on Penn’s use of the term BBM (he never once writes “BlackBerry” in his many-thousand-word Rolling Stone diatribe, and likely has no idea what BBM stands for) we’ll assume they used BBM and not SMS texts sent via BlackBerry. (Why else would they think BlackBerry messages were more secure than texts?)

Señor Guzman must be a fairly intelligent man, right? I mean, could you escape prison twice and evade Mexican law enforcement for years, while continuing to “supply more heroin, methamphetamine, cocaine and marijuana than anybody else in the world.” (His words, not mine.) However, if he’s so smart, why not use the BBM Protected service, which routes messages through private BlackBerry Enterprise Service (BES) servers so they are truly 100-percent secure and cannot be obtained by law enforcement, according to BlackBerry, as long as recipients are also connected to the same BES. So El Chapo could have simply sent Mr. Spicoli Penn and his other associates secure BlackBerrys and not had to worry. (BBM Protected also encrypts BBM messages sent via the company’s iPhone and Android apps.)

While regular BBM messages are encrypted when they’re sent, BlackBerry uses a “global cryptographic key” that it can use to decrypt BBM messages when they pass through its relay station, according to EncryptedMobile.com. And those decrypted messages can be shared with law enforcement under the right circumstances.

The Mexican government presumably determined that Guzman and his associates were using BBM and served BlackBerry with a lawful access request that just about required the company to hand over those text records. BlackBerry wouldn’t provide a specific comment on the situation, and instead directed me to its Public Policy and Government Relations page, which details its lawful access policies.

From BlackBerry’s lawful access statement:

What "El Chapo", Sean Penn and BlackBerry teach us about encryption

Note to self: If I ever decide to leave the lucrative world of journalism to take control of a massive criminal syndicate, shell out the extra cash for BES, and make sure to enable BBM Protected.

Smartphone encryption yesterday and today

BlackBerry, a company that’s always been focused on enterprise security, has fought the good fight with various governments over its ability to provide encryption keys for years. BlackBerry went back and forth with the Indian government over encryption demands, for example. And in November, it pulled out of Pakistan after the country demanded access to its customers’ encrypted email and messages, though the government eventually backed down and BlackBerry returned to the market.

BlackBerry’s stance has always been that it cannot and will not provide encryption keys for BES customer data. But governments won’t take no for an answer, and today, other mobile platform providers including Apple and Google must also balance customer privacy needs with government encryption demands.

Just this week, New York State Assemblyman Matt Titone reintroduced a 2015 bill that attempts to require encryption “backdoors” in all smartphones sold in the state, according to TechDirt.com. The bill would reportedly make New York smartphone retailers stop selling devices that don’t have encryption backdoors, which would only hurt New York businesses and lead the state’s residents to simply buy their phones out of state or via black market resellers.

Titone’s bill won’t likely have legs, but it represents the latest (and definitely not the last) attempt by a U.S. lawmaker to circumvent the encryption protections mobile software companies purposefully build into products, which many organizations — legal and illegal — depend on to protect sensitive data.

Of course, unless he pulls off another great escape, it’s too late for encryption to help El Chapo.

Tim Cook pushes for strong encryption at White House summit

As expected, Apple CEO Tim Cook urged White House and government officials to come to terms with strong encryption practices that protect consumer data, at one point saying such intentions should be stated publicly.

Tim Cook pushes for strong encryption at White House summit

Cook’s plea came during a cybersecurity summit held in San Jose, Calif., last week, where government officials met with Silicon Valley tech executives to discuss how best to stymie threats posed by non-state actors like ISIS, reports The Guardian.

According to a follow-up report from The Intercept, Cook asked the White House to take a “no backdoors” stance on encryption. Law enforcement agencies, specifically the FBI, have clamored for so-called “weak encryption” policies that would allow access to protected data through supervised software backdoors.

In response, Attorney General Loretta Lynch said a balance must be struck between personal privacy and national security. The current administration is still grappling with the issue and has yet to reach a resolution that would not tip the scales.

FBI director James Comey was among those in attendance at last week’s summit. White House Chief of Staff Denis McDonough, counterterrorism adviser Lisa Monaco, Attorney General Loretta Lynch, National Intelligence Director James Clapper and National Security Agency Director Mike Rogers were also present.

Government officials say existing strong encryption techniques employed by Apple, Google and other tech firms make it easy for criminals and terrorists to communicate in relative safety. Cook maintains a hardline stance on the issue, saying that “any backdoor means a backdoor for bad guys as well as good guys.” Apple’s introduced a nearly impenetrable data encryption protocol with iOS 8, one that the company itself is unable to crack even with the proper warrants.

A document obtained by The Intercept notes summit talks included questions on whether tech companies would be willing to enact “high level principles” relating to terrorists’ use of encryption, or technologies that “could make it harder for terrorists to use the internet to mobilize, facilitate, and operationalize.” Also on the docket was the potential use of unencrypted data like metadata. Such far-reaching strategies would be difficult, if not impossible, to implement without actively policing customer data.

The summit was held less than three months after the controversial Cybersecurity Information Sharing Act cleared the U.S. Senate floor in October, legislation that would allow private companies to share customer data with government agencies, including the Department of Homeland Security and the NSA. While not labeled a surveillance bill, Apple and other powerful tech companies dispute its merit, saying CISA disregards user privacy.

Congress wades into encryption debate with bill to create expert panel

Congress wades into encryption debate with bill to create expert panel

WASHINGTON — Growing concern about terrorists’ use of encrypted communication is spurring Congress to act, but the first major piece of legislation is taking a cautious approach as lawmakers grapple with how to spy on suspected criminals without weakening cybersecurity and privacy.

House Homeland Security Committee Chairman Michael McCaul, R-Texas, and Sen. Mark Warner, D-Va., who serves on the Intelligence Committee, are set to brief reporters this week on a bill that would create a national commission on security and technology to come up with creative ways to solve the problem. The panel would be made up of civil liberty and privacy advocates, law enforcement and intelligence officials, professors, lawyers, tech executives, and computer science and cryptography experts.

Despite calls from some lawmakers to do so, the bill would not mandate that tech companies build “backdoors” into encrypted cellphones or Internet sites to give law enforcement access to digital communication. The U.S. tech industry strongly opposes such mandates.

“We cannot wait for the next attack before we outline our options, nor should we legislate out of fear,” McCaul and Warner wrote in a recent op-ed in the Washington Post. “Instead, Congress must be proactive and should officially convene a body of experts representing all of the interests at stake so we can evaluate and improve America’s security posture as technology — and our adversaries — evolve.”

Last month, law enforcement officials confirmed that the terrorists who struck Paris in November used encrypted apps to coordinate their attacks. The apps they used were not created by American tech companies.

Islamic State leaders have distributed a 32-page manual of tips for how their followers can conceal their messages by using encrypted devices and apps, McCaul and Warner wrote. They said similar tactics are used by drug traffickers and child predators.

Sen. Dianne Feinstein, D-Calif., vowed last month to introduce legislation with Senate Intelligence Committee Chairman Richard Burr, R-N.C., to require companies to provide encrypted data with a court order. Companies such as Apple and Google are currently unable to provide data from their most strongly encrypted cellphones and other electronic devices because the data cannot be accessed by anyone other than the user.

“I’m going to seek legislation if nobody else is,” Feinstein said during a Senate Judiciary Committee hearing last month. “I think this world is really changing in terms of people wanting the protection and wanting law enforcement, if there is conspiracy going on over the Internet, that that encryption ought to be able to be pierced.”

FBI Director James Comey said at the same hearing that he believes companies should be able to comply with court orders to provide communications between suspected terrorists or other criminals. However, he stopped short of saying that Congress should pass a law mandating that companies do so.

Representatives of the U.S. tech industry said that mandating backdoors into encrypted communication would compromise cybersecurity by allowing hackers to gain entry as well.

“A backdoor for the good guys is a backdoor for the bad guys too,” said Adora Jenkins, senior vice president of external affairs at the Information Technology Industry Council, which represents companies such as Facebook, Google, Twitter, Microsoft, Visa, and Samsung.

The council welcomed the idea of a national commission to bring all sides together.

“We think it’s the right way to go about discussing the challenges that law enforcement and technology companies are facing,” said Andy Halataei, the group’s senior vice president of governmental affairs. “In order for this to work, you have to have everybody in the room that has a stake in this issue. You really have to get the technologists and civil libertarians and law enforcement in the room together to talk about what is technically feasible.”

McCaul and Warner said there are no easy answers.

“The same tools that terrorists and criminals are using to hide their nefarious activities are those that everyday Americans rely on to safely shop online, communicate with friends and family, and run their businesses,” they wrote. “We are no longer simply weighing the costs and benefits of privacy vs. security but rather security vs. security.”

Ubuntu Touch to Support Encryption of User Data

The Ubuntu Touch operating system is also going to provide support for encryption of user data, developers have revealed.

It wasn’t a secret that Ubuntu Touch will get encryption, but it’s also not listed as an upcoming feature. It’s buried in a wiki entry with plans for Ubuntu Touch, but it’s nice to see that it’s still being considered, even if it’s not going to arrive anytime soon.

Ubuntu Touch is a Linux distribution before being an operating system for mobile devices, which means that integrating encryption shouldn’t be a difficult thing to implement. The problem is encryption usually has an impact on the overall performance. Powerful PC can deal with each much easier, but a phone that has limited hardware won’t be too happy.

Ubuntu Touch is getting encryption

Encryption on phones is not something new. For example, Google was supposed to make it mandatory for Android 5.x, but that didn’t happen. It’s now present in Android 6 Marshmallow, but in a limited fashion, for the /Data folder and SDcard. The OS itself is not covered, and the reason is of course performance.

“We have high-level plans to support encryption of user data. It isn’t clear at this time if that will be based on LUKS, eCryptfs, or ext4/f2fs encryption. We’ll know more once we’re a bit closer to implementing the, feature but there is currently no set timeline,” developer Tyler Hicks explained on the official mailing list.

Pat McGowan, the Director of Tools and Applications at Canonical, explained that they are tracking this feature in Launchpad, but as of now it’s been pushed after the launch of Ubuntu 16.04 LTS that will happen in April.

It’s also worth noting that Tyler said “encryption of user data,” which probably means that the encryptions will also cover some sensitive parts of the phone and not the OS itself.

Tech big guns confront U.K. parliament on backdoors, encryption

A group of high tech corporate powerhouses has gathered together to protest a law proposed by the U.K. government that would allow an array of legal and intelligence agencies the ability to access computer data through backdoors and decryption.

Facebook, Google, Microsoft, Twitter and Yahoo submitted a letter, dated December 21, 2015, to the parliamentary committee charged with reviewing the Investigatory Powers Bill saying it would have a negative impact on both the nation’s citizenry and the corporation’s customers.

“We believe the best way for countries to promote the security and privacy interests of their citizens, while also respecting the sovereignty of other nations, is to ensure that surveillance is targeted, lawful, proportionate, necessary, jurisdictionally bounded, and transparent. These principles reflect the perspective of global companies that offer borderless technologies to billions of people around the globe. The actions the U.K. Government takes here could have far reaching implications – for our customers, for your own citizens, and for the future of the global technology industry,” the companies wrote.

The five companies belong to a larger group, the two-year-old Reform Government Surveillance (RGS) coalition that is fighting similar legislation in the United States. The RGS website lists Apple, AOL, Dropbox, Evernote and LinkedIn as members, but these names were not included in the U.K. letter.

The group spelled out its misgivings stating the implementation of such a policy could undermine consumer trust of their products, a fear that any legislation passed by the U.K. could be duplicated in another country and making it difficult for companies to understand what is legal and what is not.

“An increasingly chaotic international legal system will leave companies in the impossible position of deciding whose laws to violate and could fuel data localization efforts,” the companies said.

The letter also strongly rejected any use of backdoors, forced decryption or any other technological method allowing government agencies to enter their products.

“The companies believe that encryption is a fundamental security tool, important to the security of the digital economy as well as crucial to ensuring the safety of web users worldwide,” the group wrote.

RGS itself in May 2105 wrote to the U.S. Senate encouraging it to pass the USA Freedom Act. However, it has not yet, as a group, confronted American legislators on the issues of encryption and backdoors.

Microsoft, Google, Facebook to U.K.: Don’t weaken encryption

Microsoft, Google, Facebook to U.K.: Don’t weaken encryption

Microsoft, Google and Facebook are urging U.K. officials not to undermine encryption as they work on laws that would authorize forcing communications service providers to decrypt customer traffic.

In a joint written submission to the U.K. Parliament the three U.S.-based companies lay down several areas of concern, which, if not addressed, they say could damage their businesses and leave them caught in legal crossfires among the many countries where they do business.

The companies say they don’t want the U.K. to impose restrictions and apply them to foreign service providers such as themselves because, if other countries followed suit, it would lead to a morass of laws impossible to navigate. “Conflicts of laws create an increasingly chaotic legal environment for providers, restricting the free flow of information and leaving private companies to decide whose laws to violate,” the submission says.

They staunchly support encryption without backdoors. “The companies believe that encryption is a fundamental security tool, important to the security of the digital economy as well as crucial to ensuring the safety of web users worldwide,” they write. “We reject any proposals that would require companies to deliberately weaken the security of their products via backdoors, forced decryption, or any other means.”

Despite what the U.K.’s Home Secretary Theresa May has said about not seeking encryption backdoors, they want it in writing. “We appreciate the statements in the Bill and by the Home Secretary that the Bill is not intended to weaken the use of encryption, and suggest that the Bill expressly state that nothing in the Bill should be construed to require a company to weaken or defeat its security measures.”

The Parliament is considering bills that would give government agencies access to communications across service provider networks with proper legal authorization, which would affect Microsoft, Google and Facebook, all of which operate globally and face compliance with laws in many countries.

As the U.K. is considering such laws, the Netherlands have rejected forcing providers to break encryption on demand. In the U.S., Congress has held hearings in which members say they will propose legislation to require providing cleartext versions of encrypted traffic when presented with a judge’s order.

The three companies ask that if the U.K. does create lawful access to encrypted communications, companies based outside the U.K. would not be required to comply if that would go against laws it has to follow in other countries.

They urge an international agreement on how the lawful-access laws of individual countries should be observed in other countries to remove ambiguities that might prevent them from complying with all of them.

The companies want to protect customer privacy by requiring notification of those whose communications are intercepted. “While it may be appropriate to withhold or delay notice in exceptional cases, in those cases the burden should be on the Government to demonstrate that there is an overriding need to protect public safety or preserve the integrity of a criminal investigation,” they say.

They also seek to protect data stored in the cloud the same way it is protected in private data centers. The government should go to a business if it is seeking a business’s data, just as it did before cloud services existed. “This is an area where the UK can lead the rest of the world, promoting cloud adoption, protecting law enforcement’s investigative needs, and resolving jurisdictional challenges without acting extraterritorially,” they say.

They note that the draft lacks requirements for agencies to tell the providers if they know of vulnerabilities in their networks that could be exploited, and that any authorized actions agencies take don’t introduce new vulnerabilities.

Microsoft, Google and Facebook seem concerned that agencies granted legal access to their networks might alter them lest that have a negative effect on the services they deliver over those networks. “The clearest example is the authority to engage in computer network exploitation, or equipment interference,” they say. “To the extent this could involve the introduction of risks or vulnerabilities into products or services, it would be a very dangerous precedent to set, and we would urge your Government to reconsider.”

The companies want protections for their executives located within the U.K. They want warrants, when they have to be served on communications companies, to be served to officers of the companies who are located at the companies’ headquarters, not to employees of the companies located in the U.K. “We have collective experience around the world of personnel who have nothing to do with the data sought being arrested or intimidated in an attempt to force an overseas corporation to disclose user information,” they write. “We do not believe that the UK wants to legitimize this lawless and heavy-handed practice.”

They don’t want to be forced to create and retain data about customers that they don’t already in the normal course of business. “Some language under the retention part of the Bill suggests that a company could be required to generate data – and perhaps even reconfigure their networks or services to generate data – for the purposes of retention,” they write.

The companies think whatever judicial approvals are required to issue warrants to decrypt communications ought to apply to other U.K. orders issued to communications providers by the U.K.’s Defense Intelligence and other intelligence services. These other orders include national security notices, maintenance of technical capability orders, and modifications to equipment interference warrants.

They want the law to narrowly define bulk collection of data so it doesn’t include all traffic on a given channel, but rather is restricted to traffic specified by specific indicators such as source and destination, for example. The law should allow only necessary and proportionate amounts of data be analyzed and retained, and the rest be destroyed, they say.

Service providers should be allowed to hire attorneys and protest warrants without running the risk of violating disclosure laws or acknowledging that they actually are subject to the law, they write.

They take exception to a single word – urgent – not being defined in drafts of the law where it says requiring decryption of communications in urgent cases. “Clarity on this term – which other countries may seek to emulate and even abuse – is important,” they say.

Netherlands opposes backdoors, but encryption still under assault

Netherlands opposes backdoors, but encryption still under assault

The Dutch government has officially declared its opposition to any restrictions on the development or use of encryption products, even as Dutch lawmakers are weighing legislation that could mandate backdoor government access to encrypted communications.

In a 4 January 2016 letter to the Dutch parliament, the head of the Ministry of Security and Justice, Ard van der Steur, explained the government’s reasons for endorsing strong encryption, which sound quite similar to those cited by technologists such as Apple’s Tim Cook, the most high-profile critic of backdoors.

According to a translation of the letter, provided by Dutch cybersecurity consultant Matthijs R. Koot, van der Steur points to the uses of encryption for protecting the privacy of citizens, securing confidential communications by government and businesses, and ensuring the security of internet commerce and banking against cybercrime.

Privacy of communications is also a protected right under the Dutch constitution, and a fundamental right protected by the European Convention on Human Rights and the Charter of Fundamental Rights of the EU, van der Steur’s letter says.

The minister acknowledges that criminals and terrorists may also use encryption, making it difficult if not impossible for law enforcement and intelligence services to monitor their communications in defense of national security and public safety.

But van der Steur also observes that encryption is widely available and requires “little technical knowledge, because encryption is often [an] integral part of the internet services that they too can use.”

But because today’s communications products and services use unbreakable encryption, demands that technology companies hand over decrypted data would essentially require weakening encryption to provide backdoors.

Van der Steur notes that any “technical doorways” [backdoors] in encryption would undermine the security of digital systems, making them “vulnerable to criminals, terrorists and foreign intelligence services.”

As fellow Naked Security writer Paul Ducklin put it in a recent article we published about the risks of deliberately weakening cryptographic systems:

[M]andatory cryptographic backdoors will leave all of us at increased risk of data compromise, possibly on a massive scale, by crooks and terrorists…

…whose illegal activities we will be able to eavesdrop and investigate only if they too comply with the law by using backdoored encryption software themselves.

Van der Steur agrees very strongly:

[Backdoors] would have undesirable consequences for the security of communicated and stored information, and the integrity of IT systems, which are increasingly important to the functioning of society.

In his conclusion, van der Steur states:

The government endorses the importance of strong encryption for internet security, for supporting the protection of citizens’ privacy, for confidential communication by the government and companies, and for the Dutch economy.

Therefore, the government believes that it is currently not desirable to take restricting legal measures concerning the development, availability and use of encryption within the Netherlands.

A VICTORY IN THE CRYPTO WARS?

The debate over encryption backdoors goes back to the 1980s and 1990s, was revived in the past two years by law enforcement officials like FBI Director James Comey, and has intensified since the 13 November 2015 terrorist attacks in Paris.

While efforts to pass legislation in the US and UK mandating backdoors have so far been unsuccessful, some advocates fighting against backdoors are worried the Crypto Wars have gone global.

China recently passed an anti-terrorism law that compels technology companies to decrypt data upon request of the government; while in Pakistan, the government’s demand for backdoor access to BlackBerry customer data led the company to pull out of the country entirely.

Concerns over proposed surveillance legislation in the UK has led Apple to take unusually bold steps to oppose passage of the Investigatory Powers Bill.

Apple submitted a letter to the bill’s oversight committee saying language in the draft bill could force Apple to “weaken security for hundreds of millions of law-abiding customers,” in order to allow security services to eavesdrop on encrypted communications such as iMessage.

In the US, Republican Senator Richard Burr, chairman of the Senate Intelligence Committee, has indicated that he wants to propose legislation requiring companies to decrypt data at the government’s request.

Even in the Netherlands, the government’s recent pro-encryption stance is not a complete victory for opponents of backdoors.

As Koot noted on his blog, the pro-encryption policy isn’t guaranteed to remain policy in the future, and Dutch law already requires technology companies to decrypt data sought in targeted investigations.

Meanwhile, the Dutch parliament is considering updating a 2002 security and intelligence law to compel bulk decryption of communications, Koot reports.

The war over backdoors has yet to be lost or won, and it is far from over.

The Netherlands will not weaken encryption for security purposes

The Netherlands will not weaken encryption for security purposes

The Dutch government believes that confidence in secure communication and storage data is essential for the development of the Dutch economy.

The Netherlands will not follow the trend of weakening encryption for security purposes, according to a statement by the Dutch Minister of Security and Justice.

In contrast, with the United Kingdom where the Investigatory Powers Bill, will ban internet firms of holding client’s private communication information the Dutch government believes that strong encryption is key for the future growth of the Dutch economy.

Daily Dot website reported that Ard van der Steur, the Dutch minister of security and justice, wrote in a statement that the Dutch executive cabinet endorsed “the importance of strong encryption for Internet security to support the protection of privacy for citizens, companies, the government, and the entire Dutch economy”.

The statement continues saying: “Therefore, the government believes that it is currently not desirable to take legal measures against the development, availability and use of encryption within the Netherlands.”

Van der Steur added in the statement that “confidence in secure communication and storage data is essential for the future growth potential of the Dutch economy, which is mainly in the digital economy.”

The Dutch Minister also explained that weakening encryption will not lead to a safer world, as criminal organizations will have easier access to sensitive private information.

According to Daily Dot, the minister of security and justice described at length the virtues of encryption, from protecting laptops against theft to allowing the Dutch government itself to communicate online safely with its citizens about taxes and digital IDs. “Cryptography is key to security in the digital domain,” Van der Steur argued.

Microsoft may have your encryption key:here’s how to take it back

Microsoft may have your encryption key; here’s how to take it back

As happens from time to time, somebody hasspotted a feature in Windows 10 that isn’t actually new and has largely denounced it as a great privacy violation.

The Intercept has written that if you have bought a Windows PC recently then Microsoft probably has your encryption key. This is a reference to Windows’ device encryption feature. We wrote about this feature when it was new, back when Microsoft introduced it in Windows 8.1 in 2013 (and before that, in Windows RT.

Device encryption is a simplified version of the BitLocker drive encryption that made its debut in Windows Vista in 2006. The full BitLocker requires a Pro or Enterprise edition of Windows, and includes options such as integration with Active Directory, support for encrypting removable media, and the use of passwords or USB keys to unlock the encrypted disk. Device encryption is more restricted. It only supports internal system drives, and it requires the use of Secure Boot, Trusted Platform Module 2.0 (TPM), and Connected Standby-capable hardware. This is because Device encryption is designed to be automatic; it uses the TPM to store the password used to decrypt the disk, and it uses Secure Boot to ensure that nothing has tampered with the system to compromise that password.

The final constraint for Device encryption is that you must sign in to Windows with a Microsoft account or a Windows domain account to turn it on. This is because full disk encryption opens the door to all kinds of new data loss opportunities. If, for example, you have your system’s motherboard replaced due to a hardware problem, then you will lose access to the disk, because the decryption keys needed to read the disk are stored in the motherboard-mounted TPM. Some disk encryption users may feel that this is a price worth paying for security, but for an automatic feature such as device encryption, it’s an undesirable risk.

To combat that, device encryption stores a recovery key. For domain accounts, the recovery key is stored in Active Directory, but in the common consumer case, using a Microsoft account, it is instead stored in OneDrive. This recovery key can be used after, say, a motherboard replacement or when trying to recover data from a different Windows installation.

While device encryption is available in all versions of Windows 10, it has a particular significance in the Home version, where the full BitLocker isn’t available. Windows 10 Home also can’t use domain accounts. This means that if you enable device encryption (and on new systems that are set up to use Microsoft accounts, it may well be enabled by default) then the recovery key is necessarily stored on OneDrive.

Lawmakers push for commission on encryption

Lawmakers push for commission on encryption

Congress should create a national commission to investigate the difficulties encryption has created for law enforcement, a bipartisan pair of lawmakers argued Monday in a Washington Post op-ed.

“Congress must be proactive and should officially convene a body of experts representing all of the interests at stake so we can evaluate and improve America’s security posture as technology — and our adversaries — evolve,” said House Homeland Security Committee Chairman Michael McCaul (R-Texas) and Sen. Mark Warner (D-Va.).

It’s an idea that McCaul first floated several weeks ago, after terrorist attacks in Paris and San Bernardino, Calif.
The deadly incidents have given new urgency to a long-running debate over encryption. Lawmakers and investigators said they believe the people behind those incidents used encrypted communication to hide their plans.

“This presents an extraordinary security challenge for the United States and our allies,” McCaul and Warner said. “Because extremists are ‘going dark,’ law enforcement officials warn that we are ‘going blind’ in our efforts to track them.”

Officials looking into the Paris attacks said they have definitive evidence the terrorists used the popular encrypted apps Telegram and WhatsApp to help plan the assault that killed 130 people.

“Frustratingly, there are no easy answers,” said McCaul and Warner. “The same tools that terrorists and criminals are using to hide their nefarious activities are those that everyday Americans rely on to safely shop online, communicate with friends and family, and run their businesses.”

For some, the answer is legislation. Senate Intelligence Committee Chairman Richard Burr (R-N.C.) has called for a law that would require companies to decrypt data upon government request. But the tech community is balking at that, arguing that such a mandate would defeat the purpose of encryption.

Major tech players including Apple have even refused to comply with court orders to turn over encrypted data, arguing that they can’t access information secured by their own products. Only this type of inaccessible encryption truly protects data from hackers, technologists insist.

McCaul and Warner agreed with this assessment.

“Encryption is a bedrock of global commerce, and it has helped enhance individual privacy immeasurably,” they said. “It is also integral to our cybersecurity efforts — protecting individuals, U.S. businesses, intellectual property and our nation’s critical infrastructure.”

Yet because this same uncrackable technology is also used to hide nefarious activities, “digital innovations present us with a paradox,” they added.

A bill that would require companies to maintain a guaranteed entry point into their encrypted data would backfire, McCaul and Warner cautioned.

“Such a law could weaken Internet privacy for everyone and could have the unintended consequence of making our information systems more vulnerable to attack,” the pair said. “Moreover, in our globalized world, a U.S.-only solution would likely have only a limited impact and could encourage offenders to simply use technology developed overseas instead.”

But Congress must act, they said, suggesting a national commission of all relevant parties is the right step forward.

“We are seeking the brightest minds from the technology sector, the legal world, computer science and cryptography, academia, civil liberties and privacy advocates, law enforcement and intelligence to collaboratively explore the intersection of technology and security,” the duo said.

McCaul and Warner explained the group would tasked with “generating much-needed data and developing a range of actionable recommendations that can protect privacy and public safety.”

The effort may have momentum in Congress. Several Capitol Hill leaders have appeared hesitant to back Burr’s legislative efforts. McCaul and Warner’s alternative may be more palatable to lawmakers and the tech community.

“We cannot wait for the next attack before we outline our options,” they said.

China Antiterror Law Doesn’t Require Encryption Code Handovers

China Antiterror Law Doesn’t Require Encryption Code Handovers

BEIJING—China passed a new antiterrorism law that stepped back from previous language of concern to global technology firms, but which still raises questions about its scope and the potential impact on companies doing business there.

The law, passed Sunday by China’s rubber-stamp parliament, also authorized the armed forces and paramilitary police to take part in counterterrorism operations in foreign countries with the approval of those countries and Beijing’s military leadership.

Chinese authorities say the law is intended to help prevent terror attacks in China and better protect its citizens overseas, four of whom were killed by militants in Mali and Syria in November.

Beijing has blamed a series of recent attacks in China on jihadist separatists from the northwestern region of Xinjiang, where some of the mostly Muslim Uighur ethnic group have been resisting Chinese rule for decades.

The new law contains much of the language from a draft version released a year ago that U.S. officials, business groups and rights advocates criticized as having an overly broad definition of terrorism and onerous requirements for companies dealing with proprietary commercial information and private data in China.

The final version of the law requires telecom operators and Internet companies to help authorities with decryption of data and other counterterrorism efforts. Unlike the draft version, however, it leaves out some controversial language requiring tech companies to store their data locally and provide their encryption systems for review to be able to operate in China.

Still, the broad wording that tech companies must provide “technical means of support” to China’s government for counterterrorism has prompted concern among some U.S. tech firms, according to a person familiar with the matter.

“Telecommunications and Internet service providers should provide technical interfaces and technical support and assistance in terms of decryption and other techniques to the public and national security agencies in the lawful conduct of terrorism prevention and investigation,” says a final version of the law, published by the official Xinhua News Agency.

China’s law comes as data encryption has become a flash point globally between tech firms and law enforcement authorities. U.S. tech companies such as Apple Inc. and Google Inc. have been clashing with U.S. and European governments over new encryption technologies, which law-enforcement officials say hinder their ability to catch terrorists.

Apple criticized a U.K. proposal on Dec. 21 that would give national-security authorities more power to monitor communications. The proposal would require tech companies to retain “permanent interception capabilities” for communications, including “the ability to remove any encryption.”

U.S. President Barack Obama had spoken in support of the U.K. stance against encryption in January, but backed down from trying to change U.S. law in October.

U.S. Federal Bureau of Investigation Director James Comey said in November that the bureau had been stymied in tracking Islamic State’s recruiting efforts due to use of encrypted communication services.

Following Edward Snowden’s revelations that U.S. authorities inserted so-called backdoors in technology products to allow spying, U.S. tech companies have sought to distance themselves from government surveillance in order to regain the trust of consumers. Apple and Google have released software with encryption they say they are unable to unlock.

Chinese officials say they studied U.S. and European Union legislation while drawing up China’s counterterrorism law.

They have also stepped up efforts in recent months to persuade foreign governments that Uighurs resisting Chinese rule should be considered terrorists.

Beijing has long maintained that Uighur separatists have links to al Qaeda and Chinese officials have said in recent months that at least 300 ethnic Uighurs have joined Islamic State in Iraq and Syria.

Some recent attacks in China have borne the hallmarks of jihadist groups, but rights groups and Uighur activists say much of the violence is provoked by police abuses, excessive religious restrictions and a huge influx of non-Uighur migrants to Xinjiang.

The new law also restricts the right of media to report on details of terrorist attacks and the government’s response.

The counterterrorism law is part of a series of new pieces of legislation that many experts say are designed to tighten the Communist Party’s control over the economy and society, and promote a notion of rule of law that doesn’t undermine its monopoly on power.

President Obama has said he raised concerns about an early draft of the counterterrorism law directly with Chinese President Xi Jinping, saying technology companies would be unwilling to comply with its provisions.

U.S. officials and business groups have also expressed concern over a sweeping new national security law, passed in July, that the government says is needed to counter emerging threats but that critics say may be used to quash dissent and exclude foreign investment.

In May, China’s parliament also published a draft of a new law that seeks to tighten controls on foreign nongovernmental groups. Nearly four dozen U.S. business and professional groups signed a letter to the Chinese government in June urging it to modify that draft, which they said could hurt U.S.-China relations.

Senate Intel chair: “It’s time” for encryption legislation

Senate Intel chair: "It's time" for encryption legislation

Congress must enact a law that would require companies to decrypt data upon government request, Senate Intelligence Committee Chairman Richard Burr (R-N.C.) argued Thursday in a Wall Street Journal op-ed.

“Criminals in the U.S. have been using this technology for years to cover their tracks,” Burr said. “The time has come for Congress and technology companies to discuss how encryption — encoding messages to protect their content — is enabling murderers, pedophiles, drug dealers and, increasingly, terrorists.”

The recent terrorist attacks in Paris and San Bernardino, Calif., have reignited the debate over encryption. Lawmakers and investigators have said they believe the people behind those incidents likely used encrypted platforms to help hide their plans.
Burr has been one of Capitol Hill’s leading proponents of legislation that forces companies to crack their own encryption. But the tech community has pushed back, arguing that such a mandate would make encrypted data less secure.

Major tech players like Apple have even refused to comply with court orders for encrypted data, arguing that they can’t access their own secured information.

Burr said this has become a serious issue for law enforcement.

“Even when the government has shown probable cause under the Fourth Amendment, it cannot acquire the evidence it seeks,” he said, adding, “Technology has outpaced the law.”

Burr explained that the Communications Assistance for Law Enforcement Act of 1994 “requires telecommunications carriers — for instance, phone companies — to build into their equipment the capability for law enforcement to intercept communications in real time.”

“The problem is that it doesn’t apply to other providers of electronic communications, including those supporting encrypted applications,” he said.

It’s time for Congress to close that loophole with legislation, Burr insisted.

But it’s unclear if Burr would have the momentum to move his proposed bill. While Senate Intelligence Committee Ranking Member Dianne Feinstein (D-Calif.) has said she will work with him on his efforts, other congressional leaders seem more hesitant.

Many have suggested the government must simply do a better job of working with Silicon Valley to come up with a non-legislative solution.

House Homeland Security Committee Chairman Michael McCaul (R-Texas) recently proposed “a national commission on security and technology challenges in the digital age.” The commission, tasked with creating alternatives to legislation, would include tech companies, privacy advocates and law enforcement officials.

Burr countered that the tech community has almost forced Congress’s hand.

“I and other lawmakers in Washington would like to work with America’s leading tech companies to solve this problem, but we fear they may balk,” Burr said.

He noted that when Apple refused to comply with the court order seeking encrypted data, the company argued, “This is a matter for Congress to decide.”

“On that point, Apple and I agree,” he said. “It’s time to update the law.”

Best Disk Lock Has Been Updated to Version 2.62

The powerful data protection software-Best Disk Lock has been updated to version 2.62. There are many improvements in version 2.62, which are designed to make the program even easier to use. The latest version software not only improved the stability for disk elementary-lock, changed the Lock Log to Lock Record for easily unlocking, but also fixed the BUG that an error message occurred when the disk is opened after being unlocked in Windows XP.

Besides, six new features are introduced in this new version: added the feature to automatically open the disk when unlocking it, added the option for users that whether to recover the unlocked disk(s) to lock status, added the feature to automatically open the file or folder when unlocking it, added the option for users that whether to recover the unlocked file(s)  or folder(s) to lock status, added Lock Record for easily unlocking, added the feature Forbid using the unassigned drive letters for more control on USB storage devices.

Change Log of Best Disk Lock 2.60:

File Name: Best Disk Lock

Version: 2.62

File Size: 3.43MB

Category: System Security Software

Language: English

License type: Trial Version

OS Support: Win2000/XP/VISTA/Win 7/Win 8

Released on: Dec.22, 2015

Download Address: http://www.dogoodsoft.com/best-disk-lock/free-download.html

What’s New in This Version:

* Changed the Lock Log to Lock Record for easily unlocking.

– Fixed a BUG that an error message occurred when the disk is opened after being unlocked in Windows XP.

Why Choose Best Disk Lock:

Best Disk Lock Has Been Updated to Version 2.62

Best Disk Lock is a powerful utility that can not only completely hide disk partitions and CD-ROM drives on your PC, disable USB storage devices or set them as read-only, but also forbid using the unassigned drive letters . The partition with advanced-lock cannot be found in any environment by anyone else, so the security and confidentiality of your data on this partition can be ensured.

The feature Lock File is to change the access permissions of file, folder or disk in NTFS-formatted partitions, by which teh file/folder/disk will be prohibited or allowed to access. Besides, Best Disk Lock can configure the security of your computer system and optimize the system.

Apple CEO Tim Cook Mounts Defense of Encryption on “60 Minutes”

Apple CEO Tim Cook Mounts Defense of Encryption on "60 Minutes"

In a “60 Minutes” appearance Sunday, Apple CEO Tim Cook reiterated his support of encryption, in the face of renewed criticism from the U.S. intelligence community that these digital locks interfere with the ability to detect threats to national security.

Cook used an interview with CBS’s Charlie Rose to lay out his argument for why weakening encryption on consumer devices is a bad idea.

“If there’s a way to get in, then somebody will find the way in,” Cook said. “There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back door’s for everybody, for good guys and bad guys.”

Following the mass murders in Paris and San Bernardino, Apple and other technology companies have come under mounting pressure to give U.S. law enforcement access to their consumers’ encrypted messages. FBI Director James Comey complained that potential attackers are using communications platforms that authorities can’t access — even through warrants and wiretaps.

“I don’t believe that the trade-off here is privacy versus national security,” Cook said in the interview. “I think that’s an overly simplistic view. We’re America. We should have both.”

Cook said modern smartphones such as the iPhone contain sensitive information: Personal health details, financial data, business secrets and intimate conversations with family, friends or co-workers. The only way to ensure this information is kept secure is to encrypt it, turning personal data into indecipherable garble that can only be read with the right key — a key that Apple doesn’t hold.

Apple will comply with warrants seeking specific information, Cook said, but there’s only so much it can provide.

Moving to other topics, Cook defended Apple’s tax strategy, which has drawn criticism from Congress. He described as “total political crap” charges that Apple is engaged in an elaborate scheme to pay little or no taxes on overseas income. He also discussed the company’s use of one million Chinese workers to manufacture most of its products, saying they possess the skills that American workers now lack.

“The U.S., over time, began to stop having as many vocational kind of skills,” Cook said in the interview. “I mean, you can take every tool and die maker in the United States and probably put them in the room that we’re currently sitting in. In China, you would have to have multiple football fields.”

The television news magazine also took viewers on a tour of Apple’s headquarters. Rose talked with design guru Jony Ive about the Apple Watch inside the secret design studio, where the wooden tables were draped with covers to shield future projects from the camera.

Apple CEO Tim Cook Mounts Defense of Encryption on "60 Minutes"

Retail chief Angela Ahrendts escorted Rose to a mock Apple Store in an unmarked warehouse off the main Cupertino campus.

And, armed with cameras and drones, Rose ascended a giant mound of earth to visit to the site of Apple’s future corporate headquarters, a building dubbed the “spaceship” by many. The $5 billion project, with 7,000 trees, fruit and vegetable gardens and natural ventilation system, is expected to one day house 13,000 employees.

Paris attack planners used encrypted apps, investigators believe

i

French counterterrorism investigators believe that the men suspected in last month’s Paris attacks used widely available encryption tools to communicate with each other, officials familiar with the investigation said, raising questions about whether the men used U.S.-made tools to hide the plot from authorities.

Investigators have previously said that messaging services WhatsApp and Telegram were found on some of the phones of the men suspected in the November attacks that claimed 130 victims. But they had not previously said that the services had been used by the men to communicate with each other in connection with the attacks. The two services are free, encrypted chat apps that can be downloaded onto smartphones. Both use encryption technology that makes it difficult for investigators to monitor conversations.

The findings of the investigation were confirmed by four officials, including one in France, who are familiar with the investigation. All spoke on the condition of anonymity because they were not authorized to speak publicly about the ongoing inquiry. A spokeswoman for the Paris prosecutor’s office, which is leading the investigation, declined to comment.

The investigators’ belief that WhatsApp and Telegram had been used in connection with the attacks was first reported by CNN.

The revelation is likely to add fuel to calls in Congress to force services such as WhatsApp, which is owned by Facebook, to add a back door that would enable investigators to monitor encrypted communications. Such demands have grown stronger in the wake of the Paris attacks and after other attacks in the United States in which the suspects are believed to have communicated securely with Islamic State plotters in Syria.

Already, security hawks in Congress, citing the likelihood that the Paris attackers used encrypted communications, have called for legislation to force companies to create ways to unlock encrypted content for law enforcement. Sen. Dianne Feinstein, D-California, vice-chairman of the Senate Intelligence Committee, has begun working on possible legislation. And Sen. John McCain, R-Arizona, chairman of the Senate Armed Services Committee, has promised hearings on the issue, saying, “We’re going to have legislation.”

FBI Director James B. Comey last week cited a May shooting in Garland, Texas, in which two people with assault rifles attempted to attack an exhibit of cartoons of the prophet Muhammad. Investigators believe they were motivated by the Islamic State. Comey told the Senate Judiciary Committee that encrypted technology had prevented investigators from learning the content of communications between the shooters and an alleged foreign plotter.

“That morning, before one of those terrorists left and tried to commit mass murder, he exchanged 109 messages with an overseas terrorist,” Comey told the committee. “We have no idea what he said, because those messages were encrypted.”

Tech firms such as Apple have opposed such calls, saying that such a requirement would render their services and devices less secure and simply send users elsewhere. Apple began placing end-to-end encryption on its chat and video call features several years ago. Then last year, in the wake of revelations by former National Security Agency contractor Edward Snowden about the scope of U.S. surveillance, Apple announced it was offering stronger encryption on its latest iPhones. And more tech firms began to question what had once been routine law enforcement requests to comply with court-ordered wiretaps.

A spokesman for Facebook declined to comment about whether the attackers used WhatsApp. A representative for Germany-based Telegram did not respond to a request for comment.

The officials familiar with the Paris investigation did not say when the services were used, how frequently or for what purpose. One of the officials said investigators believe that the attackers used Telegram’s encrypted chat function more frequently than they used WhatsApp. It was not clear whether authorities were able to obtain “metadata,” information indicating the times and dates of chat messages from either company’s servers. Nor was it clear whether authorities had been able to recover the messages from the phones themselves.

Not all encrypted apps are equal. WhatsApp offers end-to-end encryption between two users on some platforms, such as Android phones. That means the chat content is not visible to Facebook but only to the sender and receiver. WhatsApp is in the process a rollout for Apple’s iPhones. Telegram’s Secret Chat feature is end-to-end encrypted. However, a number of experts say that Telegram is not secure.

“It’s home-brew crypto style,” said Lance James, chief scientist at Flashpoint, a threat intelligence firm. The Telegram developers have “introduced unnecessary risk by making up their own cryptography rules.” He said he was “fairly certain” that advanced spy agencies could find ways around the encryption.

The group chat functions on the apps do not offer end-to-end encryption, which means anyone with access to WhatsApp or Telegram’s servers can read the chats.

European authorities have come under heavy criticism for failing to disrupt the Paris attacks, and it is unclear whether encrypted messaging played an important role in the plot’s success. Ringleader Abdelhamid Abaaoud, a Belgian citizen, was being monitored by European authorities but nevertheless managed to travel to Syria and back this year.

Another suspect, Salah Abdeslam, is still at large despite having been stopped by French police at the Belgian-French border hours after the attacks. He used his real identity documents, but he was not yet in a database, Belgian Interior Minister Jan Jambon told the Belgian VTM broadcaster in an interview aired this week.

“We were simply unlucky,” he said.

Then, investigators believe, Abdeslam went into hiding in a building in the Molenbeek district of Brussels, and Belgian Justice Minister Koen Geens said that a Belgian law banning police raids between 9 p.m. and 5 a.m. may have played a role in his subsequent escape.

FBI chief James Comey says Calif. killers used encrypted email, but not social media

FBI chief James Comey says Calif. killers used encrypted email, but not social media

The couple who killed 14 people and wounded nearly two dozen others this month in California chatted secretly of jihad long before they married or entered the United States, not on social media as politicians have claimed, FBI Director James Comey said Wednesday at a Manhattan law enforcement conference, where he urged the public to remain alert for signs someone close to them is being radicalized online.

Comey said those messages between Syed Rizwan Farook and Tashfeen Malik were direct, private messages well before their attack in San Bernardino, California.

“So far, in this investigation we have found no evidence of posting on social media by either of them at that period in time and thereafter reflecting their commitment to jihad or to martyrdom,” he said, referring to the reports suggesting that Malik had spoken openly on social media about jihad and that background checks had not detected those comments.

Comey made his statements at 1 Police Plaza, first at the NYPD Shield Conference, which included several hundred security personnel who work in the private sector and who collaborate with the NYPD, and again at a news conference.

“The threat comes from social media, which revolutionized terrorism,” Comey said.

Comey revealed for the first time that the shooting deaths last July of five people after attacks on two military installations in Chattanooga, Tenn., have now officially been classified as a terrorist attack. The assailant in that attack, Muhammad Youssef Abdulazeez, a naturalized U.S. citizen living in Hixson, Tenn., was killed by police gunfire after he shot and killed four Marines and a sailor and wounded three other people.

The White House on Wednesday said President Obama plans to visit San Bernardino on Friday and meet with the families of shooting victims there.

Comey said he understands Americans are jittery, but citizens should try to channel their awareness into vigilance, not panic. He said the threat from the Islamic State group, known as ISIS or ISIL, has not changed — but it’s vastly different from how terror cells operated around the time of the Sept. 11 attack. “Your parents’ al-Qaida is a very different model and was a very different threat that what we face today,” he said.

For example, he said, some Twitter messages cannot be “unlocked” by law enforcement, making it impossible for them to track communications between terrorists.

Comey said Farook and Malik communicated via encrypted email which investigators have not been able to crack.

“The bottleneck here is there are a lot people who have designed these products and they can’t access it themselves because that is what the market requires,” Comey said. He said he hoped further public debate on encryption will convince the public to accept that unlocking encryptions is needed by law enforcement to battle global terrorism.

Comey said the messages relayed from foreign terrorist groups are as succinct as “I will kill where I am.” Comey said such messages have inspired homegrown terrorists, who are receiving these messages on their phones daily.

Comey also urged the public not to “freak out” because they are anxious about another homegrown terrorist attack. Instead, he said, “We need the public to be aware and not to be fearful, but instead have a healthy awareness of their surroundings and report something if they see something. Tells us [law enforcement], because we need your help, and then live your life and let us do our job.”

NYPD Commissioner William Bratton echoed Comey’s comments. “To prevent crime, disorder and now terrorism we must go where it begins . . . in the minds of those who hate and feel victimized. People who see this are moms and dads.”

Bratton said the terrorists are “propagandizing messages that are slick and professional and are inspiring attacks.”

Encrypted Messages Stymied Probe of Garland Shooting — FBI Director

Encrypted Messages Stymied Probe of Garland Shooting — FBI Director

FBI Director James Comey Jr. testifies at a Senate Judiciary Committee hearing on Capitol Hill in Washington December 9, 2015. John McCain (R-Ariz.), who said after the Paris attacks that the status quo was “unacceptable”.

He said the Federal Bureau of Investigation was focused intently on the threat of homegrown violent extremism, “the radicalization in place” of people who become inspired, influenced and/or directed by a terrorist group or extremists.

Though he said the Obama administration was not seeking to address concerns over data encryption on smartphones, he said he remained concerns that criminals, terrorists and spies were using such technology to evade detection. This is why technologists must continue to dispel the myths behind the arguments against encryption. ”

This isn’t going to solve the whole problem”, Comey said. “I’m not questioning their motivations”, Comey said.

In response Comey appeared to counter his previous statement on the lack of a “technical issue”, and essentially admitted he doesn’t know how companies would comply with the order, but it would be their burden to figure it out. “In fact, the makers of phones that today can’t be unlocked, a year ago they could be unlocked”.

He also says tech companies should just accept that they would be selling less secure products.

William Binney, veteran NSA codebreaker and early whistleblower, said good intelligence is much more a matter of collecting the destinations and origins of communications – the “metadata”, which will not work if encrypted – than of breaking into people’s private messages to see what’s there.

Comey said he is engaged in ongoing and productive conversations with Silicon Valley. “I promise you that’s the way we conduct ourselves”. “We care about the same things”.

One of the attackers “exchanged 109 messages with an overseas terrorist” on the morning of the shooting, Comey said. “That is a big problem”, he said.

If firms have already decided that strong encryption is in their best interest, Sen. “Encryption is always going to be available to the sophisticated user”. FaceTime, Apple’s video call feature, has had end-to-end encryption since 2010.

In the wake of National Security Agency contractor Edward Snowden’s revelations about mass surveillance in 2013, there have been several discussions about governments’ need to be able to look at citizen data and individual privacy. Feinstein offered to pursue legislation herself, citing fear that her grandchildren might start communicating with terrorists over encrypted Playstation systems. ”

US tech companies do not want to be the middleman between law enforcement and their customers”, observed Utah Republican Orrin Hatch to Comey, who said he “wasn’t sure what [Hatch] meant by “middleman”. “Our ability to monitor them has not kept pace”. “We ought to remember the limits on what we can do legislatively, it wouldn’t necessarily fix the problem”. But law enforcement agents still have powerful tools to surveil suspects and gain information on terror plots.

FBI Director: Silicon Valley’s encryption is a “business model problem”

FBI Director: Silicon Valley’s encryption is a “business model problem”

Leaders in both major political parties have increasingly been calling on tech companies to give law enforcement encryption backdoors in the wake of recent terror attacks in Paris and California.

Today, FBI director James Comey has suggested that Silicon Valley isn’t faced with a serious technical problem, but rather a “business model problem,” according to a report on his comments in The Intercept, based on C-SPAN video of the hearing.

On the face of it, Comey’s statement would seem to back away from earlier suggestions that tech companies can and should find a way to allow access to data when law enforcement wanted it, but provide otherwise secure services. Critics have pointed out that any encryption backdoors that can be used by the “good guys” also lead to widespread insecurity, since they can also be exploited by not-so-good guys.

At one point, Comey identified the problem as encryption “by default,” leading even unsophisticated users to have encrypted phones. The exchange looked like a veiled jab at Google and Apple.

“There are plenty of companies today that provide secure services to their customers and still comply with court orders,” said Comey. “There are plenty of folks who make good phones who are able to unlock them in response to a court order. In fact, the makers of phones that today can’t be unlocked, a year ago they could be unlocked.”

Comey also provided a specific example of a situation in which he said encryption was an obstacle for law enforcement.

“In May, when two terrorists attempted to kill a whole lot of people in Garland, Texas, and were stopped by the action of great local law enforcement,” he said. “That morning, before one of those terrorists left to try to commit mass murder, he exchanged 109 messages with an overseas terrorist. We have no idea what he said, because those messages were encrypted. That is a big problem.”

In the end, Comey didn’t really make clear exactly what measures he expects tech companies to take, or whether he’d favor legislation to force them to do it. But he made clear, in a fairly confusing way, that he’s not satisfied with the current drive to encrypt devices.

McCaul wants new commission on encryption and law enforcement

McCaul wants new commission on encryption and law enforcement

The chairman of the House Homeland Security Committee said he plans to introduce legislation that would allow the creation of a “national commission on security and technology challenges in the Digital Age.”

The legislation “would bring together the technology sector, privacy and civil liberties groups, academics, and the law enforcement community to find common ground,” Chairman Rep. Michael McCaul (R-Texas) said in a Dec. 7 speech at National Defense University. “This will not be like other blue ribbon panels, established and forgotten.”

He said the ability of terrorist groups to use encrypted applications while communicating is one of his biggest fears. “We cannot stop what we cannot see,” he said in reference to recent attacks in San Bernardino, Calif., and Paris.

McCaul described the Islamic State as not a “terrorist group on the run” but a “terrorist group on the march.” He said 19 Islamic State-connected plots in the U.S. have been thwarted by government officials. But he added that terrorist groups are using the Internet to expand.

“Americans are being recruited by terrorist groups at the speed of broadband while we are responding at the speed of bureaucracy,” he said.

FBI Director James Comey has been a vocal critic of end-to-end encryption in commercial devices, and his advocacy has received a mixed reception on Capitol Hill. During an Oct. 27 hearing, Rep. Will Hurd (R-Texas), a former CIA officer who has private-sector cybersecurity experience, criticized Comey for saying encryption thwarts counterterrorism efforts and for “throwing certain companies under the bus by saying they’re not cooperating,” a charge that Comey denied.

In an interview, Hurd welcomed McCaul’s proposed commission by saying, “I think getting a group of industry experts from all sides of this issue to talk — and to not talk past one another — is ultimately a good thing.”

Hurd, a member of the Homeland Security Committee, said he would planned to speak with McCaul to make sure the commission had the “right folks in the room.”

He added that the right people would be leaders of technology firms whose encryption services have been at the center of debate and law enforcement officers who might be able to identify situations in which agencies would need to get around encryption, Hurd said.

But those situations still seem elusive. When he was a CIA officer working on cybersecurity issues, Hurd said he did not think of encryption as an insurmountable roadblock.

“Guess what? Encryption was around back then,” he said.

Hurd pointed out that intelligence can be gleaned from the contours of encrypted channels — such as communications between IP addresses — without decrypting the communications.

“I still haven’t gotten anybody to explain to me a very specific case where the investigation went cold” because of encryption, he said of his conversations with law enforcement officials.

McCaul sounded a more dire note by saying, “I have personally been briefed on cases where terrorists communicated in darkness and where we couldn’t shine a light, even with a lawful warrant.”

He said countering Islamic State’s use of encrypted messaging is “one of the greatest counterterrorism challenges of the 21th century.” At the same time, he was careful not to target encryption technology itself, which he described as “essential for privacy, data security and global commerce.”

In a Dec. 6 speech from the Oval Office, President Barack Obama announced plans to seek public/private cooperation on challenges posed by encrypted communications. He said he will “urge high-tech and law enforcement leaders to make it harder for terrorists to use technology to escape from justice.”

However, it is not clear if that message represents more than a change in tone from current policy. The administration had previously said it would not seek legislation to push companies to retain customers’ encryption keys and share them with law enforcement agencies.

U.S. CIO Tony Scott told FCW in a November interview that “at the end of the day, I think the better policy is probably not to require these backdoors” for law enforcement.

Although a new law could potentially cover U.S.-based providers and devices manufactured by U.S.-based companies, encryption applications would still be widely available beyond the country’s jurisdiction.

“All the really bad people who are highly motivated to keep their stuff secret are going to use the encryption method that doesn’t have a backdoor,” Scott said.

McCaul used the bulk of his speech to call for tighter restrictions on the Visa Waiver Program, as outlined in a bill introduced this week that would require high-risk individuals who have visited a terrorist hot spot to undergo an intensive screening process before entering the United States. He said that approach would also strengthen intelligence sharing with allies and help prevent passport fraud.

Apple, Google encryption is a blow to public safety

Apple, Google encryption is a blow to public safety

A November 2015 report of the Manhattan District Attorney’s Office in New York City sets forth succinctly a huge public safety problem of which most Americans are unaware:

“Most people today live their lives on smartphones, and, in this regard at least, criminals are no different. While in the past criminals may have kept evidence of their crimes in file cabinets, closets and safes, today that evidence is more often found on smartphones. Photos and videos of child sexual assault, text messages between sex traffickers and their customers, even a video of a murder victim being shot to death — these are just a few of the pieces of evidence found on smartphones and used to prosecute people committing horrific crimes.

“Last fall a decision by a single company changed the way those of us in law enforcement work to keep the public safe and bring justice to victims and their families. In September 2014 Apple announced that its new operating system for smartphones and tablets would employ, by default, what is commonly referred to as “full-disk encryption,” making data on its devices completely inaccessible without a pass code. Shortly thereafter, Google announced that it would do the same.

“Apple’s and Google’s decisions to enable full-disk encryption by default on smartphones means that law enforcement officials can no longer access evidence of crimes stored on smartphones, even though the officials have a search warrant issued by a neutral judge.

“Apple and Google are not responsible for keeping the public safe. That is the job of law enforcement. But the consequences of these companies’ actions on public safety are severe.”

Smartphone encryption will hamper many criminal investigations. E-mails, text messages, voice messages, photos and other data — all of which could lead to the perpetrator of a crime or finding an abducted victim — will now be fully encrypted simply so Apple and Google can increase their profits by advertising enticing claims of privacy.

And this is not just about domestic criminal investigations. What happens when the U.S. military captures or kills the next global terrorist, locates his phone and acquires … nothing.

This is not an issue of government overreaching into the private lives of citizens, as some make it out to be. No smartphone or other device can be accessed by law enforcement without a search warrant issued upon probable cause assessed by a neutral magistrate.

This isn’t about privacy, and it shouldn’t be about profits. It’s about the safety of American citizens and others around the world.

Congress can stop this serious public safety risk tomorrow by its inherent powers under the Commerce Clause of the Constitution. The time to act is now.

Encrypted messaging app Signal now available for desktops

Encrypted messaging app Signal now available for desktops

The much-lauded encryption app Signal has launched a beta program for a desktop version of the app, which will run through Google’s Chrome browser.

Signal Desktop is Chrome app that will sync messages transmitted between it and an Android device, wrote Moxie Marlinspike, a cryptography expert who had helped develop Signal, in a blog post on Wednesday.

The app comes from Open Whisper Systems, which developed Signal’s predecessors, Redphone and TextSecure, which were two Android applications that encrypt calls and messages. Both have been consolidated into Signal.

Signal Desktop won’t be able to sync messages with iPhone just yet, although there are plans for iOS compatibility, Marlinspike wrote. It also won’t support voice initially.

Signal, which is free, has stood out in a crowded field of encrypted messaging applications, which are notoriously difficult to engineer, and has been endorsed by none other than former U.S. National Security Agency contractor Edward Snowden.

The mobile version of Signal for the iPhone and Android uses end-to-end encryption for voice calls, messaging and sending photos.

Open Whisper Systems itself can’t see the plain text of messages or get access to phone calls since it doesn’t store the encryption keys.

Signal is open source, which allows developers to closely inspect its code. There has been growing concern that software vendors may have been pressured into adding capabilities in their products that would assist government surveillance programs. In theory, having open-source code means such tampering could be identified.

Why Government and Tech Can’t Agree about Encryption

Why Government and Tech Can't Agree about Encryption

Your g better and better at protecting your privacy. But Uncle Sam isn’t totally comfortable with that, because it’s also complicating the work of tracking criminals and potential national-security threats.
For decades, tech companies have steadily expanded the use of encryption — a data-scrambling technology that shields information from prying eyes, whether it’s sent over the Internet or stored on phones and computers. For almost as long, police and intelligence agencies have sought to poke holes in the security technology, which can thwart investigators even when they have a legal warrant for, say, possibly incriminating text messages stored on a phone.

The authorities haven’t fared well; strong encryption now keeps strangers out of everything from your iMessages to app data stored on the latest Android phones. But in the wake of the Paris attacks, U.S. officials are again pushing for limits on encryption, even though there’s still no evidence the extremists used it to safeguard their communications.

While various experts are exploring ways of resolving the impasse, none are making much headway. For now, the status quo favors civil libertarians and the tech industry, although that could change quickly — for instance, should another attack lead to mass U.S. casualties. Such a scenario could stampede Congress into passing hasty and potentially counterproductive restrictions on encryption.

“There are completely reasonable concerns on both sides,” said Yeshiva University law professor Deborah Pearlstein. The aftermath of an attack, however, “is the least practical time to have a rational discussion about these issues.”

Encryption plays a little heralded, yet crucial role in the modern economy and daily life. It protects everything from corporate secrets to the credit-card numbers of online shoppers to the communications of democracy advocates fighting totalitarian regimes.

At the same time, recent decisions by Apple and Google to encrypt smartphone data by default have rankled law enforcement officials, who complain of growing difficulty in getting access to the data they feel they need to build criminal cases and prevent attacks. For months, the Obama administration — which has steered away from legislative restrictions on encryption — has been in talks with technology companies to brainstorm ways of giving investigators legal access to encrypted information.

But technology experts and their allies say there’s no way to grant law enforcement such access without making everyone more vulnerable to cybercriminals and identity thieves. “It would put American bank accounts and their health records, and their phones, at a huge risk to hackers and foreign criminals and spies, while at the same time doing little or nothing to stop terrorists,” Sen. Ron Wyden, D-Ore., said in an interview Monday.

Lawmakers on the U.S. Senate Select Committee on Intelligence remain on what they call an “exploratory” search for options that might expand access for law enforcement, although they’re not necessarily looking at new legislation.

The FBI and police have other options even if they can’t read encrypted files and messages. So-called metadata — basically, a record of everyone an individual contacts via phone, email or text message — isn’t encrypted, and service providers will make it available when served with subpoenas. Data stored on remote computers in the cloud — for instance, on Apple’s iCloud service or Google’s Drive — is also often available to investigators with search warrants. (Apple and Google encrypt that data, but also hold the keys.)

Some security experts suggest that should be enough. Michael Moore, chief technology officer and co-founder of the Baltimore, Maryland-based data security firm Terbium Labs, noted that police have managed to take down online criminals even without shortcuts to encryption. He pointed to the 2013 take down of Silk Road, a massive online drug bazaar that operated on the “dark Web,” essentially the underworld of the Internet.

“The way they figured that out was through good old-fashioned police work, not by breaking cryptography,” Moore said. “I don’t think there’s a shortcut to good police work in that regard.”

Others argue that the very notion of “compromise” makes no sense where encryption is concerned. “Encryption fundamentally is about math,” said Mike McNerney, a fellow on the Truman National Security Project and a former cyber policy adviser to the Secretary of Defense. “How do you compromise on math?” He calls the idea of backdoors “silly.”

Some in law enforcement have compromise ideas of their own. The Manhattan District Attorney’s office, for instance, recently called for a federal law that would require smartphone companies to sell phones they could unlock for government searches — in essence, forcing them to hold the keys to user data.

In a report on the subject, the office called its suggestion a “limited proposal” that would only apply to data stored on smartphones and restrict searches to devices that authorities had already seized. Privacy advocates and tech companies aren’t sold, saying it would weaken security for phones that are already too vulnerable to attack.

Marcus Thomas, the chief technology officer at Subsentio and former assistant director of the FBI’s operational technology division, argued that it’s too late to turn back the clock on strong encryption, putting law enforcement in a “race against time” to obtain investigatory data whenever and wherever it can. But he urged security experts to find ways to help out investigators as they design next-generation encryption systems.

The idea of allowing law enforcement secure access to encrypted information doesn’t faze Nathan Cardozo, a staff attorney for the San Francisco-based Electronic Frontier Foundation, provided a warrant is involved. Unfortunately, he says, cryptographers agree that the prospect is a “pure fantasy.”

The secret American origins of Telegram, the encrypted messaging app favored by the Islamic State

The secret American origins of Telegram, the encrypted messaging app favored by the Islamic State

An encrypted communications app called Telegram has been in the news a lot this week, amid fears that the Islamic State has adopted it as its preferred platform for messaging.

On Nov. 18, Telegram reportedly banned 78 ISIS-related channels, “disturbed” to learn how popular the app had become among extremists. Those extremists had used the app both to spread propaganda, according to an October report, and to crowdfund money for guns and rockets, according to Vocativ.

Telegram makes an obvious choice for both activities: In media interviews and on his Web site, the app’s founder — Pavel Durov, often called the “Zuckerberg of Russia” — has boasted that Telegram is technologically and ideologically unsurveillable. In the wake of the terrorist attacks in Paris, however, questions have begun to emerge about how trustworthy Telegram actually is.

Multiple cryptologists and security experts have claimed that Telegram is actually not all that secure: a flaw that may reflect the fact that Telegram wasn’t initially conceived as an encrypted messaging platform.

On top of that, while Telegram is typically described as a highly principled, Berlin-based nonprofit, that hasn’t always been the case: Up until about a year ago, Telegram was an opaque web of for-profit shell companies — mired in conflict and managed, in large part, from the United States.

“Pavel is really unpredictable,” said Axel Neff, the estranged co-founder and former chief information officer at the company. “His biggest drive has always been notoriety.”

Neff makes an odd protagonist in a tale of international corporate intrigue. Raised in rural ski country south of Buffalo, N.Y., and schooled in engineering, Neff was essentially working in construction when Durov founded Russia’s largest social network, Vkontakte, in 2006. Neff’s a salt-of-the-earth guy — a Bills fan and the co-owner, with his mother, of a train-themed restaurant — who seems to have stumbled into Russian tycoon circles entirely by accident. (Neither Pavel nor Telegram returned the Post’s request for comment.)

In college, one of his high school buddies studied abroad in Russia, where he was fortuitously placed in a study group with Durov and a guy named Ilya Perekopsky. Neff befriended Perekopsky when he came to Buffalo for a summer to practice English; Perekopsky went on to help found VK. Before he knew it, a random 28-year-old who drove an old Toyota and lived in rural New York state was the assistant director of international operations at one of the world’s largest social networking companies.

Neff was pretty good at his job, according to court documents made public in 2014 that shed light on the business practices and dealings of Telegram — although he did depart, that same year, under sketchy circumstances. After joining VK in 2008, Neff helped develop the site in foreign markets and transition it away from vkontakte.com URL. By 2011, when the political situation in Russia was making business perilous for social networks and other Internet companies, Neff was good friends with both Durov and Perekopsky. In 2012, they and several other VK executives began discussing a new app; Neff began researching server space and renting a downtown Buffalo office.

At the time, Neff said, the concept for the company was simple: a series of messaging apps — of which Telegram would be the first — that relied not on cellphone carriers but on data networks.

Encryption Debate Erupts Post-Paris Attacks But Don’t Expect Any Change Soon

Encryption Debate Erupts Post-Paris Attacks But Don't Expect Any Change Soon

Despite the lack of evidence, the Obama Administration has revived the encryption debate, pointing to encryption as an aid to the terrorists behind the Nov. 13 Paris attacks.

Investigators from France and the U.S. have conceded that there has been no evidence backing up their conclusion that the terrorist behind the attacks relied on the latest, high-level encryption techniques being offered to consumers by Google and Apple.

Yet, the debate over government-grieving encryption is back in high gear.

Decrypting the Encryption Debate

The Great Encryption debate kicked into full swing about a year ago, when current and former chiefs of the U.S. Department of Justice began calling on Apple and Google to create backdoors in iOS 8 and Android Lollipop.

The encryption built for the two mobile operating systems is so tough, that the world’s best forensic scientists in all of computing wouldn’t be able to crack devices running the software in time for a seven-year statute of limitations.

While it’s possible to crack the encryption in less time, each misstep would push back the subsequent cool-down period before the software would allow for another go.

A few weeks before the Nov. 13 attacks on Paris, the DOJ employed a new strategy to coerce Apple into handing over the keys to iOS – and it’s a good one. The tech world is still awaiting Apple’s counterpunch.

Roughly a year ago, then U.S. Attorney General Eric Holder frame the debate on encryption and stated the DOJ’s stance while speaking at the Global Alliance Against Child Sexual Abuse Online.

“Recent technological advances have the potential to greatly embolden online criminals, providing new methods for abusers to avoid detection,” Holder said, adding that there are those who take advantage of encryption in order to hide their identities and “conceal contraband materials and disguise their locations.”

The Information Technology Industry Council, which speaks on behalf of the high-tech industry, sees all of the above issues as reasons everyone needs encryption.

“Encryption is a security tool we rely on everyday to stop criminals from draining our bank accounts, to shield our cars and airplanes from being taken over by malicious hacks, and to otherwise preserve our security and safety,” said Dean Garfield, president and CEO of ITI.

While stating the ITI’s deep “appreciation” for the work done by law enforcement and the national security community, Garfield said there is no sense in weakening the security just to improve it.

“[W]eakening encryption or creating backdoors to encrypted devices and data for use by the good guys would actually create vulnerabilities to be exploited by the bad guys, which would almost certainly cause serious physical and financial harm across our society and our economy,” he explained.

Paris as a Talking Point

In the wake of the recent Paris Attack, U.S. officials have again reissued their call for software developers – Apple, Google and others – to provide law enforcement agencies with keys to the backdoor of operating systems with government-grade encryption.

While there is still no evidence that law enforcement agencies, with encryption keys in hand, could have given police on the ground in Paris a game-changing heads up of the attacks. Nevertheless, Paris has been turned into a talking point said Michael Morell, a former deputy director of the CIA, who stated that the tragic events will reshape the encryption debate.

“We have, in a sense, had a public debate [on encryption],” said Morell. “That debate was defined by Edward Snowden.” Although, instead of what the former NSA contractor and leaker had done, the issue of encryption will now be “defined by what happened in Paris.”

Paris attacks reignite debate over encryption,surveillance and privacy

Paris attacks reignite debate over encryption,surveillance and privacy

WASHINGTON — Friday’s terrorist attacks in Paris have revived the debate over whether U.S. tech companies should be required to build “backdoors” into encrypted phones, apps and Internet sites to let law enforcement conduct surveillance of suspected terrorists.

There has been widespread speculation among law enforcement authorities and the media that the Islamic State terrorists who attacked Paris were using some kind of encryption technology to communicate. However, American and French authorities have said there is no hard evidence to back up that assumption.

Still, the possibility has been enough to renew criticism of commercial encryption, putting pressure on U.S. companies that are increasingly using the technology to thwart hackers and reassure customers that their data will be kept private.

“When individuals choose to move from open means of communication to those that are encrypted, it can cause a disruption in our ability to use lawful legal process to intercept those communications and does give us concern about being able to gather the evidence that we need to continue in our mission for the protection of the American people,” Attorney General Loretta Lynch told the House Judiciary Committee Tuesday.

Lynch said the FBI and other Justice Department agencies work with Internet providers to try to find a way to enforce court orders to conduct surveillance of suspected terrorists. However, companies are increasingly employing encryption that even they cannot break to access their customers’ data.

In those cases, federal agents use other types of surveillance and intelligence-gathering tools, Lynch said.

“But it (encryption) does cause us the loss of a very valuable source of information,” she told the committee.

Despite strong criticism of encryption by the FBI, the White House announced in October that it would not seek legislation to force U.S. tech companies to build backdoors to let law enforcement get around the technology to access people’s messages and other information.

Paris attack stokes the flames in fight over US data encryption

Last week’s terrorist attack on Paris sounded a call to arms for hawkish U.S. officials seeking broad oversight of encrypted digital communications, some of whom used the opportunity to rekindle discussions with Silicon Valley technology companies.

Paris attack stokes the flames in fight over US data encryption

In an interview with MSNBC on Monday, Senator Diane Feinstein (D-Calif.) said Silicon Valley companies, particularly those marketing secure Internet messaging services, should help government agencies protect the homeland by allowing controlled access to encrypted data.

“They have apps to communicate on that cannot be pierced even with a court order, so they have a kind of secret way of being able to conduct operations and operational planning,” Feinstein said of ISIS terrorists. She hammered the point home, reminding MSNBC’s Andrea Mitchell of recent video footage showing ISIS leaders giving potential sleeper cells the go ahead to carry out attacks on U.S. soil.

Last month the Senate passed the controversial Cybersecurity Information Sharing Act, a bill that effectively allows companies to legally share customer data with the Department of Homeland Security and other government agencies. Feinstein is a co-sponsor of the bill.

As iOS and Android dominate modern mobile communications, Apple and Google have been singled out as part of the problem for providing end-to-end encryption messaging services. For example, strong encryption in iOS 8 and above makes it virtually impossible to eavesdrop on iMessage conversations or gain physical device access, even with appropriate warrants.

“I have actually gone to Silicon Valley, I have met with the chief counsels of most of the big companies, I have asked for help and I haven’t gotten any help,” Feinstein said. “I think Silicon Valley has to take a look at their products, because if you create a product that allows evil monsters to communicate in this way, to behead children, to strike innocents, whether it’s at a game in a stadium, in a small restaurant in Paris, take down an airliner, that’s a big problem.”

Bloomberg reports other top-ranking U.S. officials, including CIA Director John Brennan, made similar comments, but fell short of asking that new laws be enacted.

“There are a lot of technological capabilities that are available right now that make it exceptionally difficult — both technically as well as legally — for intelligence security services to have insight that they need,” Brennan said today at an event in Washington, D.C.

For its part, Apple has been a vocal advocate of consumer privacy and pushed back against CISA alongside other tech companies in October. CEO Tim Cook has repeatedly warned of the detrimental effects a back door policy would have not only on individual users, but the tech industry as a whole.

Critics to Apple’s position argue CISA lets providers share data while still maintaining privacy, a proverbial win-win situation for everyone involved. Americans could find themselves putting to those claims to the test sooner rather than later, as the bill is headed to the House of Representatives and, if passed, to President Obama for ratification.

Microsoft releases encryption tech for bioinformatics

Microsoft releases encryption tech for bioinformatics

Allows researchers to work on data securely.

Microsoft has released tools that allow bioinformatics researchers to work on genome data sets securely to protect privacy.

Genomic data is becoming available in increasing amounts as gene sequencing becomes easier, cheaper and faster, and is used for several new applicaitons such as predicting the occurrence and survival of cardiovascular disease.

Hospitals, clinics, companies and other insitutions are faced with handling large amounts of such data securely, to ensure the privacy of subjects, but this carries risks.

Storing the data in a cloud is one solution to handle large amounts of information, but this is subject to legal orders, data misuse, theft and insider attacks, a team of six Microsoft researchers said.

Homomorphic encryption can protect people’s sensitive genetic information and still allow researchers to work with the data.

The technique allows an unlimited amount of two operations, addition and multiplication, on the scrambled material.

This means researchers are able to work on the data in encrypted form without having to decrypt it or have access to decryption keys.

Traditional encryption, in comparison, locks down data, making it impossible to use or compute on without decoding it first.

The Microsoft team of researchers have written a manual for how to use their homomorphic encryption solution, as a guide to using the technique for bioinformatics and genomic computations.

Along with the manual, Microsoft will also release the SEAL (simple encrypted arithmetic library) as a free download, to be used for experimentation and research purposes.

Apple’s Encryption Fight Turns To The UK

Apple’s Encryption Fight Turns To The UK

After a major victory in the United States, Apple is facing an another threat to its encryption efforts on a different front: the United Kingdom.

The Cupertino-based tech giant typically shies away from taking firm stances on specific legislation and works through lobbying groups representing technology companies’ interests. Apple’s CEO Tim Cook today told students in Dublin that the company is opposed to a new British proposal that would require it to provide law enforcement with access to encrypted data.

Cook said creating a so-called backdoor for law enforcement would expose personal data to hackers.

“If you leave a back door in the software, there is no such thing as a back door for good guys only,” Cook said, according to Reuters. “If there is a back door, anyone can come in the back door.”

Cook’s statements have been backed up by privacy and technology experts. This summer, a group at MIT reported government limits on encryption would present risks.

Cook also said the British bill in its current form is vague. He said at the same event that it is not clear how Apple has to comply.

The Brtish bill, known as the Investigatory Powers Bill, would make explicit in law for the first time that law enforcement can hack and bug computers and phones, and it obliges companies to help officials bypass encryption.

Apple began encrypting its smartphones by default in 2014 with the introduction of iOS 8. Law enforcement in the United States has rallied against the update, claiming it would prevent them from obtaining information key to solving investigations.

However the White House has said it will not take a firm stance against encryption. Though the debate has continued heavily in the Capitol Hill hearing rooms, the U.S. Congress has not proposed any legislative solutions to the encryption debate.

The danger of the U.K.’s current proposal does not lie just in the privacy and security risks it presents to British citizens, but in the global precedent such a law would set. If the U.K. passes a law that requires that law enforcement be able to access encrypted data with a warrant, what’s to stop China or Russia from passing a similar law?

Apple hasn’t backed down on encryption since this issue first bubbled up last year. Though it’s been able to hold its own in the debate over encryption, this is the first time it will have to fight a bill targeting this practice.

Snowden Never Told Us About Ransom Encryption

Snowden Never Told Us About Ransom Encryption

While Edward Snowden is the source behind the largest scandal on the internet, he sure didn’t warn us that hackers would put ransoms onto their spyware. A special ransomware virus was discovered which targets Linux-based systems specifically, and it’s telling us hackers are expanding to web browsers for their vicious attacks.

This specific malware, labeled Lunix.Encoder.1, it breakes all files and goes through specific directories, encrypting home directories, the MySQL server directory, logs, and Web directories of Apache and the Ngnix web servers. It leaves a ransom note in every directory that contains encrypted files, and they are next to impossible to recover without appropriate backups or if users don’t pay the ransom.

This specific virus encrypts archives that contain the very word ‘backup’, so getting out of the pinch without paying the ransom is extremely difficult. The team behind the discovery urge users to keep active backups and make sure their information is as secure as possible. The team also revealed that it’s likely that the malware uses brute force guessing of remote access credentials or Web application exports combined with local privilege escalations, and it probably gives Snowden himself a warm feeling in the heart.

It’s an interesting development in how we are willing to pay to keep our information secure, as anti-virus software continues to grow, perhaps ransoms will start getting more aggressive and more lethal. Could this have been something Snowden missed or failed to inform the world about?

Investigatory Powers Bill could allow Government to ban end-to-end encryption, technology powering iMessage and WhatsApp

Investigatory Powers Bill could allow Government to ban end-to-end encryption, technology powering iMessage and WhatsApp

The new Investigatory Powers Bill could ban WhatsApp and iMessage as they currently exist and lead to the weakening of security.

Introducing the Bill this week, Home Secretary Theresa May said that it didn’t include a controversial proposal to ban the encryption that ensures that messages can’t be read as they are sent between devices. But it does include rules that could allow the Government to force companies to create technology that allows those messages to be read, weakening encryption.

The Bill gives wide-ranging powers to the Home Secretary to force companies to make services that that can be more easily read by intelligence agencies.

Section 189 of the law allows the Government to impose “obligations” on companies that provide telecommunications services. That can include “the removal of electronic protection”, as well as a range of others.

It isn’t clear how that law would be used in practice. But it could allow for the breaking of encryption so that messages can be read.

Some of those powers were already available. But the new legislation repeats them – despite the suggestion that the ban on encryption has been dropped – as well as strengthening some of the ways that Government can impose such obligations.

At the moment, services including WhatsApp and Apple’s iMessage use end-to-end encryption. That means that the phones that are sending each other use keys to ensure that nobody else – including WhatsApp and Apple themselves – can’t read messages.

When end-to-end encryption is used, it isn’t possible to set up a system so that it only allows for the breaking of messages from a specific phone, or of messages sent between two specific people. Instead, allowing for the viewing of just two messages would entail entirely re-engineering the system so that WhatsApp and Apple had the keys to unlock any message, sitting in the middle of all messages.

Technology companies are understood to be concerned about that setup, because if they are able to read through messages then the same system could be used by members of staff or hackers to read through the messages of all of a services’ users.

Earlier this year, a report from some of the world’s leading computer experts said that weakening encryption “will open doors through which criminals and malicious nation states can attack the very individuals law enforcement seeks to defend”.

“If law enforcement’s keys guaranteed access to everything, an attacker who gained access to these keys would enjoy the same privilege,” the report argued.

Apparently partly in response to that criticism, the US Government has mostly walked back its attempts to weaken encryption.

New U.K. online surveillance proposal could have international reach

New U.K. online surveillance proposal could have international reach

A new surveillance proposal in the United Kingdom is drawing criticism from privacy advocates and tech companies that say it gives the government far-reaching digital surveillance powers that will affect users outside the nation’s borders.

The Draft Investigatory Powers Bill released by British Home Secretary Theresa May Wednesday would force tech companies to build intercept capabilities into encrypted communications and require telecommunications companies to hold on to records of Web sites visited by citizens for 12 months so the government can access them, critics allege.

Policy changes are necessary to maintain security in a changing digital landscape, the government argued. “The means available to criminals, terrorists and hostile foreign states to co-ordinate, inspire and to execute their plans are evolving,” May wrote in a forward to the bill. “Communications technologies that cross communications platforms and international borders increasingly allow those who would do us harm the opportunity to evade detection.”

The bill has some new judicial oversight mechanisms, but the response from privacy advocates was largely negative, with some arguing that those changes aren’t enough to compensate for the expanse of new powers.

“The law would apply to all companies doing business with the UK, which includes basically all companies that operate over the internet,” said Nathan White, senior legislative manager at digital rights group Access. “This means that even wholly domestic encrypted communications in the United States, France, or South Africa would be put at risk.”

Some tech companies themselves also raised alarm bells. “Many aspects of the draft Bill would directly impact internet users not just in the UK, but also beyond British borders,” Yahoo said in a blog post. “Of most concern to us at this stage is the UK Government’s proposal to affirm extraterritorial jurisdiction over foreign service providers.”

The U.K. government says some of the controversial aspects of the draft, including the requirement to unlock encrypted communications, date back to laws already on the books and it replaces a patchwork of powers which go back to the early days of the Web. However, while a Code of Conduct for Interception Capabilities released by the British government earlier this year said communications companies were required to maintain a “permanent interception capability,” it made no mention of decrypting such content.

Privacy advocates say the government is reinterpreting earlier laws in problematic ways. “This is a major change” that would effectively outlaw end-to-end encryption, a form of digital security where only the sender and the recipient of a message can unlock it, White said.

In meetings before the draft was released, the government pressed at least one tech company to build in backdoors into encrypted communications, according to a person familiar with the issue who requested anonymity because he was not authorized to comment on the issue.

Apple’s iMessage system uses end-to-end encryption as do an increasingly number of standalone messaging and calling apps including Signal. If the proposal becomes law, critics warn, such services may be forced to alter their systems to include such “backdoors” to allow the government to access encrypted content — something encryption experts say would undermine security by making the underlying code more complex and giving hackers something new to target — or exit the market. Apple declined to comment on the bill, but chief executive Tim Cook has been a vocal opponent of government-mandated backdoors in the past.

Encryption was at the heart of a U.S. policy debate over the last year. The dialogue was triggered when Apple moved to automatically protect iOS devices with encryption so secure the company itself cannot unlock data stored on an iPhone even if faced with a warrant, assuming that a user turns off automatic back-ups to the company’s servers.

Some law enforcement officials warn that criminals and terrorists are “going dark” due to such technology. But the Obama administration decided not to press for a legislative mandate that would require companies to build ways to access such content into their products, although it has not yet come out with a full policy position on the issue.

Critics argue that has led to ambiguity which emboldened British officials. “This draft proposal from the U.K. government demonstrates the lack of leadership on encryption policy from the Obama Administration” and could lead to similar proposals in other parts of the world, said White.

If one country is able to force companies to unlock encrypted data it will be hard to fend off such requests from others including China and Russia, some inside tech companies fear.

When asked about the British proposal by The Post, National Security Council spokesperson Mark Stroh declined to weigh in. “We’d refer you to the British government on draft British legislation,” he said via e-mail.

This Snowden-Approved Encrypted-Communication App Is Coming to Android

This Snowden-Approved Encrypted-Communication App Is Coming to Android

Since it first appeared in Apple’s App Store last year, the free encrypted calling and texting app Signal has become the darling of the privacy community, recommended—and apparently used daily—by no less than Edward Snowden himself. Now its creator is bringing that same form of ultra-simple smartphone encryption to Android.

On Monday the privacy-focused nonprofit software group Open Whisper Systems announced the release of Signal for Android, the first version of its combined calling and texting encryption app to hit Google’s Play store. It’s not actually the first time Open Whisper Systems has enabled those features on Android phones; Open Whisper Systems launched an encrypted voice app called RedPhone and an encrypted texting program called TextSecure for Android back in 2010. But now the two have been combined into a Signal’s single, simple app, just as they are on the iPhone. “Mostly this was just about complexity. It’s easier to get people to install one app than two,” says Moxie Marlinspike, Open Whisper Systems’ founder. “We’re taking some existing things and merging them together to make the experience a little nicer.”

That streamlining of RedPhone and TextSecure into a single app, in other words, doesn’t actually make Open Whisper System’s encryption tools available to anyone who couldn’t already access them. But it does represent a milestone in those privacy programs’ idiot-proof interface, which in Signal is just as straightforward as normal calling and texting. As Marlinspike noted when he spoke to Wired about Signal’s initial release last year, that usability is just as important to him as the strength of Signal’s privacy protections. “In many ways the crypto is the easy part,” Marlinspike said at the time. “The hard part is developing a product that people are actually going to use and want to use. That’s where most of our effort goes.”

Open Whisper Systems’ encryption tools already have a wide footprint: According to Google Play’s stats, TextSecure had been downloaded to at least a million Android phones, all of which will now receive the Signal app in a coming update. Since 2013, TextSecure has also been integrated by default in the popular CyanogenMod version of Android. And last year WhatsApp gave it an enormous boost by integrating it by default into its Android app for Android-to-Android communications—a move that put Open Whisper Systems’ code on at least a half-billion Android users’ devices.

The security of those apps has been widely applauded by cryptographers who have audited them: As Johns Hopkin professor Matthew Green wrote in a 2013 blog post, “After reading Moxie’s RedPhone code the first time, I literally discovered a line of drool running down my face. It’s really nice.”

Open Whisper Systems, which is funded by a combination of personal donations and grants from groups like the U.S. government’s Open Technology Fund, likely doesn’t enjoy the same popularity among law enforcement agencies. FBI Director James Comey has repeatedly warned Congress over the last year of the dangers of consumer encryption programs, and British Prime Minister David Cameron even threatened to ban WhatsApp this summer based on its use of TextSecure.

All of that enmity has only bolstered Signal’s reputation within the privacy community—an affection that’s now been extended to its new Android app, too. “Every time someone downloads Signal and makes their first encrypted call, FBI Director Jim Comey cries,” wrote American Civil Liberties Union lead technologist Chris Soghoian on Twitter. “True fact.”

New UK laws ban unbreakable encryption for internet and social media companies

New UK laws ban unbreakable encryption for internet and social media companies

Companies such as Apple and Google will be banned from offering unbreakable encryption under new UK laws.

Set to be unveiled on Wednesday (November 4), internet and social media companies will no longer be able to provide encryption so advanced that they cannot decipher it, according to The Daily Telegraph.

It will see tech firms and service providers required to provide unencrypted communications to the police or spy agencies if requested through a warrant, and comes as David Cameron urged the public and MPs to back his new surveillance measures.

On ITV’s This Morning earlier today (November 2), the Prime Minister argued that terrorists, paedophiles and criminals must not be allowed to communicate secretly online.

“We shouldn’t allow the internet to be a safe space for them to communicate and do bad things,” he outlined.

Measures in the Investigatory Powers Bill will place a duty on companies to be able to access their customer data in law, and is also expected to maintain the current responsibility for signing off requests with the Home Secretary, but with extra judicial oversight.

The bill will also require internet companies to retain the browsing history of their customers for up to a year.

Oracle hardwires encryption and SQL hastening algorithms into Sparc M7 silicon

Oracle hardwires encryption and SQL hastening algorithms into Sparc M7 silicon

Oracle execs used the final keynote of this week’s OpenWorld to praise their Sparc M7 processor’s ability to accelerate encryption and some SQL queries in hardware.

On Wednesday, John Fowler, veep of systems at Oracle, said the M7 microprocessor and its builtin coprocessors that speed up crypto algorithms and database requests stood apart from the generic Intel x86 servers swelling today’s data center racks.

“I don’t believe that the million-server data center powered by a hydroelectric dam is the scalable future of enterprise computing,” Fowler said. “We’ll need to keep doing it, but we also need to invest in new technology so you all don’t have to build them.”

He told the crowd that Oracle has spent the past five years working out how to build a chip that can handle some SQL database queries in hardware, offloading the job from the main processor cores.

The new Sparc has eight in-memory database acceleration engines that are capable of blitzing through up to 170 billion rows per second, apparently. The acceleration is limited by the memory subsystem, which tops out at 160GB/s. Each of the eight engines has four pipelines, which adds up to 32 processing units.

Oracle hardwires encryption and SQL hastening algorithms into Sparc M7 silicon

According to Oracle, an acceleration engine can read in chunks of compressed columnar databases, evaluate a query on those columns while decompressing the information, and then spit out the result. While powerful, these engines are tiny and account for less than one per cent of the M7 chip’s acreage, Fowler said.

Essentially, the hardware is tuned for performing analytics at high-speed on in-memory columnar databases. Decompression is more important than compression for handling information fast, Fowler said, and the decision to build in specific hardware to handle it all makes the M7 very speedy. Very speedy at running Oracle Database, anyway.

To access these engines, you need to use an Oracle software library that abstracts away the specifics of the hardware: the library queues up SQL queries for the coprocessors to process, much like firing graphics commands into a GPU. Naturally, Oracle Database takes advantage of this library.

Oracle has taken the same hardware approach to encryption, too. Inside the M7 are accelerators capable of running 15 crypto algorithms, including AES and Diffie-Hellman, although at least two of these – DES and SHA-1 – are considered to be broken by now. Hardware accelerated crypto is standard issue now in today’s microprocessors, from Intel and AMD CPUs to ARM-compatible system-on-chips.

As a result of these accelerators, the M7 chip is 4.5 times as fast as IBM’s Power8 processors, Fowler claimed, and in Oracle systems the processor handled encrypted data only 2.8 per cent more slowly than the same data unencrypted. The cryptographic capabilities of the chip don’t just work for Oracle code, Fowler said, but also in third-party Solaris applications.

“We’ve picked up the pace of silicon development,” he concluded. “This is our sixth processor in five years, with many more to come.”

Timothy Prickett Morgan, co-editor of our sister site The Platform said the M7 has 10 billion 20nm transistor gates, and its database analytics engines are available to any programs running on Solaris.

“The Sparc M7 processors made their debut at the Hot Chips conference in 2014, and it is one of the biggest, baddest server chips on the market,” Prickett Morgan added in his in-depth analysis on Wednesday.

“And with the two generations of ‘Bixby’ interconnects that Oracle has cooked up to create ever-larger shared memory systems, Oracle could put some very big iron with a very large footprint into the field, although it has yet to push those interconnects to their limits.”

Biometric data becomes the encryption key in Fujitsu system

Biometric data becomes the encryption key in Fujitsu system

Fujitsu says it has developed software that uses biometric data directly as the basis for encryption and decryption of data, simplifying and strengthening security systems that rely on biometrics such as fingerprints, retina scans and palm vein scans.

Current security systems that rely on encryption require the management of encryption keys, which are stored on secure smartcards or directly on PCs. Biometric scans can be used as a way of authenticating the user and providing access to those encryption keys in order to decrypt data.

Fujitsu’s system uses elements extracted from the biometric scan itself as a part of a procedure to encrypt the data, making the biometric scan an integral part of the encryption system and removing the need for encryption keys.

That has two big benefits, according to the company.

The lack of encryption keys means there’s no need for smartcards and hackers won’t have anything to find should they break into a network.

The second major benefit comes from biometric data use with cloud services. With current systems, a user’s biometric data is potentially vulnerable as it’s sent over the Internet to allow log-in to a service. Because Fujitsu’s new system uses random numbers to convert the biometric data as part of the encryption and decryption process, unconverted data is not transmitted over a network.

The procedure employs error correction to smooth out slight differences in successive biometric scans that are the result of variations in a user’s position or motion when the scan is taken.

At present, the system has been developed to work with palm vein authentication, a technology that Fujitsu has spent years developing and has already deployed on systems like bank ATMs in Japan. But the company said it could readily be adapted to work with other biometric data such as fingerprints or retina scans.

The software was developed by Fujitsu Laboratories and two Japanese universities, Kyushu University and Saitama University, and is being presented this week at the 8th International Symposium on Foundations and Practice of Security in Clermont-Ferrand, France.

Tech Companies and Civil Liberties Groups Force Obama To Weigh In On Encryption Debate

Tech Companies and Civil Liberties Groups Force Obama To Weigh In On Encryption Debate

President Obama will now be forced to publicly describe the extent of his commitment to protecting strong encryption, after nearly 50 major technology companies, human rights groups, and civil liberties collectives—including Twitter, the ACLU, and Reddit — succeeded in getting over 100,000 signatures on a White House petition on Tuesday.

The government’s “We the People” platform, created in 2011, was designed as “a clear and easy way for the American people to petition their government.” Once a petition gains 100,000 signatures, it is guaranteed a response.

The savecrypto.org petition demands that Obama “publicly affirm your support for strong encryption” and “reject any law, policy, or mandate that would undermine our security.”

FBI director James Comey has been preaching about the dangers of end-to-end encryption for the past year, saying it blocks law enforcement from monitoring communications involving criminals and terrorists. He’s asked for special access into encrypted communications — a “back door” or “front door.”

However, technologists and privacy advocates insist that any hole in encryption for law enforcement can be exploited by hackers.

Comey testified earlier this month before the Senate Homeland Security and Governmental Affairs Committee that the White House was not seeking legislation to force companies to build backdoors into their products—at least not yet.

However, top intelligence community lawyer Robert S. Litt wrote in a leaked e-mail obtained by the Washington Post that public opinion could change “in the event of a terrorist attack or criminal event” where encryption stopped law enforcement from detecting the threat. He recommended “keeping our options open for such a situation.”

Now, the White House will have to speak for itself.

“More than 100,000 users have now spoken up to ask the Administration to make a strong statement in support of data security – no back doors, no golden keys, no exceptional access,” said Amie Stepanovich, the U.S. Policy Manager for digital rights group Access Now, one of the founding organizations of the petition along with the Electronic Frontier Foundation. “We thank those who have stood with us and look forward to President Obama’s response.”

Your self-encrypting hard drive isn’t nearly as secure as you thought

Your self-encrypting hard drive isn't nearly as secure as you thought

If you want to keep your information away from hackers and snoops, whether it’s your Internet use, email, hard drive data or your backup, the best thing you can do is use encryption. Encryption scrambles your data and, in theory, the only way to unscramble it is to know the password. That’s why choosing a strong password no one can guess is important.

This is also what makes a ransomware virus that encrypts your files so dangerous. Without paying for the decryption password, you can’t get your files back. Learn three steps you can take to beat ransomware. Unfortunately for your security, encryption isn’t always a secure as you’d hope.

Without going into too much technical detail, there are a lot of ways that encryption can happen, from the method it uses to encrypt the data to how many bits it uses. For example, you’ll see 128-bit AES and 256-bit AES show up a lot in programs and Web encryption. There’s SHA-1 and SHA-2 from the NSA. For your router, you’ll see options like WEP, WPA TKIP, WPA2 AES and more.

Unfortunately, not all encryption is created equal. For centuries, mathematicians and cryptographers have been coming up with and breaking encryption schemes. As computers have gotten more powerful, encryption that should have taken centuries to crack can fail in seconds.

That’s why you don’t see much 64-bit AES anymore, why using WEP on your router is the same has having no encryption, and why large organizations are moving from SHA-1 to SHA-2 encryption.

Of course, this is way more than the average person should have to think about. You should be able to trust that every company is using the best encryption possible in the products you buy and use. Unfortunately, that often isn’t the case, and we just got a fresh reminder.

Western Digital’s hard drive encryption is useless. Totally useless

The encryption systems used in Western Digital’s portable hard drives are pretty pointless, according to new research.

WD’s My Passport boxes automatically encrypt data as it is written to disk and decrypt the data as it is read back to the computer. The devices use 256-bit AES encryption, and can be password-protected: giving the correct password enables the data to be successfully accessed.

Now, a trio of infosec folks – Gunnar Alendal, Christian Kison and “modg” – have tried out six models in the WD My Passport family, and found blunders in the software designs.

For example, on some models, the drive’s encryption key can be trivially brute-forced, which is bad news if someone steals the drive: decrypting it is child’s play. And the firmware on some devices can be easily altered, allowing an attacker to silently compromise the drive and its file systems.

“We developed several different attacks to recover user data from these password-protected and fully encrypted external hard disks,” the trio’s paper [PDF] [slides PDF] states.

“In addition to this, other security threats are discovered, such as easy modification of firmware and on-board software that is executed on the user’s PC, facilitating evil maid and badUSB attack scenarios, logging user credentials, and spreading of malicious code.”

My Passport models using a JMicron JMS538S micro-controller have a pseudorandom number generator that is not cryptographically safe, and only cycles through a series of 255 32-bit values. This generator is used to create the data encryption key, and the drive firmware leaks enough information about the random number generator for this key to be recreated by brute-force, we’re told.

“An attacker can regenerate any DEK [data encryption key] generated from this vulnerable setup with a worst-case complexity of close to 240,” the paper states.

“Once the DEK [data encryption key] is recovered, an attacker can read and decrypt any raw disk sector, revealing decrypted user data. Note that this attack does not need, nor reveals, the user password.”

Drive models using a JMicron JMS569 controller – which is present in newer My Passport products – can be forcibly unlocked using commercial forensic tools that access the unencrypted system area of the drive, we’re told.

Drives using a Symwave 6316 controller store their encryption keys on the disk, encrypted with a known hardcoded AES-256 key stored in the firmware, so recovery of the data is trivial.

Western Digital's hard drive encryption is useless. Totally useless

Meanwhile, Western Digital says it is on the case.

“WD has been in a dialogue with independent security researchers relating to their security observations in certain models of our My Passport hard drives,” spokeswoman Heather Skinner told The Register in a statement.

“We continue to evaluate the observations. We highly value and encourage this kind of responsible community engagement because it ultimately benefits our customers by making our products better. We encourage all security researchers to responsibly report potential security vulnerabilities or concerns to WD Customer Service.

NSA, Apple Chiefs Decode Encryption Views

NSA, Apple Chiefs Decode Encryption Views

LAGUNA BEACH, Calif.—The heads of the National Security Agency and the world’s most valuable company appeared to try to make nice Monday night over their contrasting views on encryption—to a point.

NSA Director Adm. Michael Rogers and Apple Inc. Chief Executive Tim Cook, appearing at The Wall Street Journal’s technology conference, WSJDLive, spoke in broad terms about encryption in back-to-back interviews.

Asked about efforts by Apple and other tech firms to build products that protect user data and communications from law enforcement, Mr. Rogers said, “Strong encryption is in our nation’s best interest.”

But asked if that included impenetrable encryption, he quickly interrupted, “That’s not what I said.”

Mr. Cook, appearing later, disagreed on the latter point. “I don’t know a way to protect people without encrypting,” he said. “You can’t have a backdoor that’s only for the good guys.”

Apple and federal officials have been at odds for more than a year, since Apple issued a new version of its mobile-operating system that it said safeguards user information, even from law enforcement. But the White House signaled recently that it won’t seek new laws to force tech companies to make products that allow law enforcement to eavesdrop.

Messrs. Cook and Rogers said both sides in the encryption debate need to turn down the vitriol. “Reasonable people can have discussions and figure out how to move forward,” Mr. Cook said.

On other subjects, Mr. Cook said, Apple has 15 million users on its streaming music service, including 6.5 million paying subscribers.

Apple launched Apple Music on June 30, offering every user a three-month trial period. Once the trial period ends, customers pay $9.99 a month for individual users and $14.99 for families. The first batch of customers came off the trial period at the end of September.

Mr. Cook also spoke unusually frankly about the automobile industry, although he declined to address Apple’s interest in building an electric car. The Apple CEO said he sees a “massive change” coming in the automobile industry as major technologies shift the sector away from today’s combustion-engine focus.

He said he sees software, electrification and autonomous driving technologies playing a crucial role in the cars of the future. “That industry is at an inflection point for massive change, not just evolutionary change,” he said.

The NSA may have been able to crack so much encryption thanks to a simple mistake

The NSA may have been able to crack so much encryption thanks to a simple mistake

The NSA could have gained a significant amount of its access to the world’s encrypted communications thanks to the high-tech version of reusing passwords, according to a report from two US academics.

Computer scientists J Alex Halderman and Nadia Heninger argue that a common mistake made with a regularly used encryption protocol leaves much encrypted traffic open to eavesdropping from a well-resourced and determined attacker such as the US national security agency.

The information about the NSA leaked by Edward Snowden in the summer of 2013 revealed that the NSA broke one sort of encrypted communication, virtual private networks (VPN), by intercepting connections and passing some data to the agency’s supercomputers, which would then return the key shortly after. Until now, it was not known what those supercomputers might be doing, or how they could be returning a valid key so quickly, when attacking VPN head-on should take centuries, even with the fastest computers.

The researchers say the flaw exists in the way much encryption software applies an algorithm called Diffie-Hellman key exchange, which lets two parties efficiently communicate through encrypted channels.

A form of public key cryptography, Diffie-Hellman lets users communicate by swapping “keys” and running them through an algorithm which results in a secret key that both users know, but no-one else can guess. All the future communications between the pair are then encrypted using that secret key, and would take hundreds or thousands of years to decrypt directly.

But the researchers say an attacker may not need to target it directly. Instead, the flaw lies in the exchange at the start of the process. Each person generates a public key – which they tell to their interlocutor – and a private key, which they keep secret. But they also generate a common public key, a (very) large prime number which is agreed upon at the start of the process.

The NSA may have been able to crack so much encryption thanks to a simple mistake

Since those prime numbers are public anyway, and since it is computationally expensive to generate new ones, many encryption systems reuse them to save effort. In fact, the researchers note, one single prime is used to encrypt two-thirds of all VPNs and a quarter of SSH servers globally, two major security protocols used by a number of businesses. A second is used to encrypt “nearly 20% of the top million HTTPS websites”.

The problem is that, while there’s no need to keep the chosen prime number secret, once a given proportion of conversations are using it as the basis of their encryption, it becomes an appealing target. And it turns out that, with enough money and time, those commonly used primes can become a weak point through which encrypted communications can be attacked.

In their paper, the two researchers, along with a further 12 co-authors, describe their process: a single, extremely computationally intensive “pre-calculation” which “cracks” the chosen prime, letting them break communications encrypted using it in a matter of minutes.

How intensive? For “shorter” primes (512 bits long, about 150 decimal digits), the precalcuation takes around a week – crippling enough that, after it was disclosed with the catchy name of “Logjam”, major browsers were changed to reject shorter primes in their entirety. But even for the gold standard of the protocol, using a 1024-bit prime, a precalculation is possible, for a price.

The researchers write that “it would cost a few hundred million dollars to build a machine, based on special purpose hardware, that would be able to crack one Diffie-Hellman prime every year.”

The NSA may have been able to crack so much encryption thanks to a simple mistake

“Based on the evidence we have, we can’t prove for certain that NSA is doing this. However, our proposed Diffie-Hellman break fits the known technical details about their large-scale decryption capabilities better than any competing explanation.”

There are ways around the problem. Simply using a unique common prime for each connection, or even for each application, would likely reduce the reward for the year-long computation time so that it was uneconomical to do so. Similarly, switching to a newer cryptography standard (“elliptic curve cryptography”, which uses the properties of a particular type of algebraic curve instead of large prime numbers to encrypt connections) would render the attack ineffective.

But that’s unlikely to happen fast. Some occurrences of Diffie-Hellman literally hard-code the prime in, making it difficult to change overnight. As a result, “it will be many years before the problems go away, even given existing security recommendations and our new findings”.

“In the meantime, other large governments potentially can implement similar attacks, if they haven’t already.”

The next steps for the White House on encryption

The next steps for the White House on encryption

THE OBAMA administration’s decision not to seek legislation requiring technology companies to give law enforcement access to encrypted communications on smartphones has a certain logic. In this age of hacking and cyberintrusion, encryption can keep most people safer. But the decision also carries risks. Encryption can give a tiny band of criminals and terrorists a safe haven. The United States must now make the most of the useful side of encryption, but without losing sight of the risks.

FBI Director James B. Comey warned last year that law enforcement might be “going dark” because technology companies, including Apple and Google, are introducing ways for users to send encrypted messages by smartphones that can be unlocked only by the users, not by the companies. Mr. Comey was alarmed this would give criminals and terrorists a place to communicate that was beyond reach even of law enforcement with a court order. Mr. Comey suggested Congress require tech companies to provide what is known as extraordinary access to encrypted information, a “lawful intercept” capability, sometimes referred to as a backdoor, or a special key for the government. We sympathized with Mr. Comey’s appeal and urged all sides to look for a compromise.

No compromise was forthcoming. The reaction to Mr. Comey’s suggestion in the technology world was a strong protest that any weakening of encryption — even a tiny bit, for a good reason — creates a vulnerability for all. The firms also made the argument that encryption can be a positive force in today’s chaotic world of cyberattacks; their customers want absolute privacy, too, for the digital lives held on the smartphones in their pockets. They also pointed out that if backdoor access is granted to the U.S. government, it will provide cover for authoritarian governments such as China and Russia to demand the same or worse.

Mr. Comey said last week that private talks with the tech companies have been “increasingly productive.” That is promising. There are methods the FBI might use to crack encryption case by case or to find the information elsewhere. The FBI and state and local law enforcement are most in need; the National Security Agency has much stronger tools for breaking encryption overseas.

Having stood up to Mr. Comey, Silicon Valley should demonstrate the same fortitude when it comes to China and Russia and absolutely refuse to allow intrusions by these and other police states. It would help, too, if President Obama articulated the principle loud and clear.

That leaves a nagging worry. The United States is a rule-of-law nation, and encryption technology is creating a space that is in some ways beyond the reach of the law. Encryption may indeed be valuable to society if it protects the majority. But what if it enables or protects the 1 percent who are engaged in criminality or terrorism? That threat has to be taken into account, and so far it remains unresolved. It will not go away.

Aadhaar encryption protects privacy, will take eons to crack

The Aadhaar system’s data collection and storage is strongly protected by sophisticated encryption processes to ensure biometric data does not leak either through private contractors running enrollment centres or at the central data servers that store the details.

The unique identity authority of India’s processes are intended to allay fears that biometric data collected by private contractors might be vulnerable to falling in unauthorized hands as the biometric detail is encrypted using the highest available public key cryptography encryption.

Even if the data is stolen or lost the encryption prevents access to the biometrics as it will require the most powerful computers literally eons to crack the code. Similarly at the central data centre, the encryption processes are repeated while storing the details, making attempts to access and use the data very difficult.

The government hopes that the lack of human interface in storing the data and procedures such as data collectors being required to authenticate every entry though their own biometric verification will help convince the Supreme Court that privacy concerns have been addressed by the UIDAI.

The UIDAI programme’s success is indicated by lack of any credible complaints or proof of misuse of data since it started the ambitious scheme almost five year ago. This is partly due to the processes that make even loss of a recording machine or copying on a flash drive a futile exercise.

The data are being collected on software-Enrollment Client (EC) Software-written, maintained and provided by the UIDAI and is encrypted to prevent leaks at the enrollment centres managed by private vendors and in transit. The private agencies on ground use the EC Software which ensures that only authentic and approved person can sign-in for the purpose of enrolling people.

The enrollment client software used by private vendors strongly encrypts individual electronic files containing demographic and biometric details (enrollment data packets) of residents at the time of enrollment and even before the data is saved in any hard disk.

The encryption uses highest available public key cryptography encryption (PKI-2048 and AES-256) with each data record having a built-in mechanism to detect any tampering.

The e-data packages are always stored on disk in PKI encrypted form and is never decrypted or modified during transit making it completely inaccessible to any system or person.

Among other security measures, UIDAI has ensured that the Aadhaar database is not linked to any other databases., or to information held in other databases and its only purpose is to verify a person’s identity at the point of receiving a service, and that too with the consent of the Aadhaar number holder.

Encrypted Smartphones Challenge Investigators

Encrypted Smartphones Challenge Investigators

Law-enforcement officials are running up against a new hurdle in their investigations: the encrypted smartphone.

Officials say they have been unable to unlock the phones of two homicide victims in recent months, hindering their ability to learn whom those victims contacted in their final hours. Even more common, say prosecutors from New York, Boston and elsewhere, are locked phones owned by suspects, who refuse to turn over passcodes.

Manhattan District Attorney Cyrus Vance says his office had 101 iPhones that it couldn’t access as of the end of August, the latest data available.

The disclosures are the latest twist in a continuing dispute between law-enforcement officials and Apple Inc. and Google Inc., after the two tech companies released software last year that encrypted more data on new smartphones. The clash highlights the challenge of balancing the privacy of phone users with law enforcement’s ability to solve crimes.

“Law enforcement is already feeling the effects of these changes,” Hillar Moore, the district attorney in Baton Rouge, La., wrote to the Senate Judiciary Committee in July. Mr. Moore is investigating a homicide where the victim’s phone is locked. He is one of 16 prosecutors to send letters to the committee calling for back doors into encrypted devices for law enforcement.

The comments are significant because, until now, the debate over encrypted smartphones has been carried by federal officials. But local police and prosecutors handle the overwhelming share of crimes in the U.S., and district attorneys say encryption gives bad guys an edge.

Encrypted phones belonging to victims further complicate the issue, because some families want investigators to have access to the phones.

“Even if people are not terribly sympathetic to law-enforcement arguments, this situation might cause them to think differently,” said Paul Ohm, a Georgetown University Law Center professor and former prosecutor.

Last week, Federal Bureau of Investigation Director James Comey told a Senate hearing that the administration doesn’t want Congress to force companies to rewrite their encryption code. “The administration is not seeking legislation at this time,” White House National Security Council spokesman Mark Stroh said in a written statement Monday.

Some independent experts say the handful of cases that have emerged so far isn’t enough to prove that phone encryption has altered the balance between law enforcement and privacy. In many cases, they say, investigators can obtain the encrypted information elsewhere, from telephone companies, or because the data was backed up on corporate computers.

—————————————————————————————————————————————————–

In the past this would have been easy for us. We would have an avenue for this information, we’d get a subpoena, obtain a record, further our investigation.

—Evanston Police Commander Joseph Dugan

—————————————————————————————————————————————————–
“It depends on what the success rate is of getting around this technology,” said Orin Kerr, a George Washington Law professor.

Apple encrypted phones by default beginning with iOS 8, the version of its mobile-operating system released last fall. The decision came amid public pressure following former national-security contractor Edward Snowden’s revelations of tech-company cooperation with government surveillance.

With iOS 8, and the newly released iOS 9, Apple says it cannot unlock a device with a passcode. That means Apple cannot provide information to the government on users’ text messages, photos, contacts and phone calls that don’t go over a telephone network. Data that isn’t backed up elsewhere is accessible only on the password-protected phone.

“We have the greatest respect for law enforcement and by following the appropriate legal process, we provide the relevant information we have available to help,” Apple wrote in a statement to The Wall Street Journal.

Apple Chief Executive Tim Cook is an advocate of encryption. “Let me be crystal clear: Weakening encryption, or taking it away, harms good people that are using it for the right reasons,” he said at a conference earlier this year.

Only some phones, such as the Nexus 6 and the Nexus 9, running Google’s Android Lollipop system are encrypted by default. Google declined to comment about the role of encryption in police investigations.

Three of the 16 district attorneys who wrote to the Senate—from Boston, Baton Rouge and Brooklyn—told the Journal they were aware of cases where encrypted phones had hindered investigations. Investigators in Manhattan and Cook County in Illinois also have cases dealing with encrypted phones. Investigators say, however, they have no way of knowing whether or not the locked phones contain valuable evidence.

Mr. Moore, of Baton Rouge, thinks there might be important information on a victim’s phone. But he can’t access it.

Brittany Mills of Baton Rouge used her iPhone 5s for everything from sending iMessages to writing a diary, and she didn’t own a computer, her mother said. Ms. Mills, a 28-year-old patient caregiver, was shot to death at her door in April when she was eight months pregnant.

Police submitted a device and account information subpoena to Apple, which responded that it couldn’t access anything from the device because it was running iOS 8.2. Mr. Moore thinks the iCloud data Apple turned over won’t be helpful because the most recent backup was in February, two months before her death. The records he obtained of her phone calls yielded nothing.

“When something as horrible as this happens to a person, there should be no roadblock in the way for law enforcement to get in there and catch the person as quickly as possible,” said Barbara Mills, Brittany Mills’s mother.

Investigators in Evanston, Ill., are equally stumped by the death of Ray C. Owens, 27. Mr. Owens was found shot to death in June with two phones police say belonged to him, an encrypted iPhone 6 and a Samsung Galaxy S6 running Android. A police spokesman said the Samsung phone is at a forensics lab, where they are trying to determine if it is encrypted.

The records that police obtained from Apple and service providers had no useful information, he added. Now the investigation is at a standstill.

“In the past this would have been easy for us,” said Evanston Police Commander Joseph Dugan. “We would have an avenue for this information, we’d get a subpoena, obtain a record, further our investigation.”

Barbara Mills is committed to making sure more families don’t have to see cases go unsolved because of phone encryption. “Any time you have a situation of this magnitude, if you can’t depend on law enforcement, who can you depend on?”

US Government Will No Longer Push For User’s Encrypted Data

US Government Will No Longer Push For User’s Encrypted Data

Last year Google and Apple (and other companies) made some changes to the way encryption was handled. Instead of Google and Apple holding the keys to the encryption, they gave the keys to their customers. What this meant is that law enforcement agencies can no longer ask these companies to turn over user encrypted data.

If they want the data, they will have to convince those users to give it up themselves, something which the FBI was not too happy about. However for a while the government did not give up their quest to make it so that tech companies could be forced to turn over encrypted user data, but all of that has since changed.

According to reports, the Obama administration has finally backed down in their battle against the tech companies over encrypted data. This means that if you were worried that these tech companies would one day be forced to install back doors which are supposedly only for government access, you won’t have to worry about that anymore.

The argument made by the tech companies basically stated that by installing back doors, even if it was just for the “good guys”, could leave their products and services open to hacks. While this is no doubt a big victory, there are some who are skeptical that this is the end of that.

According to Peter G. Neumann, one of the nation’s leading computer scientists, “This looks promising, but there’s still going to be tremendous pressure from law enforcement. The NSA is capable of dealing with the cryptography for now, but law enforcement is going to have real difficulty with this. This is never a done deal.”

Obama administration has decided not to seek a legislative remedy now

Obama administration has decided not to seek a legislative remedy now

FBI Director James Comey told a congressional panel that the Obama administration won’t ask Congress for legislation requiring the tech sector to install backdoors into their products so the authorities can access encrypted data.

Comey said the administration for now will continue lobbying private industry to create backdoors to allow the authorities to open up locked devices to investigate criminal cases and terrorism.

“The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” Comey told a Senate panel of the Homeland Security and Governmental Affairs Committee on Thursday.

Comey’s comments come as many in the privacy community were awaiting a decision by the administration over whether it would seek such legislation. Many government officials, including Comey himself, have called for backdoors. All the while, there’s been intense lobbying by the White House to guilt the tech sector for a backdoor. And Congress has remained virtually silent on the issue that resembles the so-called Crypto Wars.

The president’s public position on the topic, meanwhile, has been mixed. Obama had said he is a supporter and “believer in strong encryption” but also “sympathetic” to law enforcement’s need to prevent terror attacks.

The government’s lobbying efforts, at least publicly, appear to be failing to convince tech companies to build backdoors into their products. Some of the biggest names in tech, like Apple, Google, and Microsoft, have publicly opposed allowing the government a key to access their consumers’ encrypted products. All the while, some government officials, including Comey, have railed against Apple and Google for selling encrypted products where only the end-user has the decryption passcode.

According to a letter to Obama from the tech sector:

Obama administration has decided not to seek a legislative remedy now

The government cannot force the tech sector to build encryption end-arounds. The closest law on the books is the Communications Assistance for Law Enforcement Act of 1994, known as CALEA. The measure generally demands that telecommunication companies make their phone networks available to wiretaps.

Obama administration opts not to force firms to decrypt data — for now

Obama administration opts not to force firms to decrypt data — for now

After months of deliberation, the Obama administration has made a long-awaited decision on the thorny issue of how to deal with encrypted communications: It will not — for now — call for legislation requiring companies to decode messages for law enforcement.

Rather, the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations.

“The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI Director James Comey said at a Senate hearing Thursday of the Homeland Security and Governmental Affairs Committee.

The decision, which essentially maintains the status quo, underscores the bind the administration is in — between resolving competing pressures to help law enforcement and protecting consumer privacy.

The FBI says it is facing an increasing challenge posed by the encryption of communications of criminals, terrorists and spies. A growing number of companies have begun to offer encryption in which the only people who can read a message, for instance, are the person who sent it and the person who received it. Or, in the case of a device, only the device owner has access to the data. In such cases, the companies themselves lack “backdoors” or keys to decrypt the data for government investigators, even when served with search warrants or intercept orders.

The decision was made at a Cabinet meeting Oct. 1.

“As the president has said, the United States will work to ensure that malicious actors can be held to account – without weakening our commitment to strong encryption,” National Security Council spokesman Mark Stroh said. “As part of those efforts, we are actively engaged with private companies to ensure they understand the public safety and national security risks that result from malicious actors’ use of their encrypted products and services.”

But privacy advocates are concerned that the administration’s definition of strong encryption also could include a system in which a company holds a decryption key or can retrieve unencrypted communications from its servers for law enforcement.

“The government should not erode the security of our devices or applications, pressure companies to keep and allow government access to our data, mandate implementation of vulnerabilities or backdoors into products, or have disproportionate access to the keys to private data,” said Savecrypto.org, a coalition of industry and privacy groups that has launched a campaign to petition the Obama administration.

To Amie Stepanovich, the U.S. policy manager for Access, one of the groups signing the petition, the status quo isn’t good enough. “It’s really crucial that even if the government is not pursuing legislation, it’s also not pursuing policies that will weaken security through other methods,” she said.

The FBI and Justice Department have been talking with tech companies for months. On Thursday, Comey said the conversations have been “increasingly productive.” He added: “People have stripped out a lot of the venom.”

He said the tech executives “are all people who care about the safety of America and also care about privacy and civil liberties.”

Comey said the issue afflicts not just federal law enforcement but also state and local agencies investigating child kidnappings and car crashes— “cops and sheriffs … [who are] increasingly encountering devices they can’t open with a search warrant.”

One senior administration official said the administration thinks it’s making enough progress with companies that seeking legislation now is unnecessary. “We feel optimistic,” said the official, who spoke on the condition of anonymity to describe internal discussions. “We don’t think it’s a lost cause at this point.”

Legislation, said Rep. Adam Schiff (D-Calif.), is not a realistic option given the current political climate. He said he made a recent trip to Silicon Valley to talk to Twitter, Facebook and Google. “They quite uniformly are opposed to any mandate or pressure — and more than that, they don’t want to be asked to come up with a solution,” Schiff said.

Law enforcement officials know that legislation is a tough sell now. But, one senior official stressed, “it’s still going to be in the mix.”

On the other side of the debate, technology, diplomatic and commerce agencies were pressing for an outright statement by Obama to disavow a legislative mandate on companies. But their position did not prevail.

Daniel Castro, vice president of the Information Technology & Innovation Foundation, said absent any new laws, either in the United States or abroad, “companies are in the driver’s seat.” He said that if another country tried to require companies to retain an ability to decrypt communications, “I suspect many tech companies would try to pull out.”

Risk Analysis, Encryption Stressed in HITECH Act Final Rules

Risk Analysis, Encryption Stressed in HITECH Act Final Rules

Two final rules for the HITECH electronic health record incentive program strongly emphasize the value of risk assessments and encryption as measures for safeguarding patient information.

A new rule establishing requirements for proving a provider is a “meaningful user” for Stage 3 of the incentive program requires protecting patient data through the implementation of appropriate technical, administrative and physical safeguards and conducting a risk analysis that includes assessing encryption of ePHI created or maintained by a certified electronic health record.

A companion final rule setting 2015 standards for certifying EHR software as qualifying for the program requires the software to be capable of creating a hashing algorithm with security strength equal to or greater than SHA-2.

The Department of Health and Human Services’ Centers for Medicare and Medicaid Services says the Stage 3 requirements are optional in 2017. Providers who choose to begin Stage 3 in 2017 will have a 90-day reporting period. However, all providers will be required to comply with Stage 3 requirements beginning in 2018 using EHR technology certified to the 2015 Edition requirements.

When it comes to privacy and security requirements included in the final rules, versus what was in the proposed rules, there were “no significant changes, no surprises,” says John Halamka, CIO of Beth Israel Deaconess Medical Center.

Some privacy and security experts, however, point out the rules spotlight the importance of safeguarding electronic protected health information through measures such as risk analysis, encryption and secure data exchange. But some observers criticize HHS for not offering more detailed guidance on risk assessments.

Risk Analysis

While conducting a risk analysis was also a requirement in Stages 1 and 2 of the meaningful use program, the final rule for Stage 3 requires that healthcare providers drill down further by “conducting or reviewing a security risk analysis … including addressing the security – to include encryption – of electronic protected health information created or maintained by certified electronic health record technology … and implement security updates as necessary and correct identified security deficiencies.”

The objective of that requirement is to protect electronic health information through the implementation of “appropriate technical, administrative and physical safeguards,” the rule states. Rulemakers stress assessing the data created or maintained by an electronic health record system, versus conducting a more comprehensive security risk assessment as required under the HIPAA Security Rule.

“Although [HHS’] Office for Civil Rights does oversee the implementation of the HIPAA Security Rule and the protection of patient health information, we believe it is important and necessary for a provider to attest to the specific actions required to protect ePHI created or maintained by CEHRT in order to meet the EHR incentive program requirements,” the rule notes. “In fact, in our audits of providers who attested to the requirements of the EHR Incentive Program, this objective and measure are failed more frequently than any other requirement.

“This objective and measure are only relevant for meaningful use and this program, and are not intended to supersede what is separately required under HIPAA and other rulemaking. We do believe it is crucial that all [eligible healthcare providers] evaluate the impact CEHRT has on their compliance with HIPAA and the protection of health information in general.”

New to the risk analysis requirement is the addition of assessing administrative and technical safeguards. “This measure enables providers to implement risk management security measures to reduce the risks and vulnerabilities identified. Administrative safeguards – for example, risk analysis, risk management, training and contingency plans – and physical safeguards – for example, facility access controls, workstation security – are also required to protect against threats and impermissible uses or disclosures to ePHI created or maintained by CEHRT.”

Missed Opportunity?

HHS should have used the final rule to offer even more helpful guidance about risk assessments, says privacy attorney David Holtzman, vice president of compliance at the security consulting firm CynergisTek.

“CMS focused significant attention to the role of risk analysis in safeguarding the privacy and security of health information created or maintained in an EHR,” he says. “However, they missed an important opportunity to … ensure that administrative and physical safeguards requirements of the HIPAA Security Rule are assessed in any security risk analysis.”

To guide healthcare providers, including smaller doctors’ offices, in conducting the Stage 3 risk analysis, the rule makes note of free tools and resources available to assist providers, including a Security Risk Assessment Tool developed by ONC and OCR.

But the use of that tool is daunting for some smaller healthcare entities, contends Keith Fricke, principal consultant at consulting firm tw-Security.

“The SRA tool is too overbearing for any organization to use, let alone small healthcare organizations, including small provider offices,” he says.

Secure Data Exchange

Besides a renewed focus on risk analysis, other privacy and security related enhancements to the meaningful use Stage 3 final rule include an emphasis on encryption and secure messaging.

“More than half of the objectives in Stage 3 starting in 2017 require EHRs to have interoperable exchange technology that is encrypted and offered to relying parties with strong identity assurance,” said David Kibbe, M.D., CEO of DirectTrust, which created and maintains a framework for secure e-mail in the healthcare sector.

“DirectTrust’s work can and will be relied upon for multiple Stage 2 and 3 objectives and criteria announced by CMS in the new rule,” he says.

For instance, secure electronic messaging to communicate with patients on relevant health information is an objective in Stage 3, with a series of measurements.

Software Certification Rule

While privacy and security are weaved through the final rule for Stage 3 of the meaningful use program for healthcare providers, HHS’ Office of the National Coordinator for Health IT also raised the bar on requirements in the final rule for 2015 Edition health IT software certification. That includes phasing in requirements for more robust encryption.

“Given that the National Institute of Standards and Technology, technology companies, and health IT developers are moving away from SHA-1, we believe now is the appropriate time to move toward the more secure SHA-2 standard,” ONC wrote in its rulemaking.

The rule also states: “We note that there is no requirement obligating health IT developers to get their products certified to this requirement immediately, and we would expect health IT developers to not begin seeking certification to this criterion until later in 2016 for implementation in 2017 and 2018. We further note that certification only ensures that a health IT module can create hashes using SHA-2; it does not require the use of SHA-2. For example, users of certified health IT may find it appropriate to continue to use SHA-1 for backwards compatibility if their security risk analysis justifies the risk.”

Some other safeguard features, such as data segmentation for privacy of sensitive health information, are included in the software certification rule as optional, Halamka notes. “That’s appropriate for immature standards,” he says.

Public Input

CMS is continuing to seek public comment on the “meaningful use” rule for 60 days. This input could be considered by CMS for future policy developments for the EHR incentive program, as well as other government programs, the agency says.

However, this additional public comment period could become problematic, Holtzman contends. “The adoption of the changes in the objective and measures as a ‘final rule with comment’ could cause delays in EHR vendors and developers in producing upgrades to their technology. The uncertainty in that CMS could make further changes in the months ahead might encourage these industry partners to hold off in their production process.”

CHK File Recovery Has Been Updated to Version 1.09

CHK File Recovery is an excellent recovery tool specialized in recovering CHK files in a quick and easy way, which has been updated to version 1.09 recently. In this new version, we fixed a bug which disabled to identify one file type, also we added one recoverable file type.

Change Log of CHK File Recovery 1.09:

File Name: CHK File Recovery

Version: 1.09

File Size: 2.64MB

Category: CHK File Recovery Software

Language: English

License type: Trial Version

OS Support: Win2000/XP/VISTA/Win 7/Win 8

Released on: Sept.30, 2015

Download Address: http://www.dogoodsoft.com/chk-file-recovery/free-download.html

What’s New in This Version:

* Improved the accuracy of judgement on Office file types.

+ Added 55 recoverable file types.

Why Choose CHK File Recovery:

CHK File Recovery Has Been Updated to Version 1.09

CHK File Recovery is an excellent recovery tool specialized in recovering CHK files in a quick and easy way. CHK File Recovery can accurately and quickly recover more than 180 common file types, such as mp3, mp4, jpg, bmp, gif, png, avi, rm, mov, mpg, wma, wmv, doc, docx, xls, xlsx, ppt, pptx, zip, rar, exe, dll, sql, mdb, psd.

CHK File Recovery can determine file type automatically by default. However, for file types that cannot be recognized automatically, manual identification is used to confirm file type, which can check the content of an unknown file through 4 methods and recover it afterwards.

The interface of CHK File Recovery is simple and clear. It is easy to use. You only need to select a drive and click Search, then CHK File Recovery starts to scan the whole drive automatically. Afterwards, the CHK files found are shown in the list at the left of the application by their original file type. Besides, you can choose to search and scan a folder you specify.

National Encryption Policy: Not just privacy, but also feasibility and security are at risk

National Encryption Policy: Not just privacy, but also feasibility and security are at risk

Encryption is an important aspect which governs not just the communications but also the storage. When data is in motion there are some methods/ protocols which facilitate end-to-end encryption:

1. VPN

2. Remote Server Connectivity viz. RDP, SSH

3. Internet based Voice/ Messaging Communications

4. email communication

5. Communications between Wearables and their Host devices

6. Web-Services providing encryption services viz. Etherpad, Gist

However, when it concerns data at rest ie. data stored on the disk, there are numerous scenarios which fall under the purview of encryption:

1. On the Fly Disk Encryption which may also include the entire OS

2. Password protection of files

3. email Message Encryption

4. Full disk-encryption by Smartphones

Recently, Government of India released its version of Draft for National Encryption Policy and within 24 hours of releasing it, they have withdrawn it, however with a promise the policy will be re-drafted and re-released.

In these 24 hours, all those involved in IT security of the Indian Internet Security forum took up the cause of protecting user privacy, reprimanding the government for ill conceived draft of National Encryption Policy. Their efforts resulted in forcing the government to revoke the draft proposal and contemplate on a better proposal.

According to the draft, B2B/ B2C and C2B, sector shall use encryption algorithms and key sizes as prescribed by the government, moreover, according to the draft:

“On demand, the user shall be able to reproduce the same Plain text and encrypted text pairs using the software/ hardware used to produce the encrypted text from the given plain text. Such plain text information shall be stored by the user/ organization/ agency for 90 days from the date of transaction and made available to Law Enforcement Agencies as and when demanded in line with the provisions of the laws of the country.”

Furthermore, the draft also issued guidelines for communication with foreign entity, “the primary responsibility of providing readable plain-text along with the corresponding Encrypted information shall rest on entity (B or C) located in India.”

The draft policy requires service providers whether irrespective of their country of origin to enter into an agreement with the Government of India and the consumers of these services (Government/ Business/ Citizens) are expected to provide the pain-text/encrypted datasets.

The question is not why, but how would it be technically feasible for a customer to maintain this information, given the fact that encryption was used to secure the data from rogue entities. Storing anything in plain-text for any amount of period, defeats the entire purpose of using encryption except with a solace that the channel used for transmission of data is secured. The draft has set very high and impossible to achieve expectations from every citizen and organization, irrespective of their field of expertise to have knowledge about the internal working of these third party applications, also at the same time they are expected to have knowledge about maintaining the two different data-sets.

Furthermore, the draft also requires anything that has been encrypted by an individual be it his personal documents or communication between two individuals, which interestingly is considered to be a private affair by the rest of the world, to be made available for scrutiny as and when demanded.

Expecting a consumer of various services, irrespective of the fact whether the consumer is an organization or an individual, to understand the internal functionality of each and every service / software and take a conscious decision of maintaining the two separate data-sets is simply not feasible and virtually impossible.

Even though a clarification was issued by the government that

The mass use encryption products, which are currently being used in web applications, social media sites, and social media applications such as Whatsapp, Facebook, Twitter etc.
SSL/TLS encryption products being used in Internet-banking and payment gateways as directed by the Reserve Bank of India
SSL/TLS encryption products being used for e-commerce and password based transactions.

It still raises quite a few eyebrows especially about the intention of the drafting of this National Encryption Policy. Not just the privacy, but also the feasibility and the security are at risk.

The argument until now was about data which resides on your disk, and using these very standards what can we say about the encrypted communication channels/ services? One word summarizes it all “Impossible”. Over the network encryption like VPN/ SSH or to put it simply cloud based services be it of any-type, which lately have made inroads into our lives would be rendered useless and their very existence in India is at risk, not just because it would have been mandatory for all of them to enter into an agreement with the Government of India, but the consumers of these services will also have to maintain a separate copy of the content.

Applications and Service providers who provide Secure Messaging ie, encrypting the voice channels or self-destructing messages, in order to provide better privacy and discourage eavesdropping, would in all probability get banned or might have to remove these features so as to cater to Indian audience. Over and above, how do the policy-makers expect the consumers to comply?

What happens when a person from a different country uses these services in India? Wouldn’t this person be violating the Indian Law and in all probability be considered a criminal?

The draft also requires all the stakeholders to use Symmetric Cryptographic/Encryption products with AES, Triple DES and RC4 encryption algorithms and key sizes up to 256 bits.

Way back in 2011 when Microsoft Researchers discovered a way to break AES based encryption, Triple DES is considered weak, while RC4 is simply not acceptable as an encryption algorithm to any organization. These are age-old encryption algorithms and are never/rarely considered when organizations are drawing up their own encryption policies.

In this age of competition, organizations have their own trade secrets to be guarded, not just from competitors but also from rogue governments. A weakened encryption schema and mandatory storage of encrypted data in its plain text form is nothing less than committing a Harakiri for these organizations. Moreover, by way of an agreement that draft expects the software/ hardware vendors to comply with these encryption restrictions, thereby weakening the overall security of India’s IT infrastructure.

National Encryption Policy should be about setting up of minimum encryption standards for data protection, penalization organizations and institutions for not implementing high encryption standards and protecting the data from pilferage and leakage.

Encryption policy has always had a direct impact on the privacy of an individual and when it used by corporations/ organization, it affects their business/ trade secrets; hence Government should also consider thinking about the various means and ways of implementing/ strengthening the non-existent privacy laws.

As we have been promised that the policy would be re-drafted, let us keep our fingers crossed and hope that better sense prevails.

Data encryption policy blamed on lack of talent, key changes: Report

Data encryption policy blamed on lack of talent, key changes: Report

The whole draft encryption policy episode has left netizens with a bitter-sweet taste. And now, the blame game has begun.

Soon after the government retracted the policy and said it was simply wrongly worded which led to the confusion, it has blamed a junior scientist for the fiasco. An official now told The Economic Times that ‘you think anything in the government moves without due procedure? All I can tell you is that all rules and regulations were followed.’

The report adds that some officials said that the junior officer didn’t seek advice of higher-ups while some other said they were out of the country.

Citing an official of a Big Four consultancy firm who didn’t want to reveal his identity, the report adds that DeitY has undergone several changes and this could have affected the function and decision making.

Director general of the National Informatics Centre (NIC) responsible to manage the technology of the entire government machinery has been vacant for more than a year now. However, a senior officer said there are many competent people who can take on additional responsibilities.

The government had released a draft encryption policy aimed at keeping a tab on the use of technology by specifying algorithms and length of encryption keys used by ‘all’. It wanted businesses, telcos and Internet companies to store all encrypted data for 90 days in plain text which should be presented before the law enforcement agencies whenever asked to. Moreover, failing to do so would mean legal action as per the laws of the country.

After a huge outcry, the government put out an addendum clarifying the exempted products such as social media sites including WhatsApp, Facebook and Twitter; payment gateways; e-commerce and password based transactions and more from the draft policy. The outcry finally led the government to withdraw the draft policy.

Draft encryption policy: Frequent changes in key positions & talent crunch in DeitY led to the debacle

Draft encryption policy: Frequent changes in key positions & talent crunch in DeitY led to the debacle

As the blame game for the fiasco created by the draft National Encryption Policy plays out, experts are asking if frequent changes in key positions and a talent crunch in the Department of Electronics and Information Technology (DeitY) led to the debacle.

After the government held a junior scientist responsible, officers in the department are now pointing fingers at each other, while maintaining all along that due procedure was followed.

“You think anything in the government moves without due procedure? All I can tell you is that all rules and regulations were followed,” said an official who requested anonymity. The draft policy, which proposed that social media text messages be stored for scrutiny by the government, was withdrawn after a public outcry.

Another set of officials alleged that the junior officer did not seek the advice of higher-ups before making the policy public. Some officials said they were out of the country when the policy was released online and others said they were not involved in framing it, laying the blame squarely on the junior official.

The episode has led experts to ask whether organisational instability in DeitY over the past few months led to the embarrassment. The department, which is part of the Ministry of Communications and IT, has the mandate of running the government’s ambitious Digital India project. However, several key posts have been lying vacant for many months. DeitY has also seen several changes, including that of the secretary, additional secretary and joint secretary, over the last one month.

“Unfortunately, DeitY has gone through a number of changes very frequently. Every change affects function and decision making,” said an official of a Big Four consultancy firm, who requested not to be identified.

While the position of the director general of the National Informatics Centre (NIC), which manages technology of the entire government machinery, has been lying vacant for over a year, the key post of director general of the Computer Emergency Response Team (CERT) has not been filled after Gulshan Rai was appointed national cyber security chief under the PMO in March.

CERT is responsible for warding off and fighting cyber attacks. While ministry officials have been given additional ge of these positions, it may be adding to instability and workloads. Nodal officer for the encryption policy is supposed to be the group coordinator for cyber law — but there is confusion in the ministry on who holds that post after Rai moved to the PMO.

Even the National e-Governance Division and the Controller of Certifying Authorities are being run by acting chiefs for months now. Appointments to the position of additional secretary (egovernance) and joint secretary (electronics) are also awaited.

“Though vacancies and frequent changes are routine in the government, the secretary, additional secretary and joint secretary, all in charge of the same function – e-governance – should not have been changed at the same time, especially with all the focus on Digital India,” said another technology consultant. The person added that because of these vacancies, several key initiatives such as restructuring of NIC have been stuck.

Ministry officials, while conceding that there are vacancies, countered by saying that business in the government never stops. “There are lots of competent people in the department to take on additional responsibilities,” said a senior official of the department.

The first consultancy official said there is a vacuum in the department in terms of the second rung of leadership.

Encryption policy poorly worded by officer: Telecom Minister Ravi Shankar Prasad

Encryption policy poorly worded by officer: Telecom Minister Ravi Shankar Prasad

The government has blamed a junior official – a scientist — for the encryption policy fiasco, saying he was responsible for the poor and confusing wording of the document and failed to seek advice from his higher ups before making it public.

Several officials in the communications and IT Ministry that ET spoke to admitted that the timing of the release of the draft policy – just before Prime Minister Narendra Modi’s US visit — couldn’t have been worse, prompting its immediate withdrawal.

Speaking exclusively to ET, telecom minister Ravi Shankar Prasad, however, blamed poor wording for directing withdrawal of the policy, which gave an impression that subscribers could become legally liable to store messages exchanged throug WhatsApp, Facebook and Google among other social media platforms for up-to 90 days, and produce them before authorities if asked. The intent of the government was to make the social media and messaging companies liable to store information for the 90 day period.

“I read the draft. I understand that the manner in which it is written can lead to misconceptions. I have asked for the draft policy to be withdrawn and reworded,” Prasad said. “There was a misuse of word ‘users’ in the draft policy, for which the concerned officer has been taken to task.”

He explained that the wrong use of the phrase ‘users of encryption’ instead of ‘creators of encryption’ had led to all the confusion. Prasad added that the ‘scientist’, who was part of the expert committee under the Department of Information and Technology (Dei-TY), was responsible for the confusion. The expert panel had been tasked with framing of a national policy on ‘encryption’ which is crucial for the national policy on cyber security.

Internally, senior officials in the ministry admitted the timing of the draft policy release was all wrong with Modi set to travel to the US and meet, among others, Facebook CEO Mark Zuckerberg and other tech giants as well as many from the Indian diaspora.

“This is bad timing for sure. Modi would have surely have faced very uncomfortable questions at what is expected to be very high profile visit,” one of the officials told ET. Another official said the official tasked with coordinating and putting the policy together should have shown either the joint secretary, secretary or someone in the minister’s office before releasing it for public consultation. “This is the basics, especially for something which could be controversial.

But it was messed up,” he said, adding that reworking the policy and putting it in the public domain could take around three weeks.

The government Tuesday was forced to withdraw the controversial ‘draft encryption policy’ just over 12 hours after making it public after it came under severe criticism, especially on social media, for its move to make individuals legally bound to retain personal chats/messages on social networking sites for 90 days and provide to law authorities, if asked.

The draft policy was met with severe criticism, citing invasion of privacy, forcing DeiTY to clarify within a few hours on Monday that chats on popular social networking sites like Whatsapp and Facebook were exempted. And Tuesday it withdrew it in its entirety.

Prasad urged citizens not to misunderstand the policy. “Firstly this is a draft policy not the final policy and we have sought the comments of all stakeholders. There has always been a need for a policy on encryption given the spurt in online transactions through net banking, ecommerce, and so on,” Prasad said.

“However, no attempt will ever be made to jeopardize the rights of netizens and this government’s commitment to social media and the rights of netizens is unwavering,” he added. Dismissing speculation that the government had withdrawn the policy owing to severe media backlash or political pressure, Prasad said the country needed a robust encryption policy for security reasons.

One of the officials cited above said that the essence of the reworked draft policy will remain same, but it will be reworded. “The final policy could also require the companies to set up servers in India,” he added.

According to sources, the Intelligence Bureau (IB) had demanded that government make it mandatory for all the companies to make keep data for up-to one year, but the ministry of communications and IT had brought it down to just 90 days.

The policy seeks to bring all creators of ‘encryption codes’ to register with the government. Secondly the department of IT will from time to time notify standardized algorithms which could be used by companies. “We will only standardize the algorithms based on global practices, the formula of encryption codes will remain with the creators only,” the official said.

At present, an internet service provider licence allows for encryption of only up-to 40 bits but banks, e-commerce companies and communication services use much higher levels of encryption codes.

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

The government issued an addendum to clarify that “mass use encryption products, which are currently being used in web applications, social media sites, and social media applications such as WhatsApp, Facebook, Twitter etc.” While that language is vague in itself, you can rest easy without needing to worry about having to store your WhatsApp messages for 90 days. The original text continues below.

The DeitY has posted a draft National Encryption Policy on its website inviting comments from the public on its mission, strategies, objectives, and regulatory framework, which you can send to akrishnan@deity.gov.in, until 16th October 2015. A lot of the details mentioned in the draft guidelines are worrying, and this is a topic that concerns every consumer.

While the draft encryption policy’s preamble starts by talking about improving e-governance and e-commerce through better security and privacy measures, it very quickly brings up national security as well, and that’s where things get worrying from a consumer’s perspective. It’s very reminiscent of when the Indian government was thinking about banning BBM in India unless BlackBerry (then Research in Motion) gave security agencies access to snoop on emails. The two would eventually reach an arrangement that allowed the government to intercept email.

The language of the new draft policy is quite clear on one thing – businesses and consumers may use encryption for storage and communication, but the encryption algorithms and key sizes will be prescribed by the Indian government. What’s more, vendors of encryption products would have to register in India (with the exception of mass use products, such as SSL), and citizens are allowed to use only the products registered in India.

“Would OpenPGP, a commonly-used standard for encryption of email, fall under ‘mass use’?” asks Pranesh Prakash, Policy Director at the Centre for Internet and Society, speaking to Gadgets 360. “Because if it doesn’t, I am prohibited from using it. But if it does, I am required to copy-paste all my encrypted mails into a separate document to store it in plain text, as required by the draft policy. Is that what it really intends? Has the government thought this through?”

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

Most people don’t explicitly use encryption, but it’s built into apps they use every day. Do the draft guidelines also extend to products and services with built-in encryption like WhatsApp? If yes – and the language certainly suggests it does – then combine them with governments requirements for its citizens, as proposed in the draft guidelines, and we could have very worrying scenarios.

The draft guidelines read “All citizens (C), including personnel of Government/ Business (G/B) performing non-official/ personal functions, are required to store the plaintexts of the corresponding encrypted information for 90 days from the date of transaction and provide the verifiable Plain Text to Law and Enforcement Agencies as and when required as per the provision of the laws of the country.”

WhatsApp messages are now encrypted end-to-end. So do the draft guidelines mean you have to store a copy of all your WhatsApp messages for 90 days? What about Snapchat? Or any other form of ephemeral messaging that is automatically deleted after being read? The consumer is expected to maintain plain text copies of all communications for 90 days – so that these can be produced if required by the laws of the land – so, will it even legal to read a message that deletes itself, if and when the draft guidelines become law?

The draft policy document states that the vision is to create an information security environment, and secure transactions. But the actual details mentioned in the draft appear to do the opposite, and put a focus more on the lines of limiting encryption only to technologies that likely could be intercepted by the government, when required.

This is in many ways similar to the Telecom Regulatory Authority of India’s draft letter on Net Neutrality, which instead talked about issues like cyberbullying and ‘sexting’. In the feedback period, Trai received over 1 million emails. but the Department of Telecom report on Net Neutrality also went against public sentiment on certain things, suggesting that telcos should be allowed to charge extra for specific services, such as Skype or WhatsApp voice calls in India, showing that calls for feedback aren’t necessarily being taken seriously.

And, with the draft National Encryption Policy, another problem that is shared with the Net Neutrality discussions, is the use of vague language. The result is that there is very little clarity at this point on what will and will not be permitted by the government if the draft guidelines are adopted. We’re living in a time when the government talks about how WhatsApp and Gmail may be used by “anti-national elements”, and even considered requiring Twitter and Facebook to establish servers in India.

With that in mind, you have to ask, will it be even legal to use WhatsApp if these guidelines are implemented? After all, WhatsApp messages have end-to-end encryption and if this service does not register in India, and comply with the algorithms prescribed by the government, then as a citizen of India, you won’t be allowed to use it because “users in India are allowed to use only the products registered in India,” as per the draft guidelines.

These are questions that don’t just affect a few people, but just about every Indian who is using the mobile Internet. In its present form, the draft actually severely limits what you can do online, and could hobble the push for a digital India. There’s almost a full month to give our feedback, but is anyone listening?

Best Disk Lock Has Been Updated to Version 2.60

Best Disk Lock, which can completely hide the disk partitions, has been updated to the version 2.60. In this new version, we have improved the stability of disk advanced-lock, added the judgement for the disks unsuitable for lock when locking disks, also fixed a BUG that an error occurred in software uninstallation.

Change Log of Best Disk Lock 2.60:

File Name: Best Disk Lock

Version: 2.60

File Size: 3.38MB

Category: System Security Software

Language: English

License type: Trial Version

OS Support: Win2000/XP/VISTA/Win 7/Win 8

Released on: Sept.21, 2015

Download Address: http://www.dogoodsoft.com/best-disk-lock/free-download.html

What’s New in This Version:

* Improved the stability for disk advanced-lock.

+ Added the judgement for the disks unsuitable for lock when locking disks.

– Fixed a BUG that an error occurred in software uninstallation.

Why Choose Best Disk Lock:

Best Disk Lock Has Been Updated to Version 2.60

Best Disk Lock is a powerful utility that can completely hide disk partitions and CD-ROM drives on your PC, and disable USB storage devices or set them as read-only. A hidden partition cannot be found in any environment by anyone else, so the security and confidentiality of your data on this partition can be ensured.

Experts pick big holes in India’s encryption policy

India’s proposed encryption policy has come under heavy fire with internet experts and online activists alleging that it provides blanket backdoors to law enforcement agencies to access user data, which could be abused by hackers and spies.

Experts pick big holes in India's encryption policy

The Department of Electronics and Information Technology ( DeitY) has asked for public comments on the ‘Draft National Encryption Policy’ on its website until October 16. The stated mission of the policy on encryption -or, the practice of scrambling data to make it unintelligible for even the service providers -is to “provide confidentiality of information in cyber space for individuals, protection of sensitive or proprietary information for individuals & businesses, (and) ensuring continuing reliability and integrity of nationally critical information systems and networks”.

However, almost all the experts ET spoke to, while agreeing that a policy for encryption is a welcome move, felt that the policy document in its current form is not well thought-out and makes suggestions that could harm businesses and individuals, and thwart research and development in the field of encryption. The most contentious provision in the draft policy document is perhaps the one requiring businesses and individuals to keep a plain text copy of the data they encrypt for storage and communication, for 90 days, and make it available to law enforcement agencies “as and when demanded in line with the provisions of the laws of the country”.

“The mission of the policy is to promote national security and in crease confidentiality of information, but it specifically excludes `sensitive departmentsagencies’, which most need such protection.The content of the policy shows why they have been excluded: the policy, in fact, decreases security and confidentiality of information,” said Pranesh Prakash, policy director at the Centre for Internet and Society. “If our emails, for example, are required to be kept in plain text rather than in encrypted form, then that makes it easier for hackers and foreign agencies to spy on our government, businesses, and on all Indian citizens,” he said.

Raman Jit Chima, policy director at digital rights organisation Access, said that instead of promoting the use of encryption, the policy draft “appears to seek to heavily regulate encryption and the rules it proposes will likely impede its usage by Indian developers and startups”. “By trying to restrict and weaken the everyday usage of encryption in order to facilitate tapping demands, the everyday communications of all Indians will likely become less secure,” Chima said.

The policy seeks to promote R&D in the field of cryptography by public and private companies, government agencies and academia, but it requires all vendors of encryption products to register their products with the government and re-register when their products are upgraded.

Arun Mohan Sukumar, cyber initiative head at Observer Research Foundation, said, “The government has finally realised the need to protect its communications infrastructure from cyber intrusions. But creating a `license raj’ of encrypted products and services, as this draft policy aims to, will only stunt cyber security research.”

Obama edges toward full support for encryption

Obama edges toward full support for encryption

President Obama recently called on the best minds in government, the tech sector and academia to help develop a policy consensus around “strong encryption” — powerful technologies that can thwart hackers and provide a profound new level of cybersecurity, but also put data beyond the reach of court-approved subpoenas.

From Obama on down, government officials stressed that they are not asking the technology sector to build “back doors” that would allow law enforcement and intelligence agencies to obtain communications in the event of criminal or terrorist acts.

That prospect drew an extremely negative reaction from the techies — and is still chilling the government-industry dialogue over the issue.

Instead, the government is saying that tech and communications companies themselves should have some way to unlock encrypted messages if law enforcement shows up with a subpoena.

Access to such messages could, in theory, be vital in real-time crises. Skeptical lawmakers have said federal officials have offered no empirical data suggesting this has been a problem.

“One of the big issues … that we’re focused on, is this encryption issue,” Obama said during a Sept. 16 appearance before the Business Roundtable. “And there is a legitimate tension around this issue.”

Obama explained: “On the one hand, the stronger the encryption, the better we can potentially protect our data. And so there’s an argument that says we want to turbocharge our encryption so that nobody can crack it.”

But it wasn’t as simple as that.

“On the other hand,” Obama said, “if you have encryption that doesn’t have any way to get in there, we are now empowering ISIL, child pornographers, others to essentially be able to operate within a black box in ways that we’ve never experienced before during the telecommunications age. And I’m not talking, by the way, about some of the controversies around [National Security Agency surveillance]; I’m talking about the traditional FBI going to a judge, getting a warrant, showing probable cause, but still can’t get in.”

According to the president, law enforcement, the tech community and others are engaged in “a process … to see if we can square the circle here and reconcile the need for greater and greater encryption and the legitimate needs of national security and law enforcement.”

Obama summed up: “And I won’t say that we’ve cracked the code yet, but we’ve got some of the smartest folks not just in government but also in the private sector working together to try to resolve it. And what’s interesting is even in the private sector, even in the tech community, people are on different sides of this thing.”

However, the tech sector, writ large, has shown little interest in negotiating over strong encryption.

After a recent hearing of the House Intelligence Committee, Rep. Adam Schiff, D-Calif., said technology companies want the government to spell out what it wants, and that techies simply will not craft a policy in an area that should be free from government interference.

Tech companies are deeply concerned that American-made products will be seen in the global marketplace as tainted if they reach some kind of accommodation with the government. It’s all part of the continued international blowback from the revelations by ex-NSA contractor Edward Snowden, tech groups say.

Schiff visited with several Silicon Valley-based companies over the recent summer recess. “I was impressed by the companies’ position — it’s hard to refute. But what was unusual, more than one of the companies said government should provide its [proposed] answer in order to advance the discussion,” he said.

The tech sector, Schiff said, is unlikely to advance a policy position other than its opposition to any mandated “back door.”

“But there has to be some kind of resolution, even if it is acceptance of the status quo.”

Schiff and other lawmakers, including Senate Judiciary Chairman Charles Grassley, R-Iowa, are trying to encourage a dialogue between the tech sector and law enforcement.

FBI Director James Comey testified before the House Intelligence panel that such talks are underway, and have been productive so far.

“First of all, I very much appreciate the feedback from the companies,” Comey said at the Sept. 10 Intelligence Committee hearing. “We’ve been trying to engage in dialogue with companies, because this is not a problem that’s going to be solved by the government alone; it’s going to require industry, academia, associations of all kinds and the government.”

He stressed: “I hope we can start from a place we all agree there’s a problem and that we share the same values around that problem. … We all care about safety and security on the Internet, right? I’m a big fan of strong encryption. We all care about public safety.”

It was an extremely complicated policy problem, Comey agreed, but added, “I don’t think we’ve really tried. I also don’t think there’s an ‘it’ to the solution. I would imagine there might be many, many solutions depending upon whether you’re an enormous company in this business, or a tiny company in that business. I just think we haven’t given it the shot it deserves, which is why I welcome the dialogue. And we’re having some very healthy discussions.”

Tech sources contacted after the hearing suggested that Comey was overstating the level of dialogue now taking place.

The Obama administration has signaled that it isn’t looking for a legislative solution, which is just as well, because lawmakers including Schiff and Grassley have said that is a highly unlikely prospect.

But the administration probably needs to give a clearer signal of what it would like to see at the end of this dialogue before the tech side agrees to fully engage.

Science on the Hill: For cybersecurity, in quantum encryption we trust

As everyone becomes more interconnected on the Internet, personal information like bank and investment accounts, credit card numbers, home addresses and even social security numbers becomes more vulnerable to cybertheft. The same goes for the corporate world.

Identity theft struck 16.6 million Americans in 2012, the most recent year for which figures were available. According to the U.S. Department of Justice, financial losses hit $24.7 billion — at least $10 billion more than other property crimes. PBS Newshour reported that in 2014, 783 million data breaches exposed 85 million records. This spring, hackers broke into the Anthem Health System, potentially gaining access to the health records of 80 million people.

One can’t build a concrete wall around this kind of information nor post an armed guard at every portal to the Internet. Keeping information secure depends on encryption. The security of electronic messages depends on the unpredictability of the random numbers used to scramble the data. Modern data centers have very limited access to true random numbers.

Current encryption methods are based on the difficulty of finding the right numbers in the key. The Achilles’ heel is that all encryption requires unpredictable, unguessable random numbers and computers do not (generally) do unpredictable things. Large data centers, like those used by online shopping sites, aren’t good at generating truly random numbers in sufficient quantity to offer bulletproof encryption. So to provide truly secure data communications, we need a reliable source of unpredictable numbers that aren’t generated by a set of mathematical operations, or algorithm.
Los Alamos National Laboratory has specialized for decades in security and pushed the limits of computing. With that background, it’s only natural that we made it our business to improve data security with a solution from outside traditional computing. From the physicist’s point of view, the only true unpredictability comes from quantum mechanics. That’s why Los Alamos physicists developed a quantum random number generator and a quantum communication system, both of which exploit the weird and immutable laws of quantum physics to improve cybersecurity.

These physical laws state that events at the subatomic level cannot be predicted; random quantum events lie at the root of the universe. From that starting point, we developed a revolutionary method to generate unpredictable, theoretically unhackable random numbers. Quantum mechanics itself guards the secret. Unlike current math-based encryption keys, which are derived from random numbers generated by a potentially knowable algorithm, a quantum key can’t be determined through calculation, no matter how powerful a computer one uses.

After thorough testing, we teamed with Whitewood Encryption Systems to commercialize a quantum random number generator, called the Entropy Engine. A plug-and-play computer card that fits most network servers, the Entropy Engine creates more than 200 million random numbers each second on demand and integrates with — and greatly improves — existing cryptographic methods over networks.

At the lab, we’ve also demonstrated an impregnable quantum communication system that sends a signal of polarized pulses of light over a fiber-optic cable. Under the peculiar laws of quantum physics, the photons, or light particles, encoding a message are in two different and unpredictable physical states. Because the act of intercepting a message over this quantum system alters the state of the photons, the sender is guaranteed to find out if someone is eavesdropping. The hacker never even gets a chance to examine the key.

This communication system works over distances up to 100 miles. We’re now refining it for commercial use over longer distances and possibly even through the air to satellites. Combined with technology like the Entropy Engine, it could revolutionize cybersecurity worldwide. We envision a wide range of organizations deploying these technologies, including financial institutions, government agencies, health care organizations, large data centers and cloud servers.

Encryption, unhackable digital identities and secure digital signatures are indispensable to establishing trust in the digital world. As Whitewood rolls out the Entropy Engine across the global digital landscape and more quantum-computing technology follows, we can all breathe a little easier that our information is safe.

MainOne addresses data centre security concerns

Keeping customer data secure is of utmost importance in any organization. Therefore, compliance with security standards is vital.

In today’s world, a company cannot afford to experience breaches of customer information, transactional data or other important business information given the volume of business taking place online and the consequences of such breaches.

When considering collocation at a commercial data centre, compliance to global security standards, such as the Payment Card Industry Data Security Standard (PCI DSS) and Information Security Standards, should feature as a key selection criteria.

Speaking about the measures put in place to ensure security of data, the CEO of MainOne, Funke Opeke, said that “security in the Data Centre cannot be over emphasised and MainOne’s Tier III Data Centre, MDX-I, recently subjected its operations to rigorous process improvements and audits to ensure compliance and certification on these key security standards.”

She confirmed that MDX-I was certified following a comprehensive ISO27001 audit carried out by British Standard Institution (BSI) group; a business standards company that helps organisations all over the world make excellence a habit. “The PCI DSS assessment was conducted by Digital Jewals Limited, a PCIDSS QSA and an information value chain company which also provided end-to-end support in preparing the Data Centre for certification to both standards. The audits measure the facilities at the Data Centre according to several strict criteria including physical access controls as well as information security policies, procedures and infrastructure”, she said.

Compromised data affects customers, business partners and financial institutions as a single incident and can severely damage a company’s reputation and its ability to conduct business effectively. Data breaches lead to catastrophic loss of sales due to damaged reputation and business relationships, lawsuits, insurance claims, cancelled accounts, payment card issuer fines and possible government fines or other liabilities.

In order to avoid possible breaches, collocating equipment with a provider that is compliant to global standards provides assurance that your systems and sensitive customer information remains secure.

Opeke reaffirmed that the “PCI DSS accreditation is the most comprehensive, internationally recognised data security consisting of robust and comprehensive standards, and supporting materials to enhance payment card data security. These include a framework of specifications, tools, measurements and support resources to help organizations ensure the safe handling of cardholder information at every step. It provides an actionable framework for developing a robust payment card data security process including prevention, detection and appropriate reaction to security incidents”.

In addition to that, the CEO made mention that MXD-I has the ISO 27001 security certification, a globally recognised Information Security Management System (ISMS) standard, which ensures the right information security management environment, procedures and policies to ensure that it is able to provide a high degree of security and assurance to customers.

According to her, outsourcing Data Centre operations to a collocation provider that imbibes global security policies and procedures to mitigate data breaches remains the best choice for any business.

“We are the first data centre in Nigeria to be PDCI DSS certified. This is the same security certification banks have to get when they are at the highest levels of performance for the security of their customers financial data so if people are comfortable enough to provide personal data to banks, why wouldn’t they have confidence in a certified data centre that is ready to submit itself to external audit and verification to show that it is in compliance with standards. This actually gives you the peace of mind that you get the secure environment you need as opposed to the one that you are building in house but may not have adequate resources to sustain on an ongoing basis. In this challenging environment, wouldn’t you rather have a certified partner to handle your data”, she said.

Managing Director of Epinec Nigeria Limited, an ICT company in partnership with Microsoft, Obi Ibeto was asked about how secure data centers are. He said “access to the physical data centre is highly restricted. With biometric readers, motion sensors, 24-hour secured access, video surveillance, and many other security features – in my opinion – the warehouses look like something from a James Bond movie”.

FBI director: Ability to unlock encryption is not a ‘fatal’ security flaw

FBI director: Ability to unlock encryption is not a ‘fatal’ security flaw

In the tug-of-war between the government and U.S. companies over whether firms should hold a key to unlock encrypted communications, a frequent argument of technologists and privacy experts is that maintaining such a key poses a security threat.

But on Thursday, FBI Director James B. Comey pointed out that a number of major Internet companies do just that “so they can read our e-mails and send us ads.”

And, he said: “I’ve never heard anybody say those companies are fundamentally insecure and fatally flawed from a security perspective.”

Comey was airing a new line of government argument in the year-old public debate over the desirability of compelling Internet companies to provide a way for law enforcement to have access to decrypted communications.

Although he didn’t name names, he was alluding to major e-mail providers Google and Yahoo, which both encrypt customers’ e-mails as they fly between servers, but decrypt them once they land in order to scan them and serve customers relevant ads.

Comey, who spoke at a cyberthreats hearing held by the House Intelligence Committee, has been a leading voice advancing the concerns of law enforcement that the growing trend of strong encryption — where devices and some communications are encrypted and companies do not hold the keys to decode them — will increasingly leave criminal investigators in the dark.

The current debate, which echoes a bitter argument over encryption in the 1990s, was triggered by Apple’s announcement last September that it would expand the use of a method of encryption on its mobile operating system in which it did not hold a key. That meant Apple could no longer unlock troves of photos and other data stored on iPhones and iPads where the user had turned off the automatic backup to Apple’s servers. Such data “at rest” is useful in criminal investigations.

Of great concern to counterterrorism officials are communications encrypted in transit, such as text and instant messages, where the companies do not hold a key and where users have turned off automatic backups. Such end-to-end encryption is a feature of Apple’s iMessage and FaceTime — a video phone-call system, as well as Open Whisper Systems’ Signal, and WhatsApp — both instant message platforms.

But stored commercial e-mail is largely either unencrypted, or encrypted with a key known to the provider, Christopher Soghoian, principal technologist at the American Civil Liberties Union, said in an interview. And that’s a recipe for insecurity, he said.

“Any data that’s either unencrypted or encrypted with a key known to another party is inherently more vulnerable,” he said. He added that Google and Yahoo have been criticized for their lack of e-mail security, and the Chinese breach of Gmail announced in 2010 was a case in point.

During the hearing, Comey said that the bureau was “having some very healthy discussions” with companies on the issue. “I would imagine there might be many, many solutions depending upon whether you’re an enormous company in this business, or a tiny company in that business. I just think we haven’t given it the shot it deserves.”

Rep. Adam Schiff (D-Calif.) noted that the tech firms have stiff global competition. Other companies are offering encrypted platforms that customers might choose. “So what do we achieve, apart from harming our economic interests, by insisting on a key?” he said.

Comey said he thought that part of the solution would be “an international set of norms” in which other countries join with the United States to establish a rule that companies should be able to provide law enforcement with communications in the clear. “I hear from our allies all the time,” he said. “The French want the same thing. The Germans. The British. So I think that’s something that could be done.”

Soghoian noted, however, that more and more encryption platforms are being made available on the Internet for free by individuals or groups of open-source developers in the United States and Europe, which will make it difficult to regulate them.

Encryption and privacy are priorities for tech firms

Encryption and privacy are priorities for tech firms

The Justice Department and Microsoft go head-to-head in the U.S. Second Circuit Court of Appeals in Manhattan on Wednesday. The battleground? Data privacy.

At issue is the question of whether U.S. law enforcement can use a search warrant — in this case, in a drug investigation — to force the U.S.-based technology company to turn over emails it has stored in a data center in Ireland. Lower courts have sided with the government and held Microsoft in contempt for refusing to comply with the search warrant. Microsoft has appealed, arguing that its data center is subject to Irish and European privacy laws and outside the jurisdiction of U.S. authorities.

Civil liberties and internet-privacy advocates are watching the case closely, as are company and law-enforcement lawyers. They’re also watching another case, also involving a drug investigation, in which Apple was served with a court order instructing it to turn over text messages between iPhone owners.

After the Edward Snowden revelations, U.S. technology and telecom companies were criticized for allegedly letting the government spy on Americans’ emails, texts and video chats.

Many companies have been fighting back, hoping to burnish their images as protector of their client data privacy. Microsoft is fighting government access to overseas data centers. Apple has been rolling out strong “end-to-end” encryption, in which only the software in the sender’s and receiver’s devices (an iPhone or iPad) have the the requisite keys to decode the message. That means there’s no “back-door key” that could unlock an email or other communication. In addition, both Apple and Google have deployed private-code locking systems that make their smartphones essentially unbreakable, except by the phone’s owner, who sets the code.

“This way, the companies don’t open up the device,” says Peter Swire, an expert on computer security at Georgia Tech who served on President Obama’s task force on surveillance and cybersecurity. “The companies don’t have access to the content between Alice and Bob.”

If the company that made the device, or is carrying the communication on its network, can’t eavesdrop on users like Alice and Bob, he says, the FBI and other outside parties can’t either.

FBI director James Comey has said these new strong encryption technologies are making communications “go dark” for law enforcement. He claims the companies deploying this kind of encryption are hampering law-enforcement investigations.

But Nate Cardozo, a staff attorney at the Electronic Frontier Foundation, says law enforcement will just have to find other ways to gather information. And, he says, with so much non-encrypted information being gathered on private citizens and consumers these days (such as GPS location, purchases, social media “likes” and contacts, web browsing habits), law enforcement still has plenty of investigative tools.

“End-to-end encryption is coming,” he says, pointing to Apple and to Facebook, which recently bought WhatsApp, a popular global messaging platform that is deploying strong encryption. “It will keep us more safe from criminals, from foreign spies, from prying eyes in general.”

CHK File Recovery Has Been Updated to Version 1.082

CHK File Recovery is an excellent recovery tool specialized in recovering CHK files in a quick and easy way, which has been updated to version 1.082 recently. In this new version, we fixed a bug which disabled to identify one file type, also we added one recoverable file type.

Change Log of CHK File Recovery 1.082:

File Name: CHK File Recovery

Version: 1.082

File Size: 2.63MB

Category: CHK File Recovery Software

Language: English

License type: Trial Version

OS Support: Win2000/XP/VISTA/Win 7/Win 8

Released on: Sept.09, 2015

Download Address: http://www.dogoodsoft.com/chk-file-recovery/free-download.html

What’s New in This Version:

1. Fixed a bug which disabled to identify one file type.

2. Added one recoverable file type.

Why Choose CHK File Recovery:

CHK File Recovery Has Been Updated to Version 1.082

CHK File Recovery is an excellent recovery tool specialized in recovering CHK files in a quick and easy way. CHK File Recovery can accurately and quickly recover more than 120 common file types, such as mp3, mp4, jpg, bmp, gif, png, avi, rm, mov, mpg, wma, wmv, doc, docx, xls, xlsx, ppt, pptx, zip, rar, exe, dll, sql, mdb, psd.

CHK File Recovery can determine file type automatically by default. However, for file types that cannot be recognized automatically, manual identification is used to confirm file type, which can check the content of an unknown file through 4 methods and recover it afterwards.

The interface of CHK File Recovery is simple and clear. It is easy to use. You only need to select a drive and click Search, then CHK File Recovery starts to scan the whole drive automatically. Afterwards, the CHK files found are shown in the list at the left of the application by their original file type. Besides, you can choose to search and scan a folder you specify.

Argument over strong encryption reaches boiling point as Apple, Microsoft rebuff court orders for data access

A long-running debate concerning recent advances in consumer data encryption came to a head this summer when Apple rebuffed a Justice Department court order demanding access to iMessage transcripts, causing some in the law enforcement community to call for legal action against the company.

Argument over strong encryption reaches boiling point as Apple, Microsoft rebuff court orders for data access

Over the summer Apple was asked to furnish real-time iMessage communications sent between two suspects in an investigation involving guns and drugs, reports The New York Times. The company said it was unable to provide such access as iMessage is protected by end-to-end encryption, a stance taken in similar cases that have over the past few months punctuated a strained relationship between the tech sector and U.S. law enforcement agencies.

Sources said a court action is not in the cards for Apple just yet, but another case involving Microsoft could set precedent for future cases involving strong encryption. Microsoft is due to argue its case in a New York appellate court on Wednesday after being taken to task for refusing to serve up emails belonging to a drug trafficking suspect. As the digital correspondence was housed in servers located in Dublin, Ireland, the company said it would relinquish the emails only after U.S. authorities obtained proper documentation from an Irish court.

Government agencies have posed hypothetical scenarios in which strong encryption systems, while good for the consumer, hinder or thwart time-sensitive criminal investigations. It appears those theories are being borne out in the real world.

Further confusing matters is a seemingly non-committed White House that has yet to decide on the topic either way. Apple and other tech companies are pressing hard to stop the Obama administration from agreeing to policy that would, in their eyes, degrade the effectiveness of existing data encryption technologies.

As for Apple, while some DOJ and FBI personnel are advocating to take the company to court, other officials argue that such an action would only serve to undermine the potential for compromise. Apple and other tech firms have privately voiced interest in finding a common ground, The Times reports. To that end, the publication notes Apple did indeed hand over a limited number of messages stored in iCloud pertaining to this summer’s investigation.

For its part, Apple is standing firm against government overtures calling for it to relinquish data stored on its servers. CEO Tim Cook outlined his thoughts on data privacy in an open letter to customers last year and came down hard on unlawful government snooping earlier this year.

Best Folder Encryptor Has Updated to Version 16.83

The professinal file and folder encryption software – Best Folder Encryptor has been updated to version 16.83 recently. In last version 16.82, we have fixed a bug that the encrypted file/folder cannot be prevented from deletion, copy and removal in 64 bit operating system, also fixed a bug that there is no verification for password entering when folder bulk encryption and other three bugs. Besides, we added the judgement for the disks unsuitable for protection when protecting disks.

In this new version 16.83, we improved the stability for disk advanced-protection, fixed three minor bugs prompted in message, and expanded the file/folder size limitation for Diamond-, Full- and Portable encryption to 990MB.

Change Log of Best Folder Encryptor:

File Name: Best Folder Encryptor

Version: 16.83

File Size: 3.70MB

Category: Folder Encryption, File Encryption

Language: English

License: Trial version

System Requirements: Win xp/vista/Win 7/Win 8

Released on: Aug.31, 2015

Download Address: http://www.dogoodsoft.com/best-folder-encryptor/free-download.html

What’s New in This Version:

* Improved the stability for disk advanced-protection.

– Fixed three minor bugs prompted in message.

* Expanded the file/folder size limitation for Diamond- , Full- and Portable encryption to 990MB.

Best Folder Encryptor Has Updated to Version 16.83

Why Choose Best Folder Encryptor:

Best Folder Encryptor is a professional file and folder encryption software. It features superfast with high security and confidentiality. With the internationally advanced encryption algorithms, encryption methods and file system drivers, the encrypted files and folders cannot be decrypted without the correct password, and are prevented from copy, deletion or removal.

It is convenient to open and edit the encrypted folder or file with the Open feature, and you don’t have to re-encrypt the folder or file after use.

Besides, it supports many powerful features such as data shredding (file/folder shredding), completely hiding hard drive partition, disabling USB storage devices or set them as read-only, etc. All these make Best Folder Encryptor undoubtedly a flawless encryption software and the best helper.

Vice News fixer ‘charged over encryption software’

Vice News fixer 'charged over encryption software'

Three staff members from Vice News were charged with “engaging in terrorist activity” because one of the men was using an encryption system on his personal computer which is often used by the Islamic State of Iraq and the Levant (ISIL), a senior press official in the Turkish government has told Al Jazeera.

Two UK journalists, Jake Hanrahan and Philip Pendlebury, along with their Turkey-based Iraqi fixer and a driver, were arrested on Thursday in Diyarbakir while filming clashes between security forces and youth members of the outlawed and armed Kurdistan Workers’ Party (PKK).

On Monday, the three men were charged by a Turkish judge in Diyarbakir with “engaging in terrorist activity” on behalf of ISIL, the driver was released without charge.

The Turkish official, who spoke on condition of anonymity, told Al Jazeera: “The main issue seems to be that the fixer uses a complex encryption system on his personal computer that a lot of ISIL militants also utilise for strategic communications.”

Speaking to Al Jazeera, Tahir Elci, the head of the Diyarbakir lawyers association, said: “I find it ridiculous that they were taken into custody. I don’t believe there is any accuracy to what they are charged for.

“To me, it seems like an attempt by the government to get international journalists away from the area of conflict.

“These people have obviously been in contact with YDG-H members (the youth wing of the PKK) because of their jobs, because they are covering stories. This might not have been welcomed by the security forces.”

Rejecting the accusations, the Turkish press offical said: “This is an unpleasant incident, but the judiciary is moving forward with the investigation independently and, contrary to claims, the government has no role in the proceedings.”

‘Freedom of expression’

In response to the charges, Kevin Sutcliffe, Vice head of news programming for Europe, said on Monday that the judge “has levelled baseless and alarmingly false charges of ‘working on behalf of a terrorist organisation’ against three VICE News reporters, in an attempt to intimidate and censor their coverage.

“Prior to being unjustly detained, these journalists were reporting and documenting the situation in the southeastern Turkish province of Diyarbakir.

“Vice News condemns in the strongest possible terms the Turkish government’s attempts to silence our reporters who have been providing vital coverage from the region.

“We continue to work with all relevant authorities to expedite the safe release of our three colleagues and friends.”

In Brussels, EU spokeswoman Maja Kocijancic said on Tuesday: “Any country negotiating EU accession needs to guarantee the respect for human rights, including freedom of expression.”

The PKK and the Turkish state were engaged in a war for almost 30 years until a 2013 ceasefire was declared after the two sides held peace talks.

There have been clashes between security forces and protesters in different parts of Turkey following the unravelling of the ceasefire and the beginning of an air campaign by Turkey against the group.

When It Comes To Encryption, Our Policy Makers Could Learn A Thing Or Two From Thomas Jefferson

When It Comes To Encryption, Our Policy Makers Could Learn A Thing Or Two From Thomas Jefferson

Thomas Jefferson was so interested in cryptography that he may have developed his own enciphering device after his mail was inspected by postmasters when the revolution was looming. Indeed, codes and ciphers are as American as the American Revolution itself. In fact, the revolution may not have happened if confidential correspondence, both military and otherwise, had been compromised by the British. In December 1801, Jefferson received an encrypted letter from a mathematics professor (the two both served at the American Philosophical Society) that was so inscrutable that he was never able to decode it—in fact, it was not decoded until over 200 years later.

The thread of cipher text runs through the very core of the history of this country. When James Madison penned a letter to Thomas Jefferson in 1789, letting him know that “a Bill of rights, incorporated perhaps into the Constitution will be proposed, with a few alterations most called for by the opponents of the Government and least objectionable to its friends,” the letter was partially enciphered, so that discussion about might run the Department of Finance, a smattering of international politics, and a bit of gossip about the French minister to the United States, the count de Moustier, and his sister-in-law, Madame de Brehan, wouldn’t have fallen into the wrong hands.

It’s hard to know when the narrative shifted, moving from trying to crack your enemies’ crypto and secure your own communications to working to weaken crypto for everyone. NSA director Michael Rogers, FBI director James Comey, and others in the Obama Administration have been working hard to try to convince the public that it’s possible to have secure communications that the government can access, but that criminals and bad nation-state actors can’t circumvent. They give lip service to the need for secure communications to fuel innovation and economic growth, while simultaneously working to dismantle the very systems that make those communications secure.

It is not entirely clear which approach the government will take, but whether it tries to pursue legislation forcing companies to work on mandated backdoors that they don’t want or even need, or simply tries to coerce them with fearmongering about the threat of terrorism, one thing is clear: the government should be embracing cryptography, as it once did, rather than fighting against it.

It’s true that end-to-end encryption could thwart investigation attempts for a small amount of crimes—or maybe call for more hands-on detective work—but this pales in comparison to the damage caused by government backdoors. “Cryptography was once a private game of shadows played by spy masters, but today it has become the critical foundation of our information infrastructure,” says Ethan Heilman, Research Fellow at Boston University.

A recent MIT paper written by a slew of experts makes it clear that giving the government backdoor access to secure communications would weaken the security of any system. “This report’s analysis of law enforcement demands for exceptional access to private communications and data shows that such access will open doors through which 24 criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend. The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict. The costs to developed countries’ soft power and to our moral authority would also be considerable. Policy-makers need to be clear-eyed in evaluating the likely costs and benefits,” it reads. (Oh, and China wants backdoors, too. So there’s that.)

This isn’t the first time the government has worked to weaken encryption on purpose. It goes back as far as the 1950s, and continued in the 1970s, (…NSA tried to convince IBM to reduce the length of thekey from 64-bit to 48-bit. Ultimately, they compromised on a 56-bit key,” wrote Tom Johnson in Book III: Retrenchment and Reform, an official NSA book), and the 1990s. Intentionally bad cryptography led to the Logjam bug, which can “break secure connections by tricking the browser and server to communicate using weak crypto,” Cory Doctorow explained on Boing Boing—and the government is to blame for these browsers and servers supporting weak crypto in the first place. Weak crypto, courtesy of the U.S. government, can be blamed for the FREAK SSL/TSL vulnerability as well.

NSA wants encryption that fends off quantum computing hacks

NSA wants encryption that fends off quantum computing hacks

The National Security Agency isn’t just yearning for quantum computers that can break tough encryption — it wants encryption that can protect against quantum computers, too. Officials have begun planning a transition to “quantum resistant” encryption that can’t be cracked as quickly as conventional algorithms. As the NSA explains, even a seemingly exotic technique like elliptic curve cryptography “is not the long term solution” people thought it was. Quantum computing is advancing quickly enough that the NSA and other organizations could find themselves extremely vulnerable if they’re not completely ready when the technology becomes a practical reality.

This doesn’t mean that the NSA is asking the government or security vendors to avoid upgrading their ‘traditional’ encryption. It already has suggestions for cryptographic methods that should make it easier to adopt quantum-proof security. However, the agency doesn’t want others pouring a lot of their time and money into encryption that may well become obsolete in the “not too distant future.” Even though you aren’t likely to see a wave of quantum hacking any time soon, the prospect is real enough that the NSA is treating it as a high priority.

Whose keys are they anyway?

Whose keys are they anyway?

Google recently announced enhanced security support for its cloud customers by granting them the ability to hold the encryption keys to their data. These customer-supplied encryption keys for the Google Cloud Platform follow the example set by other cloud industry leaders such as Amazon Web Services and Box and position the tech giant as an advocate for user data privacy.

The many federal IT managers who rely on Google Cloud and AWS are now able to develop a more sound security strategy when it comes to adopting the cloud. Government security managers running Google Cloud should educate themselves on the various cloud encryption models available and also consider which complementary security solutions must also be implemented. Depending on the cloud encryption model employed, cloud data may be susceptible to unauthorized access by cloud service provider insiders or be moved to other jurisdictions that might present data sovereignty issues.

Let’s break it down.

Server-side encryption. At the most basic level of the cloud encryption models, there is server-side encryption (SSE), where the encryption is performed by the cloud service provider using keys it owns and manages itself. Server-side encryption is the most vulnerable cloud encryption model, as the key unlocking access to the data is in control of the cloud provider. While SSE provides a basic level of encryption, it does not provide enterprise security control nor does it help protect against insider attacks because service provider employees could access the data intentionally or by mistake.

Server-side encryption with customer-provided keys. What Box, AWS and now Google offer is server-side encryption with customer-provided keys (SSE-CPK). In this model, the cloud provider handles the encryption but hands the keys the customer to own and manage. The cloud service provider runs the encryption in its underlying infrastructure and promises to only keep the keys in memory while the virtual machine is up and running. However, the keys still flow through cloud provider application programming interfaces, so it is not much of a stretch for the cloud provider to divert or intercept the keys.

Client-side encryption. The most secure solution is client-side encryption (CSE), which occurs in the cloud but it is initiated and managed by the data owner. The customer selects the encryption method and provides the encryption software. Most important, the customer owns and manages the encryption keys.

This approach allows customers to store and manage the keys for the virtual machines on their own premises or in a controlled instance in the cloud. When the virtual machine boots up in the private or public cloud, it can use a pre-boot network connection to an enterprise-controlled intelligent key manager to retrieve the key.

In the announcement of SSE-CPK on Google’s blog, the company chides, “Keep in mind, though, if you lose your encryption keys, we won’t be able to help you recover your keys or your data – with great power comes great responsibility!” The onus is indeed on the customer to not only keep the keys close, but keep them safe. The most responsible move for IT admin is to have an enterprise-controlled intelligent key management solution to manage crypto activities.

Google’s support for SSE-CPK is a step in the right direction to giving enterprises control over who accesses their data, but it still falls short of client-side encryption. Only with the CSE model – where both the encryption and keys are initiated and managed by the data owner, not the cloud provider – does the customer have the most protection and control possible in the cloud.

NCUA institutes encryption protocols for data provided to examiners

NCUA institutes encryption protocols for data provided to examiners

NCUA has instituted data encryption protocols as suggested by its Office of Inspector General this June following review of an examiner’s loss of a thumb drive containing credit union members’ data.

The protocols were communicated Aug. 21 in a letter from NCUA Examination and Insurance Director Larry Fazio to the chief executives of federally insured credit unions.

The letter says the agency’s examiners now will accept data files from credit unions only if the files are encrypted first by the credit union or, if the credit union is unable or does not wish to do that, via transfer to NCUA’s encrypted equipment. In either case, parties involved will sign a “chain of custody” document. The letter, in a footnote, also advises credit unions against electronically transmitting unencrypted data to examiners.

Encryption protocols outlined in the letter will remain in use until the agency acquires a secure file transfer solution that will allow credit unions and exam staff to “securely and efficiently” exchange information, Fazio wrote. That solution is expected to be in place early next year.

Reflective satellites may be the future of high-end encryption

Reflective satellites may be the future of high-end encryption

Quantum key distribution is regularly touted as the encryption of the future. While the keys are exchanged on an insecure channel, the laws of physics provide a guarantee that two parties can exchange a secret key without knowing whether they’re being overheard. This unencrypted-but-secure form of key exchange circumvents one of the potential shortcomings of some forms of public key systems.

However, quantum key distribution (QKD) has one big downside: the two parties need to have a direct link to each other. So, for instance, banks in and around Geneva use dedicated fiber links to perform QKD, but they can only do this because the link distance is less than 100km. These fixed and short links are an expensive solution. A more flexible solution is required if QKD is going to be used for more general encryption purposes.

A group of Italian researchers have demonstrated the possibility of QKD via a satellite, which in principle (but not in practice) means that any two parties with a view of a satellite can exchange keys.

Why QKD?

We live in a world where quantum computing is looming as a viable tool, one that could make current means of encryption obsolete. More secure forms of cryptography are becoming increasingly important. Even now, researchers contemplate a world where various agencies store some intercepted encrypted communication under the assumption that one day they will have sufficient computational power to decode them.

Ars readers know that most security breaches are not due to a failure of encryption; rather they are enabled by poor security practices. However, I think it is fair to say that the exfiltrated data is more accessible due to poor encryption practices. And, once encrypted data has been exfiltrated, it simply awaits the requisite computational power to decode it.

This expectation—that encrypted data can be decrypted in the near future—comes from the fact that many cryptographic algorithms rely on an assumption of mathematical difficulty for their security. The validity of this assumption relies on some deep ideas about how mathematical problems can be solved.

Specifically, the mathematical assumptions that underlie public key exchange are under attack. The most commonly used algorithms are based on the computational complexity of finding prime factors of large numbers. But a quantum computer can solve this problem in far fewer steps than a classical computer. Indeed, the scaling of Shor’s algorithm—this is the quantum version of an algorithm for finding prime factors—is so favorable that it is expected that a practical quantum computer will render all encryption methods based on prime factors useless.

This is one reason why QKD is so attractive for certain people: the keys are secret and are exchanged in a way that allows one to ensure that it cannot be intercepted during exchange. Thus, an attacker is always forced to guess the key (rather than use the public part of the key to compute the secret part of the key). Any brute force attack must be performed without even knowing the length of the key or how often a new key is used.

You might argue that an assumption of QKD is that the laws of physics are correct. Science makes a big deal about how we can only get an increasingly accurate approximation of the truth, so surely this assumption is as suspect as the mathematical ones made for classical cryptography? Well, no, not really. Even if we were to discover some deeper theory than quantum mechanics, that theory must still replicate all the experimental results of quantum theory, and this includes the ones on which QKD are based. So this assumption is a fairly safe one.

In space, no one can hear your key exchange

In terms of technology, QKD is very close to being suitable for widespread use—though by “use” I mean communication between data centers, rather than for home use. The hurdle, as I stated in the introduction, is that the link must be directly between two parties, which limits us to about 100km via fiber.

There, has, however, been a rather strong push to develop free-space QKD, and this has now gone critical with the tests that show QKD via satellite is possible. In order to do this, the researchers made use of laser ranging satellites, which have corner cube mirrors mounted on them. The corner cube mirrors are retro-reflectors, so any signal that arrives gets sent back in the direction that it came from. More importantly, corner cube reflectors normally preserve polarization, which is commonly used to carry data.

So, as long as the signal arrives at your detector, then you should be able to generate a key using lasers bounced off this satellite.

Getting a signal is, unfortunately, no easy task. First, you need a clock signal to tell you when to measure—the properties of the atmosphere and the relative motion between the sender, detector, and satellite mean that you can’t rely on local timing. The clock takes the form of a powerful, let-me-fry-your-eyes laser, emitting 10 pulses per second. The actual qubits (quantum bits) are sent at 100 MHz, with every 105th pulse synchronized with the clock signal. These pulses are emitted and collected by a 1.5m telescope.

The researchers compared the polarization states they detected to the pulses of light they sent. They determined that the newer satellites did preserve polarization, while older satellites generated more errors, possibly because the coatings on the reflectors had been damaged over time (the older satellites are 15 to 20 years old). For the researchers, this showed that the error rate was low enough that a key could be shared via quantum states. But, at this point I was extremely skeptical.

QKD security is only guaranteed if the source emits single photons, since those get altered by any eavesdropping. But, in this system, the receiver gets single photons, while each pulse contains 1.3 billion photons when it exits the telescope. You would think that this renders the result useless. An eavesdropper can, by tapping a tiny fraction of the signal emitted from the telescope, obtain every bit sent without the knowledge of either sender or receiver.

The standard QKD protocol involves revealing how each measurement was performed. While only the sender knows which polarization state was sent, everyone (including an eavesdropper) knows how the measurement was performed. If only the sender and receiver know the results of the measurements, the key is secure.

It is the first and last bit of hidden knowledge—the bits sent and the measurement results—that keeps the key secret. On the face of it, in this scheme, anyone can know what polarization state was sent if they can simply snag one of those 1.3 billion photons. Everyone knows how the measurement was performed; therefore, everyone knows what the measurement results were. No secrets are kept in this situation.

However, the researchers realize this and have an alternative protocol. In their approach, the satellite would contain optics that would modify the polarization of the light at the satellite. Since the reflected signal is at the single photon level, interception after this point is detectable. Therefore, all is well, right?

The key is to make sure that the polarization state sent to the satellite does not reveal the polarization state reflected from the satellite. This can be done by sending pulses of light that are circularly polarized. This can be filtered to two pairs of linearly polarized states at the satellite (under the control of the sender). Now, the sender knows which states were sent, everyone knows how the measurements were performed, and, only the sender and receiver know the results of the measurements. This meets the requirements for QKD, but only under the condition that the control signal sent to the satellite remains secure.

This later point seems like a pretty serious weakness. A solution might be to have two identical pseudo random number generators and initiate both with the same seed at the beginning of the key generation process. But you really need to ensure that the random number generator is protected or that the seed is truly obfuscated.

I guess that what this paper demonstrates is that the single photon states behind QKD are certainly preserved on reflection from a satellite and that this opens up the possibility of having non-fixed links between parties that need to share keys. But we can’t use this technique with existing satellites, and there are some very practical problems associated with controlling the satellites in a secret manner that remain unsolved.

Phone and laptop encryption guide: Protect your stuff and yourself

The worst thing about having a phone or laptop stolen isn’t necessarily the loss of the physical object itself, though there’s no question that that part sucks. It’s the amount of damage control you have to do afterward. Calling your phone company to get SIMs deactivated, changing all of your account passwords, and maybe even canceling credit cards are all good ideas, and they’re just the tip of the iceberg.

Using strong PINs or passwords and various Find My Phone features is a good place to start if you’d like to limit the amount of cleanup you need to do, but in this day and age it’s a good idea to encrypt your device’s local storage if at all possible. Full-disk or full-device encryption (that is, encrypting everything on your drive, rather than a specific folder or user profile) isn’t yet a default feature across the board, but most of the major desktop and mobile OSes support it in some fashion. In case you’ve never considered it before, here’s what you need to know.

Why encrypt?

Even if you normally protect your user account with a decent password, that doesn’t truly protect your data if someone decides to swipe your device. For many computers, the drive can simply be removed and plugged into another system, or the computer can be booted from an external drive and the data can be copied to that drive. Android phones and tablets can be booted into recovery mode and many of the files on the user partition can be accessed with freely available debug tools. And even if you totally wipe your drive, disk recovery software may still be able to read old files.

Encrypting your local storage makes all of that much more difficult, if not impossible. Anyone trying to access your data will need a key to actually mount the drive or read anything off of it, and if you wipe the drive the leftover data that can be read by that file recovery software will still be encrypted even if the new data on the drive isn’t.

There are a few downsides. If you yourself lose the key or if your drive becomes corrupted, for example, it might be more difficult or impossible to recover data. It can slow down performance, especially for devices with processors that don’t provide hardware acceleration for encrypting and decrypting data. But, by and large, the benefits outweigh the drawbacks, and the slowdown for modern devices should be tolerable-to-unnoticeable.

iOS: Don’t worry about it

As of iOS 8, as long as you set a passcode, your personal data gets encrypted. Apple’s security whitepaper (PDF) for iOS 8.3 and later specifically says that “key system apps, such as Messages, Mail, Calendar, Contacts, Photos, and Health data values use Data Protection by default, and third-party apps installed on iOS 7 or later receive this protection automatically.”

The company also claims that every current iDevice features “a dedicated AES 256 crypto engine built into the DMA path between the flash storage and main system memory,” which ought to limit the impact of this encryption on system speed.

OS X: FileVault

Phone and laptop encryption guide: Protect your stuff and yourself

Starting with OS X 10.7 (Lion) in 2011, Apple began supporting full-disk encryption with FileVault 2. In more recent OS X versions, some Macs even offer to encrypt your storage as part of the first-boot setup process, though it’s not the default as it is in iOS.

To encrypt your drive after the fact, go to the Security & Privacy pane in System Preferences, and select the FileVault tab. Click Turn On FileVault and you’ll be offered a pair of options: store the key used to unlock your disk somewhere yourself, or choose to store it in your iCloud account. A local recovery key keeps that key off of another company’s servers, but leaves you without recourse if you lose it and you’re locked out of your system. If you do store your key in iCloud (or even if you don’t, for that matter), we strongly recommend enabling two-factor authentication for your Apple ID.

Encrypting your disk doesn’t drastically change the way that OS X works—you just need to put your account password in to unlock the disk before the operating system boots instead of afterward. You’ll also need to specify which local users’ logins can decrypt the disk. Otherwise, just the account that enabled FileVault will be able to turn the machine on. If you ever need to decrypt your Mac, it’s pretty easy if you can log in to the computer or if you have the key available.

Generally speaking, performance for encrypted devices declines less for newer Macs with hardware acceleration—most Core i5s and i7s can do it, but Core 2 Duo Macs cannot.

Android

Phone and laptop encryption guide: Protect your stuff and yourself

Despite past promises, new Android devices still aren’t being encrypted by default. Default encryption is an option for OEMs, but outside of Google’s Nexus devices few if any companies are choosing to enable the feature on their phones.

You can still encrypt any relatively modern version of Android pretty easily—these specific steps work for Nexus devices or anything running near-stock Android, but the process should be similar if your phone is using a skin.

Open the Settings app, go to Security, and then tap “encrypt phone” to get the process started. Your phone may ask you to plug it in or charge the battery to a specific level before it will give you the option to encrypt, mostly because interrupting this process at any point is likely to completely corrupt your data partition. You’ll need to protect your phone with some kind of PIN or pattern or password if you haven’t already, and as in OS X your phone will probably require it before the operating system will boot.

To confirm that your phone was encrypted, go to Settings and then Security and look for a small “Encrypted” badge under the “Encrypt phone” menu item. If your phone already says it’s encrypted, you may have one of the new post-Lollipop phones that came with encryption enabled out of the box.

Depending on your phone, encrypting your Android phone or tablet can significantly impact performance. This is the worst for older or slower devices, which can use slower flash memory and filesystems and lack hardware encryption acceleration. The experience is better on newer phones with 64-bit ARMv8 processors and higher-end, faster storage.

Additionally, if you need to decrypt the device later on, there’s no way to do it without wiping and resetting the phone. If your phone came encrypted out of the box, though, there’s no way to decrypt the device without making more extensive software modifications.

Finally, in Android Marshmallow, the Android phones that include external storage are able to encrypt and protect the data on those cards as well as on internal storage.

Jeb Bush: encryption makes it too hard to catch “evildoers”

Jeb Bush: encryption makes it too hard to catch "evildoers"

Bush, the former governor of Florida, said Tuesday that encryption “makes it harder for the American government to do its job.”

That job would be, according to Bush, “making sure that evildoers aren’t in our midst,” echoing a phrase frequently used by his brother President George W. Bush to describe the threat of radical Islamic terrorism.

If you create encryption, it makes it harder for the American government to do its job – while protecting civil liberties – to make sure that evildoers aren’t in our midst.

Governor Bush’s comments were delivered at a forum hosted by a lobbyist group called Americans for Peace, Prosperity and Security (APPS) with close ties to military contractors, that is pushing presidential candidates to adopt “hawkish positions,” according to The Intercept.

(APPS’s advisory board includes members of what you might call the National Security establishment – including a former national security advisor to George W. Bush and a former CEO of BAE Systems. Its honorary chair is Mike Rogers, formerly the chairman of the US Congress’s Permanent Select Committee on Intelligence.)

Bush also advocated for wide latitude for the NSA to continue collecting phone metadata, although the NSA’s surveillance powers over Americans have been curtailed by Congress.

There’s “no evidence” that the NSA abused its powers or infringed on civil liberties of Americans, Bush said.

In fact, Bush said, in the clash of surveillance and civil liberties, “the balance has actually gone the wrong way” – meaning that civil liberties have too much weight.

There’s a place to find common ground between personal civil liberties and NSA doing its job. I think the balance has actually gone the wrong way.

While some US officials have advocated for technology companies to give law enforcement backdoors to read encrypted data, many security experts and tech companies say such a move would jeopardize security for everyone.

Others have pushed for some sort of middle ground, such as a multi-part encryption key that would keep encryption safeguarded by multiple agencies or companies holding part of the key.

Bush falls into this middle ground category, saying at the APPS forum that Silicon Valley companies (like Google and Apple) should cooperate with the government.

We need to find a new arrangement with Silicon Valley in this regard because I think this is a very dangerous kind of situation.

In response to Bush’s comments, some in tech and media suggested that Bush doesn’t really understand encryption.

Andrew Wooster, co-founder of a Seattle mobile software company, tweeted:

Jeb Bush: encryption makes it too hard to catch "evildoers"

The presidential politics of cybersecurity

As the 2016 US presidential election contest has heated up this summer, we’re reminded that cybersecurity isn’t just about technology, it’s also about policy – and that makes it highly political.

It’s still quite early in the election cycle, but cyber issues have taken up a good bit of the debate so far.

At a 6 August Republican debate, two contenders – Governor Chris Christie and Senator Rand Paul – clashed on NSA powers, with Christie claiming that the government needs “more tools” for fighting terrorism, and Paul arguing that the US Constitution requires a warrant for collecting data from Americans.

On the Democratic side, former Secretary of State Hillary Clinton has largely avoided the issue of NSA surveillance, while her chief rival, Senator Bernie Sanders, has called the NSA activities exposed by leaker Edward Snowden “Orwellian” and “clearly unconstitutional.”

Beyond encryption and surveillance, the cyberthreat from China has also taken up a lot of air time, with Republican candidates Mike Huckabee and Marco Rubio calling for retaliation against China over its presumed involvement in cyberattacks on the US government.

Clinton didn’t go as far as Huckabee or Rubio, but talked up the threat of Chinese economic espionage in a speech last month in which she also claimed that China wants to hack “everything that doesn’t move in America.”

A lot of important policies affecting privacy and security of Americans – and others around the world – will be decided by the next US president.

If you care about any of these issues – encryption, surveillance and the powers of law enforcement; privacy rights; government oversight of the internet and telecommunications; and laws that affect everything from data breach liability, to the rights of security researchers to hack things – it’s time to tune in and make your voice heard.

Five free Android encryption tools for the paranoid user

Do your hats tend to fall into the tinfoil range? Are you afraid there is always somebody watching you? If so, rest assured that the Android ecosystem offers plenty of apps to soothe your paranoia. But which apps are the must-haves? Here are five apps you should immediately install and put to work. They’ll bring you peace in the knowledge that your mobile data is far more secure than those around you.

1: Orbot Proxy with Tor

Orbot Proxy with Tor (Figure A) is an open network that strives to prevent any form of data surveillance. Tor protects you by bouncing your communications around a distributed network run by volunteers around the globe. Not only does this help prevent prying eyes from spying on you as you use the internet, it also keeps sites from learning your physical location.

Figure A

Five free Android encryption tools for the paranoid user
To use Tor on Android, your best bet is Orbot Proxy with Tor. Once you have it installed and connected, it will encrypt all internet traffic leaving your device. This is the only app that produces a truly secure and encrypted connection for your Android device. If you are really paranoid, you need Orbot Proxy with Tor. It’s free… what do you have to lose?

2: CSipSimple

CSipSimple (Figure B) lets you do encrypted SIP calling via your Android device. It’s open source and free, and it offers an easy-to-use Wizard for setting up the app. You are required to have an account on a SIP server, and I highly recommend using Ostel. It works seamlessly and has its own wizard for setting up the SIP account within CSipSimple. Even the Ostel account is free—so the only cost associated with this will be any data usage from your provider. You can set up CSipSimple to only use Wi-Fi, to avoid any charges whatsoever. CSipSimple uses rewrite/filtering rules to integrate with Android and allows you to record calls.

Figure B

Five free Android encryption tools for the paranoid user

3: ChatSecure

ChatSecure (Figure C) offers free, unlimited encrypted chatting on your Android device. You can chat over Google Talk/Hangouts, Facebook Chat, Dukgo, Jabber, and more. ChatSecure claims 100% privacy using state-of-the-art Off the Record (OTR) encryption. If you’re concerned about ChatSecure being blocked, you can use it in conjunction with Orbot to circumvent all firewalls and monitors.

Figure C

Five free Android encryption tools for the paranoid user
With ChatSecure, setting up an OTR session is simple. When you start a chat with someone, you can first verify the contact and then start the encryption. This app isn’t perfect. You might run into instances where the encryption won’t start or the connection with Orbot isn’t made. But should either happen, you can restart the app and try again. It doesn’t occur often, but when you’re dealing with the need for 100% security, you don’t want to use the app without the aid of Tor.

4: K-9 Mail

K-9 Mail with APG (Figure D) encrypts email on your Android device. You must install both apps and set up APG, which will create a key pair to be used by K-9. Once you’ve created your key pair in APG, set up K-9 and it will automatically detect that you have APG installed and offer the option to sign and encrypt an outgoing email with a simple tap of a check box. This is by far the easiest means of getting encrypted email on your Android device.

Figure D

Five free Android encryption tools for the paranoid user
One thing to remember is that all encryption keys are handled with APG—which lets you import keys created from other sources (even searching for public keys from key servers). Both apps are free. Use K-9 in conjunction with Tor and you’ll enjoy even more security.

5: Built-in device encryption

This option is for those who want to ensure the privacy of their device should it fall into the wrong hands. This built-in encryption system (Figure E) works with all data—including app data, downloaded files… everything on your device. Of course, this level of security does come with its drawbacks.

Figure E

Five free Android encryption tools for the paranoid user
First, older (or lower-end) devices might see a hit on the performance. (Newer and flagship devices shouldn’t so much as hiccup with system-wide encryption.) Second, you’ll have to enter the encryption password on every startup of the device—but that’s a small price to pay for this level of security. Pay it and be safe. Also understand that once you’ve encrypted your Android device, the only way to disable the encryption is to do a factory reset. Note: Android Lollipop defaults to device encryption.

Topping the list

Do you already feel more secure? You should. Each of these apps does a great job of keeping your data away from prying eyes. But if you only have time for one of these tools, I’d highly recommend Orbot Proxy with Tor. It will ensure all of your device traffic is routed through a far more secure network.

Pushbullet adds end-to-end encryption to its Android, Chrome and Windows desktop app

Pushbullet adds end-to-end encryption to its Android, Chrome and Windows desktop app

Continuing its evolution in to a full-fledged messaging service, Pushbullet has added support for end-to-end encryption when using the app to mirror notifications, move text captured by the universal copy and paste clipboard and send SMS messages.

The feature is available to anyone using the latest version of the company’s Android, Chrome or Windows desktop app; Pushbullet promises that its iOS and Mac apps will support the feature in the near future.

Enabling end-to-end encryption is done by going to the settings menu of each device you have Pushbullet installed on and inputing the same password.

Once it’s enabled, Pushbullet won’t be able to see the data you’re sending between your devices.

“End-to-end encryption means your data is encrypted before it leaves your device, and isn’t decrypted until it is received by another of your devices. This means we at Pushbullet only forward encrypted data. By setting up end-to-end encryption, you can be confident that your data is only readable when it’s shown to you,” says the company in a blog post. “The best part of all of this is that protecting your privacy doesn’t mean giving up features. Everything you love about Pushbullet still works great even with end-to-end encryption set up!”

Download Pushbullet from the Google Play Store and the iTunes App Store.

Pushbullet adds end-to-end encryption as it continues shift into messaging

Pushbullet adds end-to-end encryption as it continues shift into messaging

Pushbullet, once a simple tool for sending files between your various devices, has announced that it now supports end-to-end encryption for additional user privacy, as it continues its march towards becoming a fully-fledged messenger.

Announced in a blog post, the new encryption is applied across notifications that are mirrored between devices, any text captured by the universal copy-and-paste option and any SMS messages that are sent using the platform.

Pushbullet adds end-to-end encryption as it continues shift into messaging

Once enabled (achieved by entering a password on each device), it means that data passed using Pushbullet isn’t visible to the service itself or the company – only encrypted data is passed along.

To enter a password for end-to-end encryption, you just need to go to the settings menu on each device. Don’t forget your password though, there’s no record of it anywhere.

For now, the Pushbullet Android, Chrome and Windows desktop apps support the feature, but the company says that it’s working to bring it to iOS and Mac as “soon as possible.” Opera, Safari and Firefox support will then be added later.

While it’s a relatively small (but nonetheless important) feature for users, it’s essential for the future of the company if it’s intent on ploughing ahead into the messaging space.

Blackberry PGP Encrypted Phones With Latest BB12 Encryption Technology Released

Blackberry PGP Encrypted Phones With Latest BB12 Encryption Technology Released

Blackberry Encrypted Phones have Blackberry PGP email encrypted devices that offer safe and secure solutions for wireless communications.

Android and iPhones have proven to be unreliable when it comes to encryption and data protection. These popular devices have been relatively reduced to the status of toys when it comes to industrial or professional grade protection against espionage at any level. No one knows where the compromise begins and ends with these platforms whose very hardware was born with the idea of giving access to those who demanded it from certain levels.

The engineers at BBPGP.com have found that Blackberry PGP email encryption devices offer the highest level of security for wireless communications. This Blackberry PGP encryption technology allows for the highest encryption standards for email accounts. This encryption is done through BES servers.

The Blackberry PGP email encryption system is designed to be user friendly so that any level of user can conveniently protect their private information. This PGP encryption is available for private users or businesses who rely on security and privacy. It works by heavily encrypting all messages so that even if they were intercepted by a third party, it would be indecipherable.

Mark Spencer, Representative for BBPGP.com comments, “The Blackberry PGP email encryption devices is the most familiar way to communicate safely. This Blackberry PGP cryptofoons have been specially developed to communicate without the risk that the information sent by a third party, such as a government agency is intercepted safe. The Blackberry PGP encrypts the information, namely in such a way that even if this information is intercepted is nothing to do here.”

Because email is such an important communication system that is unfortunately an insecure way to transmit information, additional security measures are required to assure that privacy and sensitive information are protected. If messages are intercepted, without being automatically encrypted, personal information could become compromised. However, using technology such as the Blackberry PGP encryption from Blackberry Encrypted Phones assures that all messages are encrypted and only readable text for intended recipients. File attachments such as documents and images are also heavily encrypted for further privacy protection.

Email encryption is a process by where communications are completely scrambled to the point they are completely unreadable. The better the encryption, the less likely that a communication will be able to be deciphered. PGP email encryption offers a heavy level of this type of security.

Private users and businesses using wireless communication methods should make sure they have an additional layer of security due to how easy it is to breach the insecure wireless environment. PGP encryption acts like a high security envelope that shields communications from prying eyes of hackers, government institutions, competitors and others.

Cryptolocker virus: Australians forced to pay as latest encryption virus is ‘unbreakable’, security expert says

Cryptolocker virus: Australians forced to pay as latest encryption virus is 'unbreakable', security expert says

Australians are paying thousands of dollars to overseas hackers to rid their computers of an unbreakable virus known as Cryptolocker.

There has been a rise in the number of people falling victim to the latest version of an encryption virus which hijacks computer files and demands a ransom to restore them.

The “ransomware” infects computers through programs and credible-looking emails, taking computer files and photographs hostage.

Cryptolocker comes in a number of versions, the latest capitalising on the release of Windows 10.

It can arrive in an email disguised as an installer of the new operating system in a zip file.

IT technician Josh Lindsay said he had been repairing computers for 15 years but the current form of the virus was “unbreakable”.

“It’s definitely the worst I have come across,” he said.

The hackers offer computer owners a chance to retrieve data – but only if they pay a ransom using the electronic currency Bitcoin.

“If it’s on Bitcoin they can use it to purchase anything online from gold bullion, to shares, to property even and it’s virtually untraceable,” Mr Lindsay said.

Virus victim Renata Eugstar said she decided not to pay the ransom price.

“I just wouldn’t pay it out of principle, I suppose there are people out there that have to, you know, if it is a business,” she said.

Michael Bailey from the Tasmanian Chamber of Commerce and Industry said when his organisation was hit, a ransom equivalent to $US350 was paid to overseas hackers.

“It was cheaper for us to just pay rather than worry about trying to fix it,” he said.

“The advice from our IT people is – some of the best in Australia – was that it would take weeks for them to work out how to unencrypt the files, if they could at all.”

The deputy chairwoman of the Australian Competition and Consumer Commission, Delia Rickard, said over the past two months there had been a spike in the number of people falling victim to the scam.

The commission has received 2,500 complaints this year and estimates about $400,000 has been paid to the hackers.

“That’s the tip of the iceberg,” she said.

Thomas King, the general manager of the Australian Cyber Emergency Response Team (AusCERT) and part of the University of Queensland, said the number of computers infected by the virus was on the rise.

“Individuals, companies, not-for-profits, organisations of all kinds have paid and it’s a sad state of affairs that so many people do feel the need to pay because they don’t have good enough cyber security protections,” he said.

Mr King has urged people to take precautions when opening emails and to ensure good backups of any data is kept offline.

 

NSA-grade encryption for mobile over untrusted networks

NSA-grade encryption for mobile over untrusted networks

The only term being thrown around government more than “2016 elections” these days is “cybersecurity,” particularly following a rash of damaging and high-profile data breaches. With that focus on protecting information top of mind in agencies, USMobile officials hope to find a ready market for their commercial app, which lets government workers use their personal smartphones for top-secret communications.

Called Scrambl3, the app creates a secure virtual-private network that connects bring-your-own devices to an agency server to send messages using end-to-end encryption. Irvine, Calif.- based USMobile developed the Scrambl3 technology  when team members worked with the National Security Agency to create “Fishbowl,” a secure phone network available only to Defense Department users via the DOD Information Network.

“We’ve implemented Fishbowl in the form of a software-defined network, so all of those typical hardware components that you’d find in a mobile network — routers, VPNs, gateways, firewalls, proxy servers — all of those components are expressed or implemented in our system in the form of software,” said Jon Hanour, USMobile’s president and CEO. “We’ve made an affordable version of Fishbowl.”

When the turnkey solution comes to market in October, it will work with Android and Apple iOS devices. It uses the Security-Enhanced Linux operating system and a defense-in-depth approachThe layered approach uses a VPN connection with an encrypted VoIP call travelling within.When an agency deploys Scrambl3 Enterprise, administrators will set up what USMobile calls Black Books, or lists of contacts that each user can communicate with via the VPN.

“A lower-level person wouldn’t necessarily have the director of that particular agency listed,” Hanour said. “Conversely, the director of that particular agency would have [a] contact list populated with people that are at the higher levels of management.”

When a user logs into the app on a smartphone, it creates a VPN that connects to the agency’s server, whether it’s in the cloud or on premises. Currently, Scrambl3 Enterprise software is deployed only on IBM Power Systems Linux servers.

A two-rack server can handle up to 3,000 concurrent calls, Hanour said, a capacity “that would handle comfortably an agency of 50,000 people.”

Once connected, users can see who in their Black Book is also logged in, as indicated by a green dot next to the name, and then select the mode of communication: email, voice call or text. Both senders and recipients would need to have Scrambl3 installed.

“Once you establish this powerful VPN, you can run anything through it,” Hanour said. “Anything that you can put on a server, you can use Scrambl3 to communicate with.”

Calls are highly encrypted until they reach the recipient, where the app decrypts them. That communication happens at a top-secret-grade level as specified by NSA. Despite that encryption/decryption process, Hanour said, latency is unnoticeable.

For additional protection, nothing is recorded – users can’t even leave voicemail – unless an agency specifies otherwise. For instance, Hanour said, some law enforcement regulations require that all communication among officers be recorded.

The law enforcement community is a prime target customer for Scrambl3 because public cell phone networks don’t meet heightened police security standards, and photographic evidence requires a secure uploading process.

To use Scrambl3, agencies don’t need mobile device management systems, but it integrates with any that might exist.

“The advantage of this architecture is that the communication that the mobile device management software would typically have with the device, that communication can now run inside the VPN, so it makes that even more secure,” Hanour said. “It creates value for the mobile device management system as well because you can protect it inside the VPN.”

Licensing fees for Scrambl3 depend on the number of users, but typically start at $5 per user per month. The most it would cost, Hanour said, is about $10 per user per month.

Right now, Scrambl3 for Android is available in beta form in the Google Play Store for testing. Scramble3 for iOS will be available next month.

The beta version does not include all Scrambl3’s features, such as conference calling. When the release version is up and running in October, Scrambl3 will offer the only top-secret-grade conference call capability outside DOD’s network, Hanour said. Users will be able to initiate a conference call by touching a few people’s names and pressing the call button.

Besides law enforcement, Hanour sees potential customers in several types of government operations, including health care, the State Department when conducting diplomatic relations and even individual politicians, who might want to communicate in absolute privacy.

“The whole idea is to create trusted communications over untrusted networks (i.e., the Internet),” Hanour said.

Cloud encryption key management becomes table stakes

Cloud encryption key management becomes table stakes

Encryption key management has become table stakes for cloud vendors, but bringing your own key isn’t always the right move.

The ability to bring your own encryption keys is fast becoming ubiquitous in public cloud, but that doesn’t mean IT pros should retain control.

Security concerns and data center oversight are two primary hang-ups for IT shops averse to adopting public cloud. Amazon became the first major infrastructure as a service (IaaS) vendor to offer bring your own key encryption in 2014 as an answer to some of those critiques. Over the past few weeks, Microsoft and Google have also advanced their cloud encryption key management capabilities.

Vendors at every layer of the cloud stack have added encryption capabilities, and, eventually, all cloud vendors will offer some form of encryption and key management, said Garrett Bekker, senior security analyst with 451 Research LLC, based in New York. Some vendors will opt to do it natively, while others will pass the control to customers so they can check off that box on their list of capabilities, Bekker said.

“It comes down to how important it is for customers to control the keys,” Bekker said. “My guess is a lot of customers will be OK with letting service providers control the keys, but it depends on what the data is, what you’re using it for, and what industry and regulatory compliance you face.”

And business considerations will affect vendor services, too, with a company such as Google that lags in the market offering key management for free. Other companies like Salesforce.com that need to generate new revenue streams offer native encryption as a premium service.

To key or not to key?

Encryption is considered central to data protection in the cloud, but who should retain its control?

SunGard Financial Systems, which partners with Google to build a big data processing prototype for the U.S. Securities and Exchange Commission, uses Customer-Supplied Encryption Keys for compute resources on Google Compute Engine. The free tool for bringing your own keys became available in beta last week, and it’s essential from a risk and regulatory control perspective for this project, said Neil Palmer, CTO at SunGard Consulting Services, based in Wayne, Pa.

All data in the cloud should be encrypted anyway, but the ability to bring your own keys is one of those additions that should help enterprise adoption and increase the ways those customers use public cloud, Palmer said. Still, SunGard doesn’t bring its own keys to every project, so it’s a matter of weighing if and when key management is the best fit.

“It’s just a question from a perspective of effort, time, integration, etc.,” Palmer said. “There’s a return on investment around key management required, so if you’re BuzzFeed or one of the big media Internet sites, maybe not so much. But if you’re healthcare or government work, you may need it.”

Microsoft Azure Key Vault, which became generally available last month, can be used as a standalone service and allows customers to import keys from their own hardware security modules (HSMs). Microsoft charges $0.03 per 10,000 operations for software-protected keys and an additional $1 per month per key for HSM protected keys.

Similarly, Amazon Web Services (AWS) Key Management Services charges $0.03 per 10,000 requests and $1 per month per each key that is created and active. Amazon also has CloudHSM, a dedicated HSM appliance that costs $5,000 for each instance, in addition to an hourly fee of $1.88 for as long as the instance is running.

Cloud encryption key management is difficult, and bringing your own keys to a service someone else owns is a non-trivial endeavor that goes against one of the cloud’s main advantages of not having to worry about these sorts of things, said Adrian Sanabria, senior security analyst at 451 Research.

“You’ve got to somehow own the keys and manage to inject them into workloads without exposing them to the cloud provider,” Sanabria said. “It is a compromise, where you can’t be 100% cloud if you want to manage your own keys.”

Public perception about cloud security and regulatory environments with antiquated requirements both play a role for the need for key management, but the point could be moot in five years’ time, as customers start to trust large public cloud providers as good stewards of keys, said Leonard Law, a product manager for Google Cloud Platform.

“As people are transitioning from on-premises to the cloud, there’s this notion of control. So by managing your own custom keys that gives customers a lot of peace of mind, but ultimately, it’s just less necessary,” Law said.

SafeChats aims to give messaging an encryption edge

SafeChats aims to give messaging an encryption edge

THE revelations from former US National Security Agency (NSA) contractor Edward Snowden that the US Government has been tapping communications have created greater awareness on the need for secure communications, which in turn has given rise to secure messaging apps such as Telegram, Wickr and Threema.

Privacy should not be a concern for just individuals, but businesses also need to be aware of how tapped communications can affect them, according to Maxim Glazov (pic above), chief executive officer of Singapore-based SafeChats.

For example, customers’ VoIP (Voice-over-Internet Protocol) calls can be intercepted and sensitive information gathered for blackmail. Hackers can gain unauthorised access to a customer’s webmail account to forge emails, and issue payment instructions to send the money to the hackers’ accounts instead.

The scenario is made worse by the fact that many businesses use unsecured mass-market services because of their ease of use.

It was this realisation that catalysed Glaznov and his chief technology officer Nikita Osipov to build SafeChats, which they claim is a secure communications platform that protects collaboration as well.

The company was one of the finalists at the recent RSA Conference Asia Pacific and Japan (RSAC APJ) Innovation Sandbox startup competition in Singapore.

SafeChat began as an internal project for an undisclosed international logistics and finance company that Osipov and Glaznov were part of, looking into the problem of communicating sensitive information with customers more securely and efficiently than existing methods.

Glaznov’s initiative to build a secure communication platform got traction with his customers which were eager to use the platform for themselves

The market for secure communication, whether for consumers or enterprises, is gaining traction with the entry of companies like Silent Circle, Tigertext and ArmourText.

Osipov recognises the growing maturity of the market but remains undeterred. “We keep ourselves motivated by acquiring more use cases for what is essentially a red-ocean market, and the constant validation that there is a need for such a communications platform.”

The SafeChats platform aims to encompass the entire suite of communications, from email to messaging, and from file transfers to video and voice calls. It also gives the option of using the customer’s own server infrastructure instead of SafeChats’.

“SafeChats is the only secure communications platform that also integrates collaborative features and a full suite of privacy features,” Osipov claimed.

The SafeChats messaging volume has grown 10 times in the last six months, organically from initial customers, without an official release, the startup claimed.

When asked about its customers, Osipov cryptically replied, “As a company entrenched in security and privacy, we cannot reveal our current client list … and there are some users on board that we simply don’t know who they are.”

The company’s revenue model is set to be freemium Software-as-a-Service, with different tiers of control and fees being charged for white labeling and on-premises installation.

It also charges enterprise customers on a per-user if they “enforce a security policy on employees or create groups of more than 15 individuals,” Osipov said.

SafeChats is currently in public beta and will be officially launched at the end of August. It is currently available for the iOS and Android platforms. There are plans to make a desktop version for Mac OS X and Windows.

The challenges

SafeChats aims to give messaging an encryption edge

Spinning off into its own startup has seen some challenges, with Osipov (pic above) saying that one main one was building the right team.

“Once you have a great team, everything becomes so much easier,” he said.

On the technical front, coming up with the right set of technologies to use was one of the biggest challenges.

“We evaluated multiple different software solutions, protocols and algorithms that we could use before we settled on the current architecture,” said Osipov.

“All that required extensive research work – thinking of the whole system from the technical side and possible technical challenges in the future … and how to solve them … [while making sure] it remains very easy to use,” he added.

Under the hood

SafeChats aims to give messaging an encryption edge

SafeChats uses a variety of encryption algorithms, depending on the particular function.

“We use well-known end-to-end encryption algorithms trusted by security experts as the core of our platform, which means that your data stays safe in transit and only you and the intended recipient have access to it,” Osipov said.   For instant messaging, it uses Off-the-Record messaging (OTR) and the socialist millionaire protocol. OTR messaging uses a combination of Advanced Encryption Standard (AES) algorithms with a 128-bit key strength, with a public key exchange protocol for authentication. The socialist millionaire protocol allows two parties to verify each other’s identity through a shared secret.

For voice calls and file transfers, SafeChats uses an AES 256-bit key, military-grade encryption to protect data and calls.

Future plans

SafeChats aims to give messaging an encryption edge

SafeChats started as a bootstrapped startup, and is now on the lookout for investors who will be more than just people writing cheques.

“We are on the lookout for investors with the capacity to be strategic partners and who can provide channels for the product and its derivatives,” Osipov said.

SafeChats will be seeking pre-Series A round within the next six months, and is looking to raise over US$700,000, aiming for a valuation of US$6 million.

It intends to expand the team, especially on the marketing and technical fronts, the latter including 24/7 support.

And it will beef up its software development team “to work on enterprise features like integration with third-party services and advanced authentication options like two-factor authentication (2FA) using software and hardware tokens,” Osipov said.

Beyond expanding the platforms SafeChats works on, the company is also working on integrating the platform with other software and hardware solutions to utilise its end-to-end encryption. This will secure other software solutions as well as pave the way for Internet of Things (IoT) security.

“We won’t announce any names for now as there are many legal issues involved in this sort of integration, and with providing official software developer kits to everyone,” Osipov said.

“All we can say at the moment is that you can be sure that most popular software and hardware solutions will work with SafeChats,” he declared.

The company wants to open up its Application Program Interface (API) to others so that they can work on their own integrations as well, bringing the SafeChats level of security to other software.

“We also hope to form a community of developers to implement future integrations so everyone benefits,” Osipov claimed.

Researchers develop quantum-computing safe crypto

Practical implementation of secure key exchange for TLS.

A team of researchers claim to have developed secure, quantum computing-proof encryption that can be practically implemented today.

The paper, Post-quantum key exchange for the TLS protocol from the ring learning with errors problem [pdf] is written by Joppe Bos from NXP Semiconductors in Belgium, Craig Costello and Michael Naehrig at Microsoft Research, and mathematician Douglas Stebila from Queensland University of Technology.

Quantum computers have long been thought to be able to guess encryption keys much faster than traditional computers, which in turn would make it possible to unscramble the vast majority of internet-borne communications.

The researchers constructed ciphersuites for the Transport Layer Security protocol commonly used on the internet, providing digital key exchanges based on the ring learning with errors problem accompanied with traditional RSA and ellliptic curve cryptography signatures for authentication.

Using traditional RSA and EC signatures would speed the implementation of quantum-safe key exchanges among digital certificate authorities, the researchers believe.

There is a performance penalty of 21 percent compared to the non-quantum-safe key exchange, the researchers noted. However, that is is considered minimal, and demonstrates that provably secure post-quantum key exchanges are practical.

A theorem published by mathematician Peter Shor in 1994 and further work by other researchers has shown that quantum computers could break public-key cryptography, something which is not feasible with today’s binary devices.

As quantum computers are under development currently, the researchers believe it is important to strengthen today’s encryption protocols against future attacks using these far more powerful devices.

DA Hillar Moore: Cellphone encryption hurting murder investigation of woman, her baby; family holds onto hope the case will be solved

DA Hillar Moore: Cellphone encryption hurting murder investigation of woman, her baby; family holds onto hope the case will be solved

Cellphone encryption practices could be keeping investigators from solving the murder of Brittney Mills and her son, East Baton Rouge Parish District Attorney Hillar Moore III said Saturday, but family members remain hopeful the truth will surface.

“By no means have we forgotten them,” said Mills’ mother, Barbara Mills, on Sunday. “This will be in the forefront until it is solved.”

Brittney Mills, 29, who was eight months pregnant, was shot and killed April 24 at her Ship Drive apartment. Authorities believe Mills opened the door for someone who wanted to use her car and was shot multiple times when she refused. Doctors delivered her baby, but the baby boy,Brenton Mills, died May 1.

Three months later, the case is still unsolved.

Investigators said the shooter likely was someone Mills knew. They have looked to her cellphone for evidence, but her phone, like many others, uses software that is said to block anyone from accessing its data, including investigators.

While they have tried to crack the phone using possible pass codes suggested by family members, investigators have been unsuccessful.

“We don’t know her code number,” Mills’ mother said. “It may very well be a very important part of the investigation.”

Even Apple, the manufacturer, claims it cannot decrypt the phone.

“From what we’re told by the company that makes the encryption, the only way we can get into a phone is if the phone subscriber gives the password to us,” Moore said. “When you’re dead, it’s hard to give that to us.”

Apple’s most recent software upgrade is a response, Moore suspects, to Edward Snowden’s decision to leak U.S. National Security Agency information surrounding a national spy program. The iOS 8 software is fully encrypted, meaning the only way to access Mills’ phone data is to enter the pass code.

“If you attempt to use (too many) false passwords, though, it shuts it down for good,” Moore said. “We are cautious about that.”

While only a few cellphones previously used this technology, Moore said, this software is installed in more than 80 percent of cellphones now.

Moore recently wrote to the U.S. Senate Committee on the Judiciary to urge representatives to address this failure of balance between public safety and privacy, citing Mills’ unsolved case.

Mills’ “family indicated that she recorded all activity on her phone and join law enforcement in their frustration due to the inability to access this phone, that would in all likelihood provide information necessary to obtain justice and remove this murderer from the street,” Moore wrote.

Moore said Manhattan District Attorney Cyrus Vance is “leading the charge” for Congress to create legislation to address this problem, specifically in Apple’s and Google’s latest encryption technology. Moore said criminals, like most citizens, use their cellphones to communicate regularly and do business, which often makes their cellphones integral to many investigations. Even so, seizing someone’s phone requires a warrant, Moore said, to protect citizens’ privacy.

“I think the way Apple, the way that community has built their operating systems, they’re beyond the law,” Moore said Saturday. “It is the only way I know that you cannot court-order information. Without us being able to get into the phone itself through a subpoena, we are really at a disadvantage and at a loss to solve crimes.”

“It’s really frustrating for us and people like the Mills family,” Moore said. “There’s a darn good chance that there is info on the phone that could be extremely helpful for us.”

As the investigation continues, the family held a memorial Friday night in memory of Mills and her son for what would have been her 30th birthday. The estimated attendance was more than 150 people.

“It’s something we wanted to do because she talked a lot about turning 30,” Barbara Mills said.

She added that Mills and her son were “so special to us” and will not soon be forgotten.

Mills’ family has stayed involved with the case, encouraging investigators to do all they can to solve it, Barbara Mills said, because the family needs closure.

“We need to find out what happened,” Mills’ mother said. “We’re wanting results.”

Barbara Mills agreed with police that the killer must be someone her daughter knew because she would not have opened the door for a stranger.

Still, neither the family nor police have any leads as to the killer’s identity.

A few days after Mills was killed, the case received heavy attention when Baton Rouge police said they wanted to question former LSU star offensive lineman La’el Collins as part of the investigation into her death.

Collins was in Chicago for the NFL draft at the time, but once national media got wind that the police wanted to speak to Collins, the first-round prospect went undrafted.

Although he was never considered a suspect in the shooting, Collins was said to have had a relationship with Mills. After meeting with police, however, a paternity test ruled Collins out as the father of Mills’ son.

Barbara Mills said the family is leaving the details of the investigation up to the police but added that investigators could seek another paternity test sometime in the future.

After he was questioned and cleared by police, Collins signed as a guard with the Dallas Cowboys.

Apple could be held liable for supporting terrorism with strong iOS encryption, experts theorize

Apple could be held liable for supporting terrorism with strong iOS encryption, experts theorize

In the second installment of a thought piece about end-to-end encryption and its effect on national security, Lawfare editor-in-chief Benjamin Wittes and co-author Zoe Bedell hypothesize a situation in which Apple is called upon to provide decrypted communications data as part of a legal law enforcement process.

Since Apple does not, and on devices running iOS 8 cannot, readily hand over decrypted user data, a terrorist might leverage the company’s messaging products to hide their agenda from government security agencies. And to deadly effect.

As The Intercept reported, the hypotheticals just made the ongoing government surveillance versus consumer protection battle “uglier.”

Wittes and Bedell lay out a worst case scenario in which an American operative is recruited by ISIS via Twitter, then switches communication methods to Apple’s encrypted platform. The person might already be subject to constant monitoring from the FBI, for example, but would “go dark” once they committed to iOS. Certain information slips through, like location information and metadata, but surveillance is blind for all intents and purposes, the authors propose. The asset is subsequently activated and Americans die.

Under the civil remedies provision of the Antiterrorism Act (18 U.S. Code §2333), victims of international terrorism can sue, Lawfare explains, adding that an act violating criminal law is required to meet section definitions. Courts have found material support crimes satisfy this criteria. Because Apple was previously warned of potential threats to national security, specifically the danger of loss of life, it could be found to have provided material support to the theoretical terrorist.

The authors point out that Apple would most likely be open liability under §2333 for violating 18 USC §2339A, which makes it a crime to “provide[] material support or resources … knowing or intending that they are to be used in preparation for, or in carrying out” a terrorist attack or other listed criminal activity. Communications equipment is specifically mentioned in the statute.

Ultimately, it falls to the court to decide liability, willing or otherwise. Wittes and Bedell compare Apple’s theoretical contribution to that of Arab Bank’s monetary support of Hamas, a known terrorist organization. The judge in that case moved the question of criminality to Hamas, the group receiving assistance, not Arab Bank.

“The question for the jury was thus whether the bank was secondarily, rather than primarily, liable for the injuries,” Wittes and Bedell write. “The issue was not whether Arab Bank was trying to intimidate civilians or threaten governments. It was whether Hamas was trying to do this, and whether Arab Bank was knowingly helping Hamas.”

The post goes on to detail court precedent relating to Apple’s hypothetical case, as well as legal definitions of what constitutes criminal activity in such matters. Wittes and Bedell conclude, after a comprehensive rundown of possible defense scenarios, that Apple might, in some cases, be found in violation of the criminal prohibition against providing material support to a terrorist. They fall short of offering a viable solution to the potential problem. It’s also important to note that other companies, like Google and Android device makers, proffer similar safeguards and would likely be subject to the same theoretical — and arguably extreme — interpretations of national policy described above.

Apple has been an outspoken proponent of customer data privacy, openly touting strong iOS encryption and a general reluctance to handover information unless served with a warrant. The tack landed the company in the crosshairs of law enforcement agencies wanting open access to data deemed vital to criminal investigations.

In May, Apple was one of more than 140 signatories of a letter asking President Barack Obama to reject any proposals that would colorably change current policies relating to the protection of user data. For example, certain agencies want Apple and others to build software backdoors into their encrypted platforms, a move that would make an otherwise secure system inherently unsafe.

VeriFyle reveals Cellucrypt, a new multi-layer encryption key management technology

VeriFyle reveals Cellucrypt, a new multi-layer encryption key management technology

VeriFyle, the company headed by Hotmail inventor and co-founder Jack Smith, has a new encryption key management technology which it believes will “re-invent how the world thinks about secure sharing and messaging”. The major difference is that any object that is shared to the cloud using the system is encrypted for individual users rather than in bulk.

Cellucrypt offers such a high level of security that VeriFyle believes that it “makes illicit bulk-access to customer data virtually impossible.” It’s a bold claim, but Cellucrypt builds on the traditional a public-key system with the addition of password-derived keys.

The encryption technique will be used by VeriFyle’s messaging and file-sharing services when it launches later in the year. Cellucrypt has been patented by VeriFyle and will be made available to customers free of charge. Introducing the new encryption technique VeriFyle says

The patented Cellucrypt technology assigns each data object (e.g. document, note or conversation) a unique encryption key, which is itself encrypted uniquely each time a user shares that object.  By encrypting each data object individually for users, Cellucrypt makes illicit bulk-access to customer data virtually impossible.

CEO Jack Smith has high hopes for his company’s new technology:

Key management should be invisible to the end-user and it should maximize users’ security and peace of mind without burdening them with extra steps and add-on products. VeriFyle is the first all-in-one product that combines advanced key management technology with cloud sharing and messaging. The result is a significantly more secure way to share data.

Silent scanners: Emergency communications encrypted across Nova Scotia

Silent scanners: Emergency communications encrypted across Nova Scotia

SYDNEY — Citizens who like listening in on police, fire department and ambulance calls are out of luck, now that most emergency services communications in Cape Breton are conducted on fully encrypted radios.

The scanners have gone silent, for the most part, with the introduction of the second generation of Trunk Mobile Radio (TMR2) communications.

Being unable to monitor police traffic can be dangerous for citizens, said one longtime listener who didn’t want to be named.

“You don’t know what’s going on in the city unless you have a scanner,” said the citizen, who lives in Sydney’s north end.

“No offence to radios or newspaper, but you don’t hear everything that goes on.”

Recently, police were looking for a man seen on Dolbin Street, who reportedly had a gun. Thankfully, people listening to scanners were able to alert neighbours to stay inside, the citizen said.

“Certain things cannot be aired over the scanner, of course. It’s common logic. But they shouldn’t be blocking everything out.

“I’ve asked the police several times, and they say it’s not illegal to have a scanner. It’s illegal to follow the police cars when you have a scanner, because that’s interfering.”

The citizen said no one has yet heard of a way to crack the new encryption.

“I was hoping you would have heard,” the listener said.

Cape Breton Regional Police spokeswoman Desiree Vassallo said police haven’t heard any complaints from citizens about the encryption system.

She said police need secure communications, especially during sensitive operations when police don’t want suspects or the public to know exactly where they are.

Listeners can still occasionally hear some fire department traffic, because the Cape Breton Regional Municipality’s volunteer fire departments only have four radios each, for now.

The municipality is considering buying more radios for volunteers, but for now, fire department commanders use the TMR2 radios to talk to the dispatch centre and other emergency personnel, such as the police and ambulance services.

The commanders then communicate with individual firefighters using the older very-high-frequency (VHF) radios, which scanners can pick up.

That means listeners may hear some radio traffic, but not necessarily the most critical information, such as the location or severity of a fire or emergency scene.

Fire Chief Bernie MacKinnon said encryption is not important for fire departments, in part because fires are usually obvious and people can phone their neighbours or put messages out on social media anyway.

“TMR2 encryption is a police animal,” he said.

“When we have a raging fire, it’s not a secret. Even if we didn’t use the radios, everybody in the world is going to know, especially with the emerging technology that’s out there today.”

However, he said, maintaining clear communications with other emergency services is important.

Whether the service outfits all volunteer firefighters with the new radios is still under discussion, said MacKinnon, but it’s likely both VHF and TMR2 radios will be used for some time to come.

“To the best of my knowledge, outside of Halifax, all the other departments are running a hybrid system of using VHF in combination with TMR,” he said.

Twitter Security Pro: Encryption Isn’t Enough

Encryption can appear to be priceless when it’s absent, as it was in the recent Office of Personnel Management breach. It can appear to be costly when it’s present, as FBI director James Comey has argued. But not everything is as it appears.

Michael Coates, trust and information security officer at Twitter and global board member of the Open Web Application Security Project (OWASP), suggests encryption gets more credit than it deserves.

“Encryption is thrown around as the solution to prevent people from seeing your data,” said Coates in an interview at InformationWeek’s San Francisco office. “But if you dive into the dynamics of how data is stolen, you’ll find that encryption actually is not effective in those scenarios.”

Coates described a scenario involving a database with encrypted information. In order for a Web application to work with that database, it must decrypt the data.

“The way that data is most often compromised is through a vulnerability in the Web application … So when the attacker steals the data, that data will be unencrypted.”

Along these lines, a DHS official has asserted that encryption would not have helped in the OPM breach because the attacker had valid credentials. It may also turn out that encryption’s ability to conceal crime from the authorities is overstated.

Twitter Security Pro: Encryption Isn't Enough

Coates stopped by in his OWASP capacity in order to promote the OWASP Application Security Conference, which takes place Sept. 22 through 25 in San Francisco. The aim of the conference is to raise the bar for application security by helping individuals and organizations understand how to build better defended software.

“There’s a definite security talent shortage, so by educating more people we’re hopefully bringing more people into the fold,” said Coates.

Coates hopes the conference will provide companies with specific actions they can take to make their software more secure and with a roadmap to integrate best practices into their software development life cycle.

There are companies doing a good job with security, said Coates, citing Google, Facebook, Mozilla (where he used to work), Netflix, and Twitter (where he currently works). “The challenge is what do you say to the industry at large, to the companies in the Midwest that have one security person. … They can’t hire all these people and build custom solutions.”

Coates agrees with Google and other computer security professionals about the need for access to intrusion software, something could become more difficult if proposed export controls are adopted. “I think security engineers need both [offensive and defensive] skillsets,” he said. “Training someone how to attack software that they need to defend is vital. Anything less than that is just putting blinders on their eyes.”

At the same time, Coates is focused on providing developers with the tools and knowledge to write secure code. “We can’t just run around hacking ourselves secure,” he said. “Instead, we have to say, ‘I understand the symptom, how do I build a solution that is comprehensive and stops this problem from happening again in hundreds of applications?'”

Pointing to the way Java limits buffer overflow errors through array bounds checking and the way Python’s Django framework uses templates to prevent cross-site scripting, Coates expects some help will come through advances in programming languages that limit unsafe coding practices.

But because each application is unique and there are still so many ways to introduce vulnerabilities, Coates is pushing for security training, and for security as part of the software life cycle. “You can’t have security be this other team where you just throw things over the wall and fix stuff,” he said. “That’s a bottleneck and the business grinds to a halt. So you have to have this integrate into the life cycle and have tools that scale, because the cost of human capital for security is really high. And that’s what I see in enterprises that are doing well. They’ve found a way to minimize the human involvement and instead use highly accurate automation.”

Coates recommends that companies implement content security policies for their Web applications to defend against cross-site scripting. He also suggests using SSL everywhere and HSTS (HTTP Strict Transport Security) as defenses against man-in-the-middle attacks. He also advises use of the X-Frame-Options header, to prevent clickjacking (UI redress attacks).

“Fundamental security at the application layer and strong access controls at the enterprise layer governing who can interact with the data, those turn into the bread and butter of security,” said Coates. “And that’s where people need to spend the time.”

It’s Time to End the “Debate” on Encryption Backdoors

Yesterday, on Lawfare, FBI Director James Comey laid out his concern that the growing adoption of strong encryption technologies will frustrate law enforcement’s ability to conduct investigations — what he calls the “Going Dark” problem. The gist of Comey’s position is this: He recognizes encryption is important to security and privacy, but believes we are fast approaching an age of “universal encryption” that is in tension with the government’s investigative needs. Although he assures us he is not a “maniac,” Comey also feels it is his duty to ensure that we have a broad public debate that considers the costs as well as the benefits of widespread encryption. Comey will presumably be making the same points tomorrow afternoon at a Senate Intelligence Committee hearing where he will be the sole witness, while a broader panel of witnesses will be testifying on the same controversy tomorrow morning before the Senate Judiciary Committee.

First, credit where credit is due: James Comey is certainly not a maniac but a dedicated law enforcement official, one who has in the past put his career on the line to impose the rule of law on overreaching government surveillance. And it’s true that encryption will likely frustrate some investigations, a point I addressed directly when I testified House hearing on the subject in April. It’s also true that the FBI has so far to come up with any compelling examples of how encryption has actually stymied any investigations, and the latest wiretapping report shows that encryption is not yet a significant barrier to FBI electronic surveillance — encryption prevented law enforcement from obtaining the plaintext of communications in only four of the 3,554 criminal wiretaps authorized in 2014! Even so, it’s a given that just as ordinary citizens use encryption, so too will criminals, and that will likely pose a challenge for law enforcement in some cases.

So we are not “talking past each other” on encryption, as Comey puts it. Rather, since he first raised this issue last October, there has been an incredibly robust debate (as reflected in this massive of recent statements and writing on the subject), directly addressing the Director’s suggestion that companies should engineer their encrypted products and services to enable government surveillance. As that debate reflects, the broad consensus outside of the FBI is that the societal costs of such surveillance backdoors — or “front doors,” as Comey prefers to call them — far outweigh the benefits to law enforcement, and that strong encryption will ultimately prevent more crimes than it obscures.

Tech companies, privacy advocates, security experts, policy experts, all five members of President Obama’s handpicked Review Group on Intelligence and Communications Technologies UN human rights experts, and a majority of the House of Representatives all agree: Government-mandated backdoors are a bad idea. There are countless reasons why this is true, including: They would unavoidably weaken the security of our digital data, devices, and communications even as we are in the midst of a cybersecurity crisis; they would cost the US tech industry billions as foreign customers — including many of the criminals Comey hopes to catch — turn to more secure alternatives; and they would encourage oppressive regimes that abuse human rights to demand backdoors of their own.

Most of these arguments are not new or surprising. Indeed, it was for many of the same reasons that the US government ultimately rejected the idea of encryption backdoors in the 90s, during what are now called the “Crypto Wars.” We as a nation already had the debate that Comey is demanding — we had it 20 years ago! — and the arguments against backdoors have only become stronger and more numerous with time. Most notably, the 21st century has turned out to be a “Golden Age for Surveillance” for the government. Even with the proliferation of encryption, law enforcement has access to much more information than ever before: access to cellphone location information about where we are and where we’ve been, metadata about who we communicate with and when, and vast databases of emails and pictures and more in the cloud. So, the purported law enforcement need is even less compelling than it was in the 90s. Meanwhile, the security implications of trying to mandate backdoors throughout the vast ecosystem of digital communications services have only gotten more dire in the intervening years, as laid out in an exhaustive new report issued just this morning by over a dozen heavy-hitting security experts.

Yesterday, Comey conceded that after a meaningful debate, it may be that we as a people decide that the benefits of widespread encryption outweigh the costs and that there’s no sensible, technically feasible way to guarantee government access to encrypted data. But the fact is that we had that debate 20 years ago, and we’ve been having it again for nearly a year. We are not talking past each other; a wide range of advocates, industry stakeholders, policymakers, and experts has been speaking directly to Comey’s arguments since last fall. Hopefully he will soon start listening, rather than dooming us to repeat the mistakes of the past and dragging us into another round of Crypto Wars.

We have already had the debate that Comey says he wants. All that’s left is for him to admit that he’s lost.

Encryption, Privacy, National Security And Ashley Madison

Encryption, Privacy, National Security And Ashley Madison

So, as about a million Australians quietly shit themselves as the Ashley Madison data breach starts to bleed data, we have the UK government talking about banning encryption. Although they have backtracked to some some degree UK Prime Minister David Cameron told his parliament the country needed to crack down on encryption in order to make it harder for terrorists to communicate.

While the Ashley Madison hack is barely surprising — mega-breaches are a fact of life in today’s world — there’s a whole level of cock up associated with not encrypting such sensitive data. And if encryption becomes harder to access we can expect sensitive data to not only be captured but easily read and shared. And not actually deleting the data they promised to remove with their paid-for profile removal service suggests the story will be played out in the courts.

So, what’s happening in the Australian policy world when it comes to balancing act between security and privacy? We spoke with Tobias Feakin, the director of the International Cyber Policy Centre and Senior Analyst with the National Security at Australian Strategic Policy Institute. He works with and directly advises the government through the bipartisan Australian Strategic Policy Institute on cyber security matters.

“I think that’s the problem with the discussion right now. There’s a dichotomy that governments find themselves in. What is their primary responsibility? To protect the nation from whatever serious threat might be of the day. But here are all these other responsibilities about promoting good business practice and good cyber hygiene”.

Feakin pondered whether incidents like the Ashley Madison breach would drive governments to consider mandating the use of encryption on data.

However, there’s a real balancing act in all of this. Encrypted data can be a significant barrier that hampers police investigations but there are clear benefits when it comes to protecting the privacy of individuals and companies.

“For me, it’s about having a decent public policy discussion,” says Feakin. “It’s something that needs to be nurtured… in the Australian context is a more mature conversation around national security threats. More in terms of shaping them as risks rather than just threats because there is a distinct difference”.

Feakin noted the need for a providing balance to the debate.

“I’m always very careful… to say we’ve got to keep this in perspective. We live longer lives. We’re safer than at any point in human history.”

US officials target social media, encryption after Chattanooga shooting

Was the Chattanooga shooter inspired by IS propaganda? There’s no evidence to back the claim, but some officials are already calling for access to encrypted messages and social media monitoring. Spencer Kimball reports.

US officials target social media, encryption after Chattanooga shooting

It’s not an unusual story in America: A man in his 20s with an unstable family life, mental health issues and access to firearms goes on a shooting spree, shattering the peace of middle class life.

This time, the shooter’s name was Muhammad Youssef Abdulazeez, a Kuwaiti-born naturalized US citizen, the son of Jordanian parents of Palestinian descent. And he targeted the military.

Abdulazeez opened fire on a recruiting center and naval reserve facility in Chattanooga, Tennessee last Thursday. Four marines and a sailor, all unarmed, died in the attack.

But the picture that’s emerged from Chattanooga over the past several days is complicated, raising questions about mental health, substance abuse, firearms, religion and modernity.

Yet elected officials have been quick to suggest that events in Chattanooga were directly inspired by “Islamic State” (also known as ISIL or ISIS) Internet propaganda, though there’s still no concrete evidence to back up that claim.

“This is a classic lone wolf terrorist attack,” Senator Dianne Feinstein told US broadcaster CBS. “Last year, 2014, ISIL put out a call for people to kill military people, police officers, government officials and do so on their own, not wait for direction.”

And according to Feinstein, part of the solution is to provide the government with greater access to digital communications.

“It is now possible for people, if they’re going to talk from Syria to the United States or anywhere else, to get on an encrypted app which cannot be decrypted by the government with a court order,” Feinstein said.

Going dark

Two years ago, former NSA contractor Edward Snowden revealed the extent of US government surveillance to the public. Responding to public outcry in the wake of the NSA revelations, companies such as Facebook, Yahoo, Google and others stepped up efforts to encrypt users’ personal data.

But the Obama administration, in particular FBI Director James Comey, has expressed growing concern about encryption technology. Law enforcement argues that even with an appropriate court order they still cannot view communications masked by such technology. They call it “going dark.”

Feinstein and others believe that Internet companies have an obligation to provide law enforcement with a way to view encrypted communications, if there’s an appropriate court order. But according to Emma Llanso, that would only create greater security risks.

“If you create a vulnerability in your encryption system, you are creating a vulnerability that can be exploited by any malicious actor anywhere in the world,” Llanso, director of the Free Expression Project at the Center for Democracy and Technology, told DW.

Monitoring social media

It’s not just an issue of encryption technology. There’s also concern about how militant groups such as the “Islamic State” are using social media, in particular Twitter.

“This is the new threat that’s out there over the Internet that’s very hard to stop,” Representative Michael McCaul told ABC’s This Week. “We have over 200,000 ISIS tweets per day that hit the United States.

“If it can happen in Chattanooga, it can happen anywhere, anytime, any place and that’s our biggest fear,” added McCaul, the chairman of the House Homeland Security committee.

In the Senate, an intelligence funding bill includes a provision that would require Internet companies to report incidents of “terrorist activity” on their networks to authorities.

According to Llanso, such activity isn’t defined anywhere in the provision, which means companies would have an incentive to overreport in order to meet their obligations. And speech clearly protected by the US First Amendment can also lead to incitement, said Philip Seib, co-author of “Global Terrorism and New Media.”

“If somebody puts something up on Facebook that says Muslims are being oppressed in the Western world, maybe that’s an incentive to somebody to undertake a violent act,” Seib told DW. “But you can’t pull that down, that is a free speech issue.”

Islamist connections?

In the case of Chattanooga, it’s unclear how government access to encrypted communications or requiring social media reporting would have stopped the shooting. One of Abdulazeez’s friends told CNN that the 24-year-old actually opposed the “Islamic State,” calling it a “stupid group” that “was completely against Islam.”

But Abdulazeez was critical of US foreign policy and expressed a desire to become a martyr in his personal writings, according to CNN sources. The young man’s father was put on a terrorist watch list but was then cleared of allegedly donating money to a group tied to Hamas. Abdulazeez also spent seven months in Jordan visiting family in 2014.

He also reportedly viewed content related to radical cleric Anwar al-Awlaki. An American citizen, Awlaki was killed in 2011 by a US drone strike in Yemen for alleged ties to al Qaeda in the Arabian Peninsula.

“The Guardian” reported that just hours before the shooting spree, Abdulazeez sent a text message to a friend with a verse from the Koran: “Whosoever shows enmity to a friend of Mine, then I have declared war against him.”

Guns, drugs and depression

Abdulazeez reportedly suffered from depression and had suicidal thoughts. He abused alcohol and drugs, including marijuana and caffeine pills. He had recently been arrested and charged with driving under the influence, with a court date set for July 30. He also took muscle relaxants for back pain and sleeping pills for a night shift at a manufacturing plant, according to the Associated Press.

His family life was also unstable. In 2009, Abdulazeez’s mother filed for divorce, accusing his father of abuse. The two later reconciled, according to the “New York Times.”

And he had access to guns, including an AK-47 assault rifle. Abdulazeez liked to go shooting and hunting. He also participated in mixed martial arts.

Officials told ABC News that Abdulazeez had conducted Internet research on Islamist militant justifications for violence, perhaps hoping to find religious atonement for his problems.

“The campaigns by the Western governments – the US primarily, the Brits and others – have indicated that they don’t really understand what’s going on in the minds of many young Muslims,” Seib told DW.

“The Western efforts don’t ring true amongst many people they seek to reach because on issues such as human rights the Western governments don’t have much credibility,” he added.

Passphrase.io Uses Bitcoin-level Encryption To Create A Safe Online Notepad Service

Passphrase.io Uses Bitcoin-level Encryption To Create A Safe Online Notepad Service

Passphrase.io – A Social Experiment With Lots of Potential

Storing sensitive data in a secure and safe environment is not an easy task to accomplish for most people. Even though there are multiple guides on the internet of how to store data, and even encrypt if needed, doing so is still a hassle for most people. After all, our society values convenience above anything else, even if it goes at the cost of security.

On top of that, even if a user manages to create a backup of their sensitive data, there is still the question of what type of media to use. Storing a text file with passwords in the cloud is not the best of ideas, and physical storage is subject to wear and tear. Plus, there is always the potential of physical storage being stolen or tossed away on accident. Alternative solutions have to be created, and that is exactly what Passphrase.io aims to do.

The way Passphrase.io works is rather simple: open up the website, enter your passphrase and type the text you want to save in the notepad. It is important to remember the passphrase you entered at the beginning, as this “token” will be used to authenticate access to your notepad in the future. Rather than forcing users to create an account, a passphrase provides a more user-friendly authentication procedure for users.

Creating a passphrase may seem easy at first, but don’t be fooled by the platform’s simplicity. It is imperative to create a strong and lengthy passphrase. In fact, using shorter sentences, or combinations that can be gathered from games, music, movie or tv shows, have a higher chance of “being stumbled upon” by malicious individuals.

As soon as such a service launches, there is the unavoidable question of how secure a platform like Passphrase.io is. According to the developers, all of the information is encrypted in the user’s browser, making it impossible to see plain text notepad content or passphrases. Once you click “Save” in your notepad, all data is encrypted with AES-256, after which an SHA-256 hash is run on the user’s passphrase.

And this is where things draw a major parallel to Bitcoin’s ideology. Similar to how Bitcoin users need to remember their private key in order to access funds, Passphrase.io users need to keep their passphrase safe at all times. There is no recovery for a Bitcoin wallet when you lose the private key, and there is no recovery process for Passphrase.io either.

Last but not last, the encrypted passphrase and hash