- Privacy -

Last update 23.09.2017 18:46:51

Introduction  List  Kategorie  Subcategory  0  1  2  3  4  5 



75 Percent of U.S. Companies Think GDPR Doesn't Apply to Them

10.7.2017 securityweek Privacy

A new report focusing on Europe's General Data Protection Regulation (GDPR) preparedness shows a worrying disconnect between Business and Security. GDPR will come into effect in May 2018, and perhaps more than any other security regulation will require close cooperation between Business, IT and Security to enable and ensure regulatory compliance across the whole organization. The penalty for failure is severe: up to €20 million or 4% of global turnover -- and the reach of the regulation is effectively global.

NTT Security interviewed 1,350 non-IT decision-makers across the globe. It sought to understand GDPR awareness across the business, and measure how well information security policies are being communicated across the business. The results (PDF), it suggests, are mixed. While there is some improvement in general security policies, there is poor understanding of security-related regulations in general, and GDPR in particular.

This lack of understanding starts from the very basics: only 4 in 10 respondents recognize that GDPR will affect their own organization. The reality is that it will affect any business anywhere in the world that trades with the European Union or has any customers that are citizens of the European Union.

General Data Protection Regulation (GDPR)

This lack of understanding is much higher in America where 75% of businesses do not believe that GDPR is relevant to them. This is similar to Australia (74%) and Hong Kong (71%).

The most promising figures, unsurprisingly, come from within Europe: Switzerland (42%), and Germany and Austria (53%). More surprising, however, is the lack of awareness in the UK, where 61% of respondents are unaware of the GDPR implications.

It's not clear why the UK has such a low level of awareness. It could be down to Brexit and a feeling that EU regulations will no longer apply in the UK -- but this would be a false assumption. The UK will still be a part of the EU when GDPR becomes part of UK law in May 2018; and even outside of Europe, UK companies -- like US companies -- will need to comply if they wish to trade in or with Europe. Or it could be a response to the traditional 'light touch' operated by the UK regulator (the Information Commissioners Office).

The general lack of awareness is more concerning for GDPR than most other security-related regulations because GDPR is not just about security and the prevention of breaches -- it's just as much about how personally identifiable data is handled. For example, a strict requirement is that without other arrangements (such as Privacy Shield) European data must not be exported from Europe. Similarly, it may not be exported to a third party without the user's express permission.

To meet these requirements, companies will need to know exactly where the data is held, and who has access to it. This is potentially problematic given the widespread use of cloud storage and the personal use of cloud apps. To combat this, it will be important for every employee to understand what can and cannot be done with the data, and where it can and cannot be stored. However, a third of all respondents don't even know where their company data is stored. Of the two-thirds who know where it is stored, only 45% are definitely aware of how the new regulations will affect the storage.

Incident response is a second area that will require careful planning. Any breach likely to result in a risk to the rights and freedoms of individuals has to be notified to the relevant EU regulator within 72 hours (although full disclosure can then be staggered). Where there is a 'high risk' to individuals, those affected must be notified directly. Failure in this part of GDPR can result in a penalty of up to €10 million, or 2% of global turnover.

To comply with disclosure requirements, companies need to have a detailed and thorough incident response plan in place; and for this to be effective, all aspects of the business (not just IT and Security) need to know exactly what must be done, and when it must be done.

Less than half (48%) of organizations have an incident response plan, although 31% are implementing one. However, a plan is only words if people do not understand it. Within the 48% of companies with an incident response plan, only 47% of the decision-maker respondents are fully aware of what that plan includes. This is particularly worrying since an effective plan can only be put in place with widespread involvement across the business.

"In an uncertain world," warns Garry Sidaway, an SVP at NTT Security, "there is one thing organizations can be sure of and that's the need to mark the date of 25 May 2018 in their calendars. While the GDPR is a European data protection initiative, the impact will be felt right across the world for anyone who collects or retains personally identifiable data from any individual in Europe. Our report clearly indicates that a significant number do not yet have it on their radar or are ignoring it. Unfortunately, many organizations see compliance as a costly exercise that delivers little or no value, however, without it, they could find themselves losing business as a result, or paying large regulatory fines."

The EU's recent fine of €2.42 billion ($2.73 billion) on Google suggests European regulators will not hesitate in levying large fines for serious and repeated GDPR transgressions.


Microsoft Forces Users to Review Windows 10 Privacy Settings

4.7.2017 securityweek Privacy

Windows 10 users who haven’t installed the Creators Update will soon be notified to review their privacy settings and to install the latest feature update to remain secure, Microsoft announced.

Microsoft has been criticized for its Windows 10 data collection practices, and the French National Data Protection Commission (CNIL) recently served the company a formal notice to stop collecting excessive user data. As a result, Creators Update addressed these concerns, and CNIL closed the formal notice last week.

Although it claimed the Windows 10 data collection was aimed at improving the overall user experience, Microsoft did listen to feedback and provided users with increased control over their privacy in Creators Update. Users can now set data collection to Basic or Full, depending on how much usage statics they want to share with Microsoft.

Microsoft is now using these changes to push Windows 10 users to review their privacy settings if they haven’t done so already. The company will provide users with the possibility to postpone the process up to five times, but the next prompt will ask them to confirm their privacy settings.

“Given the Windows 10 Creators Update provides the latest security protections to help keep you safe, we want to help update your device as soon as possible. […] you will have the opportunity to review your privacy settings before your device is eligible to take the Creators Update. If you have not already taken this update, starting this week, we will prompt you to review your privacy settings,” John Cable, Director of Program Management, Windows Servicing and Delivery, notes in a Friday blog post.

The update experience, Cable says, will not change, and users will be able to choose when they want to update to the Creators Update, once their devices are ready.

Updating to the Creators Update, Cable notes, ensures that Windows 10 users benefit from the latest security improvements and usability improvements available for them. In the light of the recent WannaCry and NotPetya outbreaks, it’s not surprising the tech company is playing the “stay secure” card, especially since the first version of Windows 10 is at end-of-service.

“While you can continue to use this version and your computer will still work, you will no longer receive the monthly quality updates that contain protection from the latest security threats. To remain secure your device should be updated to the latest feature update,” Cable notes.

The “latest feature update,” of course, is Windows 10 Creators Update, and Microsoft is taking steps to ensure users are more likely to update. The company will start notifying them if their devices need to be updated, Cable reveals.


French Regulator Accepts Microsoft's Data Protection Improvements to Windows 10

3.7.2017 securityweek Privacy

CNIL Accepts Microsoft's Data Protection Improvements to Windows 10

CNIL, the French data protection regulator, has closed the formal notice procedure it served on Microsoft on June 30, 2016 over privacy concerns relating to Windows 10. "Since then," says CNIL, "the company has brought itself into line with data protection rules, the formal notice procedure has therefore been closed."

In a statement emailed to SecurityWeek, Microsoft commented, "We are committed to protecting our customers' privacy and putting them in control of their information. We appreciate the French data protection authority's decision and will continue to provide clear privacy choices and easy-to-use tools in Windows 10."

The notice was served last year with three particular concerns: the excessive collection of personal data; the tracking of users' web-browsing without their consent; and a lack of security and confidentiality of users' data. Since then, Microsoft has addressed each issue to CNIL's satisfaction.

On the first, Microsoft has reduced the amount of data it collects by nearly half. "it has restricted its collection to the sole data strictly necessary for maintaining the proper functioning of its operating system and applications, and for ensuring their security," notes CNIL.

On the second concern, Microsoft now makes it clear that an advertising ID is intended to track web-browsing in order to offer personalized advertising. This now has to be activated or deactivated at installation, and users can reverse the choice at any time.

Over security concerns, Microsoft "has strengthened the robustness of the PIN code allowing users to authenticate to all company’s online services, and more specifically to their Microsoft account," notes CNIL: "too common PIN code combinations are now forbidden."

Microsoft has also addressed the other injunctions within the formal notice. It has inserted the information required under Article 32 of the French Data Protection Act; it has requested CNIL authorization for its processing of personal data; it has joined Privacy Shield; and it has ceased placing advertising cookies without obtaining users' consent.

"The Chair of the CNIL has considered that the company had complied with the French Data Protection Act and has therefore decided to proceed to the closing of the formal notice," says the CNIL announcement.

Given the size of the sanctions that will become available to CNIL when the GDPR comes into force in May 2018, it is probably a wise move by Microsoft to get compliance sorted now.


Google's $2.73 Billion Fine Demonstrates Importance of GDPR Compliance

27.6.2017 securityweek Privacy
The European Commission (EC) has levied a €2.42 billion ($2.73 billion) fine against Google because it "has abused its market dominance as a search engine by giving an illegal advantage to another Google product, its comparison shopping service."

While this is an antitrust action, it raises the possibility of similarly large fines under the General Data Protection Regulation coming into force in less than a year's time. That new regulation can set sanctions at up to 4% of a firm's annual global turnover. While this would rarely reach the level of today's fine against Google in absolute terms, it provides the potential for proportionately similar fines against a far larger number of companies than those that might be caught by antitrust regulations.

Today's fine was levied because the EC concluded that firstly, "Google is dominant in general internet search markets throughout the European Economic Area;" and that secondly, "Google has abused this market dominance by giving its own comparison shopping service an illegal advantage."

Google can, and almost certainly will, appeal the decision. In a statement emailed to SecurityWeek, Kent Walker, SVP and General Counsel, commented, "When you shop online, you want to find the products you're looking for quickly and easily. And advertisers want to promote those same products. That's why Google shows shopping ads, connecting our users with thousands of advertisers, large and small, in ways that are useful for both. We respectfully disagree with the conclusions announced today. We will review the Commission's decision in detail as we consider an appeal, and we look forward to continuing to make our case."

The level of the fine was calculated on the basis of a specified formula. "The Commission's fine of €2,424,495,000," explains the EC announcement, "takes account of the duration and gravity of the infringement. In accordance with the Commission's 2006 Guidelines on fines... the fine has been calculated on the basis of the value of Google's revenue from its comparison shopping service in the 13 EEA countries concerned."

It is this use of a known formula that allows us to speculate on any future GDPR fines (for any infringer and not just Google). "Does this case give us any entree as to how the Commission might behave in setting fines when GDPR is in force?" asks Brian Bandey, a Doctor of Law specializing in International IP and cyber issues. "Well we can say that the Commission followed its 2006 'Guidelines on the method of setting fines' with respect to Google."

When they came into force, competition commissioner Neelie Kroes said about them: "These revised Guidelines will better reflect the overall economic significance of the infringement... the link between the fine and the duration of the infringement, and the increase for repeat offenders -- send three clear signals to companies. Don't break the anti-trust rules; if you do, stop it as quickly as possible, and once you've stopped, don't do it again."

Bandey continues, "My personal expectation is that the same approach will be taken with respect to GDPR fines. The EU States hold the concept of individual personalty and their consequent rights very highly. In a sense, that is the moving force behind the GDPR. In the European Commission Fact Sheet on this subject (24th May 2017): 'The reform provides tools for gaining control of one's personal data, the protection of which is a fundamental right in the European Union.'

"In that sense," he adds, "I expect that they will link penalties for breaching these 'fundamental rights' to duration, effects on involved persons, and repeat offending." And as Kroes said, it would be best for companies who breach GDPR to stop as quickly as possible, and not breach it again.

Not everyone thinks that this anti-trust fine will provide a benchmark for future GDPR fines. Dr Monica Horten, a visiting fellow at the London School of Economics, stresses the fundamental difference between the laws. "With this Google fine," she said, "this is a corporation abusing its dominant market position. The underlying motivation is about deliberately seeking to gain market advantage, and simultaneously disadvantaging its competitors. It was a deliberate, proactive move to cut out competition.

"GDPR fines," she continued, "will be imposed by national regulators responsible for data protection in Member States. The GDPR gives national regulators a range of measures they can take before they resort to a fine. With GDPR, the root is more likely to lie in some form of corporate management failure, either through neglect or making false economies and cost-cutting." The implication is that the regulators will be slow to deliver the full force of the regulation.

But that doesn't mean that companies can afford to relax concern about GDPR. With this fine, explains David Flint, senior partner at law firm MacRoberts LLP, "the Commission has sent out a clear signal that it is not afraid to take on the largest entities who it perceives to be breaching EU law. With the introduction of the GDPR next year and its potential for penalties of up to 4% of worldwide turnover, there can be little doubt that US businesses need to take compliance with EU law, be it Data Protection or Antitrust, very seriously.

"Both the GDPR and the Antitrust rules envisage follow-on private actions for damages, so the potential risk, legal, financial and reputational may be significantly higher."

"But let me be absolutely clear," adds Bandey; "nobody really know. But we will do in the not-so-distant future."


Google Stops Scanning Gmail Content for Ad Targeting

26.6.2017 securityweek Privacy

Google on Friday announced plans to stop scanning the content of consumer Gmail addresses for personalizing the ads it serves to users.

Previously, the Internet giant would scan each and every email message received in consumer Gmail addresses, which allowed it to better determine what relevant ads to serve to its users. The only email accounts excluded from this practice were the Google Apps for Education and G Suite accounts.

Now, Google has decided to bring all accounts on the same page, and Diane Greene SVP, Google Cloud, announced in a blog post on Friday that consumer accounts will be aligned with the G Suite ones.

“G Suite’s Gmail is already not used as input for ads personalization, and Google has decided to follow suit later this year in our free consumer Gmail service. Consumer Gmail content will not be used or scanned for any ads personalization after this change,” Greene says.

As soon as the change will enter effect, ads displayed to users will be entirely based on their settings, in line with the manner in which the company personalizes ads for other Google products. Furthermore, users will be able to change their settings at all times, and can even disable ads personalization if they desire.

G Suite, which has seen great traction among enterprise users and has seen more than doubled usage among large business customers, will continue to be ad free. According to Google, over 3 million paying companies are using G Suite today.

Google’s free email service is very popular among consumers as well, and currently serves more than 1.2 billion users. To make Gmail even more appealing, Google also focused on improving user security and privacy.

In May 2017, the Internet giant rolled out a series of business-focused improvements to Gmail, including early phishing detection capabilities and "click-time warnings" for malicious links. In January, Google announced that Gmail will stop allowing users to attach JavaScript (.js) files to emails.


Kantara Initiative Releases Consent Receipt Form for GDPR

23.6.2017 securityweek  Privacy
With less than one year before GDPR kicks in, the newswaves have been flooded in recent months with new surveys showing how ill-prepared business still remains. But while there is much news, there has been little in the way of practical technology solutions. The Kantara Initiative released one on Tuesday: a global consent receipt specification that meets GDPR requirements.

'Consent' is one of the big and far-reaching elements of GDPR. Failure to abide by the new consent requirements means failure to comply with GDPR, and potential liability for the regulation's stringent sanctions -- it is no longer simply a matter of preventing breaches.

Consent now must be informed and explicit. It means that in the event of a dispute over the use of personal information, or the transfer of personal data either between applications or to third parties, business will need to be able to prove that consent had indeed been given. Online tick-boxes and assumed consent will not suffice.

Kantara's Consent Receipt 1.0 (CR 1.0) (PDF) allows businesses dealing with EU-based companies to demonstrate they meet the notice requirements of GDPR scheduled to be enforced on May 25, 2018. The specification is available free for download. Its purpose is to decrease the reliance on privacy policies and enhance the ability for people to share and control personal information.

Related: GDPR Industry Roundup - One Year to Go

The Kantara Initiative is a non-profit alliance of some of the world's companies involved with digital identities. It connects a global, open, and transparent community that includes CA Technologies, Experian, ForgeRock, Digi.me, Internet Society, Nomura Research Institute and SecureKey.

The consent receipt works both ways. While the business can prove that consent was genuinely given, the user can also define exactly what consent is withdrawn; either on its own or in conjunction with the so-called right-to-be-forgotten'.

"Until CR 1.0," explains Colin Wallis, executive director at the Kantara Initiative, "there was no effective privacy standard or requirement for recording consent in a common format and providing people with a receipt they can reuse for data rights. Individuals could not track their consents or monitor how their information was processed or know who to hold accountable in the event of a breach of their privacy. CR 1.0 changes the game," he added. "A consent receipt promises to put the power back into the hands of the individual and, together with its supporting API -- the consent receipt generator -- is an innovative mechanism for businesses to comply with upcoming GDPR requirements. For the first time individuals and organizations will be able to maintain and manage permissions for personal data."

There is, however, the proverbial elephant in the room. The companies that will be most affected by GDPR and consent are the big tech companies like Google, Facebook and Microsoft. It is unknown at this stage whether Europe will have the political will to fully enforce GDPR against the big American giants. If these companies prevaricate over full compliance without redress from Europe, why should other companies worry about something as esoteric as a consent receipt?

SecurityWeek asked the Kantara developers if this was a concern. It is not. "Markets evolve, technologies emerge and people get tired of the same old same old," said one of the consent receipt developers. "Given the rising anger amongst the people that pay for ads on these platforms, and the increasing creepiness of surveillance capitalism, it's not an unreasonable bet to say that both Google and Facebook's days as kings of their hills are numbered. They won't diminish as quickly as Friendster but they will diminish. Both the tech and business press are typically ahistorical and short sighted, so it's not surprising that they are continually surprised by new developments."

His point is that GDPR reflects an almost worldwide shift in attitudes, with consumers becoming more aware of and cynical towards the use of their personal data within surveillance capitalism. "Despite cartel-like market domination in their areas, the actual switching costs for users (and customers) of Facebook and Google are very low."

However, by embracing the new reality of user-centric regulations, companies that rely on user information will better maintain and indeed increase their user numbers. The same basic principles apply to all businesses. Engaging and conforming with user-centric regulations will only strengthen the relationship between business and customers. Kantara's consent receipt form provides compliance with GDPR, and reassurance to customers.


With Less Than 1 Year To Go Companies Place Different Priorities on GDPR Compliance
30.5.2017 securityaffairs Privacy

The European General Data Protection Regulation (GDPR) will take effect in one year from now, but a large number of firms are far from prepared.
It feels like Y2K all over again. We are less than one year until the impact of the GDPR is realized, no one is certain what will happen, and everyone is taking a different approach to mitigation.

In April 2016, the European Union introduced the General Data Protection Regulation (GDPR), and it goes into effect in May 2018. The GDPR aims to “create more consistent protection of consumer and personal data across EU nations.” (https://digitalguardian.com/blog/what-gdpr-general-data-protection-regulation-understanding-and-complying-gdpr-data-protection) One way to summarize the requirements is to say that companies that have operations or do business with EU citizens must know where EU citizens’ data in their care is located, ensure it is being handled appropriately, remove the data when requested and notify citizens’ promptly when their data has been compromised. As an individual, this seems an obvious expectation, but working in a company you learn information has a way of spreading among people and systems and trying to control it is very difficult.

“What’s most worrying about the findings,” comments Matt Lock, director of sales engineers at Varonis, “is that one in four organizations doesn’t have a handle on where its sensitive data resides. These companies are likely to have a nasty wake-up call in one year’s time. If they don’t have this fundamental insight into where sensitive data sits within their organizations and who can and is accessing it, then their chances of getting to first base with the regulations are miniscule and they are putting themselves firmly at the front of the queue for fines.” (http://www.securityweek.com/survey-shows-disparity-gdpr-preparedness-and-concerns)
Any company found to be in violation of the regulation, faces fines and penalties up to 4% of their global annual revenue. It is this penalty that has companies taking note and working hard to ensure compliance. But not everyone is taking it seriously, or at least not everyone has started.

GDPR

A recent survey conducted on behalf of Varonis highlights a disparity between the priorities of company executives and those responsible for ensuring compliance. Among the 500 IT decision makers surveyed, 75% “face serious challenges in being compliant with the EU GDPR” by the deadline. (http://www.securityweek.com/survey-shows-disparity-gdpr-preparedness-and-concerns) Not surprising when you learn 42% of company executives do not view compliance by the deadline as a priority. Where does this disparity come from?

The survey included companies from the UK, Germany, France and the US. These companies undoubtedly have different experiences with regulators based on their geographic locations and their operating industries. Some regulators tend to be collaborative in finding a resolution while others tend towards punitive actions. We don’t yet know how EU regulators will apply the GDPR penalties. 92% of respondents expect that a specific industry “will be singled out as an example in the event of a breach” (http://www.securityweek.com/survey-shows-disparity-gdpr-preparedness-and-concerns) with 52% of UK respondents predicting banking, while France and Germany overwhelmingly predict a breach in technology and telecommunications to be the example.

Regardless of who is first, the scale of the first penalty will be the signal to company executives on how much they should devote to compliance. And as with all business decisions, it is one of the minimizing costs to maximize profitability.

56% of UK respondents believe the GDPR will increase complexity for IT teams and result in higher prices for customers with 22% seeing no benefit to their business. (https://www.infosecurity-magazine.com/news/uk-it-leaders-gdpr-will-drive-up/) With these kinds of numbers, it will be difficult to get executive support for compliance efforts. However, 35% of companies surveyed believe GDPR compliance will be beneficial with better protections for personal data being the biggest improvement. While the GDPR only addresses personal information, the exercise will help companies understand the effort required to manage data better and some may see unexpected benefits.

Leading up to January 1, 2000 there were many similar stories about companies taking different approaches to Y2K remediation. Some had enormous, expensive projects running for years, others scrambled at the end of 1999 while a few focused on response planning and hoped for the best. The requirements of the GDPR are well documented, but the likelihood and size of penalties are still unknown. Different companies take different approaches based on industry, geography, and individual risk tolerances. The only certainty is that everyone is watching for the first big consumer data breach in the EU in 2018 and hoping it isn’t theirs.


Survey Shows Disparity in GDPR Preparedness and Concerns

26.5.2017 securityweek Privacy
The European General Data Protection Regulation will take effect in exactly one year from today. It will affect any company that does business with the EU, whether that company is based in Europe or elsewhere (such as the US). While there have been many surveys indicating that affected firms are far from prepared, there are few that highlight the geographic disparity in readiness.

One Year Out: Views on GDP (PDF), conducted by Vanson Bourne for Varonis, is particularly detailed. It surveyed 500 IT decision makers in organizations with more than 1,000 employees in the US (200), the UK (100), Germany (100) and France (100). Unlike many such surveys, it includes the raw data, allowing readers to dig deep into areas of interest or concern.

Unsurprisingly, given other surveys, the headline result is that 75% of respondents "face serious challenges in being compliant with the EU GDPR by 25th May 2018." This result is consistent across all four nations; but those who strongly agree range from 15% in the UK (the lowest) to 25% (the highest) in the US.

The cause of this disparity may be found in senior management's attitude towards GDPR. Overall, 42% of companies do not view compliance by the deadline as a priority. Thirteen percent of firms 'strongly agree' with this -- but the detail ranges from just 6% in the UK to 19% in the US (France and Germany are equal at 10%).

It is tempting to suggest that this is influenced by history: the UK regulator has traditionally been 'business-friendly', allowing companies to be more relaxed towards data protection than counterparts in France and Germany. US companies (apart from the major tech industries such as Google, Facebook and Microsoft), have little experience of European regulators.

But while the survey may indicate a lack of urgency at the management level, the respondents themselves indicate serious concern over the potential effect of GDPR. Overall, 75% of respondents believe that fines imposed for breaching regulations could cripple some organizations. Here, US concerns (81%) are above average, with France being the least concerned at 64%. It would appear that US practitioners are more concerned about GDPR than are their managers.

The survey also provides detail on what aspects of GDPR are most concerning. Not surprisingly, the erasure right (the right-to-be-forgotten) in Article 17 tops the list at 55% overall. Somewhat surprisingly given the apparent link between this and the American constitutional right to freedom of speech, the US respondents were the least concerned at 48%. Equally surprising, UK concern was by far the highest at 71%.

The second biggest concern is the requirement for processing activities, contained in Article 30; that is, visibility into and control over who has access to the data. Overall concern was steady at 52%, with regional variations limited to the lowest at 50% (UK) and the highest at 53% (US).

"What's most worrying about the findings," comments Matt Lock, director of sales engineers at Varonis, "is that one in four organizations doesn't have a handle on where its sensitive data resides. These companies are likely to have a nasty wake-up call in one year's time. If they don't have this fundamental insight into where sensitive data sits within their organizations and who can and is accessing it, then their chances of getting to first base with the regulations are miniscule and they are putting themselves firmly at the front of the queue for fines.”

The concern showing the greatest disparity is over data protection by design (Article 25). The least concern comes from France at 35%, with the highest from the US at 55% (this is the highest of all concerns for the US respondents). It seems to reflect a general concern that GDPR might impinge on innovation -- with the highest concern coming from perhaps the most entrepreneurial nation.

It would be wrong, however, to think that the respondents have only negative thoughts and worries about GDPR. Thirty-six percent of respondents believe it will be very beneficial for both consumers and organizations. This, however, ranges from a very low 12% in the UK to an encouraging 47% in the US. In purely business terms, 57% of UK respondents believe it will prove troublesome for organizations, while only 36% of US respondents think the same.

The top benefit for private citizens is that their personal data will be better protected (54%). The UK (61%) and the US (59%) lead France (45%) and Germany (47%) in this. The order is reversed, however, over whether GDPR will make it less likely that PII will be passed to third parties. The UK (24%) and the US (32%) are behind both France (35%) and Germany (36%). Confirming these views, very few respondents could see no benefits from GDPR -- and most of those seem to be in the UK (11%). Only 5% of US organizations hold a similar view.

A particularly interesting section of the report deals with expected outcomes from the GDPR, with wide variations on which regulator is expected to be the most stringent. Overall, Germany tops the list at 76%, with German respondents in the lead at 85%. The UK is second overall at 57% -- which could be surprising given the UK regulator's soft historical approach and the UK government's insistence that it will implement GDPR in as business-friendly manner as possible. This view is distorted, however, by the UK and US respondents' score at 76% each. France (35%) and Germany (24%) are far less confident that the UK regulator will be rigorous.

Ninety-two percent of respondents suspect a particular industry will be singled out as an example in the event of a breach. Banking is seen as the most likely at 26% overall. This figure is distorted by the UK response at 52%. Both France and Germany individually believe that any example will more likely come from the technology and telecommunications industry.

A high number of respondents (82%) also believe that a particular country will be singled out if one of their organizations is in breach of GDPR. The overall favorite is the UK at 23% -- but this is distorted by the UK respondents (48%) who are perhaps concerned with the after effects of Brexit. Noticeably, only 2% of French and 11% of German respondents have a similar view.

Nevertheless, 68% of respondents believe that a UK company (as opposed to the UK in general) will be singled out and punished because of Brexit. This belief is most strong in the US (77%) and the UK (70%), and less so, but still high, in France (58%) and Germany (57%).

What this survey shows above all is that while there is a general lack of preparedness for GDPR among most organizations, specific concerns and expectations can vary widely between the different nations. The level of detail provided goes far beyond many similar surveys, and allows individual readers to dig deeper into specific areas. The value in this is that by evaluating other countries' and organizations' concerns, individual readers can rate their own preparedness.


Survey Shows Disparity in GDPR Preparedness and Concerns

26.5.2017 securityweek Privacy
The European General Data Protection Regulation will take effect in exactly one year from today. It will affect any company that does business with the EU, whether that company is based in Europe or elsewhere (such as the US). While there have been many surveys indicating that affected firms are far from prepared, there are few that highlight the geographic disparity in readiness.

One Year Out: Views on GDP (PDF), conducted by Vanson Bourne for Varonis, is particularly detailed. It surveyed 500 IT decision makers in organizations with more than 1,000 employees in the US (200), the UK (100), Germany (100) and France (100). Unlike many such surveys, it includes the raw data, allowing readers to dig deep into areas of interest or concern.

Unsurprisingly, given other surveys, the headline result is that 75% of respondents "face serious challenges in being compliant with the EU GDPR by 25th May 2018." This result is consistent across all four nations; but those who strongly agree range from 15% in the UK (the lowest) to 25% (the highest) in the US.

The cause of this disparity may be found in senior management's attitude towards GDPR. Overall, 42% of companies do not view compliance by the deadline as a priority. Thirteen percent of firms 'strongly agree' with this -- but the detail ranges from just 6% in the UK to 19% in the US (France and Germany are equal at 10%).

It is tempting to suggest that this is influenced by history: the UK regulator has traditionally been 'business-friendly', allowing companies to be more relaxed towards data protection than counterparts in France and Germany. US companies (apart from the major tech industries such as Google, Facebook and Microsoft), have little experience of European regulators.

But while the survey may indicate a lack of urgency at the management level, the respondents themselves indicate serious concern over the potential effect of GDPR. Overall, 75% of respondents believe that fines imposed for breaching regulations could cripple some organizations. Here, US concerns (81%) are above average, with France being the least concerned at 64%. It would appear that US practitioners are more concerned about GDPR than are their managers.

The survey also provides detail on what aspects of GDPR are most concerning. Not surprisingly, the erasure right (the right-to-be-forgotten) in Article 17 tops the list at 55% overall. Somewhat surprisingly given the apparent link between this and the American constitutional right to freedom of speech, the US respondents were the least concerned at 48%. Equally surprising, UK concern was by far the highest at 71%.

The second biggest concern is the requirement for processing activities, contained in Article 30; that is, visibility into and control over who has access to the data. Overall concern was steady at 52%, with regional variations limited to the lowest at 50% (UK) and the highest at 53% (US).

"What's most worrying about the findings," comments Matt Lock, director of sales engineers at Varonis, "is that one in four organizations doesn't have a handle on where its sensitive data resides. These companies are likely to have a nasty wake-up call in one year's time. If they don't have this fundamental insight into where sensitive data sits within their organizations and who can and is accessing it, then their chances of getting to first base with the regulations are miniscule and they are putting themselves firmly at the front of the queue for fines.”

The concern showing the greatest disparity is over data protection by design (Article 25). The least concern comes from France at 35%, with the highest from the US at 55% (this is the highest of all concerns for the US respondents). It seems to reflect a general concern that GDPR might impinge on innovation -- with the highest concern coming from perhaps the most entrepreneurial nation.

It would be wrong, however, to think that the respondents have only negative thoughts and worries about GDPR. Thirty-six percent of respondents believe it will be very beneficial for both consumers and organizations. This, however, ranges from a very low 12% in the UK to an encouraging 47% in the US. In purely business terms, 57% of UK respondents believe it will prove troublesome for organizations, while only 36% of US respondents think the same.

The top benefit for private citizens is that their personal data will be better protected (54%). The UK (61%) and the US (59%) lead France (45%) and Germany (47%) in this. The order is reversed, however, over whether GDPR will make it less likely that PII will be passed to third parties. The UK (24%) and the US (32%) are behind both France (35%) and Germany (36%). Confirming these views, very few respondents could see no benefits from GDPR -- and most of those seem to be in the UK (11%). Only 5% of US organizations hold a similar view.

A particularly interesting section of the report deals with expected outcomes from the GDPR, with wide variations on which regulator is expected to be the most stringent. Overall, Germany tops the list at 76%, with German respondents in the lead at 85%. The UK is second overall at 57% -- which could be surprising given the UK regulator's soft historical approach and the UK government's insistence that it will implement GDPR in as business-friendly manner as possible. This view is distorted, however, by the UK and US respondents' score at 76% each. France (35%) and Germany (24%) are far less confident that the UK regulator will be rigorous.

Ninety-two percent of respondents suspect a particular industry will be singled out as an example in the event of a breach. Banking is seen as the most likely at 26% overall. This figure is distorted by the UK response at 52%. Both France and Germany individually believe that any example will more likely come from the technology and telecommunications industry.

A high number of respondents (82%) also believe that a particular country will be singled out if one of their organizations is in breach of GDPR. The overall favorite is the UK at 23% -- but this is distorted by the UK respondents (48%) who are perhaps concerned with the after effects of Brexit. Noticeably, only 2% of French and 11% of German respondents have a similar view.

Nevertheless, 68% of respondents believe that a UK company (as opposed to the UK in general) will be singled out and punished because of Brexit. This belief is most strong in the US (77%) and the UK (70%), and less so, but still high, in France (58%) and Germany (57%).

What this survey shows above all is that while there is a general lack of preparedness for GDPR among most organizations, specific concerns and expectations can vary widely between the different nations. The level of detail provided goes far beyond many similar surveys, and allows individual readers to dig deeper into specific areas. The value in this is that by evaluating other countries' and organizations' concerns, individual readers can rate their own preparedness.


Consent Control and eDiscovery: Devils in GDPR Detail

5.5.207 securityweek Privacy
The European General Data Protection Regulation will be in force in just over 12 months: May 25, 2018. This is the date by which all EU nations must have enacted the regulation. Gartner predicts that "by the end of 2018, more than 50 percent of companies affected by the GDPR will not be in full compliance with its requirements."

GDPR will affect all EU-based companies, and all US companies that have any trade with the EU. Despite the threat of hefty non-compliance fines, Gartner is not alone in finding a lack of preparatory urgency among organizations.

"The Gartner data aligns with a survey Imperva recently conducted of IT security professionals at RSA," Imperva's chief product strategist Terry Ray told SecurityWeek. "Our data showed an overall lack of urgency among the IT professionals surveyed, with only 43 percent of respondents indicating that they are evaluating or implementing change in preparation for GDPR."

An April 2017 NetApp survey that queried 750 CIOs, IT Managers and C-suite executives in France, Germany and the UK, found that around 10% of companies have yet to begin preparations. Seventy-three percent of respondents have some concern over meeting the GDPR deadline.

A new report (PDF) published Wednesday by Pierre Audoin Consultants (PAC) and sponsored by Reliance acsn also supports the idea that companies do not understand the urgent need for GDPR compliance. Paul Fisher, a research analyst and cyber security lead at PAC, suggests, "The fact that compliance and more especially, GDPR, has such a low priority among our respondents is worrying. I do not believe that they are burying their hands in the sand, more that the implications and complexity of GDPR compliance have not yet fully sunk in."

It is tempting to believe the lack of preparedness is due to a misunderstanding of the nature of the regulation -- a belief that so long as personal data is kept safe, compliance will be assured. This is not true with GDPR. "The big change is that organizations will be financially punished for violations of record keeping and privacy impact assessment obligations, and not just actual data breaches," explains the PAC analysis.

"The increasingly empowered position of individual data subjects tilts the business case for compliance and should cause decision makers to re-evaluate measures to safely process personal data," warns Gartner.

It is this data subject empowerment that particularly makes GDPR different and complex. Simply installing new layers of security will not ensure compliance.

Gartner suggests organizations should focus on "five high-priority changes to help them to get up to speed with GDPR requirements." These are:

Check for GDPR applicability

Appoint a data protection officer (DPO)

Demonstrate accountability in all processing activities

Check cross-border data flows

Prepare for data subjects exercising their rights

The devil is in the detail of that final recommendation. In full, Gartner says, "Data subjects have extended rights under the GDPR. These include the right to be forgotten, to data portability and to be informed (e.g., in case of a data breach). If a business is not yet prepared to adequately handle data breach incidents and subjects exercising their rights, now is the time to start implementing additional controls." An additional right is the data subject's right to withdraw consent for personal data processing.

Compliance and security officers need to consider the effect of data subjects exercising their rights -- and in particular the two issues of withdrawal of consent and the right to be forgotten.

The first issue involves the provision and withdrawal of the data subject's consent. Implied consent and implied cessation are no longer sufficient -- consent must explicit. Being able to prove that consent was given and continues (that is, has not been withdrawn) is new and will require completely new procedures. Gartner says, "A clear and express action is needed that will require organizations to implement streamlined techniques to obtain and document consent and consent withdrawal." One option could be the Consent Receipt Specification being developed by the Kantara Initiative -- but whatever solution is adopted, maintaining the status quo is not an option.

The second issue -- the right to be forgotten -- requires that an organization should have absolute knowledge of where all EU personal data is stored, and be able to remove it. That is no simple task in the age of cloud and mobility.

The PAC report notes, "Compliance with GDPR will only be legally registered if an organization is able to identify exactly where data is, whether in its own data centres, in the cloud or with a third party. The data controller will be held responsible for data at all times."

This requirement is little different to eDiscovery; but the reality is that few organizations currently have fully effective eDiscovery. Historically, the primary motivation has been litigation and the threat of litigation -- with the implication that if you don't get sued, you don't need eDiscovery.

This will no longer be realistic. Any one of the European data subjects can request -- effectively on a whim -- that all data you hold on them be removed. Organizations will not merely be required to do that, they will need to be able to demonstrate that they can do that. A combination of data classification and eDiscovery needs to be in place by May of next year.

"One of the huge holes for GDPR compliance," Skyhigh's privacy spokesperson Nigel Hawthorn told SecurityWeek, "is third party data handling. Most organizations aren't sure how many third parties process data for them, whether that's an outsourcer or a cloud provider being used to crunch or collaborate on data. The Data Controller is ultimately responsible for data handling of all of their third-party data processors and needs to ensure that the data processor's data handling procedures are robust -- I am sure this will catch out a lot of people."

The message from Gartner, reinforced by many other surveys, is that the task is more complex, and the available time much less, than many organizations realize. Hawthorn adds, "Gartner's prediction that by the end of 2018 less than 50% of organizations will be in full compliance reminds everyone we need to accelerate our efforts now -- as the regulation will be been in force for over 6 months by the end of 2018 and the risks of non-compliance can be huge."

His advice is that "Organizations need to take an holistic approach to GDPR compliance involving teams from multiple departments, led by senior management. The Governance, Risk and Compliance teams need to lead the project but involve IT risk and security along with other teams that are heavy users of data, such as marketing and HR. Sadly, marketing, the team most likely to break the regulations, is rarely involved in the discussions."


eDiscovery - An Enterprise Issue That Can't be Ignored

4.5.2017 securityweek Privacy
eDiscovery for Enterprises

eDiscovery is a concept born from litigation. It describes the need to find and retain electronic data that might be required in litigation ― whether for the plaintiff, the defendant or a third party. In recent years, eDiscovery has become considerably more complex. Business is increasingly litigious; legal obligations such as freedom of information (FoIA) laws and Europe’s General Data Protection Regulation (GDPR) are generating new demands; and the sheer volume and diversity of corporate electronically stored information (ESI) is expanding dramatically.

E-discovery Requirements

For Litigation
In its original sense, eDiscovery is the process of fulfilling the legal requirement to locate and present documents pertinent to a legal case; that is, litigation support. It goes beyond simple discovery to include the concept of ‘litigation hold’; that is, the safe preservation of such documents.

The need to do this is growing. A recent paper compiled by Osterman Research questioned nearly 150 decision makers from medium and large companies in North America ― and found that 60% of the respondents were somewhat or very worried that their organizations would be sued. The research also indicated that 75% of the organizations had received an average of 12 requests during the past 12 months.

The primary source of litigation obligation in the US comes from the 1938 Federal Rules of Civil Procedure. This was updated in 2006, and again in 2015. It now places greater focus on the preservation of ESI, and makes the failure to produce required documents potentially more expensive. There is effectively no source of ESI that is exempt, whether that is in the cloud, on social media, or stored on employees’ personal devices.

“In short,” notes Osterman, “any electronic information that contains a business record, regardless of the tool that was used to create it or the venue in which it is stored, will potentially be subject to eDiscovery. The amendments to the FRCP in 2006 and 2015 have, for all intents and purposes, made anything from any source potentially subject to eDiscovery.”

For FoIA
While litigation eDiscovery is governed by the Federal Rules, FoIA requests are governed directly by the Freedom of Information Act. The FoIA establishes a statutory right of public access to Executive Branch information in the federal government.

In litigation, only those parties involved can demand eDiscovery, and can only demand eDiscovery of litigation-pertinent ESI. FoIA requests, however, can be from anyone for anything, and there are no relevancy requirements. So, while FoIA targets may be fewer (limited to government), the source of requests is much greater and can include just about anything.

For GDPR
GDPR is a new type of eDiscovery driver that applies only to companies operating in, or with operations in (such as trading with) the European Union. It includes facets of both litigation discovery and FoIA discovery. Like FoIA, it does not require litigation, but it does require relevancy (that is, a customer or customer’s representative).

GDPR is a user-centric privacy law. It gives users greater control over how their personal information is used by commerce; with potentially huge sanctions on companies that break the law. Two example requirements will demonstrate the need for efficient eDiscovery: the so-called right-to-be-forgotten; and the requirement for unambiguous and revocable informed consent from the user to the company collecting and using personal data.

The only way an organization can comply with either is if it can ‘discover’ all instances of personal data that it needs to forget (remove), and can prove that it has removed those records. Similarly, to demonstrate that it has revoked consent, it will need a record of the initial consent that is now revoked.

The Scope of the Difficulty

“eDiscovery is a term that seems simple in conversation ― but no one is truly ready for what it really means,” warns Drew Koenig, security solutions architect at Magenic. “Off the record, I’ve seen a 200% increase in the last 3 years with Lit Holds and eDiscovery involved cases,” commented a CISO who did not wish to be named.

There are two primary categories to the eDiscovery problem: data and organization. The data issue comprises volume, variety of data types, and physical location of that data. The organization problem is one of ownership. Who owns responsibility for eDiscovery?

Volume
The sheer volume of ESI stored by corporations is staggering. Without specific procedures able to find relevant documents, the time and cost involved would be enormous. Part of the volume problem is data classification ― the need to know what data might be relevant.

Variety
eDiscovery draws no distinction over how data is stored. It could be in structured databases and spreadsheets, or unstructured email, voicemail, documents, presentations or CRM data. It simply needs to be stored.

Location
eDiscovery draws no distinction over where data is stored. It could be on in-house servers, in the cloud, on employees’ personal devices, on websites or in social media accounts ― or with a service provider.

Responsibility
eDiscovery involves multiple departments. IT is responsible for the infrastructure that holds ESI; Security is responsible for protecting it; Compliance is involved through regulations such as GDPR; and Legal is responsible for litigation aspects of discovery. With no single owner to take responsibility of eDiscovery, the danger is that no-one does.

The combination and interaction of these difficulties is a huge problem for many organizations. “Depending on legal requirements a business may have to reach out into social networks, personal home computers (BYOD), cloud services, IoT/mobile devices in addition to corporate assets,” warns Koenig.

“The ever-changing data flows makes a consistent model and applying control sets near impossible. The infrastructure to store and process is usually under-estimated. Most clients, in my experience, begin but collapse under the immense weight of data they realize they truly have. There is a growing compliance need for this, but no security tool will tell you how to be secure or how to classify data. That's up to the business to solve, then find the tools to solve them. eDiscovery is another example that security is a business problem not a technology problem. Without business security processes around data classification and use, no tool will help you fully.”

Solutions in Practice

Despite these difficulties, eDiscovery is a legal requirement that cannot be shirked. Adequate preparation is the key, so that when a discovery request or right-to-be-forgotten demand is made, it can be actioned efficiently.

“Lack of preparation for eDiscovery can expose a business to serious legal and financial risks if the organization can’t find the complete set of information requested,” warns Mike Pagani, chief evangelist at Smarsh, a provider of cloud-based information archiving solutions. “If the information wasn’t retained in an organized way for easy retrieval, or if it was altered in any significant way, that creates significant eDiscovery problems.”

The first task is that of ownership; and there is no single solution. Much will depend on the type and size of the organization.

For Samsung Research America, eDiscovery is owned by Security. Steve Lentz, CSO, explains. “Security is responsible here for anything to do with security, including eDiscovery. We work with the relevant departments, such as IT, Legal, HR, Lab, etc, to gather the data… Bottom line,” he adds, “is that you need to communicate and collaborate with the responsible departments.”

This is a good working model ― effectively a committee of relevant department heads that meets regularly, but with a specific chairman. In this organization, it is Security; in others that might be subject to a high rate of litigation, it could be Legal.

“The answer to the question,” suggests Brian Kelly, chief information security leader at Quinnipiac University, “like many legal questions, is ‘it depends’. The size of the organization is the key. In my role, I see the Information Security function as a ‘support agency’ to both Legal and Compliance (depending on the case being investigated). While I was at a health insurance company, it was part of Internal Audit. Ultimately, I think it’s a combination that works best with Legal directing and Information Security or IT completing tasks.”

The very largest corporations may require something different. “There was a prediction made not long ago,” says Martin Zinaich, information security officer at the city of Tampa, “that a new position would start to appear in larger organizations ― Chief Information Governance Officer ― a combination of Information Security and Governance. If that ever does happen, eDiscovery will have found its home.”

Technology
The volume of ESI, the diversity of data types, and its physical distribution combine to create a problem that for most companies can only be solved by technology. “eDiscovery systems are plentiful, from cloud hosted to on-site,” says Zinaich ― but choice is important. “The reality is most of the cloud based system are more record processing. They help identify, preserve, collect, review and process. The real trick is making sure everything relevant is in the eDiscovery system. On-site packages often tackle the collection of data from disparate systems and the processing of that data.”

Pagani believes the solution is in the cloud. “Modern comprehensive archiving technology can enable eDiscovery for blog posts, social media feeds, instant messages, text messages and much more, all in one platform,” he claims. The single platform is important to avoid multiple separate silos of discoverable EIS. “Furthermore,” he adds, “comprehensive archiving platforms that retain non-email electronic communications in their native, proper context (e.g. a Tweet as a Tweet and not an email representation of a Tweet) should be implemented to prevent material alteration of messages.”

The cloud becomes important, he suggests, “because it offers the scalability needed to keep up with the rapidly expanding volume of information created each day.” But technology alone is not enough ― especially where eDiscovery is based on separate silos of archived EIS.

Manual Processes
“From a pure legal standpoint,” comments Zinaich, “it is advantageous to keep records only as long as they are relevant or legally required. Yet, often that determination is based on the type of data.” An email archive, he explains, “is likely holding spam, solicitations, birthday announcements and other transitory data. Unless each email is categorized, the IT department is stuck with keeping everything, and they are not sure how long they have to keep everything.”

“Keeping track of where everything is, or could be, is daunting. In most cases email is the primary ‘place’ for evidence,” says Kelly. “Microsoft has some great Lit Hold and eDiscovery tools built in to Exchange and Office 365 (email, OneDrive, SharePoint). Collecting and searching can be done through automation and there is a host of vendors and tools out there.” But, he warns, “Making sense of the results is still a manual process that either staff lawyers, legal assistants or IT workers at counsel have the unenviable job of sifting through.”

“Another challenge is redaction,” adds Zinaich. “When information is gathered from all points, it has to be reviewed and appropriately redacted to keep security information, investigation information, intellectual property and other exempt data safe. While there are systems that help with a bit of AI, it is largely a very costly manual process.”

eDiscovery going Forward

eDiscovery is already a complex issue, involving multiple departments and a mix of business and technology processes. It is going to get worse. Both business and society are increasingly litigious; regulations such as the FoIA and GDPR are likely to increase; and both the volume and location of EIS are expanding.

There is one other emerging area that will make matters worse: the internet of things (IoT). It is already here in some areas, and will emerge in others. Consider a company connected car. If it is involved in an accident, access to the vehicle’s logs will be required either to make or defend a claim.

But it goes further, and even beyond business. “The IoT will move eDiscovery from the Boardroom to the Livingroom,” warns Kelly. “Every divorce lawyer will be looking for logs from NEST thermostats, webcams and maybe even the refrigerator.” Much of that could just as easily apply to the office.


European Parliament Slams Privacy Shield

7.4.2017 securityweek Privacy
The European Parliament on Thursday adopted a resolution (PDF) strongly criticizing the EU-US Privacy Shield. Privacy Shield is the mechanism jointly developed by the European Commission and the US government to replace the earlier Safe Harbor, struck down by the European Court of Justice in 2015. Its purpose is to allow the transfer of EU personal information from Europe to servers in the US.

European law requires that personal information can only be transferred to geographical locations with an equivalent or 'adequate' level of privacy protection. With very different attitudes towards privacy between the US and the EU, it is unlikely that US data protection will ever be considered adequate for EU data. Privacy Shield is designed to provide an agreement between individual US organizations and the EU that they will handle EU data in a manner acceptable to European standards.

Although Privacy Shield has been agreed between the EC and the US and is already in operation it is not without its critics-- not the least of which is the European Parliament. The stakes are high. While this is not the only legal mechanism for the export of European data to the US, it is the primary one. Others include standard contractual clauses (SCCs); but SCCs are already being challenged by Max Schrems in the Irish High Court. Without an acceptable lawful mechanism, there can be no trade between the US and the EU.

It is generally considered that SCCs will eventually be declared unlawful. "There is the ongoing case in Ireland regarding Standard Contractual Clauses," European privacy consultant Alexander Hanff told SecurityWeek. "This is likely to reach the CJEU and be ruled on in a similar fashion to Safe Harbor which, although will not have a direct impact on Privacy Shield, quite clearly shows the result similar cases (including Binding Corporate Rules and Privacy Shield itself) are likely to achieve."

There is therefore a lot riding on the continuing legality of Privacy Shield. For the moment, this is not as immediately concerning as it may seem. "The EP resolution follows the statement earlier this week from the Commission indicating a review in the Fall," comments David Flint, a senior partner at the MacRoberts law firm. "At this stage, it is merely a reminder of all the matters that the Commission should take account of and noting the residual powers of national DPAs to ban transfers, whilst restating the EP's concerns."

Hanff agrees that there will be little immediate outcome from this resolution. "I am pretty sure that the Commission can ignore the motion and are likely to do so because frankly what other choice do they have at the moment -- if they agree to it, then they are basically accepting that they failed, and the Commission are really not that humble." Politically, he sees a rift in the current Commission between those focused on digital rights and those focused on the Digital Economy; with the latter in the ascendency.

This doesn't mean that there is not a problem. Individual national data protection authorities (DPAs) "do have the power to effectively shut down Privacy Shield by banning transfers based on it on the grounds that it does not meet adequacy requirements," continued Hanff. "They have not done so to date -- I suspect because they have been giving the Commission and the US Government a chance to fix it -- but it seems highly unlikely that that will ever happen."

Hanff notes that there is little actual progress on the Privacy Shield agreement from the US side. "When you consider there is still no Ombudsman and that the Privacy and Civil Liberties Oversight Board is reduced to a non-quorate position where only one of its five seats are currently occupied... even if you completely ignore the woeful inadequacies of the agreement, you cannot ignore that some of the major assurances of that agreement have quite simply not been met. I suspect it is only a matter of time now before one or more of the EU's DPAs makes a stand." The French authority, CNIL, has demonstrated that it would not be afraid to do so, with recent actions against both Google and Microsoft.

One further complication is a hardening of attitudes with the arrival of the Trump administration. "There is no detailed consideration of possible changes as a result of the new US administration, although that remains a significant concern," comments Flint. "The recent policy changes on net neutrality and ISP data sharing exacerbate the concern."

Hanff is more forthright. "One should also be asking questions with regards to the Trump administration and US Congress wiping out ISP privacy rules last week. One must understand that whereas many people focus on the transference of data to a third country when they discuss Privacy Shield (in the case of Privacy Shield, specifically the US) it is not just about the right to transfer; it stems from the right to process - so we must now consider whether a European Citizen visiting the US and using a US carrier for data and voice, have their rights undermined by these recent changes. The obvious answer is yes; however, how we deal with that is much less obvious."

The European Commission is caught in a modern Morten's Fork of its own making. It was instrumental in developing European data protection laws (for human rights reasons), but doesn't wish to abide by them (for economic reasons). Much will hinge on the EC-US talks in the Fall; but today's European Parliament resolution has indicated to the EC what it expects.

If there is no significant move by the US administration to satisfy European concerns, then a rapid legal challenge to the Privacy Shield can be expected. But it should also be noted that the national DPAs do not have to wait for a legal judgment before taking action. The Schrems case that brought down the original Safe Harbor also made it clear that DPAs cannot be bound by EC promulgations. They have, as Hanff notes, "the power to effectively shut down Privacy Shield by banning transfers based on it, on the grounds that it does not meet adequacy requirements."


Microsoft Details Data Collection in Windows 10 Creators Update

6.4.2017 securityweek Privacy
Microsoft on Wednesday revealed details on the data collection practices that the next major Windows 10 version, set to arrive next week, will be collecting from computers.

Ever since first announcing Windows 10, the tech giant faced criticism for collecting a large amount of data on the usage of the platform and applications. In July 2016, France served notice to Microsoft to stop collecting excessive user data without consent on civil liberty grounds.

In September 2015, the company said that the collected data was meant to improve the overall user experience. Only months before, the company had boosted data collection in Windows 7 and Windows 8.

In January this year, the company took the wraps off a privacy dashboard, meant to provide users with increased visibility and control over the data collected by Microsoft services, and even allows them to clear the collected data if they want to.

At the time, Microsoft also revealed that Windows 10 Creators Update will simplify Diagnostic data levels, reduce data collected at the Basic level, and present only two data collection options to users: Basic and Full. The platform update will also bring increased privacy settings, Microsoft said in early March.

Only one week before Windows 10 Creators Update starts rolling out to users, Microsoft decided to provide specific information on the type of data it will be gathering from users’ computers based on the collection level selected.

“The Basic level gathers a limited set of information that is critical for understanding the device and its configuration including: basic device information, quality-related information, app compatibility, and Windows Store,” Microsoft’s Brian Lich explains.

Security level information is also collected as part of the Basic level, with all of the gathered information meant to help identify problems that can occur on a particular device hardware or software configuration.

When it comes to the Full level, the type of collected data expands dramatically beyond the data gathered in the Basic level, to include device, connectivity, and configuration data; products and services usage data; software setup and inventory data; browsing, search and query data; typing and speech data; and licensing and purchase data.

Thus, users who opt in for this data collection level will allow their Windows 10 machine to send information such as OS version, user ID, Xbox user ID, device ID, device properties and capabilities, app usage, device health and crash data, device performance and reliability data, device preferences and network info, installed applications, content consumption data, and information on purchases made on the device.

Facing increased scrutiny over its data collection practices, Microsoft appears determined to become more transparent on the matter, so as to ensure it doesn’t run into too much trouble, especially in the European Union, which last year started investigating the tech giant on user privacy-related issues. For that, the company also published a privacy statement.


Microsoft Finally Reveals What Data Windows 10 Collects From Your PC
6.4.2017 thehackernews Privacy
Since the launch of Windows 10, there has been widespread concern about its data collection practices, mostly because Microsoft has been very secretive about the telemetry data it collects.
Now, this is going to be changed, as Microsoft wants to be more transparent on its diagnostics data collection practices.
Till now there are three options (Basic, Enhanced, Full) for Windows 10 users to select from under its diagnostics data collection section, with no option for users to opt out of sending their data to Microsoft.
Also, the company has never said precisely what data it collects behind these options, which raised huge privacy concerns among privacy-conscious users.
But now for the first time, Microsoft has revealed what data Windows 10 is collecting from your computer with the release of the Windows 10 Creators Update, bringing an end to nearly two years of its mysterious data collection practices.
The Windows 10 Creators Update, which will be available from April 11 for users to download for free, comes with a revamped Privacy settings section.

During the process of upgrading to the Creators Update, you will be displayed a new Privacy Settings screen that will ask you to toggle the following features:
Location – Allow Windows and apps to request your location and share that data with Microsoft.
Speech Recognition – Allow Cortana and Windows Store apps to recognize your voice and send that data to Microsoft to improve speech recognition.
Tailored experiences with diagnostic data – Allow Microsoft to use diagnostic data from your computer to offer tips and recommendations.
Relevant ads – Allow apps to use advertising IDs to show ads more interesting to you based on your app usage.
What's more? On Wednesday, Microsoft published a massive list of diagnostics data – both the Basic and Full levels of diagnostics – on its TechNet site, showing what data gets collected.
Basic – The Basic level collects a limited set of data that is critical for understanding the device and its configuration. This data includes basic device information, quality-related information, app compatibility, and Windows Store.
Full – The Full level collects data for the following nine categories: common data; software setup and inventory data; product and service usage data; browsing, search and query data; content consumption data; linking, typing, and speech utterance data; and licensing and purchase data.

Windows chief Terry Myerson said in a blog post published Wednesday that Microsoft hoped the transparency would allow users to make "more informed choices" as the company starts rolling out its new Creators update to the operating system.
This more transparency in gathering diagnostic data after two years of the Windows 10 release is likely Microsoft's response to European Union regulators that's publicly pressuring the company about its privacy practices for the past year.
In February, European Union regulators said they're still unsatisfied with the privacy changes announced by Microsoft and seeking further clarification from the company.
Marisa Rogers, the privacy officer of the Microsoft's Windows and Devices Group, said that the company is planning to "share more information about how [it] will ensure Windows 10 is compliant with the European Union's General Data Protection Regulation."


Kantara Initiative Assists With EU Privacy and GDPR Issues

4.4.2017 securityweek Privacy
The US-based Kantara Initiative announced today that it has joined the European Trust Foundation to help its non-EU government and corporate members engage with Europe on pan-jurisdiction federated digital identity, trust and privacy initiatives.

The advent of the General Data Protection Regulation (GDPR) turns Kantara's development of good business practices into legal requirements for any enterprise that has a single customer within the European Union. The new alliance will make it easier for US business to engage with the European Commission over such issues.

There are still fundamental misconceptions in the common understanding of the GDPR: firstly, that it only involves European companies; and secondly, that it solely concerns the protection of personal data from being hacked. Neither are true. Any company anywhere in the world that trades with Europe is affected; and data protection now involves far more than the protection of data. GDPR shifts emphasis from company security to involved customer protection: secure customer relations are now a focus.

The issue is demonstrated by GDPR's 'consent' requirements. For a business to process personal data, it must now obtain consent, defined in article 4(11) as "any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she by statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."

The detail, requiring explicit informed consent (tick boxes and obscure T&Cs are no longer sufficient) will require changes to business practices. But consent can also be withdrawn -- and that will require changes to business processes. Commercial enterprises will need to manage consent as effectively as they manage identity; and indeed, the two become woven together.

This is where Kantara comes in. Its Consent Receipt Specification is a record of consent provided to an individual at the time the consent is given. The purpose is effectively to verify a consent contract, but it also provides a mechanism for the withdrawal of that consent. Coupled with a second evolving Kantara specification, User Managed Access (UMA) -- which enables the user to control how his or her data is shared -- these new initiatives could help provide a solution to the GDPR consent requirements.

Kantara's new relationship with the European Trust Foundation, which has a history of working closely with the European Commission, will help US consent mechanisms be accepted as adequate for the GDPR. But it is not just a one-way matter of compliance. It doesn't simply provide part of the legal basis for the transfer of personal data out of the EU; it is also part of the legal basis for making automated decisions relating to that personal information.

Consent receipts and user managed access are not simply a GDPR solution, they are good practices for the modern world. User trust in vendors' use of PII is low. If that can be improved so that secure customer relations can replace old-style hidden and obfuscated personal data collection, then new avenues for business will emerge.

In Kantara's own words, "When individuals are forced to sign organization-centric privacy policies/ terms of use, then this places limitations on the information that will be shared. If such constraints were removed, and capabilities built on the side of the individual, then new, rich information will flow -- including actual demand data (as opposed to derived/ predicted demand)."

But whatever solutions to GDPR requirements are chosen by US (or any non-EU) business, they will need to be accepted as adequate by the European Union -- and this is the aim of the new relationship between Kantara and the European Trust Foundation. "The European Trust Foundation aims to provide a valuable service to Kantara members located outside of Europe by helping to streamline the engagement process with the EU," said Colin Wallis, executive director, Kantara Initiative. "The foundation and organizations like Kantara act as a 'staging area' to help expedite the process of gathering information and presenting a common voice for non-EU countries to approach and engage with the EU on GDPR."


Telegram Messenger Adds AI-powered Encrypted Voice Calls
31.3.2017 thehackerews Privacy 

Joining the line with rival chat apps WhatsApp, Viber, Facebook Messenger, and Signal, the Telegram instant messaging service has finally rolled out a much-awaited feature for the new beta versions of its Android app: Voice Calling.
And what's interesting? Your calls will be secured by Emojis, and quality will be better using Artificial Intelligence.
No doubt the company brought the audio calling feature quite late, but it's likely because of its focus on security — the voice calls on Telegram are by default based on the same end-to-end encryption methods as its Secret Chat mode to help users make secure calls.
Unlike Signal or WhatsApp, Telegram does not support end-to-end encryption by default; instead, it offers a 'Secret Chat' mode, which users have to enable manually, to completely secure their chats from prying eyes.
However, the voice calling feature in Telegram supports end-to-end encryption by default, enabling users to secure their chats in a way that no one, not even Telegram or law enforcement, can intercept your calls.
Emoji-Based Secure Key Exchange Mechanism
Telegram features an interesting key exchange mechanism to authenticate users and make sure their calls are even more secure: Users are required just to compare four emoji.
While making a call, you will see four emoji on your mobile screen and so the recipient. If the emoji on your screen match the recipient's, your connection is secure!
"The key verification UI we came up with in 2013 to protect against man-in-the-middle attacks served well for Telegram (and for other apps that adopted it), but for Calls, we needed something easier," Telegram said in a blog post published Thursday.
"That's why we've improved the key exchange mechanism. To make sure your call is 100% secure, you and your recipient just need to compare four emoji over the phone. No lengthy codes or complicated pictures!"
Voice Calls — Encrypted, Super-Fast and AI-Powered
What's more? Telegram ensures its users that the audio quality of the voice calls has kept as high as possible by using a peer-to-peer connection, the best audio codecs, and Artificial Intelligence.
Developers say that audio quality of the call is "superior to any of our competitors" by including an AI neural network.
So, each time you make a Voice Call, your Telegram app's AI neural network will optimize dozens of parameters based on technical information of your device and network such as network speed, ping times, packet loss percentage, to adjust the quality of your call and improve future calls on the given device and network.
"These parameters can also be adjusted during a conversation if there's a change in your connection," the company states. "Telegram will adapt and provide excellent sound quality on stable WiFi — or use less data when you walk into a refrigerator with bad reception."
Note: AI doesn't have access to the contents of the conversation, so your calls are completely secure.
Telegram Offer Complete Control & Video Compression
Unlike WhatsApp and Facebook, Telegram lets you control "who can and who can't call you with granular precision."
If you don't want anyone bothering you, you can simply switch voice calls off altogether, blocking anyone and even everyone from calling you.
Telegram also offers users direct control over the quality of videos they shared over the platform. You can adjust the compression and see the quality of the video before sending it to your friends.
You can also set the video compression rate as the default setting for all your future video uploads.
Telegram version 3.18 which includes new features, such as Voice Calling, is free to download for iPhone on the App Store and Android phone on the Google Play Store.


Could Killing of FCC Privacy Rules Lead to End of Net Neutrality?

26.3.2017 securityweek Privacy
The Senate on Thursday voted 50-48 to overturn new FCC rules that would prevent ISPs from monetizing customers' information without their consent. The rules, passed during the Obama administration in October 2016, were due to come into force earlier this month, but were delayed by new Republican chairman Ajit Pai.

This delay provided time for Republican senators to propose a Joint Resolution to 'disapprove' the new FCC rules. S.J. Res. 34 was adopted along party lines. It 'disapproves' the FCC rule "Protecting the Privacy of Customers of Broadband and Other Telecommunications Services... and such rule shall have no force or effect."

It is expected that this will be confirmed by Congress, which could then further prevent the FCC from issuing substantially similar rules in the future. However, many commentators also consider this to be the first step in dismantling the net neutrality rules imposed during the Democrat Obama administration.

Internet Privacy and ISP DataThe debate goes back to the Open Internet Order of 2010, and the subsequent reclassification of ISPs as common carriers in 2015. This was necessary to bring ISPs under the FCC's regulatory regime in order to enforce net neutrality -- but it also meant that the FCC was responsible for privacy enforcement.

The ensuing privacy rules were adopted on October 27, 2016, and were designed "to give broadband consumers increased choice, transparency, and security over their personal data so consumers are empowered to decide how data are used and shared by broadband providers."

In short, the FCC grabbed regulatory control of ISPs from the FTC in order to enforce net neutrality, but in doing so also became responsible for privacy. The effect was to place different internet giants (such as Comcast, Verizon and AT&T) under different regulations to others (such as Google and Facebook). The latter are allowed to monetize customer data, while the former are not.

The ISPs are not happy with this, and have been complaining and lobbying to get it reversed. "The unfortunate result of the FCC's extreme regulatory proposals," wrote Comcast in March 2016, "will be more consumer confusion and less competition -- and a bunch of collateral damage to innovation and investment along the way. This is most disappointing because it is entirely avoidable, since the Administration, the Federal Trade Commission, and others have examined this issue and marketplace for many years and have reached very different conclusions."

The Internet & Television Association trade group (NCTA) issued a new statement Thursday: "We appreciate today's Senate action to repeal unwarranted FCC rules that deny consumers consistent privacy protection online and violate competitive neutrality. The Senate's action represents a critical step towards reestablishing a balanced framework that is grounded in the long-standing and successful FTC privacy framework that applies equally to all parties operating online..."

The ISPs would like the marketplace to be unified under the regulatory control of the FTC -- or at least to have no more regulatory control than that placed on other internet service companies. But ISPs provide a completely different service, and control the internet choke points. The Electronic Frontier Foundation (EFF) points out 'Five Creepy Things Your ISP Could Do if Congress Repeals the FCC's Privacy Protections'. These include selling data to marketers, hijacking searches, inserting ads, pre-installing their own spyware on phones, and injecting 'undetectable, undeletable tracking cookies in all of your HTTP traffic'. In each case, EFF provides examples of ISPs who have already done this.

It is noticeable that new UK laws focus on using the ISPs to exert the government's new surveillance (Investigatory Powers Act) and censorship (the Digital Economy Bill) capabilities. The former ensures that the government will simply be able to take the internet data that US ISPs are likely to be able to sell, while the latter will enable the government to use the ISPs to block public access to websites it deems unsuitable (as it already does in a limited form with sites such as The Pirate Bay). Both laws would almost certainly be struck down by the European Courts as unconstitutional if the UK remained within the European Union.

What isn't yet certain is whether disapproving the FCC's privacy rule in the US is really the first step towards dismantling net neutrality. Chairman Pai can legitimately claim that he had no role in this (other than providing time for it to happen). It is the Senate rather than the FCC that has done so.

Net neutrality has been in force since the FCC's Open Internet Order and the reclassification the ISPs imposed neutrality. It has already stood the test of time and would probably require government legislation rather than FCC action to reverse it.

Writing in the LegalMatch law blog, Jonathan Lurie comments, "If the rule was to be fully stripped away, it would most likely involve an act of Congress explicitly doing so. However, Congress and the Trump administration do not seem to be making such legislation a priority." Instead, he suggests that, "it has been implied that [chairman Pai's FCC] would likely see changes allowing ISPs to prioritize data in certain situations -- basically creating carveouts to the general rule. There's no particular indication as to what these carveouts might include, but it is easy to imagine a situation where exceptions could swallow the rule."

The irony in the current situation is that ISPs have argued that the FCC privacy rules distort the advertising market and hamper innovative new approaches, while supporters of net neutrality claim that it will enable innovative companies with new approaches to internet services.


Searching for Leaked Celebrity Photos? Don't Blindly Click that Fappening Link!
21.3.2017 thehackernews  Privacy
Are you curiously googling or searching torrents for nude photos or videos of Emma Watson, Amanda Seyfried, Rose McGowan, or any other celebrities leaked in The Fappenning 2.0?
If yes, then beware, you should not click any link promising Fappenning celebrity photos.
Cybercriminals often take advantage of news headlines in order to trap victims and trick them into following links that may lead to websites containing malware or survey scams.
Last week, a few private photos of Emma Watson and Amanda Seyfried — ranging from regular selfies to explicitly sexual photos — were circulating on the Internet forums, including Reddit and 4chan, with UK's TV presenter Holly Willoughby and US actor Rose McGowan among the latest alleged victims.
Now, according to the security researchers from MalwareBytes, scammers are exploiting this new batch of leaked celebrity photos and videos by using their stolen selfies to lure victims on social media sites and making dollars.
One of the scam campaigns uncovered by MalwareBytes targets Twitter users, promising them to follow mentioned links to access leaked embarrassing private photos of British WWE star Paige – whose intimate photos and videos, among other celebs, were leaked online last week without her permission in an act dubbed "The Fappening 2.0."

The Fappening 2.0 is named after similar leaks in 2014 when some anonymous hackers flooded the Internet with private photographs of Jennifer Lawrence, Kim Kardashian, Kate Upton and many hundreds of other celebrities by hacking their Apple's iCloud accounts.
Don't Install Any App To View Leaked Fappening Images — It's a Malware!
The latest scams spreading on Twitter read:
"VIDEO: WWE Superstar Paige Leaked Nude Pics and Videos"
"Incredible!!! Leaked Nude Pics and Videos of WWE Superstar Paige!!!!: [url] (Accept the App First)"
For accessing the content, scammers told you to first install a twitter app called "Viral News." In the hope of a glimpse of Paige's nude video, victims tricked into giving the malicious app permission to access their Twitter account, update their profile and post tweets on their behalf.
Once the app is installed, you are then sent to a site that serves no purpose other than enabling crooks to make money from affiliate marketing and advertising link clicks.
The site quickly grays out, asking you to click yet another link that eventually lands you on a survey page that promises to reward you with an Amazon gift card as soon as you hand over your details.
"Suffice to say, filling this in hands your personal information to marketers – and there is no guarantee you will get any pictures at the end of it," said Chris Boyd, a malware intelligence analyst at Malwarebytes.
Malware Hijacks Twitter Accounts to Spread Fappening Spam

While you are looking through all these links, the creepy app spams out the same tweets from your account, leading your followers to the same The Fappening 2.0 scam you fell for.
So far nearly 7,000 users have become a victim of the latest scam.
"As freshly leaked pictures and video of celebrities continue to be dropped online, so too will scammers try to make capital out of image-hungry clickers. Apart from the fact that these images have been taken without permission so you really shouldn’t be hunting for them, anyone going digging on less than reputable sites is pretty much declaring open season on their computers," Boyd concluded.
Here are some useful tips you can follow in an effort to protect yourself from scams shared through social media:
Don't take the bait. Stay away from promotions of "exclusive," "shocking" or "sensational" photos or footage. If it sounds too outlandish to be true, it is probably a scam.
Hover over links to see their real destination. Before you click on any link, mouseover the link to see where it'll take you. Do not click on links leading to unfamiliar sites.
Don't trust your friends online. It might not actually be your friends who are liking or sharing scam links to photos. Their account may have been hijacked by scammers.
Raise your Eyebrows when asked for something in return. Beware of any site that asks you to download and install software in order to view anything else, in this case, nude photos and videos of Paige. This is a known tactic of spreading scam.


WhatsApp may let you Recall Sent Messages and Track Friends Location in Realtime
2.2.2017 thehackernews Privacy
whatsapp-live-location-recall-messages
Are you the victim of sending awkward WhatsApp messages to your friends, families, and colleagues while you're drunk?
No need to panic now, as you'll soon be able to recall your drunk or mistakenly sent text messages on WhatsApp – a much-demanded feature.
Recall Unread Messages Sent Mistakenly
The most popular instant messaging service is reportedly testing the ability to edit or completely recall messages that have already been sent, allowing you to edit or delete a message from your friend's phone if it is yet to be read.
This new feature, first spotted by Twitter account @WABetaInfo, may be included in a new beta version of WhatsApp's next update before making it into a full consumer release.
If so, the update will add "Revoke" and "Edit" options for messages with gray tick marks that have not yet been viewed by the recipient. Blue ticks on WhatsApp represents that the recipient has seen your sent messages.
If the sender clicks on the Revoke option, the message from the recipient inbox will be replaced with "Sender revoked the message," as shown in the screenshots, telling the recipient that the sender is keeping something away from them.
However, the Facebook-owned messaging service has not officially announced the edit and recall feature, so it is unclear if or when these features will make its way to the popular messaging app's stable release.
Track Your Friends Location In Realtime
Besides giving its users more control over their sent messages, WhatsApp is also testing a new feature called "Live Location" in group chats to make it a lot easier for users to track the location of the group members while coordinating a group meeting.
WhatsApp's Live Location Tracking feature will allow users to enable other members in a group to track their location in real time. The feature will be built on WhatsApp's send your location feature, and users can opt in the feature to share their moving position for one, two or five minutes or indefinitely.
Other upcoming features being tested in the beta version of WhatsApp include the ability to reply to status messages, as well as shaking your smartphone within a conversation to contact WhatsApp and report spam.


The Turkish Government has blocked the Tor access once again
19.12.2016 securityaffairs Privacy

The Turkish Government has applied restrictions on the Tor anonymity network, the discovery was made the Turkey Blocks internet censorship watchdog.
“Our study indicates that service providers have successfully complied with a government order to ban VPN services.” reads a blog post published by the Turkey Blocks.

Users in the country started reporting connectivity issues around the same time, a circumstance that suggests the adoption of new measures to control the access to the Internet.
Turkish Government Tor ban
Tor popularity in Turkey is increasing due to the censorship applied by the local authorities.

The Turkish Government has applied new sophisticated “blocking measures” that will not allow circumventing social media shutdowns in Turkey. The Government is blocking Tor and any other VPN service as part of the Turkey’s internet censorship of the central government.

The Government of Ankara recently ordered ISPs to block access to the Tor and many other VPN services. Earlier in December, the Turk Internet that represents the ISPs in the country reported the high pressure on the Government to complete the ban. The Government ordered ISPs a weekly status update on the applied restrictions.

In the recent years, the Turkish Government as applied several times the block of the social media networks during national emergencies and political unrest and street demonstrations.

In the following graph, it is reported the number of direct connected Tor users (Source Tor Metrics), it is easy to verify the increased popularity of the tool in the country since last year.

Turkish Government Tor banTurkish Government Tor ban

The experts from the Turkey Blocks also noticed Tor usage via bridges is being downgraded due to the restrictions applied by the Turkish Government.

“Turkey Blocks finds that the Tor direct access mode is now restricted for most internet users throughout the country; Tor usage via bridges including obfs3 and obfs4 remains viable, although we see indications that obfs3 is being downgraded by some service providers with scope for similar on restrictions obfs4. The restrictions are being implemented in tandem with apparent degradation of commercial VPN service traffic.” continues the analysis published by the Turkey Blocks.

The study conducted by the organizations corroborates user reports that Tor access with the default configuration is now widely restricted. At the time I was writing, the ban isn’t total and the Turkish government is not covering corporate or custom VPN solutions.


New Privacy Rules require ISPs to must Ask you before Sharing your Sensitive Data
28.10.2016 thehackernews Privacy
New Privacy Rule requires ISPs to must ask you before sharing your Sensitive Data
Good News for privacy concerned people! Now, your online data will not be marketed for business; at least by your Internet Service Providers (ISPs).
Yes, it's time for your ISPs to ask your permission in order to share your sensitive data for marketing or advertisement purposes, the FCC rules.
On Thursday, the United States Federal Communications Commission (FCC) has imposed new privacy rules on Internet Service Providers (ISPs) that restrict them from sharing your online history with third parties without your consent.
In a 3-2 vote, the FCC approved the new rules by which many privacy advocates seem pleased, while some of them wanted the Commission to even apply the same rules to web-based services like Google and Facebook as well.
Initially proposed earlier this year, the new rule says: "ISPs are required to obtain affirmative 'opt-in' consent from consumers to use and share sensitive information."
What does 'sensitive' information mean here? The rule lists the following:
Your precise geo-location
Your children's information
Information about your health
Your financial data
Social Security Numbers (SNNs)
Your Web browsing history
App usage history
The content of your communication
Note: Your broadband provider can use and share this information if you give them explicit permission. So, you need to watch out for those invites and gently worded dialog boxes.
What's non-sensitive is information like your email address, service tier, IP address, bandwidth used and other information along those lines, but you can still officially opt-out.
The new rule also requires Internet providers to tell customers with "clear, conspicuous and persistent notice" about the information they are collecting on them and how/when they share it, and the "types of entities" they share it with.
The ISPs even need to notify its customers in the event of a data breach.
The FCC aims to provide consumers an increased choice, transparency, and security online over their personal information. Here's what the Commission writes:
"ISPs serve as a consumer's "on-ramp" to the Internet. Providers have the ability to see a tremendous amount of their customers' personal information that passes over that Internet connection, including their browsing habits. Consumers deserve the right to decide how that information is used and shared — and to protect their privacy and their children's privacy online."
Meanwhile, the advertisers are, of course, not at all happy with the FCC's move. The Association of National Advertisers called the new rules "unprecedented, misguided and extremely harmful," saying the move is bad for consumers as well as the U.S. economy.
However, ISPs have a year to comply with the new rules. So, it won't go into effect for at least a year.


Using VPN in the UAE? You'll Be Fined Up To $545,000 If Get Caught!
1.8.2016 Thehackernews.com Privacy

If you get caught using a VPN (Virtual Private Network) in Abu Dhabi, Dubai and the broader of United Arab Emirates (UAE), you could face temporary imprisonment and fines of up to $545,000 (~Dhs2 Million).
Yes, you heard that right.
Online Privacy is one of the biggest challenges in today's interconnected world. The governments across the world have been found to be using the Internet to track people’s information and conduct mass surveillance.
Here VPNs and proxy servers come into Play.
VPNs and proxy servers are being used by many digital activists and protesters, who are living under the most oppressive regimes, to protect their online activity from prying eyes.
However, using VPN or proxy in the UAE could land you into great difficulty.
The UAE President Sheikh Khalifa bin Zayed Al Nahyan has issued new sovereign laws for combating cyber crimes, which includes a regulation that prohibits anyone, even travelers, in the UAE from using VPNs to secure their web traffic from prying eyes.
Also Read: Best VPN Services for Fast, Anonymous and Secure Browsing
According to the laws, anyone using a VPN or proxy server can be imprisoned and fined between $136,000 and $545,000 (Dhs500,000 and Dhs2 Million).
The laws have already been issued by the UAE President and have now been reported to the official government news service WAM.
For those unfamiliar, Virtual Private Network (VPN) securely routes your Internet traffic through a distant connection, protecting your browsing, hiding your location data and accessing restricted resources.
Nowadays, VPNs have become a valuable tool not just for large companies, but also for individuals to dodge content restrictions as well as to counter growing threat of cyber attacks.
The UAE's top two telecom companies, Etislat and Du, have banned VoIP -- the phone calling features in popular apps like WhatsApp, Viber, Facebook Messenger and SnapChat that deliver voice calls over the Internet for free -- from within the Gulf nation.
Also Read: Opera Browser Now Offers Free and Unlimited Built-in VPN Service
However, soon the vast number of UAE residents who use VPNs and proxies within the UAE for years to bypass the VoIP ban could be in difficulty.
Out of two new laws issued last week, one lays out fines for anyone who uses a VPN or proxy server, local news reports. The new law regarding VPNs states:
"Whoever uses a fraudulent computer network protocol address (IP address) by using a false address or a third-party address by any other means for the purpose of committing a crime or preventing its discovery, shall be punished by temporary imprisonment and a fine of no less than Dhs500,000 and not exceeding Dhs2 million, or either of these two penalties."


Employee Monitoring, a controversial topic
29.4.2016 Privacy

Employee monitoring is a complex and controversial topic that can often become the source of discontent between employers and their staff.
It is not a secret that most employees have a negative opinion about modern monitoring practices, such as PC monitoring. It is often viewed as an invasion of privacy and employer overstepping their authority.

From an employer perspective, employee monitoring is a very useful tool, allowing them to solve a number of issues and challenges, and raise the general health and effectiveness of an organization. It is something that employers used for a very long time. A couple of decades ago they tapped corporate phones, checked mail and conducted video surveillance. Nowadays they monitor employee PCs, social networks, and e-mails. Most of the time such monitoring is not conducted out of maliciousness toward employees, but rather to serve a specific business-related purpose.

The question then is – can the compromise on employee monitoring be found? How to monitor employee internet usage while getting him or her on your side and ensuring cooperation? Practice shows that it is possible, and in this article we will try to give you some tips on how to monitor employee PC use ethically and without overstepping your boundaries.

employee monitoring legal

Employee monitoring is necessary and here is why
First, it is important to understand the reason why it can be very useful and often even necessary to monitor what employees are doing. Such reasons often differs from one organization to the other. However, they all can be generalized into three main categories:

Many norms and regulations regarding data security and handling of personal data require some form of access management and activity monitoring to make sure that said data is not misused by company employees. Employee monitoring for compliance purposes is used, for example, in financial and healthcare institutions.
Security. Insider threats are a very real security issue that can result in very damaging and costly attack, if neglected. Employee monitoring is the best way to prevent and detect such threats.
Performance evaluation and improvement. Monitoring can be used to gauge employee performance and see if they spend their time productively. It is especially useful for subcontractors and employees paid by the hour.
Employee monitoring can solve these crucial issues to the benefit of an organization. However, your employees most likely will not be happy with your decision to monitor their PCs.

Why employees may seem unhappy
Most employees viewing monitoring negatively and usually meet it with hostility. It is rarely considered to be a useful business or security tool, but rather an oppressive practice by the overly zealous boss.

This negative opinion is usually based on a number of legitimate concerns that can be summarized as follows:

Privacy concerns. Private matters inevitably come up during job hours. It does not necessarily mean that your employees are slacking off. Yet, they can often see intrusive monitoring as an invasion of their privacy, especially when employer monitors their emails or social network activity.
Concerns regarding trust. Monitoring can cause employees to think that an employer do not trusts them. This perception can undermine the relationship between employee and employer.
Increased stress. Constant monitoring, especially for performance evaluation purposes, creates a strong pressure to perform on the peak of employee productivity at all times, leading to high levels of stress. This can negatively affect morale and motivation of an employee.
Invasive monitoring often leads to lower general work satisfaction. As a result, such monitoring will produce an opposite effect to the one intended: instead of improving employee productivity, it will reduce it. However, there are certain ways and best practices to change the situation and conduct employee monitoring that satisfies all involved parties.

Ethical monitoring – key to remedy the problem
One of the best tips to employee computer monitoring, is to approach the issue ethically and fairly, with respect for the privacy of your employees in mind. First, you need to make sure that employee monitoring is prompted by a very serious business need that can be clearly formulated and easily communicated to your employees. You should not monitor your employees beyond your direct business needs, and should not collect data, that is not necessary for business purposes.

It is important to create a clear formal monitoring policy based on your needs and stick to it. Make sure that your employees are familiar with it and understand it. You need to clearly communicate what employee actions are being monitored and in what way, and how this information will be used to help your organization.

One of the best practices on how to monitor employees computer usage is to notify them when they are being monitored. While it is not required by federal law, it will show your concern for the privacy of your employees and will help to facilitate the relationship of trust between you.

Use appropriate software
Another important point that can help you make employee monitoring more effective is the right software selection. It is very important to use the right tools for the right job. You need to clearly define what type of information you want to collect and why, and choose the tool that will allow you to do just that.

There are a lot of different simple solutions for recording certain types of user activity, such as keystrokes recording, network monitoring, and employee tracking software, etc. These solutions are easily available and not expensive, often even free. They can be used to control employee productivity, monitor their internet and social network usage, and as a basic security precautions.

However, if your goal is to organize centralized monitoring at a number of endpoints, monitor compliance, or you want a way to actually reliably detect and prevent insider threats, then you need to employ a more sophisticated professional software. Such solution needs to be heavily protected, configurable, and capable of collecting a large amount of important data regarding network and application usage.

Agent-based user monitoring software, are able to create video recording of everything employee sees on their screen coupled with large amounts of relevant metadata. Such solutions allow you to comply with regulations, thoroughly protect your organization from insider threats and provide you with all the necessary data for employee performance evaluation. They can be configured to record either the whole user session, or only specific data, allowing you to collect only information that is needed.

The downside is that most of these business solutions are very expensive and can be cost-prohibitive for small companies, although there are some offers with flexible licensing. Therefore, it is important to carefully evaluate your needs and consider all available options when deciding what employee monitoring software to use.


Facebook uses Artificial Intelligence to Describe Photos to Blind Users
6.4.2016 Privacy

Today the Internet has become dominated by images, and it’s the major feature that got Facebook to a Billion daily users.
We can not imagine Facebook without photos, but for Millions of blind and visually impaired people, Facebook without photos has been the reality since its launch.
But not now! Facebook has launched a system, dubbed Automatic Alternative Text, which describes the contents of pictures by telling blind and visually-impaired users what appears in them.
Blind and visually-impaired people use sophisticated navigation software known as screen readers to make their computers usable. The software turns the contents of the screen into speech, but it can't "read" pictures.
However, Facebook's Automatic Alternative Text or AAT uses object recognition technology that can decode and describe photos uploaded to the social network site using artificial intelligence and then provide them in a form that can be readable by a screen reader.
Video Demonstration

ATT tool, led by Facebook's 5-year-old accessibility team, has already made its way to iOS devices and would soon be available for Android and the Web as well.
Facebook says its AAT tool The more images it scans, the more sophisticated the software will become. While still in its early stages, the AAT technology can reliably identify objects and activities in categories including:
Appearance - baby, eyeglasses, smiling, beard, jewellery, shoes and selfie
Environment - outdoor, sky, grass, tree, mountain, snow, ocean, beach, water, wave, sun
Food - pizza, ice cream, dessert, sushi, coffee
Transport - aeroplane, train, bus, boat, car, motorcycle, bicycle, road
Sports - tennis, basketball, baseball, golf, swimming, stadium
The move by the social network giant is a bigger step forward for blind and visually-impaired users, although it only works in English at the current.
So in order to see the AAT technology in action for yourself, iOS users using iOS’s built-in screen reader can Go to Settings → General → Accessibility, and activate VoiceOver.
The company will soon bring the new functionality to other mobile platforms as well as languages. You can see the video demonstration to know how AAT tool works for someone using a screen reader.


German intelligence Agency BND spied on Netanyahu
4.4.2016 Privacy

The German Intelligence Agency BND has intercepted the Office of the Israeli Prime Minister Benjamin Netanyahu among others.
According to the German weekly Der Spiegel, the German intelligence Agency BND (Bundesnachrichtendienst) has reportedly been spying on Israel for years. The Prime Minister Benjamin Netanyahu’s Office is one of the main targets of the espionage activity, the German intelligence also targeted the British Ministry of Defense, the Organization of the Petroleum Exporting Countries (OPEC), the International Monetary Fund (IMF) , and the interior ministers of Austria and Belgium.

“The Federal Intelligence Service has intercepted friend countries and international organizations. The Der SPIEGEL information confirmed the Office of the Israeli Prime Minister and the US State Department were their objectives.” reported the Der Spiegel.

The BND gathered emails, phone calls and faxes from embassies and consulate belonging the US, UK, France, Sweden, Spain and other countries.

The Prime Minister’s Office has declined to comment on the news.

Merkel Netanyahu German Intelligence BND

The news follows the revelations made by The Der Spiegel magazine in November 2015, when it reported German Intelligence Agency BND “systematically spied” on its allies and several international organizations.

According to The Der Spiegel magazine, the German Intelligence Agency BND has also been spying on several US Government organizations, including the NASA, the US State Department, the US Air Force, and American diplomats across Europe.

In November 2015, the RBB Radio and Spiegel Online claimed that the BND is responsible for cyber espionage on its own account on several embassies and administrations of “European states and allies”.

“the BND had systematically spied on ‘allies’ across the world, including on the interior ministries of the United States, Poland, Austria, Denmark and Croatia.” states the Spiegel.
According to the Der Spiegel, the German Secret Service spied on the US delegation at the European Union in Brussels and the UN in New York, the US Treasury, and several embassies in Germany, including those of the US, France, Britain, Sweden, Portugal, Greece, Spain, Italy, Switzerland, Austria and the Vatican.

The German intelligence appears very active, the German spies also spied on the Geneva-based International Committee of the Red Cross and Oxfam.

Following the above events, in May the German intelligence BND had stopped sharing surveillance information with the NSA. The data were collected from a surveillance station located in Bad Aibling in Bavaria, the same center used by the German intelligence to monitor events in the Middle East.