Why Rio 2016 is both a problem and an opportunity for your data project

The data visualisations that we all see online and on social media are making us more open to using this kind of information at work.

Our daily lives are now drenched in data, delivered to us from our televisions, our computers, and the smartphones in our hands. Charts and tables are commonplace, but the online world and social media have also made infographics a powerful tool for the presentation of this information, especially when coupled with images, animations, video clips and written commentary. This data can come from a huge number and variety of sources, brought in from places all around the world.

A perfect example of this is what we have seen in the coverage of the Rio Olympics. Online news providers have taken great leaps of the imagination in how data can be delivered to us, harnessing the tools at their disposal. The Guardian is a prime example, with their coverage of Great Britain’s cycling success over Australia that saw Sir Bradley Wiggins win his fifth Olympic gold medal (1). The newspaper created an animated version of the race that showed what happened every second of the way, with the reader being able to click through each stage at their own pace.

Not to be left trailing in second place, The New York Times has drawn up an extensive set of data visualisations that shows exactly how well each country has done at each Olympics since the games began (2). The graphics are a brilliant mixture of aesthetics and information, delivering a huge amount of complicated data at a glance.

These high-tech ways of accessing data are becoming everyday experiences for many people, but how does this affect businesses beyond the mass media outlets, and should companies strive to access and make use of these new tools?

There is a danger that if some data analytics projects are at a fairly embryonic stage they could seem outdated by the time they’re implemented. After all, with online trends changing day-by-day, what seemed like a great idea just a few months ago could be old-fashioned by now.

This poses problems for business who are then pushed to keep up with the latest innovations but don’t want to shake-up their operations. However, there is a good chance that your staff are using better, newer technology at home than they have access to when at work.

There are echoes of the BYOD (Bring Your Own Device) phenomenon, where the smartphones that people were buying with their own money were far more advanced than the ones they were being given by their employers. BYOD was a clever way of working around this without companies having to regularly shell out for new phones.

Now your staff will be using the data analytics power of social media such as Twitter and LinkedIn in their personal lives, along with gathering data on themselves with mobile apps such as Run Keeper. It is commonplace to have data at our fingertips, and people will be happy to use equivalent tools at work.

It would be very easy for anyone in the position of running a company that is making use of data visualisations to look at the sort of tools that are being used elsewhere and become despondent at what they have at their disposal. Adopting new technologies can be expensive, and many employers could worry that the changes this can bring to a workplace could have a negative effect.

But what needs to be remembered is that any new data technology that is brought in will probably not be unfamiliar to your colleagues, and may be something they are already using in their everyday lives. With so much technology, and so much data, now available to each and every one of us, that new piece of software that you’re apprehensive about buying may not have the disruptive effect on your team’s way of working that you think, and could give a massive boost to your profit margins.

What’s also very important to remember is that data visualisations are only the endgame of a very long process, one that begins with gathering good quality data itself. While new tools designed to present this data are emerging all the time, the basic foundation that they build upon is information. And if you’re looking at new ways of visualising data then you probably have a good bedrock of this information at your disposal already.

There’s an opportunity here, a massive one, that could see your company pushing itself to the forefront of the way in which data is presented and getting everyone involved in its use, not just data scientists and your IT team.

What’s needed to make the most of this is the realisation that the apps, websites and social media that your staff are using on a daily basis are indicative of a wider acceptance among them of how data now works in our world, and how it touches every aspect of our lives. People are now comfortable with digesting huge amounts of information, and even expect it to be delivered to them. If they do this in their own time, they’ll have no trouble doing it at work.



Image provided courtesy of Ian Burt


Are You on the Data Offensive or Defence?

Understanding the different types of data positions – data offensive or data defence.


Companies are either on the data offensive or data defence – and organisations need to move to being on the offensive to actively take hold of data and make tangible use of it.

There is a huge amount of data that any company will gather over time. This can be deliberate, and be something that you have set out to obtain, or it can be something that simply gathers as a result of the IT systems that we all use.

There are two ways a business can approach this data, and it’s a choice between a position of defence or offensive. One could hold you back, but the other is much more positive, allowing you to push your company into new areas and target your approach so that you achieve exactly what you need to.


The defensive approach

Data defence is the traditional approach to managing the information that your company holds. It’s all the regular things that have to be done with large amounts of data, such as maintaining security to make sure none of it leaks or is compromised. It’s the governance of data, the everyday handling of it and the processes around it.

This also includes ensuring privacy and making sure that the quality of the data is up to scratch. These are certainly things that have to be done with data that is gained in a commercial context, and many of them are done to make sure that your business falls in line with whichever set of regulations you have to adhere to.  It’s a case of preventing data from becoming a problem – rather than seeing it as a valuable asset.

This is the approach that many companies take towards data, and the one that can seem to be sensible and correct. That is, until you look further than the data defence attitude and closer at what could otherwise be done. There are opportunities to take the data that you have and use it to push your business on to the next step.


Go on the offensive

Being on the data offensive is about taking the wealth of information that you have at your disposal and exploring the possibilities of what it can do for your business in a proactive sense. Whereas data defence is about making sure that everything is in order, data offensive sees you pushing the boundaries and creating new opportunities.

The data that you have at your disposal can open doors for your business that were closed before. This information can support marketing and help to target outbound campaigns, making sure you are reaching the right people in the right way. In turn, this helps to build new revenue, all of which can lead to further data being gathered as time goes on.

Data management can be at the forefront of your company’s strategy rather than being something that simply has to be done. In the modern, digital world the companies that are using data well are those that are harnessing its power and using it to change their behaviour and the way they work. Data is driving their behaviour and they are allowing it to take the lead rather than letting their existing behaviour govern the way data is collected and protected.


A light in the dark

There is another kind of data out there that might not seem so full of opportunity until it is put under the microscope and given a closer look.

Dark data, as it is known, is the information that tends to be ignored by businesses and just builds up in the background over time. This could be server logs, data about old employees, and outdated login information, for example. In his book Dark Data: A Business Definition, Isaac Sacolik describes it as “data that is kept ‘just in case’ but hasn’t (so far) found a proper usage.” (1)

Much of this data will be seen as having little or no value to your firm, and simply something that is given the minimum amount of attention to make sure it is secure and stored correctly. But harnessing this data can be a big step in the process of moving towards data offensive and taking your company forward.

Any business that finds itself in possession of a significant amount of dark data needs to look at how to harness the opportunities that it can create, and how to capitalise on that information and turn it into something proactive rather than letting it impact your business’ resources.

While dark data can be turned to good use and create opportunities, the failure to do this could pose a risk to your company. Instead of letting it become a burden on your business, why not turn dark data into something positive?

Most companies are currently stuck in the data defence approach, but there are new solutions to this problem that can put you on the offensive. Dark data could be the key to where you go next, helping you to explore new avenues that you hadn’t thought of before. This approach will become even more effective as data analytics tools become standardised and the ability to pull information from the unlikeliest of sources increases through technology such as IoT sensors.

There is a wealth of information that any company builds up over time, and the choices are either to let it become a drain on what you do or harness the power that it can give you and allow it to take you forward.





Image provided courtesy of KamiPhuc


4.5% of data breaches in 2014/15 related to the legal sector

The Information Commissioner’s Office is alerting legal professionals to the risk of data breaches in the sector, especially concerning paper files that can be easily lost, stolen or copied – with no form of encryption.

The ICO (Information Commissioner’s Office) is the office responsible for the enforcement of the Data Protection Act 1998.  They have the power to serve a monetary penalty if up to half a million pounds for serious breaches of the Data Protection Act, so they are not an organisation to ignore.C24 - Data breach legal sector

Especially if you work in the legal sector where sensitive information is handled, posted, transferred, copied and destroyed on a daily basis.

In a three month period in 2014 alone, 15 data security incidents linked to the legal sector were reported to the ICO.  This may not sound like many, but due to the highly confidential data handled by the legal profession, the security incidents can often be more likely to generate a fine.

In the ICO’s 2015 annual report, the ICO also highlighted that they were receiving “a significant amount of complaints about the legal profession and (were) working closely with the Bar Standards Board and other regulatory and representative groups within the legal profession to improve information rights compliance”, suggesting that the sector is coming under scrutiny for how it handles data in the future – especially as more and more legal work is conducted online and transferred by email or web portals.

Across all industry sectors, the health sector still stands out as having the most reported data security incidents.  In October – December 2015, the health sector was reported to the ICO for 204 incidents, compared to the legal sector at just 19 incidents.

In the same three month period of 2015, the Crown Prosecution Service received a penalty of £200,000.00 from the ICO (which was one of only two monetary penalties issued in that quarter, the other one being a £250 fine for Bloomsbury Patient Network).  Some of the increases in incidents are thought to be related to the prevalence of cyber-attacks, and the ICO is now creating specialist teams to focus on cyber security as it becomes an increasing threat to data loss and misuse.

In 35% of all cases in 2015 reported to the ICO, no action was required to be taken, however in 22% of cases data controller action was needed.  In other cases, further information or investigation was carried out.

In the entire year, the ICO received 14,268 incidents, across a range of categories, such as inaccurate data, fair processing, retention of data and excessive data.  It’s important to remember that the ICO also handles spam call incidents so not all of the 14,268 incidents are data breaches.

Data breaches can also result in prosecutions.  In 2015, some notable cases prosecuted by the ICO include a company director who was fined for accessing Everything Everywhere’s customer databases to sell his own telecoms services, and a pharmacist who unlawfully accessed the medical records of his family and work colleagues.

In 2014, a Freedom of Information request revealed that during the year, 173 legal firms were investigated by the ICO (29% relating to security and 26% related to the disclosure of data).

In an infographic from the ICO, 2014/15 saw the majority of breaches within the legal sector attributed to the loss or theft of paperwork.  Some commentators have suggested this is due to the large folders and quantities of files that lawyers regularly carry around from court to office to home – increasing the opportunity for information to be lost or stolen.

  • The second most common type of breach in the sector was data posted or faxed to an incorrect recipient.
  • The fifth most common breach was data sent by email to the incorrect recipient. Over time as more firms move their communications to a digital format, I would expect that these two breach types to swap places, as more email incidents occur, and less info is faxed.
  • The least common breach was hacking into insecure webpages – however, with a higher number of firms choosing to interact with clients via portals (for uploading files, receiving updates or reporting) – it is expected that this breach will increase.

The type of data most vulnerable to breaches in the legal sector were basic personal identifiers, followed by clinical data and then criminal records.  The sensitivity of the information that legal professionals deal in means that the loss of a few items of data about a person has the ability to significantly affect the individual.  An extreme example is how some individuals resorted to suicide following the Ashley Madison ‘affair website’ hacking incident; demonstrating the extent that data can affect people’s lives if the data is used in an unintended way.

A specific example of how the legal sector has been on the receiving end of ICO penalties can be seen in the case of Stoke City Council who received a penalty of £120,000 after a solicitor who worked on behalf of the council sent 11 emails about a child protection case to the wrong email address by mistake, some of which contained highly confidential information.  This was the second time the council received a fine from the ICO after a USB stick was lost that contained sensitive information relating to childcare cases.

Evidently, data is important to the legal sector, but data security is critical for the safeguarding of clients, lawyers and law firms.  The legal sector is particularly interesting as many of their data breaches relate to the high use of paper files that can’t be encrypted, unlike other industries that often fall foul of data protection laws due to cyber-attacks, hacking incidents or lax IT policies around data management.


For further reading and our references on the above statistics, please see the below links:

ICO: Data Security Trends

ICO: Legal Sector Data Breaches

ICO: Data Breaches within the Legal Profession

ICO: 2014/15 Annual Report

Computer Weekly: Stoke City Council Data Fine


Image courtesy of Beth Cortez-Neavel

How Do We Keep You Safe?

The BBC has reported in the past few days in their Technology of Business section, that cybersecurity will be the main issue concerning global businesses this year, and that the Internet of Things will only increase this growing threat to security.

It’s obvious really, as more devices and systems become connected, the threat of security attacks will inevitably increase as hackers find more inventive ways to penetrate data.  The BBC suggests the cause is the ‘development of the hyper-connected world’.  As more devices are created, and more ‘things’ become connected, we are then able to interact more digitally and the opportunity to hack/attack increases.  More data is digitised, and more actions are taken on a digital platform rather than in person, by phone or on paper.

C24 Security

Internet of Things

The Internet of Things has seen its share of attacks over the past year.  A report from TrendMicro highlighted that 2015 saw baby monitors, smart TVs and connected cars as the focus of cyber attacks.

Security isn’t just about averting attacks as they occur, it’s about proactively attaining high standards to ensure you can stave off attacks before they cause issues for your technology environment, and consequently, your business.  The EU is also introducing new data protection laws which will come into effect in 2018; further increasing the responsibility on the shoulders of business owners for how they manage their data and systems in the future.

This is partly due to the change in nature of attacks, as industry experts forecast that 2016 will see a dramatic increase in ransomware attacks – where hackers break into systems, encrypt the data and then demand ransoms to decrypt the information.


How we keep our customers safe

At C24, we take security very seriously.  It has changed how we run our business, how we build our systems and the choices we make about infrastructure and software solutions.

For instance, we elected not to build our own datacentres and instead rent space out of Six Degrees Group (who recently joined with C24) who have state of the art datacentres, designed for mission critical systems.  Alone, we would have been able to build a datacentre that was leading edge today.  However would it still be leading edge in a years’ time, and then in three years’ time and so on?  Partnering with a specialist datacentre provider was a way for us to ensure we were housing our hosting infrastructure within the most current, enterprise-grade datacentres possible.

We split security considerations into three levels: datacentre, network and data.


Datacentre security

We house our hosting infrastructure within the Six Degrees Group datacentre facility in the Midlands.  We initially chose 6DG due to the high level of security externally around the datacentre facility, as very often hosters think about security within their IT systems but don’t extend their thinking to the external datacentre.  Putting your systems within your office leaves you open to potential attacks from disgruntled staff onsite, or the potential for a vehicle to ‘ram’ a building and break into the facility.  This may sound farfetched but many resellers have fallen prey to attacks in which warehouses have been broken into using vehicles to gain entry.

Our datacentres have anti-ram bollards to prevent unauthorised vehicles entering the site, and there is 24/7 CCTV monitoring all around the site to control access.  Perimeter fencing and guards ensure that only people permitted to enter the wider site (not just the building) do so, while all visitors have to prebook access requests and bring along government issued identification otherwise they will not be permitted access inside the datacentre or surrounding offices.  Unfortunately, we sometimes have to turn away customers who have come to visit the site but haven’t brought the necessary ID with them.


Network security

Within our datacentre ‘pod’, we ensure the security extends across all of the network layers.  We have intrusion detection and prevention software in place to continually monitor the network for unusual activity.  We perform routine tests to interrogate the network in order to check for potential holes where attackers could gain entry; this ensures we proactively manage network security before issues occur.  Our technical team also monitor the network around the clock to ensure systems are operating as they should and no unusual activity is being reported.

When setting up a client’s hosted infrastructure, we split our network into VLANs so that each client has their own private network that cannot be accessed by other customers.  This separates everything out to add increased security to our hosted infrastructure delivery.  We also use VRF (Virtual routing and forwarding) technology to segment network paths.

To reduce the possibility of malicious attacks on clients’ websites, we employ Webscreen Technology to guard against flood and applications layer distributed denial of service (DDoS) attacks.  Hackers sometimes use bots to ‘flood’ websites to bring them down or slow down the service, making it unusable for real users.  Our DDoS security technology averts these attacks by having the system ‘learn’ which IP addresses to trust and which to drop in the event of an attack.


Data security

Whilst we deliver hosting to the infrastructure level, we do not directly handle clients’ data, however we do have a number of processes internally to ensure that, as a company, we adhere to data protection guidelines and keep our customer data safe.

We have data protection policies in place that govern how we consume, process, collect and store customer data and our utilisation across the firm of Citrix desktop technology means we are able to take data away from the individual PC or laptop in the event of the device being stolen or lost, and keep information at the datacentre level where it can be managed and monitored centrally.


You can’t always be prepared for every type of attack, but it’s important when you speak with a hoster or Software-as-a-service provider that you ensure they have covered off the issue of security across a range of areas, not just within the systems itself but the physical infrastructure that prevents the outside world getting into your data.

Security is often only important after you’ve been through an attack and resolve to never let it happen again, but with the Internet of Things increasing the array of devices that can be accessed by hackers, from within your home to the datacentre, IT Managers will need to be looking at how this new trend could affect their day to day operations and how to control the devices and appliances across the workplace that IT may not have visibility of.


Image provided courtesy of Holly Victoria Norval.


During a recent visit to Brazil, I encountered many customers and partners who faced a similar challenge – providing their clients with a safe, secure and genuinely easy way to share files and collaborate with data.  All faced a number of barriers and none were happy with the current offerings of cloud based file sharing solutions.  Generally speaking:

  • All required a secure way to share files with internal and external people– partners, vendors and employees
  • All tried to block access to file sharing sites and no one thought they were successful in doing so
  • All were concerned about the additional resource requirements to manage and control cloud file shares
  • Many wanted the same user experience and processes  for internal  and external collaboration
  • Not one had a plan to fulfill these requirements
  • All were required by the business areas to provide a solution in the near term

The following 5 criteria summarize their requirements, which are not currently fulfilled by cloud based file sharing solutions:

1. Ongoing guarantee of rightful access

Customers clearly state that the security of cloud based file sharing solutions is a primary concern.  They require a comprehensive audit trail of all usage activity, the ability to ensure permissions are granted and revoked at the appropriate times by the appropriate people, and the ability to develop different profiles for different data and people based on data sensitivity, customer location, and role.

2. Ability to leverage existing infrastructure and processes

Customers want to leverage their existing infrastructure and processes instead of purchasing a new solution, and have no wish to reinvent their processes for managing data on a third-party cloud solution.  Customers have processes and applications to perform backup, archival, provisioning and management of existing infrastructure, and they are confused about how to perform these functions within a cloud-base file sharing solution.

3. Ensuring Reliability with Accountability

IT organizations have defined service levels for their internal clients,  and are accountable for the delivery of each service. If they don’t deliver, there is no question about whose responsibility it is.  Service levels associated with cloud based file sharing must be negotiated like other third party services – there are typically few guarantees of performance and remedies for non-performance are limited.

4. Providing an intuitively simple user experience

Regardless of the solution, IT Managers are very concerned about a new user experience for their clients.  Most indicate that a different user experience will require training, impact the number of calls for support, and reduce productivity at least temporarily.  Ultimately, IT Managers would like leverage the user experience that their user population has already mastered.

5. Predictable expense

Typical cloud based file sharing solutions are priced based on amount of storage— storage requirements often grow at a surprising rate. Customers may need to negotiate storage costs with cloud providers on an ongoing basis.


To get a sense of where the PCI Data Security Standard (DSS) is heading, it helps to take a look beyond the actual language in the requirements. In August, PCI published a DSS 3.0 best practices document that provided additional context for the 12 DSS requirements and their almost 300 sub-controls. It’s well worth looking at. The key point is that PCI compliance is not a project you do once a year just for the official assessments.

The best practice is for DSS compliance to be a continual process: the controls should be well-integrated into daily IT operations and they should be monitored.

Hold that thought.

Clear and Present Dangers

One criticism of DSS is that it doesn’t take into account real-world threats. There’s some truth to this, though, the standard has addressed the most common threats at least since version 2.0—these are the injection style attacks we’ve written about.

In Requirement 6, “develop and maintain secure systems and applications,” there are sub-controls devoted to SQL and OS injection (6.5.1), buffer overflows (6.5.2), cross-site scripting (6.5.7), and cryptographic storage vulnerabilities (6.5.3)—think Pass the Hash. By my count, they’ve covered all the major bases—with one exception, which I’ll get to below.

The deeper problems are that these checks aren’t done on a more regular basis—as part of “business as usual”—and the official standard is not clear about what constitutes an adequate sample size when testing.

While it’s a PCI best practice to perform automated scanning for vulnerabilities and try to cover every port, file, URL, etc., it may not be practical in many scenarios, especially for large enterprises. Companies will then have to conduct a more selective testing regiment.

If you can’t test it all, then what constitutes an adequate sample?

This question is taken up in some detail in the PCI best practices. The answer they give is that the “samples must be sufficiently large to provide assurance that controls are implemented as expected.” Fair enough.

The other criteria that’s supposed to inform the sampling decision is an organization’s own risk profile.

Content at Risk

In other words, companies are supposed to know where cardholder data is located at all times, minimize what’s stored if possible, and make sure it’s protected. This information then should guide IT in deciding those apps and software on which to focus the testing efforts.

Not only should testing be performed more frequently, it’s also critical to have a current inventory, according to PCI, of the data that’s potentially hackable—let’s call it data at risk—and users who have access.

For Metadata Era readers, this is basically the Varonis “know your data” mantra. It becomes even more important because of a new attack vector that has not (yet) been directly addressed by PCI DSS. I’m referring to phishing and social engineering, which has been implicated in at least one of the major retail incidents in the last year.

Unlike the older style of injection attacks that targeted web and other back-end servers, phishing now opens the potential entry points to include every user’s desktop or laptop.

Effectively, any employee receiving a mail—an intern or the CEO­­—is at risk. Phishing obviously increases the chances of hackers getting inside and therefore raises the stakes for knowing and monitoring your data at all times, not just once a year.


There’s been a lot of news lately in the health and fitness wearables space. Apple just announced they’re releasing an app, called “Health,” as well as a cloud-based platform “Health Kit”. Somewhat related, Nike recently pulled the plug on its activity tracking Fuelband. The conventional wisdom is that fitness trackers are on the decline, while the wearables market in general —think Google Glass and the upcoming iWatch–is still waiting for its defining moment.

And on the privacy front?  In fact, there’s been a lot of movement there as well, and the FTC is all over it! They recently hosted a “Consumer Generated and Controlled Health Data” event and all the speakers – the FTC Commissioner, technologists, attorneys, privacy experts – agree that the potential of health-based wearables is huge, but since health data is so sensitive, it needs special protection.

I’ve distilled their privacy wisdom into 5 key things privacy experts want you to know about health data, the data generated from your wearables, your privacy and why it’s so hard to create one law that protects all.

1. Transparency and trust are essential

If health and fitness wearable makers create privacy policies that are ambiguous and don’t require consumer consent for data sharing, it may limit the benefits of these services for many people, especially those who are privacy conscious.  Why upload your health data when there’s no guarantee it will be kept private?

Some experts suggest short, clear-cut notices about the safety and protection of your data—something akin to a data nutrition label.

nutrition label

 2. Your health data gets around

Latanya Sweeney, the FTC Chief Technologist and Professor of Government and Technology at Harvard University, attempted to document and map all flows of data between patients, hospitals, insurance companies, etc. She learned that it’s not really clear where data is going and it’s difficult to know all the places it might wind up.

Inspired by Sweeney, I checked whether some heathcare data may find its way outside the medical ecosystem. It does! The recent FTC report on the data broker industry (see appendix B) proves the brokers collect some sensitive patient data points.


Source: DataMap

3. Discharge data in disarray

Information about your hospital visit is also known as discharge data. This data is required by state law to be sent to whomever is designated by that state law to receive that data.

What do states do with your discharge data? Turns out 33 states sell or share your discharge data. Of the 33 states, there are only 3 that are HIPAA compliant.

discharge data

Source: DataMap

4. Geolocation is not to be overlooked

One very important privacy matter mentioned at the FTC event was geolocation. Many health and fitness apps and wearables mine data about your running routes or when you’re at the gym.  Some apps may also be able to predict where you’re going to be at a certain time or predict when you’re not home.


5. There’s no free lunch

In exchange for a freemium health and fitness app, you are sharing A LOT of data. That’s not unusual in the free app world, but medical data is not the same as sharing your list of favorite movies.

Some users might put trust in the maker of their heathcare app or device-maker, say, Nike, but not realize that by using the product they’re consenting to having their health information sold and resold to third parties that may not be as trustworthy.

Jared Ho, who is an attorney in the FTC’s Mobile Technology Unit tested 12 health and fitness apps and found that his data was sent to the developer’s website as well as 76 third parties–mostly advertising and analytics organizations.

Here’s what he found:

  1. 18 of the 76 third parties collected device identifiers such as unique device ID.
  2. 14 of the 76 third parties collected consumer-specific identifiers, such as user name, name and email address
  3. 22 of the 76 third parties received information about the consumers such as exercise information, meal and diet information, medical symptoms, zip code, gender and geo-location.


No one can predict what will happen in the wearables market, but emerging business practices and technologies will inform and impact consumer privacy regulations as it remains a very hot topic. Concerns and worries such as who will and should have access to one’s personal health data and who has potential access to the data will no doubt remain a part of the discussion.


SANS Critical Security Controls (CSC) have been getting more attention over the last few years. As security experts come around to focusing on the actual techniques used by hackers, the SANS “offense informs defense” approach is resonating. And now with the2014 Verizon Data Breach Investigations Report (DBIR), it has received a new and important endorsement. The DBIR has for years acknowledged the CSC, but for the first time, the Verizon security team included a direct mapping between common attack patterns and the CSC.

One of the assumptions behind the SANS controls is that the hackers will eventually get past the perimeter defenses, so you need to limit the data that’s available and accessible. And then back that up with secondary defenses to spot unusual patterns that often indicate intruders are present–for example, by monitoring audit logs and account activity.

We’ll be discussing more about the SANS controls in a series of posts that we will be publishing soon. In the meantime, the SANS site has some interesting survey data that’s worth reviewing. In 2012, 600 IT professionals were interviewed about their organizations’ use of event logging technology.

Two results from this SANS survey got my attention and provided additional validation for ideas we’ve been talking about at the Metadata Era. First, the SANS data says that the single greatest challenge in integrating audit logs with other tools was “identification of key events from normal background activity”.

What’s stopping companies from gauging the baseline? It turns out that it’s not that easy—as readers of this blog already know–to establish what are normal patterns using standard technology.  Not easy, but not impossible: Varonis DatAdvantage learns typical file and email usage and then can alert IT staff when anomalies are detected.

The survey suggests that IT pros are overwhelmed by the amount and complexity of data in the logs. The largest group, about 35%, said they spend none to a few hours a week looking at their logs, 10% said one day per week, and 11% said more than one day. By the way, the survey was weighted heavily towards enterprise class companies. This second result tells us that half of the companies in the survey are not conducting log analysis in anything close to real-time.

I’m not surprised by these data points. Our own Red Alert Research Report yielded similar dispiriting survey data, and we learned that only 6% of companies had full automated breach notification alerts.

One of our conclusions was that IT security should limit in the first place the amount of sensitive information available to hackers. Of course, real-time monitoring—the security cameras—are an important SANS control and a critical part of any “plan B” defense.

But organizations still need an enforceable policy in place to prevent users from leaving valuable assets out in the open—IP and other sensitive data in poorly permissioned folders. This can be accomplished by routinely classifying and searching for PII in files, and performing regular monitoring of folders to hunt down and eliminate broad permissions to this data.

It’s a low-hanging security fruit that, by the way, the Varonis IDU Classification Enginewas designed to snatch up and quickly resolve.

Thanks to http://www.varonis.com

Corporate Espionage: Hacking A Company Through A Chinese Restaurant Takeout Menu

Corporate espionage (industrial espionage) is a favorite topic of mine. I have written and presented on the subject quite a bit and, while I am never sure how my readers react when I write about this, I do carefully watch the look on my audience members’ faces when I first mention the issue. The story their eyes tell is interesting.

The story of “why should I care about this?”

At first they usually have a glazed over look with no emotion or reaction — as if they are thinking “this is just another lawyer using fancy lawyer words but whatever he is talking about, it doesn’t apply to anything that I do” and they politely sit there feigning paying attention.

And then, I tell them about the cases where Chinese state-sponsored groups had “insiders” planted in companies like Motorola or DuPont to steal their proprietary trade secrets. Their reaction does not change — as if they are thinking “yeah, ok, whatever, my company is not Motorola or DuPont or anything like it — we are a small shop and nobody cares that much about what we have.”

And then, trying to get their attention with something they have heard about, I mention Target and the massive and expensive Target breach. Their reaction does not change — as if they are thinking “dude, why are you telling me this? My company is nothing like Target — we could barely even be a supplier to Target, why would anyone care about us?”

And then, I ask them if they have ever heard of Fazio Mechanical Services — knowing they have no idea of who that is.

Blank stares.

So I ask them to raise their hands if they’ve ever heard of Fazio Mechanical Services — and usually no one raises their hands but at least now they are listening …

So I go on to explain that

  • Fazio Mechanical Services is (or should I say was) a vendor to Target and that it was a breach of Fazio’s computer system through an email spear phishing attack that ultimately allowed the hackers to breach the Target system;
  • While no one may have cared about getting Fazio’s information, Fazio’s system was very valuable to the hackers because it provided an intrusion point into the Target system — which made attacking Fazio very valuable, strategically, to the hackers;
  • Hackers are smart and very strategic and now that they have seen a great example of how effective using indirect methods, such as third party vendors, to attack their primary target has been and they will likely do it again;
  • Even if they do not believe their company is a high value target to hackers, if one of their suppliers, vendors, or other business associates may be, it could be their system that is used to become that intrusion point to reach the high value target, and
  • If that were to happen, their business would likely be the next Fazio and they would probably be looking for new employment.

What does this have to do with hacking through a Chinese Restaurant Takeout Menu (website)?

This usually brings the abstract notion of “corporate espionage” to reality for them. I was reminded of this when I read a recent article in the New York Times titled Hackers Lurking in Vents and Soda Machines that provides a great explanation of how hackers use this indirect method of attack on their primary targets. Here are a few poignant quotes but you should read the whole article:

Unable to breach the computer network at a big oil company, hackers infected with malware the online menu of a Chinese restaurant that was popular with employees. When the workers browsed the menu, they inadvertently downloaded code that gave the attackers a foothold in the business’s vast computer network.

*   *   *

Hackers in the recent Target payment card breach gained access to the retailer’s records through its heating and cooling system. In other cases, hackers have used printers, thermostats and videoconferencing equipment.

Companies have always needed to be diligent in keeping ahead of hackers — email and leaky employee devices are an old problem — but the situation has grown increasingly complex and urgent as countless third parties are granted remote access to corporate systems. This access comes through software controlling all kinds of services a company needs: heating, ventilation and air-conditioning; billing, expense and human-resources management systems; graphics and data analytics functions; health insurance providers; and even vending machines.

Full Article: http://www.nytimes.com/2014/04/08/technology/the-spy-in-the-soda-machine.html?ref=technology&_r=0.

This is a serious problem — even your company needs to pay attention to it, even if no one in your company likes Chinese takeout.

Thanks to http://shawnetuma.com/2014/04/15/corporate-espionage-hacking-a-company-through-a-chinese-restaurant-takeout-menu/