Excellent video from Citrix. C24 are a specialist application hosting and delivery organisation that specialises in the delivery of your business applications at speed. The solutions we deliver enable you to log on, anywhere, on any device and at any time. For further information please visit http://www.c24.co.uk
Organizations hoping distributed denial of service (DDoS) attacks are no longer incidents du jour and are beginning to slow down can think again: there were more attacks in 2012 and they aren’t going away, according to Neustar.
A little over a third, or 35 percent, of organizations in the survey experienced some form of a disruptive DDoS attack in 2012, Neustar found in its second DDoS Survey, released Wednesday. Retailers and e-commerce businesses were among the top three industry sectors being targeted, accounting for 39 percent and 41 percent, respectively, of the attacks in 2012. Financial service organizations, many of whom battled waves of attacks last fall as part of Operation Ababil, were the most targeted, at 44 percent.
Back in February, Neustar surveyed 704 IT professionals in North America how their organizations managed DDoS attacks. When organizations are hit with distributed denial of service attack, organizations generally go into “crisis” mode, as everyone from the IT department to customer service does whatever is necessary to get past the threat.
“The consequences of being unprepared to mitigate a DDoS attack can be crippling to businesses, Alex Berry, a senior vice-president of enterprise services at Neustar, said in a statement.
Slightly more than a quarter of survey participants indicted that DDoS-related outages cost their organizations anywhere between $50 and $100,000 an hour, or up to $2.4 million a day, the study found. About 74 percent of users projected outage costs of $10,000 per hour, or $240,000 a day.
The damage isn’t just revenue loss, however, but “about erosion in trust, brand value, and reputation,” Berry said. Nearly a third of the respondents said DDoS mitigation required time and related expenses of six or more employees.
While large attacks, such as those serious enough to raise the specter of a DDoS Armageddon, grab headlines, more than 70 percent of the attacks were less than 100 Mbps in network size or less than 100 Kpps in packets, Neustar found. Only two percent of the attacks in 2012 approached SpamHaus levels, with more than 20 Gbps of malicious traffic targeting the network.
While about 63 percent of the attacks lasted less than a day, the remainder of the attacks lasted more than 24 hours, with 17 percent going between one and two days. More organizations are seeing attacks that last more than a week, according to the survey.
“A well-crafted, multi-vector attack of just 2Gbps can bring most Websites to their knees,” Neustar said.
While companies are increasingly investigating DDoS protection, they aren’t investing in the right solutions or doing it fast enough. Only 8 percent of IT administrators in Neustar’s survey admitted to not having some kind of protection in place, a dramatic difference from 25 percent reporting no protection last year.
About two-third of the companies use firewalls, routers, and switches to manage DDoS Attacks, the survey found. In fact, Neustar found a 10 percent increase year-over-year in organizations using firewalls, switches, and routers for DDoS defenses. These networking products are not intended to filter out and block an overwhelming volume of malicious traffic, and wind up creating bottlenecks which help the attacks succeed, Neustar said.
“Few have invested in purpose-built hardware or third party expertise,” Neustar said.
We must welcome the future… and we must respect the past, remembering that it was once all that was humanly possible.
We will never know for sure what American philosopher George Santayana was referring to when he uttered his famous lesson, but we can certainly apply his insights as we attempt to comprehend the potential impact of technological advances as significant as cloud computing. In predicting the future of cloud computing, we must first take note of its past.
Each year, the world’s leading IT research and advisory firm Gartner identifies the top 10 strategic technologies that will have significant impact for organizations for the coming year. Gartner defines these strategic technologies as the following: technologies with the potential for significant impact on the enterprise in the next three years.
David Cearley, a Fellow at Gartner explains that “the technologies [on these lists] will be strategic for most organizations, and IT leaders should use this list in their strategic planning process by reviewing the technologies and how they fit into their expected needs.”
I decided to do a simple tabulation of Gartner’s lists for both 2012 and 2013 (next year’s list hasn’t been released yet) to assess if Gartner had identified cloud computing -related technologies among their top 10 strategic technologies in the last two years.
The results are tabulated below:
|Top 10 Strategic Techs for 2012||Cloud-Related||Top 10 Strategic Techs for 2013||Cloud Related|
|Media Tablets||No||Mobile experiences||yes|
|Mobile Applications||no||HTML 5||no|
|Social User Experience||no||Personal Cloud||yes|
|Intelligence Sensors||no||Intelligence Sensors||no|
|App Stores||No||Hybrid cloud computing||yes|
|Next-gen Analytics||yes||Strategic Big Data||no|
|Data Management||No||Actionable Analytics||yes|
|In-memory Computing||No||In-memory Computing||no|
|Low-energy Servers||No||Integrated Ecosystems||yes|
|Cloud Computing||yes||Enterprise App Stores||No|
|Cloud-related techs||2||Cloud-related techs||5|
The number of cloud computing-related technologies on the Gartner Strategic Technology list has more than doubled from 2 to 5 over the last two years. From these results alone, it is clear that cloud computing has had a significant influence on the global IT industry.
In February 2012, Gartner released a research paper entitled “Five Cloud Computing Trends that Will Affect Your Cloud Strategy Through 2015” (Cearley, D. and Smith, D.). Inside, there is a brilliant exposé on the anticipation surrounding cloud computing.
Here’s an excerpt of the paper;
Cloud computing is a major technology trend that has permeated the market over the last two years. … Cloud computing has a significant potential impact on every aspect of IT and how users access applications, information and business services. Although the potential is significant, the breadth and depth of the impact and the level of adoption over time is uncertain. The trend and related technologies continue to evolve and change rapidly, and there is continuing confusion and misunderstanding as vendors increasingly hype “cloud” as a marketing term. This level of impact, confusion, uncertainty and change make cloud computing one of Gartner’s top 10 strategic technology trends to address.
It is remarkable to note the variety of descriptions of cloud computing in that final sentence, “impact, confusion, uncertainty and change”. Even the IT experts are unsure how this will evolve! Salesforce.com provided a comprehensive look on how data communication and management have evolved and converged towards cloud computing.
Thanks to http://risingwiththecloud.wordpress.com/
Even though Microsoft SharePoint is widely deployed throughout enterprises and SMBs as a collaboration platform, a shocking two-thirds of SharePoint-using companies in a recent survey have admitted to having ‘no active security policy’ in place for the application.
The situation translates to a smorgasbord of opportunity for a hungry information-hijacker, but one which could soon turn into an all-you-can-eat buffet. The study, carried out by Emedia and provided to Infosecurity on an exclusive basis, investigated a wide range of businesses from 25 through to 5000+ PC users. The study found that while about half (52%) of those surveyed were currently using SharePoint, the other half planned to adopt the application once its social networking enhancements were live.
“This is a data leakage time bomb,” said security specialist and UK Accounting Standards Board member Steve Bailey. “SharePoint is a very widely-used medium, and it’s growing fast, so it is remarkable that IT-savvy users are disregarding the security implications. This could be down to complacency, confusion as to where the responsibility for developing such a policy lies, or simply lack of awareness.”
Whatever the root cause, he noted that in many organizations, SharePoint use has grown organically to “become part of the fabric of the business without being subject to mainstream security controls.”
The employees themselves are part of the problem, but how to implement an IT policy that makes sense is a conundrum for many IT professionals – contributing to the lack of IT policy.
“Banning data sharing is not the solution – that’s both impractical and undesirable,” said Martin Sugden, CEO at Boldon James, which sponsored the study. “In fact, refusing to share data is inefficient and potentially dangerous. What’s important is striking the balance between the need to protect information and the need to share it.”
The survey concluded that a protective marking solution for labeling the data’s level of sensitivity needs to be implemented. Many government agencies use protective marking to minimize inadvertent disclosure of confidential information, while commercial organizations employ protective marking to control intellectual property or information containing customer data.
By clearly identifying sensitive information using a classification solution, it becomes easier to ensure that access control methodology is correctly connecting the right users to the right data, Sugden noted.
Yet the study discovered that 65% of respondents are not yet marking any of their data. A very low 9% of respondents said they protectively mark all emails, and the same percentage said they do the same for all documents. Only 17% of respondents said they mark all email and documents.
“When you consider that hundreds – and even thousands – of users could be accessing your SharePoint server, it makes sense to have a solid SharePoint security policy in place,” added Sugden. “[SharePoint] is a superb tool for creating routes into your data, but you can’t let your user group have unfettered access to data without giving them some method of understanding how sensitive it is – that’s why you have to label.
Steve Bailey warned, “Any business that relies on SharePoint to store sensitive or confidential data should always ensure that its users understand their responsibilities for the safe handling of that information. With the advent of BYOD this extends to employees and associates.”
He cautioned that recent high-profile breaches should serve as object lessons. “Otherwise we’ll have more examples such as the Police email that, according to the [UK's] Information Commissioner’s Office (ICO) ‘contained 863 pieces of personal information’. Police accidentally sent the email containing the results of 10,000 checks with the Criminal Records Bureau (CRB) to a reporter when a staff member copied the wrong person into a message.”
Thanks to http://www.thethreatvector.wordpress.com
It’s no surprise that in the wake of the rapid increase in cyber attacks, governments around the world are moving towards strengthening their cyber security, and even taking steps to mandate better collaboration on security issues between the private and public sectors. Here is a sample of the most recent initiatives:
- US – Feb-2013: Obama Orders Cybersecurity Standards for Infrastructure
- European Union – Feb-2013: EU Unveils New Cybersecurity Policy
- Italy – Jan-2013: Italian Government Approves Cybersecurity Measures to beef up strengthen online security and protect critical infrastructure from increasing cyber assaults
- India – Jan-2013: India Developing National Cybersecurity Architecture. India is in the midst of developing a national cybersecurity architecture aimed at preventing sabotage and espionage of its core IT systems and networks
- Australia – Jan-2013: Australia toughens stance on cybersecurity
- Russia – Jan-2013: The Russian Federal Security Service gets empowered to create a state system for the detection, prevention and liquidation of the effects of computer attacks on the information resources of the Russian Federation
There are important common factors in all the above:
First, a global appeal for stronger collaboration between the public and private sectors to share intelligence on cyber attacks. Under existing EU rules, telecommunication companies are already required to report significant security incidents. Wade Williamson, one of our in-house experts on cyber threats recently wrote in this blog about “Combating Emerging Threats Through Security Collaboration”
Secondly, a shared understanding that the global economy is highly dependent on critical infrastructure that might not be as secure as initially thought. For example, the U.S. executive order specifically mentions power grids, pipelines and water systems.
Finally, full awareness that much of the critical infrastructure supporting a thriving, modern economy relies on a set of interconnected networks and systems that must be closely monitored and protected. The proposed European directive calls out the need for resilient, safe, and stable networks and systems.
One takeaway for our customers is that network security is being more systematically called out in cybersecurity discussions worldwide and is even taking center stage. Some analysts have commented that network security will remain the largest cybersecurity submarket for the next 10 years.
Why? Even as SaaS applications, social networking, mobile devices, or cloud-based computing become mainstream and push the limit of the traditional enterprise perimeter, the network and the firewalls remain the one place where organizations in both the public and private sectors can see all traffic and actually enforce security policy.
Thanks to http://www.thethreatvector.wordpress.com
451 Group (a research analyst firm) recently posted a meaningful blog article on this topic – http://idoneous-security.blogspot.com/ . Worth taking a moment to read as it effectively highlights what the security industry faces every day.
By confessing that its mistakes led to security breaches at three customers, Bit9 has sparked debate over whether the industry is ready to block hackers that see vendors as the door to other companies.
Bit9 disclosed last week that cybercriminals stole digital code-signing certificates from its computers and then used them to drop malware in the systems of three unidentified customers. The vendor acknowledged that the theft occurred on computers that it had failed to protect with its own product, which allows only software on a whitelist to run.
The hack is troublesome because cybercriminals are increasingly attacking vendors in order to steal technology that can be used to penetrate their customers’ systems. Vendors whose computers have been breached over the last few years include RSA, Symantec, DigiNotar, Verisign and Comodo.
Bit9 revealed in a blog post on Saturday that the attackers were using the vendor as a way in to its customers. “We can only speculate, but we believe the attack on us was part of a larger campaign against a particular and narrow set of companies,” wrote Harry Sverdlove, chief technology officer for Bit9.
Customers should not expect vendors to be anymore prepared to fend off attackers than most corporations. “Security vendors are no different than their customers and suffer from the same challenges: dealing with the threat landscape, and not having enough time/staffing/resources to accomplish all objectives,” said Rick Holland, an analyst at Forrester Research.
In fact, most security vendors are much smaller than enterprises and often have fewer resources available for security operations, Holland said. “Startups have even more challenges as investments in R&D, marketing and sales trump technology investments.”
Overall, vendors are more focused on implementing best practices in regards to security than their customers, experts say.
“Ultimately vendors of security need to be even more secure than the people that their technologies are protecting,” said Charles Kolodgy, and analyst at IDC. “Most security vendors understand this and I do believe they are better prepared to protect their intellectual property, but mistakes happen so those need to continue to be reduced.”
A certainty is no organization can be 100% protected against a security breach, and everyone is struggling to secure their systems against attackers who are constantly developing new tactics and technology.
“It’s the art of war,” said Gartner analyst Lawrence Pingree.
To better serve customers, vendors need to have procedures in place to handle quickly the consequences of the inevitable hack, Pingree said.
Scott Crawford, analyst for Enterprise Management Associates, said customers need to question their vendors about their security practices. “The focus in the security industry has been pretty similar to the rest of the technology vendor world, which is primarily on getting products out,” Crawford said.
Vendors should be drilled on how well they understand the impact a breach would have on customers, Crawford said. Other discussions should include their secure development practices.
“Do they treat their products the same way that someone designing products for aerospace, defense and the military would?” he said.
Experts agree that the security industry as a whole could benefit from sharing more information about malware and attacks. Currently, there are competing data-sharing frameworks sponsored by government groups, security vendors and the information technology industry, Holland said. Leading groups include OpenIOC, MITRE, and IODEF.
“If I were optimistic in the federal government’s ability to understand the issues as well as execute, I’d welcome legislation in this area,” Holland said.
With 2012 coming to a close, I decided to take a look back at some of the year’s more significant hacks. Two of the largest heists involved thefts of millions of records of personal data. In March, Global Payments, a credit card processor, revealed a breach in which at least 1.5 million credit card numbers were exported. And the year began when hackers targetedZappos, the online shoe retailer, and relieved this e-tailer of over 24 million rows of email addresses and other data.
Based on these gigantic incidents, I thought this was the year of the Big Hack and a unique turning point. For perspective, I reviewed two years’ worth of Verizon’s indispensable Data Breach Investigations Reports. The DBIR is based on data collected from the US Secret Service and the Dutch National High Tech Crime Unit. For 2011, Verizon reported over 855 incidents and 174 million records compromised. Last year was the second highest data loss recorded since Verizon began this study in 2004.
I’m not sure if 2012 hacking levels will surpass 2011, and neither of these two years will come close to the 360 million records compromised in 2008. However, there are other trends that seem to have remained relatively constant.
In recent years, the top three industry sectors breached have been hospitality (read: restaurants), retail, and financial services. No surprises here.
Another common theme in the report is that poor authorization monitoring and procedures often broaden the damage done by attackers. Verizon suggests that companies should constantly be on the lookout for new files, especially growing archive and log files, with unusual attribute settings. These often indicate an attack in progress.
The DBIR also tells us that straightforward hacking—using default passwords, stolen login credentials, or backdoor attacks—is still a very effective way to extract protected data.
One revealing stat is that most of the records hacked in the last few years have not involved credit card numbers. The winner in the most-hacked-data category instead goes to plain old PII—name, address, and social security number.
So how do Global Payments and Zappos match up with the overall trends? Depressingly, these two incidents fit it like a glove. Financial or retail? Check. External attack? Yes. Straightforward hack? It seems so, and no malware was involved that we know about.
For both Global Payments and Zappos, the actual exploits used are still a little fuzzy. According to Gartner Research’s Avivah Litan, the Global Payments attacker may have been able to get through the company’s knowledge-based authentication layer by answering questions correctly. This is still just speculation. Here’s what we do know: Global Payments was PCI-DSS compliant.Visa and Mastercard have since revoked their certification.
Zappos, which is also PCI-DSS compliant, kept their credit card numbers encrypted and separated from other personal information. Hackers were not able to access the “PANs”—PCI lingo for the card numbers. Zappos has kept their certification.
The most eye-opening part of Verizon’s DBIR can be found in their conclusions. Not to put too fine a point on this, but companies are simply not making the attackers work very hard. It’s not that they are so clever; it’s that IT has been a bit lax.
Here’s some of their all-too-familiar advice:
- change default credentials
- review user accounts on a regular basis
- restrict and monitor privileged users
On that last point, I’ll quote the actual text from the DBIR:
“Don’t give users more privileges than they need (this is a biggie) and use separation of duties. Make sure they have direction (they know policies and expectations) and supervision (to make sure they adhere to them). Privileged use should be logged and generate messages to management.”
Speaking as a Varonis blogger, I couldn’t have said it better.
Let’s hope some of this advice takes hold, and 2013 will be a more forgettable year in hacking annals.
C24 are extremely proud to hear that one of our major clients, Origin Enterprises, has recently been presented with the award ‘ICT in Manufacturing’ by the magazine The Manufacturer. It was awarded for ‘the implementation and management of an effective information and communication technology infrastructure that has brought improved competitive positioning and operational excellence through an engaged user-base’.
The considerable judgement criteria evaluated a number of strict areas that included:
a) Is the ICT’s deployment aligned with the business aims and objectives with clearly defined short-term and long-term goals
b) Has the ICT infrastructure investment been effective in the last three years
c) Is the ICT infrastructure streamlined so that it is fully integrated across all business functions
d) The level of success the project has shown in fulfilling its objectives
e) Quantifiable returns on the ICT investment
f) Is the company assessing or applying leading and/or advanced IT solutions
Comment from Paul Hemming MD C24 Ltd
This is fantastic recognition for Origin Enterprises and is reward for all the hard work their team has done over the last 18 months. The C24 and Origin Enterprise teams have been working together on various key IT projects; both on-premise and hosted, for a considerable time and we are proud to have helped them achieve such a prestigious award.
Comment from Derek Wilson CIO Origin Enterprises
C24 provide Origin Enterprises, and our associated businesses, a range of managed IT services including the management and deployment of our Microsoft Dynamics ERP infrastructural solution and the delivery of our warehouse management system. Since our initial engagement the two companies have developed a key strategic relationship, that has seen C24 not only manage the day to day delivery of our business applications infrastructure but them also being involved in a number of other critical on premise IT deployments.
The C24 relationship is a key element to the success of our current hosted IT infrastructure and we can only see the relationship continuing to develop in the future.
For more information about C24 please visit www.c24.co.uk
Now that we have a pretty good idea where the highest-risk data is, the question naturally turns to reducing that risk. Fixing permissions problems on Windows, SharePoint or Exchange has always been a significant operational challenge. I’ve been in plenty of situations as an admin where I know something is broken—a SharePoint site open to Authenticated Users for instance—but I’ve felt powerless to actually address the problem since any permissions change carries the risk of denying access to a user (or process) who needs it. Mistakes can have significant business impact depending on whose access you broke and on what data. Since we’re defining “at-risk” as being valuable data that’s over-exposed, that means that any accessibility problems we create will impact valuable data, and that can create more problems than we started with.
Step 3: Remediate High-Risk Data
The goal is to reduce risk by reducing permissions for those users or processes that don’t require access to the data in question.
The next step in the Varonis Operational Plan is fixing those high-risk access control issues that we’ve identified: data open to global access groups as well as concentrations of sensitive information open to either global groups or groups with many users. Since simply reducing access without any context can cause problems, we need to leverage metadata and automation through DatAdvantage.
Let’s tackle global access first. When everyone can access data, it’s very difficult to know who among the large set of potential users actually needs that access. If we know exactly who’s touching the data, we can be surgical about reducing access without causing any headaches.
DatAdvantage analyzes the data’s audit record over time in conjunction with access controls, showing folders, SharePoint sites, and other repositories that are accessible by global access groups, and those users who have been accessing that data who wouldn’t have had access without a global access group. In effect, it’s doing an environment-wide simulation to answer the question, “What if I removed every global access group off every ACL tomorrow. Who would be affected?” This report gives you some key information:
- Which data is open to global access groups
- Which part of that data is being accessed by users who wouldn’t otherwise be able to access
And it’s not just global groups that DatAdvantage lets you do this with. Because every data touch by every user on every monitored server is logged, Varonis lets you do this kind of analysis for any user, in any group, on any file or folder. That means you can safely remediate access to all of the high-risk data without risking productivity. You can actually fix the problem without getting in anyone’s way.
The next step is to start shifting decision making from your IT staff to the people who actually should be making choices about who gets access to data: data owners.
- Data Migration a Security Threat: Varonis (c24.co.uk)
- At-Risk Exchange Data (c24.co.uk)
- 12 Tips to Prevent your Sensitive Data Becoming a Wikileaks Headline (c24.co.uk)
- PCI DSS do not become the Weakest link (c24.co.uk)